hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
7c40c883205e041bca866b03efc2614c4229e71f | 9,606 | py | Python | musco/pytorch/compressor/decompositions/svd_layer.py | juliagusak/musco-pytorch | 74b9f4abcbf01ed3d8aee20cd97d56617cd1314f | [
"BSD-3-Clause"
] | 48 | 2019-10-11T19:11:15.000Z | 2022-03-22T09:20:09.000Z | musco/pytorch/compressor/decompositions/svd_layer.py | juliagusak/musco-pytorch | 74b9f4abcbf01ed3d8aee20cd97d56617cd1314f | [
"BSD-3-Clause"
] | 11 | 2019-11-12T09:59:27.000Z | 2021-04-01T21:12:11.000Z | musco/pytorch/compressor/decompositions/svd_layer.py | juliagusak/musco-pytorch | 74b9f4abcbf01ed3d8aee20cd97d56617cd1314f | [
"BSD-3-Clause"
] | 13 | 2019-10-16T09:08:02.000Z | 2022-03-10T23:08:47.000Z | import numpy as np
import torch
from torch import nn
from musco.pytorch.compressor.rank_selection.estimator import estimate_rank_for_compression_rate, estimate_vbmf_ranks
class SVDDecomposedLayer():
def __init__(self, layer, layer_name,
rank_selection,
rank = None,
pretrained = None,
vbmf_weaken_factor = None,
param_reduction_rate = None):
"""
rank_selection: str, 'vbmf'/'param_reduction'/'manual'
"""
self.layer_name = layer_name
self.layer = layer
self.pretrained = pretrained
self.min_rank = 8
if isinstance(self.layer, nn.Sequential):
self.in_features = self.layer[0].in_features
self.out_features = self.layer[1].out_features
else:
if not isinstance(self.layer, nn.Linear):
raise AttributeError('only linear layer can be decomposed')
self.in_features = self.layer.in_features
self.out_features = self.layer.out_features
self.weight, self.bias = self.get_weights_to_decompose()
if rank_selection == 'vbmf':
self.rank = estimate_vbmf_ranks(self.weight, vbmf_weaken_factor, min_rank = self.min_rank)
elif rank_selection == 'manual':
self.rank = rank
elif rank_selection == 'param_reduction':
if isinstance(self.layer, nn.Sequential):
prev_rank = self.layer[0].out_features
else:
prev_rank = None
self.rank = estimate_rank_for_compression_rate((self.out_features, self.in_features),
rate = param_reduction_rate,
key = 'svd',
prev_rank = prev_rank,
min_rank = self.min_rank)
##### create decomposed layers
self.new_layers = nn.Sequential()
for j, l in enumerate(self.create_new_layers()):
self.new_layers.add_module('{}-{}'.format(self.layer_name, j), l)
weights, biases = self.get_svd_factors()
for j, (w, b) in enumerate(zip(weights, biases)):
self.new_layers.__getattr__('{}-{}'.format(self.layer_name, j)).weight.data = w
if b is not None:
self.new_layers.__getattr__('{}-{}'.format(self.layer_name, j)).bias.data = b
else:
self.new_layers.__getattr__('{}-{}'.format(self.layer_name, j)).bias = None
self.layer = None
self.weight = None
self.bias = None
def create_new_layers(self):
layers = []
layers.append(nn.Linear(in_features = self.in_features,
out_features = self.rank,
bias = False))
layers.append(nn.Linear(in_features = self.rank,
out_features = self.out_features))
return layers
def get_weights_to_decompose(self):
if isinstance(self.layer, nn.Sequential):
#weight = self.layer[1].weight.data @ self.layer[0].weight.data
weight = self.layer[1].weight.data
try:
bias = self.layer[1].bias.data
except:
bias = None
else:
weight = self.layer.weight.data
try:
bias = self.layer.bias.data
except:
bias = None
return weight, bias
def get_svd_factors(self):
if self.pretrained is not None:
raise AttributeError('Not implemented')
else:
weights = self.weight.cpu()
if self.bias is not None:
bias = self.bias.cpu()
else:
bias = self.bias
U, S, Vt = np.linalg.svd(weights.data.numpy(), full_matrices=False)
w0 = np.dot(np.diag(np.sqrt(S[0:self.rank])),Vt[0:self.rank, :])
w1 = np.dot(U[:, 0:self.rank], np.diag(np.sqrt(S[0:self.rank])))
if isinstance(self.layer, nn.Sequential):
w0_old = self.layer[0].weight.cpu().data
w0 = np.dot(w0, w0_old)
w0 = torch.FloatTensor(w0).contiguous()
w1 = torch.FloatTensor(w1).contiguous()
return [w0, w1], [None, bias]
class SVDDecomposedConvLayer():
def __init__(self, layer, layer_name,
rank_selection,
rank = None,
pretrained = None,
vbmf_weaken_factor = None,
param_reduction_rate = None):
self.layer_name = layer_name
self.layer = layer
self.pretrained = pretrained
self.min_rank = 2
#print(layer)
if isinstance(self.layer, nn.Sequential):
self.in_channels = self.layer[0].in_channels
self.out_channels = self.layer[1].out_channels
self.padding = self.layer[1].padding
self.stride = self.layer[1].stride
else:
if not isinstance(self.layer, nn.Conv2d):
raise AttributeError('only conv layer can be decomposed')
self.in_channels = self.layer.in_channels
self.out_channels = self.layer.out_channels
self.padding = self.layer.padding
self.stride = self.layer.stride
self.weight, self.bias = self.get_weights_to_decompose()
if rank_selection == 'vbmf':
self.rank = estimate_vbmf_ranks(self.weight, vbmf_weaken_factor, min_rank = self.min_rank)
elif rank_selection == 'manual':
self.rank = rank
elif rank_selection == 'param_reduction':
if isinstance(self.layer, nn.Sequential):
prev_rank = self.layer[0].out_channels
else:
prev_rank = None
self.rank = estimate_rank_for_compression_rate((self.out_channels, self.in_channels),
rate = param_reduction_rate,
key = 'svd',
prev_rank = prev_rank,
min_rank = self.min_rank)
##### create decomposed layers
self.new_layers = nn.Sequential()
for j, l in enumerate(self.create_new_layers()):
self.new_layers.add_module('{}-{}'.format(self.layer_name, j), l)
weights, biases = self.get_svd_factors()
for j, (w, b) in enumerate(zip(weights, biases)):
self.new_layers.__getattr__('{}-{}'.format(self.layer_name, j)).weight.data = w
if b is not None:
self.new_layers.__getattr__('{}-{}'.format(self.layer_name, j)).bias.data = b
else:
self.new_layers.__getattr__('{}-{}'.format(self.layer_name, j)).bias = None
self.layer = None
self.weight = None
self.bias = None
def create_new_layers(self):
layers = []
layers.append(nn.Conv2d(in_channels = self.in_channels,
out_channels = self.rank,
kernel_size = 1,
bias = False))
layers.append(nn.Conv2d(in_channels = self.rank,
out_channels = self.out_channels,
kernel_size = 1,
padding = self.padding,
stride = self.stride))
return layers
def get_weights_to_decompose(self):
if isinstance(self.layer, nn.Sequential):
#weight = self.layer[1].weight.data @ self.layer[0].weight.data
weight = self.layer[1].weight.data
try:
bias = self.layer[1].bias.data
except:
bias = None
else:
weight = self.layer.weight.data
try:
bias = self.layer.bias.data
except:
bias = None
return weight[:,:,0,0], bias
def get_svd_factors(self):
if self.pretrained is not None:
raise AttributeError('Not implemented')
else:
weights = self.weight.cpu()
if self.bias is not None:
bias = self.bias.cpu()
else:
bias = self.bias
U, S, Vt = np.linalg.svd(weights.data.numpy(), full_matrices=False)
w0 = np.dot(np.diag(np.sqrt(S[0:self.rank])),Vt[0:self.rank, :])
w1 = np.dot(U[:, 0:self.rank], np.diag(np.sqrt(S[0:self.rank])))
if isinstance(self.layer, nn.Sequential):
w0_old = self.layer[0].weight[:,:,0,0].cpu().data
w0 = np.dot(w0, w0_old)
w0 = torch.FloatTensor(w0[:,:, np.newaxis, np.newaxis]).contiguous()
w1 = torch.FloatTensor(w1[:,:, np.newaxis, np.newaxis]).contiguous()
return [w0, w1], [None, bias]
| 38.424 | 117 | 0.501041 | 1,023 | 9,606 | 4.516129 | 0.109482 | 0.105195 | 0.028139 | 0.045455 | 0.878355 | 0.821212 | 0.784416 | 0.718615 | 0.701732 | 0.701732 | 0 | 0.010548 | 0.39798 | 9,606 | 249 | 118 | 38.578313 | 0.788345 | 0.025193 | 0 | 0.752688 | 0 | 0 | 0.0208 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043011 | false | 0 | 0.021505 | 0 | 0.107527 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7c5d9e8881286ff3d8caef9ecf3d2fe7afc9871a | 18,077 | py | Python | saleor/payment/gateways/stripe/tests/test_webhooks.py | felipearmat/saleor | 34c01912fede74dae45edfd23c1bfdca8ad26e35 | [
"CC-BY-4.0"
] | 1 | 2021-08-12T04:16:08.000Z | 2021-08-12T04:16:08.000Z | saleor/payment/gateways/stripe/tests/test_webhooks.py | felipearmat/saleor | 34c01912fede74dae45edfd23c1bfdca8ad26e35 | [
"CC-BY-4.0"
] | 101 | 2018-06-02T17:33:17.000Z | 2022-03-28T04:46:22.000Z | saleor/payment/gateways/stripe/tests/test_webhooks.py | aminziadna/saleor | 2e78fb5bcf8b83a6278af02551a104cfa555a1fb | [
"CC-BY-4.0"
] | null | null | null | import json
from decimal import Decimal
from unittest.mock import Mock, patch
import pytest
from stripe.stripe_object import StripeObject
from .....checkout.complete_checkout import complete_checkout
from .... import ChargeStatus, TransactionKind
from ....utils import price_to_minor_unit
from ..consts import (
AUTHORIZED_STATUS,
FAILED_STATUSES,
PROCESSING_STATUS,
SUCCESS_STATUS,
WEBHOOK_AUTHORIZED_EVENT,
WEBHOOK_CANCELED_EVENT,
WEBHOOK_FAILED_EVENT,
WEBHOOK_PROCESSING_EVENT,
WEBHOOK_SUCCESS_EVENT,
)
from ..webhooks import (
handle_authorized_payment_intent,
handle_failed_payment_intent,
handle_processing_payment_intent,
handle_refund,
handle_successful_payment_intent,
)
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_successful_payment_intent_for_checkout(
wrapped_checkout_complete,
payment_stripe_for_checkout,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_checkout
payment.to_confirm = True
payment.save()
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.ACTION_TO_CONFIRM,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount_received"] = price_to_minor_unit(
payment.total, payment.currency
)
payment_intent["setup_future_usage"] = None
payment_intent["currency"] = payment.currency
payment_intent["status"] = SUCCESS_STATUS
handle_successful_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert wrapped_checkout_complete.called
assert payment.checkout_id is None
assert payment.order
assert payment.order.checkout_token == str(checkout_with_items.token)
transaction = payment.transactions.get(kind=TransactionKind.CAPTURE)
assert transaction.token == payment_intent.id
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentMethod.modify")
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_successful_payment_intent_with_future_usage(
_wrapped_checkout_complete,
mocked_payment_method_modify,
payment_stripe_for_checkout,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_checkout
payment.to_confirm = True
payment.save()
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.ACTION_TO_CONFIRM,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount_received"] = price_to_minor_unit(
payment.total, payment.currency
)
payment_intent["payment_method"] = "payment_method_id"
payment_intent["setup_future_usage"] = "off_line"
payment_intent["currency"] = payment.currency
payment_intent["status"] = SUCCESS_STATUS
handle_successful_payment_intent(payment_intent, plugin.config, channel_USD.slug)
mocked_payment_method_modify.assert_called_once_with(
"payment_method_id",
api_key="secret_key",
metadata={"channel": channel_USD.slug},
)
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_successful_payment_intent_for_order(
wrapped_checkout_complete, payment_stripe_for_order, stripe_plugin, channel_USD
):
payment = payment_stripe_for_order
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = payment.total
payment_intent["currency"] = payment.currency
payment_intent["capture_method"] = "automatic"
handle_successful_payment_intent(payment_intent, plugin.config, channel_USD.slug)
assert wrapped_checkout_complete.called is False
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_successful_payment_intent_for_order_with_auth_payment(
wrapped_checkout_complete, payment_stripe_for_order, stripe_plugin, channel_USD
):
payment = payment_stripe_for_order
plugin = stripe_plugin()
payment_intent = StripeObject(id="token", last_response={})
payment_intent["amount_received"] = price_to_minor_unit(
payment.total, payment.currency
)
payment_intent["currency"] = payment.currency
payment_intent["setup_future_usage"] = None
payment_intent["status"] = SUCCESS_STATUS
handle_successful_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert payment.is_active
assert payment.charge_status == ChargeStatus.FULLY_CHARGED
assert payment.captured_amount == payment.total
assert payment.transactions.filter(kind=TransactionKind.CAPTURE).exists()
assert wrapped_checkout_complete.called is False
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_successful_payment_intent_for_order_with_pending_payment(
wrapped_checkout_complete, payment_stripe_for_order, stripe_plugin, channel_USD
):
payment = payment_stripe_for_order
transaction = payment.transactions.first()
transaction.kind = TransactionKind.PENDING
transaction.save()
plugin = stripe_plugin()
payment_intent = StripeObject(id="token", last_response={})
payment_intent["amount_received"] = price_to_minor_unit(
payment.total, payment.currency
)
payment_intent["currency"] = payment.currency
payment_intent["setup_future_usage"] = None
payment_intent["status"] = SUCCESS_STATUS
handle_successful_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert payment.is_active
assert payment.charge_status == ChargeStatus.FULLY_CHARGED
assert payment.captured_amount == payment.total
assert payment.transactions.filter(kind=TransactionKind.CAPTURE).exists()
assert wrapped_checkout_complete.called is False
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_authorized_payment_intent_for_checkout(
wrapped_checkout_complete,
payment_stripe_for_checkout,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_checkout
payment.to_confirm = True
payment.save()
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.ACTION_TO_CONFIRM,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["currency"] = payment.currency
payment_intent["status"] = AUTHORIZED_STATUS
handle_authorized_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert wrapped_checkout_complete.called
assert payment.checkout_id is None
assert payment.order
assert payment.order.checkout_token == str(checkout_with_items.token)
transaction = payment.transactions.get(kind=TransactionKind.AUTH)
assert transaction.token == payment_intent.id
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_authorized_payment_intent_for_order(
wrapped_checkout_complete,
payment_stripe_for_order,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_order
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = payment.total
payment_intent["currency"] = payment.currency
payment_intent["status"] = AUTHORIZED_STATUS
handle_authorized_payment_intent(payment_intent, plugin.config, channel_USD.slug)
assert wrapped_checkout_complete.called is False
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_authorized_payment_intent_for_processing_order_payment(
wrapped_checkout_complete,
payment_stripe_for_order,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_order
payment.charge_status = ChargeStatus.PENDING
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = payment.total
payment_intent["currency"] = payment.currency
payment_intent["status"] = AUTHORIZED_STATUS
handle_authorized_payment_intent(payment_intent, plugin.config, channel_USD.slug)
assert wrapped_checkout_complete.called is False
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_processing_payment_intent_for_order(
wrapped_checkout_complete,
payment_stripe_for_order,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_order
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = payment.total
payment_intent["currency"] = payment.currency
payment_intent["status"] = PROCESSING_STATUS
handle_processing_payment_intent(payment_intent, plugin.config, channel_USD.slug)
assert wrapped_checkout_complete.called is False
@patch(
"saleor.payment.gateways.stripe.webhooks.complete_checkout", wraps=complete_checkout
)
def test_handle_processing_payment_intent_for_checkout(
wrapped_checkout_complete,
payment_stripe_for_checkout,
checkout_with_items,
stripe_plugin,
channel_USD,
):
payment = payment_stripe_for_checkout
payment.to_confirm = True
payment.save()
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.ACTION_TO_CONFIRM,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["currency"] = payment.currency
payment_intent["status"] = PROCESSING_STATUS
handle_processing_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert wrapped_checkout_complete.called
assert payment.checkout_id is None
assert payment.order
assert payment.order.checkout_token == str(checkout_with_items.token)
transaction = payment.transactions.get(kind=TransactionKind.PENDING)
assert transaction.token == payment_intent.id
def test_handle_failed_payment_intent_for_checkout(
stripe_plugin, payment_stripe_for_checkout, channel_USD
):
payment = payment_stripe_for_checkout
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.ACTION_TO_CONFIRM,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = payment.total
payment_intent["currency"] = payment.currency
payment_intent["status"] = FAILED_STATUSES[0]
handle_failed_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert not payment.order_id
assert not payment.is_active
assert payment.charge_status == ChargeStatus.CANCELLED
assert payment.transactions.filter(kind=TransactionKind.CANCEL).exists()
def test_handle_failed_payment_intent_for_order(
stripe_plugin, payment_stripe_for_order, channel_USD
):
payment = payment_stripe_for_order
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.ACTION_TO_CONFIRM,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
payment_intent = StripeObject(id="ABC", last_response={})
payment_intent["amount"] = payment.total
payment_intent["currency"] = payment.currency
payment_intent["status"] = FAILED_STATUSES[0]
handle_failed_payment_intent(payment_intent, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert not payment.is_active
assert payment.charge_status == ChargeStatus.CANCELLED
assert payment.transactions.filter(kind=TransactionKind.CANCEL).exists()
def test_handle_fully_refund(stripe_plugin, payment_stripe_for_order, channel_USD):
payment = payment_stripe_for_order
payment.captured_amount = payment.total
payment.save()
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.CAPTURE,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
refund = StripeObject(id="refund_id")
refund["amount"] = price_to_minor_unit(payment.total, payment.currency)
refund["currency"] = payment.currency
refund["last_response"] = None
charge = StripeObject()
charge["payment_intent"] = "ABC"
charge["refunds"] = StripeObject()
charge["refunds"]["data"] = [refund]
handle_refund(charge, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert payment.charge_status == ChargeStatus.FULLY_REFUNDED
assert payment.is_active is False
assert payment.captured_amount == Decimal("0")
def test_handle_partial_refund(stripe_plugin, payment_stripe_for_order, channel_USD):
payment = payment_stripe_for_order
payment.captured_amount = payment.total
payment.save()
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.CAPTURE,
amount=payment.total,
currency=payment.currency,
token="ABC",
gateway_response={},
)
plugin = stripe_plugin()
refund = StripeObject(id="refund_id")
refund["amount"] = price_to_minor_unit(Decimal("10"), payment.currency)
refund["currency"] = payment.currency
refund["last_response"] = None
charge = StripeObject()
charge["payment_intent"] = "ABC"
charge["refunds"] = StripeObject()
charge["refunds"]["data"] = [refund]
handle_refund(charge, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert payment.charge_status == ChargeStatus.PARTIALLY_REFUNDED
assert payment.is_active is True
assert payment.captured_amount == payment.total - Decimal("10")
def test_handle_refund_already_processed(
stripe_plugin, payment_stripe_for_order, channel_USD
):
payment = payment_stripe_for_order
payment.charge_status = ChargeStatus.PARTIALLY_REFUNDED
payment.captured_amount = payment.total - Decimal("10")
payment.save()
refund_id = "refund_abc"
payment.transactions.create(
is_success=True,
action_required=True,
kind=TransactionKind.REFUND,
amount=payment.total,
currency=payment.currency,
token=refund_id,
gateway_response={},
)
plugin = stripe_plugin()
refund = StripeObject(id=refund_id)
refund["amount"] = price_to_minor_unit(Decimal("10"), payment.currency)
refund["currency"] = payment.currency
refund["last_response"] = None
charge = StripeObject()
charge["payment_intent"] = "ABC"
charge["refunds"] = StripeObject()
charge["refunds"]["data"] = [refund]
handle_refund(charge, plugin.config, channel_USD.slug)
payment.refresh_from_db()
assert payment.charge_status == ChargeStatus.PARTIALLY_REFUNDED
assert payment.is_active is True
assert payment.captured_amount == payment.total - Decimal("10")
@pytest.mark.parametrize(
"webhook_type, fun_to_mock",
[
(WEBHOOK_SUCCESS_EVENT, "handle_successful_payment_intent"),
(WEBHOOK_PROCESSING_EVENT, "handle_processing_payment_intent"),
(WEBHOOK_FAILED_EVENT, "handle_failed_payment_intent"),
(WEBHOOK_AUTHORIZED_EVENT, "handle_authorized_payment_intent"),
(WEBHOOK_CANCELED_EVENT, "handle_failed_payment_intent"),
],
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Webhook.construct_event")
def test_handle_webhook_events(
mocked_webhook_event, webhook_type, fun_to_mock, stripe_plugin, rf, channel_USD
):
dummy_payload = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
request = rf.post(
path="/webhooks/", data=dummy_payload, content_type="application/json"
)
stripe_signature = "1234"
request.META["HTTP_STRIPE_SIGNATURE"] = stripe_signature
event = Mock()
event.type = webhook_type
event.data.object = StripeObject()
mocked_webhook_event.return_value = event
plugin = stripe_plugin()
with patch(f"saleor.payment.gateways.stripe.webhooks.{fun_to_mock}") as mocked_fun:
plugin.webhook(request, "/webhooks/", None)
mocked_fun.assert_called_once_with(
event.data.object, plugin.config, channel_USD.slug
)
api_key = plugin.config.connection_params["secret_api_key"]
endpoint_secret = plugin.config.connection_params["webhook_secret"]
mocked_webhook_event.assert_called_once_with(
json.dumps(dummy_payload).encode("utf-8"),
stripe_signature,
endpoint_secret,
api_key=api_key,
)
| 32.454219 | 88 | 0.742214 | 2,063 | 18,077 | 6.148812 | 0.07174 | 0.106583 | 0.03784 | 0.03311 | 0.844225 | 0.828065 | 0.82097 | 0.807489 | 0.798818 | 0.797872 | 0 | 0.001659 | 0.166289 | 18,077 | 556 | 89 | 32.51259 | 0.840024 | 0 | 0 | 0.713666 | 0 | 0 | 0.099131 | 0.053383 | 0 | 0 | 0 | 0 | 0.104121 | 1 | 0.034707 | false | 0 | 0.021692 | 0 | 0.056399 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7c658671eca1190b1b37c212e938f681ce0dfae7 | 21,225 | py | Python | lfs/order/migrations/0007_auto_20210503_2013.py | michael-hahn/django-lfs | 26c3471a8f8d88269c84f714f507b952dfdb6397 | [
"BSD-3-Clause"
] | null | null | null | lfs/order/migrations/0007_auto_20210503_2013.py | michael-hahn/django-lfs | 26c3471a8f8d88269c84f714f507b952dfdb6397 | [
"BSD-3-Clause"
] | null | null | null | lfs/order/migrations/0007_auto_20210503_2013.py | michael-hahn/django-lfs | 26c3471a8f8d88269c84f714f507b952dfdb6397 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 3.1.2 on 2021-05-03 20:13
from django.db import migrations, models
import django.splice.splicefields
import lfs.order.models
class Migration(migrations.Migration):
dependencies = [
('order', '0006_auto_20210406_1809'),
]
operations = [
migrations.AddField(
model_name='order',
name='account_number_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='account_number_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='bank_identification_code_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='bank_identification_code_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='bank_name_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='bank_name_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='created_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='created_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='customer_email_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='customer_email_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='customer_firstname_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='customer_firstname_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='customer_lastname_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='customer_lastname_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='depositor_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='depositor_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='ia_object_id_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='ia_object_id_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='message_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='message_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='number_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='number_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='pay_link_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='pay_link_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='payment_price_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='payment_price_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='payment_tax_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='payment_tax_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='price_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='price_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='requested_delivery_date_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='requested_delivery_date_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='sa_object_id_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='sa_object_id_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='session_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='session_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='shipping_price_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='shipping_price_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='shipping_tax_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='shipping_tax_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='state_modified_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='state_modified_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='state_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='state_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='tax_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='tax_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='uuid_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='uuid_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='voucher_number_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='voucher_number_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='voucher_price_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='voucher_price_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='order',
name='voucher_tax_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='order',
name='voucher_tax_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='price_gross_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='price_gross_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='price_net_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='price_net_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='product_amount_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='product_amount_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='product_name_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='product_name_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='product_price_gross_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='product_price_gross_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='product_price_net_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='product_price_net_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='product_sku_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='product_sku_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='product_tax_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='product_tax_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitem',
name='tax_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitem',
name='tax_taint',
field=models.BigIntegerField(default=0),
),
migrations.AddField(
model_name='orderitempropertyvalue',
name='value_synthesized',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='orderitempropertyvalue',
name='value_taint',
field=models.BigIntegerField(default=0),
),
migrations.AlterField(
model_name='order',
name='account_number',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=30, null=True, verbose_name='Account number'),
),
migrations.AlterField(
model_name='order',
name='bank_identification_code',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=30, null=True, verbose_name='Bank identication code'),
),
migrations.AlterField(
model_name='order',
name='bank_name',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Bank name'),
),
migrations.AlterField(
model_name='order',
name='created',
field=django.splice.splicefields.SpliceDateTimeField(auto_now_add=True, null=True, verbose_name='Created'),
),
migrations.AlterField(
model_name='order',
name='customer_email',
field=django.splice.splicefields.SpliceCharField(max_length=75, null=True, verbose_name='email'),
),
migrations.AlterField(
model_name='order',
name='customer_firstname',
field=django.splice.splicefields.SpliceCharField(max_length=50, null=True, verbose_name='firstname'),
),
migrations.AlterField(
model_name='order',
name='customer_lastname',
field=django.splice.splicefields.SpliceCharField(max_length=50, null=True, verbose_name='lastname'),
),
migrations.AlterField(
model_name='order',
name='depositor',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Depositor'),
),
migrations.AlterField(
model_name='order',
name='ia_object_id',
field=django.splice.splicefields.SplicePositiveIntegerField(null=True),
),
migrations.AlterField(
model_name='order',
name='message',
field=django.splice.splicefields.SpliceTextField(blank=True, null=True, verbose_name='Message'),
),
migrations.AlterField(
model_name='order',
name='number',
field=django.splice.splicefields.SpliceCharField(max_length=30, null=True),
),
migrations.AlterField(
model_name='order',
name='pay_link',
field=django.splice.splicefields.SpliceTextField(blank=True, null=True, verbose_name='pay_link'),
),
migrations.AlterField(
model_name='order',
name='payment_price',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Payment Price'),
),
migrations.AlterField(
model_name='order',
name='payment_tax',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Payment Tax'),
),
migrations.AlterField(
model_name='order',
name='price',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Price'),
),
migrations.AlterField(
model_name='order',
name='requested_delivery_date',
field=django.splice.splicefields.SpliceDateTimeField(blank=True, null=True, verbose_name='Delivery Date'),
),
migrations.AlterField(
model_name='order',
name='sa_object_id',
field=django.splice.splicefields.SplicePositiveIntegerField(null=True),
),
migrations.AlterField(
model_name='order',
name='session',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Session'),
),
migrations.AlterField(
model_name='order',
name='shipping_price',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Shipping Price'),
),
migrations.AlterField(
model_name='order',
name='shipping_tax',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Shipping Tax'),
),
migrations.AlterField(
model_name='order',
name='state',
field=django.splice.splicefields.SplicePositiveSmallIntegerField(choices=[(0, 'Submitted'), (1, 'Paid'), (7, 'Prepared'), (2, 'Sent'), (3, 'Closed'), (4, 'Canceled'), (5, 'Payment Failed'), (6, 'Payment Flagged')], default=0, null=True, verbose_name='State'),
),
migrations.AlterField(
model_name='order',
name='state_modified',
field=django.splice.splicefields.SpliceDateTimeField(auto_now_add=True, null=True, verbose_name='State modified'),
),
migrations.AlterField(
model_name='order',
name='tax',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Tax'),
),
migrations.AlterField(
model_name='order',
name='uuid',
field=django.splice.splicefields.SpliceCharField(default=lfs.order.models.get_unique_id_str, editable=False, max_length=50, null=True, unique=True),
),
migrations.AlterField(
model_name='order',
name='voucher_number',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Voucher number'),
),
migrations.AlterField(
model_name='order',
name='voucher_price',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Voucher value'),
),
migrations.AlterField(
model_name='order',
name='voucher_tax',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Voucher tax'),
),
migrations.AlterField(
model_name='orderitem',
name='price_gross',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Price gross'),
),
migrations.AlterField(
model_name='orderitem',
name='price_net',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Price net'),
),
migrations.AlterField(
model_name='orderitem',
name='product_amount',
field=django.splice.splicefields.SpliceFloatField(blank=True, null=True, verbose_name='Product quantity'),
),
migrations.AlterField(
model_name='orderitem',
name='product_name',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Product name'),
),
migrations.AlterField(
model_name='orderitem',
name='product_price_gross',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Product price gross'),
),
migrations.AlterField(
model_name='orderitem',
name='product_price_net',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Product price net'),
),
migrations.AlterField(
model_name='orderitem',
name='product_sku',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Product SKU'),
),
migrations.AlterField(
model_name='orderitem',
name='product_tax',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Product tax'),
),
migrations.AlterField(
model_name='orderitem',
name='tax',
field=django.splice.splicefields.SpliceFloatField(default=0.0, null=True, verbose_name='Tax'),
),
migrations.AlterField(
model_name='orderitempropertyvalue',
name='value',
field=django.splice.splicefields.SpliceCharField(blank=True, max_length=100, null=True, verbose_name='Value'),
),
]
| 37.171629 | 271 | 0.580589 | 1,887 | 21,225 | 6.350821 | 0.061473 | 0.083361 | 0.094626 | 0.121662 | 0.954606 | 0.930991 | 0.903371 | 0.795728 | 0.767523 | 0.751669 | 0 | 0.009512 | 0.306525 | 21,225 | 570 | 272 | 37.236842 | 0.804674 | 0.00212 | 0 | 0.739362 | 1 | 0 | 0.145292 | 0.04533 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005319 | 0 | 0.010638 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7c75fe52eb5797c6572e0aa6c91249e8bb11c7cf | 48,342 | py | Python | tests/test_core.py | QDucasse/sdvs | 642275220704b373b7fc87340da9b4d917088a02 | [
"MIT"
] | null | null | null | tests/test_core.py | QDucasse/sdvs | 642275220704b373b7fc87340da9b4d917088a02 | [
"MIT"
] | null | null | null | tests/test_core.py | QDucasse/sdvs | 642275220704b373b7fc87340da9b4d917088a02 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# ===========================================
# author: Quentin Ducasse
# email: quentin.ducasse@ensta-bretagne.org
# github: https://github.com/QDucasse
# ===========================================
# Simulator: Process instructions one by one and show the results of their execution
# Test File!
import unittest
from unittest.mock import patch, mock_open
from sdvs.asm import ASM
from sdvs.constants import *
from sdvs.decoder import Decoder, Instruction
from sdvs.memory import Memory
from sdvs.core import Core
# Instructions file setup
ADD_INDEX = 0
SUB_INDEX = 4
MUL_INDEX = 8
DIV_INDEX = 12
MOD_INDEX = 16
AND_INDEX = 20
OR_INDEX = 24
LT_INDEX = 28
GT_INDEX = 32
EQ_INDEX = 36
NOT_INDEX = 40
JMP_INDEX = 41
MOV_INDEX = 42
LOADBOOL_INDEX = 44
LOADBYTE_INDEX = 46
LOADINT_INDEX = 48
LOADSTATE_INDEX = 50
STOREBOOL_INDEX = 52
STOREBYTE_INDEX = 54
STOREINT_INDEX = 56
STORESTATE_INDEX = 58
binops = ["add", "sub", "mul", "div", "mod",
"and", "or", "lt", "gt", "eq"]
mock_file = """
not r3 r1
jmp r3 32
mov r3 r1
mov r3 234
loadbool r3 r1
loadbool r3 8
loadbyte r3 r1
loadbyte r3 8
loadint r3 r1
loadint r3 8
loadstate r3 r1
loadstate r3 8
storebool r3 r1
storebool r3 8
storebyte r3 r1
storebyte r3 8
storeint r3 r1
storeint r3 8
storestate r3 r1
storestate r3 8
"""
def append_bin_mock_file(mock_file):
bin_mock_file = """"""
for op in binops:
bin_mock_file += op + " r3 r1 r2\n"
bin_mock_file += op + " r3 r1 122\n"
bin_mock_file += op + " r3 122 r2\n"
bin_mock_file += op + " r3 123 124\n"
return bin_mock_file + mock_file
mock_file = append_bin_mock_file(mock_file)
# Dummy instruction setup
def setUpInstruction(op_code, cfg, data_type=VAL_BOOL):
instruction = Instruction(op_code)
instruction.cfg_mask = cfg
instruction.rd = 1
instruction.ra = 2
instruction.rb = 3
instruction.imma = 122
instruction.immb = 123
instruction.address = 124
instruction.type = data_type
return instruction
class TestSimulator(unittest.TestCase):
def setUpSimOnInstruction(self, instruction):
asm = ASM()
bit_instructions = [asm.process_line(instruction)]
decoder = Decoder(bit_instructions)
memory = Memory(0, 0)
self.simulator = Core(decoder, memory)
@patch('builtins.open', mock_open(read_data=mock_file))
def setUp(self):
asm = ASM()
bit_instructions = asm.process_file("path/to/mock/file")
decoder = Decoder(bit_instructions)
memory = Memory(0, 0)
self.simulator = Core(decoder, memory)
def testAssignRegisterValue(self):
for reg in self.simulator.registers:
self.assertEqual(0, reg.value)
self.simulator.assign_register_value(7, 32)
for reg in self.simulator.registers:
if reg.number == 7:
self.assertEqual(32, reg.value)
else:
self.assertEqual(0, reg.value)
def testRetrieveRegisterValue(self):
self.simulator.registers[7].value = 32
for i, reg in enumerate(self.simulator.registers):
if reg.number == 7:
self.assertEqual(32, self.simulator.retrieve_register_value(i))
else:
self.assertEqual(0, self.simulator.retrieve_register_value(i))
def testProcessBinaryOperandsRR(self):
self.simulator.registers[2].value = 2
self.simulator.registers[3].value = 4
for op in range(OP_EQ + 1):
self.simulator.current_instruction = setUpInstruction(op, CFG_RR)
left_operand, right_operand = self.simulator.process_binary_operands()
self.assertEqual(2, left_operand)
self.assertEqual(4, right_operand)
def testProcessBinaryOperandsRI(self):
self.simulator.registers[2].value = 2
for op in range(OP_EQ + 1):
self.simulator.current_instruction = setUpInstruction(op, CFG_RI)
left_operand, right_operand = self.simulator.process_binary_operands()
self.assertEqual(2, left_operand)
self.assertEqual(123, right_operand)
def testProcessBinaryOperandsIR(self):
self.simulator.registers[3].value = 4
for op in range(OP_EQ + 1):
self.simulator.current_instruction = setUpInstruction(op, CFG_IR)
left_operand, right_operand = self.simulator.process_binary_operands()
self.assertEqual(122, left_operand)
self.assertEqual(4, right_operand)
def testProcessBinaryOperandsII(self):
for op in range(OP_EQ + 1):
self.simulator.current_instruction = setUpInstruction(op, CFG_II)
left_operand, right_operand = self.simulator.process_binary_operands()
self.assertEqual(122, left_operand)
self.assertEqual(123, right_operand)
# --------------
# ADD OPERATIONS
# --------------
def testProcessAddRR(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_add()
self.assertEqual(1 + 2, self.simulator.registers[3].value)
def testProcessAddRI(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_add()
self.assertEqual(1 + 122, self.simulator.registers[3].value)
def testProcessAddIR(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_add()
self.assertEqual(122 + 2, self.simulator.registers[3].value)
def testProcessAddII(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_add()
self.assertEqual(123 + 124, self.simulator.registers[3].value)
def testProcessOneInstructionAddRR(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 + 2, self.simulator.registers[3].value)
def testProcessOneInstructionAddRI(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 + 122, self.simulator.registers[3].value)
def testProcessOneInstructionAddIR(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 + 2, self.simulator.registers[3].value)
def testProcessOneInstructionAddII(self):
self.simulator.decoder.next_instruction_index = ADD_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 + 124, self.simulator.registers[3].value)
# --------------
# SUB OPERATIONS
# --------------
def testProcessSubRR(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_sub()
self.assertEqual(1 - 2, self.simulator.registers[3].value)
def testProcessSubRI(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_sub()
self.assertEqual(1 - 122, self.simulator.registers[3].value)
def testProcessSubIR(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_sub()
self.assertEqual(122 - 2, self.simulator.registers[3].value)
def testProcessSubII(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_sub()
self.assertEqual(123 - 124, self.simulator.registers[3].value)
def testProcessOneInstructionSubRR(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 - 2, self.simulator.registers[3].value)
def testProcessOneInstructionSubRI(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 - 122, self.simulator.registers[3].value)
def testProcessOneInstructionSubIR(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 - 2, self.simulator.registers[3].value)
def testProcessOneInstructionSubII(self):
self.simulator.decoder.next_instruction_index = SUB_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 - 124, self.simulator.registers[3].value)
# --------------
# MUL OPERATIONS
# --------------
def testProcessMulRR(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_mul()
self.assertEqual(1 * 2, self.simulator.registers[3].value)
def testProcessMulRI(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_mul()
self.assertEqual(1 * 122, self.simulator.registers[3].value)
def testProcessMulIR(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_mul()
self.assertEqual(122 * 2, self.simulator.registers[3].value)
def testProcessMulII(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_mul()
self.assertEqual(123 * 124, self.simulator.registers[3].value)
def testProcessOneInstructionMulRR(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 * 2, self.simulator.registers[3].value)
def testProcessOneInstructionMulRI(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 * 122, self.simulator.registers[3].value)
def testProcessOneInstructionMulIR(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 * 2, self.simulator.registers[3].value)
def testProcessOneInstructionMulII(self):
self.simulator.decoder.next_instruction_index = MUL_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 * 124, self.simulator.registers[3].value)
# --------------
# DIV OPERATIONS
# --------------
def testProcessDivRR(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_div()
self.assertEqual(1 // 2, self.simulator.registers[3].value)
def testProcessDivRI(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_div()
self.assertEqual(1 // 122, self.simulator.registers[3].value)
def testProcessDivIR(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_div()
self.assertEqual(122 // 2, self.simulator.registers[3].value)
def testProcessDivII(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_div()
self.assertEqual(123 // 124, self.simulator.registers[3].value)
def testProcessOneInstructionDivRR(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 // 2, self.simulator.registers[3].value)
def testProcessOneInstructionDivRI(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 // 122, self.simulator.registers[3].value)
def testProcessOneInstructionDivIR(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 // 2, self.simulator.registers[3].value)
def testProcessOneInstructionDivII(self):
self.simulator.decoder.next_instruction_index = DIV_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 // 124, self.simulator.registers[3].value)
# --------------
# MOD OPERATIONS
# --------------
def testProcessModRR(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_mod()
self.assertEqual(1 % 2, self.simulator.registers[3].value)
def testProcessModRI(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_mod()
self.assertEqual(1 % 122, self.simulator.registers[3].value)
def testProcessModIR(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_mod()
self.assertEqual(122 % 2, self.simulator.registers[3].value)
def testProcessModII(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_mod()
self.assertEqual(123 % 124, self.simulator.registers[3].value)
def testProcessOneInstructionModRR(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 % 2, self.simulator.registers[3].value)
def testProcessOneInstructionModRI(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 % 122, self.simulator.registers[3].value)
def testProcessOneInstructionModIR(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 % 2, self.simulator.registers[3].value)
def testProcessOneInstructionModII(self):
self.simulator.decoder.next_instruction_index = MOD_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 % 124, self.simulator.registers[3].value)
# --------------
# AND OPERATIONS
# --------------
def testProcessAndRR(self):
self.simulator.decoder.next_instruction_index = AND_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_and()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessAndRI(self):
self.simulator.decoder.next_instruction_index = AND_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_and()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessAndIR(self):
self.simulator.decoder.next_instruction_index = AND_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_and()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessAndII(self):
self.simulator.decoder.next_instruction_index = AND_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_and()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessFalseAndRR(self):
self.setUpSimOnInstruction("and r3 r2 r1")
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 0
self.simulator.registers[2].value = 123
self.simulator.process_and()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessFalseAndRI(self):
self.setUpSimOnInstruction("and r3 r2 0")
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 123
self.simulator.process_and()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessFalseAndIR(self):
self.setUpSimOnInstruction("and r3 123 r1")
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 0
self.simulator.process_and()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessFalseAndII(self):
self.setUpSimOnInstruction("and r3 0 123")
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_and()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessOneInstructionAndRR(self):
self.simulator.decoder.next_instruction_index = AND_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionAndRI(self):
self.simulator.decoder.next_instruction_index = AND_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionAndIR(self):
self.simulator.decoder.next_instruction_index = AND_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionAndII(self):
self.simulator.decoder.next_instruction_index = AND_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionFalseAndRR(self):
self.setUpSimOnInstruction("and r3 r2 r1")
self.simulator.registers[1].value = 0
self.simulator.registers[2].value = 123
self.simulator.process_one_instruction()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessOneInstructionFalseAndRI(self):
self.setUpSimOnInstruction("and r3 r2 0")
self.simulator.registers[2].value = 123
self.simulator.process_one_instruction()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessOneInstructionFalseAndIR(self):
self.setUpSimOnInstruction("and r3 123 r1")
self.simulator.registers[1].value = 0
self.simulator.process_one_instruction()
self.assertEqual(0, self.simulator.registers[3].value)
def testProcessOneInstructionFalseAndII(self):
self.setUpSimOnInstruction("and r3 0 123")
self.simulator.process_one_instruction()
self.assertEqual(0, self.simulator.registers[3].value)
# --------------
# OR OPERATIONS
# --------------
def testProcessOrRR(self):
self.simulator.decoder.next_instruction_index = OR_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_or()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOrRI(self):
self.simulator.decoder.next_instruction_index = OR_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_or()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOrIR(self):
self.simulator.decoder.next_instruction_index = OR_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_or()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOrII(self):
self.simulator.decoder.next_instruction_index = OR_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_or()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionOrRR(self):
self.simulator.decoder.next_instruction_index = OR_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionOrRI(self):
self.simulator.decoder.next_instruction_index = OR_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionOrIR(self):
self.simulator.decoder.next_instruction_index = OR_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionOrII(self):
self.simulator.decoder.next_instruction_index = OR_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
# --------------
# LT OPERATIONS
# --------------
def testProcessLtRR(self):
self.simulator.decoder.next_instruction_index = LT_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_less_than()
self.assertEqual(1 < 2, self.simulator.registers[3].value)
def testProcessLtRI(self):
self.simulator.decoder.next_instruction_index = LT_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_less_than()
self.assertEqual(1 < 122, self.simulator.registers[3].value)
def testProcessLtIR(self):
self.simulator.decoder.next_instruction_index = LT_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_less_than()
self.assertEqual(122 < 2, self.simulator.registers[3].value)
def testProcessLtII(self):
self.simulator.decoder.next_instruction_index = LT_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_less_than()
self.assertEqual(123 < 124, self.simulator.registers[3].value)
def testProcessOneInstructionLtRR(self):
self.simulator.decoder.next_instruction_index = LT_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 < 2, self.simulator.registers[3].value)
def testProcessOneInstructionLtRI(self):
self.simulator.decoder.next_instruction_index = LT_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 < 122, self.simulator.registers[3].value)
def testProcessOneInstructionLtIR(self):
self.simulator.decoder.next_instruction_index = LT_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 < 2, self.simulator.registers[3].value)
def testProcessOneInstructionLtII(self):
self.simulator.decoder.next_instruction_index = LT_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 < 124, self.simulator.registers[3].value)
# --------------
# GT OPERATIONS
# --------------
def testProcessGtRR(self):
self.simulator.decoder.next_instruction_index = GT_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_greater_than()
self.assertEqual(1 > 2, self.simulator.registers[3].value)
def testProcessGtRI(self):
self.simulator.decoder.next_instruction_index = GT_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_greater_than()
self.assertEqual(1 > 122, self.simulator.registers[3].value)
def testProcessGtIR(self):
self.simulator.decoder.next_instruction_index = GT_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_greater_than()
self.assertEqual(122 > 2, self.simulator.registers[3].value)
def testProcessGtII(self):
self.simulator.decoder.next_instruction_index = GT_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_greater_than()
self.assertEqual(123 > 124, self.simulator.registers[3].value)
def testProcessOneInstructionGtRR(self):
self.simulator.decoder.next_instruction_index = GT_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 > 2, self.simulator.registers[3].value)
def testProcessOneInstructionGtRI(self):
self.simulator.decoder.next_instruction_index = GT_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 > 122, self.simulator.registers[3].value)
def testProcessOneInstructionGtIR(self):
self.simulator.decoder.next_instruction_index = GT_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 > 2, self.simulator.registers[3].value)
def testProcessOneInstructionGtII(self):
self.simulator.decoder.next_instruction_index = GT_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 > 124, self.simulator.registers[3].value)
# --------------
# EQ OPERATIONS
# --------------
def testProcessEqRR(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_equal()
self.assertEqual(1 == 2, self.simulator.registers[3].value)
def testProcessEqRI(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 1
self.simulator.process_equal()
self.assertEqual(1 == 122, self.simulator.registers[3].value)
def testProcessEqIR(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX + 2
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[2].value = 2
self.simulator.process_equal()
self.assertEqual(122 == 2, self.simulator.registers[3].value)
def testProcessEqII(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX + 3
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_equal()
self.assertEqual(123 == 124, self.simulator.registers[3].value)
def testProcessOneInstructionEqRR(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX
self.simulator.registers[1].value = 1
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(1 == 2, self.simulator.registers[3].value)
def testProcessOneInstructionEqRI(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX + 1
self.simulator.registers[1].value = 1
self.simulator.process_one_instruction()
self.assertEqual(1 == 122, self.simulator.registers[3].value)
def testProcessOneInstructionEqIR(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX + 2
self.simulator.registers[2].value = 2
self.simulator.process_one_instruction()
self.assertEqual(122 == 2, self.simulator.registers[3].value)
def testProcessOneInstructionEqII(self):
self.simulator.decoder.next_instruction_index = EQ_INDEX + 3
self.simulator.process_one_instruction()
self.assertEqual(123 == 124, self.simulator.registers[3].value)
# --------------
# NOT OPERATIONS
# --------------
def testProcessNot(self):
self.simulator.decoder.next_instruction_index = NOT_INDEX
self.simulator.registers[1].value = 23
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_not()
self.assertEqual(not 23, self.simulator.registers[3].value)
def testProcessOneInstructionNot(self):
self.simulator.decoder.next_instruction_index = NOT_INDEX
self.simulator.registers[1].value = 23
self.simulator.process_one_instruction()
self.assertEqual(not 23, self.simulator.registers[3].value)
# --------------
# JMP OPERATIONS
# --------------
def testProcessJmpTrue(self):
self.simulator.decoder.next_instruction_index = JMP_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[3].value = 1 # True
self.simulator.process_jmp()
self.assertEqual(JMP_INDEX + 1, self.simulator.decoder.next_instruction_index)
def testProcessOneInstructionJmpTrue(self):
self.simulator.decoder.next_instruction_index = JMP_INDEX
self.simulator.registers[3].value = 1 # True
self.simulator.process_one_instruction()
self.assertEqual(JMP_INDEX + 1, self.simulator.decoder.next_instruction_index)
def testProcessJmpFalse(self):
self.simulator.decoder.next_instruction_index = JMP_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[3].value = 0 # False
self.simulator.process_jmp()
self.assertEqual(32, self.simulator.decoder.next_instruction_index)
def testProcessOneInstructionJmpFalse(self):
self.simulator.decoder.next_instruction_index = JMP_INDEX
self.simulator.registers[3].value = 0 # False
self.simulator.process_one_instruction()
self.assertEqual(32, self.simulator.decoder.next_instruction_index)
# --------------
# MOV OPERATIONS
# --------------
def testProcessMovReg(self):
self.simulator.decoder.next_instruction_index = MOV_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.registers[1].value = 32
self.simulator.process_load()
self.assertEqual(32, self.simulator.registers[3].value)
def testProcessMovImm(self):
self.simulator.decoder.next_instruction_index = MOV_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.process_load()
self.assertEqual(234, self.simulator.registers[3].value)
def testProcessOneInstructionMovReg(self):
self.simulator.decoder.next_instruction_index = MOV_INDEX
self.simulator.registers[1].value = 32
self.simulator.process_one_instruction()
self.assertEqual(32, self.simulator.registers[3].value)
def testProcessOneInstructionMovImm(self):
self.simulator.decoder.next_instruction_index = MOV_INDEX + 1
self.simulator.process_one_instruction()
self.assertEqual(234, self.simulator.registers[3].value)
# ---------------
# LOAD OPERATIONS
# ---------------
def testProcessLoadBoolRAA(self):
self.simulator.decoder.next_instruction_index = LOADBOOL_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeee01ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_load()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessLoadBoolADR(self):
self.simulator.decoder.next_instruction_index = LOADBOOL_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeee01ee)
self.simulator.process_load()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionLoadBoolRAA(self):
self.simulator.decoder.next_instruction_index = LOADBOOL_INDEX
self.simulator.memory = Memory(40, 0xeeeeee01ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessOneInstructionLoadBoolADR(self):
self.simulator.decoder.next_instruction_index = LOADBOOL_INDEX + 1
self.simulator.memory = Memory(40, 0xeeeeee01ee)
self.simulator.process_one_instruction()
self.assertEqual(1, self.simulator.registers[3].value)
def testProcessLoadByteRAA(self):
self.simulator.decoder.next_instruction_index = LOADBYTE_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeee24ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_load()
self.assertEqual(0x24, self.simulator.registers[3].value)
def testProcessLoadByteADR(self):
self.simulator.decoder.next_instruction_index = LOADBYTE_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeee24ee)
self.simulator.process_load()
self.assertEqual(0x24, self.simulator.registers[3].value)
def testProcessOneInstructionLoadByteRAA(self):
self.simulator.decoder.next_instruction_index = LOADBYTE_INDEX
self.simulator.memory = Memory(40, 0xeeeeee24ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_one_instruction()
self.assertEqual(0x24, self.simulator.registers[3].value)
def testProcessOneInstructionLoadByteADR(self):
self.simulator.decoder.next_instruction_index = LOADBYTE_INDEX + 1
self.simulator.memory = Memory(40, 0xeeeeee24ee)
self.simulator.process_one_instruction()
self.assertEqual(0x24, self.simulator.registers[3].value)
def testProcessLoadIntRAA(self):
self.simulator.decoder.next_instruction_index = LOADINT_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(48, 0xee12341234ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_load()
self.assertEqual(0x12341234, self.simulator.registers[3].value)
def testProcessLoadIntADR(self):
self.simulator.decoder.next_instruction_index = LOADINT_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(48, 0xee12341234ee)
self.simulator.process_load()
self.assertEqual(0x12341234, self.simulator.registers[3].value)
def testProcessOneInstructionLoadIntRAA(self):
self.simulator.decoder.next_instruction_index = LOADINT_INDEX
self.simulator.memory = Memory(48, 0xee12341234ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_one_instruction()
self.assertEqual(0x12341234, self.simulator.registers[3].value)
def testProcessOneInstructionLoadIntADR(self):
self.simulator.decoder.next_instruction_index = LOADINT_INDEX + 1
self.simulator.memory = Memory(48, 0xee12341234ee)
self.simulator.process_one_instruction()
self.assertEqual(0x12341234, self.simulator.registers[3].value)
def testProcessLoadStateRAA(self):
self.simulator.decoder.next_instruction_index = LOADSTATE_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeee1234ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_load()
self.assertEqual(0x1234, self.simulator.registers[3].value)
def testProcessLoadStateADR(self):
self.simulator.decoder.next_instruction_index = LOADSTATE_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeee1234ee)
self.simulator.process_load()
self.assertEqual(0x1234, self.simulator.registers[3].value)
def testProcessOneInstructionLoadStateRAA(self):
self.simulator.decoder.next_instruction_index = LOADSTATE_INDEX
self.simulator.memory = Memory(40, 0xeeee1234ee)
self.simulator.registers[1].value = 8 # address
self.simulator.process_one_instruction()
self.assertEqual(0x1234, self.simulator.registers[3].value)
def testProcessOneInstructionLoadStateADR(self):
self.simulator.decoder.next_instruction_index = LOADSTATE_INDEX + 1
self.simulator.memory = Memory(40, 0xeeee1234ee)
self.simulator.process_one_instruction()
self.assertEqual(0x1234, self.simulator.registers[3].value)
# ----------------
# STORE OPERATIONS
# ----------------
def testProcessStoreBoolRAA(self):
self.simulator.decoder.next_instruction_index = STOREBOOL_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x01 # value
self.simulator.process_store()
self.assertEqual(0xeeeeee01ee, self.simulator.memory.raw_memory)
def testProcessStoreBoolADR(self):
self.simulator.decoder.next_instruction_index = STOREBOOL_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[3].value = 0x01 # value
self.simulator.process_store()
self.assertEqual(0xeeeeee01ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreBoolRAA(self):
self.simulator.decoder.next_instruction_index = STOREBOOL_INDEX
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x01 # value
self.simulator.process_one_instruction()
self.assertEqual(0xeeeeee01ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreBoolADR(self):
self.simulator.decoder.next_instruction_index = STOREBOOL_INDEX + 1
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[3].value = 0x01 # value
self.simulator.process_one_instruction()
self.assertEqual(0xeeeeee01ee, self.simulator.memory.raw_memory)
def testProcessStoreByteRAA(self):
self.simulator.decoder.next_instruction_index = STOREBYTE_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x24 # value
self.simulator.process_store()
self.assertEqual(0xeeeeee24ee, self.simulator.memory.raw_memory)
def testProcessStoreByteADR(self):
self.simulator.decoder.next_instruction_index = STOREBYTE_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[3].value = 0x24 # value
self.simulator.process_store()
self.assertEqual(0xeeeeee24ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreByteRAA(self):
self.simulator.decoder.next_instruction_index = STOREBYTE_INDEX
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x24 # value
self.simulator.process_one_instruction()
self.assertEqual(0xeeeeee24ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreByteADR(self):
self.simulator.decoder.next_instruction_index = STOREBYTE_INDEX + 1
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[3].value = 0x24 # value
self.simulator.process_one_instruction()
self.assertEqual(0xeeeeee24ee, self.simulator.memory.raw_memory)
def testProcessStoreIntRAA(self):
self.simulator.decoder.next_instruction_index = STOREINT_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(48, 0xeeeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x12341234 # value
self.simulator.process_store()
self.assertEqual(0xee12341234ee, self.simulator.memory.raw_memory)
def testProcessStoreIntADR(self):
self.simulator.decoder.next_instruction_index = STOREINT_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(48, 0xeeeeeeeeeeee)
self.simulator.registers[3].value = 0x12341234 # value
self.simulator.process_store()
self.assertEqual(0xee12341234ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreIntRAA(self):
self.simulator.decoder.next_instruction_index = STOREINT_INDEX
self.simulator.memory = Memory(48, 0xeeeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x12341234 # value
self.simulator.process_one_instruction()
self.assertEqual(0xee12341234ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreIntADR(self):
self.simulator.decoder.next_instruction_index = STOREINT_INDEX + 1
self.simulator.memory = Memory(48, 0xeeeeeeeeeeee)
self.simulator.registers[3].value = 0x12341234 # value
self.simulator.process_one_instruction()
self.assertEqual(0xee12341234ee, self.simulator.memory.raw_memory)
def testProcessStoreStateRAA(self):
self.simulator.decoder.next_instruction_index = STORESTATE_INDEX
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x1234 # value
self.simulator.process_store()
self.assertEqual(0xeeee1234ee, self.simulator.memory.raw_memory)
def testProcessStoreStateADR(self):
self.simulator.decoder.next_instruction_index = STORESTATE_INDEX + 1
self.simulator.current_instruction = self.simulator.decoder.decode_next()
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x1234 # value
self.simulator.process_store()
self.assertEqual(0xeeee1234ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreStateRAA(self):
self.simulator.decoder.next_instruction_index = STORESTATE_INDEX
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[1].value = 8 # address
self.simulator.registers[3].value = 0x1234 # value
self.simulator.process_one_instruction()
self.assertEqual(0xeeee1234ee, self.simulator.memory.raw_memory)
def testProcessOneInstructionStoreStateADR(self):
self.simulator.decoder.next_instruction_index = STORESTATE_INDEX + 1
self.simulator.memory = Memory(40, 0xeeeeeeeeee)
self.simulator.registers[3].value = 0x1234 # value
self.simulator.process_one_instruction()
self.assertEqual(0xeeee1234ee, self.simulator.memory.raw_memory)
| 43.987261 | 86 | 0.701771 | 5,445 | 48,342 | 6.078604 | 0.057117 | 0.272584 | 0.164179 | 0.091728 | 0.846244 | 0.843737 | 0.834824 | 0.829537 | 0.823947 | 0.802133 | 0 | 0.033907 | 0.189814 | 48,342 | 1,098 | 87 | 44.027322 | 0.811163 | 0.026871 | 0 | 0.743151 | 0 | 0 | 0.010266 | 0 | 0 | 0 | 0.016528 | 0 | 0.163242 | 1 | 0.159817 | false | 0 | 0.007991 | 0 | 0.171233 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7c77cc8f2c561785e48737f1fec48d3c9ef2ce0b | 103,145 | py | Python | assets/useragent/chrome.py | ice-melt/python-lib | 345e34fff7386d91acbb03a01fd4127c5dfed037 | [
"MIT"
] | 74 | 2018-07-31T05:04:26.000Z | 2021-02-18T05:51:22.000Z | assets/useragent/chrome.py | ice-melt/python-lib | 345e34fff7386d91acbb03a01fd4127c5dfed037 | [
"MIT"
] | null | null | null | assets/useragent/chrome.py | ice-melt/python-lib | 345e34fff7386d91acbb03a01fd4127c5dfed037 | [
"MIT"
] | 39 | 2018-08-30T07:02:51.000Z | 2021-03-22T11:47:01.000Z | chrome = [
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.1 Safari/537.36',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2226.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.4; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2225.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2225.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2224.3 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.93 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.124 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 4.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.67 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.67 Safari/537.36',
'Mozilla/5.0 (X11; OpenBSD i386) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.125 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1944.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.3319.102 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.2309.372 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.2117.157 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.47 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1866.237 Safari/537.36',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.137 Safari/4E423F',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.116 Safari/537.36 Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.10',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.517 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1667.0 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1664.3 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1664.3 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.16 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1623.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.17 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.62 Safari/537.36',
'Mozilla/5.0 (X11; CrOS i686 4319.74.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.57 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.2 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1468.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1467.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1464.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1500.55 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.93 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.93 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.93 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.93 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.93 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.93 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.90 Safari/537.36',
'Mozilla/5.0 (X11; NetBSD) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.116 Safari/537.36',
'Mozilla/5.0 (X11; CrOS i686 3912.101.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.116 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.60 Safari/537.17',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1309.0 Safari/537.17',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.15 (KHTML, like Gecko) Chrome/24.0.1295.0 Safari/537.15',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.14 (KHTML, like Gecko) Chrome/24.0.1292.0 Safari/537.14',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.13 (KHTML, like Gecko) Chrome/24.0.1290.1 Safari/537.13',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.13 (KHTML, like Gecko) Chrome/24.0.1290.1 Safari/537.13',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.13 (KHTML, like Gecko) Chrome/24.0.1290.1 Safari/537.13',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_4) AppleWebKit/537.13 (KHTML, like Gecko) Chrome/24.0.1290.1 Safari/537.13',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.13 (KHTML, like Gecko) Chrome/24.0.1284.0 Safari/537.13',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.6 Safari/537.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.6 Safari/537.11',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.26 Safari/537.11',
'Mozilla/5.0 (Windows NT 6.0) yi; AppleWebKit/345667.12221 (KHTML, like Gecko) Chrome/23.0.1271.26 Safari/453667.1221',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.17 Safari/537.11',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_0) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.79 Safari/537.4',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.2 (KHTML, like Gecko) Chrome/22.0.1216.0 Safari/537.2',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1',
'Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.9 Safari/536.5',
'Mozilla/5.0 (X11; FreeBSD amd64) AppleWebKit/536.5 (KHTML like Gecko) Chrome/19.0.1084.56 Safari/1EA69',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/535.22 (KHTML, like Gecko) Chrome/19.0.1047.0 Safari/535.22',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.21 (KHTML, like Gecko) Chrome/19.0.1042.0 Safari/535.21',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.21 (KHTML, like Gecko) Chrome/19.0.1041.0 Safari/535.21',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/535.20 (KHTML, like Gecko) Chrome/19.0.1036.7 Safari/535.20',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/18.6.872.0 Safari/535.2 UNTRUSTED/1.0 3gpp-gba UNTRUSTED/1.0',
'Mozilla/5.0 (Macintosh; AMD Mac OS X 10_8_2) AppleWebKit/535.22 (KHTML, like Gecko) Chrome/18.6.872',
'Mozilla/5.0 (X11; CrOS i686 1660.57.0) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.46 Safari/535.19',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.45 Safari/535.19',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.45 Safari/535.19',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.45 Safari/535.19',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Safari/535.19',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.151 Safari/535.19',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.19 (KHTML, like Gecko) Ubuntu/11.10 Chromium/18.0.1025.142 Chrome/18.0.1025.142 Safari/535.19',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.11 Safari/535.19',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.66 Safari/535.11',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Ubuntu/11.10 Chromium/17.0.963.65 Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Ubuntu/11.04 Chromium/17.0.963.65 Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Ubuntu/10.10 Chromium/17.0.963.65 Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.11 (KHTML, like Gecko) Ubuntu/11.10 Chromium/17.0.963.65 Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (X11; FreeBSD amd64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_4) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.65 Safari/535.11',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Ubuntu/11.04 Chromium/17.0.963.56 Chrome/17.0.963.56 Safari/535.11',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.12 Safari/535.11',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.8 (KHTML, like Gecko) Chrome/17.0.940.0 Safari/535.8',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.77 Safari/535.7ad-imcjapan-syosyaman-xkgi3lqg03!wgz',
'Mozilla/5.0 (X11; CrOS i686 1193.158.0) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.75 Safari/535.7',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.75 Safari/535.7',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.75 Safari/535.7',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.63 Safari/535.7xs5D9rRDFpg2g',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.8 (KHTML, like Gecko) Chrome/16.0.912.63 Safari/535.8',
'Mozilla/5.0 (Windows NT 5.2; WOW64) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.63 Safari/535.7',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.36 Safari/535.7',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.36 Safari/535.7',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.36 Safari/535.7',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.6 (KHTML, like Gecko) Chrome/16.0.897.0 Safari/535.6',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.54 Safari/535.2',
'Mozilla/5.0 (X11; FreeBSD i386) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.2 (KHTML, like Gecko) Ubuntu/11.10 Chromium/15.0.874.120 Chrome/15.0.874.120 Safari/535.2',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.120 Safari/535.2',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.872.0 Safari/535.2',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.2 (KHTML, like Gecko) Ubuntu/11.04 Chromium/15.0.871.0 Chrome/15.0.871.0 Safari/535.2',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.864.0 Safari/535.2',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.861.0 Safari/535.2',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.861.0 Safari/535.2',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.861.0 Safari/535.2',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.860.0 Safari/535.2',
'Chrome/15.0.860.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/15.0.860.0',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.835.186 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.834.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/11.04 Chromium/14.0.825.0 Chrome/14.0.825.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.824.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.815.10913 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.815.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/11.04 Chromium/14.0.814.0 Chrome/14.0.814.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.814.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/10.04 Chromium/14.0.813.0 Chrome/14.0.813.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.813.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.813.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.813.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_7) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.813.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.812.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.811.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.810.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.810.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.809.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/10.10 Chromium/14.0.808.0 Chrome/14.0.808.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/10.04 Chromium/14.0.808.0 Chrome/14.0.808.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/10.04 Chromium/14.0.804.0 Chrome/14.0.804.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.803.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/11.04 Chromium/14.0.803.0 Chrome/14.0.803.0 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.803.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.803.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_7) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.803.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.803.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.801.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.801.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.794.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.794.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.792.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.792.0 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.792.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; PPC Mac OS X 10_6_7) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.790.0 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_7) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.790.0 Safari/535.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.1) AppleWebKit/526.3 (KHTML, like Gecko) Chrome/14.0.564.21 Safari/526.3',
'Mozilla/5.0 (X11; CrOS i686 13.587.48) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.43 Safari/535.1',
'Mozilla/5.0 Slackware/13.37 (X11; U; Linux x86_64; en-US) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41',
'Mozilla/5.0 ArchLinux (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Ubuntu/11.04 Chromium/13.0.782.41 Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.2; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_7) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_3) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.41 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_3) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.32 Safari/535.1',
'Mozilla/5.0 (X11; Linux amd64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.24 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.24 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.24 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.220 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.220 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.220 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.215 Safari/535.1',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.215 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.215 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.215 Safari/535.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.20 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.20 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.20 Safari/535.1',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.20 Safari/535.1',
'Mozilla/5.0 (X11; CrOS i686 0.13.587) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.14 Safari/535.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.107 Safari/535.1',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_2) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.107 Safari/535.1',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.1 Safari/535.1',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.36 (KHTML, like Gecko) Chrome/13.0.766.0 Safari/534.36',
'Mozilla/5.0 (X11; Linux amd64) AppleWebKit/534.36 (KHTML, like Gecko) Chrome/13.0.766.0 Safari/534.36',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.35 (KHTML, like Gecko) Ubuntu/10.10 Chromium/13.0.764.0 Chrome/13.0.764.0 Safari/534.35',
'Mozilla/5.0 (X11; CrOS i686 0.13.507) AppleWebKit/534.35 (KHTML, like Gecko) Chrome/13.0.763.0 Safari/534.35',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.33 (KHTML, like Gecko) Ubuntu/9.10 Chromium/13.0.752.0 Chrome/13.0.752.0 Safari/534.33',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/534.31 (KHTML, like Gecko) Chrome/13.0.748.0 Safari/534.31',
'Mozilla/5.0 (Windows NT 6.1; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.750.0 Safari/534.30',
'Mozilla/5.0 (X11; CrOS i686 12.433.109) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.93 Safari/534.30',
'Mozilla/5.0 (X11; CrOS i686 12.0.742.91) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.93 Safari/534.30',
'Mozilla/5.0 Slackware/13.37 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/12.0.742.91',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.91 Chromium/12.0.742.91 Safari/534.30',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.68 Safari/534.30',
'Mozilla/5.0 ArchLinux (X11; U; Linux x86_64; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.60 Safari/534.30',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.53 Safari/534.30',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.113 Safari/534.30',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/11.04 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/10.10 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/10.04 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/11.04 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/10.10 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/10.04 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (Windows NT 7.1) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (Windows NT 5.2) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (Windows 8) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_6) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_4) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30',
'Mozilla/5.0 (X11; CrOS i686 12.433.216) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.105 Safari/534.30',
'Mozilla/5.0 ArchLinux (X11; U; Linux x86_64; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30',
'Mozilla/5.0 ArchLinux (X11; U; Linux x86_64; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Slackware/Chrome/12.0.742.100 Safari/534.30',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_4) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.724.100 Safari/534.30',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/534.25 (KHTML, like Gecko) Chrome/12.0.706.0 Safari/534.25',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/534.25 (KHTML, like Gecko) Chrome/12.0.704.0 Safari/534.25',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Ubuntu/10.10 Chromium/12.0.703.0 Chrome/12.0.703.0 Safari/534.24',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.24 (KHTML, like Gecko) Ubuntu/10.10 Chromium/12.0.702.0 Chrome/12.0.702.0 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/12.0.702.0 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/12.0.702.0 Safari/534.24',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.700.3 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.699.0 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.699.0 Safari/534.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_6) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.698.0 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.697.0 Safari/534.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.71 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.68 Safari/534.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_7) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.68 Safari/534.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.68 Safari/534.24',
'Mozilla/5.0 Slackware/13.37 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/11.0.696.50',
'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.43 Safari/534.24',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.34 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.34 Safari/534.24',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.3 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.3 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.0) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.3 Safari/534.24',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.14 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.12 Safari/534.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_6) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.12 Safari/534.24',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Ubuntu/10.04 Chromium/11.0.696.0 Chrome/11.0.696.0 Safari/534.24',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.0 Safari/534.24',
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.694.0 Safari/534.24',
'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.23 (KHTML, like Gecko) Chrome/11.0.686.3 Safari/534.23',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.21 (KHTML, like Gecko) Chrome/11.0.682.0 Safari/534.21',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.21 (KHTML, like Gecko) Chrome/11.0.678.0 Safari/534.21',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_7_0; en-US) AppleWebKit/534.21 (KHTML, like Gecko) Chrome/11.0.678.0 Safari/534.21',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.672.2 Safari/534.20',
'Mozilla/5.0 (Windows NT) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.672.2 Safari/534.20',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.672.2 Safari/534.20',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.669.0 Safari/534.20',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.19 (KHTML, like Gecko) Chrome/11.0.661.0 Safari/534.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.18 (KHTML, like Gecko) Chrome/11.0.661.0 Safari/534.18',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.18 (KHTML, like Gecko) Chrome/11.0.660.0 Safari/534.18',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.655.0 Safari/534.17',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.655.0 Safari/534.17',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.654.0 Safari/534.17',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.652.0 Safari/534.17',
'Mozilla/4.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/11.0.1245.0 Safari/537.36',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/10.0.649.0 Safari/534.17',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/10.0.649.0 Safari/534.17',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.82 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux armv7l; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16',
'Mozilla/5.0 (X11; U; FreeBSD x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16',
'Mozilla/5.0 (X11; U; FreeBSD i386; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.133 Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.133 Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.127 Chrome/10.0.648.127 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.127 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.127 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.127 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.11 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; ru-RU; AppleWebKit/534.16; KHTML; like Gecko; Chrome/10.0.648.11;Safari/534.16)',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; ru-RU) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.11 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.11 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.0 Chrome/10.0.648.0 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.0 Chrome/10.0.648.0 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.0 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.642.0 Chrome/10.0.642.0 Safari/534.16',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.639.0 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.638.0 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.634.0 Safari/534.16',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.634.0 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 SUSE/10.0.626.0 (KHTML, like Gecko) Chrome/10.0.626.0 Safari/534.16',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Chrome/10.0.613.0 Safari/534.15',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.613.0 Chrome/10.0.613.0 Safari/534.15',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Ubuntu/10.04 Chromium/10.0.612.3 Chrome/10.0.612.3 Safari/534.15',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Chrome/10.0.612.1 Safari/534.15',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.611.0 Chrome/10.0.611.0 Safari/534.15',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/10.0.602.0 Safari/534.14',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/10.0.601.0 Safari/534.14',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/10.0.601.0 Safari/534.14',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML,like Gecko) Chrome/9.1.0.0 Safari/540.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML, like Gecko) Ubuntu/10.10 Chrome/9.1.0.0 Safari/540.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.601.0 Safari/534.14',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Ubuntu/10.10 Chromium/9.0.600.0 Chrome/9.0.600.0 Safari/534.14',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.600.0 Safari/534.14',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.599.0 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-CA) AppleWebKit/534.13 (KHTML like Gecko) Chrome/9.0.597.98 Safari/534.13',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.84 Safari/534.13',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.44 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.19 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.15 Safari/534.13',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.15 Safari/534.13',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.107 Safari/534.13 v1416758524.9051',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.107 Safari/534.13 v1416748405.3871',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.107 Safari/534.13 v1416670950.695',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.107 Safari/534.13 v1416664997.4379',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.107 Safari/534.13 v1333515017.9196',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.596.0 Safari/534.13',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Ubuntu/10.04 Chromium/9.0.595.0 Chrome/9.0.595.0 Safari/534.13',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Ubuntu/9.10 Chromium/9.0.592.0 Chrome/9.0.592.0 Safari/534.13',
'Mozilla/5.0 (X11; U; Windows NT 6; en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.587.0 Safari/534.12',
'Mozilla/5.0 (Windows U Windows NT 5.1 en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.583.0 Safari/534.12',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.579.0 Safari/534.12',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.576.0 Safari/534.12',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML, like Gecko) Ubuntu/10.10 Chrome/8.1.0.0 Safari/540.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.558.0 Safari/534.10',
'Mozilla/5.0 (X11; U; CrOS i686 0.9.130; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.344 Safari/534.10',
'Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.343 Safari/534.10',
'Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.341 Safari/534.10',
'Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.339 Safari/534.10',
'Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.339',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Ubuntu/10.10 Chromium/8.0.552.237 Chrome/8.0.552.237 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/533.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.215 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.215 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.215 Safari/534.10',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.210 Safari/534.10',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.200 Safari/534.10',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.551.0 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.548.0 Safari/534.10',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.544.0 Safari/534.10',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.15) Gecko/20101027 Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.9 (KHTML, like Gecko) Chrome/7.0.531.0 Safari/534.9',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.8 (KHTML, like Gecko) Chrome/7.0.521.0 Safari/534.8',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.517.24 Safari/534.7',
'Mozilla/5.0 (X11; U; Linux x86_64; fr-FR) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.514.0 Safari/534.7',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.514.0 Safari/534.7',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.514.0 Safari/534.7',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.6 (KHTML, like Gecko) Chrome/7.0.500.0 Safari/534.6',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.6 (KHTML, like Gecko) Chrome/7.0.498.0 Safari/534.6',
'Mozilla/5.0 (ipad Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.6 (KHTML, like Gecko) Chrome/7.0.498.0 Safari/534.6',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/7.0.0 Safari/700.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.4 (KHTML, like Gecko) Chrome/6.0.481.0 Safari/534.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.63 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.53 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.33 Safari/534.3',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.470.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.464.0 Safari/534.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.464.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.463.0 Safari/534.3',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.462.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.462.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.461.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.461.0 Safari/534.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.461.0 Safari/534.3',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.460.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.460.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.460.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.459.0 Safari/534.3',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.0 Safari/534.3',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.457.0 Safari/534.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.456.0 Safari/534.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.454.0 Safari/534.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.454.0 Safari/534.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.453.1 Safari/534.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.453.1 Safari/534.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.453.1 Safari/534.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.451.0 Safari/534.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.1 SUSE/6.0.428.0 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.427.0 Safari/534.1',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.422.0 Safari/534.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.417.0 Safari/534.1',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.416.0 Safari/534.1',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.414.0 Safari/534.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.9 (KHTML, like Gecko) Chrome/6.0.400.0 Safari/533.9',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.8 (KHTML, like Gecko) Chrome/6.0.397.0 Safari/533.8',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/6.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.999 Safari/533.4',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.86 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.86 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.86 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.70 Safari/533.4',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.127 Safari/533.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.126 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; fr-FR) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.126 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.125 Safari/533.4',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.370.0 Safari/533.4',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.368.0 Safari/533.4',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.366.2 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.366.0 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.366.0 Safari/533.4',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.363.0 Safari/533.3',
'Mozilla/5.0 (X11; U; OpenBSD i386; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.359.0 Safari/533.3',
'Mozilla/5.0 (X11; U; x86_64 Linux; en_GB, en_US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.358.0 Safari/533.3',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.358.0 Safari/533.3',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.358.0 Safari/533.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.357.0 Safari/533.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.356.0 Safari/533.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.355.0 Safari/533.3',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.354.0 Safari/533.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.354.0 Safari/533.3',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.353.0 Safari/533.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.353.0 Safari/533.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.343.0 Safari/533.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.343.0 Safari/533.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_7_0; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.7 Safari/533.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.7 Safari/533.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.5 Safari/533.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.3 Safari/533.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.3 Safari/533.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.2 Safari/533.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.1 Safari/533.2',
'Mozilla/5.0 (X11; U; Linux i586; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.1 Safari/533.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.1 Safari/533.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.1 (KHTML, like Gecko) Chrome/5.0.335.0 Safari/533.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN) AppleWebKit/533.16 (KHTML, like Gecko) Chrome/5.0.335.0 Safari/533.16',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.310.0 Safari/532.9',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.309.0 Safari/532.9',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.308.0 Safari/532.9',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.307.11 Safari/532.9',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.307.1 Safari/532.9',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.1.249.1025 Safari/532.5',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.8 (KHTML, like Gecko) Chrome/4.0.302.2 Safari/532.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.8 (KHTML, like Gecko) Chrome/4.0.288.1 Safari/532.8',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.8 (KHTML, like Gecko) Chrome/4.0.277.0 Safari/532.8',
'Mozilla/5.0 (X11; U; Slackware Linux x86_64; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.249.30 Safari/532.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; it-IT) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.249.25 Safari/532.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.249.0 Safari/532.5',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_8; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.249.0 Safari/532.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.246.0 Safari/532.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.4 (KHTML, like Gecko) Chrome/4.0.241.0 Safari/532.4',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.4 (KHTML, like Gecko) Chrome/4.0.237.0 Safari/532.4 Debian',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.3 (KHTML, like Gecko) Chrome/4.0.227.0 Safari/532.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.3 (KHTML, like Gecko) Chrome/4.0.224.2 Safari/532.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.3 (KHTML, like Gecko) Chrome/4.0.223.5 Safari/532.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.4 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.3 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; de-DE) Chrome/4.0.223.3 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.1 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.1 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.1 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.0 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.8 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.7 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.6 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.6 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.6 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.3 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.3 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.3 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.2 Safari/532.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.2 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.12 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.12 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.12 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.1 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.0 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.7 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.6 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.6 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.6 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.3 Safari/532.2',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.0 Safari/532.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.220.1 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.6 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.5 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.5 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.4 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.3 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.3 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.3 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.0 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.212.1 Safari/532.1',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.212.1 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.1',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.7 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.7 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.4 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.4 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.4 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.210.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.210.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (X11; U; FreeBSD i386; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.205.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.4 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0 (x86_64); de-DE) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; de-DE) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/525.13.',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.201.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.201.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.201.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.201.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.1 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.1 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.2 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.2 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.0 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.0 Safari/532.0',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.4 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.33 Safari/532.0',
'Mozilla/4.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.33 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.3 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.3 Safari/532.0',
'Mozilla/6.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML,like Gecko) Chrome/3.0.195.27',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.24 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.24 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.20 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.20 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.17 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.17 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.10 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.10 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.10 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0',
'Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/531.4 (KHTML, like Gecko) Chrome/3.0.194.0 Safari/531.4',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.4 (KHTML, like Gecko) Chrome/3.0.194.0 Safari/531.4',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.2 Safari/531.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.2 Safari/531.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.2 Safari/531.3',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.0 Safari/531.3',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.192 Safari/531.3',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/531.2 (KHTML, like Gecko) Chrome/3.0.191.3 Safari/531.2',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/3.0.191.0 Safari/531.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/3.0.191.0 Safari/531.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/2.0.182.0 Safari/532.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/2.0.182.0 Safari/531.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.0 (KHTML, like Gecko) Chrome/2.0.182.0 Safari/531.0',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.8 (KHTML, like Gecko) Chrome/2.0.178.0 Safari/530.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.8 (KHTML, like Gecko) Chrome/2.0.177.1 Safari/530.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.8 (KHTML, like Gecko) Chrome/2.0.177.0 Safari/530.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.177.0 Safari/530.7',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.176.0 Safari/530.7',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.176.0 Safari/530.7',
'Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.175.0 Safari/530.7',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.175.0 Safari/530.7',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.175.0 Safari/530.6',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.5',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.173.1 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.173.1 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.173.0 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.8 Safari/530.5',
'Mozilla/6.0 (Windows; U; Windows NT 6.0; en-US) Gecko/2009032609 Chrome/2.0.172.6 Safari/530.7',
'Mozilla/6.0 (Windows; U; Windows NT 6.0; en-US) Gecko/2009032609 (KHTML, like Gecko) Chrome/2.0.172.6 Safari/530.7',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.6 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.42 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.40 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.40 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.39 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.39 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.23 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.2 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.2 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; eu) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.5',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.171.0 Safari/530.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.170.0 Safari/530.1',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.169.0 Safari/530.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.168.0 Safari/530.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.164.0 Safari/530.1',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.0 (KHTML, like Gecko) Chrome/2.0.162.0 Safari/530.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.0 (KHTML, like Gecko) Chrome/2.0.160.0 Safari/530.0',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.2 Safari/528.10',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.2 Safari/528.10',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.2 Safari/528.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.11 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.11',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.9 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.9',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.11 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.11',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.10',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.1 Safari/528.8',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.1 Safari/528.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.1 Safari/528.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.0 Version/3.2.1 Safari/528.8',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.0 Safari/528.8',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/1.0.156.0 Safari/528.8',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.59 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.59 Safari/525.19',
'Mozilla/4.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.59 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.55 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.55 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.50 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.50 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.48 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.46 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.42 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.39 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.4.154.31 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.4.154.18 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.4 (KHTML, like Gecko) Chrome/0.3.155.0 Safari/528.4',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.3.155.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.3.154.9 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.3.154.6 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.153.1 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.153.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.153.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.152.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.152.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.151.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.151.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.151.0 Safari/525.19',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.6 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.6 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.30 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.30 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.29 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.29 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.29 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 6.0; de) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13(KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Linux; U; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13',
'Mozilla/5.0 (Macintosh; U; Mac OS X 10_6_1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/ Safari/530.5',
'Mozilla/5.0 (Macintosh; U; Mac OS X 10_5_7; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/ Safari/530.5',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/530.9 (KHTML, like Gecko) Chrome/ Safari/530.9',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/ Safari/530.6',
'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/ Safari/530.5'
] | 122.209716 | 251 | 0.686878 | 20,667 | 103,145 | 3.408332 | 0.022113 | 0.025327 | 0.167149 | 0.226008 | 0.974603 | 0.972572 | 0.963714 | 0.954869 | 0.947885 | 0.93312 | 0 | 0.207677 | 0.134674 | 103,145 | 844 | 252 | 122.209716 | 0.581531 | 0 | 0 | 0.004739 | 0 | 0.99763 | 0.934578 | 0.004644 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
7ccc53a013c72a90b4171725c76ffd660d6c2670 | 187 | py | Python | tensorflow_tts/losses/__init__.py | ishine/TensorFlowTTS-1 | dd04992f2b05d2845f862f86cfae006b91e3e870 | [
"Apache-2.0"
] | null | null | null | tensorflow_tts/losses/__init__.py | ishine/TensorFlowTTS-1 | dd04992f2b05d2845f862f86cfae006b91e3e870 | [
"Apache-2.0"
] | null | null | null | tensorflow_tts/losses/__init__.py | ishine/TensorFlowTTS-1 | dd04992f2b05d2845f862f86cfae006b91e3e870 | [
"Apache-2.0"
] | 4 | 2021-02-23T13:05:59.000Z | 2021-04-23T05:15:32.000Z | from tensorflow_tts.losses.stft import TFMultiResolutionSTFT
from tensorflow_tts.losses.spectrogram import TFMelSpectrogram
from tensorflow_tts.losses.ganloss import GANCritic, GanLoss
| 31.166667 | 62 | 0.882353 | 22 | 187 | 7.363636 | 0.5 | 0.259259 | 0.314815 | 0.425926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080214 | 187 | 5 | 63 | 37.4 | 0.94186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7cccd3dfc279ee55bb4e26d1f36b00c9e064bec2 | 110 | py | Python | test/unit/jobs/test_rules_override/10_rule.py | rikeshi/galaxy | c536a877e4a9b3d12aa0d00fd4d5e705109a0d0a | [
"CC-BY-3.0"
] | 1,085 | 2015-02-18T16:14:38.000Z | 2022-03-30T23:52:07.000Z | test/unit/jobs/test_rules_override/10_rule.py | rikeshi/galaxy | c536a877e4a9b3d12aa0d00fd4d5e705109a0d0a | [
"CC-BY-3.0"
] | 11,253 | 2015-02-18T17:47:32.000Z | 2022-03-31T21:47:03.000Z | test/unit/jobs/test_rules_override/10_rule.py | rikeshi/galaxy | c536a877e4a9b3d12aa0d00fd4d5e705109a0d0a | [
"CC-BY-3.0"
] | 1,000 | 2015-02-18T16:18:10.000Z | 2022-03-29T08:22:56.000Z | def rule_module_override():
# Dummy rule for testing rule module overrides
return 'new_rules_package'
| 27.5 | 50 | 0.763636 | 15 | 110 | 5.333333 | 0.8 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 110 | 3 | 51 | 36.666667 | 0.888889 | 0.4 | 0 | 0 | 0 | 0 | 0.265625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
7cdddd8f45b5198792b13378e51bad8752418d4f | 11,785 | py | Python | nmt/utils/iterator_utils_test.py | godblessforhimself/nmt | 1d71bbe4d69932fbe92998abc6c23443c75ebbf9 | [
"Apache-2.0"
] | 6,575 | 2017-07-12T18:34:44.000Z | 2022-03-30T08:36:18.000Z | nmt/utils/iterator_utils_test.py | godblessforhimself/nmt | 1d71bbe4d69932fbe92998abc6c23443c75ebbf9 | [
"Apache-2.0"
] | 458 | 2017-07-13T01:57:19.000Z | 2022-03-23T23:19:03.000Z | nmt/utils/iterator_utils_test.py | godblessforhimself/nmt | 1d71bbe4d69932fbe92998abc6c23443c75ebbf9 | [
"Apache-2.0"
] | 2,251 | 2017-07-12T19:35:53.000Z | 2022-03-26T19:57:51.000Z | # Copyright 2017 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for iterator_utils.py"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
from tensorflow.python.ops import lookup_ops
from ..utils import iterator_utils
class IteratorUtilsTest(tf.test.TestCase):
def testGetIterator(self):
tf.set_random_seed(1)
tgt_vocab_table = src_vocab_table = lookup_ops.index_table_from_tensor(
tf.constant(["a", "b", "c", "eos", "sos"]))
src_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["f e a g", "c c a", "d", "c a"]))
tgt_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["c c", "a b", "", "b c"]))
hparams = tf.contrib.training.HParams(
random_seed=3,
num_buckets=5,
eos="eos",
sos="sos")
batch_size = 2
src_max_len = 3
iterator = iterator_utils.get_iterator(
src_dataset=src_dataset,
tgt_dataset=tgt_dataset,
src_vocab_table=src_vocab_table,
tgt_vocab_table=tgt_vocab_table,
batch_size=batch_size,
sos=hparams.sos,
eos=hparams.eos,
random_seed=hparams.random_seed,
num_buckets=hparams.num_buckets,
src_max_len=src_max_len,
reshuffle_each_iteration=False)
table_initializer = tf.tables_initializer()
source = iterator.source
target_input = iterator.target_input
target_output = iterator.target_output
src_seq_len = iterator.source_sequence_length
tgt_seq_len = iterator.target_sequence_length
self.assertEqual([None, None], source.shape.as_list())
self.assertEqual([None, None], target_input.shape.as_list())
self.assertEqual([None, None], target_output.shape.as_list())
self.assertEqual([None], src_seq_len.shape.as_list())
self.assertEqual([None], tgt_seq_len.shape.as_list())
with self.test_session() as sess:
sess.run(table_initializer)
sess.run(iterator.initializer)
(source_v, src_len_v, target_input_v, target_output_v, tgt_len_v) = (
sess.run((source, src_seq_len, target_input, target_output,
tgt_seq_len)))
self.assertAllEqual(
[[2, 0, 3], # c a eos -- eos is padding
[-1, -1, 0]], # "f" == unknown, "e" == unknown, a
source_v)
self.assertAllEqual([2, 3], src_len_v)
self.assertAllEqual(
[[4, 1, 2], # sos b c
[4, 2, 2]], # sos c c
target_input_v)
self.assertAllEqual(
[[1, 2, 3], # b c eos
[2, 2, 3]], # c c eos
target_output_v)
self.assertAllEqual([3, 3], tgt_len_v)
(source_v, src_len_v, target_input_v, target_output_v, tgt_len_v) = (
sess.run((source, src_seq_len, target_input, target_output,
tgt_seq_len)))
self.assertAllEqual(
[[2, 2, 0]], # c c a
source_v)
self.assertAllEqual([3], src_len_v)
self.assertAllEqual(
[[4, 0, 1]], # sos a b
target_input_v)
self.assertAllEqual(
[[0, 1, 3]], # a b eos
target_output_v)
self.assertAllEqual([3], tgt_len_v)
with self.assertRaisesOpError("End of sequence"):
sess.run(source)
def testGetIteratorWithShard(self):
tf.set_random_seed(1)
tgt_vocab_table = src_vocab_table = lookup_ops.index_table_from_tensor(
tf.constant(["a", "b", "c", "eos", "sos"]))
src_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["c c a", "f e a g", "d", "c a"]))
tgt_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["a b", "c c", "", "b c"]))
hparams = tf.contrib.training.HParams(
random_seed=3,
num_buckets=5,
eos="eos",
sos="sos")
batch_size = 2
src_max_len = 3
iterator = iterator_utils.get_iterator(
src_dataset=src_dataset,
tgt_dataset=tgt_dataset,
src_vocab_table=src_vocab_table,
tgt_vocab_table=tgt_vocab_table,
batch_size=batch_size,
sos=hparams.sos,
eos=hparams.eos,
random_seed=hparams.random_seed,
num_buckets=hparams.num_buckets,
src_max_len=src_max_len,
num_shards=2,
shard_index=1,
reshuffle_each_iteration=False)
table_initializer = tf.tables_initializer()
source = iterator.source
target_input = iterator.target_input
target_output = iterator.target_output
src_seq_len = iterator.source_sequence_length
tgt_seq_len = iterator.target_sequence_length
self.assertEqual([None, None], source.shape.as_list())
self.assertEqual([None, None], target_input.shape.as_list())
self.assertEqual([None, None], target_output.shape.as_list())
self.assertEqual([None], src_seq_len.shape.as_list())
self.assertEqual([None], tgt_seq_len.shape.as_list())
with self.test_session() as sess:
sess.run(table_initializer)
sess.run(iterator.initializer)
(source_v, src_len_v, target_input_v, target_output_v, tgt_len_v) = (
sess.run((source, src_seq_len, target_input, target_output,
tgt_seq_len)))
self.assertAllEqual(
[[2, 0, 3], # c a eos -- eos is padding
[-1, -1, 0]], # "f" == unknown, "e" == unknown, a
source_v)
self.assertAllEqual([2, 3], src_len_v)
self.assertAllEqual(
[[4, 1, 2], # sos b c
[4, 2, 2]], # sos c c
target_input_v)
self.assertAllEqual(
[[1, 2, 3], # b c eos
[2, 2, 3]], # c c eos
target_output_v)
self.assertAllEqual([3, 3], tgt_len_v)
with self.assertRaisesOpError("End of sequence"):
sess.run(source)
def testGetIteratorWithSkipCount(self):
tf.set_random_seed(1)
tgt_vocab_table = src_vocab_table = lookup_ops.index_table_from_tensor(
tf.constant(["a", "b", "c", "eos", "sos"]))
src_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["c a", "c c a", "d", "f e a g"]))
tgt_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["b c", "a b", "", "c c"]))
hparams = tf.contrib.training.HParams(
random_seed=3,
num_buckets=5,
eos="eos",
sos="sos")
batch_size = 2
src_max_len = 3
skip_count = tf.placeholder(shape=(), dtype=tf.int64)
iterator = iterator_utils.get_iterator(
src_dataset=src_dataset,
tgt_dataset=tgt_dataset,
src_vocab_table=src_vocab_table,
tgt_vocab_table=tgt_vocab_table,
batch_size=batch_size,
sos=hparams.sos,
eos=hparams.eos,
random_seed=hparams.random_seed,
num_buckets=hparams.num_buckets,
src_max_len=src_max_len,
skip_count=skip_count,
reshuffle_each_iteration=False)
table_initializer = tf.tables_initializer()
source = iterator.source
target_input = iterator.target_input
target_output = iterator.target_output
src_seq_len = iterator.source_sequence_length
tgt_seq_len = iterator.target_sequence_length
self.assertEqual([None, None], source.shape.as_list())
self.assertEqual([None, None], target_input.shape.as_list())
self.assertEqual([None, None], target_output.shape.as_list())
self.assertEqual([None], src_seq_len.shape.as_list())
self.assertEqual([None], tgt_seq_len.shape.as_list())
with self.test_session() as sess:
sess.run(table_initializer)
sess.run(iterator.initializer, feed_dict={skip_count: 3})
(source_v, src_len_v, target_input_v, target_output_v, tgt_len_v) = (
sess.run((source, src_seq_len, target_input, target_output,
tgt_seq_len)))
self.assertAllEqual(
[[-1, -1, 0]], # "f" == unknown, "e" == unknown, a
source_v)
self.assertAllEqual([3], src_len_v)
self.assertAllEqual(
[[4, 2, 2]], # sos c c
target_input_v)
self.assertAllEqual(
[[2, 2, 3]], # c c eos
target_output_v)
self.assertAllEqual([3], tgt_len_v)
with self.assertRaisesOpError("End of sequence"):
sess.run(source)
# Re-init iterator with skip_count=0.
sess.run(iterator.initializer, feed_dict={skip_count: 0})
(source_v, src_len_v, target_input_v, target_output_v, tgt_len_v) = (
sess.run((source, src_seq_len, target_input, target_output,
tgt_seq_len)))
self.assertAllEqual(
[[-1, -1, 0], # "f" == unknown, "e" == unknown, a
[2, 0, 3]], # c a eos -- eos is padding
source_v)
self.assertAllEqual([3, 2], src_len_v)
self.assertAllEqual(
[[4, 2, 2], # sos c c
[4, 1, 2]], # sos b c
target_input_v)
self.assertAllEqual(
[[2, 2, 3], # c c eos
[1, 2, 3]], # b c eos
target_output_v)
self.assertAllEqual([3, 3], tgt_len_v)
(source_v, src_len_v, target_input_v, target_output_v, tgt_len_v) = (
sess.run((source, src_seq_len, target_input, target_output,
tgt_seq_len)))
self.assertAllEqual(
[[2, 2, 0]], # c c a
source_v)
self.assertAllEqual([3], src_len_v)
self.assertAllEqual(
[[4, 0, 1]], # sos a b
target_input_v)
self.assertAllEqual(
[[0, 1, 3]], # a b eos
target_output_v)
self.assertAllEqual([3], tgt_len_v)
with self.assertRaisesOpError("End of sequence"):
sess.run(source)
def testGetInferIterator(self):
src_vocab_table = lookup_ops.index_table_from_tensor(
tf.constant(["a", "b", "c", "eos", "sos"]))
src_dataset = tf.data.Dataset.from_tensor_slices(
tf.constant(["c c a", "c a", "d", "f e a g"]))
hparams = tf.contrib.training.HParams(
random_seed=3,
eos="eos",
sos="sos")
batch_size = 2
src_max_len = 3
iterator = iterator_utils.get_infer_iterator(
src_dataset=src_dataset,
src_vocab_table=src_vocab_table,
batch_size=batch_size,
eos=hparams.eos,
src_max_len=src_max_len)
table_initializer = tf.tables_initializer()
source = iterator.source
seq_len = iterator.source_sequence_length
self.assertEqual([None, None], source.shape.as_list())
self.assertEqual([None], seq_len.shape.as_list())
with self.test_session() as sess:
sess.run(table_initializer)
sess.run(iterator.initializer)
(source_v, seq_len_v) = sess.run((source, seq_len))
self.assertAllEqual(
[[2, 2, 0], # c c a
[2, 0, 3]], # c a eos
source_v)
self.assertAllEqual([3, 2], seq_len_v)
(source_v, seq_len_v) = sess.run((source, seq_len))
self.assertAllEqual(
[[-1, 3, 3], # "d" == unknown, eos eos
[-1, -1, 0]], # "f" == unknown, "e" == unknown, a
source_v)
self.assertAllEqual([1, 3], seq_len_v)
with self.assertRaisesOpError("End of sequence"):
sess.run((source, seq_len))
if __name__ == "__main__":
tf.test.main()
| 36.599379 | 80 | 0.620874 | 1,618 | 11,785 | 4.241656 | 0.103832 | 0.089174 | 0.07198 | 0.028413 | 0.858517 | 0.854 | 0.836369 | 0.828355 | 0.805479 | 0.802564 | 0 | 0.016744 | 0.249979 | 11,785 | 321 | 81 | 36.713396 | 0.759701 | 0.098006 | 0 | 0.837545 | 0 | 0 | 0.022119 | 0 | 0 | 0 | 0 | 0 | 0.202166 | 1 | 0.01444 | false | 0 | 0.021661 | 0 | 0.039711 | 0.00361 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ce785317b370700a018bf6fe4c64135e3bcf45e | 82,366 | py | Python | python/candig/schemas/candig/pipeline_metadata_pb2.py | ljdursi/ga4gh-schemas | 8255e66d247e65688d0b5320173340f3eb52ce7c | [
"Apache-2.0"
] | 1 | 2019-12-06T14:06:37.000Z | 2019-12-06T14:06:37.000Z | python/candig/schemas/candig/pipeline_metadata_pb2.py | ljdursi/ga4gh-schemas | 8255e66d247e65688d0b5320173340f3eb52ce7c | [
"Apache-2.0"
] | 9 | 2019-03-25T22:35:49.000Z | 2019-12-16T22:02:14.000Z | python/candig/schemas/candig/pipeline_metadata_pb2.py | ljdursi/ga4gh-schemas | 8255e66d247e65688d0b5320173340f3eb52ce7c | [
"Apache-2.0"
] | 1 | 2017-12-04T17:29:14.000Z | 2017-12-04T17:29:14.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: candig/schemas/candig/pipeline_metadata.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from candig.schemas.candig import common_pb2 as candig_dot_schemas_dot_candig_dot_common__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='candig/schemas/candig/pipeline_metadata.proto',
package='candig.schemas.candig',
syntax='proto3',
serialized_pb=_b('\n-candig/schemas/candig/pipeline_metadata.proto\x12\x15\x63\x61ndig.schemas.candig\x1a\"candig/schemas/candig/common.proto\"\xc4\x03\n\nExtraction\x12\n\n\x02id\x18\x01 \x01(\t\x12\x12\n\ndataset_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07\x63reated\x18\x05 \x01(\t\x12\x0f\n\x07updated\x18\x06 \x01(\t\x12\x35\n\nattributes\x18\x07 \x01(\x0b\x32!.candig.schemas.candig.Attributes\x12\x10\n\x08sampleId\x18\x08 \x01(\t\x12\x14\n\x0csampleIdTier\x18\t \x01(\x05\x12\x10\n\x08rnaBlood\x18\n \x01(\t\x12\x14\n\x0crnaBloodTier\x18\x0b \x01(\x05\x12\x10\n\x08\x64naBlood\x18\x0c \x01(\t\x12\x14\n\x0c\x64naBloodTier\x18\r \x01(\x05\x12\x11\n\trnaTissue\x18\x0e \x01(\t\x12\x15\n\rrnaTissueTier\x18\x0f \x01(\x05\x12\x11\n\tdnaTissue\x18\x10 \x01(\t\x12\x15\n\rdnaTissueTier\x18\x11 \x01(\x05\x12\x14\n\x0c\x65xtractionId\x18\x12 \x01(\t\x12\x18\n\x10\x65xtractionIdTier\x18\x13 \x01(\x05\x12\x0c\n\x04site\x18\x14 \x01(\t\x12\x10\n\x08siteTier\x18\x15 \x01(\x05\"\xaa\x05\n\nSequencing\x12\n\n\x02id\x18\x01 \x01(\t\x12\x12\n\ndataset_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07\x63reated\x18\x05 \x01(\t\x12\x0f\n\x07updated\x18\x06 \x01(\t\x12\x35\n\nattributes\x18\x07 \x01(\x0b\x32!.candig.schemas.candig.Attributes\x12\x10\n\x08sampleId\x18\x08 \x01(\t\x12\x14\n\x0csampleIdTier\x18\t \x01(\x05\x12\x15\n\rdnaLibraryKit\x18\n \x01(\t\x12\x19\n\x11\x64naLibraryKitTier\x18\x0b \x01(\x05\x12\x16\n\x0e\x64naSeqPlatform\x18\x0c \x01(\t\x12\x1a\n\x12\x64naSeqPlatformTier\x18\r \x01(\x05\x12\x15\n\rdnaReadLength\x18\x0e \x01(\t\x12\x19\n\x11\x64naReadLengthTier\x18\x0f \x01(\x05\x12\x15\n\rrnaLibraryKit\x18\x10 \x01(\t\x12\x19\n\x11rnaLibraryKitTier\x18\x11 \x01(\x05\x12\x16\n\x0ernaSeqPlatform\x18\x12 \x01(\t\x12\x1a\n\x12rnaSeqPlatformTier\x18\x13 \x01(\x05\x12\x15\n\rrnaReadLength\x18\x14 \x01(\t\x12\x19\n\x11rnaReadLengthTier\x18\x15 \x01(\x05\x12\x11\n\tpcrCycles\x18\x16 \x01(\t\x12\x15\n\rpcrCyclesTier\x18\x17 \x01(\x05\x12\x14\n\x0csequencingId\x18\x18 \x01(\t\x12\x18\n\x10sequencingIdTier\x18\x19 \x01(\x05\x12\x14\n\x0c\x65xtractionId\x18\x1a \x01(\t\x12\x18\n\x10\x65xtractionIdTier\x18\x1b \x01(\x05\x12\x0c\n\x04site\x18\x1c \x01(\t\x12\x10\n\x08siteTier\x18\x1d \x01(\x05\"\xe1\x07\n\tAlignment\x12\n\n\x02id\x18\x01 \x01(\t\x12\x12\n\ndataset_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07\x63reated\x18\x05 \x01(\t\x12\x0f\n\x07updated\x18\x06 \x01(\t\x12\x35\n\nattributes\x18\x07 \x01(\x0b\x32!.candig.schemas.candig.Attributes\x12\x10\n\x08sampleId\x18\x08 \x01(\t\x12\x14\n\x0csampleIdTier\x18\t \x01(\x05\x12\x17\n\x0finHousePipeline\x18\n \x01(\t\x12\x1b\n\x13inHousePipelineTier\x18\x0b \x01(\x05\x12\x15\n\ralignmentTool\x18\x0c \x01(\t\x12\x19\n\x11\x61lignmentToolTier\x18\r \x01(\x05\x12\x11\n\tmergeTool\x18\x0e \x01(\t\x12\x15\n\rmergeToolTier\x18\x0f \x01(\x05\x12\x16\n\x0emarkDuplicates\x18\x10 \x01(\t\x12\x1a\n\x12markDuplicatesTier\x18\x11 \x01(\x05\x12\x17\n\x0frealignerTarget\x18\x12 \x01(\t\x12\x1b\n\x13realignerTargetTier\x18\x13 \x01(\x05\x12\x16\n\x0eindelRealigner\x18\x14 \x01(\t\x12\x1a\n\x12indelRealignerTier\x18\x15 \x01(\x05\x12\x18\n\x10\x62\x61seRecalibrator\x18\x16 \x01(\t\x12\x1c\n\x14\x62\x61seRecalibratorTier\x18\x17 \x01(\x05\x12\x12\n\nprintReads\x18\x18 \x01(\t\x12\x16\n\x0eprintReadsTier\x18\x19 \x01(\x05\x12\x10\n\x08idxStats\x18\x1a \x01(\t\x12\x14\n\x0cidxStatsTier\x18\x1b \x01(\x05\x12\x10\n\x08\x66lagStat\x18\x1c \x01(\t\x12\x14\n\x0c\x66lagStatTier\x18\x1d \x01(\x05\x12\x10\n\x08\x63overage\x18\x1e \x01(\t\x12\x14\n\x0c\x63overageTier\x18\x1f \x01(\x05\x12\x19\n\x11insertSizeMetrics\x18 \x01(\t\x12\x1d\n\x15insertSizeMetricsTier\x18! \x01(\x05\x12\x0e\n\x06\x66\x61stqc\x18\" \x01(\t\x12\x12\n\nfastqcTier\x18# \x01(\x05\x12\x11\n\treference\x18$ \x01(\t\x12\x15\n\rreferenceTier\x18% \x01(\x05\x12\x13\n\x0b\x61lignmentId\x18& \x01(\t\x12\x17\n\x0f\x61lignmentIdTier\x18\' \x01(\x05\x12\x14\n\x0csequencingId\x18( \x01(\t\x12\x18\n\x10sequencingIdTier\x18) \x01(\x05\x12\x0c\n\x04site\x18* \x01(\t\x12\x10\n\x08siteTier\x18+ \x01(\x05\"\xa8\x06\n\x0eVariantCalling\x12\n\n\x02id\x18\x01 \x01(\t\x12\x12\n\ndataset_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07\x63reated\x18\x05 \x01(\t\x12\x0f\n\x07updated\x18\x06 \x01(\t\x12\x35\n\nattributes\x18\x07 \x01(\x0b\x32!.candig.schemas.candig.Attributes\x12\x10\n\x08sampleId\x18\x08 \x01(\t\x12\x14\n\x0csampleIdTier\x18\t \x01(\x05\x12\x17\n\x0finHousePipeline\x18\n \x01(\t\x12\x1b\n\x13inHousePipelineTier\x18\x0b \x01(\x05\x12\x15\n\rvariantCaller\x18\x0c \x01(\t\x12\x19\n\x11variantCallerTier\x18\r \x01(\x05\x12\x10\n\x08tabulate\x18\x0e \x01(\t\x12\x14\n\x0ctabulateTier\x18\x0f \x01(\x05\x12\x12\n\nannotation\x18\x10 \x01(\t\x12\x16\n\x0e\x61nnotationTier\x18\x11 \x01(\x05\x12\x11\n\tmergeTool\x18\x12 \x01(\t\x12\x15\n\rmergeToolTier\x18\x13 \x01(\x05\x12\x10\n\x08rdaToTab\x18\x14 \x01(\t\x12\x14\n\x0crdaToTabTier\x18\x15 \x01(\x05\x12\r\n\x05\x64\x65lly\x18\x16 \x01(\t\x12\x11\n\tdellyTier\x18\x17 \x01(\x05\x12\x12\n\npostFilter\x18\x18 \x01(\t\x12\x16\n\x0epostFilterTier\x18\x19 \x01(\x05\x12\x12\n\nclipFilter\x18\x1a \x01(\t\x12\x16\n\x0e\x63lipFilterTier\x18\x1b \x01(\x05\x12\x0e\n\x06\x63osmic\x18\x1c \x01(\t\x12\x12\n\ncosmicTier\x18\x1d \x01(\x05\x12\r\n\x05\x64\x62Snp\x18\x1e \x01(\t\x12\x11\n\tdbSnpTier\x18\x1f \x01(\x05\x12\x18\n\x10variantCallingId\x18 \x01(\t\x12\x1c\n\x14variantCallingIdTier\x18! \x01(\x05\x12\x13\n\x0b\x61lignmentId\x18\" \x01(\t\x12\x17\n\x0f\x61lignmentIdTier\x18# \x01(\x05\x12\x0c\n\x04site\x18$ \x01(\t\x12\x10\n\x08siteTier\x18% \x01(\x05\"\xb3\x05\n\x0f\x46usionDetection\x12\n\n\x02id\x18\x01 \x01(\t\x12\x12\n\ndataset_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07\x63reated\x18\x05 \x01(\t\x12\x0f\n\x07updated\x18\x06 \x01(\t\x12\x35\n\nattributes\x18\x07 \x01(\x0b\x32!.candig.schemas.candig.Attributes\x12\x10\n\x08sampleId\x18\x08 \x01(\t\x12\x14\n\x0csampleIdTier\x18\t \x01(\x05\x12\x17\n\x0finHousePipeline\x18\n \x01(\t\x12\x1b\n\x13inHousePipelineTier\x18\x0b \x01(\x05\x12\x13\n\x0bsvDetection\x18\x0c \x01(\t\x12\x17\n\x0fsvDetectionTier\x18\r \x01(\x05\x12\x17\n\x0f\x66usionDetection\x18\x0e \x01(\t\x12\x1b\n\x13\x66usionDetectionTier\x18\x0f \x01(\x05\x12\x13\n\x0brealignment\x18\x10 \x01(\t\x12\x17\n\x0frealignmentTier\x18\x11 \x01(\x05\x12\x12\n\nannotation\x18\x12 \x01(\t\x12\x16\n\x0e\x61nnotationTier\x18\x13 \x01(\x05\x12\x17\n\x0fgenomeReference\x18\x14 \x01(\t\x12\x1b\n\x13genomeReferenceTier\x18\x15 \x01(\x05\x12\x12\n\ngeneModels\x18\x16 \x01(\t\x12\x16\n\x0egeneModelsTier\x18\x17 \x01(\x05\x12\x19\n\x11\x66usionDetectionId\x18\x18 \x01(\t\x12\x1d\n\x15\x66usionDetectionIdTier\x18\x19 \x01(\x05\x12\x13\n\x0b\x61lignmentId\x18\x1a \x01(\t\x12\x17\n\x0f\x61lignmentIdTier\x18\x1b \x01(\x05\x12\x0c\n\x04site\x18\x1c \x01(\t\x12\x10\n\x08siteTier\x18\x1d \x01(\x05\"\xde\x04\n\x12\x45xpressionAnalysis\x12\n\n\x02id\x18\x01 \x01(\t\x12\x12\n\ndataset_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07\x63reated\x18\x05 \x01(\t\x12\x0f\n\x07updated\x18\x06 \x01(\t\x12\x35\n\nattributes\x18\x07 \x01(\x0b\x32!.candig.schemas.candig.Attributes\x12\x10\n\x08sampleId\x18\x08 \x01(\t\x12\x14\n\x0csampleIdTier\x18\t \x01(\x05\x12\x12\n\nreadLength\x18\n \x01(\t\x12\x16\n\x0ereadLengthTier\x18\x0b \x01(\x05\x12\x11\n\treference\x18\x0c \x01(\t\x12\x15\n\rreferenceTier\x18\r \x01(\x05\x12\x15\n\ralignmentTool\x18\x0e \x01(\t\x12\x19\n\x11\x61lignmentToolTier\x18\x0f \x01(\x05\x12\x13\n\x0b\x62\x61mHandling\x18\x10 \x01(\t\x12\x17\n\x0f\x62\x61mHandlingTier\x18\x11 \x01(\x05\x12\x1c\n\x14\x65xpressionEstimation\x18\x12 \x01(\t\x12 \n\x18\x65xpressionEstimationTier\x18\x13 \x01(\x05\x12\x1c\n\x14\x65xpressionAnalysisId\x18\x14 \x01(\t\x12 \n\x18\x65xpressionAnalysisIdTier\x18\x15 \x01(\x05\x12\x14\n\x0csequencingId\x18\x16 \x01(\t\x12\x18\n\x10sequencingIdTier\x18\x17 \x01(\x05\x12\x0c\n\x04site\x18\x18 \x01(\t\x12\x10\n\x08siteTier\x18\x19 \x01(\x05\x62\x06proto3')
,
dependencies=[candig_dot_schemas_dot_candig_dot_common__pb2.DESCRIPTOR,])
_EXTRACTION = _descriptor.Descriptor(
name='Extraction',
full_name='candig.schemas.candig.Extraction',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='candig.schemas.candig.Extraction.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dataset_id', full_name='candig.schemas.candig.Extraction.dataset_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name', full_name='candig.schemas.candig.Extraction.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description', full_name='candig.schemas.candig.Extraction.description', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='created', full_name='candig.schemas.candig.Extraction.created', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='updated', full_name='candig.schemas.candig.Extraction.updated', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='attributes', full_name='candig.schemas.candig.Extraction.attributes', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleId', full_name='candig.schemas.candig.Extraction.sampleId', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleIdTier', full_name='candig.schemas.candig.Extraction.sampleIdTier', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaBlood', full_name='candig.schemas.candig.Extraction.rnaBlood', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaBloodTier', full_name='candig.schemas.candig.Extraction.rnaBloodTier', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaBlood', full_name='candig.schemas.candig.Extraction.dnaBlood', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaBloodTier', full_name='candig.schemas.candig.Extraction.dnaBloodTier', index=12,
number=13, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaTissue', full_name='candig.schemas.candig.Extraction.rnaTissue', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaTissueTier', full_name='candig.schemas.candig.Extraction.rnaTissueTier', index=14,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaTissue', full_name='candig.schemas.candig.Extraction.dnaTissue', index=15,
number=16, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaTissueTier', full_name='candig.schemas.candig.Extraction.dnaTissueTier', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='extractionId', full_name='candig.schemas.candig.Extraction.extractionId', index=17,
number=18, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='extractionIdTier', full_name='candig.schemas.candig.Extraction.extractionIdTier', index=18,
number=19, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='site', full_name='candig.schemas.candig.Extraction.site', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='siteTier', full_name='candig.schemas.candig.Extraction.siteTier', index=20,
number=21, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=109,
serialized_end=561,
)
_SEQUENCING = _descriptor.Descriptor(
name='Sequencing',
full_name='candig.schemas.candig.Sequencing',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='candig.schemas.candig.Sequencing.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dataset_id', full_name='candig.schemas.candig.Sequencing.dataset_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name', full_name='candig.schemas.candig.Sequencing.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description', full_name='candig.schemas.candig.Sequencing.description', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='created', full_name='candig.schemas.candig.Sequencing.created', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='updated', full_name='candig.schemas.candig.Sequencing.updated', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='attributes', full_name='candig.schemas.candig.Sequencing.attributes', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleId', full_name='candig.schemas.candig.Sequencing.sampleId', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleIdTier', full_name='candig.schemas.candig.Sequencing.sampleIdTier', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaLibraryKit', full_name='candig.schemas.candig.Sequencing.dnaLibraryKit', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaLibraryKitTier', full_name='candig.schemas.candig.Sequencing.dnaLibraryKitTier', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaSeqPlatform', full_name='candig.schemas.candig.Sequencing.dnaSeqPlatform', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaSeqPlatformTier', full_name='candig.schemas.candig.Sequencing.dnaSeqPlatformTier', index=12,
number=13, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaReadLength', full_name='candig.schemas.candig.Sequencing.dnaReadLength', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dnaReadLengthTier', full_name='candig.schemas.candig.Sequencing.dnaReadLengthTier', index=14,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaLibraryKit', full_name='candig.schemas.candig.Sequencing.rnaLibraryKit', index=15,
number=16, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaLibraryKitTier', full_name='candig.schemas.candig.Sequencing.rnaLibraryKitTier', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaSeqPlatform', full_name='candig.schemas.candig.Sequencing.rnaSeqPlatform', index=17,
number=18, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaSeqPlatformTier', full_name='candig.schemas.candig.Sequencing.rnaSeqPlatformTier', index=18,
number=19, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaReadLength', full_name='candig.schemas.candig.Sequencing.rnaReadLength', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rnaReadLengthTier', full_name='candig.schemas.candig.Sequencing.rnaReadLengthTier', index=20,
number=21, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pcrCycles', full_name='candig.schemas.candig.Sequencing.pcrCycles', index=21,
number=22, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pcrCyclesTier', full_name='candig.schemas.candig.Sequencing.pcrCyclesTier', index=22,
number=23, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sequencingId', full_name='candig.schemas.candig.Sequencing.sequencingId', index=23,
number=24, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sequencingIdTier', full_name='candig.schemas.candig.Sequencing.sequencingIdTier', index=24,
number=25, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='extractionId', full_name='candig.schemas.candig.Sequencing.extractionId', index=25,
number=26, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='extractionIdTier', full_name='candig.schemas.candig.Sequencing.extractionIdTier', index=26,
number=27, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='site', full_name='candig.schemas.candig.Sequencing.site', index=27,
number=28, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='siteTier', full_name='candig.schemas.candig.Sequencing.siteTier', index=28,
number=29, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=564,
serialized_end=1246,
)
_ALIGNMENT = _descriptor.Descriptor(
name='Alignment',
full_name='candig.schemas.candig.Alignment',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='candig.schemas.candig.Alignment.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dataset_id', full_name='candig.schemas.candig.Alignment.dataset_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name', full_name='candig.schemas.candig.Alignment.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description', full_name='candig.schemas.candig.Alignment.description', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='created', full_name='candig.schemas.candig.Alignment.created', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='updated', full_name='candig.schemas.candig.Alignment.updated', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='attributes', full_name='candig.schemas.candig.Alignment.attributes', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleId', full_name='candig.schemas.candig.Alignment.sampleId', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleIdTier', full_name='candig.schemas.candig.Alignment.sampleIdTier', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inHousePipeline', full_name='candig.schemas.candig.Alignment.inHousePipeline', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inHousePipelineTier', full_name='candig.schemas.candig.Alignment.inHousePipelineTier', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentTool', full_name='candig.schemas.candig.Alignment.alignmentTool', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentToolTier', full_name='candig.schemas.candig.Alignment.alignmentToolTier', index=12,
number=13, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='mergeTool', full_name='candig.schemas.candig.Alignment.mergeTool', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='mergeToolTier', full_name='candig.schemas.candig.Alignment.mergeToolTier', index=14,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='markDuplicates', full_name='candig.schemas.candig.Alignment.markDuplicates', index=15,
number=16, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='markDuplicatesTier', full_name='candig.schemas.candig.Alignment.markDuplicatesTier', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='realignerTarget', full_name='candig.schemas.candig.Alignment.realignerTarget', index=17,
number=18, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='realignerTargetTier', full_name='candig.schemas.candig.Alignment.realignerTargetTier', index=18,
number=19, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='indelRealigner', full_name='candig.schemas.candig.Alignment.indelRealigner', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='indelRealignerTier', full_name='candig.schemas.candig.Alignment.indelRealignerTier', index=20,
number=21, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='baseRecalibrator', full_name='candig.schemas.candig.Alignment.baseRecalibrator', index=21,
number=22, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='baseRecalibratorTier', full_name='candig.schemas.candig.Alignment.baseRecalibratorTier', index=22,
number=23, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='printReads', full_name='candig.schemas.candig.Alignment.printReads', index=23,
number=24, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='printReadsTier', full_name='candig.schemas.candig.Alignment.printReadsTier', index=24,
number=25, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='idxStats', full_name='candig.schemas.candig.Alignment.idxStats', index=25,
number=26, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='idxStatsTier', full_name='candig.schemas.candig.Alignment.idxStatsTier', index=26,
number=27, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='flagStat', full_name='candig.schemas.candig.Alignment.flagStat', index=27,
number=28, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='flagStatTier', full_name='candig.schemas.candig.Alignment.flagStatTier', index=28,
number=29, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='coverage', full_name='candig.schemas.candig.Alignment.coverage', index=29,
number=30, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='coverageTier', full_name='candig.schemas.candig.Alignment.coverageTier', index=30,
number=31, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='insertSizeMetrics', full_name='candig.schemas.candig.Alignment.insertSizeMetrics', index=31,
number=32, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='insertSizeMetricsTier', full_name='candig.schemas.candig.Alignment.insertSizeMetricsTier', index=32,
number=33, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fastqc', full_name='candig.schemas.candig.Alignment.fastqc', index=33,
number=34, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fastqcTier', full_name='candig.schemas.candig.Alignment.fastqcTier', index=34,
number=35, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='reference', full_name='candig.schemas.candig.Alignment.reference', index=35,
number=36, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='referenceTier', full_name='candig.schemas.candig.Alignment.referenceTier', index=36,
number=37, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentId', full_name='candig.schemas.candig.Alignment.alignmentId', index=37,
number=38, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentIdTier', full_name='candig.schemas.candig.Alignment.alignmentIdTier', index=38,
number=39, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sequencingId', full_name='candig.schemas.candig.Alignment.sequencingId', index=39,
number=40, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sequencingIdTier', full_name='candig.schemas.candig.Alignment.sequencingIdTier', index=40,
number=41, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='site', full_name='candig.schemas.candig.Alignment.site', index=41,
number=42, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='siteTier', full_name='candig.schemas.candig.Alignment.siteTier', index=42,
number=43, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1249,
serialized_end=2242,
)
_VARIANTCALLING = _descriptor.Descriptor(
name='VariantCalling',
full_name='candig.schemas.candig.VariantCalling',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='candig.schemas.candig.VariantCalling.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dataset_id', full_name='candig.schemas.candig.VariantCalling.dataset_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name', full_name='candig.schemas.candig.VariantCalling.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description', full_name='candig.schemas.candig.VariantCalling.description', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='created', full_name='candig.schemas.candig.VariantCalling.created', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='updated', full_name='candig.schemas.candig.VariantCalling.updated', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='attributes', full_name='candig.schemas.candig.VariantCalling.attributes', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleId', full_name='candig.schemas.candig.VariantCalling.sampleId', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleIdTier', full_name='candig.schemas.candig.VariantCalling.sampleIdTier', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inHousePipeline', full_name='candig.schemas.candig.VariantCalling.inHousePipeline', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inHousePipelineTier', full_name='candig.schemas.candig.VariantCalling.inHousePipelineTier', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='variantCaller', full_name='candig.schemas.candig.VariantCalling.variantCaller', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='variantCallerTier', full_name='candig.schemas.candig.VariantCalling.variantCallerTier', index=12,
number=13, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='tabulate', full_name='candig.schemas.candig.VariantCalling.tabulate', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='tabulateTier', full_name='candig.schemas.candig.VariantCalling.tabulateTier', index=14,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='annotation', full_name='candig.schemas.candig.VariantCalling.annotation', index=15,
number=16, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='annotationTier', full_name='candig.schemas.candig.VariantCalling.annotationTier', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='mergeTool', full_name='candig.schemas.candig.VariantCalling.mergeTool', index=17,
number=18, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='mergeToolTier', full_name='candig.schemas.candig.VariantCalling.mergeToolTier', index=18,
number=19, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rdaToTab', full_name='candig.schemas.candig.VariantCalling.rdaToTab', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rdaToTabTier', full_name='candig.schemas.candig.VariantCalling.rdaToTabTier', index=20,
number=21, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='delly', full_name='candig.schemas.candig.VariantCalling.delly', index=21,
number=22, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dellyTier', full_name='candig.schemas.candig.VariantCalling.dellyTier', index=22,
number=23, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='postFilter', full_name='candig.schemas.candig.VariantCalling.postFilter', index=23,
number=24, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='postFilterTier', full_name='candig.schemas.candig.VariantCalling.postFilterTier', index=24,
number=25, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='clipFilter', full_name='candig.schemas.candig.VariantCalling.clipFilter', index=25,
number=26, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='clipFilterTier', full_name='candig.schemas.candig.VariantCalling.clipFilterTier', index=26,
number=27, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cosmic', full_name='candig.schemas.candig.VariantCalling.cosmic', index=27,
number=28, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cosmicTier', full_name='candig.schemas.candig.VariantCalling.cosmicTier', index=28,
number=29, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dbSnp', full_name='candig.schemas.candig.VariantCalling.dbSnp', index=29,
number=30, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dbSnpTier', full_name='candig.schemas.candig.VariantCalling.dbSnpTier', index=30,
number=31, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='variantCallingId', full_name='candig.schemas.candig.VariantCalling.variantCallingId', index=31,
number=32, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='variantCallingIdTier', full_name='candig.schemas.candig.VariantCalling.variantCallingIdTier', index=32,
number=33, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentId', full_name='candig.schemas.candig.VariantCalling.alignmentId', index=33,
number=34, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentIdTier', full_name='candig.schemas.candig.VariantCalling.alignmentIdTier', index=34,
number=35, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='site', full_name='candig.schemas.candig.VariantCalling.site', index=35,
number=36, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='siteTier', full_name='candig.schemas.candig.VariantCalling.siteTier', index=36,
number=37, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2245,
serialized_end=3053,
)
_FUSIONDETECTION = _descriptor.Descriptor(
name='FusionDetection',
full_name='candig.schemas.candig.FusionDetection',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='candig.schemas.candig.FusionDetection.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dataset_id', full_name='candig.schemas.candig.FusionDetection.dataset_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name', full_name='candig.schemas.candig.FusionDetection.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description', full_name='candig.schemas.candig.FusionDetection.description', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='created', full_name='candig.schemas.candig.FusionDetection.created', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='updated', full_name='candig.schemas.candig.FusionDetection.updated', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='attributes', full_name='candig.schemas.candig.FusionDetection.attributes', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleId', full_name='candig.schemas.candig.FusionDetection.sampleId', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleIdTier', full_name='candig.schemas.candig.FusionDetection.sampleIdTier', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inHousePipeline', full_name='candig.schemas.candig.FusionDetection.inHousePipeline', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inHousePipelineTier', full_name='candig.schemas.candig.FusionDetection.inHousePipelineTier', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='svDetection', full_name='candig.schemas.candig.FusionDetection.svDetection', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='svDetectionTier', full_name='candig.schemas.candig.FusionDetection.svDetectionTier', index=12,
number=13, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fusionDetection', full_name='candig.schemas.candig.FusionDetection.fusionDetection', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fusionDetectionTier', full_name='candig.schemas.candig.FusionDetection.fusionDetectionTier', index=14,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='realignment', full_name='candig.schemas.candig.FusionDetection.realignment', index=15,
number=16, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='realignmentTier', full_name='candig.schemas.candig.FusionDetection.realignmentTier', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='annotation', full_name='candig.schemas.candig.FusionDetection.annotation', index=17,
number=18, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='annotationTier', full_name='candig.schemas.candig.FusionDetection.annotationTier', index=18,
number=19, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='genomeReference', full_name='candig.schemas.candig.FusionDetection.genomeReference', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='genomeReferenceTier', full_name='candig.schemas.candig.FusionDetection.genomeReferenceTier', index=20,
number=21, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='geneModels', full_name='candig.schemas.candig.FusionDetection.geneModels', index=21,
number=22, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='geneModelsTier', full_name='candig.schemas.candig.FusionDetection.geneModelsTier', index=22,
number=23, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fusionDetectionId', full_name='candig.schemas.candig.FusionDetection.fusionDetectionId', index=23,
number=24, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fusionDetectionIdTier', full_name='candig.schemas.candig.FusionDetection.fusionDetectionIdTier', index=24,
number=25, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentId', full_name='candig.schemas.candig.FusionDetection.alignmentId', index=25,
number=26, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentIdTier', full_name='candig.schemas.candig.FusionDetection.alignmentIdTier', index=26,
number=27, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='site', full_name='candig.schemas.candig.FusionDetection.site', index=27,
number=28, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='siteTier', full_name='candig.schemas.candig.FusionDetection.siteTier', index=28,
number=29, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=3056,
serialized_end=3747,
)
_EXPRESSIONANALYSIS = _descriptor.Descriptor(
name='ExpressionAnalysis',
full_name='candig.schemas.candig.ExpressionAnalysis',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='candig.schemas.candig.ExpressionAnalysis.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='dataset_id', full_name='candig.schemas.candig.ExpressionAnalysis.dataset_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name', full_name='candig.schemas.candig.ExpressionAnalysis.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description', full_name='candig.schemas.candig.ExpressionAnalysis.description', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='created', full_name='candig.schemas.candig.ExpressionAnalysis.created', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='updated', full_name='candig.schemas.candig.ExpressionAnalysis.updated', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='attributes', full_name='candig.schemas.candig.ExpressionAnalysis.attributes', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleId', full_name='candig.schemas.candig.ExpressionAnalysis.sampleId', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sampleIdTier', full_name='candig.schemas.candig.ExpressionAnalysis.sampleIdTier', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='readLength', full_name='candig.schemas.candig.ExpressionAnalysis.readLength', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='readLengthTier', full_name='candig.schemas.candig.ExpressionAnalysis.readLengthTier', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='reference', full_name='candig.schemas.candig.ExpressionAnalysis.reference', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='referenceTier', full_name='candig.schemas.candig.ExpressionAnalysis.referenceTier', index=12,
number=13, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentTool', full_name='candig.schemas.candig.ExpressionAnalysis.alignmentTool', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='alignmentToolTier', full_name='candig.schemas.candig.ExpressionAnalysis.alignmentToolTier', index=14,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bamHandling', full_name='candig.schemas.candig.ExpressionAnalysis.bamHandling', index=15,
number=16, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bamHandlingTier', full_name='candig.schemas.candig.ExpressionAnalysis.bamHandlingTier', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='expressionEstimation', full_name='candig.schemas.candig.ExpressionAnalysis.expressionEstimation', index=17,
number=18, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='expressionEstimationTier', full_name='candig.schemas.candig.ExpressionAnalysis.expressionEstimationTier', index=18,
number=19, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='expressionAnalysisId', full_name='candig.schemas.candig.ExpressionAnalysis.expressionAnalysisId', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='expressionAnalysisIdTier', full_name='candig.schemas.candig.ExpressionAnalysis.expressionAnalysisIdTier', index=20,
number=21, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sequencingId', full_name='candig.schemas.candig.ExpressionAnalysis.sequencingId', index=21,
number=22, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sequencingIdTier', full_name='candig.schemas.candig.ExpressionAnalysis.sequencingIdTier', index=22,
number=23, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='site', full_name='candig.schemas.candig.ExpressionAnalysis.site', index=23,
number=24, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='siteTier', full_name='candig.schemas.candig.ExpressionAnalysis.siteTier', index=24,
number=25, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=3750,
serialized_end=4356,
)
_EXTRACTION.fields_by_name['attributes'].message_type = candig_dot_schemas_dot_candig_dot_common__pb2._ATTRIBUTES
_SEQUENCING.fields_by_name['attributes'].message_type = candig_dot_schemas_dot_candig_dot_common__pb2._ATTRIBUTES
_ALIGNMENT.fields_by_name['attributes'].message_type = candig_dot_schemas_dot_candig_dot_common__pb2._ATTRIBUTES
_VARIANTCALLING.fields_by_name['attributes'].message_type = candig_dot_schemas_dot_candig_dot_common__pb2._ATTRIBUTES
_FUSIONDETECTION.fields_by_name['attributes'].message_type = candig_dot_schemas_dot_candig_dot_common__pb2._ATTRIBUTES
_EXPRESSIONANALYSIS.fields_by_name['attributes'].message_type = candig_dot_schemas_dot_candig_dot_common__pb2._ATTRIBUTES
DESCRIPTOR.message_types_by_name['Extraction'] = _EXTRACTION
DESCRIPTOR.message_types_by_name['Sequencing'] = _SEQUENCING
DESCRIPTOR.message_types_by_name['Alignment'] = _ALIGNMENT
DESCRIPTOR.message_types_by_name['VariantCalling'] = _VARIANTCALLING
DESCRIPTOR.message_types_by_name['FusionDetection'] = _FUSIONDETECTION
DESCRIPTOR.message_types_by_name['ExpressionAnalysis'] = _EXPRESSIONANALYSIS
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Extraction = _reflection.GeneratedProtocolMessageType('Extraction', (_message.Message,), dict(
DESCRIPTOR = _EXTRACTION,
__module__ = 'candig.schemas.candig.pipeline_metadata_pb2'
# @@protoc_insertion_point(class_scope:candig.schemas.candig.Extraction)
))
_sym_db.RegisterMessage(Extraction)
Sequencing = _reflection.GeneratedProtocolMessageType('Sequencing', (_message.Message,), dict(
DESCRIPTOR = _SEQUENCING,
__module__ = 'candig.schemas.candig.pipeline_metadata_pb2'
# @@protoc_insertion_point(class_scope:candig.schemas.candig.Sequencing)
))
_sym_db.RegisterMessage(Sequencing)
Alignment = _reflection.GeneratedProtocolMessageType('Alignment', (_message.Message,), dict(
DESCRIPTOR = _ALIGNMENT,
__module__ = 'candig.schemas.candig.pipeline_metadata_pb2'
# @@protoc_insertion_point(class_scope:candig.schemas.candig.Alignment)
))
_sym_db.RegisterMessage(Alignment)
VariantCalling = _reflection.GeneratedProtocolMessageType('VariantCalling', (_message.Message,), dict(
DESCRIPTOR = _VARIANTCALLING,
__module__ = 'candig.schemas.candig.pipeline_metadata_pb2'
# @@protoc_insertion_point(class_scope:candig.schemas.candig.VariantCalling)
))
_sym_db.RegisterMessage(VariantCalling)
FusionDetection = _reflection.GeneratedProtocolMessageType('FusionDetection', (_message.Message,), dict(
DESCRIPTOR = _FUSIONDETECTION,
__module__ = 'candig.schemas.candig.pipeline_metadata_pb2'
# @@protoc_insertion_point(class_scope:candig.schemas.candig.FusionDetection)
))
_sym_db.RegisterMessage(FusionDetection)
ExpressionAnalysis = _reflection.GeneratedProtocolMessageType('ExpressionAnalysis', (_message.Message,), dict(
DESCRIPTOR = _EXPRESSIONANALYSIS,
__module__ = 'candig.schemas.candig.pipeline_metadata_pb2'
# @@protoc_insertion_point(class_scope:candig.schemas.candig.ExpressionAnalysis)
))
_sym_db.RegisterMessage(ExpressionAnalysis)
# @@protoc_insertion_point(module_scope)
| 54.223831 | 8,328 | 0.729172 | 11,083 | 82,366 | 5.199044 | 0.037355 | 0.077472 | 0.070565 | 0.07624 | 0.891324 | 0.870412 | 0.787317 | 0.771715 | 0.761736 | 0.760105 | 0 | 0.05132 | 0.135554 | 82,366 | 1,518 | 8,329 | 54.259552 | 0.757953 | 0.007588 | 0 | 0.816092 | 1 | 0.002028 | 0.231971 | 0.194812 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004733 | 0 | 0.004733 | 0.002028 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6b0a50bf29dc89ebcf6e147652567c48604a4b1d | 67,713 | py | Python | tests/test_gateway.py | ykris45/aioupnp | b0e44dfb9cddb58065517dadb14a9769af7afc08 | [
"MIT"
] | null | null | null | tests/test_gateway.py | ykris45/aioupnp | b0e44dfb9cddb58065517dadb14a9769af7afc08 | [
"MIT"
] | null | null | null | tests/test_gateway.py | ykris45/aioupnp | b0e44dfb9cddb58065517dadb14a9769af7afc08 | [
"MIT"
] | null | null | null | from aioupnp.fault import UPnPError
from tests import AsyncioTestCase, mock_tcp_and_udp
from collections import OrderedDict
from aioupnp.gateway import Gateway, get_action_list
from aioupnp.serialization.ssdp import SSDPDatagram
def gen_get_bytes(location: str, host: str) -> bytes:
return (
'GET %s HTTP/1.1\r\nAccept-Encoding: gzip\r\nHost: %s\r\nConnection: Close\r\n\r\n' % (location, host)
).encode()
class TestParseActionList(AsyncioTestCase):
test_action_list = {'actionList': {
'action': [OrderedDict([('name', 'SetConnectionType'), ('argumentList', OrderedDict([('argument', OrderedDict(
[('name', 'NewConnectionType'), ('direction', 'in'), ('relatedStateVariable', 'ConnectionType')]))]))]),
OrderedDict([('name', 'GetConnectionTypeInfo'), ('argumentList', OrderedDict([('argument', [
OrderedDict([('name', 'NewConnectionType'), ('direction', 'out'),
('relatedStateVariable', 'ConnectionType')]), OrderedDict(
[('name', 'NewPossibleConnectionTypes'), ('direction', 'out'),
('relatedStateVariable', 'PossibleConnectionTypes')])])]))]),
OrderedDict([('name', 'RequestConnection')]), OrderedDict([('name', 'ForceTermination')]),
OrderedDict([('name', 'GetStatusInfo'), ('argumentList', OrderedDict([('argument', [OrderedDict(
[('name', 'NewConnectionStatus'), ('direction', 'out'),
('relatedStateVariable', 'ConnectionStatus')]), OrderedDict(
[('name', 'NewLastConnectionError'), ('direction', 'out'),
('relatedStateVariable', 'LastConnectionError')]), OrderedDict(
[('name', 'NewUptime'), ('direction', 'out'), ('relatedStateVariable', 'Uptime')])])]))]),
OrderedDict([('name', 'GetNATRSIPStatus'), ('argumentList', OrderedDict([('argument', [OrderedDict(
[('name', 'NewRSIPAvailable'), ('direction', 'out'),
('relatedStateVariable', 'RSIPAvailable')]), OrderedDict(
[('name', 'NewNATEnabled'), ('direction', 'out'),
('relatedStateVariable', 'NATEnabled')])])]))]), OrderedDict(
[('name', 'GetGenericPortMappingEntry'), ('argumentList', OrderedDict([('argument', [OrderedDict(
[('name', 'NewPortMappingIndex'), ('direction', 'in'),
('relatedStateVariable', 'PortMappingNumberOfEntries')]), OrderedDict(
[('name', 'NewRemoteHost'), ('direction', 'out'), ('relatedStateVariable', 'RemoteHost')]),
OrderedDict(
[('name', 'NewExternalPort'), ('direction', 'out'), ('relatedStateVariable', 'ExternalPort')]),
OrderedDict(
[('name', 'NewProtocol'), ('direction', 'out'),
('relatedStateVariable', 'PortMappingProtocol')]),
OrderedDict([('name',
'NewInternalPort'),
('direction',
'out'), (
'relatedStateVariable',
'InternalPort')]),
OrderedDict([('name',
'NewInternalClient'),
('direction',
'out'), (
'relatedStateVariable',
'InternalClient')]),
OrderedDict([('name',
'NewEnabled'),
('direction',
'out'), (
'relatedStateVariable',
'PortMappingEnabled')]),
OrderedDict([('name',
'NewPortMappingDescription'),
('direction',
'out'), (
'relatedStateVariable',
'PortMappingDescription')]),
OrderedDict([('name',
'NewLeaseDuration'),
('direction',
'out'), (
'relatedStateVariable',
'PortMappingLeaseDuration')])])]))]),
OrderedDict([('name', 'GetSpecificPortMappingEntry'), ('argumentList', OrderedDict([('argument', [
OrderedDict(
[('name', 'NewRemoteHost'), ('direction', 'in'), ('relatedStateVariable', 'RemoteHost')]),
OrderedDict([('name', 'NewExternalPort'), ('direction', 'in'),
('relatedStateVariable', 'ExternalPort')]), OrderedDict(
[('name', 'NewProtocol'), ('direction', 'in'),
('relatedStateVariable', 'PortMappingProtocol')]), OrderedDict(
[('name', 'NewInternalPort'), ('direction', 'out'),
('relatedStateVariable', 'InternalPort')]), OrderedDict(
[('name', 'NewInternalClient'), ('direction', 'out'),
('relatedStateVariable', 'InternalClient')]), OrderedDict(
[('name', 'NewEnabled'), ('direction', 'out'),
('relatedStateVariable', 'PortMappingEnabled')]), OrderedDict(
[('name', 'NewPortMappingDescription'), ('direction', 'out'),
('relatedStateVariable', 'PortMappingDescription')]), OrderedDict(
[('name', 'NewLeaseDuration'), ('direction', 'out'),
('relatedStateVariable', 'PortMappingLeaseDuration')])])]))]), OrderedDict(
[('name', 'AddPortMapping'), ('argumentList', OrderedDict([('argument', [
OrderedDict(
[('name', 'NewRemoteHost'), ('direction', 'in'), ('relatedStateVariable', 'RemoteHost')]),
OrderedDict(
[('name', 'NewExternalPort'), ('direction', 'in'), ('relatedStateVariable', 'ExternalPort')]),
OrderedDict(
[('name', 'NewProtocol'), ('direction', 'in'),
('relatedStateVariable', 'PortMappingProtocol')]),
OrderedDict(
[('name', 'NewInternalPort'), ('direction', 'in'), ('relatedStateVariable', 'InternalPort')]),
OrderedDict(
[('name', 'NewInternalClient'), ('direction', 'in'),
('relatedStateVariable', 'InternalClient')]),
OrderedDict(
[('name', 'NewEnabled'), ('direction', 'in'), ('relatedStateVariable', 'PortMappingEnabled')]),
OrderedDict([('name', 'NewPortMappingDescription'), ('direction', 'in'),
('relatedStateVariable', 'PortMappingDescription')]), OrderedDict(
[('name', 'NewLeaseDuration'), ('direction', 'in'),
('relatedStateVariable', 'PortMappingLeaseDuration')])])]))]), OrderedDict(
[('name', 'DeletePortMapping'), ('argumentList', OrderedDict([('argument', [
OrderedDict(
[('name', 'NewRemoteHost'), ('direction', 'in'), ('relatedStateVariable', 'RemoteHost')]),
OrderedDict(
[('name', 'NewExternalPort'), ('direction', 'in'), ('relatedStateVariable', 'ExternalPort')]),
OrderedDict(
[('name', 'NewProtocol'), ('direction', 'in'),
('relatedStateVariable', 'PortMappingProtocol')])])]))]),
OrderedDict([('name', 'GetExternalIPAddress'),
('argumentList', OrderedDict(
[('argument', OrderedDict([('name', 'NewExternalIPAddress'),
('direction', 'out'),
('relatedStateVariable', 'ExternalIPAddress')]))]))])]}}
def test_parse_expected_action_list(self):
expected = [('SetConnectionType', ['NewConnectionType'], []),
('GetConnectionTypeInfo', [], ['NewConnectionType', 'NewPossibleConnectionTypes']),
('RequestConnection', [], []), ('ForceTermination', [], []),
('GetStatusInfo', [], ['NewConnectionStatus', 'NewLastConnectionError', 'NewUptime']),
('GetNATRSIPStatus', [], ['NewRSIPAvailable', 'NewNATEnabled']), (
'GetGenericPortMappingEntry', ['NewPortMappingIndex'],
['NewRemoteHost', 'NewExternalPort', 'NewProtocol', 'NewInternalPort', 'NewInternalClient',
'NewEnabled', 'NewPortMappingDescription', 'NewLeaseDuration']), (
'GetSpecificPortMappingEntry', ['NewRemoteHost', 'NewExternalPort', 'NewProtocol'],
['NewInternalPort', 'NewInternalClient', 'NewEnabled', 'NewPortMappingDescription',
'NewLeaseDuration']), ('AddPortMapping',
['NewRemoteHost', 'NewExternalPort', 'NewProtocol', 'NewInternalPort',
'NewInternalClient', 'NewEnabled', 'NewPortMappingDescription',
'NewLeaseDuration'], []),
('DeletePortMapping', ['NewRemoteHost', 'NewExternalPort', 'NewProtocol'], []),
('GetExternalIPAddress', [], ['NewExternalIPAddress'])]
self.assertEqual(expected, get_action_list(self.test_action_list))
class TestDiscoverDLinkDIR890L(AsyncioTestCase):
gateway_info = \
{'manufacturer_string': 'D-Link DIR-890L', 'gateway_address': '10.0.0.1',
'server': 'Linux, UPnP/1.0, DIR-890L Ver 1.20', 'urlBase': 'http://10.0.0.1:49152',
'location': 'http://10.0.0.1:49152/InternetGatewayDevice.xml', 'specVersion': {'major': '1', 'minor': '0'},
'usn': 'uuid:11111111-2222-3333-4444-555555555555::urn:schemas-upnp-org:device:WANDevice:1',
'urn': 'urn:schemas-upnp-org:device:WANDevice:1',
'gateway_xml': 'HTTP/1.1 200 OK\r\nServer: WebServer\r\nDate: Thu, 11 Oct 2018 22:16:16 GMT\r\nContent-Type: text/xml\r\nContent-Length: 3921\r\nLast-Modified: Thu, 09 Aug 2018 12:41:07 GMT\r\nConnection: close\r\n\r\n<?xml version="1.0"?>\n<root xmlns="urn:schemas-upnp-org:device-1-0">\n\t<specVersion>\n\t\t<major>1</major>\n\t\t<minor>0</minor>\n\t</specVersion>\n\t<URLBase>http://10.0.0.1:49152</URLBase>\n\t<device>\n\t\t<deviceType>urn:schemas-upnp-org:device:InternetGatewayDevice:1</deviceType>\n\t\t<friendlyName>Wireless Broadband Router</friendlyName>\n\t\t<manufacturer>D-Link Corporation</manufacturer>\n\t\t<manufacturerURL>http://www.dlink.com</manufacturerURL>\n\t\t<modelDescription>D-Link Router</modelDescription>\n\t\t<modelName>D-Link Router</modelName>\n\t\t<modelNumber>DIR-890L</modelNumber>\n\t\t<modelURL>http://www.dlink.com</modelURL>\n\t\t<serialNumber>120</serialNumber>\n\t\t<UDN>uuid:11111111-2222-3333-4444-555555555555</UDN>\n\t\t<iconList>\n\t\t\t<icon>\n\t\t\t\t<mimetype>image/gif</mimetype>\n\t\t\t\t<width>118</width>\n\t\t\t\t<height>119</height>\n\t\t\t\t<depth>8</depth>\n\t\t\t\t<url>/ligd.gif</url>\n\t\t\t</icon>\n\t\t</iconList>\n\t\t<serviceList>\n\t\t\t<service>\n\t\t\t\t<serviceType>urn:schemas-microsoft-com:service:OSInfo:1</serviceType>\n\t\t\t\t<serviceId>urn:microsoft-com:serviceId:OSInfo1</serviceId>\n\t\t\t\t<controlURL>/soap.cgi?service=OSInfo1</controlURL>\n\t\t\t\t<eventSubURL>/gena.cgi?service=OSInfo1</eventSubURL>\n\t\t\t\t<SCPDURL>/OSInfo.xml</SCPDURL>\n\t\t\t</service>\n\t\t\t<service>\n\t\t\t\t<serviceType>urn:schemas-upnp-org:service:Layer3Forwarding:1</serviceType>\n\t\t\t\t<serviceId>urn:upnp-org:serviceId:L3Forwarding1</serviceId>\n\t\t\t\t<controlURL>/soap.cgi?service=L3Forwarding1</controlURL>\n\t\t\t\t<eventSubURL>/gena.cgi?service=L3Forwarding1</eventSubURL>\n\t\t\t\t<SCPDURL>/Layer3Forwarding.xml</SCPDURL>\n\t\t\t</service>\n\t\t</serviceList>\n\t\t<deviceList>\n\t\t\t<device>\n\t\t\t\t<deviceType>urn:schemas-upnp-org:device:WANDevice:1</deviceType>\n\t\t\t\t<friendlyName>WANDevice</friendlyName>\n\t\t\t\t<manufacturer>D-Link</manufacturer>\n\t\t\t\t<manufacturerURL>http://www.dlink.com</manufacturerURL>\n\t\t\t\t<modelDescription>WANDevice</modelDescription>\n\t\t\t\t<modelName>DIR-890L</modelName>\n\t\t\t\t<modelNumber>1</modelNumber>\n\t\t\t\t<modelURL>http://www.dlink.com</modelURL>\n\t\t\t\t<serialNumber>120</serialNumber>\n\t\t\t\t<UDN>uuid:11111111-2222-3333-4444-555555555555</UDN>\n\t\t\t\t<serviceList>\n\t\t\t\t\t<service>\n\t\t\t\t\t\t<serviceType>urn:schemas-upnp-org:service:WANCommonInterfaceConfig:1</serviceType>\n\t\t\t\t\t\t<serviceId>urn:upnp-org:serviceId:WANCommonIFC1</serviceId>\n\t\t\t\t\t\t<controlURL>/soap.cgi?service=WANCommonIFC1</controlURL>\n\t\t\t\t\t\t<eventSubURL>/gena.cgi?service=WANCommonIFC1</eventSubURL>\n\t\t\t\t\t\t<SCPDURL>/WANCommonInterfaceConfig.xml</SCPDURL>\n\t\t\t\t\t</service>\n\t\t\t\t</serviceList>\n\t\t\t\t<deviceList>\n\t\t\t\t\t<device>\n\t\t\t\t\t\t<deviceType>urn:schemas-upnp-org:device:WANConnectionDevice:1</deviceType>\n\t\t\t\t\t\t<friendlyName>WANConnectionDevice</friendlyName>\n\t\t\t\t\t\t<manufacturer>D-Link</manufacturer>\n\t\t\t\t\t\t<manufacturerURL>http://www.dlink.com</manufacturerURL>\n\t\t\t\t\t\t<modelDescription>WanConnectionDevice</modelDescription>\n\t\t\t\t\t\t<modelName>DIR-890L</modelName>\n\t\t\t\t\t\t<modelNumber>1</modelNumber>\n\t\t\t\t\t\t<modelURL>http://www.dlink.com</modelURL>\n\t\t\t\t\t\t<serialNumber>120</serialNumber>\n\t\t\t\t\t\t<UDN>uuid:11111111-2222-3333-4444-555555555555</UDN>\n\t\t\t\t\t\t<serviceList>\n\t\t\t\t\t\t\t<service>\n\t\t\t\t\t\t\t\t<serviceType>urn:schemas-upnp-org:service:WANEthernetLinkConfig:1</serviceType>\n\t\t\t\t\t\t\t\t<serviceId>urn:upnp-org:serviceId:WANEthLinkC1</serviceId>\n\t\t\t\t\t\t\t\t<controlURL>/soap.cgi?service=WANEthLinkC1</controlURL>\n\t\t\t\t\t\t\t\t<eventSubURL>/gena.cgi?service=WANEthLinkC1</eventSubURL>\n\t\t\t\t\t\t\t\t<SCPDURL>/WANEthernetLinkConfig.xml</SCPDURL>\n\t\t\t\t\t\t\t</service>\n\t\t\t\t\t\t\t<service>\n\t\t\t\t\t\t\t\t<serviceType>urn:schemas-upnp-org:service:WANIPConnection:1</serviceType>\n\t\t\t\t\t\t\t\t<serviceId>urn:upnp-org:serviceId:WANIPConn1</serviceId>\n\t\t\t\t\t\t\t\t<controlURL>/soap.cgi?service=WANIPConn1</controlURL>\n\t\t\t\t\t\t\t\t<eventSubURL>/gena.cgi?service=WANIPConn1</eventSubURL>\n\t\t\t\t\t\t\t\t<SCPDURL>/WANIPConnection.xml</SCPDURL>\n\t\t\t\t\t\t\t</service>\n\t\t\t\t\t\t</serviceList>\n\t\t\t\t\t</device>\n\t\t\t\t</deviceList>\n\t\t\t</device>\n\t\t</deviceList>\n\t\t<presentationURL>http://10.0.0.1</presentationURL>\n\t</device>\n</root>\n',
'services_xml': {
'/OSInfo.xml': 'HTTP/1.1 200 OK\r\nServer: WebServer\r\nDate: Thu, 11 Oct 2018 22:16:16 GMT\r\nContent-Type: text/xml\r\nContent-Length: 219\r\nLast-Modified: Thu, 09 Aug 2018 12:41:07 GMT\r\nConnection: close\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0">\n\t<specVersion>\n\t\t<major>1</major>\n\t\t<minor>0</minor>\n\t</specVersion>\n\t<actionList>\n\t</actionList>\n\t<serviceStateTable>\n\t</serviceStateTable>\n</scpd>\n',
'/Layer3Forwarding.xml': 'HTTP/1.1 200 OK\r\nServer: WebServer\r\nDate: Thu, 11 Oct 2018 22:16:16 GMT\r\nContent-Type: text/xml\r\nContent-Length: 920\r\nLast-Modified: Thu, 09 Aug 2018 12:41:07 GMT\r\nConnection: close\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0">\n\t<specVersion>\n\t\t<major>1</major>\n\t\t<minor>0</minor>\n\t</specVersion>\n\t<actionList>\n\t\t<action>\n\t\t\t<name>GetDefaultConnectionService</name>\n\t\t\t<argumentList>\n\t\t\t\t<argument>\n\t\t\t\t\t<name>NewDefaultConnectionService</name>\n\t\t\t\t\t<direction>out</direction>\n\t\t\t\t\t<relatedStateVariable>DefaultConnectionService</relatedStateVariable>\n\t\t\t\t</argument>\n\t\t\t</argumentList>\n\t\t</action>\n\t\t<action>\n\t\t\t<name>SetDefaultConnectionService</name>\n\t\t\t<argumentList>\n\t\t\t\t<argument>\n\t\t\t\t\t<name>NewDefaultConnectionService</name>\n\t\t\t\t\t<direction>in</direction>\n\t\t\t\t\t<relatedStateVariable>DefaultConnectionService</relatedStateVariable>\n\t\t\t\t</argument>\n\t\t\t</argumentList>\n\t\t</action>\n\t</actionList>\n\t<serviceStateTable>\n\t\t<stateVariable sendEvents="yes">\n\t\t\t<name>DefaultConnectionService</name>\n\t\t\t<dataType>string</dataType>\n\t\t</stateVariable>\n\t</serviceStateTable>\n</scpd>\n',
'/WANCommonInterfaceConfig.xml': 'HTTP/1.1 200 OK\r\nServer: WebServer\r\nDate: Thu, 11 Oct 2018 22:16:16 GMT\r\nContent-Type: text/xml\r\nContent-Length: 5343\r\nLast-Modified: Thu, 09 Aug 2018 12:41:07 GMT\r\nConnection: close\r\n\r\n<?xml version="1.0"?>\r\n<scpd xmlns="urn:schemas-upnp-org:service-1-0">\r\n\t<specVersion>\r\n\t\t<major>1</major>\r\n\t\t<minor>0</minor>\r\n\t</specVersion>\r\n\t<actionList>\r\n\t\t<action>\r\n\t\t\t<name>GetCommonLinkProperties</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewWANAccessType</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>WANAccessType</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewLayer1UpstreamMaxBitRate</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>Layer1UpstreamMaxBitRate</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewLayer1DownstreamMaxBitRate</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>Layer1DownstreamMaxBitRate</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewPhysicalLinkStatus</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PhysicalLinkStatus</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetTotalBytesSent</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewTotalBytesSent</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalBytesSent</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetTotalBytesReceived</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewTotalBytesReceived</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalBytesReceived</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetTotalPacketsSent</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewTotalPacketsSent</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalPacketsSent</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetTotalPacketsReceived</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewTotalPacketsReceived</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalPacketsReceived</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>X_GetICSStatistics</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>TotalBytesSent</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalBytesSent</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>TotalBytesReceived</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalBytesReceived</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>TotalPacketsSent</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalPacketsSent</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>TotalPacketsReceived</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>TotalPacketsReceived</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>Layer1DownstreamMaxBitRate</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>Layer1DownstreamMaxBitRate</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>Uptime</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>X_Uptime</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t</actionList>\r\n\t<serviceStateTable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>WANAccessType</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<allowedValueList>\r\n\t\t\t\t<allowedValue>DSL</allowedValue>\r\n\t\t\t\t<allowedValue>POTS</allowedValue>\r\n\t\t\t\t<allowedValue>Cable</allowedValue>\r\n\t\t\t\t<allowedValue>Ethernet</allowedValue>\r\n\t\t\t\t<allowedValue>Other</allowedValue>\r\n\t\t\t</allowedValueList>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>Layer1UpstreamMaxBitRate</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>Layer1DownstreamMaxBitRate</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="yes">\r\n\t\t\t<name>PhysicalLinkStatus</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<allowedValueList>\r\n\t\t\t\t<allowedValue>Up</allowedValue>\r\n\t\t\t\t<allowedValue>Down</allowedValue>\r\n\t\t\t\t<allowedValue>Initializing</allowedValue>\r\n\t\t\t\t<allowedValue>Unavailable</allowedValue>\r\n\t\t\t</allowedValueList>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>WANAccessProvider</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>MaximumActiveConnections</name>\r\n\t\t\t<dataType>ui2</dataType>\r\n\t\t\t<allowedValueRange>\r\n\t\t\t\t<minimum>1</minimum>\r\n\t\t\t\t<maximum></maximum>\r\n\t\t\t\t<step>1</step>\r\n\t\t\t</allowedValueRange>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>TotalBytesSent</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>TotalBytesReceived</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>TotalPacketsSent</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>TotalPacketsReceived</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>X_PersonalFirewallEnabled</name>\r\n\t\t\t<dataType>boolean</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>X_Uptime</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t</serviceStateTable>\r\n</scpd>\r\n',
'/WANEthernetLinkConfig.xml': 'HTTP/1.1 200 OK\r\nServer: WebServer\r\nDate: Thu, 11 Oct 2018 22:16:16 GMT\r\nContent-Type: text/xml\r\nContent-Length: 773\r\nLast-Modified: Thu, 09 Aug 2018 12:41:07 GMT\r\nConnection: close\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0">\n\t<specVersion>\n\t\t<major>1</major>\n\t\t<minor>0</minor>\n\t</specVersion>\n\t<actionList>\n\t\t<action>\n\t\t\t<name>GetEthernetLinkStatus</name>\n\t\t\t<argumentList>\n\t\t\t\t<argument>\n\t\t\t\t\t<name>NewEthernetLinkStatus</name>\n\t\t\t\t\t<direction>out</direction>\n\t\t\t\t\t<relatedStateVariable>EthernetLinkStatus</relatedStateVariable>\n\t\t\t\t</argument>\n\t\t\t</argumentList>\n\t\t</action>\n\t</actionList>\n\t<serviceStateTable>\n\t\t<stateVariable sendEvents="yes">\n\t\t\t<name>EthernetLinkStatus</name>\n\t\t\t<dataType>string</dataType>\n\t\t\t<allowedValueList>\n\t\t\t\t<allowedValue>Up</allowedValue>\n\t\t\t\t<allowedValue>Down</allowedValue>\n\t\t\t\t<allowedValue>Unavailable</allowedValue>\n\t\t\t</allowedValueList>\n\t\t</stateVariable>\n\t</serviceStateTable>\n</scpd>\n',
'/WANIPConnection.xml': 'HTTP/1.1 200 OK\r\nServer: WebServer\r\nDate: Thu, 11 Oct 2018 22:16:16 GMT\r\nContent-Type: text/xml\r\nContent-Length: 12078\r\nLast-Modified: Thu, 09 Aug 2018 12:41:07 GMT\r\nConnection: close\r\n\r\n<?xml version="1.0"?>\r\n<scpd xmlns="urn:schemas-upnp-org:service-1-0">\r\n\t<specVersion>\r\n\t\t<major>1</major>\r\n\t\t<minor>0</minor>\r\n\t</specVersion>\r\n\t<actionList>\r\n\t\t<action>\r\n\t\t\t<name>SetConnectionType</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewConnectionType</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>ConnectionType</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action> \r\n\t\t<action>\r\n\t\t\t<name>GetConnectionTypeInfo</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewConnectionType</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>ConnectionType</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewPossibleConnectionTypes</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PossibleConnectionTypes</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>RequestConnection</name>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>ForceTermination</name>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetStatusInfo</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewConnectionStatus</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>ConnectionStatus</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewLastConnectionError</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>LastConnectionError</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewUptime</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>Uptime</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetNATRSIPStatus</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewRSIPAvailable</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>RSIPAvailable</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewNATEnabled</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>NATEnabled</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetGenericPortMappingEntry</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewPortMappingIndex</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingNumberOfEntries</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewRemoteHost</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>RemoteHost</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewExternalPort</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>ExternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewProtocol</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingProtocol</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewInternalPort</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>InternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewInternalClient</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>InternalClient</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewEnabled</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingEnabled</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewPortMappingDescription</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingDescription</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewLeaseDuration</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingLeaseDuration</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetSpecificPortMappingEntry</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewRemoteHost</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>RemoteHost</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewExternalPort</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>ExternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewProtocol</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingProtocol</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewInternalPort</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>InternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewInternalClient</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>InternalClient</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewEnabled</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingEnabled</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewPortMappingDescription</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingDescription</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewLeaseDuration</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingLeaseDuration</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>AddPortMapping</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewRemoteHost</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>RemoteHost</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewExternalPort</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>ExternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewProtocol</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingProtocol</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewInternalPort</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>InternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewInternalClient</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>InternalClient</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewEnabled</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingEnabled</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewPortMappingDescription</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingDescription</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewLeaseDuration</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingLeaseDuration</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>DeletePortMapping</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewRemoteHost</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>RemoteHost</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewExternalPort</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>ExternalPort</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewProtocol</name>\r\n\t\t\t\t\t<direction>in</direction>\r\n\t\t\t\t\t<relatedStateVariable>PortMappingProtocol</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t\t<action>\r\n\t\t\t<name>GetExternalIPAddress</name>\r\n\t\t\t<argumentList>\r\n\t\t\t\t<argument>\r\n\t\t\t\t\t<name>NewExternalIPAddress</name>\r\n\t\t\t\t\t<direction>out</direction>\r\n\t\t\t\t\t<relatedStateVariable>ExternalIPAddress</relatedStateVariable>\r\n\t\t\t\t</argument>\r\n\t\t\t</argumentList>\r\n\t\t</action>\r\n\t</actionList>\r\n\t<serviceStateTable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>ConnectionType</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<defaultValue>Unconfigured</defaultValue>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="yes">\r\n\t\t\t<name>PossibleConnectionTypes</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<allowedValueList>\r\n\t\t\t\t<allowedValue>Unconfigured</allowedValue>\r\n\t\t\t\t<allowedValue>IP_Routed</allowedValue>\r\n\t\t\t\t<allowedValue>IP_Bridged</allowedValue>\r\n\t\t\t</allowedValueList>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="yes">\r\n\t\t\t<name>ConnectionStatus</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<defaultValue>Unconfigured</defaultValue>\r\n\t\t\t<allowedValueList>\r\n\t\t\t\t<allowedValue>Unconfigured</allowedValue>\r\n\t\t\t\t<allowedValue>Connecting</allowedValue>\r\n\t\t\t\t<allowedValue>Authenticating</allowedValue>\r\n\t\t\t\t<allowedValue>PendingDisconnect</allowedValue>\r\n\t\t\t\t<allowedValue>Disconnecting</allowedValue>\r\n\t\t\t\t<allowedValue>Disconnected</allowedValue>\r\n\t\t\t\t<allowedValue>Connected</allowedValue>\r\n\t\t\t</allowedValueList>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>Uptime</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t\t<defaultValue>0</defaultValue>\r\n\t\t\t<allowedValueRange>\r\n\t\t\t\t<minimum>0</minimum>\r\n\t\t\t\t<maximum></maximum>\r\n\t\t\t\t<step>1</step>\r\n\t\t\t</allowedValueRange>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>RSIPAvailable</name>\r\n\t\t<dataType>boolean</dataType>\r\n\t\t\t<defaultValue>0</defaultValue>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>NATEnabled</name>\r\n\t\t\t<dataType>boolean</dataType>\r\n\t\t\t<defaultValue>1</defaultValue>\r\n\t\t</stateVariable> \r\n\t\t<stateVariable sendEvents="yes">\r\n\t\t\t<name>X_Name</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>LastConnectionError</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<defaultValue>ERROR_NONE</defaultValue>\r\n\t\t\t<allowedValueList>\r\n\t\t\t\t<allowedValue>ERROR_NONE</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_ISP_TIME_OUT</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_COMMAND_ABORTED</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_NOT_ENABLED_FOR_INTERNET</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_BAD_PHONE_NUMBER</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_USER_DISCONNECT</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_ISP_DISCONNECT</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_IDLE_DISCONNECT</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_FORCED_DISCONNECT</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_SERVER_OUT_OF_RESOURCES</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_RESTRICTED_LOGON_HOURS</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_ACCOUNT_DISABLED</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_ACCOUNT_EXPIRED</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_PASSWORD_EXPIRED</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_AUTHENTICATION_FAILURE</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_NO_DIALTONE</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_NO_CARRIER</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_NO_ANSWER</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_LINE_BUSY</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_UNSUPPORTED_BITSPERSECOND</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_TOO_MANY_LINE_ERRORS</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_IP_CONFIGURATION</allowedValue>\r\n\t\t\t\t<allowedValue>ERROR_UNKNOWN</allowedValue>\r\n\t\t\t</allowedValueList>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="yes">\r\n\t\t\t<name>ExternalIPAddress</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>RemoteHost</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>ExternalPort</name>\r\n\t\t\t<dataType>ui2</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>InternalPort</name>\r\n\t\t\t<dataType>ui2</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>PortMappingProtocol</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t\t<allowedValueList>\r\n\t\t\t\t<allowedValue>TCP</allowedValue>\r\n\t\t\t\t<allowedValue>UDP</allowedValue>\r\n\t\t\t</allowedValueList>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>InternalClient</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>PortMappingDescription</name>\r\n\t\t\t<dataType>string</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>PortMappingEnabled</name>\r\n\t\t\t<dataType>boolean</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="no">\r\n\t\t\t<name>PortMappingLeaseDuration</name>\r\n\t\t\t<dataType>ui4</dataType>\r\n\t\t</stateVariable>\r\n\t\t<stateVariable sendEvents="yes">\r\n\t\t\t<name>PortMappingNumberOfEntries</name>\r\n\t\t\t<dataType>ui2</dataType>\r\n\t\t</stateVariable>\r\n\t</serviceStateTable>\r\n</scpd>\r\n'},
'services': {'/OSInfo.xml': OrderedDict([('serviceType', 'urn:schemas-microsoft-com:service:OSInfo:1'),
('serviceId', 'urn:microsoft-com:serviceId:OSInfo1'),
('controlURL', '/soap.cgi?service=OSInfo1'),
('eventSubURL', '/gena.cgi?service=OSInfo1'),
('SCPDURL', '/OSInfo.xml')]), '/Layer3Forwarding.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:Layer3Forwarding:1'),
('serviceId', 'urn:upnp-org:serviceId:L3Forwarding1'), ('controlURL', '/soap.cgi?service=L3Forwarding1'),
('eventSubURL', '/gena.cgi?service=L3Forwarding1'), ('SCPDURL', '/Layer3Forwarding.xml')]),
'/WANCommonInterfaceConfig.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:WANCommonInterfaceConfig:1'),
('serviceId', 'urn:upnp-org:serviceId:WANCommonIFC1'),
('controlURL', '/soap.cgi?service=WANCommonIFC1'),
('eventSubURL', '/gena.cgi?service=WANCommonIFC1'),
('SCPDURL', '/WANCommonInterfaceConfig.xml')]), '/WANEthernetLinkConfig.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:WANEthernetLinkConfig:1'),
('serviceId', 'urn:upnp-org:serviceId:WANEthLinkC1'),
('controlURL', '/soap.cgi?service=WANEthLinkC1'), ('eventSubURL', '/gena.cgi?service=WANEthLinkC1'),
('SCPDURL', '/WANEthernetLinkConfig.xml')]), '/WANIPConnection.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:WANIPConnection:1'),
('serviceId', 'urn:upnp-org:serviceId:WANIPConn1'), ('controlURL', '/soap.cgi?service=WANIPConn1'),
('eventSubURL', '/gena.cgi?service=WANIPConn1'), ('SCPDURL', '/WANIPConnection.xml')])},
'reply': OrderedDict(
[('CACHE_CONTROL', 'max-age=1800'), ('LOCATION', 'http://10.0.0.1:49152/InternetGatewayDevice.xml'),
('SERVER', 'Linux, UPnP/1.0, DIR-890L Ver 1.20'), ('ST', 'urn:schemas-upnp-org:device:WANDevice:1'),
('USN', 'uuid:11111111-2222-3333-4444-555555555555::urn:schemas-upnp-org:device:WANDevice:1')]),
'soap_port': 49152,
'registered_soap_commands': {'GetGenericPortMappingEntry': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'GetSpecificPortMappingEntry': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'AddPortMapping': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'DeletePortMapping': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'GetExternalIPAddress': 'urn:schemas-upnp-org:service:WANIPConnection:1'},
'unsupported_soap_commands': {
'urn:schemas-upnp-org:service:Layer3Forwarding:1': ['GetDefaultConnectionService',
'SetDefaultConnectionService'],
'urn:schemas-upnp-org:service:WANCommonInterfaceConfig:1': ['GetCommonLinkProperties', 'GetTotalBytesSent',
'GetTotalBytesReceived', 'GetTotalPacketsSent',
'GetTotalPacketsReceived',
'X_GetICSStatistics'],
'urn:schemas-upnp-org:service:WANEthernetLinkConfig:1': ['GetEthernetLinkStatus'],
'urn:schemas-upnp-org:service:WANIPConnection:1': ['SetConnectionType', 'GetConnectionTypeInfo',
'RequestConnection', 'ForceTermination',
'GetStatusInfo', 'GetNATRSIPStatus']},
'soap_requests': []}
client_address = "10.0.0.2"
def setUp(self) -> None:
self.replies = {
(
f"GET {path} HTTP/1.1\r\n"
f"Accept-Encoding: gzip\r\n"
f"Host: {self.gateway_info['gateway_address']}\r\n"
f"Connection: Close\r\n"
f"\r\n"
).encode(): xml_bytes.encode()
for path, xml_bytes in self.gateway_info['services_xml'].items()
}
self.replies.update({
(
f"GET /{self.gateway_info['location'].lstrip(self.gateway_info['urlBase'])} HTTP/1.1\r\n"
f"Accept-Encoding: gzip\r\n"
f"Host: {self.gateway_info['gateway_address']}\r\n"
f"Connection: Close\r\n"
f"\r\n"
).encode(): self.gateway_info['gateway_xml'].encode()
})
super().setUp()
async def test_discover_gateway(self):
with self.assertRaises(UPnPError) as e1:
with mock_tcp_and_udp(self.loop):
await Gateway.discover_gateway(self.client_address, self.gateway_info['gateway_address'], 2,
loop=self.loop)
with self.assertRaises(UPnPError) as e2:
with mock_tcp_and_udp(self.loop):
await Gateway.discover_gateway(self.client_address, self.gateway_info['gateway_address'], 2,
loop=self.loop)
self.assertEqual(str(e1.exception), f"M-SEARCH for {self.gateway_info['gateway_address']}:1900 timed out")
self.assertEqual(str(e2.exception), f"M-SEARCH for {self.gateway_info['gateway_address']}:1900 timed out")
async def test_discover_commands(self):
with mock_tcp_and_udp(self.loop, tcp_replies=self.replies):
gateway = Gateway(
SSDPDatagram("OK", self.gateway_info['reply']),
self.client_address, self.gateway_info['gateway_address'], loop=self.loop
)
await gateway.discover_commands()
self.assertDictEqual(self.gateway_info['registered_soap_commands'], gateway._registered_commands)
self.assertDictEqual(gateway.debug_gateway(), self.gateway_info)
class TestDiscoverNetgearNighthawkAC2350(TestDiscoverDLinkDIR890L):
gateway_info = {'manufacturer_string': 'NETGEAR NETGEAR Nighthawk X4 AC2350 Smart WiFi Router',
'gateway_address': '192.168.0.1', 'server': 'R7500v2 UPnP/1.0 miniupnpd/1.0',
'urlBase': 'http://192.168.0.1:5555', 'location': 'http://192.168.0.1:5555/rootDesc.xml',
'specVersion': {'major': '1', 'minor': '0'},
'usn': 'uuid:11111111-2222-3333-4444-555555555555::upnp:rootdevice', 'urn': 'upnp:rootdevice',
'gateway_xml': 'HTTP/1.1 200 OK\r\nContent-Type: text/xml; charset="utf-8"\r\nConnection: close\r\nContent-Length: 3720\r\nServer: R7500v2 UPnP/1.0 miniupnpd/1.0\r\nExt: \r\nContent-Language: en-US\r\n\r\n<?xml version="1.0"?>\n<root xmlns="urn:schemas-upnp-org:device-1-0" \txmlns:pnpx="http://schemas.microsoft.com/windows/pnpx/2005/11" \txmlns:df="http://schemas.microsoft.com/windows/2008/09/devicefoundation"><specVersion><major>1</major><minor>0</minor></specVersion><URLBase>http://192.168.0.1:5555</URLBase><device><pnpx:X_hardwareId>VEN_01f2&DEV_0018&REV_02 VEN_01f2&DEV_8000&SUBSYS_01&REV_01 VEN_01f2&DEV_8000&REV_01 VEN_0033&DEV_0008&REV_01</pnpx:X_hardwareId><pnpx:X_compatibleId>urn:schemas-upnp-org:device:InternetGatewayDevice:1</pnpx:X_compatibleId><pnpx:X_deviceCategory>NetworkInfrastructure.Router</pnpx:X_deviceCategory><df:X_deviceCategory>Network.Router.Wireless</df:X_deviceCategory><deviceType>urn:schemas-upnp-org:device:InternetGatewayDevice:1</deviceType><friendlyName>R7500v2 (Gateway)</friendlyName><manufacturer>NETGEAR, Inc.</manufacturer><manufacturerURL>http://www.netgear.com</manufacturerURL><modelDescription>NETGEAR R7500v2 NETGEAR Nighthawk X4 AC2350 Smart WiFi Router</modelDescription><modelName>NETGEAR Nighthawk X4 AC2350 Smart WiFi Router</modelName><modelNumber>R7500v2</modelNumber><modelURL>http://www.netgear.com/home/products/wirelessrouters</modelURL><serialNumber>v1</serialNumber><UDN>uuid:11111111-2222-3333-4444-555555555555</UDN><UPC>606449084528</UPC><serviceList><service><serviceType>urn:schemas-upnp-org:service:Layer3Forwarding:1</serviceType><serviceId>urn:upnp-org:serviceId:L3Forwarding1</serviceId><controlURL>/ctl/L3Forwarding</controlURL><eventSubURL>/evt/L3Forwarding</eventSubURL><SCPDURL>/Layer3F.xml</SCPDURL></service></serviceList><deviceList><device><deviceType>urn:schemas-upnp-org:device:WANDevice:1</deviceType><friendlyName>WAN Device</friendlyName><manufacturer>NETGEAR</manufacturer><manufacturerURL>http://www.netgear.com</manufacturerURL><modelDescription>WAN Device on NETGEAR R7500v2 Wireless Router</modelDescription><modelName>NETGEAR Nighthawk X4 AC2350 Smart WiFi Router</modelName><modelNumber>R7500v2</modelNumber><modelURL>http://www.netgear.com</modelURL><serialNumber>v1</serialNumber><UDN>uuid:11111111-2222-3333-4444-555555555555</UDN><UPC>1234567890ab</UPC><serviceList><service><serviceType>urn:schemas-upnp-org:service:WANCommonInterfaceConfig:1</serviceType><serviceId>urn:upnp-org:serviceId:WANCommonIFC1</serviceId><controlURL>/ctl/CommonIfCfg</controlURL><eventSubURL>/evt/CommonIfCfg</eventSubURL><SCPDURL>/WANCfg.xml</SCPDURL></service></serviceList><deviceList><device><deviceType>urn:schemas-upnp-org:device:WANConnectionDevice:1</deviceType><friendlyName>WAN Connection Device</friendlyName><manufacturer>NETGEAR</manufacturer><manufacturerURL>http://www.netgear.com</manufacturerURL><modelDescription>WANConnectionDevice on NETGEAR R7500v2 Wireless Router</modelDescription><modelName>NETGEAR Nighthawk X4 AC2350 Smart WiFi Router</modelName><modelNumber>R7500v2</modelNumber><modelURL>http://www.netgear.com</modelURL><serialNumber>v1</serialNumber><UDN>uuid:4d696e69-444c-164e-9d44-b0b98a4cd3c3</UDN><UPC>1234567890ab</UPC><serviceList><service><serviceType>urn:schemas-upnp-org:service:WANEthernetLinkConfig:1</serviceType><serviceId>urn:upnp-org:serviceId:WANEthLinkC1</serviceId><controlURL>/ctl/WanEth</controlURL><eventSubURL>/evt/WanEth</eventSubURL><SCPDURL>/WanEth.xml</SCPDURL></service><service><serviceType>urn:schemas-upnp-org:service:WANIPConnection:1</serviceType><serviceId>urn:upnp-org:serviceId:WANIPConn1</serviceId><controlURL>/ctl/IPConn</controlURL><eventSubURL>/evt/IPConn</eventSubURL><SCPDURL>/WANIPCn.xml</SCPDURL></service></serviceList></device></deviceList></device></deviceList><presentationURL>http://www.routerlogin.net</presentationURL></device></root>',
'services_xml': {
'/Layer3F.xml': 'HTTP/1.1 200 OK\r\nContent-Type: text/xml; charset="utf-8"\r\nConnection: close\r\nContent-Length: 794\r\nServer: R7500v2 UPnP/1.0 miniupnpd/1.0\r\nExt: \r\nContent-Language: en-US\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0"><specVersion><major>1</major><minor>0</minor></specVersion><actionList><action><name>SetDefaultConnectionService</name><argumentList><argument><name>NewDefaultConnectionService</name><direction>in</direction><relatedStateVariable>DefaultConnectionService</relatedStateVariable></argument></argumentList></action><action><name>GetDefaultConnectionService</name><argumentList><argument><name>NewDefaultConnectionService</name><direction>out</direction><relatedStateVariable>DefaultConnectionService</relatedStateVariable></argument></argumentList></action></actionList><serviceStateTable><stateVariable sendEvents="yes"><name>DefaultConnectionService</name><dataType>string</dataType></stateVariable></serviceStateTable></scpd>',
'/WANCfg.xml': 'HTTP/1.1 200 OK\r\nContent-Type: text/xml; charset="utf-8"\r\nConnection: close\r\nContent-Length: 2942\r\nServer: R7500v2 UPnP/1.0 miniupnpd/1.0\r\nExt: \r\nContent-Language: en-US\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0"><specVersion><major>1</major><minor>0</minor></specVersion><actionList><action><name>GetCommonLinkProperties</name><argumentList><argument><name>NewWANAccessType</name><direction>out</direction><relatedStateVariable>WANAccessType</relatedStateVariable></argument><argument><name>NewLayer1UpstreamMaxBitRate</name><direction>out</direction><relatedStateVariable>Layer1UpstreamMaxBitRate</relatedStateVariable></argument><argument><name>NewLayer1DownstreamMaxBitRate</name><direction>out</direction><relatedStateVariable>Layer1DownstreamMaxBitRate</relatedStateVariable></argument><argument><name>NewPhysicalLinkStatus</name><direction>out</direction><relatedStateVariable>PhysicalLinkStatus</relatedStateVariable></argument></argumentList></action><action><name>GetTotalBytesSent</name><argumentList><argument><name>NewTotalBytesSent</name><direction>out</direction><relatedStateVariable>TotalBytesSent</relatedStateVariable></argument></argumentList></action><action><name>GetTotalBytesReceived</name><argumentList><argument><name>NewTotalBytesReceived</name><direction>out</direction><relatedStateVariable>TotalBytesReceived</relatedStateVariable></argument></argumentList></action><action><name>GetTotalPacketsSent</name><argumentList><argument><name>NewTotalPacketsSent</name><direction>out</direction><relatedStateVariable>TotalPacketsSent</relatedStateVariable></argument></argumentList></action><action><name>GetTotalPacketsReceived</name><argumentList><argument><name>NewTotalPacketsReceived</name><direction>out</direction><relatedStateVariable>TotalPacketsReceived</relatedStateVariable></argument></argumentList></action></actionList><serviceStateTable><stateVariable sendEvents="no"><name>WANAccessType</name><dataType>string</dataType><allowedValueList><allowedValue>DSL</allowedValue><allowedValue>POTS</allowedValue><allowedValue>Cable</allowedValue><allowedValue>Ethernet</allowedValue></allowedValueList></stateVariable><stateVariable sendEvents="no"><name>Layer1UpstreamMaxBitRate</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="no"><name>Layer1DownstreamMaxBitRate</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="yes"><name>PhysicalLinkStatus</name><dataType>string</dataType><allowedValueList><allowedValue>Up</allowedValue><allowedValue>Down</allowedValue><allowedValue>Initializing</allowedValue><allowedValue>Unavailable</allowedValue></allowedValueList></stateVariable><stateVariable sendEvents="no"><name>TotalBytesSent</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="no"><name>TotalBytesReceived</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="no"><name>TotalPacketsSent</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="no"><name>TotalPacketsReceived</name><dataType>ui4</dataType></stateVariable></serviceStateTable></scpd>',
'/WanEth.xml': 'HTTP/1.1 200 OK\r\nContent-Type: text/xml; charset="utf-8"\r\nConnection: close\r\nContent-Length: 711\r\nServer: R7500v2 UPnP/1.0 miniupnpd/1.0\r\nExt: \r\nContent-Language: en-US\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0"><specVersion><major>1</major><minor>0</minor></specVersion><actionList><action><name>GetEthernetLinkStatus</name><argumentList><argument><name>NewEthernetLinkStatus</name><direction>out</direction><relatedStateVariable>EthernetLinkStatus</relatedStateVariable></argument></argumentList></action></actionList><serviceStateTable><stateVariable sendEvents="yes"><name>EthernetLinkStatus</name><dataType>string</dataType><allowedValueList><allowedValue>Up</allowedValue><allowedValue>Down</allowedValue><allowedValue>Initializing</allowedValue><allowedValue>Unavailable</allowedValue></allowedValueList></stateVariable></serviceStateTable></scpd>',
'/WANIPCn.xml': 'HTTP/1.1 200 OK\r\nContent-Type: text/xml; charset="utf-8"\r\nConnection: close\r\nContent-Length: 8400\r\nServer: R7500v2 UPnP/1.0 miniupnpd/1.0\r\nExt: \r\nContent-Language: en-US\r\n\r\n<?xml version="1.0"?>\n<scpd xmlns="urn:schemas-upnp-org:service-1-0"><specVersion><major>1</major><minor>0</minor></specVersion><actionList><action><name>AddPortMapping</name><argumentList><argument><name>NewRemoteHost</name><direction>in</direction><relatedStateVariable>RemoteHost</relatedStateVariable></argument><argument><name>NewExternalPort</name><direction>in</direction><relatedStateVariable>ExternalPort</relatedStateVariable></argument><argument><name>NewProtocol</name><direction>in</direction><relatedStateVariable>PortMappingProtocol</relatedStateVariable></argument><argument><name>NewInternalPort</name><direction>in</direction><relatedStateVariable>InternalPort</relatedStateVariable></argument><argument><name>NewInternalClient</name><direction>in</direction><relatedStateVariable>InternalClient</relatedStateVariable></argument><argument><name>NewEnabled</name><direction>in</direction><relatedStateVariable>PortMappingEnabled</relatedStateVariable></argument><argument><name>NewPortMappingDescription</name><direction>in</direction><relatedStateVariable>PortMappingDescription</relatedStateVariable></argument><argument><name>NewLeaseDuration</name><direction>in</direction><relatedStateVariable>PortMappingLeaseDuration</relatedStateVariable></argument></argumentList></action><action><name>GetExternalIPAddress</name><argumentList><argument><name>NewExternalIPAddress</name><direction>out</direction><relatedStateVariable>ExternalIPAddress</relatedStateVariable></argument></argumentList></action><action><name>DeletePortMapping</name><argumentList><argument><name>NewRemoteHost</name><direction>in</direction><relatedStateVariable>RemoteHost</relatedStateVariable></argument><argument><name>NewExternalPort</name><direction>in</direction><relatedStateVariable>ExternalPort</relatedStateVariable></argument><argument><name>NewProtocol</name><direction>in</direction><relatedStateVariable>PortMappingProtocol</relatedStateVariable></argument></argumentList></action><action><name>SetConnectionType</name><argumentList><argument><name>NewConnectionType</name><direction>in</direction><relatedStateVariable>ConnectionType</relatedStateVariable></argument></argumentList></action><action><name>GetConnectionTypeInfo</name><argumentList><argument><name>NewConnectionType</name><direction>out</direction><relatedStateVariable>ConnectionType</relatedStateVariable></argument><argument><name>NewPossibleConnectionTypes</name><direction>out</direction><relatedStateVariable>PossibleConnectionTypes</relatedStateVariable></argument></argumentList></action><action><name>RequestConnection</name></action><action><name>ForceTermination</name></action><action><name>GetStatusInfo</name><argumentList><argument><name>NewConnectionStatus</name><direction>out</direction><relatedStateVariable>ConnectionStatus</relatedStateVariable></argument><argument><name>NewLastConnectionError</name><direction>out</direction><relatedStateVariable>LastConnectionError</relatedStateVariable></argument><argument><name>NewUptime</name><direction>out</direction><relatedStateVariable>Uptime</relatedStateVariable></argument></argumentList></action><action><name>GetNATRSIPStatus</name><argumentList><argument><name>NewRSIPAvailable</name><direction>out</direction><relatedStateVariable>RSIPAvailable</relatedStateVariable></argument><argument><name>NewNATEnabled</name><direction>out</direction><relatedStateVariable>NATEnabled</relatedStateVariable></argument></argumentList></action><action><name>GetGenericPortMappingEntry</name><argumentList><argument><name>NewPortMappingIndex</name><direction>in</direction><relatedStateVariable>PortMappingNumberOfEntries</relatedStateVariable></argument><argument><name>NewRemoteHost</name><direction>out</direction><relatedStateVariable>RemoteHost</relatedStateVariable></argument><argument><name>NewExternalPort</name><direction>out</direction><relatedStateVariable>ExternalPort</relatedStateVariable></argument><argument><name>NewProtocol</name><direction>out</direction><relatedStateVariable>PortMappingProtocol</relatedStateVariable></argument><argument><name>NewInternalPort</name><direction>out</direction><relatedStateVariable>InternalPort</relatedStateVariable></argument><argument><name>NewInternalClient</name><direction>out</direction><relatedStateVariable>InternalClient</relatedStateVariable></argument><argument><name>NewEnabled</name><direction>out</direction><relatedStateVariable>PortMappingEnabled</relatedStateVariable></argument><argument><name>NewPortMappingDescription</name><direction>out</direction><relatedStateVariable>PortMappingDescription</relatedStateVariable></argument><argument><name>NewLeaseDuration</name><direction>out</direction><relatedStateVariable>PortMappingLeaseDuration</relatedStateVariable></argument></argumentList></action><action><name>GetSpecificPortMappingEntry</name><argumentList><argument><name>NewRemoteHost</name><direction>in</direction><relatedStateVariable>RemoteHost</relatedStateVariable></argument><argument><name>NewExternalPort</name><direction>in</direction><relatedStateVariable>ExternalPort</relatedStateVariable></argument><argument><name>NewProtocol</name><direction>in</direction><relatedStateVariable>PortMappingProtocol</relatedStateVariable></argument><argument><name>NewInternalPort</name><direction>out</direction><relatedStateVariable>InternalPort</relatedStateVariable></argument><argument><name>NewInternalClient</name><direction>out</direction><relatedStateVariable>InternalClient</relatedStateVariable></argument><argument><name>NewEnabled</name><direction>out</direction><relatedStateVariable>PortMappingEnabled</relatedStateVariable></argument><argument><name>NewPortMappingDescription</name><direction>out</direction><relatedStateVariable>PortMappingDescription</relatedStateVariable></argument><argument><name>NewLeaseDuration</name><direction>out</direction><relatedStateVariable>PortMappingLeaseDuration</relatedStateVariable></argument></argumentList></action></actionList><serviceStateTable><stateVariable sendEvents="no"><name>ConnectionType</name><dataType>string</dataType></stateVariable><stateVariable sendEvents="yes"><name>PossibleConnectionTypes</name><dataType>string</dataType><allowedValueList><allowedValue>Unconfigured</allowedValue><allowedValue>IP_Routed</allowedValue><allowedValue>IP_Bridged</allowedValue></allowedValueList></stateVariable><stateVariable sendEvents="yes"><name>ConnectionStatus</name><dataType>string</dataType><allowedValueList><allowedValue>Unconfigured</allowedValue><allowedValue>Connecting</allowedValue><allowedValue>Connected</allowedValue><allowedValue>PendingDisconnect</allowedValue><allowedValue>Disconnecting</allowedValue><allowedValue>Disconnected</allowedValue></allowedValueList></stateVariable><stateVariable sendEvents="no"><name>Uptime</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="no"><name>LastConnectionError</name><dataType>string</dataType><allowedValueList><allowedValue>ERROR_NONE</allowedValue></allowedValueList></stateVariable><stateVariable sendEvents="no"><name>RSIPAvailable</name><dataType>boolean</dataType></stateVariable><stateVariable sendEvents="no"><name>NATEnabled</name><dataType>boolean</dataType></stateVariable><stateVariable sendEvents="yes"><name>ExternalIPAddress</name><dataType>string</dataType></stateVariable><stateVariable sendEvents="yes"><name>PortMappingNumberOfEntries</name><dataType>ui2</dataType></stateVariable><stateVariable sendEvents="no"><name>PortMappingEnabled</name><dataType>boolean</dataType></stateVariable><stateVariable sendEvents="no"><name>PortMappingLeaseDuration</name><dataType>ui4</dataType></stateVariable><stateVariable sendEvents="no"><name>RemoteHost</name><dataType>string</dataType></stateVariable><stateVariable sendEvents="no"><name>ExternalPort</name><dataType>ui2</dataType></stateVariable><stateVariable sendEvents="no"><name>InternalPort</name><dataType>ui2</dataType></stateVariable><stateVariable sendEvents="no"><name>PortMappingProtocol</name><dataType>string</dataType><allowedValueList><allowedValue>TCP</allowedValue><allowedValue>UDP</allowedValue></allowedValueList></stateVariable><stateVariable sendEvents="no"><name>InternalClient</name><dataType>string</dataType></stateVariable><stateVariable sendEvents="no"><name>PortMappingDescription</name><dataType>string</dataType></stateVariable></serviceStateTable></scpd>'},
'services': {'/Layer3F.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:Layer3Forwarding:1'),
('serviceId', 'urn:upnp-org:serviceId:L3Forwarding1'), ('controlURL', '/ctl/L3Forwarding'),
('eventSubURL', '/evt/L3Forwarding'), ('SCPDURL', '/Layer3F.xml')]),
'/WANCfg.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:WANCommonInterfaceConfig:1'),
('serviceId', 'urn:upnp-org:serviceId:WANCommonIFC1'),
('controlURL', '/ctl/CommonIfCfg'), ('eventSubURL', '/evt/CommonIfCfg'),
('SCPDURL', '/WANCfg.xml')]), '/WanEth.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:WANEthernetLinkConfig:1'),
('serviceId', 'urn:upnp-org:serviceId:WANEthLinkC1'), ('controlURL', '/ctl/WanEth'),
('eventSubURL', '/evt/WanEth'), ('SCPDURL', '/WanEth.xml')]), '/WANIPCn.xml': OrderedDict(
[('serviceType', 'urn:schemas-upnp-org:service:WANIPConnection:1'),
('serviceId', 'urn:upnp-org:serviceId:WANIPConn1'), ('controlURL', '/ctl/IPConn'),
('eventSubURL', '/evt/IPConn'), ('SCPDURL', '/WANIPCn.xml')])},
'reply': OrderedDict(
[('CACHE_CONTROL', 'max-age=1800'), ('ST', 'upnp:rootdevice'),
('USN', 'uuid:11111111-2222-3333-4444-555555555555::upnp:rootdevice'),
('Server', 'R7500v2 UPnP/1.0 miniupnpd/1.0'), ('Location', 'http://192.168.0.1:5555/rootDesc.xml')]),
'soap_port': 5555,
'registered_soap_commands': {'AddPortMapping': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'GetExternalIPAddress': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'DeletePortMapping': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'GetGenericPortMappingEntry': 'urn:schemas-upnp-org:service:WANIPConnection:1',
'GetSpecificPortMappingEntry': 'urn:schemas-upnp-org:service:WANIPConnection:1'},
'unsupported_soap_commands': {
'urn:schemas-upnp-org:service:Layer3Forwarding:1': ['SetDefaultConnectionService',
'GetDefaultConnectionService'],
'urn:schemas-upnp-org:service:WANCommonInterfaceConfig:1': ['GetCommonLinkProperties',
'GetTotalBytesSent',
'GetTotalBytesReceived',
'GetTotalPacketsSent',
'GetTotalPacketsReceived'],
'urn:schemas-upnp-org:service:WANEthernetLinkConfig:1': ['GetEthernetLinkStatus'],
'urn:schemas-upnp-org:service:WANIPConnection:1': ['SetConnectionType', 'GetConnectionTypeInfo',
'RequestConnection', 'ForceTermination',
'GetStatusInfo', 'GetNATRSIPStatus']},
'soap_requests': []}
| 226.464883 | 14,439 | 0.684891 | 9,124 | 67,713 | 5.063349 | 0.048772 | 0.080047 | 0.076432 | 0.055327 | 0.800078 | 0.757912 | 0.728625 | 0.689727 | 0.657864 | 0.603143 | 0 | 0.019972 | 0.118559 | 67,713 | 298 | 14,440 | 227.224832 | 0.754059 | 0 | 0 | 0.308772 | 0 | 0.052632 | 0.814201 | 0.603429 | 0 | 0 | 0 | 0 | 0.024561 | 1 | 0.010526 | false | 0.003509 | 0.017544 | 0.003509 | 0.05614 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6b0e9385e798386e9c4b809d508224e22ab38361 | 162 | py | Python | benchmark/plotting/__init__.py | choderalab/integrator-benchmark | bb307e6ebf476b652e62e41ae49730f530732da3 | [
"MIT"
] | 5 | 2017-02-22T09:08:21.000Z | 2021-09-08T21:21:35.000Z | benchmark/plotting/__init__.py | choderalab/integrator-benchmark | bb307e6ebf476b652e62e41ae49730f530732da3 | [
"MIT"
] | 36 | 2017-04-15T21:34:25.000Z | 2018-07-22T13:56:40.000Z | benchmark/plotting/__init__.py | choderalab/integrator-benchmark | bb307e6ebf476b652e62e41ae49730f530732da3 | [
"MIT"
] | 2 | 2019-12-06T05:43:10.000Z | 2021-04-01T01:00:24.000Z | from .plotting_utilities import savefig, plot, generate_figure_filename, plot_scheme_comparison
__all__ = ["generate_figure_filename", "plot_scheme_comparison"]
| 40.5 | 95 | 0.851852 | 19 | 162 | 6.578947 | 0.631579 | 0.224 | 0.352 | 0.416 | 0.672 | 0.672 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 162 | 3 | 96 | 54 | 0.833333 | 0 | 0 | 0 | 1 | 0 | 0.283951 | 0.283951 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
6b2a59faa5b01c68f563d5b981f4d8e19eaf0be2 | 5,828 | py | Python | serial_scripts/k8s_scripts/test_deployment.py | lmadhusudhanan/contrail-test | bd39ff19da06a20bd79af8c25e3cde07375577cf | [
"Apache-2.0"
] | null | null | null | serial_scripts/k8s_scripts/test_deployment.py | lmadhusudhanan/contrail-test | bd39ff19da06a20bd79af8c25e3cde07375577cf | [
"Apache-2.0"
] | 1 | 2021-06-01T22:19:48.000Z | 2021-06-01T22:19:48.000Z | serial_scripts/k8s_scripts/test_deployment.py | lmadhusudhanan/contrail-test | bd39ff19da06a20bd79af8c25e3cde07375577cf | [
"Apache-2.0"
] | null | null | null | import time
import test
from common.k8s.base import BaseK8sTest
from tcutils.wrappers import preposttest_wrapper
from tcutils.contrail_status_check import ContrailStatusChecker
class TestDeployment(BaseK8sTest):
@classmethod
def setUpClass(cls):
super(TestDeployment, cls).setUpClass()
@classmethod
def tearDownClass(cls):
super(TestDeployment, cls).tearDownClass()
@test.attr(type=['k8s_sanity'])
@preposttest_wrapper
def test_deployment_with_kube_manager_restart(self):
''' Create a deployment object with 3 pod replicas and Verify http service works across the pod replicas
Verify deletion of the deployment object cleans up all the pods which it had created
Restart kube manager on all the control nodes and verify redeploying the deployment object with pod replicas take into effect
Re-verify the deployment passes and pods work as expected using http service with new set of replicas
'''
client_pod = self.setup_busybox_pod()
namespace = 'default'
labels = {'deployment': 'test'}
dep = self.setup_nginx_deployment(name='dep-test',
replicas=3,
pod_labels=labels)
assert dep.verify_on_setup()
service = self.setup_http_service(namespace=namespace,
labels=labels)
server_pods = dep.get_pods_list()
s_pod_fixtures = []
for x in server_pods:
s_pod_fixture = self.setup_nginx_pod(name=x.metadata.name)
self.verify_nginx_pod(s_pod_fixture)
s_pod_fixtures.append(s_pod_fixture)
assert self.validate_nginx_lb(s_pod_fixtures, service.cluster_ip,
test_pod=client_pod)
self.restart_kube_manager()
self.sleep(5)
assert self.validate_nginx_lb(s_pod_fixtures, service.cluster_ip,
test_pod=client_pod)
self.perform_cleanup(dep)
self.sleep(1)
'''After restart of the Kube Manager recreate the deployment obect
With additional pod replicas'''
dep = self.setup_nginx_deployment(name='dep-test',
replicas=5,
pod_labels=labels)
assert dep.verify_on_setup()
service = self.setup_http_service(namespace=namespace,
labels=labels)
server_pods = dep.get_pods_list()
s_pod_fixtures = []
for x in server_pods:
s_pod_fixture = self.setup_nginx_pod(name=x.metadata.name)
self.verify_nginx_pod(s_pod_fixture)
s_pod_fixtures.append(s_pod_fixture)
assert self.validate_nginx_lb(s_pod_fixtures, service.cluster_ip,
test_pod=client_pod)
@test.attr(type=['k8s_sanity'])
@preposttest_wrapper
def test_deployment_with_agent_restart(self):
''' Create a deployment object with 3 pod replicas and Verify http service works across the pod replicas
Verify deletion of the deployment object cleans up all the pods which it had created
Restart vrouter agent on all the nodes and verify redeploying the deployment object with pod replicas take into effect
Re-verify the deployment passes and pods work as expected using http service with new set of replicas
'''
client_pod = self.setup_busybox_pod()
namespace = 'default'
labels = {'deployment': 'test'}
dep = self.setup_nginx_deployment(name='dep-test',
replicas=3,
pod_labels=labels)
assert dep.verify_on_setup()
service = self.setup_http_service(namespace=namespace,
labels=labels)
server_pods = dep.get_pods_list()
s_pod_fixtures = []
for x in server_pods:
s_pod_fixture = self.setup_nginx_pod(name=x.metadata.name)
self.verify_nginx_pod(s_pod_fixture)
s_pod_fixtures.append(s_pod_fixture)
assert self.validate_nginx_lb(s_pod_fixtures, service.cluster_ip,
test_pod=client_pod)
for compute_ip in self.inputs.compute_ips:
self.inputs.restart_service('contrail-vrouter-agent',[compute_ip],
container='agent')
cluster_status, error_nodes = ContrailStatusChecker().wait_till_contrail_cluster_stable()
assert cluster_status, 'Cluster is not stable after restart'
self.sleep(5)
assert self.validate_nginx_lb(s_pod_fixtures, service.cluster_ip,
test_pod=client_pod)
self.perform_cleanup(dep)
self.sleep(1)
'''After restart of the vrouter agent recreate the deployment obect
With additional pod replicas'''
dep = self.setup_nginx_deployment(name='dep-test',
replicas=5,
pod_labels=labels)
assert dep.verify_on_setup()
service = self.setup_http_service(namespace=namespace,
labels=labels)
server_pods = dep.get_pods_list()
s_pod_fixtures = []
for x in server_pods:
s_pod_fixture = self.setup_nginx_pod(name=x.metadata.name)
self.verify_nginx_pod(s_pod_fixture)
s_pod_fixtures.append(s_pod_fixture)
assert self.validate_nginx_lb(s_pod_fixtures, service.cluster_ip,
test_pod=client_pod)
| 48.165289 | 138 | 0.607241 | 676 | 5,828 | 4.970414 | 0.173077 | 0.030952 | 0.05 | 0.041071 | 0.81131 | 0.81131 | 0.81131 | 0.81131 | 0.81131 | 0.81131 | 0 | 0.003835 | 0.328929 | 5,828 | 120 | 139 | 48.566667 | 0.85528 | 0.141386 | 0 | 0.8125 | 0 | 0 | 0.033333 | 0.004701 | 0 | 0 | 0 | 0 | 0.114583 | 1 | 0.041667 | false | 0 | 0.052083 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
86174c8df6e51a1b4dff79569859a4fac81901fe | 153,937 | py | Python | src/segmentpy/tf114/model.py | ZeliangSu/LRCS-Xlearn | 50ff9c64f36c0d80417aa44aac2db68f392130f0 | [
"Apache-2.0"
] | 4 | 2021-06-08T07:53:55.000Z | 2022-02-16T15:10:15.000Z | src/segmentpy/tf114/model.py | ZeliangSu/LRCS-Xlearn | 50ff9c64f36c0d80417aa44aac2db68f392130f0 | [
"Apache-2.0"
] | 7 | 2021-06-01T21:19:47.000Z | 2022-02-25T07:36:58.000Z | src/segmentpy/tf114/model.py | ZeliangSu/LRCS-Xlearn | 50ff9c64f36c0d80417aa44aac2db68f392130f0 | [
"Apache-2.0"
] | 1 | 2021-11-13T16:44:32.000Z | 2021-11-13T16:44:32.000Z | import tensorflow as tf
from segmentpy.tf114.layers import *
from segmentpy.tf114.util import print_nodes_name_shape
def classification_nodes(pipeline,
placeholders=None,
model_name='LRCS',
patch_size=512,
batch_size=200,
conv_size=9,
nb_conv=80,
activation='relu',
batch_norm=True,
loss_option='cross_entropy',
is_training=False,
grad_view=False,
nb_classes=3
):
# check entries
assert isinstance(placeholders, list), 'placeholders should be a list.'
# get placeholder
drop_prob, lr, BN_phase = placeholders
# build model
logits, list_params = model_dict[model_name](pipeline=pipeline,
patch_size=patch_size,
batch_size=batch_size,
conv_size=conv_size,
nb_conv=nb_conv,
drop_prob=drop_prob,
activation=activation,
if_BN=batch_norm,
BN_phase=BN_phase,
reuse=not is_training,
mode='classification',
nb_classes=nb_classes, # todo: automatize here
)
# logits shape [B, H, W, nb_class]
with tf.name_scope('Loss'):
if loss_option == 'DSC':
softmax = customized_softmax(logits)
loss = DSC(pipeline['label'], softmax, name='loss_fn')
elif loss_option == 'cross_entropy':
softmax = customized_softmax(logits)
loss = Cross_Entropy(pipeline['label'], softmax, name='CE')
else:
raise NotImplementedError('Cannot find the loss option')
# gradients
if is_training:
with tf.name_scope('operation'):
# optimizer/train operation
opt = optimizer(lr, name='optimizeR')
# program gradients
grads = opt.compute_gradients(loss)
# train operation
train_op = opt.apply_gradients(grads, name='train_op')
with tf.name_scope('train_metrics'):
m_loss, loss_up_op, m_acc, acc_up_op, lss, acc = metrics(softmax, #[B, W, H, 3]
pipeline['label'], #[B, W, H, 3]
loss,
is_training)
with tf.name_scope('summary'):
tmp = []
for layer_param in list_params:
for k, v in layer_param.items():
tmp.append(tf.summary.histogram(k, v))
if len(tmp) > 0:
m_param = tf.summary.merge(tmp)
merged = tf.summary.merge([m_param, m_loss, m_acc])
else:
merged = tf.summary.merge([m_loss, m_acc])
if grad_view:
grad_sum = tf.summary.merge([tf.summary.histogram('{}/grad'.format(g[1].name), g[0]) for g in grads])
merged = tf.summary.merge([merged, grad_sum])
else:
with tf.name_scope('operation'):
train_op = tf.no_op(name='no_op')
with tf.name_scope('test_metrics'):
m_loss, loss_up_op, m_acc, acc_up_op, lss, acc = metrics(softmax, pipeline['label'], loss, is_training)
with tf.name_scope('summary'):
tmp = []
for layer_param in list_params:
for k, v in layer_param.items():
tmp.append(tf.summary.histogram(k, v))
if len(tmp) > 0:
m_param = tf.summary.merge(tmp)
merged = tf.summary.merge([m_param, m_loss, m_acc])
else:
merged = tf.summary.merge([m_loss, m_acc])
return {
'y_pred': logits,
'train_op': train_op,
'learning_rate': lr,
'summary': merged,
'drop': drop_prob,
'BN_phase': BN_phase,
'loss_update_op': loss_up_op,
'acc_update_op': acc_up_op,
'val_lss': lss,
'val_acc': acc,
}
def model_LRCS(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
"""
lite version (less GPU occupancy) of xlearn segmentation convolutional neural net model with summary. histograms are
saved in
input:
-------
train_inputs: (tf.iterator?)
test_inputs: (tf.iterator?)
patch_size: (int) height and width (here we assume the same length for both)
batch_size: (int) number of images per batch (average the gradient within a batch,
the weights and bias upgrade after one batch)
conv_size: (int) size of the convolution matrix e.g. 5x5, 7x7, ...
nb_conv: (int) number of convolution per layer e.g. 32, 64, ...
learning_rate: (float) learning rate for the optimizer
return:
-------
(dictionary) dictionary of nodes in the conv net
'y_pred': output of the neural net,
'train_op': node of the trainning operation, once called, it will update weights and bias,
'drop': dropout layers' probability parameters,
'summary': compared to the original model, only summary of loss, accuracy and histograms of gradients are invovled,
which lighten GPU resource occupancy,
'train_or_test': switch button for a training/testing input pipeline,
'loss_update_op': node of updating loss function summary,
'acc_update_op': node of updating accuracy summary
"""
#note: Batch Norm automatically applied, can be tuned manually
with tf.name_scope('LRCS'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1bis, _ = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bis, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2bis, _ = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2bis, name='maxp2')
conv3, _ = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bis', reuse=reuse)
conv4bisbis, m4bb = conv2d_layer(conv4bis, shape=[conv_size, conv_size, nb_conv * 8, 1],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bisbis', reuse=reuse)
with tf.name_scope('dnn'):
conv4_flat = reshape(conv4bisbis, [-1, patch_size ** 2 // 64], name='flatten')
full_layer_1, mf1 = normal_full_layer(conv4_flat, patch_size ** 2 // 128, activation=activation, # OOM: //128 --> //512
if_BN=if_BN, is_train=BN_phase, name='dnn1', reuse=reuse)
full_dropout1 = dropout(full_layer_1, drop_prob, name='dropout1')
full_layer_2, mf2 = normal_full_layer(full_dropout1, patch_size ** 2 // 128, activation=activation, # OOM: //128 --> //512
if_BN=if_BN, is_train=BN_phase, name='dnn2', reuse=reuse)
full_dropout2 = dropout(full_layer_2, drop_prob, name='dropout2')
full_layer_3, mf3 = normal_full_layer(full_dropout2, patch_size ** 2 // 64, activation=activation, # OOM: //64 --> //512
if_BN=if_BN, is_train=BN_phase, name='dnn3', reuse=reuse)
full_dropout3 = dropout(full_layer_3, drop_prob, name='dropout1')
dnn_reshape = reshape(full_dropout3, [-1, patch_size // 8, patch_size // 8, 1], name='reshape')
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(dnn_reshape, [conv_size, conv_size, 1, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse) # [height, width, in_channels, output_channels]
deconv_5bis, _ = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_improved(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS2'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1bis, _ = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bis, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2bis, _ = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2bis, name='maxp2')
conv3, _ = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bis', reuse=reuse)
conv4bisbis, m4bb = conv2d_layer(conv4bis, shape=[conv_size, conv_size, nb_conv * 8, nb_classes],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bisbis', reuse=reuse)
with tf.name_scope('dnn'):
conv4_flat = reshape(conv4bisbis, [-1, patch_size ** 2 // 64 * nb_classes], name='flatten')
full_layer_1, mf1 = normal_full_layer(conv4_flat, patch_size ** 2 // 1024 * nb_classes, activation=activation,
if_BN=if_BN, is_train=BN_phase, name='dnn1', reuse=reuse)
full_dropout1 = dropout(full_layer_1, drop_prob, name='dropout1')
# add a second layer can reduce NxN --> 2xMxN
full_layer_2, mf2 = normal_full_layer(full_dropout1, nb_classes, activation=activation, # note: shoudnt do nb_classes * batch
if_BN=if_BN, is_train=BN_phase, name='dnn2', reuse=reuse)
full_dropout2 = dropout(full_layer_2, drop_prob, name='dropout2')
full_layer_3, mf3 = normal_full_layer(full_dropout2, patch_size ** 2 // 64 * nb_classes,
activation=activation,
if_BN=if_BN, is_train=BN_phase, name='dnn3',
reuse=reuse)
full_dropout3 = dropout(full_layer_3, drop_prob, name='dropout3')
dnn_reshape = reshape(full_dropout3, [-1, patch_size // 8, patch_size // 8, nb_classes], name='reshape')
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(dnn_reshape, [conv_size, conv_size, nb_classes, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse) # [height, width, in_channels, output_channels]
deconv_5bis, _ = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_constant(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
#note: Batch Norm automatically applied, can be tuned manually
with tf.name_scope('LRCS3'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1bis, _ = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bis, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2bis, _ = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2bis, name='maxp2')
conv3, _ = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bis', reuse=reuse)
conv4bisbis, m4bb = conv2d_layer(conv4bis, shape=[conv_size, conv_size, nb_conv * 8, nb_classes],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bisbis', reuse=reuse)
with tf.name_scope('dnn'):
dnn_reshape = constant_layer(conv4bisbis, constant=1.0)
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(dnn_reshape, [conv_size, conv_size, nb_classes, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse) # [height, width, in_channels, output_channels]
deconv_5bis, m5b = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, [m3b, m4bb, m5, m5b, m8bb]
def model_LRCS_shallow(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS4'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[1, 1, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4bisbis, m4bb = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_classes],
is_train=BN_phase, activation=activation,if_BN=False,
name='conv4bisbis', reuse=reuse)
with tf.name_scope('dnn'):
dnn_reshape = constant_layer(conv4bisbis, constant=1.0, name='constant')
with tf.name_scope('decoder'):
deconv_5bis, m5b = conv2d_layer(dnn_reshape, [conv_size, conv_size, nb_classes, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_simple(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS5'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[1, 1, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse, if_BN=False) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse, if_BN=False)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
with tf.name_scope('connexion'):
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
with tf.name_scope('decoder'):
up1 = up_2by2_ind(conv3, ind2, name='up2')
concat1 = concat([up1, conv1_pooling])
deconv_7, m7 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 2 + nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up2 = up_2by2_ind(deconv_7bis, ind1, name='up3')
# concat2 = concat([up2, pipeline['img']])
deconv_8, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, [m1, m3, m7, m8bb]
def model_LRCS_purConv(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS6'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1b, m1b = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1b', reuse=reuse)
conv1bb, m1bb = conv2d_layer(conv1b, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bb', reuse=reuse)
logits, m1bbb = conv2d_layer(conv1bb, shape=[conv_size, conv_size, nb_conv * 2, nb_classes], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bbb', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_LeCun(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS7'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1b, m1b = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1bis', reuse=reuse)
conv1bb, m1bb = conv2d_layer(conv1b, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1bisbis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bb, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2b, m2b = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2b, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3b, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3b, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation=activation, if_BN=False,
name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation='sigmoid', if_BN=False,
name='conv4bis', reuse=reuse)
with tf.name_scope('decoder'):
deconv5, m5 = conv2d_layer(conv4bis, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation=activation, if_BN=if_BN,
name='conv5', reuse=reuse)
deconv_5bis, m5b = conv2d_layer(deconv5, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_Weka(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS8'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 10, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=True,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=True,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=True,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4bisbis, m4bb = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_classes],
is_train=BN_phase, activation='sigmoid', if_BN=True,
name='conv4bisbis', reuse=reuse)
with tf.name_scope('decoder'):
deconv_5bis, m5b = conv2d_layer(conv4bisbis, [conv_size, conv_size, nb_classes, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_weka_constant(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS9'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 10, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4bisbis, m4bb = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_classes],
is_train=BN_phase, activation=activation,if_BN=False,
name='conv4bisbis', reuse=reuse)
with tf.name_scope('dnn'):
dnn_reshape = constant_layer(conv4bisbis, constant=1.0, name='constant')
with tf.name_scope('decoder'):
deconv_5bis, m5b = conv2d_layer(dnn_reshape, [conv_size, conv_size, nb_classes, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 6], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 6, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_lecun_thinner_weka_encoder(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS10'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 10, 20],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, 20, 40], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, 40, 80], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4bisbis, m4bb = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, 80, 10],
is_train=BN_phase, activation='sigmoid', if_BN=False,
name='conv4bisbis', reuse=reuse)
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(conv4bisbis, [conv_size, conv_size, 10, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse)
deconv_5bis, m5b = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 4, 80],
if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis',
reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, 80, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, 40], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, 40, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, 20], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, 20, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_lecun_thinner_encoder(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS11'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation='sigmoid', if_BN=False,
name='conv4', reuse=reuse)
# note: wider connexion for the bottom layers
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(conv4, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse)
deconv_5bis, m5b = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis',
reuse=reuse)
up1 = up_2by2_ind(deconv_5bis, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_mix_skipconnect(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS12'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation='sigmoid', if_BN=False,
name='conv4', reuse=reuse)
# note: wider connexion for the bottom layers
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(conv4, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse)
deconv_5bis, m5b = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis',
reuse=reuse)
concat1 = concat([up_2by2_ind(deconv_5bis, ind3, name='up1'), conv3], name='concat1')
deconv_6, _ = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
concat2 = concat([up_2by2_ind(deconv_6bis, ind2, name='up2'), conv2], name='concat2')
deconv_7, _ = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
concat3 = concat([up_2by2_ind(deconv_7bis, ind1, name='up3'), conv1], name='concat3')
deconv_8, _ = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_dropout_on_conv(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS13'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
drop1 = dropout(conv1, drop_prob, name='do1')
conv1_pooling, ind1 = max_pool_2by2_with_arg(drop1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
drop2 = dropout(conv2, drop_prob, name='do2')
conv2_pooling, ind2 = max_pool_2by2_with_arg(drop2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
drop3 = dropout(conv3, drop_prob, name='do3')
conv3_pooling, ind3 = max_pool_2by2_with_arg(drop3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation='sigmoid', if_BN=False,
name='conv4', reuse=reuse)
drop4 = dropout(conv4, drop_prob, name='do4')
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(drop4, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse)
deconv_5bis, m5b = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis',
reuse=reuse)
drop5 = dropout(deconv_5bis, drop_prob, name='do5')
up1 = up_2by2_ind(drop5, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
drop6 = dropout(deconv_6bis, drop_prob, name='do6')
up2 = up_2by2_ind(drop6, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
drop7 = dropout(deconv_7bis, drop_prob, name='do7')
up3 = up_2by2_ind(drop7, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_LRCS_full_FCLs(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS14'):
with tf.name_scope('encoder'):
pass
with tf.name_scope('decoder'):
pass
def model_LRCS_deeper_with_dropout_on_conv(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('LRCS15'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
drop1 = dropout(conv1, drop_prob, name='do1')
conv1b, m1 = conv2d_layer(drop1, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
# [height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation, if_BN=False,
name='conv1b', reuse=reuse) # [height, width, in_channels, output_channels]
drop1b = dropout(conv1b, drop_prob, name='do1b')
conv1_pooling, ind1 = max_pool_2by2_with_arg(drop1b, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
drop2 = dropout(conv2, drop_prob, name='do2')
conv2b, m2 = conv2d_layer(drop2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=False,
is_train=BN_phase, activation=activation, name='conv2b', reuse=reuse)
drop2b = dropout(conv2b, drop_prob, name='do2b')
conv2_pooling, ind2 = max_pool_2by2_with_arg(drop2b, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
drop3 = dropout(conv3, drop_prob, name='do3')
conv3b, m3 = conv2d_layer(drop3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=False,
is_train=BN_phase, activation=activation, name='conv3b', reuse=reuse)
drop3b = dropout(conv3b, drop_prob, name='do3b')
conv3_pooling, ind3 = max_pool_2by2_with_arg(drop3b, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation=activation, if_BN=False,
name='conv4', reuse=reuse)
drop4 = dropout(conv4, drop_prob, name='do4')
conv4b, m4 = conv2d_layer(drop4, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation='sigmoid', if_BN=False,
name='conv4b', reuse=reuse)
drop4 = dropout(conv4b, drop_prob, name='do4b')
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_layer(drop4, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse)
drop5 = dropout(deconv_5, drop_prob, name='do5')
deconv_5bis, m5b = conv2d_layer(drop5, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis',
reuse=reuse)
drop5b = dropout(deconv_5bis, drop_prob, name='do5b')
up1 = up_2by2_ind(drop5b, ind3, name='up1')
deconv_6, _ = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
drop6 = dropout(deconv_6, drop_prob, name='do6')
deconv_6bis, _ = conv2d_layer(drop6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
drop6b = dropout(deconv_6bis, drop_prob, name='do6b')
up2 = up_2by2_ind(drop6b, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
drop7 = dropout(deconv_7, drop_prob, name='do7')
deconv_7bis, _ = conv2d_layer(drop7, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
drop7b = dropout(deconv_7bis, drop_prob, name='do7b')
up3 = up_2by2_ind(drop7b, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
drop8 = dropout(deconv_8, drop_prob, name='do8')
deconv_8bis, _ = conv2d_layer(drop8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
drop8b = dropout(deconv_8bis, drop_prob, name='do8b')
logits, m8bb = conv2d_layer(drop8b,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Segnet_like(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Segnet'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1bis, _ = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bis, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2bis, _ = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2bis, name='maxp2')
conv3, _ = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bis', reuse=reuse)
conv4bisbis, m4bb = conv2d_layer(conv4bis, shape=[conv_size, conv_size, nb_conv * 8, 1],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bisbis', reuse=reuse)
conv4_pooling, ind4 = max_pool_2by2_with_arg(conv4bisbis, name='maxp4')
with tf.name_scope('decoder'):
up0 = up_2by2_ind(conv4_pooling, ind4, name='up0')
deconv_5, m5 = conv2d_layer(up0, [conv_size, conv_size, 1, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5', reuse=reuse) # [height, width, in_channels, output_channels]
deconv_5bis, _ = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
deconv_5bisbis, _ = conv2d_layer(deconv_5, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv5bis', reuse=reuse)
up1 = up_2by2_ind(deconv_5bisbis, ind3, name='up1')
deconv_6, m6 = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6', reuse=reuse)
deconv_6bis, _ = conv2d_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv6bis', reuse=reuse)
up2 = up_2by2_ind(deconv_6bis, ind2, name='up2')
deconv_7, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7', reuse=reuse)
deconv_7bis, _ = conv2d_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv7bis', reuse=reuse)
up3 = up_2by2_ind(deconv_7bis, ind1, name='up3')
deconv_8, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8', reuse=reuse)
deconv_8bis, _ = conv2d_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='deconv8bis', reuse=reuse)
logits, m8bb = conv2d_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Segnet_improved(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Segnet2'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1bis, _ = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bis, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2b, _ = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2b, name='maxp2')
conv3, _ = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3b, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3b, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4b, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bis', reuse=reuse)
conv4_pooling, ind4 = max_pool_2by2_with_arg(conv4b, name='maxp4')
with tf.name_scope('connexion'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5', reuse=reuse)
conv5b, m5b = conv2d_layer(conv5, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5bis', reuse=reuse)
conv5bb, m5u = conv2d_layer(conv5b, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='bot5bb', reuse=reuse)
with tf.name_scope('decoder'):
up0 = up_2by2_ind(conv5bb, ind4, name='up0')
conv6, m6 = conv2d_layer(up0, [conv_size, conv_size, nb_conv * 8, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv6', reuse=reuse) # [height, width, in_channels, output_channels]
conv6b, _ = conv2d_layer(conv6, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv6bis', reuse=reuse)
up1 = up_2by2_ind(conv6b, ind3, name='up1')
conv7, m7 = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv7', reuse=reuse)
conv7b, _ = conv2d_layer(conv7, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv7bis', reuse=reuse)
up2 = up_2by2_ind(conv7b, ind2, name='up2')
conv8, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv8', reuse=reuse)
conv8b, _ = conv2d_layer(conv8, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv8bis', reuse=reuse)
up3 = up_2by2_ind(conv8b, ind1, name='up3')
conv9, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv9', reuse=reuse)
conv9b, _ = conv2d_layer(conv9, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(conv9b,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Segnet_constant(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Segnet3'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1bis, _ = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1bis', reuse=reuse)
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1bis, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2b, _ = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2b, name='maxp2')
conv3, _ = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3b, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3b, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4b, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4bis', reuse=reuse)
conv4_pooling, ind4 = max_pool_2by2_with_arg(conv4b, name='maxp4')
with tf.name_scope('connexion'):
connex = constant_layer(conv4_pooling, constant=1.0, name='constant')
with tf.name_scope('decoder'):
up0 = up_2by2_ind(connex, ind4, name='up0')
conv6, m6 = conv2d_layer(up0, [conv_size, conv_size, nb_conv * 8, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv6', reuse=reuse) # [height, width, in_channels, output_channels]
conv6b, _ = conv2d_layer(conv6, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv6bis', reuse=reuse)
up1 = up_2by2_ind(conv6b, ind3, name='up1')
conv7, m7 = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv7', reuse=reuse)
conv7b, _ = conv2d_layer(conv7, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv7bis', reuse=reuse)
up2 = up_2by2_ind(conv7b, ind2, name='up2')
conv8, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv8', reuse=reuse)
conv8b, _ = conv2d_layer(conv8, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv8bis', reuse=reuse)
up3 = up_2by2_ind(conv8b, ind1, name='up3')
conv9, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv9', reuse=reuse)
conv9b, _ = conv2d_layer(conv9, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(conv9b,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Segnet_shallow(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Segnet4'):
with tf.name_scope('encoder'):
conv1, _ = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse) # [height, width, in_channels, output_channels]
conv1_pooling, ind1 = max_pool_2by2_with_arg(conv1, name='maxp1')
conv2, _ = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling, ind2 = max_pool_2by2_with_arg(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv3', reuse=reuse)
conv3_pooling, ind3 = max_pool_2by2_with_arg(conv3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_classes],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv4', reuse=reuse)
conv4_pooling, ind4 = max_pool_2by2_with_arg(conv4, name='maxp4')
with tf.name_scope('decoder'):
up0 = up_2by2_ind(conv4_pooling, ind4, name='up0')
conv6, m6 = conv2d_layer(up0, [conv_size, conv_size, nb_classes, nb_conv * 8], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv6', reuse=reuse) # [height, width, in_channels, output_channels]
conv6b, _ = conv2d_layer(conv6, [conv_size, conv_size, nb_conv * 8, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv6bis', reuse=reuse)
up1 = up_2by2_ind(conv6b, ind3, name='up1')
conv7, m7 = conv2d_layer(up1, [conv_size, conv_size, nb_conv * 4, nb_conv * 4], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv7', reuse=reuse)
conv7b, _ = conv2d_layer(conv7, [conv_size, conv_size, nb_conv * 4, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv7bis', reuse=reuse)
up2 = up_2by2_ind(conv7b, ind2, name='up2')
conv8, _ = conv2d_layer(up2, [conv_size, conv_size, nb_conv * 2, nb_conv * 2], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv8', reuse=reuse)
conv8b, _ = conv2d_layer(conv8, [conv_size, conv_size, nb_conv * 2, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv8bis', reuse=reuse)
up3 = up_2by2_ind(conv8b, ind1, name='up3')
conv9, _ = conv2d_layer(up3, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv9', reuse=reuse)
conv9b, _ = conv2d_layer(conv9, [conv_size, conv_size, nb_conv, nb_conv], if_BN=if_BN,
is_train=BN_phase, activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(conv9b,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False,is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Unet'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=if_BN, is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse)
conv1bis, m1b = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv1bis', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1bis, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2bis, m2b = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2bis, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
is_train=BN_phase, activation=activation,
name='conv4bisbis', reuse=reuse)
conv4_pooling = max_pool_2by2(conv4bis, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5', reuse=reuse)
conv5bis, m5b = conv2d_layer(conv5, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5bis', reuse=reuse)
deconv1, m5u = conv2d_transpose_layer(conv5bis, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 8],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv1', reuse=reuse)
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4bis], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6', reuse=reuse)
conv_6bis, m6b = conv2d_layer(conv_6, [conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6bis', reuse=reuse)
deconv2, m6u = conv2d_transpose_layer(conv_6bis, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 4],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation,
name='deconv2', reuse=reuse)
concat2 = concat([deconv2, conv3bis], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7', reuse=reuse)
conv_7bis, m7b = conv2d_layer(conv_7, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7bis', reuse=reuse)
deconv3, m7u = conv2d_transpose_layer(conv_7bis, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv3', reuse=reuse)
concat3 = concat([deconv3, conv2bis], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8', reuse=reuse)
conv_8bis, m8b = conv2d_layer(conv_8, [conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8bis', reuse=reuse)
deconv4, m8u = conv2d_transpose_layer(conv_8bis, shape=[conv_size, conv_size, nb_conv * 2, nb_conv],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv4', reuse=reuse)
concat4 = concat([deconv4, conv1bis], name='concat4')
deconv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9', reuse=reuse)
deconv_9bis, m9b = conv2d_layer(deconv_9, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(deconv_9bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_shallow(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Unet2'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=False, is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=False, is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=False, is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=False, is_train=BN_phase, activation=activation, name='conv4', reuse=reuse)
conv4_pooling = max_pool_2by2(conv4, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5', reuse=reuse)
conv5bis, m5b = conv2d_layer(conv5, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5bis', reuse=reuse)
deconv1, m5u = conv2d_transpose_layer(conv5bis, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 8],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv1', reuse=reuse)
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6', reuse=reuse)
conv_6bis, m6b = conv2d_layer(conv_6, [conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6bis', reuse=reuse)
deconv2, m6u = conv2d_transpose_layer(conv_6bis, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 4],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation,
name='deconv2', reuse=reuse)
concat2 = concat([deconv2, conv3], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7', reuse=reuse)
conv_7bis, m7b = conv2d_layer(conv_7, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7bis', reuse=reuse)
deconv3, m7u = conv2d_transpose_layer(conv_7bis, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv3', reuse=reuse)
concat3 = concat([deconv3, conv2], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8', reuse=reuse)
conv_8bis, m8b = conv2d_layer(conv_8, [conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8bis', reuse=reuse)
deconv4, m8u = conv2d_transpose_layer(conv_8bis, shape=[conv_size, conv_size, nb_conv * 2, nb_conv],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv4', reuse=reuse)
concat4 = concat([deconv4, conv1], name='concat4')
deconv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9', reuse=reuse)
deconv_9bis, m9b = conv2d_layer(deconv_9, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(deconv_9bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_weka(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Unet4'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 10, nb_conv], #[height, width, in_channels, output_channels]
if_BN=if_BN, is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv4', reuse=reuse)
conv4_pooling = max_pool_2by2(conv4, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5', reuse=reuse)
deconv1, m5u = up_2by2(conv5, name='up1')
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6', reuse=reuse) #[height, width, in_channels, output_channels]
deconv2, m6u = up_2by2(conv_6, name='up2')
concat2 = concat([deconv2, conv3], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7', reuse=reuse)
deconv3, m6u = up_2by2(conv_7, name='up2')
concat3 = concat([deconv3, conv2], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8', reuse=reuse)
deconv4, m8u = conv2d_transpose_layer(conv_8, [conv_size, conv_size, nb_conv * 2, nb_conv],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size, patch_size, nb_conv],
if_BN=if_BN, is_train=BN_phase,
stride=2, activation=activation,
name='deconv4', reuse=reuse)
concat4 = concat([deconv4, conv1], name='concat4')
deconv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9', reuse=reuse)
logits, m9bb = conv2d_layer(deconv_9,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_upsample(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Unet4'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=if_BN, is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='conv4', reuse=reuse)
conv4_pooling = max_pool_2by2(conv4, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5', reuse=reuse)
conv5b, m5b = conv2d_layer(conv5, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5bis', reuse=reuse)
deconv1, m5u = up_2by2(conv5b, name='up1')
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6', reuse=reuse)
conv_6b, m6b = conv2d_layer(conv_6, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6bis', reuse=reuse) #[height, width, in_channels, output_channels]
deconv2, m6u = up_2by2(conv_6b, name='up2')
concat2 = concat([deconv2, conv3], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7', reuse=reuse)
conv_7b, m7b = conv2d_layer(conv_7, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7bis', reuse=reuse)
deconv3, m6u = up_2by2(conv_7b, name='up3')
concat3 = concat([deconv3, conv2], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8', reuse=reuse)
conv_8b, m8b = conv2d_layer(conv_8, [conv_size, conv_size, nb_conv * 2, nb_conv * 1],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8bis', reuse=reuse)
deconv4, m6u = up_2by2(conv_8b, name='up4')
concat4 = concat([deconv4, conv1], name='concat4')
conv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9', reuse=reuse)
conv_9b, m9 = conv2d_layer(conv_9, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(conv_9b,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_encoder_no_BN(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Unet5'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=False, activation=activation,
name='conv1', reuse=reuse)
conv1bis, m1b = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv],
if_BN=False, activation=activation, name='conv1bis', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1bis, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=False, activation=activation, name='conv2', reuse=reuse)
conv2bis, m2b = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=False, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2bis, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=False, activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=False, activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=False, activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
activation=activation,
name='conv4bisbis', reuse=reuse)
conv4_pooling = max_pool_2by2(conv4bis, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5', reuse=reuse)
conv5bis, m5b = conv2d_layer(conv5, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 16],
if_BN=if_BN, is_train=BN_phase, activation=activation, name='bot5bis', reuse=reuse)
deconv1, m5u = conv2d_transpose_layer(conv5bis, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 8],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv1', reuse=reuse)
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4bis], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6', reuse=reuse)
conv_6bis, m6b = conv2d_layer(conv_6, [conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv6bis', reuse=reuse)
deconv2, m6u = conv2d_transpose_layer(conv_6bis, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 4],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation,
name='deconv2', reuse=reuse)
concat2 = concat([deconv2, conv3bis], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7', reuse=reuse)
conv_7bis, m7b = conv2d_layer(conv_7, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv7bis', reuse=reuse)
deconv3, m7u = conv2d_transpose_layer(conv_7bis, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv3', reuse=reuse)
concat3 = concat([deconv3, conv2bis], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8', reuse=reuse)
conv_8bis, m8b = conv2d_layer(conv_8, [conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv8bis', reuse=reuse)
deconv4, m8u = conv2d_transpose_layer(conv_8bis, shape=[conv_size, conv_size, nb_conv * 2, nb_conv],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv4', reuse=reuse)
concat4 = concat([deconv4, conv1bis], name='concat4')
deconv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9', reuse=reuse)
deconv_9bis, m9b = conv2d_layer(deconv_9, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=if_BN, is_train=BN_phase,
activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(deconv_9bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_without_BN(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
device=0,
):
with tf.name_scope('Unet6'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=False, activation=activation,
name='conv1', reuse=reuse)
conv1bis, m1b = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv],
if_BN=False, activation=activation, name='conv1bis', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1bis, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=False, activation=activation, name='conv2', reuse=reuse)
conv2bis, m2b = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=False, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2bis, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=False, activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=False, activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=False, activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
activation=activation,
name='conv4bisbis', reuse=reuse)
conv4_pooling = max_pool_2by2(conv4bis, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=False, activation=activation, name='bot5', reuse=reuse)
conv5bis, m5b = conv2d_layer(conv5, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 16],
if_BN=False, activation=activation, name='bot5bis', reuse=reuse)
deconv1, m5u = conv2d_transpose_layer(conv5bis, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 8],
stride=2, if_BN=False,
activation=activation, name='deconv1', reuse=reuse)
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4bis], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=False,
activation=activation, name='conv6', reuse=reuse)
conv_6bis, m6b = conv2d_layer(conv_6, [conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=False,
activation=activation, name='conv6bis', reuse=reuse)
deconv2, m6u = conv2d_transpose_layer(conv_6bis, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 4],
stride=2, if_BN=False,
activation=activation,
name='deconv2', reuse=reuse)
concat2 = concat([deconv2, conv3bis], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=False, activation=activation, name='conv7', reuse=reuse)
conv_7bis, m7b = conv2d_layer(conv_7, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=False, activation=activation, name='conv7bis', reuse=reuse)
deconv3, m7u = conv2d_transpose_layer(conv_7bis, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv3', reuse=reuse)
concat3 = concat([deconv3, conv2bis], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=False, activation=activation, name='conv8', reuse=reuse)
conv_8bis, m8b = conv2d_layer(conv_8, [conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=False, activation=activation, name='conv8bis', reuse=reuse)
deconv4, m8u = conv2d_transpose_layer(conv_8bis, shape=[conv_size, conv_size, nb_conv * 2, nb_conv],
stride=2, if_BN=False, activation=activation, name='deconv4', reuse=reuse)
concat4 = concat([deconv4, conv1bis], name='concat4')
deconv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=False, activation=activation, name='conv9', reuse=reuse)
deconv_9bis, m9b = conv2d_layer(deconv_9, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=False, activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(deconv_9bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_with_droupout(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
device=0,
):
with tf.name_scope('Unet7'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=False, activation=activation,
name='conv1', reuse=reuse)
conv1D = dropout(conv1, drop_prob, 'do1')
conv1bis, m1b = conv2d_layer(conv1D, shape=[conv_size, conv_size, nb_conv, nb_conv],
if_BN=False, activation=activation, name='conv1bis', reuse=reuse)
conv1bisD = dropout(conv1bis, drop_prob, 'do1b')
conv1_pooling = max_pool_2by2(conv1bisD, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=False, activation=activation, name='conv2', reuse=reuse)
conv2D = dropout(conv2, drop_prob, 'do2')
conv2bis, m2b = conv2d_layer(conv2D, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=False, activation=activation, name='conv2bis', reuse=reuse)
conv2bisD = dropout(conv2bis, drop_prob, 'do2b')
conv2_pooling = max_pool_2by2(conv2bisD, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=False, activation=activation, name='conv3', reuse=reuse)
conv3D = dropout(conv3, drop_prob, 'do3')
conv3bis, m3b = conv2d_layer(conv3D, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=False, activation=activation, name='conv3bis', reuse=reuse)
conv3bisD = dropout(conv3bis, drop_prob, 'do3b')
conv3_pooling = max_pool_2by2(conv3bisD, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=False, activation=activation, name='conv4', reuse=reuse)
conv4D = dropout(conv4, drop_prob, 'do4')
conv4bis, m4b = conv2d_layer(conv4D, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=False, activation=activation,
name='conv4bisbis', reuse=reuse)
conv4bisD = dropout(conv4bis, drop_prob, 'do4b')
conv4_pooling = max_pool_2by2(conv4bisD, name='maxp4')
with tf.name_scope('bottom'):
conv5, m5 = conv2d_layer(conv4_pooling, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 16],
if_BN=if_BN, activation=activation, name='bot5', reuse=reuse)
conv5D = dropout(conv5, drop_prob, 'do5')
conv5bis, m5b = conv2d_layer(conv5D, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 16],
if_BN=if_BN, activation=activation, name='bot5bis', reuse=reuse)
conv5bisD = dropout(conv5bis, drop_prob, 'do5b')
deconv1, m5u = conv2d_transpose_layer(conv5bisD, shape=[conv_size, conv_size, nb_conv * 16, nb_conv * 8],
stride=2, if_BN=False,
activation=activation, name='deconv1', reuse=reuse)
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv4bis], name='concat1')
conv_6, m6 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 16, nb_conv * 8],
if_BN=if_BN,
activation=activation, name='conv6', reuse=reuse)
conv6D = dropout(conv_6, drop_prob, 'do6')
conv_6bis, m6b = conv2d_layer(conv6D, [conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=if_BN,
activation=activation, name='conv6bis', reuse=reuse)
conv6bisD = dropout(conv_6bis, drop_prob, 'do6b')
deconv2, m6u = conv2d_transpose_layer(conv6bisD, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 4],
stride=2, if_BN=False,
activation=activation,
name='deconv2', reuse=reuse)
concat2 = concat([deconv2, conv3bis], name='concat2')
conv_7, m7 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=if_BN, activation=activation, name='conv7', reuse=reuse)
conv7D = dropout(conv_7, drop_prob, 'do7')
conv_7bis, m7b = conv2d_layer(conv7D, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=if_BN, activation=activation, name='conv7bis', reuse=reuse)
conv7bisD = dropout(conv_7bis, drop_prob, 'do7b')
deconv3, m7u = conv2d_transpose_layer(conv7bisD, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2],
stride=2, if_BN=False, is_train=BN_phase,
activation=activation, name='deconv3', reuse=reuse)
concat3 = concat([deconv3, conv2bis], name='concat3')
conv_8, m8 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=if_BN, activation=activation, name='conv8', reuse=reuse)
conv8D = dropout(conv_8, drop_prob, 'do8')
conv_8bis, m8b = conv2d_layer(conv8D, [conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=if_BN, activation=activation, name='conv8bis', reuse=reuse)
conv8bisD = dropout(conv_8bis, drop_prob, 'do8b')
deconv4, m8u = conv2d_transpose_layer(conv8bisD, shape=[conv_size, conv_size, nb_conv * 2, nb_conv],
stride=2, if_BN=False, activation=activation, name='deconv4', reuse=reuse)
concat4 = concat([deconv4, conv1bis], name='concat4')
conv_9, m9 = conv2d_layer(concat4, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=if_BN, activation=activation, name='conv9', reuse=reuse)
conv9D = dropout(conv_9, drop_prob, 'do8b')
conv_9bis, m9b = conv2d_layer(conv9D, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=if_BN, activation=activation, name='conv9bis', reuse=reuse)
logits, m9bb = conv2d_layer(conv_9bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_Unet_with_droupout_shallow(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
device=0,
):
with tf.name_scope('Unet8'):
with tf.name_scope('contractor'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
if_BN=False, activation=activation,
name='conv1', reuse=reuse)
conv1D = dropout(conv1, drop_prob, 'do1')
conv1bis, m1b = conv2d_layer(conv1D, shape=[conv_size, conv_size, nb_conv, nb_conv],
if_BN=False, activation=activation, name='conv1bis', reuse=reuse)
conv1bisD = dropout(conv1bis, drop_prob, 'do1b')
conv1_pooling = max_pool_2by2(conv1bisD, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
if_BN=False, activation=activation, name='conv2', reuse=reuse)
conv2D = dropout(conv2, drop_prob, 'do2')
conv2bis, m2b = conv2d_layer(conv2D, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=False, activation=activation, name='conv2bis', reuse=reuse)
conv2bisD = dropout(conv2bis, drop_prob, 'do2b')
conv2_pooling = max_pool_2by2(conv2bisD, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
if_BN=False, activation=activation, name='conv3', reuse=reuse)
conv3D = dropout(conv3, drop_prob, 'do3')
conv3bis, m3b = conv2d_layer(conv3D, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=False, activation=activation, name='conv3bis', reuse=reuse)
conv3bisD = dropout(conv3bis, drop_prob, 'do3b')
conv3_pooling = max_pool_2by2(conv3bisD, name='maxp3')
with tf.name_scope('bottom'):
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
if_BN=False, activation=activation, name='bot4', reuse=reuse)
conv4D = dropout(conv4, drop_prob, 'do4')
conv4bis, m4b = conv2d_layer(conv4D, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
if_BN=False, activation=activation, name='bot4bis', reuse=reuse)
conv4bisD = dropout(conv4bis, drop_prob, 'do4b')
deconv1, m4u = conv2d_transpose_layer(conv4bisD, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 4],
stride=2, if_BN=False,
activation=activation, name='deconv1', reuse=reuse)
with tf.name_scope('decontractor'):
concat1 = concat([deconv1, conv3bis], name='concat1')
conv_5, m5 = conv2d_layer(concat1, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
if_BN=False, activation=activation, name='conv5', reuse=reuse)
conv5D = dropout(conv_5, drop_prob, 'do5')
conv_5bis, m5b = conv2d_layer(conv5D, [conv_size, conv_size, nb_conv * 4, nb_conv * 4],
if_BN=False, activation=activation, name='conv5bis', reuse=reuse)
conv5bisD = dropout(conv_5bis, drop_prob, 'do5b')
deconv2, m5u = conv2d_transpose_layer(conv5bisD, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 2],
stride=2, if_BN=if_BN, is_train=BN_phase,
activation=activation, name='deconv2', reuse=reuse)
concat2 = concat([deconv2, conv2bis], name='concat2')
conv_6, m6 = conv2d_layer(concat2, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
if_BN=False, activation=activation, name='conv6', reuse=reuse)
conv6D = dropout(conv_6, drop_prob, 'do6')
conv_6bis, m6b = conv2d_layer(conv6D, [conv_size, conv_size, nb_conv * 2, nb_conv * 2],
if_BN=False, activation=activation, name='conv6bis', reuse=reuse)
conv6bisD = dropout(conv_6bis, drop_prob, 'do6b')
deconv3, m6u = conv2d_transpose_layer(conv6bisD, shape=[conv_size, conv_size, nb_conv * 2, nb_conv],
stride=2, if_BN=False, activation=activation, name='deconv3', reuse=reuse)
concat3 = concat([deconv3, conv1bis], name='concat3')
conv_7, m7 = conv2d_layer(concat3, [conv_size, conv_size, nb_conv * 2, nb_conv],
if_BN=False, activation=activation, name='conv7', reuse=reuse)
conv7D = dropout(conv_7, drop_prob, 'do6')
conv_7bis, m7b = conv2d_layer(conv7D, [conv_size, conv_size, nb_conv, nb_conv],
if_BN=False, activation=activation, name='conv7bis', reuse=reuse)
conv7bisD = dropout(conv_7bis, drop_prob, 'do6b')
logits, m7bb = conv2d_layer(conv7bisD,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
if_BN=False, name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def model_xlearn_like(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
if_BN=True,
BN_phase=None,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
with tf.name_scope('Xlearn'):
with tf.name_scope('encoder'):
conv1, m1 = conv2d_layer(pipeline['img'], shape=[conv_size, conv_size, 1, nb_conv], #[height, width, in_channels, output_channels]
is_train=BN_phase, activation=activation,
name='conv1', reuse=reuse)#[height, width, in_channels, output_channels]
conv1bis, m1b = conv2d_layer(conv1, shape=[conv_size, conv_size, nb_conv, nb_conv],
is_train=BN_phase,
activation=activation, name='conv1bis', reuse=reuse)
conv1_pooling = max_pool_2by2(conv1bis, name='maxp1')
conv2, m2 = conv2d_layer(conv1_pooling, shape=[conv_size, conv_size, nb_conv, nb_conv * 2],
is_train=BN_phase, activation=activation, name='conv2', reuse=reuse)
conv2bis, m2b = conv2d_layer(conv2, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 2],
is_train=BN_phase, activation=activation, name='conv2bis', reuse=reuse)
conv2_pooling = max_pool_2by2(conv2bis, name='maxp2')
conv3, m3 = conv2d_layer(conv2_pooling, shape=[conv_size, conv_size, nb_conv * 2, nb_conv * 4],
is_train=BN_phase, activation=activation, name='conv3', reuse=reuse)
conv3bis, m3b = conv2d_layer(conv3, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 4],
is_train=BN_phase, activation=activation, name='conv3bis', reuse=reuse)
conv3_pooling = max_pool_2by2(conv3bis, name='maxp3')
conv4, m4 = conv2d_layer(conv3_pooling, shape=[conv_size, conv_size, nb_conv * 4, nb_conv * 8],
is_train=BN_phase, activation=activation, name='conv4', reuse=reuse)
conv4bis, m4b = conv2d_layer(conv4, shape=[conv_size, conv_size, nb_conv * 8, nb_conv * 8],
is_train=BN_phase, activation=activation,
name='conv4bis', reuse=reuse)
conv4bisbis, m4bb = conv2d_layer(conv4bis, shape=[conv_size, conv_size, nb_conv * 8, 1],
is_train=BN_phase, activation=activation,
name='conv4bisbis', reuse=reuse)
with tf.name_scope('dnn'):
conv4_flat = reshape(conv4bisbis, [-1, patch_size ** 2 // 64], name='flatten')
full_layer_1, mf1 = normal_full_layer(conv4_flat, patch_size ** 2 // 128, activation=activation,
if_BN=if_BN, is_train=BN_phase, name='dnn1', reuse=reuse)
full_dropout1 = dropout(full_layer_1, drop_prob, name='dropout1')
full_layer_2, mf2 = normal_full_layer(full_dropout1, patch_size ** 2 // 128, activation=activation,
if_BN=if_BN, is_train=BN_phase, name='dnn2', reuse=reuse)
full_dropout2 = dropout(full_layer_2, drop_prob, name='dropout2')
full_layer_3, mf3 = normal_full_layer(full_dropout2, patch_size ** 2 // 64, activation=activation,
if_BN=if_BN, is_train=BN_phase, name='dnn3', reuse=reuse)
full_dropout3 = dropout(full_layer_3, drop_prob, name='dropout3')
dnn_reshape = reshape(full_dropout3, [-1, patch_size // 8, patch_size // 8, 1], name='reshape')
with tf.name_scope('decoder'):
deconv_5, m5 = conv2d_transpose_layer(dnn_reshape, [conv_size, conv_size, 1, nb_conv * 8],
[batch_size, patch_size // 8, patch_size // 8, nb_conv * 8],
if_BN=if_BN, is_train=BN_phase, name='deconv5',
activation=activation, reuse=reuse) #[height, width, in_channels, output_channels]
deconv_5bis, m5b = conv2d_transpose_layer(deconv_5, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size // 8, patch_size // 8, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase, name='deconv5bis',
activation=activation, reuse=reuse)
concat1 = concat([up_2by2(deconv_5bis, name='up1'), conv3bis], name='concat1')
deconv_6, m6 = conv2d_transpose_layer(concat1, [conv_size, conv_size, nb_conv * 8, nb_conv * 4],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size // 4, patch_size // 4, nb_conv * 4],
if_BN=if_BN, is_train=BN_phase, name='deconv6',
activation=activation, reuse=reuse)
deconv_6bis, m6b = conv2d_transpose_layer(deconv_6, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size // 4, patch_size // 4, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase, name='deconv6bis',
activation=activation, reuse=reuse)
concat2 = concat([up_2by2(deconv_6bis, name='up2'), conv2bis], name='concat2')
deconv_7, m7 = conv2d_transpose_layer(concat2, [conv_size, conv_size, nb_conv * 4, nb_conv * 2],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size // 2, patch_size // 2, nb_conv * 2],
if_BN=if_BN, is_train=BN_phase, name='deconv7',
activation=activation, reuse=reuse)
deconv_7bis, m7b = conv2d_transpose_layer(deconv_7, [conv_size, conv_size, nb_conv * 2, nb_conv],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size // 2, patch_size //2, nb_conv],
if_BN=if_BN, is_train=BN_phase, name='deconv7bis',
activation=activation, reuse=reuse)
concat3 = concat([up_2by2(deconv_7bis, name='up3'), conv1bis], name='concat3')
deconv_8, m8 = conv2d_transpose_layer(concat3, [conv_size, conv_size, nb_conv * 2, nb_conv],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size, patch_size, nb_conv],
if_BN=if_BN, is_train=BN_phase, name='deconv8',
activation=activation, reuse=reuse)
deconv_8bis, m8b = conv2d_transpose_layer(deconv_8, [conv_size, conv_size, nb_conv, nb_conv],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size, patch_size, nb_conv],
if_BN=if_BN, is_train=BN_phase, name='deconv8bis',
activation=activation, reuse=reuse)
logits, m8bb = conv2d_transpose_layer(deconv_8bis,
[conv_size, conv_size, nb_conv, 1 if mode == 'regression' else nb_classes],
# fixme: batch_size here might not be automatic while inference
[batch_size, patch_size, patch_size, 1 if mode == 'regression' else nb_classes],
if_BN=False, is_train=BN_phase,
name='logits', reuse=reuse)
print_nodes_name_shape(tf.get_default_graph())
return logits, []
def custom(pipeline,
patch_size,
batch_size,
conv_size,
nb_conv,
drop_prob,
BN_phase,
activation='relu',
reuse=False,
mode='regression',
nb_classes=3,
):
pass
model_dict = {
'LRCS': model_LRCS,
'LRCS2': model_LRCS_improved,
'LRCS3': model_LRCS_constant,
'LRCS4': model_LRCS_shallow,
'LRCS5': model_LRCS_simple,
'LRCS6': model_LRCS_purConv,
'LRCS7': model_LRCS_LeCun,
'LRCS8': model_LRCS_Weka,
'LRCS9': model_LRCS_weka_constant,
'LRCS10': model_LRCS_lecun_thinner_weka_encoder,
'LRCS11': model_LRCS_lecun_thinner_encoder,
'LRCS12': model_LRCS_mix_skipconnect,
'LRCS13': model_LRCS_dropout_on_conv,
'LRCS14': model_LRCS_full_FCLs,
'LRCS15': model_LRCS_deeper_with_dropout_on_conv,
'Xlearn': model_xlearn_like,
'Unet': model_Unet,
'Unet2': model_Unet_shallow,
'Unet3': model_Unet_weka,
# 'Unet4': model_Unet_upsample, # upsampling2d not working
'Unet5': model_Unet_encoder_no_BN,
'Unet6': model_Unet_without_BN,
'Unet7': model_Unet_with_droupout,
'Unet8': model_Unet_with_droupout_shallow,
'Segnet': model_Segnet_like,
'Segnet2': model_Segnet_improved,
'Segnet3': model_Segnet_constant,
'Segnet4': model_Segnet_shallow,
'custom': custom,
} | 64.274322 | 158 | 0.538526 | 17,812 | 153,937 | 4.332416 | 0.026611 | 0.092369 | 0.071531 | 0.089155 | 0.931566 | 0.921134 | 0.914512 | 0.911195 | 0.904392 | 0.900945 | 0 | 0.04245 | 0.361706 | 153,937 | 2,395 | 159 | 64.274322 | 0.742929 | 0.030623 | 0 | 0.815309 | 0 | 0 | 0.041667 | 0 | 0 | 0 | 0 | 0.000835 | 0.000494 | 1 | 0.014815 | false | 0.001481 | 0.001481 | 0 | 0.030123 | 0.013827 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
862fbe942c12ab246a8426fceb753bab3b6f8d81 | 3,641 | py | Python | tools/conan/conans/test/command/new_test.py | aversiveplusplus/aversiveplusplus | 5f5fe9faca50197fd6207e2c816efa7e9af6c804 | [
"BSD-3-Clause"
] | 29 | 2016-01-27T09:43:44.000Z | 2020-03-12T04:16:02.000Z | tools/conan/conans/test/command/new_test.py | aversiveplusplus/aversiveplusplus | 5f5fe9faca50197fd6207e2c816efa7e9af6c804 | [
"BSD-3-Clause"
] | 20 | 2016-01-22T15:59:33.000Z | 2016-10-28T10:22:45.000Z | tools/conan/conans/test/command/new_test.py | aversiveplusplus/aversiveplusplus | 5f5fe9faca50197fd6207e2c816efa7e9af6c804 | [
"BSD-3-Clause"
] | 6 | 2016-02-11T14:09:04.000Z | 2018-03-17T00:18:35.000Z | import unittest
from conans.test.tools import TestClient
import os
from conans.util.files import load
class NewTest(unittest.TestCase):
def new_test(self):
""" Test that the user can be shown and changed, and it is reflected in the
user cache localdb
"""
client = TestClient()
client.run('new MyPackage/1.3@myuser/testing -t')
root = client.current_folder
self.assertTrue(os.path.exists(os.path.join(root, "conanfile.py")))
content = load(os.path.join(root, "conanfile.py"))
self.assertIn('name = "MyPackage"', content)
self.assertIn('version = "1.3"', content)
self.assertTrue(os.path.exists(os.path.join(root, "test_package/conanfile.py")))
self.assertTrue(os.path.exists(os.path.join(root, "test_package/CMakeLists.txt")))
self.assertTrue(os.path.exists(os.path.join(root, "test_package/example.cpp")))
# assert they are correct at least
client.run("export myuser/testing")
client.run("info test_package")
self.assertIn("MyPackage/1.3@myuser/testing", client.user_io.out)
def new_dash_test(self):
""" packages with dash
"""
client = TestClient()
client.run('new My-Package/1.3@myuser/testing -t')
root = client.current_folder
self.assertTrue(os.path.exists(os.path.join(root, "conanfile.py")))
content = load(os.path.join(root, "conanfile.py"))
self.assertIn('name = "My-Package"', content)
self.assertIn('version = "1.3"', content)
self.assertTrue(os.path.exists(os.path.join(root, "test_package/conanfile.py")))
self.assertTrue(os.path.exists(os.path.join(root, "test_package/CMakeLists.txt")))
self.assertTrue(os.path.exists(os.path.join(root, "test_package/example.cpp")))
# assert they are correct at least
client.run("export myuser/testing")
client.run("info test_package")
self.assertIn("My-Package/1.3@myuser/testing", client.user_io.out)
def new_header_test(self):
""" Test that the user can be shown and changed, and it is reflected in the
user cache localdb
"""
client = TestClient()
client.run('new MyPackage/1.3@myuser/testing -t -i')
root = client.current_folder
self.assertTrue(os.path.exists(os.path.join(root, "conanfile.py")))
content = load(os.path.join(root, "conanfile.py"))
self.assertIn('name = "MyPackage"', content)
self.assertIn('version = "1.3"', content)
self.assertTrue(os.path.exists(os.path.join(root, "test_package/conanfile.py")))
self.assertTrue(os.path.exists(os.path.join(root, "test_package/CMakeLists.txt")))
self.assertTrue(os.path.exists(os.path.join(root, "test_package/example.cpp")))
# assert they are correct at least
client.run("export myuser/testing")
client.run("info test_package")
self.assertIn("MyPackage/1.3@myuser/testing", client.user_io.out)
def new_without_test(self):
""" Test that the user can be shown and changed, and it is reflected in the
user cache localdb
"""
client = TestClient()
client.run('new MyPackage/1.3@myuser/testing')
root = client.current_folder
self.assertTrue(os.path.exists(os.path.join(root, "conanfile.py")))
self.assertFalse(os.path.exists(os.path.join(root, "test_package/conanfile.py")))
self.assertFalse(os.path.exists(os.path.join(root, "test_package/CMakeLists.txt")))
self.assertFalse(os.path.exists(os.path.join(root, "test_package/example.cpp")))
| 47.907895 | 91 | 0.65504 | 497 | 3,641 | 4.740443 | 0.150905 | 0.089134 | 0.080645 | 0.112903 | 0.927419 | 0.915535 | 0.907895 | 0.907895 | 0.907895 | 0.907895 | 0 | 0.006875 | 0.201044 | 3,641 | 75 | 92 | 48.546667 | 0.803025 | 0.10986 | 0 | 0.672727 | 0 | 0 | 0.261777 | 0.15871 | 0 | 0 | 0 | 0 | 0.454545 | 1 | 0.072727 | false | 0 | 0.072727 | 0 | 0.163636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
865a91b1eb4d2632f571d60a681151f3190435a0 | 16,112 | py | Python | modules/security.py | barrukurniawan/mhc_dashboard | 1fb409fcc7e09934af4c898c7985309c58fe5655 | [
"MIT"
] | null | null | null | modules/security.py | barrukurniawan/mhc_dashboard | 1fb409fcc7e09934af4c898c7985309c58fe5655 | [
"MIT"
] | null | null | null | modules/security.py | barrukurniawan/mhc_dashboard | 1fb409fcc7e09934af4c898c7985309c58fe5655 | [
"MIT"
] | null | null | null | import urllib2
import urllib
import json
import traceback
import random
from core import config
from core import database
from modules import signature
from stdlib import idgen
from bson.objectid import ObjectId
class security:
wmsDB = database.get_db_conn( config.wms_userDB_core )
def __init__(self):
pass
def wms_register_id(self, params):
response = {
"message_code" : config.SUCCESS_REGISTER_WMS_ACCOUNT_CODE,
"message_action" : config.SUCCESS_REGISTER_WMS_ACCOUNT_ACN ,
"message_desc" : "",
"message_data" : {}
}
try:
merchant_api_key_rec = database.get_record("db_merchant_api_key")
merchant_api_key_rec["merchant_label"] = params["merchant_label"]
merchant_api_key_rec["merchant_id" ] = params["merchant_id" ]
merchant_api_key_rec["merchant_key" ] = params["merchant_key" ]
merchant_api_key_rec["pic_name" ] = params["pic_name" ]
merchant_api_key_rec["pic_phone" ] = params["pic_phone" ]
merchant_api_key_rec["company" ] = params["company" ]
self.wmsDB.db_merchant_api_key.insert( merchant_api_key_rec )
except Exception, e:
response["message_code" ] = config.FAILED_REGISTER_WMS_ACCOUNT_CODE
response["message_action"] = config.FAILED_REGISTER_WMS_ACCOUNT_ACN
# end try
return response
# end def
def wms_locked(self, params):
pass
# end def
def wms_void_trans(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
partner_trx_id = params["partner_trx_id"]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(partner_trx_id)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_check_balance(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_auth_login(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id" ]
password = params["password" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(password)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_merchant_buy(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
merch_wms_id = params["to" ]
user_wms_id = params["from" ]
amount = params["amount" ]
pin = params["pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(merch_wms_id) + str(user_wms_id) + str(amount) + str(pin)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_register_user(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
name = params["name" ]
phone = params["phone" ]
dob = params["dob" ]
pin = params["pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(name) + str(phone) + str(dob)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_register_merchant(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id"]
password = params["password" ]
pin = params["pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(password) + str(pin)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_show_trans(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
target_id = params["target_id"]
start_dt = params["start_dt" ]
end_dt = params["end_dt" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(target_id) + str(start_dt) + str(end_dt)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_transfer(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
user_from = params["from" ]
user_to = params["to" ]
amount = params["amount" ]
pin = params["pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(user_from) + str(user_to) + str(amount) + str(pin)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_token_gen(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id"]
pin = params["pin" ]
valid_tm = params["valid_tm" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(pin) + str(valid_tm)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_token_redeem(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id" ]
trans_token = params["trans_token"]
amount = params["amount" ]
to = params["to" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(trans_token) + str(amount) + str(to)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_token_status(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id" ]
trans_token = params["trans_token"]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(trans_token)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_cashin(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
user_to = params["to" ]
amount = params["amount" ]
pin = params["pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(user_to) + str(amount) + str(pin)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_merchant_cashin(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
user_from = params["from" ]
user_to = params["to" ]
amount = params["amount" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(user_from) + str(user_to) + str(amount)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_process_update_pin(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id"]
new_pin = params["new_pin" ]
old_pin = params["old_pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(new_pin) + str(old_pin)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_process_reset_pin(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id"]
new_pin = params["new_pin" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(new_pin)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_process_update_password(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id"]
new_password = params["new_password"]
old_password = params["old_password"]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(new_password) + str(old_password)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_process_reset_password(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id" ]
new_password = params["new_password"]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(new_password)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_edit_user(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label"]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id"]
name = params["name" ]
email = params["email" ]
address = params["address" ]
phone = params["phone" ]
ktp = params["ktp" ]
mother = params["mother" ]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(name) + str(email) + str(address) +\
str(phone) + str(ktp) + str(mother)
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
def wms_edit_merchant(self, params):
verify_status = "VERIFY_SUCCESS"
h2h_label = params["h2h_label" ]
dlk_code = params["dlk_code" ]
token = params["token" ]
wallet_id = params["wallet_id" ]
ktp = params["ktp" ]
otp_phone = params["otp_phone" ]
contact = params["contact" ]
email = params["email" ]
siup = params["siup" ]
npwp = params["npwp" ]
owner = params["owner" ]
bank_name = params["bank_name" ]
bank_accn_num = params["bank_accn_num" ]
bank_accn_owner = params["bank_accn_owner"]
has_key_value = signature.signature()._create_onway_hash({
"merchant_label" : h2h_label,
"dlk_code" : dlk_code,
"sequance" : str(wallet_id) + str(ktp) + str(otp_phone) + str(contact) +\
str(email) + str(siup) + str(npwp) + str( bank_name ) +\
str( bank_accn_num )
})
if token != has_key_value:
verify_status = "VERIFY_FAILED"
# end if
return verify_status
# end def
# end class
| 38.270784 | 92 | 0.526502 | 1,682 | 16,112 | 4.683115 | 0.06956 | 0.067538 | 0.086835 | 0.072363 | 0.804494 | 0.773645 | 0.773645 | 0.773645 | 0.773645 | 0.773645 | 0 | 0.005769 | 0.375993 | 16,112 | 420 | 93 | 38.361905 | 0.7777 | 0.019737 | 0 | 0.726496 | 0 | 0 | 0.136626 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.037037 | 0.02849 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
86b2f47cb721d13a7292c9220132bd27ed55ef18 | 4,706 | py | Python | test_scaling_sla_bfo_2.py | chasebk/flnn_code | a561d4c697d1aa545a677f9e7d126ace7bb40068 | [
"Apache-2.0"
] | 36 | 2019-07-28T02:26:28.000Z | 2022-03-29T03:00:56.000Z | test_scaling_sla_bfo_2.py | chasebk/flnn_code | a561d4c697d1aa545a677f9e7d126ace7bb40068 | [
"Apache-2.0"
] | 1 | 2021-09-14T13:21:54.000Z | 2021-09-14T13:21:54.000Z | test_scaling_sla_bfo_2.py | chasebk/flnn_code | a561d4c697d1aa545a677f9e7d126ace7bb40068 | [
"Apache-2.0"
] | 16 | 2020-02-28T06:55:42.000Z | 2022-03-31T01:58:51.000Z | from model.scaling.ProactiveSLAScaling import SLABasedOnVms as BrokerScaling
from utils.IOUtil import load_number_of_vms, save_scaling_results_to_csv
import numpy as np
model_names = {"fl_bfonn": "fl_bfonn"}
input_types = {"uni": "uni", "multi": "multi"}
models = [
{"name": model_names["fl_bfonn"],
"sliding": 3,
"input_type": input_types["uni"],
"cpu": "FL_BFONN-sliding_3-ex_func_3-act_func_0-pop_size_70-elim_disp_steps_2-repro_steps_5-chem_steps_80-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_4",
"ram": "FL_BFONN-sliding_3-ex_func_2-act_func_0-pop_size_100-elim_disp_steps_1-repro_steps_5-chem_steps_80-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_8"},
{"name": model_names["fl_bfonn"],
"sliding": 3,
"input_type": input_types["multi"],
"cpu": "FL_BFONN-sliding_3-ex_func_3-act_func_0-pop_size_100-elim_disp_steps_2-repro_steps_5-chem_steps_60-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_8",
"ram": "FL_BFONN-sliding_3-ex_func_3-act_func_0-pop_size_100-elim_disp_steps_1-repro_steps_3-chem_steps_80-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.1-p_eliminate_0.25-swim_length_8"},
{"name": model_names["fl_bfonn"],
"sliding": 4,
"input_type": input_types["uni"],
"cpu": "FL_BFONN-sliding_4-ex_func_3-act_func_0-pop_size_100-elim_disp_steps_2-repro_steps_3-chem_steps_60-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.1-p_eliminate_0.25-swim_length_8",
"ram": "FL_BFONN-sliding_4-ex_func_2-act_func_0-pop_size_100-elim_disp_steps_2-repro_steps_5-chem_steps_80-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.15-p_eliminate_0.25-swim_length_4"},
{"name": model_names["fl_bfonn"],
"sliding": 4,
"input_type": input_types["multi"],
"cpu": "FL_BFONN-sliding_4-ex_func_3-act_func_0-pop_size_100-elim_disp_steps_2-repro_steps_5-chem_steps_60-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_8",
"ram": "FL_BFONN-sliding_4-ex_func_3-act_func_0-pop_size_70-elim_disp_steps_1-repro_steps_5-chem_steps_60-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_4"},
{"name": model_names["fl_bfonn"],
"sliding": 5,
"input_type": input_types["uni"],
"cpu": "FL_BFONN-sliding_5-ex_func_3-act_func_0-pop_size_100-elim_disp_steps_1-repro_steps_3-chem_steps_80-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_4",
"ram": "FL_BFONN-sliding_5-ex_func_3-act_func_0-pop_size_50-elim_disp_steps_2-repro_steps_5-chem_steps_80-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.1-p_eliminate_0.25-swim_length_4"},
{"name": model_names["fl_bfonn"],
"sliding": 5,
"input_type": input_types["multi"],
"cpu": "FL_BFONN-sliding_5-ex_func_3-act_func_0-pop_size_100-elim_disp_steps_2-repro_steps_5-chem_steps_60-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_8",
"ram": "FL_BFONN-sliding_2-ex_func_3-act_func_0-pop_size_50-elim_disp_steps_2-repro_steps_3-chem_steps_60-d_attr_0.1_w_attr_0.2-h_rep_0.1_w_rep_10-step_size_0.05-p_eliminate_0.25-swim_length_4"},
]
s_coffs = [ 1.0, 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2.0, 2.5 ]
L_adaps = [ 5 ]
resource_real_used = load_number_of_vms('vms_real_used_CPU_RAM.csv')
for model in models:
if model["input_type"] == "multi":
cpu_file = "results/" + model["name"] + "/multi_cpu/" + model["cpu"] + ".csv"
ram_file = "results/" + model["name"] + "/multi_ram/" + model["ram"] + ".csv"
else:
cpu_file = "results/" + model["name"] + "/cpu/" + model["cpu"] + ".csv"
ram_file = "results/" + model["name"] + "/ram/" + model["ram"] + ".csv"
for s_coff in s_coffs:
for L_adap in L_adaps:
broker = BrokerScaling(scaling_coefficient=s_coff, adaptation_len=L_adap)
vm_predicted, vm_actual, vm_allocated, sla = broker.get_predicted_and_allocated_vms(cpu_file, ram_file)
vms_arr = np.concatenate((vm_predicted, vm_allocated, vm_actual), axis=1)
filepathresults = "results/scaling3/sliding" + str(model["sliding"]) + "/" + model["input_type"] + "/" + model["name"] + "_vms-s_" + str(s_coff) + "-L_" + str(L_adap)
filepathsla = "results/scaling3/sliding" + str(model["sliding"]) + "/" + model["input_type"] + "/" + model["name"] + "_SLA-s_" + str(s_coff) + "-L_" + str(L_adap)
save_scaling_results_to_csv(vms_arr, filepathresults)
save_scaling_results_to_csv(sla, filepathsla)
del vms_arr
del broker
| 64.465753 | 201 | 0.733532 | 913 | 4,706 | 3.268346 | 0.115005 | 0.018767 | 0.024129 | 0.044236 | 0.783177 | 0.741287 | 0.741287 | 0.740952 | 0.728887 | 0.69571 | 0 | 0.078852 | 0.118785 | 4,706 | 72 | 202 | 65.361111 | 0.640704 | 0 | 0 | 0.310345 | 0 | 0.206897 | 0.582819 | 0.486073 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.051724 | 0 | 0.051724 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
86c9c9e34e314dbf9dbb82a60d57b9d70463e636 | 57,481 | py | Python | app/uniquepair/service/tests/site-packages/buzzblog/gen/TLikeService.py | ChandanaNT/BuzzBlogApp | 7f27409b36eb2aa9c38931ad3b4a7540340e242c | [
"Apache-2.0"
] | 1 | 2021-02-19T00:37:29.000Z | 2021-02-19T00:37:29.000Z | app/uniquepair/service/tests/site-packages/buzzblog/gen/TLikeService.py | ChandanaNT/BuzzBlogApp | 7f27409b36eb2aa9c38931ad3b4a7540340e242c | [
"Apache-2.0"
] | null | null | null | app/uniquepair/service/tests/site-packages/buzzblog/gen/TLikeService.py | ChandanaNT/BuzzBlogApp | 7f27409b36eb2aa9c38931ad3b4a7540340e242c | [
"Apache-2.0"
] | 2 | 2021-04-13T01:06:06.000Z | 2021-11-16T16:14:46.000Z | #
# Autogenerated by Thrift Compiler (0.13.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
def like_post(self, request_metadata, post_id):
"""
Parameters:
- request_metadata
- post_id
"""
pass
def retrieve_standard_like(self, request_metadata, like_id):
"""
Parameters:
- request_metadata
- like_id
"""
pass
def retrieve_expanded_like(self, request_metadata, like_id):
"""
Parameters:
- request_metadata
- like_id
"""
pass
def delete_like(self, request_metadata, like_id):
"""
Parameters:
- request_metadata
- like_id
"""
pass
def list_likes(self, request_metadata, query, limit, offset):
"""
Parameters:
- request_metadata
- query
- limit
- offset
"""
pass
def count_likes_by_account(self, request_metadata, account_id):
"""
Parameters:
- request_metadata
- account_id
"""
pass
def count_likes_of_post(self, request_metadata, post_id):
"""
Parameters:
- request_metadata
- post_id
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def like_post(self, request_metadata, post_id):
"""
Parameters:
- request_metadata
- post_id
"""
self.send_like_post(request_metadata, post_id)
return self.recv_like_post()
def send_like_post(self, request_metadata, post_id):
self._oprot.writeMessageBegin('like_post', TMessageType.CALL, self._seqid)
args = like_post_args()
args.request_metadata = request_metadata
args.post_id = post_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_like_post(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = like_post_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "like_post failed: unknown result")
def retrieve_standard_like(self, request_metadata, like_id):
"""
Parameters:
- request_metadata
- like_id
"""
self.send_retrieve_standard_like(request_metadata, like_id)
return self.recv_retrieve_standard_like()
def send_retrieve_standard_like(self, request_metadata, like_id):
self._oprot.writeMessageBegin('retrieve_standard_like', TMessageType.CALL, self._seqid)
args = retrieve_standard_like_args()
args.request_metadata = request_metadata
args.like_id = like_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_retrieve_standard_like(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = retrieve_standard_like_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "retrieve_standard_like failed: unknown result")
def retrieve_expanded_like(self, request_metadata, like_id):
"""
Parameters:
- request_metadata
- like_id
"""
self.send_retrieve_expanded_like(request_metadata, like_id)
return self.recv_retrieve_expanded_like()
def send_retrieve_expanded_like(self, request_metadata, like_id):
self._oprot.writeMessageBegin('retrieve_expanded_like', TMessageType.CALL, self._seqid)
args = retrieve_expanded_like_args()
args.request_metadata = request_metadata
args.like_id = like_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_retrieve_expanded_like(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = retrieve_expanded_like_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e1 is not None:
raise result.e1
if result.e2 is not None:
raise result.e2
if result.e3 is not None:
raise result.e3
raise TApplicationException(TApplicationException.MISSING_RESULT, "retrieve_expanded_like failed: unknown result")
def delete_like(self, request_metadata, like_id):
"""
Parameters:
- request_metadata
- like_id
"""
self.send_delete_like(request_metadata, like_id)
self.recv_delete_like()
def send_delete_like(self, request_metadata, like_id):
self._oprot.writeMessageBegin('delete_like', TMessageType.CALL, self._seqid)
args = delete_like_args()
args.request_metadata = request_metadata
args.like_id = like_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_delete_like(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = delete_like_result()
result.read(iprot)
iprot.readMessageEnd()
if result.e1 is not None:
raise result.e1
if result.e2 is not None:
raise result.e2
return
def list_likes(self, request_metadata, query, limit, offset):
"""
Parameters:
- request_metadata
- query
- limit
- offset
"""
self.send_list_likes(request_metadata, query, limit, offset)
return self.recv_list_likes()
def send_list_likes(self, request_metadata, query, limit, offset):
self._oprot.writeMessageBegin('list_likes', TMessageType.CALL, self._seqid)
args = list_likes_args()
args.request_metadata = request_metadata
args.query = query
args.limit = limit
args.offset = offset
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_list_likes(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = list_likes_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e1 is not None:
raise result.e1
if result.e2 is not None:
raise result.e2
raise TApplicationException(TApplicationException.MISSING_RESULT, "list_likes failed: unknown result")
def count_likes_by_account(self, request_metadata, account_id):
"""
Parameters:
- request_metadata
- account_id
"""
self.send_count_likes_by_account(request_metadata, account_id)
return self.recv_count_likes_by_account()
def send_count_likes_by_account(self, request_metadata, account_id):
self._oprot.writeMessageBegin('count_likes_by_account', TMessageType.CALL, self._seqid)
args = count_likes_by_account_args()
args.request_metadata = request_metadata
args.account_id = account_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_count_likes_by_account(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = count_likes_by_account_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "count_likes_by_account failed: unknown result")
def count_likes_of_post(self, request_metadata, post_id):
"""
Parameters:
- request_metadata
- post_id
"""
self.send_count_likes_of_post(request_metadata, post_id)
return self.recv_count_likes_of_post()
def send_count_likes_of_post(self, request_metadata, post_id):
self._oprot.writeMessageBegin('count_likes_of_post', TMessageType.CALL, self._seqid)
args = count_likes_of_post_args()
args.request_metadata = request_metadata
args.post_id = post_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_count_likes_of_post(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = count_likes_of_post_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "count_likes_of_post failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["like_post"] = Processor.process_like_post
self._processMap["retrieve_standard_like"] = Processor.process_retrieve_standard_like
self._processMap["retrieve_expanded_like"] = Processor.process_retrieve_expanded_like
self._processMap["delete_like"] = Processor.process_delete_like
self._processMap["list_likes"] = Processor.process_list_likes
self._processMap["count_likes_by_account"] = Processor.process_count_likes_by_account
self._processMap["count_likes_of_post"] = Processor.process_count_likes_of_post
self._on_message_begin = None
def on_message_begin(self, func):
self._on_message_begin = func
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if self._on_message_begin:
self._on_message_begin(name, type, seqid)
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_like_post(self, seqid, iprot, oprot):
args = like_post_args()
args.read(iprot)
iprot.readMessageEnd()
result = like_post_result()
try:
result.success = self._handler.like_post(args.request_metadata, args.post_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TLikeAlreadyExistsException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("like_post", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_retrieve_standard_like(self, seqid, iprot, oprot):
args = retrieve_standard_like_args()
args.read(iprot)
iprot.readMessageEnd()
result = retrieve_standard_like_result()
try:
result.success = self._handler.retrieve_standard_like(args.request_metadata, args.like_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TLikeNotFoundException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("retrieve_standard_like", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_retrieve_expanded_like(self, seqid, iprot, oprot):
args = retrieve_expanded_like_args()
args.read(iprot)
iprot.readMessageEnd()
result = retrieve_expanded_like_result()
try:
result.success = self._handler.retrieve_expanded_like(args.request_metadata, args.like_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TLikeNotFoundException as e1:
msg_type = TMessageType.REPLY
result.e1 = e1
except TAccountNotFoundException as e2:
msg_type = TMessageType.REPLY
result.e2 = e2
except TPostNotFoundException as e3:
msg_type = TMessageType.REPLY
result.e3 = e3
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("retrieve_expanded_like", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_delete_like(self, seqid, iprot, oprot):
args = delete_like_args()
args.read(iprot)
iprot.readMessageEnd()
result = delete_like_result()
try:
self._handler.delete_like(args.request_metadata, args.like_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TLikeNotFoundException as e1:
msg_type = TMessageType.REPLY
result.e1 = e1
except TLikeNotAuthorizedException as e2:
msg_type = TMessageType.REPLY
result.e2 = e2
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("delete_like", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_list_likes(self, seqid, iprot, oprot):
args = list_likes_args()
args.read(iprot)
iprot.readMessageEnd()
result = list_likes_result()
try:
result.success = self._handler.list_likes(args.request_metadata, args.query, args.limit, args.offset)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TAccountNotFoundException as e1:
msg_type = TMessageType.REPLY
result.e1 = e1
except TPostNotFoundException as e2:
msg_type = TMessageType.REPLY
result.e2 = e2
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("list_likes", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_count_likes_by_account(self, seqid, iprot, oprot):
args = count_likes_by_account_args()
args.read(iprot)
iprot.readMessageEnd()
result = count_likes_by_account_result()
try:
result.success = self._handler.count_likes_by_account(args.request_metadata, args.account_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("count_likes_by_account", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_count_likes_of_post(self, seqid, iprot, oprot):
args = count_likes_of_post_args()
args.read(iprot)
iprot.readMessageEnd()
result = count_likes_of_post_result()
try:
result.success = self._handler.count_likes_of_post(args.request_metadata, args.post_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("count_likes_of_post", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class like_post_args(object):
"""
Attributes:
- request_metadata
- post_id
"""
def __init__(self, request_metadata=None, post_id=None,):
self.request_metadata = request_metadata
self.post_id = post_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.post_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('like_post_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.post_id is not None:
oprot.writeFieldBegin('post_id', TType.I32, 2)
oprot.writeI32(self.post_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(like_post_args)
like_post_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.I32, 'post_id', None, None, ), # 2
)
class like_post_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = TLike()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TLikeAlreadyExistsException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('like_post_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(like_post_result)
like_post_result.thrift_spec = (
(0, TType.STRUCT, 'success', [TLike, None], None, ), # 0
(1, TType.STRUCT, 'e', [TLikeAlreadyExistsException, None], None, ), # 1
)
class retrieve_standard_like_args(object):
"""
Attributes:
- request_metadata
- like_id
"""
def __init__(self, request_metadata=None, like_id=None,):
self.request_metadata = request_metadata
self.like_id = like_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.like_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('retrieve_standard_like_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.like_id is not None:
oprot.writeFieldBegin('like_id', TType.I32, 2)
oprot.writeI32(self.like_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(retrieve_standard_like_args)
retrieve_standard_like_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.I32, 'like_id', None, None, ), # 2
)
class retrieve_standard_like_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = TLike()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TLikeNotFoundException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('retrieve_standard_like_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(retrieve_standard_like_result)
retrieve_standard_like_result.thrift_spec = (
(0, TType.STRUCT, 'success', [TLike, None], None, ), # 0
(1, TType.STRUCT, 'e', [TLikeNotFoundException, None], None, ), # 1
)
class retrieve_expanded_like_args(object):
"""
Attributes:
- request_metadata
- like_id
"""
def __init__(self, request_metadata=None, like_id=None,):
self.request_metadata = request_metadata
self.like_id = like_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.like_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('retrieve_expanded_like_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.like_id is not None:
oprot.writeFieldBegin('like_id', TType.I32, 2)
oprot.writeI32(self.like_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(retrieve_expanded_like_args)
retrieve_expanded_like_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.I32, 'like_id', None, None, ), # 2
)
class retrieve_expanded_like_result(object):
"""
Attributes:
- success
- e1
- e2
- e3
"""
def __init__(self, success=None, e1=None, e2=None, e3=None,):
self.success = success
self.e1 = e1
self.e2 = e2
self.e3 = e3
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = TLike()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e1 = TLikeNotFoundException()
self.e1.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.e2 = TAccountNotFoundException()
self.e2.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.e3 = TPostNotFoundException()
self.e3.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('retrieve_expanded_like_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e1 is not None:
oprot.writeFieldBegin('e1', TType.STRUCT, 1)
self.e1.write(oprot)
oprot.writeFieldEnd()
if self.e2 is not None:
oprot.writeFieldBegin('e2', TType.STRUCT, 2)
self.e2.write(oprot)
oprot.writeFieldEnd()
if self.e3 is not None:
oprot.writeFieldBegin('e3', TType.STRUCT, 3)
self.e3.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(retrieve_expanded_like_result)
retrieve_expanded_like_result.thrift_spec = (
(0, TType.STRUCT, 'success', [TLike, None], None, ), # 0
(1, TType.STRUCT, 'e1', [TLikeNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'e2', [TAccountNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'e3', [TPostNotFoundException, None], None, ), # 3
)
class delete_like_args(object):
"""
Attributes:
- request_metadata
- like_id
"""
def __init__(self, request_metadata=None, like_id=None,):
self.request_metadata = request_metadata
self.like_id = like_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.like_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('delete_like_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.like_id is not None:
oprot.writeFieldBegin('like_id', TType.I32, 2)
oprot.writeI32(self.like_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(delete_like_args)
delete_like_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.I32, 'like_id', None, None, ), # 2
)
class delete_like_result(object):
"""
Attributes:
- e1
- e2
"""
def __init__(self, e1=None, e2=None,):
self.e1 = e1
self.e2 = e2
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.e1 = TLikeNotFoundException()
self.e1.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.e2 = TLikeNotAuthorizedException()
self.e2.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('delete_like_result')
if self.e1 is not None:
oprot.writeFieldBegin('e1', TType.STRUCT, 1)
self.e1.write(oprot)
oprot.writeFieldEnd()
if self.e2 is not None:
oprot.writeFieldBegin('e2', TType.STRUCT, 2)
self.e2.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(delete_like_result)
delete_like_result.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'e1', [TLikeNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'e2', [TLikeNotAuthorizedException, None], None, ), # 2
)
class list_likes_args(object):
"""
Attributes:
- request_metadata
- query
- limit
- offset
"""
def __init__(self, request_metadata=None, query=None, limit=None, offset=None,):
self.request_metadata = request_metadata
self.query = query
self.limit = limit
self.offset = offset
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.query = TLikeQuery()
self.query.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.limit = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.offset = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('list_likes_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.query is not None:
oprot.writeFieldBegin('query', TType.STRUCT, 2)
self.query.write(oprot)
oprot.writeFieldEnd()
if self.limit is not None:
oprot.writeFieldBegin('limit', TType.I32, 3)
oprot.writeI32(self.limit)
oprot.writeFieldEnd()
if self.offset is not None:
oprot.writeFieldBegin('offset', TType.I32, 4)
oprot.writeI32(self.offset)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(list_likes_args)
list_likes_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.STRUCT, 'query', [TLikeQuery, None], None, ), # 2
(3, TType.I32, 'limit', None, None, ), # 3
(4, TType.I32, 'offset', None, None, ), # 4
)
class list_likes_result(object):
"""
Attributes:
- success
- e1
- e2
"""
def __init__(self, success=None, e1=None, e2=None,):
self.success = success
self.e1 = e1
self.e2 = e2
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype10, _size7) = iprot.readListBegin()
for _i11 in range(_size7):
_elem12 = TLike()
_elem12.read(iprot)
self.success.append(_elem12)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e1 = TAccountNotFoundException()
self.e1.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.e2 = TPostNotFoundException()
self.e2.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('list_likes_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter13 in self.success:
iter13.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.e1 is not None:
oprot.writeFieldBegin('e1', TType.STRUCT, 1)
self.e1.write(oprot)
oprot.writeFieldEnd()
if self.e2 is not None:
oprot.writeFieldBegin('e2', TType.STRUCT, 2)
self.e2.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(list_likes_result)
list_likes_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [TLike, None], False), None, ), # 0
(1, TType.STRUCT, 'e1', [TAccountNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'e2', [TPostNotFoundException, None], None, ), # 2
)
class count_likes_by_account_args(object):
"""
Attributes:
- request_metadata
- account_id
"""
def __init__(self, request_metadata=None, account_id=None,):
self.request_metadata = request_metadata
self.account_id = account_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.account_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('count_likes_by_account_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.account_id is not None:
oprot.writeFieldBegin('account_id', TType.I32, 2)
oprot.writeI32(self.account_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(count_likes_by_account_args)
count_likes_by_account_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.I32, 'account_id', None, None, ), # 2
)
class count_likes_by_account_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I32:
self.success = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('count_likes_by_account_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I32, 0)
oprot.writeI32(self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(count_likes_by_account_result)
count_likes_by_account_result.thrift_spec = (
(0, TType.I32, 'success', None, None, ), # 0
)
class count_likes_of_post_args(object):
"""
Attributes:
- request_metadata
- post_id
"""
def __init__(self, request_metadata=None, post_id=None,):
self.request_metadata = request_metadata
self.post_id = post_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request_metadata = TRequestMetadata()
self.request_metadata.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.post_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('count_likes_of_post_args')
if self.request_metadata is not None:
oprot.writeFieldBegin('request_metadata', TType.STRUCT, 1)
self.request_metadata.write(oprot)
oprot.writeFieldEnd()
if self.post_id is not None:
oprot.writeFieldBegin('post_id', TType.I32, 2)
oprot.writeI32(self.post_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(count_likes_of_post_args)
count_likes_of_post_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request_metadata', [TRequestMetadata, None], None, ), # 1
(2, TType.I32, 'post_id', None, None, ), # 2
)
class count_likes_of_post_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I32:
self.success = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('count_likes_of_post_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I32, 0)
oprot.writeI32(self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(count_likes_of_post_result)
count_likes_of_post_result.thrift_spec = (
(0, TType.I32, 'success', None, None, ), # 0
)
fix_spec(all_structs)
del all_structs
| 34.337515 | 134 | 0.592718 | 6,186 | 57,481 | 5.23149 | 0.031846 | 0.061646 | 0.028645 | 0.025029 | 0.897534 | 0.864007 | 0.838422 | 0.817193 | 0.800445 | 0.778567 | 0 | 0.009032 | 0.310468 | 57,481 | 1,673 | 135 | 34.358039 | 0.807468 | 0.024391 | 0 | 0.805982 | 1 | 0 | 0.0383 | 0.008749 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105061 | false | 0.005368 | 0.006135 | 0.032209 | 0.200153 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
86d3390539a1a8d66e545580ddcf1bfd03fd1bc3 | 168 | py | Python | django-openstack/src/django_openstack/nova/tests/__init__.py | canarie/openstack-dashboard | 88f4d456fd044e14fd145c707d3d8cc141eaeb04 | [
"Apache-2.0"
] | 2 | 2015-05-18T13:50:24.000Z | 2015-05-18T14:47:11.000Z | django-openstack/src/django_openstack/nova/tests/__init__.py | canarie/openstack-dashboard | 88f4d456fd044e14fd145c707d3d8cc141eaeb04 | [
"Apache-2.0"
] | null | null | null | django-openstack/src/django_openstack/nova/tests/__init__.py | canarie/openstack-dashboard | 88f4d456fd044e14fd145c707d3d8cc141eaeb04 | [
"Apache-2.0"
] | null | null | null | from credential_tests import *
from image_tests import *
from instance_tests import *
from keypair_tests import *
from region_tests import *
from volume_tests import *
| 24 | 30 | 0.821429 | 24 | 168 | 5.5 | 0.375 | 0.5 | 0.568182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 168 | 6 | 31 | 28 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
86d348f0ac2f414ddfebce8b8e6aaaa6e2dc2b16 | 10,150 | py | Python | ambari-server/src/test/python/stacks/2.0.6/HIVE/test_hive_service_check.py | nexr/ambari | 8452f207d7b9343a162698f2a2b79bf2c512e9d3 | [
"Apache-2.0"
] | 1 | 2015-05-04T12:19:05.000Z | 2015-05-04T12:19:05.000Z | ambari-server/src/test/python/stacks/2.0.6/HIVE/test_hive_service_check.py | nexr/ambari | 8452f207d7b9343a162698f2a2b79bf2c512e9d3 | [
"Apache-2.0"
] | null | null | null | ambari-server/src/test/python/stacks/2.0.6/HIVE/test_hive_service_check.py | nexr/ambari | 8452f207d7b9343a162698f2a2b79bf2c512e9d3 | [
"Apache-2.0"
] | 1 | 2021-01-07T08:55:01.000Z | 2021-01-07T08:55:01.000Z | #!/usr/bin/env python
'''
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
'''
import os
from mock.mock import MagicMock, call, patch
from stacks.utils.RMFTestCase import *
import datetime, sys, socket
import resource_management.libraries.functions
@patch.object(resource_management.libraries.functions, "get_unique_id_and_date", new = MagicMock(return_value=''))
@patch("socket.socket")
@patch("time.time", new=MagicMock(return_value=1431110511.43))
class TestServiceCheck(RMFTestCase):
COMMON_SERVICES_PACKAGE_DIR = "HIVE/0.12.0.2.0/package"
STACK_VERSION = "2.0.6"
def test_service_check_default(self, socket_mock):
self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + "/scripts/service_check.py",
classname="HiveServiceCheck",
command="service_check",
config_file="default.json",
hdp_stack_version = self.STACK_VERSION,
target = RMFTestCase.TARGET_COMMON_SERVICES
)
self.assertResourceCalled('Execute', "! beeline -u 'jdbc:hive2://c6402.ambari.apache.org:10000/;transportMode=binary;auth=noSasl' -e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL'",
path = ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'],
user = 'ambari-qa',
timeout = 30,
)
self.assertResourceCalled('File', '/tmp/hcatSmoke.sh',
content = StaticFile('hcatSmoke.sh'),
mode = 0755,
)
self.assertResourceCalled('Execute', 'env JAVA_HOME=/usr/jdk64/jdk1.7.0_45 /tmp/hcatSmoke.sh hcatsmoke prepare',
logoutput = True,
path = ['/usr/sbin',
'/usr/local/bin',
'/bin',
'/usr/bin',
'/bin:/usr/lib/hive/bin:/usr/bin'],
tries = 3,
user = 'ambari-qa',
try_sleep = 5,
)
self.assertResourceCalled('ExecuteHadoop', 'fs -test -e /apps/hive/warehouse/hcatsmoke',
security_enabled = False,
keytab = UnknownConfigurationMock(),
conf_dir = '/etc/hadoop/conf',
logoutput = True,
kinit_path_local = '/usr/bin/kinit',
user = 'hdfs',
bin_dir = '/bin:/usr/lib/hive/bin:/usr/bin',
)
self.assertResourceCalled('Execute', ' /tmp/hcatSmoke.sh hcatsmoke cleanup',
logoutput = True,
path = ['/usr/sbin',
'/usr/local/bin',
'/bin',
'/usr/bin',
'/bin:/usr/lib/hive/bin:/usr/bin'],
tries = 3,
user = 'ambari-qa',
try_sleep = 5,
)
self.assertResourceCalled('File', '/tmp/templetonSmoke.sh',
content = StaticFile('templetonSmoke.sh'),
mode = 0755,
)
self.assertResourceCalled('File', '/tmp/idtest.ambari-qa.1431110511.43.pig',
content = Template('templeton_smoke.pig.j2', templeton_test_input='/tmp/idtest.ambari-qa.1431110511.43.in', templeton_test_output='/tmp/idtest.ambari-qa.1431110511.43.out'),
)
self.assertResourceCalled('HdfsResource', '/tmp/idtest.ambari-qa.1431110511.43.pig',
security_enabled = False,
hadoop_bin_dir = '/usr/bin',
keytab = UnknownConfigurationMock(),
kinit_path_local = '/usr/bin/kinit',
source = '/tmp/idtest.ambari-qa.1431110511.43.pig',
user = 'hdfs',
owner = 'ambari-qa',
hadoop_conf_dir = '/etc/hadoop/conf',
type = 'file',
action = ['create_on_execute'],
)
self.assertResourceCalled('HdfsResource', '/tmp/idtest.ambari-qa.1431110511.43.in',
security_enabled = False,
hadoop_bin_dir = '/usr/bin',
keytab = UnknownConfigurationMock(),
kinit_path_local = '/usr/bin/kinit',
source = '/etc/passwd',
user = 'hdfs',
owner = 'ambari-qa',
hadoop_conf_dir = '/etc/hadoop/conf',
type = 'file',
action = ['create_on_execute'],
)
self.assertResourceCalled('HdfsResource', None,
security_enabled = False,
hadoop_bin_dir = '/usr/bin',
keytab = UnknownConfigurationMock(),
kinit_path_local = '/usr/bin/kinit',
user = 'hdfs',
action = ['execute'],
hadoop_conf_dir = '/etc/hadoop/conf',
)
self.assertResourceCalled('Execute', '/tmp/templetonSmoke.sh c6402.ambari.apache.org ambari-qa 50111 idtest.ambari-qa.1431110511.43.pig no_keytab false /usr/bin/kinit no_principal',
logoutput = True,
path = ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'],
tries = 3,
try_sleep = 5,
)
self.assertNoMoreResources()
def test_service_check_secured(self, socket_mock):
self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + "/scripts/service_check.py",
classname="HiveServiceCheck",
command="service_check",
config_file="secured.json",
hdp_stack_version = self.STACK_VERSION,
target = RMFTestCase.TARGET_COMMON_SERVICES
)
self.assertResourceCalled('Execute', '/usr/bin/kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa@EXAMPLE.COM; ',
user = 'ambari-qa',
)
self.assertResourceCalled('Execute', "! beeline -u 'jdbc:hive2://c6402.ambari.apache.org:10000/;transportMode=binary;principal=hive/_HOST@EXAMPLE.COM' -e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL'",
path = ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'],
user = 'ambari-qa',
timeout = 30,
)
self.assertResourceCalled('File', '/tmp/hcatSmoke.sh',
content = StaticFile('hcatSmoke.sh'),
mode = 0755,
)
self.maxDiff = None
self.assertResourceCalled('Execute', '/usr/bin/kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa; env JAVA_HOME=/usr/jdk64/jdk1.7.0_45 /tmp/hcatSmoke.sh hcatsmoke prepare',
logoutput = True,
path = ['/usr/sbin','/usr/local/bin','/bin','/usr/bin', '/bin:/usr/lib/hive/bin:/usr/bin'],
tries = 3,
user = 'ambari-qa',
try_sleep = 5,
)
self.assertResourceCalled('ExecuteHadoop', 'fs -test -e /apps/hive/warehouse/hcatsmoke',
security_enabled = True,
keytab = '/etc/security/keytabs/hdfs.headless.keytab',
conf_dir = '/etc/hadoop/conf',
logoutput = True,
kinit_path_local = '/usr/bin/kinit',
user = 'hdfs',
bin_dir = '/bin:/usr/lib/hive/bin:/usr/bin',
principal = 'hdfs',
)
self.assertResourceCalled('Execute', '/usr/bin/kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa; /tmp/hcatSmoke.sh hcatsmoke cleanup',
logoutput = True,
path = ['/usr/sbin',
'/usr/local/bin',
'/bin',
'/usr/bin',
'/bin:/usr/lib/hive/bin:/usr/bin'],
tries = 3,
user = 'ambari-qa',
try_sleep = 5,
)
self.assertResourceCalled('File', '/tmp/templetonSmoke.sh',
content = StaticFile('templetonSmoke.sh'),
mode = 0755,
)
self.assertResourceCalled('File', '/tmp/idtest.ambari-qa.1431110511.43.pig',
content = Template('templeton_smoke.pig.j2', templeton_test_input='/tmp/idtest.ambari-qa.1431110511.43.in', templeton_test_output='/tmp/idtest.ambari-qa.1431110511.43.out'),
)
self.assertResourceCalled('HdfsResource', '/tmp/idtest.ambari-qa.1431110511.43.pig',
action = ['create_on_execute'],
security_enabled = True,
hadoop_bin_dir = '/usr/bin',
keytab = '/etc/security/keytabs/hdfs.headless.keytab',
kinit_path_local = '/usr/bin/kinit',
source = '/tmp/idtest.ambari-qa.1431110511.43.pig',
user = 'hdfs',
owner = 'ambari-qa',
hadoop_conf_dir = '/etc/hadoop/conf',
type = 'file',
)
self.assertResourceCalled('HdfsResource', '/tmp/idtest.ambari-qa.1431110511.43.in',
action = ['create_on_execute'],
security_enabled = True,
hadoop_bin_dir = '/usr/bin',
keytab = '/etc/security/keytabs/hdfs.headless.keytab',
kinit_path_local = '/usr/bin/kinit',
source = '/etc/passwd',
user = 'hdfs',
owner = 'ambari-qa',
hadoop_conf_dir = '/etc/hadoop/conf',
type = 'file',
)
self.assertResourceCalled('HdfsResource', None,
security_enabled = True,
hadoop_bin_dir = '/usr/bin',
keytab = '/etc/security/keytabs/hdfs.headless.keytab',
kinit_path_local = '/usr/bin/kinit',
user = 'hdfs',
action = ['execute'],
hadoop_conf_dir = '/etc/hadoop/conf',
)
self.assertResourceCalled('Execute', '/tmp/templetonSmoke.sh c6402.ambari.apache.org ambari-qa 50111 idtest.ambari-qa.1431110511.43.pig /etc/security/keytabs/smokeuser.headless.keytab true /usr/bin/kinit ambari-qa@EXAMPLE.COM',
logoutput = True,
path = ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'],
tries = 3,
try_sleep = 5,
)
self.assertNoMoreResources() | 44.517544 | 231 | 0.598621 | 1,135 | 10,150 | 5.243172 | 0.200881 | 0.03428 | 0.021173 | 0.056461 | 0.781885 | 0.778525 | 0.770291 | 0.763233 | 0.763233 | 0.763233 | 0 | 0.037784 | 0.262069 | 10,150 | 228 | 232 | 44.517544 | 0.756742 | 0.00197 | 0 | 0.742574 | 0 | 0.034653 | 0.370343 | 0.189922 | 0 | 0 | 0 | 0 | 0.123762 | 0 | null | null | 0.009901 | 0.024752 | null | null | 0.009901 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
86de665e5c7f9de29ec18fc04a6a1219c531bfc6 | 268,401 | py | Python | sympy/integrals/rubi/rules/miscellaneous_trig.py | STALKER2010/sympy-bleeding-edge | 81233029a9a30866747f6da2c0e9604d1681d474 | [
"BSD-3-Clause"
] | 2 | 2018-12-05T02:30:43.000Z | 2020-11-14T01:43:15.000Z | sympy/integrals/rubi/rules/miscellaneous_trig.py | STALKER2010/sympy-bleeding-edge | 81233029a9a30866747f6da2c0e9604d1681d474 | [
"BSD-3-Clause"
] | 1 | 2017-10-23T06:56:43.000Z | 2017-10-23T06:56:43.000Z | sympy/integrals/rubi/rules/miscellaneous_trig.py | STALKER2010/sympy-bleeding-edge | 81233029a9a30866747f6da2c0e9604d1681d474 | [
"BSD-3-Clause"
] | 1 | 2020-01-01T19:49:22.000Z | 2020-01-01T19:49:22.000Z | from sympy.external import import_module
matchpy = import_module("matchpy")
from sympy.utilities.decorator import doctest_depends_on
if matchpy:
from matchpy import Pattern, ReplacementRule, CustomConstraint
from sympy.integrals.rubi.utility_function import (Int, Set, With, Module, Scan, MapAnd, FalseQ, ZeroQ, NegativeQ, NonzeroQ, FreeQ, NFreeQ, List, Log, PositiveQ, PositiveIntegerQ, NegativeIntegerQ, IntegerQ, IntegersQ, ComplexNumberQ, PureComplexNumberQ, RealNumericQ, PositiveOrZeroQ, NegativeOrZeroQ, FractionOrNegativeQ, NegQ, Equal, Unequal, IntPart, FracPart, RationalQ, ProductQ, SumQ, NonsumQ, Subst, First, Rest, SqrtNumberQ, SqrtNumberSumQ, LinearQ, Sqrt, ArcCosh, Coefficient, Denominator, Hypergeometric2F1, Not, Simplify, FractionalPart, IntegerPart, AppellF1, EllipticPi, EllipticE, EllipticF, ArcTan, ArcCot, ArcCoth, ArcTanh, ArcSin, ArcSinh, ArcCos, ArcCsc, ArcSec, ArcCsch, ArcSech, Sinh, Tanh, Cosh, Sech, Csch, Coth, LessEqual, Less, Greater, GreaterEqual, FractionQ, IntLinearcQ, Expand, IndependentQ, PowerQ, IntegerPowerQ, PositiveIntegerPowerQ, FractionalPowerQ, AtomQ, ExpQ, LogQ, Head, MemberQ, TrigQ, SinQ, CosQ, TanQ, CotQ, SecQ, CscQ, Sin, Cos, Tan, Cot, Sec, Csc, HyperbolicQ, SinhQ, CoshQ, TanhQ, CothQ, SechQ, CschQ, InverseTrigQ, SinCosQ, SinhCoshQ, LeafCount, Numerator, NumberQ, NumericQ, Length, ListQ, Im, Re, InverseHyperbolicQ, InverseFunctionQ, TrigHyperbolicFreeQ, InverseFunctionFreeQ, RealQ, EqQ, FractionalPowerFreeQ, ComplexFreeQ, PolynomialQ, FactorSquareFree, PowerOfLinearQ, Exponent, QuadraticQ, LinearPairQ, BinomialParts, TrinomialParts, PolyQ, EvenQ, OddQ, PerfectSquareQ, NiceSqrtAuxQ, NiceSqrtQ, Together, PosAux, PosQ, CoefficientList, ReplaceAll, ExpandLinearProduct, GCD, ContentFactor, NumericFactor, NonnumericFactors, MakeAssocList, GensymSubst, KernelSubst, ExpandExpression, Apart, SmartApart, MatchQ, PolynomialQuotientRemainder, FreeFactors, NonfreeFactors, RemoveContentAux, RemoveContent, FreeTerms, NonfreeTerms, ExpandAlgebraicFunction, CollectReciprocals, ExpandCleanup, AlgebraicFunctionQ, Coeff, LeadTerm, RemainingTerms, LeadFactor, RemainingFactors, LeadBase, LeadDegree, Numer, Denom, hypergeom, Expon, MergeMonomials, PolynomialDivide, BinomialQ, TrinomialQ, GeneralizedBinomialQ, GeneralizedTrinomialQ, FactorSquareFreeList, PerfectPowerTest, SquareFreeFactorTest, RationalFunctionQ, RationalFunctionFactors, NonrationalFunctionFactors, Reverse, RationalFunctionExponents, RationalFunctionExpand, ExpandIntegrand, SimplerQ, SimplerSqrtQ, SumSimplerQ, BinomialDegree, TrinomialDegree, CancelCommonFactors, SimplerIntegrandQ, GeneralizedBinomialDegree, GeneralizedBinomialParts, GeneralizedTrinomialDegree, GeneralizedTrinomialParts, MonomialQ, MonomialSumQ, MinimumMonomialExponent, MonomialExponent, LinearMatchQ, PowerOfLinearMatchQ, QuadraticMatchQ, CubicMatchQ, BinomialMatchQ, TrinomialMatchQ, GeneralizedBinomialMatchQ, GeneralizedTrinomialMatchQ, QuotientOfLinearsMatchQ, PolynomialTermQ, PolynomialTerms, NonpolynomialTerms, PseudoBinomialParts, NormalizePseudoBinomial, PseudoBinomialPairQ, PseudoBinomialQ, PolynomialGCD, PolyGCD, AlgebraicFunctionFactors, NonalgebraicFunctionFactors, QuotientOfLinearsP, QuotientOfLinearsParts, QuotientOfLinearsQ, Flatten, Sort, AbsurdNumberQ, AbsurdNumberFactors, NonabsurdNumberFactors, SumSimplerAuxQ, Prepend, Drop, CombineExponents, FactorInteger, FactorAbsurdNumber, SubstForInverseFunction, SubstForFractionalPower, SubstForFractionalPowerOfQuotientOfLinears, FractionalPowerOfQuotientOfLinears, SubstForFractionalPowerQ, SubstForFractionalPowerAuxQ, FractionalPowerOfSquareQ, FractionalPowerSubexpressionQ, Apply, FactorNumericGcd, MergeableFactorQ, MergeFactor, MergeFactors, TrigSimplifyQ, TrigSimplify, TrigSimplifyRecur, Order, FactorOrder, Smallest, OrderedQ, MinimumDegree, PositiveFactors, Sign, NonpositiveFactors, PolynomialInAuxQ, PolynomialInQ, ExponentInAux, ExponentIn, PolynomialInSubstAux, PolynomialInSubst, Distrib, DistributeDegree, FunctionOfPower, DivideDegreesOfFactors, MonomialFactor, FullSimplify, FunctionOfLinearSubst, FunctionOfLinear, NormalizeIntegrand, NormalizeIntegrandAux, NormalizeIntegrandFactor, NormalizeIntegrandFactorBase, NormalizeTogether, NormalizeLeadTermSigns, AbsorbMinusSign, NormalizeSumFactors, SignOfFactor, NormalizePowerOfLinear, SimplifyIntegrand, SimplifyTerm, TogetherSimplify, SmartSimplify, SubstForExpn, ExpandToSum, UnifySum, UnifyTerms, UnifyTerm, CalculusQ, FunctionOfInverseLinear, PureFunctionOfSinhQ, PureFunctionOfTanhQ, PureFunctionOfCoshQ, IntegerQuotientQ, OddQuotientQ, EvenQuotientQ, FindTrigFactor, FunctionOfSinhQ, FunctionOfCoshQ, OddHyperbolicPowerQ, FunctionOfTanhQ, FunctionOfTanhWeight, FunctionOfHyperbolicQ, SmartNumerator, SmartDenominator, SubstForAux, ActivateTrig, ExpandTrig, TrigExpand, SubstForTrig, SubstForHyperbolic, InertTrigFreeQ, LCM, SubstForFractionalPowerOfLinear, FractionalPowerOfLinear, InverseFunctionOfLinear, InertTrigQ, InertReciprocalQ, DeactivateTrig, FixInertTrigFunction, DeactivateTrigAux, PowerOfInertTrigSumQ, PiecewiseLinearQ, KnownTrigIntegrandQ, KnownSineIntegrandQ, KnownTangentIntegrandQ, KnownCotangentIntegrandQ, KnownSecantIntegrandQ, TryPureTanSubst, TryTanhSubst, TryPureTanhSubst, AbsurdNumberGCD, AbsurdNumberGCDList, ExpandTrigExpand, ExpandTrigReduce, ExpandTrigReduceAux, NormalizeTrig, TrigToExp, ExpandTrigToExp, TrigReduce, FunctionOfTrig, AlgebraicTrigFunctionQ, FunctionOfHyperbolic, FunctionOfQ, FunctionOfExpnQ, PureFunctionOfSinQ, PureFunctionOfCosQ, PureFunctionOfTanQ, PureFunctionOfCotQ, FunctionOfCosQ, FunctionOfSinQ, OddTrigPowerQ, FunctionOfTanQ, FunctionOfTanWeight, FunctionOfTrigQ, FunctionOfDensePolynomialsQ, FunctionOfLog, PowerVariableExpn, PowerVariableDegree, PowerVariableSubst, EulerIntegrandQ, FunctionOfSquareRootOfQuadratic, SquareRootOfQuadraticSubst, Divides, EasyDQ, ProductOfLinearPowersQ, Rt, NthRoot, AtomBaseQ, SumBaseQ, NegSumBaseQ, AllNegTermQ, SomeNegTermQ, TrigSquareQ, RtAux, TrigSquare, IntSum, IntTerm, Map2, ConstantFactor, SameQ, ReplacePart, CommonFactors, MostMainFactorPosition, FunctionOfExponentialQ, FunctionOfExponential, FunctionOfExponentialFunction, FunctionOfExponentialFunctionAux, FunctionOfExponentialTest, FunctionOfExponentialTestAux, stdev, rubi_test, If, IntQuadraticQ, IntBinomialQ, RectifyTangent, RectifyCotangent, Inequality, Condition, Simp, SimpHelp, SplitProduct, SplitSum, SubstFor, SubstForAux, FresnelS, FresnelC, Erfc, Erfi, Gamma, FunctionOfTrigOfLinearQ, ElementaryFunctionQ, Complex, UnsameQ, _SimpFixFactor, SimpFixFactor, _FixSimplify, FixSimplify, _SimplifyAntiderivativeSum, SimplifyAntiderivativeSum, _SimplifyAntiderivative, SimplifyAntiderivative, _TrigSimplifyAux, TrigSimplifyAux, Cancel, Part, PolyLog, D, Dist)
from sympy import Integral, S, sqrt
from sympy.integrals.rubi.symbol import WC
from sympy.core.symbol import symbols, Symbol
from sympy.functions import (log, sin, cos, tan, cot, csc, sec, sqrt, erf, exp, log)
from sympy.functions.elementary.hyperbolic import (acosh, asinh, atanh, acoth, acsch, asech, cosh, sinh, tanh, coth, sech, csch)
from sympy.functions.elementary.trigonometric import (atan, acsc, asin, acot, acos, asec)
A_, B_, C_, F_, G_, H_, a_, b_, c_, d_, e_, f_, g_, h_, i_, j_, k_, l_, m_, n_, p_, q_, r_, t_, u_, v_, s_, w_, x_, y_, z_ = [WC(i) for i in 'ABCFGHabcdefghijklmnpqrtuvswxyz']
a1_, a2_, b1_, b2_, c1_, c2_, d1_, d2_, n1_, n2_, e1_, e2_, f1_, f2_, g1_, g2_, n1_, n2_, n3_, Pq_, Pm_, Px_, Qm_, Qr_, Qx_, jn_, mn_, non2_, RFx_, RGx_ = [WC(i) for i in ['a1', 'a2', 'b1', 'b2', 'c1', 'c2', 'd1', 'd2', 'n1', 'n2', 'e1', 'e2', 'f1', 'f2', 'g1', 'g2', 'n1', 'n2', 'n3', 'Pq', 'Pm', 'Px', 'Qm', 'Qr', 'Qx', 'jn', 'mn', 'non2', 'RFx', 'RGx']]
_UseGamma = False
def miscellaneous_trig(rubi):
pattern1 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule1 = ReplacementRule(pattern1, lambda m, u, a, d, n, b, x, c : (c*tan(a + b*x))**m*(d*sin(a + b*x))**(-m)*(d*cos(a + b*x))**m*Int((d*sin(a + b*x))**(m + n)*(d*cos(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule1)
pattern2 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule2 = ReplacementRule(pattern2, lambda m, u, a, d, n, b, x, c : (c*tan(a + b*x))**m*(d*sin(a + b*x))**(-m)*(d*cos(a + b*x))**m*Int((d*sin(a + b*x))**m*(d*cos(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule2)
pattern3 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule3 = ReplacementRule(pattern3, lambda m, u, a, d, n, b, x, c : (c*cot(a + b*x))**m*(d*sin(a + b*x))**m*(d*cos(a + b*x))**(-m)*Int((d*sin(a + b*x))**(-m + n)*(d*cos(a + b*x))**m*ActivateTrig(u), x))
rubi.add(rule3)
pattern4 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule4 = ReplacementRule(pattern4, lambda m, u, a, d, n, b, x, c : (c*cot(a + b*x))**m*(d*sin(a + b*x))**m*(d*cos(a + b*x))**(-m)*Int((d*sin(a + b*x))**(-m)*(d*cos(a + b*x))**(m + n)*ActivateTrig(u), x))
rubi.add(rule4)
pattern5 = Pattern(Integral(u_*(WC('c', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule5 = ReplacementRule(pattern5, lambda m, u, a, d, n, b, x, c : (c*csc(a + b*x))**m*(d*sin(a + b*x))**m*Int((d*sin(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule5)
pattern6 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule6 = ReplacementRule(pattern6, lambda m, u, a, b, x, c : (c*sin(a + b*x))**(-m)*(c*cos(a + b*x))**m*(c*tan(a + b*x))**m*Int((c*sin(a + b*x))**m*(c*cos(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule6)
pattern7 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule7 = ReplacementRule(pattern7, lambda m, u, a, b, x, c : (c*sin(a + b*x))**m*(c*cos(a + b*x))**(-m)*(c*cot(a + b*x))**m*Int((c*sin(a + b*x))**(-m)*(c*cos(a + b*x))**m*ActivateTrig(u), x))
rubi.add(rule7)
pattern8 = Pattern(Integral(u_*(WC('c', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule8 = ReplacementRule(pattern8, lambda m, u, a, b, x, c : (c*cos(a + b*x))**m*(c*sec(a + b*x))**m*Int((c*cos(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule8)
pattern9 = Pattern(Integral(u_*(WC('c', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule9 = ReplacementRule(pattern9, lambda m, u, a, b, x, c : (c*sin(a + b*x))**m*(c*csc(a + b*x))**m*Int((c*sin(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule9)
pattern10 = Pattern(Integral(u_*(WC('c', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('B', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule10 = ReplacementRule(pattern10, lambda u, A, a, n, b, B, x, c : c*Int((c*sin(a + b*x))**(n + S(-1))*(A*sin(a + b*x) + B)*ActivateTrig(u), x))
rubi.add(rule10)
pattern11 = Pattern(Integral(u_*(WC('c', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('B', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule11 = ReplacementRule(pattern11, lambda u, A, a, n, b, B, x, c : c*Int((c*cos(a + b*x))**(n + S(-1))*(A*cos(a + b*x) + B)*ActivateTrig(u), x))
rubi.add(rule11)
pattern12 = Pattern(Integral(u_*(A_ + WC('B', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule12 = ReplacementRule(pattern12, lambda u, A, a, b, B, x : Int((A*sin(a + b*x) + B)*ActivateTrig(u)/sin(a + b*x), x))
rubi.add(rule12)
pattern13 = Pattern(Integral(u_*(A_ + WC('B', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule13 = ReplacementRule(pattern13, lambda u, A, b, a, B, x : Int((A*cos(a + b*x) + B)*ActivateTrig(u)/cos(a + b*x), x))
rubi.add(rule13)
pattern14 = Pattern(Integral((WC('c', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('A', S(0)) + WC('B', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule14 = ReplacementRule(pattern14, lambda u, A, C, a, n, b, B, x, c : c**S(2)*Int((c*sin(a + b*x))**(n + S(-2))*(A*sin(a + b*x)**S(2) + B*sin(a + b*x) + C)*ActivateTrig(u), x))
rubi.add(rule14)
pattern15 = Pattern(Integral((WC('c', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('A', S(0)) + WC('B', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule15 = ReplacementRule(pattern15, lambda u, A, a, C, n, b, B, x, c : c**S(2)*Int((c*cos(a + b*x))**(n + S(-2))*(A*cos(a + b*x)**S(2) + B*cos(a + b*x) + C)*ActivateTrig(u), x))
rubi.add(rule15)
pattern16 = Pattern(Integral((WC('c', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('C', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule16 = ReplacementRule(pattern16, lambda u, A, C, a, n, b, x, c : c**S(2)*Int((c*sin(a + b*x))**(n + S(-2))*(A*sin(a + b*x)**S(2) + C)*ActivateTrig(u), x))
rubi.add(rule16)
pattern17 = Pattern(Integral((WC('c', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('C', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule17 = ReplacementRule(pattern17, lambda u, A, a, C, n, b, x, c : c**S(2)*Int((c*cos(a + b*x))**(n + S(-2))*(A*cos(a + b*x)**S(2) + C)*ActivateTrig(u), x))
rubi.add(rule17)
pattern18 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule18 = ReplacementRule(pattern18, lambda u, A, C, a, b, B, x : Int((A*sin(a + b*x)**S(2) + B*sin(a + b*x) + C)*ActivateTrig(u)/sin(a + b*x)**S(2), x))
rubi.add(rule18)
pattern19 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule19 = ReplacementRule(pattern19, lambda u, A, b, C, a, B, x : Int((A*cos(a + b*x)**S(2) + B*cos(a + b*x) + C)*ActivateTrig(u)/cos(a + b*x)**S(2), x))
rubi.add(rule19)
pattern20 = Pattern(Integral(u_*(A_ + WC('C', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule20 = ReplacementRule(pattern20, lambda u, A, C, a, b, x : Int((A*sin(a + b*x)**S(2) + C)*ActivateTrig(u)/sin(a + b*x)**S(2), x))
rubi.add(rule20)
pattern21 = Pattern(Integral(u_*(A_ + WC('C', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSineIntegrandQ(u, x)))
rule21 = ReplacementRule(pattern21, lambda u, A, a, C, b, x : Int((A*cos(a + b*x)**S(2) + C)*ActivateTrig(u)/cos(a + b*x)**S(2), x))
rubi.add(rule21)
pattern22 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)))
rule22 = ReplacementRule(pattern22, lambda u, A, C, a, b, B, x : Int((A*sin(a + b*x) + B*sin(a + b*x)**S(2) + C)*ActivateTrig(u)/sin(a + b*x), x))
rubi.add(rule22)
pattern23 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)))
rule23 = ReplacementRule(pattern23, lambda u, A, C, a, b, B, x : Int((A*cos(a + b*x) + B*cos(a + b*x)**S(2) + C)*ActivateTrig(u)/cos(a + b*x), x))
rubi.add(rule23)
pattern24 = Pattern(Integral(u_*(WC('A', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)) + WC('B', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**n1_ + WC('C', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**n2_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n, n1: ZeroQ(-n + n1 + S(-1))), CustomConstraint(lambda n2, n: ZeroQ(-n + n2 + S(-2))))
rule24 = ReplacementRule(pattern24, lambda u, A, n1, a, C, n, b, n2, B, x : Int((A + B*sin(a + b*x) + C*sin(a + b*x)**S(2))*ActivateTrig(u)*sin(a + b*x)**n, x))
rubi.add(rule24)
pattern25 = Pattern(Integral(u_*(WC('A', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)) + WC('B', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**n1_ + WC('C', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**n2_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n, n1: ZeroQ(-n + n1 + S(-1))), CustomConstraint(lambda n2, n: ZeroQ(-n + n2 + S(-2))))
rule25 = ReplacementRule(pattern25, lambda u, A, n1, a, C, n, b, n2, B, x : Int((A + B*cos(a + b*x) + C*cos(a + b*x)**S(2))*ActivateTrig(u)*cos(a + b*x)**n, x))
rubi.add(rule25)
pattern26 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule26 = ReplacementRule(pattern26, lambda m, u, a, d, n, b, x, c : (c*cot(a + b*x))**m*(d*tan(a + b*x))**m*Int((d*tan(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule26)
pattern27 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule27 = ReplacementRule(pattern27, lambda m, u, a, d, n, b, x, c : (c*tan(a + b*x))**m*(d*sin(a + b*x))**(-m)*(d*cos(a + b*x))**m*Int((d*sin(a + b*x))**m*(d*cos(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule27)
pattern28 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule28 = ReplacementRule(pattern28, lambda m, u, a, b, x, c : (c*tan(a + b*x))**m*(c*cot(a + b*x))**m*Int((c*tan(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule28)
pattern29 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule29 = ReplacementRule(pattern29, lambda m, u, a, b, x, c : (c*tan(a + b*x))**m*(c*cot(a + b*x))**m*Int((c*cot(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule29)
pattern30 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('B', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule30 = ReplacementRule(pattern30, lambda u, A, a, n, b, B, x, c : c*Int((c*tan(a + b*x))**(n + S(-1))*(A*tan(a + b*x) + B)*ActivateTrig(u), x))
rubi.add(rule30)
pattern31 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('B', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule31 = ReplacementRule(pattern31, lambda u, A, a, n, b, B, x, c : c*Int((c*cot(a + b*x))**(n + S(-1))*(A*cot(a + b*x) + B)*ActivateTrig(u), x))
rubi.add(rule31)
pattern32 = Pattern(Integral(u_*(A_ + WC('B', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule32 = ReplacementRule(pattern32, lambda u, A, a, b, B, x : Int((A*tan(a + b*x) + B)*ActivateTrig(u)/tan(a + b*x), x))
rubi.add(rule32)
pattern33 = Pattern(Integral(u_*(A_ + WC('B', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule33 = ReplacementRule(pattern33, lambda u, A, b, a, B, x : Int((A*cot(a + b*x) + B)*ActivateTrig(u)/cot(a + b*x), x))
rubi.add(rule33)
pattern34 = Pattern(Integral((WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('A', S(0)) + WC('B', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule34 = ReplacementRule(pattern34, lambda u, A, C, a, n, b, B, x, c : c**S(2)*Int((c*tan(a + b*x))**(n + S(-2))*(A*tan(a + b*x)**S(2) + B*tan(a + b*x) + C)*ActivateTrig(u), x))
rubi.add(rule34)
pattern35 = Pattern(Integral((WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('A', S(0)) + WC('B', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule35 = ReplacementRule(pattern35, lambda u, A, a, C, n, b, B, x, c : c**S(2)*Int((c*cot(a + b*x))**(n + S(-2))*(A*cot(a + b*x)**S(2) + B*cot(a + b*x) + C)*ActivateTrig(u), x))
rubi.add(rule35)
pattern36 = Pattern(Integral((WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('C', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule36 = ReplacementRule(pattern36, lambda u, A, C, a, n, b, x, c : c**S(2)*Int((c*tan(a + b*x))**(n + S(-2))*(A*tan(a + b*x)**S(2) + C)*ActivateTrig(u), x))
rubi.add(rule36)
pattern37 = Pattern(Integral((WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('C', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule37 = ReplacementRule(pattern37, lambda u, A, a, C, n, b, x, c : c**S(2)*Int((c*cot(a + b*x))**(n + S(-2))*(A*cot(a + b*x)**S(2) + C)*ActivateTrig(u), x))
rubi.add(rule37)
pattern38 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule38 = ReplacementRule(pattern38, lambda u, A, C, a, b, B, x : Int((A*tan(a + b*x)**S(2) + B*tan(a + b*x) + C)*ActivateTrig(u)/tan(a + b*x)**S(2), x))
rubi.add(rule38)
pattern39 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule39 = ReplacementRule(pattern39, lambda u, A, b, C, a, B, x : Int((A*cot(a + b*x)**S(2) + B*cot(a + b*x) + C)*ActivateTrig(u)/cot(a + b*x)**S(2), x))
rubi.add(rule39)
pattern40 = Pattern(Integral(u_*(A_ + WC('C', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownTangentIntegrandQ(u, x)))
rule40 = ReplacementRule(pattern40, lambda u, A, C, a, b, x : Int((A*tan(a + b*x)**S(2) + C)*ActivateTrig(u)/tan(a + b*x)**S(2), x))
rubi.add(rule40)
pattern41 = Pattern(Integral(u_*(A_ + WC('C', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownCotangentIntegrandQ(u, x)))
rule41 = ReplacementRule(pattern41, lambda u, A, a, C, b, x : Int((A*cot(a + b*x)**S(2) + C)*ActivateTrig(u)/cot(a + b*x)**S(2), x))
rubi.add(rule41)
pattern42 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)))
rule42 = ReplacementRule(pattern42, lambda u, A, C, a, b, B, x : Int((A*tan(a + b*x) + B*tan(a + b*x)**S(2) + C)*ActivateTrig(u)/tan(a + b*x), x))
rubi.add(rule42)
pattern43 = Pattern(Integral(u_*(WC('A', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)) + WC('B', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**n1_ + WC('C', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**n2_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n, n1: ZeroQ(-n + n1 + S(-1))), CustomConstraint(lambda n2, n: ZeroQ(-n + n2 + S(-2))))
rule43 = ReplacementRule(pattern43, lambda u, A, n1, a, C, n, b, n2, B, x : Int((A + B*tan(a + b*x) + C*tan(a + b*x)**S(2))*ActivateTrig(u)*tan(a + b*x)**n, x))
rubi.add(rule43)
pattern44 = Pattern(Integral(u_*(WC('A', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)) + WC('B', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**n1_ + WC('C', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**n2_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n, n1: ZeroQ(-n + n1 + S(-1))), CustomConstraint(lambda n2, n: ZeroQ(-n + n2 + S(-2))))
rule44 = ReplacementRule(pattern44, lambda u, A, n1, a, C, n, b, n2, B, x : Int((A + B*cot(a + b*x) + C*cot(a + b*x)**S(2))*ActivateTrig(u)*cot(a + b*x)**n, x))
rubi.add(rule44)
pattern45 = Pattern(Integral(u_*(WC('c', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule45 = ReplacementRule(pattern45, lambda m, u, a, d, n, b, x, c : (c*sin(a + b*x))**m*(d*csc(a + b*x))**m*Int((d*csc(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule45)
pattern46 = Pattern(Integral(u_*(WC('c', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule46 = ReplacementRule(pattern46, lambda m, u, a, d, n, b, x, c : (c*cos(a + b*x))**m*(d*sec(a + b*x))**m*Int((d*sec(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule46)
pattern47 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule47 = ReplacementRule(pattern47, lambda m, u, a, d, n, b, x, c : (c*tan(a + b*x))**m*(d*csc(a + b*x))**m*(d*sec(a + b*x))**(-m)*Int((d*csc(a + b*x))**(-m)*(d*sec(a + b*x))**(m + n)*ActivateTrig(u), x))
rubi.add(rule47)
pattern48 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule48 = ReplacementRule(pattern48, lambda m, u, a, d, n, b, x, c : (c*tan(a + b*x))**m*(d*csc(a + b*x))**m*(d*sec(a + b*x))**(-m)*Int((d*csc(a + b*x))**(-m + n)*(d*sec(a + b*x))**m*ActivateTrig(u), x))
rubi.add(rule48)
pattern49 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule49 = ReplacementRule(pattern49, lambda m, u, a, d, n, b, x, c : (c*cot(a + b*x))**m*(d*csc(a + b*x))**(-m)*(d*sec(a + b*x))**m*Int((d*csc(a + b*x))**m*(d*sec(a + b*x))**(-m + n)*ActivateTrig(u), x))
rubi.add(rule49)
pattern50 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('d', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)), CustomConstraint(lambda m: Not(IntegerQ(m))))
rule50 = ReplacementRule(pattern50, lambda m, u, a, d, n, b, x, c : (c*cot(a + b*x))**m*(d*csc(a + b*x))**(-m)*(d*sec(a + b*x))**m*Int((d*csc(a + b*x))**(m + n)*(d*sec(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule50)
pattern51 = Pattern(Integral(u_*(WC('c', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule51 = ReplacementRule(pattern51, lambda m, u, a, b, x, c : (c*sin(a + b*x))**m*(c*csc(a + b*x))**m*Int((c*csc(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule51)
pattern52 = Pattern(Integral(u_*(WC('c', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule52 = ReplacementRule(pattern52, lambda m, u, a, b, x, c : (c*cos(a + b*x))**m*(c*sec(a + b*x))**m*Int((c*sec(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule52)
pattern53 = Pattern(Integral(u_*(WC('c', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule53 = ReplacementRule(pattern53, lambda m, u, a, b, x, c : (c*tan(a + b*x))**m*(c*csc(a + b*x))**m*(c*sec(a + b*x))**(-m)*Int((c*csc(a + b*x))**(-m)*(c*sec(a + b*x))**m*ActivateTrig(u), x))
rubi.add(rule53)
pattern54 = Pattern(Integral(u_*(WC('c', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: Not(IntegerQ(m))), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule54 = ReplacementRule(pattern54, lambda m, u, a, b, x, c : (c*cot(a + b*x))**m*(c*csc(a + b*x))**(-m)*(c*sec(a + b*x))**m*Int((c*csc(a + b*x))**m*(c*sec(a + b*x))**(-m)*ActivateTrig(u), x))
rubi.add(rule54)
pattern55 = Pattern(Integral(u_*(WC('c', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('B', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule55 = ReplacementRule(pattern55, lambda u, A, a, n, b, B, x, c : c*Int((c*sec(a + b*x))**(n + S(-1))*(A*sec(a + b*x) + B)*ActivateTrig(u), x))
rubi.add(rule55)
pattern56 = Pattern(Integral(u_*(WC('c', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('B', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule56 = ReplacementRule(pattern56, lambda u, A, a, n, b, B, x, c : c*Int((c*csc(a + b*x))**(n + S(-1))*(A*csc(a + b*x) + B)*ActivateTrig(u), x))
rubi.add(rule56)
pattern57 = Pattern(Integral(u_*(A_ + WC('B', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule57 = ReplacementRule(pattern57, lambda u, A, a, b, B, x : Int((A*sec(a + b*x) + B)*ActivateTrig(u)/sec(a + b*x), x))
rubi.add(rule57)
pattern58 = Pattern(Integral(u_*(A_ + WC('B', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule58 = ReplacementRule(pattern58, lambda u, A, b, a, B, x : Int((A*csc(a + b*x) + B)*ActivateTrig(u)/csc(a + b*x), x))
rubi.add(rule58)
pattern59 = Pattern(Integral((WC('c', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('A', S(0)) + WC('B', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule59 = ReplacementRule(pattern59, lambda u, A, C, a, n, b, B, x, c : c**S(2)*Int((c*sec(a + b*x))**(n + S(-2))*(A*sec(a + b*x)**S(2) + B*sec(a + b*x) + C)*ActivateTrig(u), x))
rubi.add(rule59)
pattern60 = Pattern(Integral((WC('c', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('A', S(0)) + WC('B', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule60 = ReplacementRule(pattern60, lambda u, A, a, C, n, b, B, x, c : c**S(2)*Int((c*csc(a + b*x))**(n + S(-2))*(A*csc(a + b*x)**S(2) + B*csc(a + b*x) + C)*ActivateTrig(u), x))
rubi.add(rule60)
pattern61 = Pattern(Integral((WC('c', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('C', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule61 = ReplacementRule(pattern61, lambda u, A, C, a, n, b, x, c : c**S(2)*Int((c*sec(a + b*x))**(n + S(-2))*(A*sec(a + b*x)**S(2) + C)*ActivateTrig(u), x))
rubi.add(rule61)
pattern62 = Pattern(Integral((WC('c', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(A_ + WC('C', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**S(2))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule62 = ReplacementRule(pattern62, lambda u, A, a, C, n, b, x, c : c**S(2)*Int((c*csc(a + b*x))**(n + S(-2))*(A*csc(a + b*x)**S(2) + C)*ActivateTrig(u), x))
rubi.add(rule62)
pattern63 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule63 = ReplacementRule(pattern63, lambda u, A, C, a, b, B, x : Int((A*sec(a + b*x)**S(2) + B*sec(a + b*x) + C)*ActivateTrig(u)/sec(a + b*x)**S(2), x))
rubi.add(rule63)
pattern64 = Pattern(Integral(u_*(WC('A', S(0)) + WC('B', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))) + WC('C', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule64 = ReplacementRule(pattern64, lambda u, A, b, C, a, B, x : Int((A*csc(a + b*x)**S(2) + B*csc(a + b*x) + C)*ActivateTrig(u)/csc(a + b*x)**S(2), x))
rubi.add(rule64)
pattern65 = Pattern(Integral(u_*(A_ + WC('C', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule65 = ReplacementRule(pattern65, lambda u, A, C, a, b, x : Int((A*sec(a + b*x)**S(2) + C)*ActivateTrig(u)/sec(a + b*x)**S(2), x))
rubi.add(rule65)
pattern66 = Pattern(Integral(u_*(A_ + WC('C', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda u, x: KnownSecantIntegrandQ(u, x)))
rule66 = ReplacementRule(pattern66, lambda u, A, a, C, b, x : Int((A*csc(a + b*x)**S(2) + C)*ActivateTrig(u)/csc(a + b*x)**S(2), x))
rubi.add(rule66)
pattern67 = Pattern(Integral(u_*(WC('A', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)) + WC('B', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**n1_ + WC('C', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**n2_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n, n1: ZeroQ(-n + n1 + S(-1))), CustomConstraint(lambda n2, n: ZeroQ(-n + n2 + S(-2))))
rule67 = ReplacementRule(pattern67, lambda u, A, n1, a, C, n, b, n2, B, x : Int((A + B*sec(a + b*x) + C*sec(a + b*x)**S(2))*ActivateTrig(u)*sec(a + b*x)**n, x))
rubi.add(rule67)
pattern68 = Pattern(Integral(u_*(WC('A', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)) + WC('B', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**n1_ + WC('C', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**n2_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda C, x: FreeQ(C, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n, n1: ZeroQ(-n + n1 + S(-1))), CustomConstraint(lambda n2, n: ZeroQ(-n + n2 + S(-2))))
rule68 = ReplacementRule(pattern68, lambda u, A, n1, a, C, n, b, n2, B, x : Int((A + B*csc(a + b*x) + C*csc(a + b*x)**S(2))*ActivateTrig(u)*csc(a + b*x)**n, x))
rubi.add(rule68)
pattern69 = Pattern(Integral(sin(x_*WC('b', S(1)) + WC('a', S(0)))*sin(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: NonzeroQ(b**S(2) - d**S(2))))
rule69 = ReplacementRule(pattern69, lambda a, d, b, x, c : -sin(a + c + x*(b + d))/(S(2)*b + S(2)*d) + sin(a - c + x*(b - d))/(S(2)*b - S(2)*d))
rubi.add(rule69)
pattern70 = Pattern(Integral(cos(x_*WC('b', S(1)) + WC('a', S(0)))*cos(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: NonzeroQ(b**S(2) - d**S(2))))
rule70 = ReplacementRule(pattern70, lambda a, d, b, x, c : sin(a + c + x*(b + d))/(S(2)*b + S(2)*d) + sin(a - c + x*(b - d))/(S(2)*b - S(2)*d))
rubi.add(rule70)
pattern71 = Pattern(Integral(sin(x_*WC('b', S(1)) + WC('a', S(0)))*cos(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: NonzeroQ(b**S(2) - d**S(2))))
rule71 = ReplacementRule(pattern71, lambda a, d, b, x, c : -cos(a + c + x*(b + d))/(S(2)*b + S(2)*d) - cos(a - c + x*(b - d))/(S(2)*b - S(2)*d))
rubi.add(rule71)
pattern72 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*cos(x_*WC('b', S(1)) + WC('a', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule72 = ReplacementRule(pattern72, lambda a, d, g, p, b, x, c : Int((g*sin(c + d*x))**p, x)/S(2) + Int((g*sin(c + d*x))**p*cos(c + d*x), x)/S(2))
rubi.add(rule72)
pattern73 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*sin(x_*WC('b', S(1)) + WC('a', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule73 = ReplacementRule(pattern73, lambda a, d, g, p, b, x, c : Int((g*sin(c + d*x))**p, x)/S(2) - Int((g*sin(c + d*x))**p*cos(c + d*x), x)/S(2))
rubi.add(rule73)
pattern74 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: IntegerQ(p)))
rule74 = ReplacementRule(pattern74, lambda m, a, d, x, p, b, e, c : S(2)**p*e**(-p)*Int((e*cos(a + b*x))**(m + p)*sin(a + b*x)**p, x))
rubi.add(rule74)
pattern75 = Pattern(Integral((WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: IntegerQ(p)))
rule75 = ReplacementRule(pattern75, lambda a, d, n, f, p, b, x, c : S(2)**p*f**(-p)*Int((f*sin(a + b*x))**(n + p)*cos(a + b*x)**p, x))
rubi.add(rule75)
pattern76 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: ZeroQ(m + p + S(-1))))
rule76 = ReplacementRule(pattern76, lambda m, a, d, x, g, p, b, e, c : e**S(2)*(e*cos(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(p + S(1))))
rubi.add(rule76)
pattern77 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: ZeroQ(m + p + S(-1))))
rule77 = ReplacementRule(pattern77, lambda m, a, d, x, g, p, b, e, c : -e**S(2)*(e*sin(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(p + S(1))))
rubi.add(rule77)
pattern78 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: ZeroQ(m + S(2)*p + S(2))))
rule78 = ReplacementRule(pattern78, lambda m, a, d, x, g, p, b, e, c : -(e*cos(a + b*x))**m*(g*sin(c + d*x))**(p + S(1))/(b*g*m))
rubi.add(rule78)
pattern79 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: ZeroQ(m + S(2)*p + S(2))))
rule79 = ReplacementRule(pattern79, lambda m, a, d, x, g, p, b, e, c : (e*sin(a + b*x))**m*(g*sin(c + d*x))**(p + S(1))/(b*g*m))
rubi.add(rule79)
pattern80 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(2))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, p: Equal(p, S(-3)/2) | Greater(m, S(3))), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule80 = ReplacementRule(pattern80, lambda m, a, d, x, g, p, b, e, c : e**S(4)*(m + p + S(-1))*Int((e*cos(a + b*x))**(m + S(-4))*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(p + S(1))) + e**S(2)*(e*cos(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(p + S(1))))
rubi.add(rule80)
pattern81 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(2))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, p: Equal(p, S(-3)/2) | Greater(m, S(3))), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule81 = ReplacementRule(pattern81, lambda m, a, d, x, g, p, b, e, c : e**S(4)*(m + p + S(-1))*Int((e*sin(a + b*x))**(m + S(-4))*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(p + S(1))) - e**S(2)*(e*sin(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(p + S(1))))
rubi.add(rule81)
pattern82 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, p: NonzeroQ(m + S(2)*p + S(2))), CustomConstraint(lambda m, p: Equal(m, S(2)) | Less(p, S(-2))), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule82 = ReplacementRule(pattern82, lambda m, a, d, x, g, p, b, e, c : e**S(2)*(m + S(2)*p + S(2))*Int((e*cos(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(p + S(1))) + (e*cos(a + b*x))**m*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(p + S(1))))
rubi.add(rule82)
pattern83 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, p: NonzeroQ(m + S(2)*p + S(2))), CustomConstraint(lambda m, p: Equal(m, S(2)) | Less(p, S(-2))), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule83 = ReplacementRule(pattern83, lambda m, a, d, x, g, p, b, e, c : e**S(2)*(m + S(2)*p + S(2))*Int((e*sin(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(p + S(1))) - (e*sin(a + b*x))**m*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(p + S(1))))
rubi.add(rule83)
pattern84 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda m, p: NonzeroQ(m + S(2)*p)), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule84 = ReplacementRule(pattern84, lambda m, a, d, x, g, p, b, e, c : e**S(2)*(m + p + S(-1))*Int((e*cos(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**p, x)/(m + S(2)*p) + e**S(2)*(e*cos(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(m + S(2)*p)))
rubi.add(rule84)
pattern85 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda m, p: NonzeroQ(m + S(2)*p)), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule85 = ReplacementRule(pattern85, lambda m, a, d, x, g, p, b, e, c : e**S(2)*(m + p + S(-1))*Int((e*sin(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**p, x)/(m + S(2)*p) - e**S(2)*(e*sin(a + b*x))**(m + S(-2))*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(m + S(2)*p)))
rubi.add(rule85)
pattern86 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda m, p: NonzeroQ(m + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule86 = ReplacementRule(pattern86, lambda m, a, d, x, g, p, b, e, c : (m + S(2)*p + S(2))*Int((e*cos(a + b*x))**(m + S(2))*(g*sin(c + d*x))**p, x)/(e**S(2)*(m + p + S(1))) - (e*cos(a + b*x))**m*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(m + p + S(1))))
rubi.add(rule86)
pattern87 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda m, p: NonzeroQ(m + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))), CustomConstraint(lambda m, p: IntegersQ(S(2)*m, S(2)*p)))
rule87 = ReplacementRule(pattern87, lambda m, a, d, x, g, p, b, e, c : (m + S(2)*p + S(2))*Int((e*sin(a + b*x))**(m + S(2))*(g*sin(c + d*x))**p, x)/(e**S(2)*(m + p + S(1))) + (e*sin(a + b*x))**m*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(m + p + S(1))))
rubi.add(rule87)
pattern88 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*cos(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))), CustomConstraint(lambda p: IntegerQ(S(2)*p)))
rule88 = ReplacementRule(pattern88, lambda a, d, g, p, b, x, c : S(2)*g*p*Int((g*sin(c + d*x))**(p + S(-1))*sin(a + b*x), x)/(S(2)*p + S(1)) + S(2)*(g*sin(c + d*x))**p*sin(a + b*x)/(d*(S(2)*p + S(1))))
rubi.add(rule88)
pattern89 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*sin(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))), CustomConstraint(lambda p: IntegerQ(S(2)*p)))
rule89 = ReplacementRule(pattern89, lambda a, d, g, p, b, x, c : S(2)*g*p*Int((g*sin(c + d*x))**(p + S(-1))*cos(a + b*x), x)/(S(2)*p + S(1)) - S(2)*(g*sin(c + d*x))**p*cos(a + b*x)/(d*(S(2)*p + S(1))))
rubi.add(rule89)
pattern90 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*cos(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda p: IntegerQ(S(2)*p)))
rule90 = ReplacementRule(pattern90, lambda a, d, g, p, b, x, c : (S(2)*p + S(3))*Int((g*sin(c + d*x))**(p + S(1))*sin(a + b*x), x)/(S(2)*g*(p + S(1))) + (g*sin(c + d*x))**(p + S(1))*cos(a + b*x)/(S(2)*b*g*(p + S(1))))
rubi.add(rule90)
pattern91 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*sin(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda p: IntegerQ(S(2)*p)))
rule91 = ReplacementRule(pattern91, lambda a, d, g, p, b, x, c : (S(2)*p + S(3))*Int((g*sin(c + d*x))**(p + S(1))*cos(a + b*x), x)/(S(2)*g*(p + S(1))) - (g*sin(c + d*x))**(p + S(1))*sin(a + b*x)/(S(2)*b*g*(p + S(1))))
rubi.add(rule91)
pattern92 = Pattern(Integral(cos(x_*WC('b', S(1)) + WC('a', S(0)))/sqrt(sin(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)))
rule92 = ReplacementRule(pattern92, lambda a, d, b, x, c : log(sin(a + b*x) + sqrt(sin(c + d*x)) + cos(a + b*x))/d + asin(sin(a + b*x) - cos(a + b*x))/d)
rubi.add(rule92)
pattern93 = Pattern(Integral(sin(x_*WC('b', S(1)) + WC('a', S(0)))/sqrt(sin(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)))
rule93 = ReplacementRule(pattern93, lambda a, d, b, x, c : -log(sin(a + b*x) + sqrt(sin(c + d*x)) + cos(a + b*x))/d + asin(sin(a + b*x) - cos(a + b*x))/d)
rubi.add(rule93)
pattern94 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_/cos(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda p: IntegerQ(S(2)*p)))
rule94 = ReplacementRule(pattern94, lambda a, d, g, p, b, x, c : S(2)*g*Int((g*sin(c + d*x))**(p + S(-1))*sin(a + b*x), x))
rubi.add(rule94)
pattern95 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_/sin(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda p: IntegerQ(S(2)*p)))
rule95 = ReplacementRule(pattern95, lambda a, d, g, p, b, x, c : S(2)*g*Int((g*sin(c + d*x))**(p + S(-1))*cos(a + b*x), x))
rubi.add(rule95)
pattern96 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))))
rule96 = ReplacementRule(pattern96, lambda m, a, d, x, g, p, b, e, c : (e*cos(a + b*x))**(-p)*(g*sin(c + d*x))**p*Int((e*cos(a + b*x))**(m + p)*sin(a + b*x)**p, x)*sin(a + b*x)**(-p))
rubi.add(rule96)
pattern97 = Pattern(Integral((WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))))
rule97 = ReplacementRule(pattern97, lambda p, a, d, n, g, f, b, x, c : (f*sin(a + b*x))**(-p)*(g*sin(c + d*x))**p*Int((f*sin(a + b*x))**(n + p)*cos(a + b*x)**p, x)*cos(a + b*x)**(-p))
rubi.add(rule97)
pattern98 = Pattern(Integral((WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_*sin(x_*WC('b', S(1)) + WC('a', S(0)))**S(2)*cos(x_*WC('b', S(1)) + WC('a', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule98 = ReplacementRule(pattern98, lambda a, d, g, p, b, x, c : Int((g*sin(c + d*x))**p, x)/S(4) - Int((g*sin(c + d*x))**p*cos(c + d*x)**S(2), x)/S(4))
rubi.add(rule98)
pattern99 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: IntegerQ(p)))
rule99 = ReplacementRule(pattern99, lambda m, a, d, n, x, f, p, b, e, c : S(2)**p*e**(-p)*f**(-p)*Int((e*cos(a + b*x))**(m + p)*(f*sin(a + b*x))**(n + p), x))
rubi.add(rule99)
pattern100 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: ZeroQ(m + p + S(1))))
rule100 = ReplacementRule(pattern100, lambda m, p, a, d, n, x, f, g, b, e, c : e*(e*cos(a + b*x))**(m + S(-1))*(f*sin(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*f*(n + p + S(1))))
rubi.add(rule100)
pattern101 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**n_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: ZeroQ(m + p + S(1))))
rule101 = ReplacementRule(pattern101, lambda m, p, a, d, n, x, g, f, b, e, c : -e*(e*sin(a + b*x))**(m + S(-1))*(f*cos(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*f*(n + p + S(1))))
rubi.add(rule101)
pattern102 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n, p: ZeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))))
rule102 = ReplacementRule(pattern102, lambda m, p, a, d, n, x, f, g, b, e, c : -(e*cos(a + b*x))**(m + S(1))*(f*sin(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*e*f*(m + p + S(1))))
rubi.add(rule102)
pattern103 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**n_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(3))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda n, p: NonzeroQ(n + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule103 = ReplacementRule(pattern103, lambda m, p, a, d, n, x, g, f, b, e, c : e**S(4)*(m + p + S(-1))*Int((e*cos(a + b*x))**(m + S(-4))*(f*sin(a + b*x))**n*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(n + p + S(1))) + e**S(2)*(e*cos(a + b*x))**(m + S(-2))*(f*sin(a + b*x))**n*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(n + p + S(1))))
rubi.add(rule103)
pattern104 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**n_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(3))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda n, p: NonzeroQ(n + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule104 = ReplacementRule(pattern104, lambda m, p, a, d, n, x, g, f, b, e, c : e**S(4)*(m + p + S(-1))*Int((e*sin(a + b*x))**(m + S(-4))*(f*cos(a + b*x))**n*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(n + p + S(1))) - e**S(2)*(e*sin(a + b*x))**(m + S(-2))*(f*cos(a + b*x))**n*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(n + p + S(1))))
rubi.add(rule104)
pattern105 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda n, p: NonzeroQ(n + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)), CustomConstraint(lambda m, p: Equal(m, S(2)) | Equal(m, S(3)) | Less(p, S(-2))))
rule105 = ReplacementRule(pattern105, lambda m, p, a, d, n, x, f, g, b, e, c : e**S(2)*(m + n + S(2)*p + S(2))*Int((e*cos(a + b*x))**(m + S(-2))*(f*sin(a + b*x))**n*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(n + p + S(1))) + (e*cos(a + b*x))**m*(f*sin(a + b*x))**n*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(n + p + S(1))))
rubi.add(rule105)
pattern106 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, p: RationalQ(m, p)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda n, p: NonzeroQ(n + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)), CustomConstraint(lambda m, p: Equal(m, S(2)) | Equal(m, S(3)) | Less(p, S(-2))))
rule106 = ReplacementRule(pattern106, lambda m, p, a, d, n, x, f, g, b, e, c : e**S(2)*(m + n + S(2)*p + S(2))*Int((e*sin(a + b*x))**(m + S(-2))*(f*cos(a + b*x))**n*(g*sin(c + d*x))**(p + S(2)), x)/(S(4)*g**S(2)*(n + p + S(1))) - (e*sin(a + b*x))**m*(f*cos(a + b*x))**n*(g*sin(c + d*x))**(p + S(1))/(S(2)*b*g*(n + p + S(1))))
rubi.add(rule106)
pattern107 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**n_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n: RationalQ(m, n)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda n: Less(n, S(-1))), CustomConstraint(lambda n, p: NonzeroQ(n + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule107 = ReplacementRule(pattern107, lambda m, p, a, d, n, x, g, f, b, e, c : e**S(2)*(m + p + S(-1))*Int((e*cos(a + b*x))**(m + S(-2))*(f*sin(a + b*x))**(n + S(2))*(g*sin(c + d*x))**p, x)/(f**S(2)*(n + p + S(1))) + e*(e*cos(a + b*x))**(m + S(-1))*(f*sin(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*f*(n + p + S(1))))
rubi.add(rule107)
pattern108 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**n_*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n: RationalQ(m, n)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda n: Less(n, S(-1))), CustomConstraint(lambda n, p: NonzeroQ(n + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule108 = ReplacementRule(pattern108, lambda m, p, a, d, n, x, g, f, b, e, c : e**S(2)*(m + p + S(-1))*Int((e*sin(a + b*x))**(m + S(-2))*(f*cos(a + b*x))**(n + S(2))*(g*sin(c + d*x))**p, x)/(f**S(2)*(n + p + S(1))) - e*(e*sin(a + b*x))**(m + S(-1))*(f*cos(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*f*(n + p + S(1))))
rubi.add(rule108)
pattern109 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p)), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule109 = ReplacementRule(pattern109, lambda m, p, a, d, n, x, f, g, b, e, c : e**S(2)*(m + p + S(-1))*Int((e*cos(a + b*x))**(m + S(-2))*(f*sin(a + b*x))**n*(g*sin(c + d*x))**p, x)/(m + n + S(2)*p) + e*(e*cos(a + b*x))**(m + S(-1))*(f*sin(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*f*(m + n + S(2)*p)))
rubi.add(rule109)
pattern110 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p)), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule110 = ReplacementRule(pattern110, lambda m, p, a, d, n, x, f, g, b, e, c : e**S(2)*(m + p + S(-1))*Int((e*sin(a + b*x))**(m + S(-2))*(f*cos(a + b*x))**n*(g*sin(c + d*x))**p, x)/(m + n + S(2)*p) - e*(e*sin(a + b*x))**(m + S(-1))*(f*cos(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*f*(m + n + S(2)*p)))
rubi.add(rule110)
pattern111 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n, p: RationalQ(m, n, p)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda p: Greater(p, S(0))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p)), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule111 = ReplacementRule(pattern111, lambda m, p, a, d, n, x, f, g, b, e, c : S(2)*f*g*(n + p + S(-1))*Int((e*cos(a + b*x))**(m + S(1))*(f*sin(a + b*x))**(n + S(-1))*(g*sin(c + d*x))**(p + S(-1)), x)/(e*(m + n + S(2)*p)) - f*(e*cos(a + b*x))**(m + S(1))*(f*sin(a + b*x))**(n + S(-1))*(g*sin(c + d*x))**p/(b*e*(m + n + S(2)*p)))
rubi.add(rule111)
pattern112 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n, p: RationalQ(m, n, p)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda p: Greater(p, S(0))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p)), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule112 = ReplacementRule(pattern112, lambda m, p, a, d, n, x, f, g, b, e, c : S(2)*f*g*(n + p + S(-1))*Int((e*sin(a + b*x))**(m + S(1))*(f*cos(a + b*x))**(n + S(-1))*(g*sin(c + d*x))**(p + S(-1)), x)/(e*(m + n + S(2)*p)) + f*(e*sin(a + b*x))**(m + S(1))*(f*cos(a + b*x))**(n + S(-1))*(g*sin(c + d*x))**p/(b*e*(m + n + S(2)*p)))
rubi.add(rule112)
pattern113 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n, p: RationalQ(m, n, p)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule113 = ReplacementRule(pattern113, lambda m, p, a, d, n, x, f, g, b, e, c : f*(m + n + S(2)*p + S(2))*Int((e*cos(a + b*x))**(m + S(1))*(f*sin(a + b*x))**(n + S(-1))*(g*sin(c + d*x))**(p + S(1)), x)/(S(2)*e*g*(m + p + S(1))) - (e*cos(a + b*x))**(m + S(1))*(f*sin(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*e*f*(m + p + S(1))))
rubi.add(rule113)
pattern114 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m, n, p: RationalQ(m, n, p)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule114 = ReplacementRule(pattern114, lambda m, p, a, d, n, x, f, g, b, e, c : f*(m + n + S(2)*p + S(2))*Int((e*sin(a + b*x))**(m + S(1))*(f*cos(a + b*x))**(n + S(-1))*(g*sin(c + d*x))**(p + S(1)), x)/(S(2)*e*g*(m + p + S(1))) + (e*sin(a + b*x))**(m + S(1))*(f*cos(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*e*f*(m + p + S(1))))
rubi.add(rule114)
pattern115 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule115 = ReplacementRule(pattern115, lambda m, p, a, d, n, x, f, g, b, e, c : (m + n + S(2)*p + S(2))*Int((e*cos(a + b*x))**(m + S(2))*(f*sin(a + b*x))**n*(g*sin(c + d*x))**p, x)/(e**S(2)*(m + p + S(1))) - (e*cos(a + b*x))**(m + S(1))*(f*sin(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*e*f*(m + p + S(1))))
rubi.add(rule115)
pattern116 = Pattern(Integral((WC('e', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**m_*(WC('f', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Less(m, S(-1))), CustomConstraint(lambda m, n, p: NonzeroQ(m + n + S(2)*p + S(2))), CustomConstraint(lambda m, p: NonzeroQ(m + p + S(1))), CustomConstraint(lambda m, n, p: IntegersQ(S(2)*m, S(2)*n, S(2)*p)))
rule116 = ReplacementRule(pattern116, lambda m, p, a, d, n, x, f, g, b, e, c : (m + n + S(2)*p + S(2))*Int((e*sin(a + b*x))**(m + S(2))*(f*cos(a + b*x))**n*(g*sin(c + d*x))**p, x)/(e**S(2)*(m + p + S(1))) + (e*sin(a + b*x))**(m + S(1))*(f*cos(a + b*x))**(n + S(1))*(g*sin(c + d*x))**p/(b*e*f*(m + p + S(1))))
rubi.add(rule116)
pattern117 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*(WC('f', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0))))**WC('n', S(1))*(WC('g', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: ZeroQ(S(-2) + d/b)), CustomConstraint(lambda p: Not(IntegerQ(p))))
rule117 = ReplacementRule(pattern117, lambda m, p, a, d, n, x, f, g, b, e, c : (e*cos(a + b*x))**(-p)*(f*sin(a + b*x))**(-p)*(g*sin(c + d*x))**p*Int((e*cos(a + b*x))**(m + p)*(f*sin(a + b*x))**(n + p), x))
rubi.add(rule117)
pattern118 = Pattern(Integral((WC('e', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b, m: ZeroQ(-Abs(m + S(2)) + d/b)))
rule118 = ReplacementRule(pattern118, lambda m, a, d, x, b, e, c : (e*cos(a + b*x))**(m + S(1))*(-m + S(-2))*cos((a + b*x)*(m + S(1)))/(d*e*(m + S(1))))
rubi.add(rule118)
pattern119 = Pattern(Integral((F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)) + a_)**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda p: PositiveIntegerQ(p)))
rule119 = ReplacementRule(pattern119, lambda F, a, d, n, p, b, x, c : Int((a + b*F(c + d*x)**n)**p, x))
rubi.add(rule119)
pattern120 = Pattern(Integral(1/(F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)) + a_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: EvenQ(n)), CustomConstraint(lambda n: Greater(n, S(2))))
rule120 = ReplacementRule(pattern120, lambda F, a, d, n, b, x, c : Dist(S(2)/(a*n), Sum(Int(1/(S(1) - (S(-1))**(-S(4)*k/n)*F(c + d*x)**S(2)/Rt(-a/b, n/S(2))), x), List(k, S(1), n/S(2))), x))
rubi.add(rule120)
pattern121 = Pattern(Integral(1/(F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)) + a_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda n: Greater(n, S(2))))
rule121 = ReplacementRule(pattern121, lambda F, a, d, n, b, x, c : Int(ExpandTrig(1/(a + b*F(c + d*x)**n), x), x))
rubi.add(rule121)
pattern122 = Pattern(Integral(G_**(x_*WC('d', S(1)) + WC('c', S(0)))/(F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)) + a_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda G, F: InertTrigQ(F, G)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda n: Greater(n, S(2))))
rule122 = ReplacementRule(pattern122, lambda m, G, F, a, d, n, b, x, c : Int(ExpandTrig(G(c + d*x)**m, 1/(a + b*F(c + d*x)**n), x), x))
rubi.add(rule122)
pattern123 = Pattern(Integral((F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('a', S(1)))**n_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: Not(IntegerQ(n))), CustomConstraint(lambda p: IntegerQ(p)), )
def With123(F, a, d, n, p, x, c):
v = ActivateTrig(F(c + d*x))
return a**IntPart(n)*(a*v**p)**FracPart(n)*(v/NonfreeFactors(v, x))**(p*IntPart(n))*Int(NonfreeFactors(v, x)**(n*p), x)*NonfreeFactors(v, x)**(-p*FracPart(n))
rule123 = ReplacementRule(pattern123, lambda F, a, d, n, p, x, c : With123(F, a, d, n, p, x, c))
rubi.add(rule123)
pattern124 = Pattern(Integral(((F_*(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)))**p_*WC('a', S(1)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: Not(IntegerQ(n))), CustomConstraint(lambda p: Not(IntegerQ(p))), )
def With124(F, a, d, n, p, b, x, c):
v = ActivateTrig(F(c + d*x))
return a**IntPart(n)*(a*(b*v)**p)**FracPart(n)*(b*v)**(-p*FracPart(n))*Int((b*v)**(n*p), x)
rule124 = ReplacementRule(pattern124, lambda F, a, d, n, p, b, x, c : With124(F, a, d, n, p, b, x, c))
rubi.add(rule124)
pattern125 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, cos) | SameQ(F, Cos)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x, True)))
def With125(u, F, a, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return d*Subst(Int(SubstFor(S(1), sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule125 = ReplacementRule(pattern125, lambda u, F, a, b, x, c : With125(u, F, a, b, x, c))
rubi.add(rule125)
pattern126 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, sin) | SameQ(F, Sin)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x, True)))
def With126(u, F, a, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -d*Subst(Int(SubstFor(S(1), cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule126 = ReplacementRule(pattern126, lambda u, F, a, b, x, c : With126(u, F, a, b, x, c))
rubi.add(rule126)
pattern127 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, cot) | SameQ(F, Cot)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x, True)))
def With127(u, F, a, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return Subst(Int(SubstFor(1/x, sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule127 = ReplacementRule(pattern127, lambda u, F, a, b, x, c : With127(u, F, a, b, x, c))
rubi.add(rule127)
pattern128 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, tan) | SameQ(F, Tan)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x, True)))
def With128(u, F, a, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -Subst(Int(SubstFor(1/x, cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule128 = ReplacementRule(pattern128, lambda u, F, a, b, x, c : With128(u, F, a, b, x, c))
rubi.add(rule128)
pattern129 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, sec) | SameQ(F, Sec)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(tan(c*(a + b*x))/d, u, x, True)))
def With129(u, F, a, b, x, c):
d = FreeFactors(tan(c*(a + b*x)), x)
return d*Subst(Int(SubstFor(S(1), tan(c*(a + b*x))/d, u, x), x), x, tan(c*(a + b*x))/d)/(b*c)
rule129 = ReplacementRule(pattern129, lambda u, F, a, b, x, c : With129(u, F, a, b, x, c))
rubi.add(rule129)
pattern130 = Pattern(Integral(u_/cos((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(tan(c*(a + b*x))/d, u, x, True)))
def With130(u, a, b, x, c):
d = FreeFactors(tan(c*(a + b*x)), x)
return d*Subst(Int(SubstFor(S(1), tan(c*(a + b*x))/d, u, x), x), x, tan(c*(a + b*x))/d)/(b*c)
rule130 = ReplacementRule(pattern130, lambda u, a, b, x, c : With130(u, a, b, x, c))
rubi.add(rule130)
pattern131 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, csc) | SameQ(F, Csc)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(cot(c*(a + b*x))/d, u, x, True)))
def With131(u, F, a, b, x, c):
d = FreeFactors(cot(c*(a + b*x)), x)
return -d*Subst(Int(SubstFor(S(1), cot(c*(a + b*x))/d, u, x), x), x, cot(c*(a + b*x))/d)/(b*c)
rule131 = ReplacementRule(pattern131, lambda u, F, a, b, x, c : With131(u, F, a, b, x, c))
rubi.add(rule131)
pattern132 = Pattern(Integral(u_/sin((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(cot(c*(a + b*x))/d, u, x, True)))
def With132(u, a, b, x, c):
d = FreeFactors(cot(c*(a + b*x)), x)
return -d*Subst(Int(SubstFor(S(1), cot(c*(a + b*x))/d, u, x), x), x, cot(c*(a + b*x))/d)/(b*c)
rule132 = ReplacementRule(pattern132, lambda u, a, b, x, c : With132(u, a, b, x, c))
rubi.add(rule132)
pattern133 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda F: SameQ(F, cot) | SameQ(F, Cot)), CustomConstraint(lambda x, u, d, a, b, n, c: TryPureTanSubst(ActivateTrig(u)*cot(c*(a + b*x))**n, x) & FunctionOfQ(tan(c*(a + b*x))/d, u, x, True)))
def With133(u, F, a, n, b, x, c):
d = FreeFactors(tan(c*(a + b*x)), x)
return d**(-n + S(1))*Subst(Int(SubstFor(x**(-n)/(d**S(2)*x**S(2) + S(1)), tan(c*(a + b*x))/d, u, x), x), x, tan(c*(a + b*x))/d)/(b*c)
rule133 = ReplacementRule(pattern133, lambda u, F, a, n, b, x, c : With133(u, F, a, n, b, x, c))
rubi.add(rule133)
pattern134 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda F: SameQ(F, tan) | SameQ(F, Tan)), CustomConstraint(lambda x, u, d, a, b, n, c: TryPureTanSubst(ActivateTrig(u)*tan(c*(a + b*x))**n, x) & FunctionOfQ(cot(c*(a + b*x))/d, u, x, True)))
def With134(u, F, a, n, b, x, c):
d = FreeFactors(cot(c*(a + b*x)), x)
return -d**(-n + S(1))*Subst(Int(SubstFor(x**(-n)/(d**S(2)*x**S(2) + S(1)), cot(c*(a + b*x))/d, u, x), x), x, cot(c*(a + b*x))/d)/(b*c)
rule134 = ReplacementRule(pattern134, lambda u, F, a, n, b, x, c : With134(u, F, a, n, b, x, c))
rubi.add(rule134)
pattern135 = Pattern(Integral(u_, x_), CustomConstraint(lambda x, u, d, v: Not(FalseQ(v)) & TryPureTanSubst(ActivateTrig(u), x) & FunctionOfQ(NonfreeFactors(cot(v), x), u, x, True)))
def With135(u, x):
v = FunctionOfTrig(u, x)
d = FreeFactors(cot(v), x)
return Dist(-d/Coefficient(v, x, S(1)), Subst(Int(SubstFor(1/(d**S(2)*x**S(2) + S(1)), cot(v)/d, u, x), x), x, cot(v)/d), x)
rule135 = ReplacementRule(pattern135, lambda u, x : With135(u, x))
#rubi.add(rule135)
pattern136 = Pattern(Integral(u_, x_), CustomConstraint(lambda x, u, d, v: Not(FalseQ(v)) and TryPureTanSubst(ActivateTrig(u), x) and FunctionOfQ(NonfreeFactors(tan(v), x), u, x, True)))
def With136(u, x):
v = FunctionOfTrig(u, x)
print(u, v)
d = FreeFactors(tan(v), x)
return Dist(d/Coefficient(v, x, S(1)), Subst(Int(SubstFor(1/(d**S(2)*x**S(2) + S(1)), tan(v)/d, u, x), x), x, tan(v)/d), x)
rule136 = ReplacementRule(pattern136, lambda u, x : With136(u, x))
#rubi.add(rule136)
pattern137 = Pattern(Integral(F_**(x_*WC('b', S(1)) + WC('a', S(0)))*G_**(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda F: SameQ(F, cos) | SameQ(F, sin)), CustomConstraint(lambda G: SameQ(G, cos) | SameQ(G, sin)), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)))
rule137 = ReplacementRule(pattern137, lambda G, q, F, a, d, p, b, x, c : Int(ExpandTrigReduce(ActivateTrig(F(a + b*x)**p*G(c + d*x)**q), x), x))
rubi.add(rule137)
pattern138 = Pattern(Integral(F_**(x_*WC('b', S(1)) + WC('a', S(0)))*G_**(x_*WC('d', S(1)) + WC('c', S(0)))*H_**(x_*WC('f', S(1)) + WC('e', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda F: SameQ(F, cos) | SameQ(F, sin)), CustomConstraint(lambda G: SameQ(G, cos) | SameQ(G, sin)), CustomConstraint(lambda H: SameQ(H, cos) | SameQ(H, sin)), CustomConstraint(lambda r, q, p: PositiveIntegerQ(p, q, r)))
rule138 = ReplacementRule(pattern138, lambda G, q, r, F, H, a, d, x, f, p, b, e, c : Int(ExpandTrigReduce(ActivateTrig(F(a + b*x)**p*G(c + d*x)**q*H(e + f*x)**r), x), x))
rubi.add(rule138)
pattern139 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, cos) | SameQ(F, Cos)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x)))
def With139(u, F, a, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return d*Subst(Int(SubstFor(S(1), sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule139 = ReplacementRule(pattern139, lambda u, F, a, b, x, c : With139(u, F, a, b, x, c))
rubi.add(rule139)
pattern140 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, sin) | SameQ(F, Sin)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x)))
def With140(u, F, a, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -d*Subst(Int(SubstFor(S(1), cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule140 = ReplacementRule(pattern140, lambda u, F, a, b, x, c : With140(u, F, a, b, x, c))
rubi.add(rule140)
pattern141 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, cot) | SameQ(F, Cot)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x)))
def With141(u, F, a, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return Subst(Int(SubstFor(1/x, sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule141 = ReplacementRule(pattern141, lambda u, F, a, b, x, c : With141(u, F, a, b, x, c))
rubi.add(rule141)
pattern142 = Pattern(Integral(F_*u_*(x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda F: SameQ(F, tan) | SameQ(F, Tan)), CustomConstraint(lambda x, u, d, a, b, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x)))
def With142(u, F, a, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -Subst(Int(SubstFor(1/x, cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule142 = ReplacementRule(pattern142, lambda u, F, a, b, x, c : With142(u, F, a, b, x, c))
rubi.add(rule142)
pattern143 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, cos) | SameQ(F, Cos)), CustomConstraint(lambda x, u, d, a, b, n, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x)))
def With143(u, F, a, n, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return d*Subst(Int(SubstFor((-d**S(2)*x**S(2) + S(1))**(n/S(2) + S(-1)/2), sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule143 = ReplacementRule(pattern143, lambda u, F, a, n, b, x, c : With143(u, F, a, n, b, x, c))
rubi.add(rule143)
pattern144 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, sec) | SameQ(F, Sec)), CustomConstraint(lambda x, u, d, a, b, n, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x)))
def With144(u, F, a, n, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return d*Subst(Int(SubstFor((-d**S(2)*x**S(2) + S(1))**(-n/S(2) + S(-1)/2), sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule144 = ReplacementRule(pattern144, lambda u, F, a, n, b, x, c : With144(u, F, a, n, b, x, c))
rubi.add(rule144)
pattern145 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, sin) | SameQ(F, Sin)), CustomConstraint(lambda x, u, d, a, b, n, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x)))
def With145(u, F, a, n, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -d*Subst(Int(SubstFor((-d**S(2)*x**S(2) + S(1))**(n/S(2) + S(-1)/2), cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule145 = ReplacementRule(pattern145, lambda u, F, a, n, b, x, c : With145(u, F, a, n, b, x, c))
rubi.add(rule145)
pattern146 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, csc) | SameQ(F, Csc)), CustomConstraint(lambda x, u, d, a, b, n, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x)))
def With146(u, F, a, n, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -d*Subst(Int(SubstFor((-d**S(2)*x**S(2) + S(1))**(-n/S(2) + S(-1)/2), cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule146 = ReplacementRule(pattern146, lambda u, F, a, n, b, x, c : With146(u, F, a, n, b, x, c))
rubi.add(rule146)
pattern147 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, cot) | SameQ(F, Cot)), CustomConstraint(lambda x, u, d, a, b, n, c: FunctionOfQ(sin(c*(a + b*x))/d, u, x)))
def With147(u, F, a, n, b, x, c):
d = FreeFactors(sin(c*(a + b*x)), x)
return d**(-n + S(1))*Subst(Int(SubstFor(x**(-n)*(-d**S(2)*x**S(2) + S(1))**(n/S(2) + S(-1)/2), sin(c*(a + b*x))/d, u, x), x), x, sin(c*(a + b*x))/d)/(b*c)
rule147 = ReplacementRule(pattern147, lambda u, F, a, n, b, x, c : With147(u, F, a, n, b, x, c))
rubi.add(rule147)
pattern148 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*u_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, tan) | SameQ(F, Tan)), CustomConstraint(lambda x, u, d, a, b, n, c: FunctionOfQ(cos(c*(a + b*x))/d, u, x)))
def With148(u, F, a, n, b, x, c):
d = FreeFactors(cos(c*(a + b*x)), x)
return -d**(-n + S(1))*Subst(Int(SubstFor(x**(-n)*(-d**S(2)*x**S(2) + S(1))**(n/S(2) + S(-1)/2), cos(c*(a + b*x))/d, u, x), x), x, cos(c*(a + b*x))/d)/(b*c)
rule148 = ReplacementRule(pattern148, lambda u, F, a, n, b, x, c : With148(u, F, a, n, b, x, c))
rubi.add(rule148)
pattern149 = Pattern(Integral(u_*(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*WC('d', S(1)) + v_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda v, x: NFreeQ(v, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, cos) | SameQ(F, Cos)), CustomConstraint(lambda x, u, d, v, b, n, c, e, a: FunctionOfQ(sin(c*(a + b*x))/e, u, x)))
def With149(u, F, v, a, d, n, b, x, c):
e = FreeFactors(sin(c*(a + b*x)), x)
return d*Int(ActivateTrig(u)*cos(c*(a + b*x))**n, x) + Int(ActivateTrig(u*v), x)
rule149 = ReplacementRule(pattern149, lambda u, F, v, a, d, n, b, x, c : With149(u, F, v, a, d, n, b, x, c))
rubi.add(rule149)
pattern150 = Pattern(Integral(u_*(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*WC('d', S(1)) + v_), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda v, x: NFreeQ(v, x)), CustomConstraint(lambda n: OddQ(n)), CustomConstraint(lambda u: NonsumQ(u)), CustomConstraint(lambda F: SameQ(F, sin) | SameQ(F, Sin)), CustomConstraint(lambda x, u, d, v, b, n, c, e, a: FunctionOfQ(cos(c*(a + b*x))/e, u, x)))
def With150(u, F, v, a, d, n, b, x, c):
e = FreeFactors(cos(c*(a + b*x)), x)
return d*Int(ActivateTrig(u)*sin(c*(a + b*x))**n, x) + Int(ActivateTrig(u*v), x)
rule150 = ReplacementRule(pattern150, lambda u, F, v, a, d, n, b, x, c : With150(u, F, v, a, d, n, b, x, c))
rubi.add(rule150)
pattern151 = Pattern(Integral(u_, x_), CustomConstraint(lambda x, u, d, v: Not(FalseQ(v)) & FunctionOfQ(NonfreeFactors(sin(v), x), u/cos(v), x)))
def With151(u, x):
v = FunctionOfTrig(u, x)
d = FreeFactors(sin(v), x)
return Dist(d/Coefficient(v, x, S(1)), Subst(Int(SubstFor(S(1), sin(v)/d, u/cos(v), x), x), x, sin(v)/d), x)
rule151 = ReplacementRule(pattern151, lambda u, x : With151(u, x))
#rubi.add(rule151)
pattern152 = Pattern(Integral(u_, x_), CustomConstraint(lambda x, u, d, v: Not(FalseQ(v)) & FunctionOfQ(NonfreeFactors(cos(v), x), u/sin(v), x)))
def With152(u, x):
v = FunctionOfTrig(u, x)
d = FreeFactors(cos(v), x)
return Dist(-d/Coefficient(v, x, S(1)), Subst(Int(SubstFor(S(1), cos(v)/d, u/sin(v), x), x), x, cos(v)/d), x)
rule152 = ReplacementRule(pattern152, lambda u, x : With152(u, x))
#rubi.add(rule152)
pattern153 = Pattern(Integral((WC('a', S(0)) + WC('b', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('c', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**S(2))**WC('p', S(1))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, c: ZeroQ(b - c)))
rule153 = ReplacementRule(pattern153, lambda u, a, d, x, p, b, e, c : (a + c)**p*Int(ActivateTrig(u), x))
rubi.add(rule153)
pattern154 = Pattern(Integral((WC('a', S(0)) + WC('b', S(1))*tan(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('c', S(1))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**S(2))**WC('p', S(1))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, c: ZeroQ(b + c)))
rule154 = ReplacementRule(pattern154, lambda u, a, d, x, p, b, e, c : (a + c)**p*Int(ActivateTrig(u), x))
rubi.add(rule154)
pattern155 = Pattern(Integral((WC('a', S(0)) + WC('b', S(1))*cot(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('c', S(1))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**S(2))**WC('p', S(1))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, c: ZeroQ(b + c)))
rule155 = ReplacementRule(pattern155, lambda u, a, d, x, p, b, e, c : (a + c)**p*Int(ActivateTrig(u), x))
rubi.add(rule155)
pattern156 = Pattern(Integral(u_/y_, x_), CustomConstraint(lambda u: Not(InertTrigFreeQ(u))), CustomConstraint(lambda q, x, y: Not(FalseQ(q))))
def With156(y, u, x):
q = DerivativeDivides(ActivateTrig(y), ActivateTrig(u), x)
return q*log(RemoveContent(ActivateTrig(y), x))
rule156 = ReplacementRule(pattern156, lambda y, u, x : With156(y, u, x))
rubi.add(rule156)
pattern157 = Pattern(Integral(u_/(w_*y_), x_), CustomConstraint(lambda u: Not(InertTrigFreeQ(u))), CustomConstraint(lambda x, q, w, y: Not(FalseQ(q))))
def With157(y, u, x, w):
q = DerivativeDivides(ActivateTrig(w*y), ActivateTrig(u), x)
return q*log(RemoveContent(ActivateTrig(w*y), x))
rule157 = ReplacementRule(pattern157, lambda y, u, x, w : With157(y, u, x, w))
rubi.add(rule157)
pattern158 = Pattern(Integral(u_*y_**WC('m', S(1)), x_), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda m: NonzeroQ(m + S(1))), CustomConstraint(lambda u: Not(InertTrigFreeQ(u))), CustomConstraint(lambda q, m, y: Not(FalseQ(q))))
def With158(y, m, u, x):
q = DerivativeDivides(ActivateTrig(y), ActivateTrig(u), x)
return q*ActivateTrig(y**(m + S(1)))/(m + S(1))
rule158 = ReplacementRule(pattern158, lambda y, m, u, x : With158(y, m, u, x))
rubi.add(rule158)
pattern159 = Pattern(Integral(u_*y_**WC('m', S(1))*z_**WC('n', S(1)), x_), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: NonzeroQ(m + S(1))), CustomConstraint(lambda u: Not(InertTrigFreeQ(u))), CustomConstraint(lambda q, z, m, y: Not(FalseQ(q))))
def With159(m, u, z, y, n, x):
q = DerivativeDivides(ActivateTrig(y*z), ActivateTrig(u*z**(-m + n)), x)
return q*ActivateTrig(y**(m + S(1))*z**(m + S(1)))/(m + S(1))
rule159 = ReplacementRule(pattern159, lambda m, u, z, y, n, x : With159(m, u, z, y, n, x))
rubi.add(rule159)
pattern160 = Pattern(Integral((F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('a', S(1)))**n_*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: Not(IntegerQ(n))), CustomConstraint(lambda p: IntegerQ(p)), )
def With160(u, F, a, d, n, p, x, c):
v = ActivateTrig(F(c + d*x))
return a**IntPart(n)*(a*v**p)**FracPart(n)*(v/NonfreeFactors(v, x))**(p*IntPart(n))*Int(ActivateTrig(u)*NonfreeFactors(v, x)**(n*p), x)*NonfreeFactors(v, x)**(-p*FracPart(n))
rule160 = ReplacementRule(pattern160, lambda u, F, a, d, n, p, x, c : With160(u, F, a, d, n, p, x, c))
rubi.add(rule160)
pattern161 = Pattern(Integral(((F_*(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)))**p_*WC('a', S(1)))**WC('n', S(1))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: Not(IntegerQ(n))), CustomConstraint(lambda p: Not(IntegerQ(p))), )
def With161(u, F, a, d, n, p, b, x, c):
v = ActivateTrig(F(c + d*x))
return a**IntPart(n)*(a*(b*v)**p)**FracPart(n)*(b*v)**(-p*FracPart(n))*Int((b*v)**(n*p)*ActivateTrig(u), x)
rule161 = ReplacementRule(pattern161, lambda u, F, a, d, n, p, b, x, c : With161(u, F, a, d, n, p, b, x, c))
rubi.add(rule161)
pattern162 = Pattern(Integral(u_, x_), CustomConstraint(lambda u, x: InverseFunctionFreeQ(u, x)), CustomConstraint(lambda x, u, d, v: Not(FalseQ(v)) & FunctionOfQ(NonfreeFactors(tan(v), x), u, x)))
def With162(u, x):
v = FunctionOfTrig(u, x)
d = FreeFactors(tan(v), x)
return Dist(d/Coefficient(v, x, S(1)), Subst(Int(SubstFor(1/(d**S(2)*x**S(2) + S(1)), tan(v)/d, u, x), x), x, tan(v)/d), x)
rule162 = ReplacementRule(pattern162, lambda u, x : With162(u, x))
#rubi.add(rule162)
pattern163 = Pattern(Integral((WC('a', S(1))*tan(x_*WC('d', S(1)) + WC('c', S(0)))**WC('n', S(1)) + WC('b', S(1))*sec(x_*WC('d', S(1)) + WC('c', S(0)))**WC('n', S(1)))**p_*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, p: IntegersQ(n, p)))
rule163 = ReplacementRule(pattern163, lambda u, a, d, n, p, b, x, c : Int((a*sin(c + d*x)**n + b)**p*ActivateTrig(u)*sec(c + d*x)**(n*p), x))
rubi.add(rule163)
pattern164 = Pattern(Integral((WC('a', S(1))*cot(x_*WC('d', S(1)) + WC('c', S(0)))**WC('n', S(1)) + WC('b', S(1))*csc(x_*WC('d', S(1)) + WC('c', S(0)))**WC('n', S(1)))**p_*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, p: IntegersQ(n, p)))
rule164 = ReplacementRule(pattern164, lambda u, a, d, n, p, b, x, c : Int((a*cos(c + d*x)**n + b)**p*ActivateTrig(u)*csc(c + d*x)**(n*p), x))
rubi.add(rule164)
pattern165 = Pattern(Integral(u_*(F_**(x_*WC('d', S(1)) + WC('c', S(0)))*a_ + F_**(x_*WC('d', S(1)) + WC('c', S(0)))*WC('b', S(1)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda q, x: FreeQ(q, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda q, p: PosQ(-p + q)))
rule165 = ReplacementRule(pattern165, lambda u, q, F, a, d, n, p, b, x, c : Int(ActivateTrig(u*(a + b*F(c + d*x)**(-p + q))**n*F(c + d*x)**(n*p)), x))
rubi.add(rule165)
pattern166 = Pattern(Integral(u_*(F_**(x_*WC('e', S(1)) + WC('d', S(0)))*a_ + F_**(x_*WC('e', S(1)) + WC('d', S(0)))*WC('b', S(1)) + F_**(x_*WC('e', S(1)) + WC('d', S(0)))*WC('c', S(1)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda q, x: FreeQ(q, x)), CustomConstraint(lambda r, x: FreeQ(r, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda q, p: PosQ(-p + q)), CustomConstraint(lambda r, p: PosQ(-p + r)))
rule166 = ReplacementRule(pattern166, lambda u, q, r, F, a, d, n, x, p, b, e, c : Int(ActivateTrig(u*(a + b*F(d + e*x)**(-p + q) + c*F(d + e*x)**(-p + r))**n*F(d + e*x)**(n*p)), x))
rubi.add(rule166)
pattern167 = Pattern(Integral(u_*(F_**(x_*WC('e', S(1)) + WC('d', S(0)))*WC('b', S(1)) + F_**(x_*WC('e', S(1)) + WC('d', S(0)))*WC('c', S(1)) + a_)**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda q, x: FreeQ(q, x)), CustomConstraint(lambda F: InertTrigQ(F)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda p: NegQ(p)))
rule167 = ReplacementRule(pattern167, lambda u, q, F, a, d, n, x, p, b, e, c : Int(ActivateTrig(u*(a*F(d + e*x)**(-p) + b + c*F(d + e*x)**(-p + q))**n*F(d + e*x)**(n*p)), x))
rubi.add(rule167)
pattern168 = Pattern(Integral((WC('a', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0))) + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, a: ZeroQ(a**S(2) + b**S(2))))
rule168 = ReplacementRule(pattern168, lambda u, a, d, n, b, x, c : Int((a*exp(-a*(c + d*x)/b))**n*ActivateTrig(u), x))
rubi.add(rule168)
pattern169 = Pattern(Integral(u_, x_), CustomConstraint(lambda u: TrigSimplifyQ(u)))
rule169 = ReplacementRule(pattern169, lambda u, x : Int(TrigSimplify(u), x))
rubi.add(rule169)
pattern170 = Pattern(Integral((a_*v_)**p_*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda v: Not(InertTrigFreeQ(v))), )
def With170(u, v, a, p, x):
uu = ActivateTrig(u)
vv = ActivateTrig(v)
return a**IntPart(p)*vv**(-FracPart(p))*(a*vv)**FracPart(p)*Int(uu*vv**p, x)
rule170 = ReplacementRule(pattern170, lambda u, v, a, p, x : With170(u, v, a, p, x))
rubi.add(rule170)
pattern171 = Pattern(Integral((v_**m_)**p_*WC('u', S(1)), x_), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda v: Not(InertTrigFreeQ(v))), )
def With171(m, u, v, p, x):
uu = ActivateTrig(u)
vv = ActivateTrig(v)
return vv**(-m*FracPart(p))*(vv**m)**FracPart(p)*Int(uu*vv**(m*p), x)
rule171 = ReplacementRule(pattern171, lambda m, u, v, p, x : With171(m, u, v, p, x))
rubi.add(rule171)
pattern172 = Pattern(Integral((v_**WC('m', S(1))*w_**WC('n', S(1)))**p_*WC('u', S(1)), x_), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda p: Not(IntegerQ(p))), CustomConstraint(lambda v, w: Not(InertTrigFreeQ(v)) | Not(InertTrigFreeQ(w))), )
def With172(m, u, v, n, p, x, w):
uu = ActivateTrig(u)
vv = ActivateTrig(v)
ww = ActivateTrig(w)
return vv**(-m*FracPart(p))*ww**(-n*FracPart(p))*(vv**m*ww**n)**FracPart(p)*Int(uu*vv**(m*p)*ww**(n*p), x)
rule172 = ReplacementRule(pattern172, lambda m, u, v, n, p, x, w : With172(m, u, v, n, p, x, w))
rubi.add(rule172)
pattern173 = Pattern(Integral(u_, x_), CustomConstraint(lambda u: Not(InertTrigFreeQ(u))), CustomConstraint(lambda v, x: SumQ(v)))
def With173(u, x):
v = ExpandTrig(u, x)
return Int(v, x)
rule173 = ReplacementRule(pattern173, lambda u, x : With173(u, x))
rubi.add(rule173)
pattern174 = Pattern(Integral(u_, x_), CustomConstraint(lambda u, x: InverseFunctionFreeQ(u, x)), CustomConstraint(lambda u, x: Not(FalseQ(FunctionOfTrig(u, x)))), )
def With174(u, x):
w = Block(List(Set(ShowSteps, False), Set(StepCounter, Null)), Int(SubstFor(1/(x**S(2)*FreeFactors(tan(FunctionOfTrig(u, x)/S(2)), x)**S(2) + S(1)), tan(FunctionOfTrig(u, x)/S(2))/FreeFactors(tan(FunctionOfTrig(u, x)/S(2)), x), u, x), x))
return Module(List(Set(v, FunctionOfTrig(u, x)), d), CompoundExpression(Set(d, FreeFactors(tan(v/S(2)), x)), Dist(S(2)*d/Coefficient(v, x, S(1)), Subst(Int(SubstFor(1/(d**S(2)*x**S(2) + S(1)), tan(v/S(2))/d, u, x), x), x, tan(v/S(2))/d), x)))
rule174 = ReplacementRule(pattern174, lambda u, x : With174(u, x))
rubi.add(rule174)
pattern175 = Pattern(Integral(u_, x_), CustomConstraint(lambda u: Not(InertTrigFreeQ(u))), )
def With175(u, x):
v = ActivateTrig(u)
return Int(v, x)
rule175 = ReplacementRule(pattern175, lambda u, x : With175(u, x))
rubi.add(rule175)
pattern176 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule176 = ReplacementRule(pattern176, lambda m, a, d, n, b, x, c : -d*m*Int((c + d*x)**(m + S(-1))*sin(a + b*x)**(n + S(1)), x)/(b*(n + S(1))) + (c + d*x)**m*sin(a + b*x)**(n + S(1))/(b*(n + S(1))))
rubi.add(rule176)
pattern177 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule177 = ReplacementRule(pattern177, lambda m, a, d, n, b, x, c : d*m*Int((c + d*x)**(m + S(-1))*cos(a + b*x)**(n + S(1)), x)/(b*(n + S(1))) - (c + d*x)**m*cos(a + b*x)**(n + S(1))/(b*(n + S(1))))
rubi.add(rule177)
pattern178 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, p: PositiveIntegerQ(n, p)))
rule178 = ReplacementRule(pattern178, lambda m, a, d, n, p, b, x, c : Int(ExpandTrigReduce((c + d*x)**m, sin(a + b*x)**n*cos(a + b*x)**p, x), x))
rubi.add(rule178)
pattern179 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, p: PositiveIntegerQ(n, p)))
rule179 = ReplacementRule(pattern179, lambda m, a, d, n, p, b, x, c : -Int((c + d*x)**m*sin(a + b*x)**n*tan(a + b*x)**(p + S(-2)), x) + Int((c + d*x)**m*sin(a + b*x)**(n + S(-2))*tan(a + b*x)**p, x))
rubi.add(rule179)
pattern180 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, p: PositiveIntegerQ(n, p)))
rule180 = ReplacementRule(pattern180, lambda m, a, d, n, p, b, x, c : -Int((c + d*x)**m*cos(a + b*x)**n*cot(a + b*x)**(p + S(-2)), x) + Int((c + d*x)**m*cos(a + b*x)**(n + S(-2))*cot(a + b*x)**p, x))
rubi.add(rule180)
pattern181 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: SameQ(p, S(1))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(0))))
rule181 = ReplacementRule(pattern181, lambda m, a, d, n, p, b, x, c : -d*m*Int((c + d*x)**(m + S(-1))*sec(a + b*x)**n, x)/(b*n) + (c + d*x)**m*sec(a + b*x)**n/(b*n))
rubi.add(rule181)
pattern182 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: SameQ(p, S(1))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(0))))
rule182 = ReplacementRule(pattern182, lambda m, a, d, n, p, b, x, c : d*m*Int((c + d*x)**(m + S(-1))*csc(a + b*x)**n, x)/(b*n) - (c + d*x)**m*csc(a + b*x)**n/(b*n))
rubi.add(rule182)
pattern183 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule183 = ReplacementRule(pattern183, lambda m, a, d, n, b, x, c : -d*m*Int((c + d*x)**(m + S(-1))*tan(a + b*x)**(n + S(1)), x)/(b*(n + S(1))) + (c + d*x)**m*tan(a + b*x)**(n + S(1))/(b*(n + S(1))))
rubi.add(rule183)
pattern184 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule184 = ReplacementRule(pattern184, lambda m, a, d, n, b, x, c : d*m*Int((c + d*x)**(m + S(-1))*cot(a + b*x)**(n + S(1)), x)/(b*(n + S(1))) - (c + d*x)**m*cot(a + b*x)**(n + S(1))/(b*(n + S(1))))
rubi.add(rule184)
pattern185 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**p_*sec(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule185 = ReplacementRule(pattern185, lambda m, a, d, p, b, x, c : -Int((c + d*x)**m*tan(a + b*x)**(p + S(-2))*sec(a + b*x), x) + Int((c + d*x)**m*tan(a + b*x)**(p + S(-2))*sec(a + b*x)**S(3), x))
rubi.add(rule185)
pattern186 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**p_*sec(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule186 = ReplacementRule(pattern186, lambda m, a, d, n, p, b, x, c : -Int((c + d*x)**m*tan(a + b*x)**(p + S(-2))*sec(a + b*x)**n, x) + Int((c + d*x)**m*tan(a + b*x)**(p + S(-2))*sec(a + b*x)**(n + S(2)), x))
rubi.add(rule186)
pattern187 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**p_*csc(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule187 = ReplacementRule(pattern187, lambda m, a, d, p, b, x, c : -Int((c + d*x)**m*cot(a + b*x)**(p + S(-2))*csc(a + b*x), x) + Int((c + d*x)**m*cot(a + b*x)**(p + S(-2))*csc(a + b*x)**S(3), x))
rubi.add(rule187)
pattern188 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**p_*csc(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: PositiveIntegerQ(p/S(2))))
rule188 = ReplacementRule(pattern188, lambda m, a, d, n, p, b, x, c : -Int((c + d*x)**m*cot(a + b*x)**(p + S(-2))*csc(a + b*x)**n, x) + Int((c + d*x)**m*cot(a + b*x)**(p + S(-2))*csc(a + b*x)**(n + S(2)), x))
rubi.add(rule188)
pattern189 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*tan(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n, p: EvenQ(n) | OddQ(p)), )
def With189(m, a, d, n, p, b, x, c):
u = IntHide(tan(a + b*x)**p*sec(a + b*x)**n, x)
return -d*m*Int(u*(c + d*x)**(m + S(-1)), x) + Dist((c + d*x)**m, u, x)
rule189 = ReplacementRule(pattern189, lambda m, a, d, n, p, b, x, c : With189(m, a, d, n, p, b, x, c))
rubi.add(rule189)
pattern190 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*cot(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n, p: EvenQ(n) | OddQ(p)), )
def With190(m, a, d, n, p, b, x, c):
u = IntHide(cot(a + b*x)**p*csc(a + b*x)**n, x)
return -d*m*Int(u*(c + d*x)**(m + S(-1)), x) + Dist((c + d*x)**m, u, x)
rule190 = ReplacementRule(pattern190, lambda m, a, d, n, p, b, x, c : With190(m, a, d, n, p, b, x, c))
rubi.add(rule190)
pattern191 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda n: IntegerQ(n)))
rule191 = ReplacementRule(pattern191, lambda m, a, d, n, b, x, c : S(2)**n*Int((c + d*x)**m*csc(S(2)*a + S(2)*b*x)**n, x))
rubi.add(rule191)
pattern192 = Pattern(Integral((x_*WC('d', S(1)) + WC('c', S(0)))**WC('m', S(1))*csc(x_*WC('b', S(1)) + WC('a', S(0)))**WC('n', S(1))*sec(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n, p: IntegersQ(n, p)), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(0))), CustomConstraint(lambda n, p: Unequal(n, p)), )
def With192(m, a, d, n, p, b, x, c):
u = IntHide(csc(a + b*x)**n*sec(a + b*x)**p, x)
return -d*m*Int(u*(c + d*x)**(m + S(-1)), x) + Dist((c + d*x)**m, u, x)
rule192 = ReplacementRule(pattern192, lambda m, a, d, n, p, b, x, c : With192(m, a, d, n, p, b, x, c))
rubi.add(rule192)
pattern193 = Pattern(Integral(F_**v_*G_**w_*u_**WC('m', S(1)), x_), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda F: TrigQ(F)), CustomConstraint(lambda G: TrigQ(G)), CustomConstraint(lambda v, w: ZeroQ(v - w)), CustomConstraint(lambda u, x, w, v: LinearQ(List(u, v, w), x)), CustomConstraint(lambda u, x, w, v: Not(LinearMatchQ(List(u, v, w), x))))
rule193 = ReplacementRule(pattern193, lambda m, u, G, F, v, n, p, x, w : Int(ExpandToSum(u, x)**m*F(ExpandToSum(v, x))**n*G(ExpandToSum(v, x))**p, x))
rubi.add(rule193)
pattern194 = Pattern(Integral((a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule194 = ReplacementRule(pattern194, lambda m, a, d, n, x, f, b, e, c : -f*m*Int((a + b*sin(c + d*x))**(n + S(1))*(e + f*x)**(m + S(-1)), x)/(b*d*(n + S(1))) + (a + b*sin(c + d*x))**(n + S(1))*(e + f*x)**m/(b*d*(n + S(1))))
rubi.add(rule194)
pattern195 = Pattern(Integral((a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule195 = ReplacementRule(pattern195, lambda m, a, d, n, x, f, b, e, c : f*m*Int((a + b*cos(c + d*x))**(n + S(1))*(e + f*x)**(m + S(-1)), x)/(b*d*(n + S(1))) - (a + b*cos(c + d*x))**(n + S(1))*(e + f*x)**m/(b*d*(n + S(1))))
rubi.add(rule195)
pattern196 = Pattern(Integral((a_ + WC('b', S(1))*tan(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sec(x_*WC('d', S(1)) + WC('c', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule196 = ReplacementRule(pattern196, lambda m, a, d, n, x, f, b, e, c : -f*m*Int((a + b*tan(c + d*x))**(n + S(1))*(e + f*x)**(m + S(-1)), x)/(b*d*(n + S(1))) + (a + b*tan(c + d*x))**(n + S(1))*(e + f*x)**m/(b*d*(n + S(1))))
rubi.add(rule196)
pattern197 = Pattern(Integral((a_ + WC('b', S(1))*cot(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*csc(x_*WC('d', S(1)) + WC('c', S(0)))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule197 = ReplacementRule(pattern197, lambda m, a, d, n, x, f, b, e, c : f*m*Int((a + b*cot(c + d*x))**(n + S(1))*(e + f*x)**(m + S(-1)), x)/(b*d*(n + S(1))) - (a + b*cot(c + d*x))**(n + S(1))*(e + f*x)**m/(b*d*(n + S(1))))
rubi.add(rule197)
pattern198 = Pattern(Integral((a_ + WC('b', S(1))*sec(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*tan(x_*WC('d', S(1)) + WC('c', S(0)))*sec(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule198 = ReplacementRule(pattern198, lambda m, a, d, n, x, f, b, e, c : -f*m*Int((a + b*sec(c + d*x))**(n + S(1))*(e + f*x)**(m + S(-1)), x)/(b*d*(n + S(1))) + (a + b*sec(c + d*x))**(n + S(1))*(e + f*x)**m/(b*d*(n + S(1))))
rubi.add(rule198)
pattern199 = Pattern(Integral((a_ + WC('b', S(1))*csc(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cot(x_*WC('d', S(1)) + WC('c', S(0)))*csc(x_*WC('d', S(1)) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: NonzeroQ(n + S(1))))
rule199 = ReplacementRule(pattern199, lambda m, a, d, n, x, f, b, e, c : f*m*Int((a + b*csc(c + d*x))**(n + S(1))*(e + f*x)**(m + S(-1)), x)/(b*d*(n + S(1))) - (a + b*csc(c + d*x))**(n + S(1))*(e + f*x)**m/(b*d*(n + S(1))))
rubi.add(rule199)
pattern200 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**WC('q', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)), CustomConstraint(lambda m: IntegerQ(m)))
rule200 = ReplacementRule(pattern200, lambda m, q, a, d, x, f, p, b, e, c : Int(ExpandTrigReduce((e + f*x)**m, sin(a + b*x)**p*sin(c + d*x)**q, x), x))
rubi.add(rule200)
pattern201 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))**WC('q', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)), CustomConstraint(lambda m: IntegerQ(m)))
rule201 = ReplacementRule(pattern201, lambda m, q, a, d, x, f, p, b, e, c : Int(ExpandTrigReduce((e + f*x)**m, cos(a + b*x)**p*cos(c + d*x)**q, x), x))
rubi.add(rule201)
pattern202 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))**WC('q', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)))
rule202 = ReplacementRule(pattern202, lambda m, q, a, d, x, f, p, b, e, c : Int(ExpandTrigReduce((e + f*x)**m, sin(a + b*x)**p*cos(c + d*x)**q, x), x))
rubi.add(rule202)
pattern203 = Pattern(Integral(F_**(x_*WC('b', S(1)) + WC('a', S(0)))*G_**(x_*WC('d', S(1)) + WC('c', S(0)))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda F: MemberQ(List(Sin, Cos), F)), CustomConstraint(lambda G: MemberQ(List(Sec, Csc), G)), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)), CustomConstraint(lambda d, b, a, c: ZeroQ(-a*d + b*c)), CustomConstraint(lambda d, b: PositiveIntegerQ(b/d + S(-1))))
rule203 = ReplacementRule(pattern203, lambda m, G, q, F, a, d, x, f, p, b, e, c : Int(ExpandTrigExpand((e + f*x)**m*G(c + d*x)**q, F, c + d*x, p, b/d, x), x))
rubi.add(rule203)
pattern204 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sin(x_*WC('e', S(1)) + WC('d', S(0))), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda b, F, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2))))
rule204 = ReplacementRule(pattern204, lambda F, a, d, x, b, e, c : F**(c*(a + b*x))*b*c*log(F)*sin(d + e*x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)) - F**(c*(a + b*x))*e*cos(d + e*x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)))
rubi.add(rule204)
pattern205 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*cos(x_*WC('e', S(1)) + WC('d', S(0))), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda b, F, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2))))
rule205 = ReplacementRule(pattern205, lambda F, a, d, x, b, e, c : F**(c*(a + b*x))*b*c*log(F)*cos(d + e*x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)) + F**(c*(a + b*x))*e*sin(d + e*x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)))
rubi.add(rule205)
pattern206 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(1))))
rule206 = ReplacementRule(pattern206, lambda F, a, d, n, x, b, e, c : F**(c*(a + b*x))*b*c*log(F)*sin(d + e*x)**n/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)) - F**(c*(a + b*x))*e*n*sin(d + e*x)**(n + S(-1))*cos(d + e*x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)) + e**S(2)*n*(n + S(-1))*Int(F**(c*(a + b*x))*sin(d + e*x)**(n + S(-2)), x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)))
rubi.add(rule206)
pattern207 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**m_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda m, F, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*m**S(2))), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(1))))
rule207 = ReplacementRule(pattern207, lambda m, F, a, d, x, b, e, c : F**(c*(a + b*x))*b*c*log(F)*cos(d + e*x)**m/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*m**S(2)) + F**(c*(a + b*x))*e*m*sin(d + e*x)*cos(d + e*x)**(m + S(-1))/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*m**S(2)) + e**S(2)*m*(m + S(-1))*Int(F**(c*(a + b*x))*cos(d + e*x)**(m + S(-2)), x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*m**S(2)))
rubi.add(rule207)
pattern208 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda F, n, b, e, c: ZeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(2))**S(2))), CustomConstraint(lambda n: NonzeroQ(n + S(1))), CustomConstraint(lambda n: NonzeroQ(n + S(2))))
rule208 = ReplacementRule(pattern208, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*sin(d + e*x)**(n + S(2))/(e**S(2)*(n + S(1))*(n + S(2))) + F**(c*(a + b*x))*sin(d + e*x)**(n + S(1))*cos(d + e*x)/(e*(n + S(1))))
rubi.add(rule208)
pattern209 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda F, n, b, e, c: ZeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(2))**S(2))), CustomConstraint(lambda n: NonzeroQ(n + S(1))), CustomConstraint(lambda n: NonzeroQ(n + S(2))))
rule209 = ReplacementRule(pattern209, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*cos(d + e*x)**(n + S(2))/(e**S(2)*(n + S(1))*(n + S(2))) - F**(c*(a + b*x))*sin(d + e*x)*cos(d + e*x)**(n + S(1))/(e*(n + S(1))))
rubi.add(rule209)
pattern210 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(2))**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Less(n, S(-1))), CustomConstraint(lambda n: Unequal(n, S(-2))))
rule210 = ReplacementRule(pattern210, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*sin(d + e*x)**(n + S(2))/(e**S(2)*(n + S(1))*(n + S(2))) + F**(c*(a + b*x))*sin(d + e*x)**(n + S(1))*cos(d + e*x)/(e*(n + S(1))) + (b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(2))**S(2))*Int(F**(c*(a + b*x))*sin(d + e*x)**(n + S(2)), x)/(e**S(2)*(n + S(1))*(n + S(2))))
rubi.add(rule210)
pattern211 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(2))**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Less(n, S(-1))), CustomConstraint(lambda n: Unequal(n, S(-2))))
rule211 = ReplacementRule(pattern211, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*cos(d + e*x)**(n + S(2))/(e**S(2)*(n + S(1))*(n + S(2))) - F**(c*(a + b*x))*sin(d + e*x)*cos(d + e*x)**(n + S(1))/(e*(n + S(1))) + (b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(2))**S(2))*Int(F**(c*(a + b*x))*cos(d + e*x)**(n + S(2)), x)/(e**S(2)*(n + S(1))*(n + S(2))))
rubi.add(rule211)
pattern212 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n: Not(IntegerQ(n))))
rule212 = ReplacementRule(pattern212, lambda F, a, d, n, x, b, e, c : (exp(S(2)*ImaginaryI*(d + e*x)) + S(-1))**(-n)*Int(F**(c*(a + b*x))*(exp(S(2)*ImaginaryI*(d + e*x)) + S(-1))**n*exp(-ImaginaryI*n*(d + e*x)), x)*exp(ImaginaryI*n*(d + e*x))*sin(d + e*x)**n)
rubi.add(rule212)
pattern213 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda n: Not(IntegerQ(n))))
rule213 = ReplacementRule(pattern213, lambda F, a, d, n, x, b, e, c : (exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**(-n)*Int(F**(c*(a + b*x))*(exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**n*exp(-ImaginaryI*n*(d + e*x)), x)*exp(ImaginaryI*n*(d + e*x))*cos(d + e*x)**n)
rubi.add(rule213)
pattern214 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*tan(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n: IntegerQ(n)))
rule214 = ReplacementRule(pattern214, lambda F, a, d, n, x, b, e, c : ImaginaryI**n*Int(ExpandIntegrand(F**(c*(a + b*x))*(-exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**n*(exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**(-n), x), x))
rubi.add(rule214)
pattern215 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*cot(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n: IntegerQ(n)))
rule215 = ReplacementRule(pattern215, lambda F, a, d, n, x, b, e, c : (-ImaginaryI)**n*Int(ExpandIntegrand(F**(c*(a + b*x))*(-exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**(-n)*(exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**n, x), x))
rubi.add(rule215)
pattern216 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Less(n, S(-1))))
rule216 = ReplacementRule(pattern216, lambda F, a, d, n, x, b, e, c : F**(c*(a + b*x))*b*c*log(F)*sec(d + e*x)**n/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)) - F**(c*(a + b*x))*e*n*sin(d + e*x)*sec(d + e*x)**(n + S(1))/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)) + e**S(2)*n*(n + S(1))*Int(F**(c*(a + b*x))*sec(d + e*x)**(n + S(2)), x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)))
rubi.add(rule216)
pattern217 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Less(n, S(-1))))
rule217 = ReplacementRule(pattern217, lambda F, a, d, n, x, b, e, c : F**(c*(a + b*x))*b*c*log(F)*csc(d + e*x)**n/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)) + F**(c*(a + b*x))*e*n*cos(d + e*x)*csc(d + e*x)**(n + S(1))/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)) + e**S(2)*n*(n + S(1))*Int(F**(c*(a + b*x))*csc(d + e*x)**(n + S(2)), x)/(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*n**S(2)))
rubi.add(rule217)
pattern218 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda F, n, b, e, c: ZeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(-2))**S(2))), CustomConstraint(lambda n: NonzeroQ(n + S(-1))), CustomConstraint(lambda n: NonzeroQ(n + S(-2))))
rule218 = ReplacementRule(pattern218, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*sec(d + e*x)**(n + S(-2))/(e**S(2)*(n + S(-2))*(n + S(-1))) + F**(c*(a + b*x))*sin(d + e*x)*sec(d + e*x)**(n + S(-1))/(e*(n + S(-1))))
rubi.add(rule218)
pattern219 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda F, n, b, e, c: ZeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(-2))**S(2))), CustomConstraint(lambda n: NonzeroQ(n + S(-1))), CustomConstraint(lambda n: NonzeroQ(n + S(-2))))
rule219 = ReplacementRule(pattern219, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*csc(d + e*x)**(n + S(-2))/(e**S(2)*(n + S(-2))*(n + S(-1))) + F**(c*(a + b*x))*cos(d + e*x)*csc(d + e*x)**(n + S(-1))/(e*(n + S(-1))))
rubi.add(rule219)
pattern220 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(-2))**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(1))), CustomConstraint(lambda n: Unequal(n, S(2))))
rule220 = ReplacementRule(pattern220, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*sec(d + e*x)**(n + S(-2))/(e**S(2)*(n + S(-2))*(n + S(-1))) + F**(c*(a + b*x))*sin(d + e*x)*sec(d + e*x)**(n + S(-1))/(e*(n + S(-1))) + (b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(-2))**S(2))*Int(F**(c*(a + b*x))*sec(d + e*x)**(n + S(-2)), x)/(e**S(2)*(n + S(-2))*(n + S(-1))))
rubi.add(rule220)
pattern221 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**n_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda F, n, b, e, c: NonzeroQ(b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(-2))**S(2))), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(1))), CustomConstraint(lambda n: Unequal(n, S(2))))
rule221 = ReplacementRule(pattern221, lambda F, a, d, n, x, b, e, c : -F**(c*(a + b*x))*b*c*log(F)*csc(d + e*x)**(n + S(-2))/(e**S(2)*(n + S(-2))*(n + S(-1))) - F**(c*(a + b*x))*cos(d + e*x)*csc(d + e*x)**(n + S(-1))/(e*(n + S(-1))) + (b**S(2)*c**S(2)*log(F)**S(2) + e**S(2)*(n + S(-2))**S(2))*Int(F**(c*(a + b*x))*csc(d + e*x)**(n + S(-2)), x)/(e**S(2)*(n + S(-2))*(n + S(-1))))
rubi.add(rule221)
pattern222 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n: IntegerQ(n)))
rule222 = ReplacementRule(pattern222, lambda F, a, d, n, x, b, e, c : S(2)**n*F**(c*(a + b*x))*Hypergeometric2F1(n, -ImaginaryI*b*c*log(F)/(S(2)*e) + n/S(2), -ImaginaryI*b*c*log(F)/(S(2)*e) + n/S(2) + S(1), -exp(S(2)*ImaginaryI*(d + e*x)))*exp(ImaginaryI*n*(d + e*x))/(ImaginaryI*e*n + b*c*log(F)))
rubi.add(rule222)
pattern223 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n: IntegerQ(n)))
rule223 = ReplacementRule(pattern223, lambda F, a, d, n, x, b, e, c : F**(c*(a + b*x))*(-S(2)*ImaginaryI)**n*Hypergeometric2F1(n, -ImaginaryI*b*c*log(F)/(S(2)*e) + n/S(2), -ImaginaryI*b*c*log(F)/(S(2)*e) + n/S(2) + S(1), exp(S(2)*ImaginaryI*(d + e*x)))*exp(ImaginaryI*n*(d + e*x))/(ImaginaryI*e*n + b*c*log(F)))
rubi.add(rule223)
pattern224 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n: Not(IntegerQ(n))))
rule224 = ReplacementRule(pattern224, lambda F, a, d, n, x, b, e, c : (exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**n*Int(SimplifyIntegrand(F**(c*(a + b*x))*(exp(S(2)*ImaginaryI*(d + e*x)) + S(1))**(-n)*exp(ImaginaryI*n*(d + e*x)), x), x)*exp(-ImaginaryI*n*(d + e*x))*sec(d + e*x)**n)
rubi.add(rule224)
pattern225 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda n: Not(IntegerQ(n))))
rule225 = ReplacementRule(pattern225, lambda F, a, d, n, x, b, e, c : (S(1) - exp(-S(2)*ImaginaryI*(d + e*x)))**n*Int(SimplifyIntegrand(F**(c*(a + b*x))*(S(1) - exp(-S(2)*ImaginaryI*(d + e*x)))**(-n)*exp(-ImaginaryI*n*(d + e*x)), x), x)*exp(ImaginaryI*n*(d + e*x))*csc(d + e*x)**n)
rubi.add(rule225)
pattern226 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(f_ + WC('g', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda g, f: ZeroQ(f**S(2) - g**S(2))), CustomConstraint(lambda n: NegativeIntegerQ(n)))
rule226 = ReplacementRule(pattern226, lambda F, a, d, n, x, g, f, b, e, c : S(2)**n*f**n*Int(F**(c*(a + b*x))*cos(-Pi*f/(S(4)*g) + d/S(2) + e*x/S(2))**(S(2)*n), x))
rubi.add(rule226)
pattern227 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(f_ + WC('g', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda g, f: ZeroQ(f - g)), CustomConstraint(lambda n: NegativeIntegerQ(n)))
rule227 = ReplacementRule(pattern227, lambda F, b, a, d, n, x, f, g, e, c : S(2)**n*f**n*Int(F**(c*(a + b*x))*cos(d/S(2) + e*x/S(2))**(S(2)*n), x))
rubi.add(rule227)
pattern228 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(f_ + WC('g', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda g, f: ZeroQ(f + g)), CustomConstraint(lambda n: NegativeIntegerQ(n)))
rule228 = ReplacementRule(pattern228, lambda F, b, a, d, n, x, f, g, e, c : S(2)**n*f**n*Int(F**(c*(a + b*x))*sin(d/S(2) + e*x/S(2))**(S(2)*n), x))
rubi.add(rule228)
pattern229 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(f_ + WC('g', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0))))**WC('n', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**WC('m', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda g, f: ZeroQ(f**S(2) - g**S(2))), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m, n: Equal(m + n, S(0))))
rule229 = ReplacementRule(pattern229, lambda m, F, a, d, n, x, g, f, b, e, c : g**n*Int(F**(c*(a + b*x))*(-tan(-Pi*f/(S(4)*g) + d/S(2) + e*x/S(2)))**m, x))
rubi.add(rule229)
pattern230 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(f_ + WC('g', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0))))**WC('n', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**WC('m', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda g, f: ZeroQ(f - g)), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m, n: Equal(m + n, S(0))))
rule230 = ReplacementRule(pattern230, lambda m, F, b, a, d, n, x, f, g, e, c : f**n*Int(F**(c*(a + b*x))*tan(d/S(2) + e*x/S(2))**m, x))
rubi.add(rule230)
pattern231 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(f_ + WC('g', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0))))**WC('n', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**WC('m', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda g, f: ZeroQ(f + g)), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m, n: Equal(m + n, S(0))))
rule231 = ReplacementRule(pattern231, lambda m, F, b, a, d, n, x, f, g, e, c : f**n*Int(F**(c*(a + b*x))*cot(d/S(2) + e*x/S(2))**m, x))
rubi.add(rule231)
pattern232 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(h_ + WC('i', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0))))/(f_ + WC('g', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda h, x: FreeQ(h, x)), CustomConstraint(lambda i, x: FreeQ(i, x)), CustomConstraint(lambda g, f: ZeroQ(f**S(2) - g**S(2))), CustomConstraint(lambda h, i: ZeroQ(h**S(2) - i**S(2))), CustomConstraint(lambda g, h, i, f: ZeroQ(-f*i + g*h)))
rule232 = ReplacementRule(pattern232, lambda h, F, a, d, x, g, f, b, e, i, c : S(2)*i*Int(F**(c*(a + b*x))*cos(d + e*x)/(f + g*sin(d + e*x)), x) + Int(F**(c*(a + b*x))*(h - i*cos(d + e*x))/(f + g*sin(d + e*x)), x))
rubi.add(rule232)
pattern233 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*(h_ + WC('i', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0))))/(f_ + WC('g', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0)))), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda h, x: FreeQ(h, x)), CustomConstraint(lambda i, x: FreeQ(i, x)), CustomConstraint(lambda g, f: ZeroQ(f**S(2) - g**S(2))), CustomConstraint(lambda h, i: ZeroQ(h**S(2) - i**S(2))), CustomConstraint(lambda g, h, i, f: ZeroQ(f*i + g*h)))
rule233 = ReplacementRule(pattern233, lambda h, F, b, a, d, x, f, g, e, i, c : S(2)*i*Int(F**(c*(a + b*x))*sin(d + e*x)/(f + g*cos(d + e*x)), x) + Int(F**(c*(a + b*x))*(h - i*sin(d + e*x))/(f + g*cos(d + e*x)), x))
rubi.add(rule233)
pattern234 = Pattern(Integral(F_**(u_*WC('c', S(1)))*G_**v_, x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda G: TrigQ(G)), CustomConstraint(lambda u, x, v: LinearQ(List(u, v), x)), CustomConstraint(lambda u, x, v: Not(LinearMatchQ(List(u, v), x))))
rule234 = ReplacementRule(pattern234, lambda u, G, F, v, n, x, c : Int(F**(c*ExpandToSum(u, x))*G(ExpandToSum(v, x))**n, x))
rubi.add(rule234)
pattern235 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*x_**WC('m', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(0))), CustomConstraint(lambda n: PositiveIntegerQ(n)), )
def With235(m, F, a, d, n, x, b, e, c):
u = IntHide(F**(c*(a + b*x))*sin(d + e*x)**n, x)
return -m*Int(u*x**(m + S(-1)), x) + Dist(x**m, u, x)
rule235 = ReplacementRule(pattern235, lambda m, F, a, d, n, x, b, e, c : With235(m, F, a, d, n, x, b, e, c))
rubi.add(rule235)
pattern236 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*x_**WC('m', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda m: RationalQ(m)), CustomConstraint(lambda m: Greater(m, S(0))), CustomConstraint(lambda n: PositiveIntegerQ(n)), )
def With236(m, F, a, d, n, x, b, e, c):
u = IntHide(F**(c*(a + b*x))*cos(d + e*x)**n, x)
return -m*Int(u*x**(m + S(-1)), x) + Dist(x**m, u, x)
rule236 = ReplacementRule(pattern236, lambda m, F, a, d, n, x, b, e, c : With236(m, F, a, d, n, x, b, e, c))
rubi.add(rule236)
pattern237 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**WC('m', S(1))*cos(x_*WC('g', S(1)) + WC('f', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, n: PositiveIntegerQ(m, n)))
rule237 = ReplacementRule(pattern237, lambda m, F, b, a, d, n, x, f, g, e, c : Int(ExpandTrigReduce(F**(c*(a + b*x)), sin(d + e*x)**m*cos(f + g*x)**n, x), x))
rubi.add(rule237)
pattern238 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*x_**WC('p', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**WC('m', S(1))*cos(x_*WC('g', S(1)) + WC('f', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m, n, p: PositiveIntegerQ(m, n, p)))
rule238 = ReplacementRule(pattern238, lambda m, F, b, a, d, n, x, f, p, g, e, c : Int(ExpandTrigReduce(F**(c*(a + b*x))*x**p, sin(d + e*x)**m*cos(f + g*x)**n, x), x))
rubi.add(rule238)
pattern239 = Pattern(Integral(F_**((x_*WC('b', S(1)) + WC('a', S(0)))*WC('c', S(1)))*G_**(x_*WC('e', S(1)) + WC('d', S(0)))*H_**(x_*WC('e', S(1)) + WC('d', S(0))), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda m, n: PositiveIntegerQ(m, n)), CustomConstraint(lambda G: TrigQ(G)), CustomConstraint(lambda H: TrigQ(H)))
rule239 = ReplacementRule(pattern239, lambda m, G, F, H, a, d, n, x, b, e, c : Int(ExpandTrigToExp(F**(c*(a + b*x)), G(d + e*x)**m*H(d + e*x)**n, x), x))
rubi.add(rule239)
pattern240 = Pattern(Integral(F_**u_*sin(v_)**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda u, x: LinearQ(u, x) | PolyQ(u, x, S(2))), CustomConstraint(lambda v, x: LinearQ(v, x) | PolyQ(v, x, S(2))), CustomConstraint(lambda n: PositiveIntegerQ(n)))
rule240 = ReplacementRule(pattern240, lambda u, F, v, n, x : Int(ExpandTrigToExp(F**u, sin(v)**n, x), x))
rubi.add(rule240)
pattern241 = Pattern(Integral(F_**u_*cos(v_)**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda u, x: LinearQ(u, x) | PolyQ(u, x, S(2))), CustomConstraint(lambda v, x: LinearQ(v, x) | PolyQ(v, x, S(2))), CustomConstraint(lambda n: PositiveIntegerQ(n)))
rule241 = ReplacementRule(pattern241, lambda u, F, v, n, x : Int(ExpandTrigToExp(F**u, cos(v)**n, x), x))
rubi.add(rule241)
pattern242 = Pattern(Integral(F_**u_*sin(v_)**WC('m', S(1))*cos(v_)**WC('n', S(1)), x_), CustomConstraint(lambda F, x: FreeQ(F, x)), CustomConstraint(lambda u, x: LinearQ(u, x) | PolyQ(u, x, S(2))), CustomConstraint(lambda v, x: LinearQ(v, x) | PolyQ(v, x, S(2))), CustomConstraint(lambda m, n: PositiveIntegerQ(m, n)))
rule242 = ReplacementRule(pattern242, lambda m, u, F, v, n, x : Int(ExpandTrigToExp(F**u, sin(v)**m*cos(v)**n, x), x))
rubi.add(rule242)
pattern243 = Pattern(Integral(sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + S(1))), CustomConstraint(lambda p: NonzeroQ(p + S(1))))
rule243 = ReplacementRule(pattern243, lambda a, n, p, b, x, c : x*(p + S(2))*sin(a + b*log(c*x**n))**(p + S(2))/(p + S(1)) + x*sin(a + b*log(c*x**n))**(p + S(2))*cot(a + b*log(c*x**n))/(b*n*(p + S(1))))
rubi.add(rule243)
pattern244 = Pattern(Integral(cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + S(1))), CustomConstraint(lambda p: NonzeroQ(p + S(1))))
rule244 = ReplacementRule(pattern244, lambda a, n, p, b, x, c : x*(p + S(2))*cos(a + b*log(c*x**n))**(p + S(2))/(p + S(1)) - x*cos(a + b*log(c*x**n))**(p + S(2))*tan(a + b*log(c*x**n))/(b*n*(p + S(1))))
rubi.add(rule244)
pattern245 = Pattern(Integral(sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: PositiveIntegerQ(p)), CustomConstraint(lambda b, n, p: ZeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule245 = ReplacementRule(pattern245, lambda a, n, p, b, x, c : Int(ExpandIntegrand((-(c*x**n)**(S(1)/(n*p))*exp(-a*b*n*p)/(S(2)*b*n*p) + (c*x**n)**(-S(1)/(n*p))*exp(a*b*n*p)/(S(2)*b*n*p))**p, x), x))
rubi.add(rule245)
pattern246 = Pattern(Integral(cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: PositiveIntegerQ(p)), CustomConstraint(lambda b, n, p: ZeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule246 = ReplacementRule(pattern246, lambda a, n, p, b, x, c : Int(ExpandIntegrand((-(c*x**n)**(S(1)/(n*p))*exp(-a*b*n*p)/S(2) + (c*x**n)**(-S(1)/(n*p))*exp(a*b*n*p)/S(2))**p, x), x))
rubi.add(rule246)
pattern247 = Pattern(Integral(sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, n: NonzeroQ(b**S(2)*n**S(2) + S(1))))
rule247 = ReplacementRule(pattern247, lambda a, n, b, x, c : -b*n*x*cos(a + b*log(c*x**n))/(b**S(2)*n**S(2) + S(1)) + x*sin(a + b*log(c*x**n))/(b**S(2)*n**S(2) + S(1)))
rubi.add(rule247)
pattern248 = Pattern(Integral(cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, n: NonzeroQ(b**S(2)*n**S(2) + S(1))))
rule248 = ReplacementRule(pattern248, lambda a, n, b, x, c : b*n*x*sin(a + b*log(c*x**n))/(b**S(2)*n**S(2) + S(1)) + x*cos(a + b*log(c*x**n))/(b**S(2)*n**S(2) + S(1)))
rubi.add(rule248)
pattern249 = Pattern(Integral(sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule249 = ReplacementRule(pattern249, lambda a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(-1))*Int(sin(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*p**S(2) + S(1)) - b*n*p*x*sin(a + b*log(c*x**n))**(p + S(-1))*cos(a + b*log(c*x**n))/(b**S(2)*n**S(2)*p**S(2) + S(1)) + x*sin(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + S(1)))
rubi.add(rule249)
pattern250 = Pattern(Integral(cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule250 = ReplacementRule(pattern250, lambda a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(-1))*Int(cos(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*p**S(2) + S(1)) + b*n*p*x*sin(a + b*log(c*x**n))*cos(a + b*log(c*x**n))**(p + S(-1))/(b**S(2)*n**S(2)*p**S(2) + S(1)) + x*cos(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + S(1)))
rubi.add(rule250)
pattern251 = Pattern(Integral(sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda p: Unequal(p, S(-2))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + S(1))))
rule251 = ReplacementRule(pattern251, lambda a, n, p, b, x, c : x*sin(a + b*log(c*x**n))**(p + S(2))*cot(a + b*log(c*x**n))/(b*n*(p + S(1))) - x*sin(a + b*log(c*x**n))**(p + S(2))/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))) + (b**S(2)*n**S(2)*(p + S(2))**S(2) + S(1))*Int(sin(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))))
rubi.add(rule251)
pattern252 = Pattern(Integral(cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda p: Unequal(p, S(-2))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + S(1))))
rule252 = ReplacementRule(pattern252, lambda a, n, p, b, x, c : -x*cos(a + b*log(c*x**n))**(p + S(2))*tan(a + b*log(c*x**n))/(b*n*(p + S(1))) - x*cos(a + b*log(c*x**n))**(p + S(2))/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))) + (b**S(2)*n**S(2)*(p + S(2))**S(2) + S(1))*Int(cos(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))))
rubi.add(rule252)
pattern253 = Pattern(Integral(sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule253 = ReplacementRule(pattern253, lambda a, n, p, b, x, c : x*(-S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**(-p)*(-ImaginaryI*(c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a) + ImaginaryI*(c*x**n)**(-ImaginaryI*b)*exp(-ImaginaryI*a))**p*Hypergeometric2F1(-p, (-ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (-ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), (c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(-ImaginaryI*b*n*p + S(1)))
rubi.add(rule253)
pattern254 = Pattern(Integral(cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule254 = ReplacementRule(pattern254, lambda a, n, p, b, x, c : x*((c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a) + (c*x**n)**(-ImaginaryI*b)*exp(-ImaginaryI*a))**p*(S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**(-p)*Hypergeometric2F1(-p, (-ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (-ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), -(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(-ImaginaryI*b*n*p + S(1)))
rubi.add(rule254)
pattern255 = Pattern(Integral(x_**WC('m', S(1))*sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + (m + S(1))**S(2))), CustomConstraint(lambda p: NonzeroQ(p + S(1))), CustomConstraint(lambda m: NonzeroQ(m + S(1))))
rule255 = ReplacementRule(pattern255, lambda m, a, n, p, b, x, c : x**(m + S(1))*(p + S(2))*sin(a + b*log(c*x**n))**(p + S(2))/((m + S(1))*(p + S(1))) + x**(m + S(1))*sin(a + b*log(c*x**n))**(p + S(2))*cot(a + b*log(c*x**n))/(b*n*(p + S(1))))
rubi.add(rule255)
pattern256 = Pattern(Integral(x_**WC('m', S(1))*cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + (m + S(1))**S(2))), CustomConstraint(lambda p: NonzeroQ(p + S(1))), CustomConstraint(lambda m: NonzeroQ(m + S(1))))
rule256 = ReplacementRule(pattern256, lambda m, a, n, p, b, x, c : x**(m + S(1))*(p + S(2))*cos(a + b*log(c*x**n))**(p + S(2))/((m + S(1))*(p + S(1))) - x**(m + S(1))*cos(a + b*log(c*x**n))**(p + S(2))*tan(a + b*log(c*x**n))/(b*n*(p + S(1))))
rubi.add(rule256)
pattern257 = Pattern(Integral(x_**WC('m', S(1))*sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: PositiveIntegerQ(p)), CustomConstraint(lambda b, m, n, p: ZeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule257 = ReplacementRule(pattern257, lambda m, a, n, p, b, x, c : S(2)**(-p)*Int(ExpandIntegrand(x**m*((c*x**n)**((-m + S(-1))/(n*p))*(m + S(1))*exp(a*b*n*p/(m + S(1)))/(b*n*p) - (c*x**n)**((m + S(1))/(n*p))*(m + S(1))*exp(-a*b*n*p/(m + S(1)))/(b*n*p))**p, x), x))
rubi.add(rule257)
pattern258 = Pattern(Integral(x_**WC('m', S(1))*cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: PositiveIntegerQ(p)), CustomConstraint(lambda b, m, n, p: ZeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule258 = ReplacementRule(pattern258, lambda m, a, n, p, b, x, c : S(2)**(-p)*Int(ExpandIntegrand(x**m*((c*x**n)**((-m + S(-1))/(n*p))*exp(a*b*n*p/(m + S(1))) - (c*x**n)**((m + S(1))/(n*p))*exp(-a*b*n*p/(m + S(1))))**p, x), x))
rubi.add(rule258)
pattern259 = Pattern(Integral(x_**WC('m', S(1))*sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, m, n: NonzeroQ(b**S(2)*n**S(2) + (m + S(1))**S(2))))
rule259 = ReplacementRule(pattern259, lambda m, a, n, b, x, c : -b*n*x**(m + S(1))*cos(a + b*log(c*x**n))/(b**S(2)*n**S(2) + (m + S(1))**S(2)) + x**(m + S(1))*(m + S(1))*sin(a + b*log(c*x**n))/(b**S(2)*n**S(2) + (m + S(1))**S(2)))
rubi.add(rule259)
pattern260 = Pattern(Integral(x_**WC('m', S(1))*cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, m, n: NonzeroQ(b**S(2)*n**S(2) + (m + S(1))**S(2))))
rule260 = ReplacementRule(pattern260, lambda m, a, n, b, x, c : b*n*x**(m + S(1))*sin(a + b*log(c*x**n))/(b**S(2)*n**S(2) + (m + S(1))**S(2)) + x**(m + S(1))*(m + S(1))*cos(a + b*log(c*x**n))/(b**S(2)*n**S(2) + (m + S(1))**S(2)))
rubi.add(rule260)
pattern261 = Pattern(Integral(x_**WC('m', S(1))*sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule261 = ReplacementRule(pattern261, lambda m, a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(-1))*Int(x**m*sin(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) - b*n*p*x**(m + S(1))*sin(a + b*log(c*x**n))**(p + S(-1))*cos(a + b*log(c*x**n))/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) + x**(m + S(1))*(m + S(1))*sin(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)))
rubi.add(rule261)
pattern262 = Pattern(Integral(x_**WC('m', S(1))*cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule262 = ReplacementRule(pattern262, lambda m, a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(-1))*Int(x**m*cos(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) + b*n*p*x**(m + S(1))*sin(a + b*log(c*x**n))*cos(a + b*log(c*x**n))**(p + S(-1))/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) + x**(m + S(1))*(m + S(1))*cos(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)))
rubi.add(rule262)
pattern263 = Pattern(Integral(x_**WC('m', S(1))*sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda p: Unequal(p, S(-2))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + (m + S(1))**S(2))))
rule263 = ReplacementRule(pattern263, lambda m, a, n, p, b, x, c : x**(m + S(1))*sin(a + b*log(c*x**n))**(p + S(2))*cot(a + b*log(c*x**n))/(b*n*(p + S(1))) - x**(m + S(1))*(m + S(1))*sin(a + b*log(c*x**n))**(p + S(2))/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))) + (b**S(2)*n**S(2)*(p + S(2))**S(2) + (m + S(1))**S(2))*Int(x**m*sin(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))))
rubi.add(rule263)
pattern264 = Pattern(Integral(x_**WC('m', S(1))*cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda p: Unequal(p, S(-2))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(2))**S(2) + (m + S(1))**S(2))))
rule264 = ReplacementRule(pattern264, lambda m, a, n, p, b, x, c : -x**(m + S(1))*cos(a + b*log(c*x**n))**(p + S(2))*tan(a + b*log(c*x**n))/(b*n*(p + S(1))) - x**(m + S(1))*(m + S(1))*cos(a + b*log(c*x**n))**(p + S(2))/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))) + (b**S(2)*n**S(2)*(p + S(2))**S(2) + (m + S(1))**S(2))*Int(x**m*cos(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*(p + S(1))*(p + S(2))))
rubi.add(rule264)
pattern265 = Pattern(Integral(x_**WC('m', S(1))*sin(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule265 = ReplacementRule(pattern265, lambda m, a, n, p, b, x, c : x**(m + S(1))*(-S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**(-p)*(-ImaginaryI*(c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a) + ImaginaryI*(c*x**n)**(-ImaginaryI*b)*exp(-ImaginaryI*a))**p*Hypergeometric2F1(-p, (-ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (-ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), (c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(-ImaginaryI*b*n*p + m + S(1)))
rubi.add(rule265)
pattern266 = Pattern(Integral(x_**WC('m', S(1))*cos(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule266 = ReplacementRule(pattern266, lambda m, a, n, p, b, x, c : x**(m + S(1))*((c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a) + (c*x**n)**(-ImaginaryI*b)*exp(-ImaginaryI*a))**p*(S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**(-p)*Hypergeometric2F1(-p, (-ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (-ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), -(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(-ImaginaryI*b*n*p + m + S(1)))
rubi.add(rule266)
pattern267 = Pattern(Integral(sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, n: ZeroQ(b**S(2)*n**S(2) + S(1))))
rule267 = ReplacementRule(pattern267, lambda a, n, b, x, c : S(2)*Int((c*x**n)**(1/n)/((c*x**n)**(S(2)/n) + exp(S(2)*a*b*n)), x)*exp(a*b*n))
rubi.add(rule267)
pattern268 = Pattern(Integral(csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, n: ZeroQ(b**S(2)*n**S(2) + S(1))))
rule268 = ReplacementRule(pattern268, lambda a, n, b, x, c : S(2)*b*n*Int((c*x**n)**(1/n)/(-(c*x**n)**(S(2)/n) + exp(S(2)*a*b*n)), x)*exp(a*b*n))
rubi.add(rule268)
pattern269 = Pattern(Integral(sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + S(1))), CustomConstraint(lambda p: NonzeroQ(p + S(-1))))
rule269 = ReplacementRule(pattern269, lambda a, n, p, b, x, c : x*(p + S(-2))*sec(a + b*log(c*x**n))**(p + S(-2))/(p + S(-1)) + x*tan(a + b*log(c*x**n))*sec(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))))
rubi.add(rule269)
pattern270 = Pattern(Integral(csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + S(1))), CustomConstraint(lambda p: NonzeroQ(p + S(-1))))
rule270 = ReplacementRule(pattern270, lambda a, n, p, b, x, c : x*(p + S(-2))*csc(a + b*log(c*x**n))**(p + S(-2))/(p + S(-1)) - x*cot(a + b*log(c*x**n))*csc(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))))
rubi.add(rule270)
pattern271 = Pattern(Integral(sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda p: Unequal(p, S(2))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + S(1))))
rule271 = ReplacementRule(pattern271, lambda a, n, p, b, x, c : x*tan(a + b*log(c*x**n))*sec(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))) - x*sec(a + b*log(c*x**n))**(p + S(-2))/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))) + (b**S(2)*n**S(2)*(p + S(-2))**S(2) + S(1))*Int(sec(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))))
rubi.add(rule271)
pattern272 = Pattern(Integral(csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda p: Unequal(p, S(2))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + S(1))))
rule272 = ReplacementRule(pattern272, lambda a, n, p, b, x, c : -x*cot(a + b*log(c*x**n))*csc(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))) - x*csc(a + b*log(c*x**n))**(p + S(-2))/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))) + (b**S(2)*n**S(2)*(p + S(-2))**S(2) + S(1))*Int(csc(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))))
rubi.add(rule272)
pattern273 = Pattern(Integral(sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule273 = ReplacementRule(pattern273, lambda a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(1))*Int(sec(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*p**S(2) + S(1)) - b*n*p*x*sin(a + b*log(c*x**n))*sec(a + b*log(c*x**n))**(p + S(1))/(b**S(2)*n**S(2)*p**S(2) + S(1)) + x*sec(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + S(1)))
rubi.add(rule273)
pattern274 = Pattern(Integral(csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule274 = ReplacementRule(pattern274, lambda a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(1))*Int(csc(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*p**S(2) + S(1)) + b*n*p*x*cos(a + b*log(c*x**n))*csc(a + b*log(c*x**n))**(p + S(1))/(b**S(2)*n**S(2)*p**S(2) + S(1)) + x*csc(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + S(1)))
rubi.add(rule274)
pattern275 = Pattern(Integral(sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule275 = ReplacementRule(pattern275, lambda a, n, p, b, x, c : x*((c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a)/((c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(1)))**p*(S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**p*Hypergeometric2F1(p, (ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), -(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(ImaginaryI*b*n*p + S(1)))
rubi.add(rule275)
pattern276 = Pattern(Integral(csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + S(1))))
rule276 = ReplacementRule(pattern276, lambda a, n, p, b, x, c : x*(-ImaginaryI*(c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a)/(-(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(1)))**p*(-S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**p*Hypergeometric2F1(p, (ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (ImaginaryI*b*n*p + S(1))/(S(2)*ImaginaryI*b*n), (c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(ImaginaryI*b*n*p + S(1)))
rubi.add(rule276)
pattern277 = Pattern(Integral(x_**WC('m', S(1))*sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, m, n: ZeroQ(b**S(2)*n**S(2) + (m + S(1))**S(2))))
rule277 = ReplacementRule(pattern277, lambda m, a, n, b, x, c : S(2)*Int(x**m*(c*x**n)**((m + S(1))/n)/((c*x**n)**(S(2)*(m + S(1))/n) + exp(S(2)*a*b*n/(m + S(1)))), x)*exp(a*b*n/(m + S(1))))
rubi.add(rule277)
pattern278 = Pattern(Integral(x_**WC('m', S(1))*csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, m, n: ZeroQ(b**S(2)*n**S(2) + (m + S(1))**S(2))))
rule278 = ReplacementRule(pattern278, lambda m, a, n, b, x, c : S(2)*b*n*Int(x**m*(c*x**n)**((m + S(1))/n)/(-(c*x**n)**(S(2)*(m + S(1))/n) + exp(S(2)*a*b*n/(m + S(1)))), x)*exp(a*b*n/(m + S(1)))/(m + S(1)))
rubi.add(rule278)
pattern279 = Pattern(Integral(x_**WC('m', S(1))*sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + (m + S(1))**S(2))), CustomConstraint(lambda m: NonzeroQ(m + S(1))), CustomConstraint(lambda p: NonzeroQ(p + S(-1))))
rule279 = ReplacementRule(pattern279, lambda m, a, n, p, b, x, c : x**(m + S(1))*(p + S(-2))*sec(a + b*log(c*x**n))**(p + S(-2))/((m + S(1))*(p + S(-1))) + x**(m + S(1))*tan(a + b*log(c*x**n))*sec(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))))
rubi.add(rule279)
pattern280 = Pattern(Integral(x_**WC('m', S(1))*csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: ZeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + (m + S(1))**S(2))), CustomConstraint(lambda m: NonzeroQ(m + S(1))), CustomConstraint(lambda p: NonzeroQ(p + S(-1))))
rule280 = ReplacementRule(pattern280, lambda m, a, n, p, b, x, c : x**(m + S(1))*(p + S(-2))*csc(a + b*log(c*x**n))**(p + S(-2))/((m + S(1))*(p + S(-1))) - x**(m + S(1))*cot(a + b*log(c*x**n))*csc(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))))
rubi.add(rule280)
pattern281 = Pattern(Integral(x_**WC('m', S(1))*sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda p: Unequal(p, S(2))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + (m + S(1))**S(2))))
rule281 = ReplacementRule(pattern281, lambda m, a, n, p, b, x, c : x**(m + S(1))*tan(a + b*log(c*x**n))*sec(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))) - x**(m + S(1))*(m + S(1))*sec(a + b*log(c*x**n))**(p + S(-2))/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))) + (b**S(2)*n**S(2)*(p + S(-2))**S(2) + (m + S(1))**S(2))*Int(x**m*sec(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))))
rubi.add(rule281)
pattern282 = Pattern(Integral(x_**WC('m', S(1))*csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(1))), CustomConstraint(lambda p: Unequal(p, S(2))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*(p + S(-2))**S(2) + (m + S(1))**S(2))))
rule282 = ReplacementRule(pattern282, lambda m, a, n, p, b, x, c : -x**(m + S(1))*cot(a + b*log(c*x**n))*csc(a + b*log(c*x**n))**(p + S(-2))/(b*n*(p + S(-1))) - x**(m + S(1))*(m + S(1))*csc(a + b*log(c*x**n))**(p + S(-2))/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))) + (b**S(2)*n**S(2)*(p + S(-2))**S(2) + (m + S(1))**S(2))*Int(x**m*csc(a + b*log(c*x**n))**(p + S(-2)), x)/(b**S(2)*n**S(2)*(p + S(-2))*(p + S(-1))))
rubi.add(rule282)
pattern283 = Pattern(Integral(x_**WC('m', S(1))*sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule283 = ReplacementRule(pattern283, lambda m, a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(1))*Int(x**m*sec(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) - b*n*p*x**(m + S(1))*sin(a + b*log(c*x**n))*sec(a + b*log(c*x**n))**(p + S(1))/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) + x**(m + S(1))*(m + S(1))*sec(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)))
rubi.add(rule283)
pattern284 = Pattern(Integral(x_**WC('m', S(1))*csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**p_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Less(p, S(-1))), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule284 = ReplacementRule(pattern284, lambda m, a, n, p, b, x, c : b**S(2)*n**S(2)*p*(p + S(1))*Int(x**m*csc(a + b*log(c*x**n))**(p + S(2)), x)/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) + b*n*p*x**(m + S(1))*cos(a + b*log(c*x**n))*csc(a + b*log(c*x**n))**(p + S(1))/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)) + x**(m + S(1))*(m + S(1))*csc(a + b*log(c*x**n))**p/(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2)))
rubi.add(rule284)
pattern285 = Pattern(Integral(x_**WC('m', S(1))*sec(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule285 = ReplacementRule(pattern285, lambda m, a, n, p, b, x, c : x**(m + S(1))*((c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a)/((c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(1)))**p*(S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**p*Hypergeometric2F1(p, (ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), -(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(ImaginaryI*b*n*p + m + S(1)))
rubi.add(rule285)
pattern286 = Pattern(Integral(x_**WC('m', S(1))*csc(WC('a', S(0)) + WC('b', S(1))*log(x_**WC('n', S(1))*WC('c', S(1))))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p, x: FreeQ(p, x)), CustomConstraint(lambda b, m, n, p: NonzeroQ(b**S(2)*n**S(2)*p**S(2) + (m + S(1))**S(2))))
rule286 = ReplacementRule(pattern286, lambda m, a, n, p, b, x, c : x**(m + S(1))*(-ImaginaryI*(c*x**n)**(ImaginaryI*b)*exp(ImaginaryI*a)/(-(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(1)))**p*(-S(2)*(c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a) + S(2))**p*Hypergeometric2F1(p, (ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), S(1) + (ImaginaryI*b*n*p + m + S(1))/(S(2)*ImaginaryI*b*n), (c*x**n)**(S(2)*ImaginaryI*b)*exp(S(2)*ImaginaryI*a))/(ImaginaryI*b*n*p + m + S(1)))
rubi.add(rule286)
pattern287 = Pattern(Integral(log(x_*WC('b', S(1)))**WC('p', S(1))*sin(x_*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))))
rule287 = ReplacementRule(pattern287, lambda b, x, a, p : -p*Int(log(b*x)**(p + S(-1))*sin(a*x*log(b*x)**p), x) - cos(a*x*log(b*x)**p)/a)
rubi.add(rule287)
pattern288 = Pattern(Integral(log(x_*WC('b', S(1)))**WC('p', S(1))*cos(x_*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))))
rule288 = ReplacementRule(pattern288, lambda b, x, a, p : -p*Int(log(b*x)**(p + S(-1))*cos(a*x*log(b*x)**p), x) + sin(a*x*log(b*x)**p)/a)
rubi.add(rule288)
pattern289 = Pattern(Integral(log(x_*WC('b', S(1)))**WC('p', S(1))*sin(x_**n_*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda n, p: RationalQ(n, p)), CustomConstraint(lambda p: Greater(p, S(0))))
rule289 = ReplacementRule(pattern289, lambda a, n, p, b, x : -p*Int(log(b*x)**(p + S(-1))*sin(a*x**n*log(b*x)**p), x)/n - x**(-n + S(1))*cos(a*x**n*log(b*x)**p)/(a*n) - (n + S(-1))*Int(x**(-n)*cos(a*x**n*log(b*x)**p), x)/(a*n))
rubi.add(rule289)
pattern290 = Pattern(Integral(log(x_*WC('b', S(1)))**WC('p', S(1))*cos(x_**n_*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda n, p: RationalQ(n, p)), CustomConstraint(lambda p: Greater(p, S(0))))
rule290 = ReplacementRule(pattern290, lambda a, n, p, b, x : -p*Int(log(b*x)**(p + S(-1))*cos(a*x**n*log(b*x)**p), x)/n + x**(-n + S(1))*sin(a*x**n*log(b*x)**p)/(a*n) + (n + S(-1))*Int(x**(-n)*sin(a*x**n*log(b*x)**p), x)/(a*n))
rubi.add(rule290)
pattern291 = Pattern(Integral(x_**WC('m', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))*sin(x_**WC('n', S(1))*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m, n: ZeroQ(m - n + S(1))), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))))
rule291 = ReplacementRule(pattern291, lambda m, a, n, p, b, x : -p*Int(x**m*log(b*x)**(p + S(-1))*sin(a*x**n*log(b*x)**p), x)/n - cos(a*x**n*log(b*x)**p)/(a*n))
rubi.add(rule291)
pattern292 = Pattern(Integral(x_**WC('m', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))*cos(x_**WC('n', S(1))*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m, n: ZeroQ(m - n + S(1))), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))))
rule292 = ReplacementRule(pattern292, lambda m, a, n, p, b, x : -p*Int(x**m*log(b*x)**(p + S(-1))*cos(a*x**n*log(b*x)**p), x)/n + sin(a*x**n*log(b*x)**p)/(a*n))
rubi.add(rule292)
pattern293 = Pattern(Integral(x_**WC('m', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))*sin(x_**WC('n', S(1))*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))), CustomConstraint(lambda m, n: NonzeroQ(m - n + S(1))))
rule293 = ReplacementRule(pattern293, lambda m, a, n, p, b, x : -p*Int(x**m*log(b*x)**(p + S(-1))*sin(a*x**n*log(b*x)**p), x)/n - x**(m - n + S(1))*cos(a*x**n*log(b*x)**p)/(a*n) + (m - n + S(1))*Int(x**(m - n)*cos(a*x**n*log(b*x)**p), x)/(a*n))
rubi.add(rule293)
pattern294 = Pattern(Integral(x_**m_*log(x_*WC('b', S(1)))**WC('p', S(1))*cos(x_**WC('n', S(1))*WC('a', S(1))*log(x_*WC('b', S(1)))**WC('p', S(1))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda p: RationalQ(p)), CustomConstraint(lambda p: Greater(p, S(0))), CustomConstraint(lambda m, n: NonzeroQ(m - n + S(1))))
rule294 = ReplacementRule(pattern294, lambda m, a, n, p, b, x : -p*Int(x**m*log(b*x)**(p + S(-1))*cos(a*x**n*log(b*x)**p), x)/n + x**(m - n + S(1))*sin(a*x**n*log(b*x)**p)/(a*n) - (m - n + S(1))*Int(x**(m - n)*sin(a*x**n*log(b*x)**p), x)/(a*n))
rubi.add(rule294)
pattern295 = Pattern(Integral(sin(WC('a', S(1))/(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n: PositiveIntegerQ(n)))
rule295 = ReplacementRule(pattern295, lambda a, d, n, x, c : -Subst(Int(sin(a*x)**n/x**S(2), x), x, 1/(c + d*x))/d)
rubi.add(rule295)
pattern296 = Pattern(Integral(cos(WC('a', S(1))/(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n: PositiveIntegerQ(n)))
rule296 = ReplacementRule(pattern296, lambda a, d, n, x, c : -Subst(Int(cos(a*x)**n/x**S(2), x), x, 1/(c + d*x))/d)
rubi.add(rule296)
pattern297 = Pattern(Integral(sin((x_*WC('b', S(1)) + WC('a', S(0)))*WC('e', S(1))/(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n: PositiveIntegerQ(n)), CustomConstraint(lambda d, b, a, c: NonzeroQ(-a*d + b*c)))
rule297 = ReplacementRule(pattern297, lambda a, d, n, x, b, e, c : -Subst(Int(sin(b*e/d - e*x*(-a*d + b*c)/d)**n/x**S(2), x), x, 1/(c + d*x))/d)
rubi.add(rule297)
pattern298 = Pattern(Integral(cos((x_*WC('b', S(1)) + WC('a', S(0)))*WC('e', S(1))/(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda n: PositiveIntegerQ(n)), CustomConstraint(lambda d, b, a, c: NonzeroQ(-a*d + b*c)))
rule298 = ReplacementRule(pattern298, lambda a, d, n, x, b, e, c : -Subst(Int(cos(b*e/d - e*x*(-a*d + b*c)/d)**n/x**S(2), x), x, 1/(c + d*x))/d)
rubi.add(rule298)
pattern299 = Pattern(Integral(sin(u_)**WC('n', S(1)), x_), CustomConstraint(lambda n: PositiveIntegerQ(n)), CustomConstraint(lambda u, x: QuotientOfLinearsQ(u, x)), )
def With299(n, x, u):
lst = QuotientOfLinearsParts(u, x)
return Int(sin((x*Part(lst, S(2)) + Part(lst, S(1)))/(x*Part(lst, S(4)) + Part(lst, S(3))))**n, x)
rule299 = ReplacementRule(pattern299, lambda n, x, u : With299(n, x, u))
rubi.add(rule299)
pattern300 = Pattern(Integral(cos(u_)**WC('n', S(1)), x_), CustomConstraint(lambda n: PositiveIntegerQ(n)), CustomConstraint(lambda u, x: QuotientOfLinearsQ(u, x)), )
def With300(n, x, u):
lst = QuotientOfLinearsParts(u, x)
return Int(cos((x*Part(lst, S(2)) + Part(lst, S(1)))/(x*Part(lst, S(4)) + Part(lst, S(3))))**n, x)
rule300 = ReplacementRule(pattern300, lambda n, x, u : With300(n, x, u))
rubi.add(rule300)
pattern301 = Pattern(Integral(WC('u', S(1))*sin(v_)**WC('p', S(1))*sin(w_)**WC('q', S(1)), x_), CustomConstraint(lambda v, w: ZeroQ(v - w)))
rule301 = ReplacementRule(pattern301, lambda u, q, v, p, x, w : Int(u*sin(v)**(p + q), x))
rubi.add(rule301)
pattern302 = Pattern(Integral(WC('u', S(1))*cos(v_)**WC('p', S(1))*cos(w_)**WC('q', S(1)), x_), CustomConstraint(lambda v, w: ZeroQ(v - w)))
rule302 = ReplacementRule(pattern302, lambda u, q, v, p, x, w : Int(u*cos(v)**(p + q), x))
rubi.add(rule302)
pattern303 = Pattern(Integral(sin(v_)**WC('p', S(1))*sin(w_)**WC('q', S(1)), x_), CustomConstraint(lambda v, x, w: (PolynomialQ(v, x) & PolynomialQ(w, x)) | (BinomialQ(List(v, w), x) & IndependentQ(v/w, x))), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)))
rule303 = ReplacementRule(pattern303, lambda q, v, p, x, w : Int(ExpandTrigReduce(sin(v)**p*sin(w)**q, x), x))
rubi.add(rule303)
pattern304 = Pattern(Integral(cos(v_)**WC('p', S(1))*cos(w_)**WC('q', S(1)), x_), CustomConstraint(lambda v, x, w: (PolynomialQ(v, x) & PolynomialQ(w, x)) | (BinomialQ(List(v, w), x) & IndependentQ(v/w, x))), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)))
rule304 = ReplacementRule(pattern304, lambda q, v, p, x, w : Int(ExpandTrigReduce(cos(v)**p*cos(w)**q, x), x))
rubi.add(rule304)
pattern305 = Pattern(Integral(x_**WC('m', S(1))*sin(v_)**WC('p', S(1))*sin(w_)**WC('q', S(1)), x_), CustomConstraint(lambda m, q, p: PositiveIntegerQ(m, p, q)), CustomConstraint(lambda v, x, w: (PolynomialQ(v, x) & PolynomialQ(w, x)) | (BinomialQ(List(v, w), x) & IndependentQ(v/w, x))))
rule305 = ReplacementRule(pattern305, lambda m, q, v, p, x, w : Int(ExpandTrigReduce(x**m, sin(v)**p*sin(w)**q, x), x))
rubi.add(rule305)
pattern306 = Pattern(Integral(x_**WC('m', S(1))*cos(v_)**WC('p', S(1))*cos(w_)**WC('q', S(1)), x_), CustomConstraint(lambda m, q, p: PositiveIntegerQ(m, p, q)), CustomConstraint(lambda v, x, w: (PolynomialQ(v, x) & PolynomialQ(w, x)) | (BinomialQ(List(v, w), x) & IndependentQ(v/w, x))))
rule306 = ReplacementRule(pattern306, lambda m, q, v, p, x, w : Int(ExpandTrigReduce(x**m, cos(v)**p*cos(w)**q, x), x))
rubi.add(rule306)
pattern307 = Pattern(Integral(WC('u', S(1))*sin(v_)**WC('p', S(1))*cos(w_)**WC('p', S(1)), x_), CustomConstraint(lambda v, w: ZeroQ(v - w)), CustomConstraint(lambda p: IntegerQ(p)))
rule307 = ReplacementRule(pattern307, lambda u, v, p, x, w : S(2)**(-p)*Int(u*sin(S(2)*v)**p, x))
rubi.add(rule307)
pattern308 = Pattern(Integral(sin(v_)**WC('p', S(1))*cos(w_)**WC('q', S(1)), x_), CustomConstraint(lambda q, p: PositiveIntegerQ(p, q)), CustomConstraint(lambda v, x, w: (PolynomialQ(v, x) & PolynomialQ(w, x)) | (BinomialQ(List(v, w), x) & IndependentQ(v/w, x))))
rule308 = ReplacementRule(pattern308, lambda q, v, p, x, w : Int(ExpandTrigReduce(sin(v)**p*cos(w)**q, x), x))
rubi.add(rule308)
pattern309 = Pattern(Integral(x_**WC('m', S(1))*sin(v_)**WC('p', S(1))*cos(w_)**WC('q', S(1)), x_), CustomConstraint(lambda m, q, p: PositiveIntegerQ(m, p, q)), CustomConstraint(lambda v, x, w: (PolynomialQ(v, x) & PolynomialQ(w, x)) | (BinomialQ(List(v, w), x) & IndependentQ(v/w, x))))
rule309 = ReplacementRule(pattern309, lambda m, q, v, p, x, w : Int(ExpandTrigReduce(x**m, sin(v)**p*cos(w)**q, x), x))
rubi.add(rule309)
pattern310 = Pattern(Integral(sin(v_)*tan(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule310 = ReplacementRule(pattern310, lambda n, x, w, v : -Int(cos(v)*tan(w)**(n + S(-1)), x) + Int(tan(w)**(n + S(-1))*sec(w), x)*cos(v - w))
rubi.add(rule310)
pattern311 = Pattern(Integral(cos(v_)*cot(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule311 = ReplacementRule(pattern311, lambda n, x, w, v : -Int(sin(v)*cot(w)**(n + S(-1)), x) + Int(cot(w)**(n + S(-1))*csc(w), x)*cos(v - w))
rubi.add(rule311)
pattern312 = Pattern(Integral(sin(v_)*cot(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule312 = ReplacementRule(pattern312, lambda n, x, w, v : Int(cos(v)*cot(w)**(n + S(-1)), x) + Int(cot(w)**(n + S(-1))*csc(w), x)*sin(v - w))
rubi.add(rule312)
pattern313 = Pattern(Integral(cos(v_)*tan(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule313 = ReplacementRule(pattern313, lambda n, x, w, v : Int(sin(v)*tan(w)**(n + S(-1)), x) - Int(tan(w)**(n + S(-1))*sec(w), x)*sin(v - w))
rubi.add(rule313)
pattern314 = Pattern(Integral(sin(v_)*sec(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule314 = ReplacementRule(pattern314, lambda n, x, w, v : Int(tan(w)*sec(w)**(n + S(-1)), x)*cos(v - w) + Int(sec(w)**(n + S(-1)), x)*sin(v - w))
rubi.add(rule314)
pattern315 = Pattern(Integral(cos(v_)*csc(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule315 = ReplacementRule(pattern315, lambda n, x, w, v : Int(cot(w)*csc(w)**(n + S(-1)), x)*cos(v - w) - Int(csc(w)**(n + S(-1)), x)*sin(v - w))
rubi.add(rule315)
pattern316 = Pattern(Integral(sin(v_)*csc(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule316 = ReplacementRule(pattern316, lambda n, x, w, v : Int(cot(w)*csc(w)**(n + S(-1)), x)*sin(v - w) + Int(csc(w)**(n + S(-1)), x)*cos(v - w))
rubi.add(rule316)
pattern317 = Pattern(Integral(cos(v_)*sec(w_)**WC('n', S(1)), x_), CustomConstraint(lambda v, x: FreeQ(v, x)), CustomConstraint(lambda w: FreeQ(Mul(S(-1), w), x)), CustomConstraint(lambda n: RationalQ(n)), CustomConstraint(lambda n: Greater(n, S(0))), CustomConstraint(lambda v, w: NonzeroQ(v - w)))
rule317 = ReplacementRule(pattern317, lambda n, x, w, v : -Int(tan(w)*sec(w)**(n + S(-1)), x)*sin(v - w) + Int(sec(w)**(n + S(-1)), x)*cos(v - w))
rubi.add(rule317)
pattern318 = Pattern(Integral((a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))*cos(x_*WC('d', S(1)) + WC('c', S(0))))**WC('n', S(1))*(x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m, x: FreeQ(m, x)), CustomConstraint(lambda n, x: FreeQ(n, x)))
rule318 = ReplacementRule(pattern318, lambda m, a, d, n, x, f, b, e, c : Int((a + b*sin(S(2)*c + S(2)*d*x)/S(2))**n*(e + f*x)**m, x))
rubi.add(rule318)
pattern319 = Pattern(Integral(x_**WC('m', S(1))*(a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**S(2))**n_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda b, a: NonzeroQ(a + b)), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m: Greater(m, S(0))), CustomConstraint(lambda n: Less(n, S(0))), CustomConstraint(lambda m, n: Equal(n, S(-1)) | (Equal(m, S(1)) & Equal(n, S(-2)))))
rule319 = ReplacementRule(pattern319, lambda m, a, d, n, b, x, c : S(2)**(-n)*Int(x**m*(S(2)*a - b*cos(S(2)*c + S(2)*d*x) + b)**n, x))
rubi.add(rule319)
pattern320 = Pattern(Integral(x_**WC('m', S(1))*(a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))**S(2))**n_, x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda b, a: NonzeroQ(a + b)), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m: Greater(m, S(0))), CustomConstraint(lambda n: Less(n, S(0))), CustomConstraint(lambda m, n: Equal(n, S(-1)) | (Equal(m, S(1)) & Equal(n, S(-2)))))
rule320 = ReplacementRule(pattern320, lambda m, a, d, n, b, x, c : S(2)**(-n)*Int(x**m*(S(2)*a + b*cos(S(2)*c + S(2)*d*x) + b)**n, x))
rubi.add(rule320)
pattern321 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin((c_ + x_*WC('d', S(1)))**n_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda p: RationalQ(p)))
rule321 = ReplacementRule(pattern321, lambda m, a, d, n, x, f, p, b, e, c : d**(-m + S(-1))*Subst(Int((-c*f + d*e + f*x)**m*sin(a + b*x**n)**p, x), x, c + d*x))
rubi.add(rule321)
pattern322 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos((c_ + x_*WC('d', S(1)))**n_*WC('b', S(1)) + WC('a', S(0)))**WC('p', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda p: RationalQ(p)))
rule322 = ReplacementRule(pattern322, lambda m, a, d, n, x, f, p, b, e, c : d**(-m + S(-1))*Subst(Int((-c*f + d*e + f*x)**m*cos(a + b*x**n)**p, x), x, c + d*x))
rubi.add(rule322)
pattern323 = Pattern(Integral((x_*WC('g', S(1)) + WC('f', S(0)))**WC('m', S(1))/(WC('a', S(0)) + WC('b', S(1))*cos(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('c', S(1))*sin(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: NonzeroQ(a + b)), CustomConstraint(lambda c, a: NonzeroQ(a + c)))
rule323 = ReplacementRule(pattern323, lambda m, b, a, d, x, f, g, e, c : S(2)*Int((f + g*x)**m/(S(2)*a + b + c + (b - c)*cos(S(2)*d + S(2)*e*x)), x))
rubi.add(rule323)
pattern324 = Pattern(Integral((x_*WC('g', S(1)) + WC('f', S(0)))**WC('m', S(1))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)/(b_ + WC('c', S(1))*tan(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)), x_), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)))
rule324 = ReplacementRule(pattern324, lambda m, b, d, x, f, g, e, c : S(2)*Int((f + g*x)**m/(b + c + (b - c)*cos(S(2)*d + S(2)*e*x)), x))
rubi.add(rule324)
pattern325 = Pattern(Integral((x_*WC('g', S(1)) + WC('f', S(0)))**WC('m', S(1))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)/(WC('a', S(1))*sec(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('b', S(0)) + WC('c', S(1))*tan(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: NonzeroQ(a + b)), CustomConstraint(lambda c, a: NonzeroQ(a + c)))
rule325 = ReplacementRule(pattern325, lambda m, b, a, d, x, f, g, e, c : S(2)*Int((f + g*x)**m/(S(2)*a + b + c + (b - c)*cos(S(2)*d + S(2)*e*x)), x))
rubi.add(rule325)
pattern326 = Pattern(Integral((x_*WC('g', S(1)) + WC('f', S(0)))**WC('m', S(1))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)/(c_ + WC('b', S(1))*cot(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)), x_), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)))
rule326 = ReplacementRule(pattern326, lambda m, b, d, x, f, g, e, c : S(2)*Int((f + g*x)**m/(b + c + (b - c)*cos(S(2)*d + S(2)*e*x)), x))
rubi.add(rule326)
pattern327 = Pattern(Integral((x_*WC('g', S(1)) + WC('f', S(0)))**WC('m', S(1))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**S(2)/(WC('a', S(1))*csc(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('b', S(1))*cot(x_*WC('e', S(1)) + WC('d', S(0)))**S(2) + WC('c', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda g, x: FreeQ(g, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: NonzeroQ(a + b)), CustomConstraint(lambda c, a: NonzeroQ(a + c)))
rule327 = ReplacementRule(pattern327, lambda m, a, d, x, f, g, b, e, c : S(2)*Int((f + g*x)**m/(S(2)*a + b + c + (b - c)*cos(S(2)*d + S(2)*e*x)), x))
rubi.add(rule327)
pattern328 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))/(a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: PosQ(a**S(2) - b**S(2))))
rule328 = ReplacementRule(pattern328, lambda m, a, d, x, f, b, e, c : -ImaginaryI*(e + f*x)**(m + S(1))/(b*f*(m + S(1))) + Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(-ImaginaryI*b*exp(ImaginaryI*(c + d*x)) + a - Rt(a**S(2) - b**S(2), S(2))), x) + Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(-ImaginaryI*b*exp(ImaginaryI*(c + d*x)) + a + Rt(a**S(2) - b**S(2), S(2))), x))
rubi.add(rule328)
pattern329 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))/(a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: PosQ(a**S(2) - b**S(2))))
rule329 = ReplacementRule(pattern329, lambda m, a, d, x, f, b, e, c : -ImaginaryI*Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(a + b*exp(ImaginaryI*(c + d*x)) - Rt(a**S(2) - b**S(2), S(2))), x) - ImaginaryI*Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(a + b*exp(ImaginaryI*(c + d*x)) + Rt(a**S(2) - b**S(2), S(2))), x) + ImaginaryI*(e + f*x)**(m + S(1))/(b*f*(m + S(1))))
rubi.add(rule329)
pattern330 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))/(a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: NegQ(a**S(2) - b**S(2))))
rule330 = ReplacementRule(pattern330, lambda m, a, d, x, f, b, e, c : ImaginaryI*Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(ImaginaryI*a + b*exp(ImaginaryI*(c + d*x)) - Rt(-a**S(2) + b**S(2), S(2))), x) + ImaginaryI*Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(ImaginaryI*a + b*exp(ImaginaryI*(c + d*x)) + Rt(-a**S(2) + b**S(2), S(2))), x) - ImaginaryI*(e + f*x)**(m + S(1))/(b*f*(m + S(1))))
rubi.add(rule330)
pattern331 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))/(a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda b, a: NegQ(a**S(2) - b**S(2))))
rule331 = ReplacementRule(pattern331, lambda m, a, d, x, f, b, e, c : ImaginaryI*(e + f*x)**(m + S(1))/(b*f*(m + S(1))) + Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(ImaginaryI*a + ImaginaryI*b*exp(ImaginaryI*(c + d*x)) - Rt(-a**S(2) + b**S(2), S(2))), x) + Int((e + f*x)**m*exp(ImaginaryI*(c + d*x))/(ImaginaryI*a + ImaginaryI*b*exp(ImaginaryI*(c + d*x)) + Rt(-a**S(2) + b**S(2), S(2))), x))
rubi.add(rule331)
pattern332 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))**n_/(a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda n: Greater(n, S(1))), CustomConstraint(lambda b, a: ZeroQ(a**S(2) - b**S(2))))
rule332 = ReplacementRule(pattern332, lambda m, a, d, n, x, f, b, e, c : -Int((e + f*x)**m*sin(c + d*x)*cos(c + d*x)**(n + S(-2)), x)/b + Int((e + f*x)**m*cos(c + d*x)**(n + S(-2)), x)/a)
rubi.add(rule332)
pattern333 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**n_/(a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda n: Greater(n, S(1))), CustomConstraint(lambda b, a: ZeroQ(a**S(2) - b**S(2))))
rule333 = ReplacementRule(pattern333, lambda m, a, d, n, x, f, b, e, c : -Int((e + f*x)**m*sin(c + d*x)**(n + S(-2))*cos(c + d*x), x)/b + Int((e + f*x)**m*sin(c + d*x)**(n + S(-2)), x)/a)
rubi.add(rule333)
pattern334 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))**n_/(a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda n: Greater(n, S(1))), CustomConstraint(lambda b, a: NonzeroQ(a**S(2) - b**S(2))))
rule334 = ReplacementRule(pattern334, lambda m, a, d, n, x, f, b, e, c : a*Int((e + f*x)**m*cos(c + d*x)**(n + S(-2)), x)/b**S(2) - Int((e + f*x)**m*sin(c + d*x)*cos(c + d*x)**(n + S(-2)), x)/b - (a**S(2) - b**S(2))*Int((e + f*x)**m*cos(c + d*x)**(n + S(-2))/(a + b*sin(c + d*x)), x)/b**S(2))
rubi.add(rule334)
pattern335 = Pattern(Integral((x_*WC('f', S(1)) + WC('e', S(0)))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**n_/(a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda m: PositiveIntegerQ(m)), CustomConstraint(lambda n: IntegerQ(n)), CustomConstraint(lambda n: Greater(n, S(1))), CustomConstraint(lambda b, a: NonzeroQ(a**S(2) - b**S(2))))
rule335 = ReplacementRule(pattern335, lambda m, a, d, n, x, f, b, e, c : a*Int((e + f*x)**m*sin(c + d*x)**(n + S(-2)), x)/b**S(2) - Int((e + f*x)**m*sin(c + d*x)**(n + S(-2))*cos(c + d*x), x)/b - (a**S(2) - b**S(2))*Int((e + f*x)**m*sin(c + d*x)**(n + S(-2))/(a + b*cos(c + d*x)), x)/b**S(2))
rubi.add(rule335)
pattern336 = Pattern(Integral((A_ + WC('B', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))*(x_*WC('f', S(1)) + WC('e', S(0)))/(a_ + WC('b', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0))))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda b, B, A, a: ZeroQ(A*a - B*b)))
rule336 = ReplacementRule(pattern336, lambda A, a, d, x, f, b, B, e, c : B*f*Int(cos(c + d*x)/(a + b*sin(c + d*x)), x)/(a*d) - B*(e + f*x)*cos(c + d*x)/(a*d*(a + b*sin(c + d*x))))
rubi.add(rule336)
pattern337 = Pattern(Integral((A_ + WC('B', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0))))*(x_*WC('f', S(1)) + WC('e', S(0)))/(a_ + WC('b', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0))))**S(2), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda e, x: FreeQ(e, x)), CustomConstraint(lambda f, x: FreeQ(f, x)), CustomConstraint(lambda A, x: FreeQ(A, x)), CustomConstraint(lambda B, x: FreeQ(B, x)), CustomConstraint(lambda b, B, A, a: ZeroQ(A*a - B*b)))
rule337 = ReplacementRule(pattern337, lambda A, a, d, x, f, b, B, e, c : -B*f*Int(sin(c + d*x)/(a + b*cos(c + d*x)), x)/(a*d) + B*(e + f*x)*sin(c + d*x)/(a*d*(a + b*cos(c + d*x))))
rubi.add(rule337)
pattern338 = Pattern(Integral((a_ + WC('b', S(1))*tan(v_))**WC('n', S(1))*sec(v_)**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m, n: Equal(m + n, S(0))), CustomConstraint(lambda m: OddQ(m)))
rule338 = ReplacementRule(pattern338, lambda m, v, a, n, b, x : Int((a*cos(v) + b*sin(v))**n, x))
rubi.add(rule338)
pattern339 = Pattern(Integral((a_ + WC('b', S(1))*cot(v_))**WC('n', S(1))*csc(v_)**WC('m', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda m, n: IntegersQ(m, n)), CustomConstraint(lambda m, n: Equal(m + n, S(0))), CustomConstraint(lambda m: OddQ(m)))
rule339 = ReplacementRule(pattern339, lambda m, v, a, n, b, x : Int((a*sin(v) + b*cos(v))**n, x))
rubi.add(rule339)
pattern340 = Pattern(Integral(WC('u', S(1))*sin(x_*WC('b', S(1)) + WC('a', S(0)))**WC('m', S(1))*sin(x_*WC('d', S(1)) + WC('c', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, n: PositiveIntegerQ(m, n)))
rule340 = ReplacementRule(pattern340, lambda m, u, a, d, n, b, x, c : Int(ExpandTrigReduce(u, sin(a + b*x)**m*sin(c + d*x)**n, x), x))
rubi.add(rule340)
pattern341 = Pattern(Integral(WC('u', S(1))*cos(x_*WC('b', S(1)) + WC('a', S(0)))**WC('m', S(1))*cos(x_*WC('d', S(1)) + WC('c', S(0)))**WC('n', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda m, n: PositiveIntegerQ(m, n)))
rule341 = ReplacementRule(pattern341, lambda m, u, a, d, n, b, x, c : Int(ExpandTrigReduce(u, cos(a + b*x)**m*cos(c + d*x)**n, x), x))
rubi.add(rule341)
pattern342 = Pattern(Integral(sec(c_ + x_*WC('d', S(1)))*sec(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: ZeroQ(b**S(2) - d**S(2))), CustomConstraint(lambda d, b, a, c: NonzeroQ(-a*d + b*c)))
rule342 = ReplacementRule(pattern342, lambda a, d, b, x, c : -Int(tan(a + b*x), x)*csc((-a*d + b*c)/d) + Int(tan(c + d*x), x)*csc((-a*d + b*c)/b))
rubi.add(rule342)
pattern343 = Pattern(Integral(csc(c_ + x_*WC('d', S(1)))*csc(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: ZeroQ(b**S(2) - d**S(2))), CustomConstraint(lambda d, b, a, c: NonzeroQ(-a*d + b*c)))
rule343 = ReplacementRule(pattern343, lambda a, d, b, x, c : Int(cot(a + b*x), x)*csc((-a*d + b*c)/b) - Int(cot(c + d*x), x)*csc((-a*d + b*c)/d))
rubi.add(rule343)
pattern344 = Pattern(Integral(tan(c_ + x_*WC('d', S(1)))*tan(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: ZeroQ(b**S(2) - d**S(2))), CustomConstraint(lambda d, b, a, c: NonzeroQ(-a*d + b*c)))
rule344 = ReplacementRule(pattern344, lambda a, d, b, x, c : -b*x/d + b*Int(sec(a + b*x)*sec(c + d*x), x)*cos((-a*d + b*c)/d)/d)
rubi.add(rule344)
pattern345 = Pattern(Integral(cot(c_ + x_*WC('d', S(1)))*cot(x_*WC('b', S(1)) + WC('a', S(0))), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda c, x: FreeQ(c, x)), CustomConstraint(lambda d, x: FreeQ(d, x)), CustomConstraint(lambda d, b: ZeroQ(b**S(2) - d**S(2))), CustomConstraint(lambda d, b, a, c: NonzeroQ(-a*d + b*c)))
rule345 = ReplacementRule(pattern345, lambda a, d, b, x, c : -b*x/d + Int(csc(a + b*x)*csc(c + d*x), x)*cos((-a*d + b*c)/d))
rubi.add(rule345)
pattern346 = Pattern(Integral((WC('a', S(1))*cos(v_) + WC('b', S(1))*sin(v_))**WC('n', S(1))*WC('u', S(1)), x_), CustomConstraint(lambda a, x: FreeQ(a, x)), CustomConstraint(lambda b, x: FreeQ(b, x)), CustomConstraint(lambda n, x: FreeQ(n, x)), CustomConstraint(lambda b, a: ZeroQ(a**S(2) + b**S(2))))
rule346 = ReplacementRule(pattern346, lambda u, v, a, n, b, x : Int(u*(a*exp(-a*v/b))**n, x))
rubi.add(rule346)
return rubi
| 171.831626 | 6,553 | 0.593295 | 50,110 | 268,401 | 3.147316 | 0.026003 | 0.364918 | 0.302317 | 0.013886 | 0.841198 | 0.836537 | 0.831053 | 0.821523 | 0.809869 | 0.798963 | 0 | 0.035848 | 0.132589 | 268,401 | 1,561 | 6,554 | 171.941704 | 0.641568 | 0.000317 | 0 | 0.051495 | 0 | 0 | 0.007804 | 0.000116 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040698 | false | 0 | 0.009136 | 0 | 0.090532 | 0.000831 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
81173935e8be86b595abcbb88c0b74008c09ccbb | 1,722 | py | Python | server/datasource/migrations/0003_change_number_fields_to_decimals.py | yizhang7210/Acre | c98cf8a4fdfb223a1958e8e61df759f889a1b13f | [
"MIT"
] | 2 | 2017-11-27T21:55:21.000Z | 2017-12-30T03:34:40.000Z | server/datasource/migrations/0003_change_number_fields_to_decimals.py | yizhang7210/Acre | c98cf8a4fdfb223a1958e8e61df759f889a1b13f | [
"MIT"
] | 30 | 2017-09-06T12:00:08.000Z | 2018-06-20T22:47:46.000Z | server/datasource/migrations/0003_change_number_fields_to_decimals.py | yizhang7210/Acre | c98cf8a4fdfb223a1958e8e61df759f889a1b13f | [
"MIT"
] | 1 | 2021-04-05T13:59:37.000Z | 2021-04-05T13:59:37.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.4 on 2017-09-10 00:47
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('datasource', '0002_add_unique_together_for_candles'),
]
operations = [
migrations.AlterField(
model_name='candle',
name='close_ask',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='close_bid',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='high_ask',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='high_bid',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='low_ask',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='low_bid',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='open_ask',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
migrations.AlterField(
model_name='candle',
name='open_bid',
field=models.DecimalField(decimal_places=6, max_digits=12),
),
]
| 30.75 | 71 | 0.58072 | 175 | 1,722 | 5.474286 | 0.297143 | 0.167015 | 0.208768 | 0.242171 | 0.784969 | 0.784969 | 0.784969 | 0.73382 | 0.73382 | 0.73382 | 0 | 0.037657 | 0.306039 | 1,722 | 55 | 72 | 31.309091 | 0.764017 | 0.039489 | 0 | 0.666667 | 1 | 0 | 0.0957 | 0.021805 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d4a8ba708d497e9b460ff8d1dd2a6e9d15fce64f | 2,322 | py | Python | CODES/flood fill/with inner walls/01_without backtracing.py | YashwanthYarala/MICROMOUSE | 69ba518ee81e1e6b70a13f7480844459d240ed11 | [
"MIT"
] | null | null | null | CODES/flood fill/with inner walls/01_without backtracing.py | YashwanthYarala/MICROMOUSE | 69ba518ee81e1e6b70a13f7480844459d240ed11 | [
"MIT"
] | null | null | null | CODES/flood fill/with inner walls/01_without backtracing.py | YashwanthYarala/MICROMOUSE | 69ba518ee81e1e6b70a13f7480844459d240ed11 | [
"MIT"
] | null | null | null | from numpy import *
#creating maze with no walls.
m = array([[10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10],
[10,6,7,5,7,4,7,3,7,3,7,4,7,5,7,6,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,5,7,4,7,3,7,2,7,2,7,3,7,4,7,5,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,4,7,3,7,2,7,1,7,1,7,2,7,3,7,4,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,3,7,2,7,1,7,0,7,0,7,1,7,2,7,3,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,3,7,2,7,1,7,0,7,0,7,1,7,2,7,3,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,4,7,3,7,2,7,1,7,1,7,2,7,3,7,4,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,5,7,4,7,3,7,2,7,2,7,3,7,4,7,5,10],
[10,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,10],
[10,6,7,5,7,4,7,3,7,3,7,4,7,5,7,6,10],
[10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10]])
i = array([[10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10],
[10,6,7,5,10,4,7,3,7,3,10,4,10,5,7,6,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],#done till here
[10,5,10,4,7,3,10,2,10,2,7,3,7,4,10,5,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],
[10,4,10,3,10,2,10,1,7,1,7,2,10,3,7,4,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],
[10,3,10,2,10,1,7,0,7,0,10,1,7,2,7,3,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],
[10,3,7,2,7,1,10,0,7,0,10,1,7,2,10,3,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],
[10,4,7,3,10,2,7,1,10,1,7,2,7,3,7,4,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],
[10,5,7,4,10,3,10,2,10,2,7,3,7,4,10,5,10],
[10,7,10,7,10,7,10,7,10,7,10,7,10,7,10,7,10],
[10,6,10,5,7,4,7,3,7,3,7,4,7,5,7,6,10],
[10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10]])
#sarting position is 1,1 .....x,y are positions in y and x direction.
x = y = 1
while(i[x][y]!=0):
a = i[x][y]
d = i[x+1][y] #downward direction one unit
u = i[x-1][y] #upward direction one unit
l = i[x][y-1] #left direction one unit
r = i[x][y+1] #right direcion one unit
D = i[x+2][y]
U = i[x-2][y]
L = i[][]
R = i[][]
# now walls are to be checked
if(d == 7):
| 43 | 71 | 0.453058 | 678 | 2,322 | 1.551622 | 0.075221 | 0.186312 | 0.259506 | 0.319392 | 0.738593 | 0.724335 | 0.706274 | 0.675856 | 0.668251 | 0.668251 | 0 | 0.441094 | 0.228682 | 2,322 | 53 | 72 | 43.811321 | 0.146287 | 0.105082 | 0 | 0.510638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.021277 | null | null | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d4cb1da781969c91c56fa58f6b50afee05956c75 | 182,219 | py | Python | kflash.py | valentin7121/kflash.py | 5783d8d85e7397007a3ba1394a93d3f9f8ad982f | [
"MIT"
] | 78 | 2018-11-14T16:51:48.000Z | 2022-03-23T07:46:51.000Z | tools/kflash.py | hzcx998/xv6-k210 | 34857cc2cb03445672772d26a18a5785b33fee7f | [
"MIT"
] | 35 | 2018-11-11T15:47:21.000Z | 2022-03-16T06:45:31.000Z | tools/kflash.py | hzcx998/xv6-k210 | 34857cc2cb03445672772d26a18a5785b33fee7f | [
"MIT"
] | 41 | 2018-12-19T07:55:22.000Z | 2021-12-21T12:09:18.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
from __future__ import (division, print_function)
import sys
import time
import zlib
import copy
import struct
import binascii
import hashlib
import argparse
import math
import zipfile, tempfile
import json
import re
import os
class KFlash:
print_callback = None
def __init__(self, print_callback = None):
self.killProcess = False
self.loader = None
self.print_callback = print_callback
@staticmethod
def log(*args, **kwargs):
if KFlash.print_callback:
KFlash.print_callback(*args, **kwargs)
else:
print(*args, **kwargs)
def process(self, terminal=True, dev="", baudrate=1500000, board=None, sram = False, file="", callback=None, noansi=False, terminal_auto_size=False, terminal_size=(50, 1), slow_mode = False):
self.killProcess = False
BASH_TIPS = dict(NORMAL='\033[0m',BOLD='\033[1m',DIM='\033[2m',UNDERLINE='\033[4m',
DEFAULT='\033[0m', RED='\033[31m', YELLOW='\033[33m', GREEN='\033[32m',
BG_DEFAULT='\033[49m', BG_WHITE='\033[107m')
ERROR_MSG = BASH_TIPS['RED']+BASH_TIPS['BOLD']+'[ERROR]'+BASH_TIPS['NORMAL']
WARN_MSG = BASH_TIPS['YELLOW']+BASH_TIPS['BOLD']+'[WARN]'+BASH_TIPS['NORMAL']
INFO_MSG = BASH_TIPS['GREEN']+BASH_TIPS['BOLD']+'[INFO]'+BASH_TIPS['NORMAL']
VID_LIST_FOR_AUTO_LOOKUP = "(1A86)|(0403)|(067B)|(10C4)|(C251)|(0403)"
# WCH FTDI PL CL DAP OPENEC
ISP_RECEIVE_TIMEOUT = 0.5
MAX_RETRY_TIMES = 10
ISP_FLASH_SECTOR_SIZE = 4096
ISP_FLASH_DATA_FRAME_SIZE = ISP_FLASH_SECTOR_SIZE * 16
def tuple2str(t):
ret = ""
for i in t:
ret += i+" "
return ret
def raise_exception(exception):
if self.loader:
try:
self.loader._port.close()
except Exception:
pass
raise exception
try:
from enum import Enum
except ImportError:
err = (ERROR_MSG,'enum34 must be installed, run '+BASH_TIPS['GREEN']+'`' + ('pip', 'pip3')[sys.version_info > (3, 0)] + ' install enum34`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise Exception(err)
try:
import serial
import serial.tools.list_ports
except ImportError:
err = (ERROR_MSG,'PySerial must be installed, run '+BASH_TIPS['GREEN']+'`' + ('pip', 'pip3')[sys.version_info > (3, 0)] + ' install pyserial`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise Exception(err)
class TimeoutError(Exception): pass
class ProgramFileFormat(Enum):
FMT_BINARY = 0
FMT_ELF = 1
FMT_KFPKG = 2
# AES is from: https://github.com/ricmoo/pyaes, Copyright by Richard Moore
class AES:
'''Encapsulates the AES block cipher.
You generally should not need this. Use the AESModeOfOperation classes
below instead.'''
@staticmethod
def _compact_word(word):
return (word[0] << 24) | (word[1] << 16) | (word[2] << 8) | word[3]
# Number of rounds by keysize
number_of_rounds = {16: 10, 24: 12, 32: 14}
# Round constant words
rcon = [ 0x01, 0x02, 0x04, 0x08, 0x10, 0x20, 0x40, 0x80, 0x1b, 0x36, 0x6c, 0xd8, 0xab, 0x4d, 0x9a, 0x2f, 0x5e, 0xbc, 0x63, 0xc6, 0x97, 0x35, 0x6a, 0xd4, 0xb3, 0x7d, 0xfa, 0xef, 0xc5, 0x91 ]
# S-box and Inverse S-box (S is for Substitution)
S = [ 0x63, 0x7c, 0x77, 0x7b, 0xf2, 0x6b, 0x6f, 0xc5, 0x30, 0x01, 0x67, 0x2b, 0xfe, 0xd7, 0xab, 0x76, 0xca, 0x82, 0xc9, 0x7d, 0xfa, 0x59, 0x47, 0xf0, 0xad, 0xd4, 0xa2, 0xaf, 0x9c, 0xa4, 0x72, 0xc0, 0xb7, 0xfd, 0x93, 0x26, 0x36, 0x3f, 0xf7, 0xcc, 0x34, 0xa5, 0xe5, 0xf1, 0x71, 0xd8, 0x31, 0x15, 0x04, 0xc7, 0x23, 0xc3, 0x18, 0x96, 0x05, 0x9a, 0x07, 0x12, 0x80, 0xe2, 0xeb, 0x27, 0xb2, 0x75, 0x09, 0x83, 0x2c, 0x1a, 0x1b, 0x6e, 0x5a, 0xa0, 0x52, 0x3b, 0xd6, 0xb3, 0x29, 0xe3, 0x2f, 0x84, 0x53, 0xd1, 0x00, 0xed, 0x20, 0xfc, 0xb1, 0x5b, 0x6a, 0xcb, 0xbe, 0x39, 0x4a, 0x4c, 0x58, 0xcf, 0xd0, 0xef, 0xaa, 0xfb, 0x43, 0x4d, 0x33, 0x85, 0x45, 0xf9, 0x02, 0x7f, 0x50, 0x3c, 0x9f, 0xa8, 0x51, 0xa3, 0x40, 0x8f, 0x92, 0x9d, 0x38, 0xf5, 0xbc, 0xb6, 0xda, 0x21, 0x10, 0xff, 0xf3, 0xd2, 0xcd, 0x0c, 0x13, 0xec, 0x5f, 0x97, 0x44, 0x17, 0xc4, 0xa7, 0x7e, 0x3d, 0x64, 0x5d, 0x19, 0x73, 0x60, 0x81, 0x4f, 0xdc, 0x22, 0x2a, 0x90, 0x88, 0x46, 0xee, 0xb8, 0x14, 0xde, 0x5e, 0x0b, 0xdb, 0xe0, 0x32, 0x3a, 0x0a, 0x49, 0x06, 0x24, 0x5c, 0xc2, 0xd3, 0xac, 0x62, 0x91, 0x95, 0xe4, 0x79, 0xe7, 0xc8, 0x37, 0x6d, 0x8d, 0xd5, 0x4e, 0xa9, 0x6c, 0x56, 0xf4, 0xea, 0x65, 0x7a, 0xae, 0x08, 0xba, 0x78, 0x25, 0x2e, 0x1c, 0xa6, 0xb4, 0xc6, 0xe8, 0xdd, 0x74, 0x1f, 0x4b, 0xbd, 0x8b, 0x8a, 0x70, 0x3e, 0xb5, 0x66, 0x48, 0x03, 0xf6, 0x0e, 0x61, 0x35, 0x57, 0xb9, 0x86, 0xc1, 0x1d, 0x9e, 0xe1, 0xf8, 0x98, 0x11, 0x69, 0xd9, 0x8e, 0x94, 0x9b, 0x1e, 0x87, 0xe9, 0xce, 0x55, 0x28, 0xdf, 0x8c, 0xa1, 0x89, 0x0d, 0xbf, 0xe6, 0x42, 0x68, 0x41, 0x99, 0x2d, 0x0f, 0xb0, 0x54, 0xbb, 0x16 ]
Si =[ 0x52, 0x09, 0x6a, 0xd5, 0x30, 0x36, 0xa5, 0x38, 0xbf, 0x40, 0xa3, 0x9e, 0x81, 0xf3, 0xd7, 0xfb, 0x7c, 0xe3, 0x39, 0x82, 0x9b, 0x2f, 0xff, 0x87, 0x34, 0x8e, 0x43, 0x44, 0xc4, 0xde, 0xe9, 0xcb, 0x54, 0x7b, 0x94, 0x32, 0xa6, 0xc2, 0x23, 0x3d, 0xee, 0x4c, 0x95, 0x0b, 0x42, 0xfa, 0xc3, 0x4e, 0x08, 0x2e, 0xa1, 0x66, 0x28, 0xd9, 0x24, 0xb2, 0x76, 0x5b, 0xa2, 0x49, 0x6d, 0x8b, 0xd1, 0x25, 0x72, 0xf8, 0xf6, 0x64, 0x86, 0x68, 0x98, 0x16, 0xd4, 0xa4, 0x5c, 0xcc, 0x5d, 0x65, 0xb6, 0x92, 0x6c, 0x70, 0x48, 0x50, 0xfd, 0xed, 0xb9, 0xda, 0x5e, 0x15, 0x46, 0x57, 0xa7, 0x8d, 0x9d, 0x84, 0x90, 0xd8, 0xab, 0x00, 0x8c, 0xbc, 0xd3, 0x0a, 0xf7, 0xe4, 0x58, 0x05, 0xb8, 0xb3, 0x45, 0x06, 0xd0, 0x2c, 0x1e, 0x8f, 0xca, 0x3f, 0x0f, 0x02, 0xc1, 0xaf, 0xbd, 0x03, 0x01, 0x13, 0x8a, 0x6b, 0x3a, 0x91, 0x11, 0x41, 0x4f, 0x67, 0xdc, 0xea, 0x97, 0xf2, 0xcf, 0xce, 0xf0, 0xb4, 0xe6, 0x73, 0x96, 0xac, 0x74, 0x22, 0xe7, 0xad, 0x35, 0x85, 0xe2, 0xf9, 0x37, 0xe8, 0x1c, 0x75, 0xdf, 0x6e, 0x47, 0xf1, 0x1a, 0x71, 0x1d, 0x29, 0xc5, 0x89, 0x6f, 0xb7, 0x62, 0x0e, 0xaa, 0x18, 0xbe, 0x1b, 0xfc, 0x56, 0x3e, 0x4b, 0xc6, 0xd2, 0x79, 0x20, 0x9a, 0xdb, 0xc0, 0xfe, 0x78, 0xcd, 0x5a, 0xf4, 0x1f, 0xdd, 0xa8, 0x33, 0x88, 0x07, 0xc7, 0x31, 0xb1, 0x12, 0x10, 0x59, 0x27, 0x80, 0xec, 0x5f, 0x60, 0x51, 0x7f, 0xa9, 0x19, 0xb5, 0x4a, 0x0d, 0x2d, 0xe5, 0x7a, 0x9f, 0x93, 0xc9, 0x9c, 0xef, 0xa0, 0xe0, 0x3b, 0x4d, 0xae, 0x2a, 0xf5, 0xb0, 0xc8, 0xeb, 0xbb, 0x3c, 0x83, 0x53, 0x99, 0x61, 0x17, 0x2b, 0x04, 0x7e, 0xba, 0x77, 0xd6, 0x26, 0xe1, 0x69, 0x14, 0x63, 0x55, 0x21, 0x0c, 0x7d ]
# Transformations for encryption
T1 = [ 0xc66363a5, 0xf87c7c84, 0xee777799, 0xf67b7b8d, 0xfff2f20d, 0xd66b6bbd, 0xde6f6fb1, 0x91c5c554, 0x60303050, 0x02010103, 0xce6767a9, 0x562b2b7d, 0xe7fefe19, 0xb5d7d762, 0x4dababe6, 0xec76769a, 0x8fcaca45, 0x1f82829d, 0x89c9c940, 0xfa7d7d87, 0xeffafa15, 0xb25959eb, 0x8e4747c9, 0xfbf0f00b, 0x41adadec, 0xb3d4d467, 0x5fa2a2fd, 0x45afafea, 0x239c9cbf, 0x53a4a4f7, 0xe4727296, 0x9bc0c05b, 0x75b7b7c2, 0xe1fdfd1c, 0x3d9393ae, 0x4c26266a, 0x6c36365a, 0x7e3f3f41, 0xf5f7f702, 0x83cccc4f, 0x6834345c, 0x51a5a5f4, 0xd1e5e534, 0xf9f1f108, 0xe2717193, 0xabd8d873, 0x62313153, 0x2a15153f, 0x0804040c, 0x95c7c752, 0x46232365, 0x9dc3c35e, 0x30181828, 0x379696a1, 0x0a05050f, 0x2f9a9ab5, 0x0e070709, 0x24121236, 0x1b80809b, 0xdfe2e23d, 0xcdebeb26, 0x4e272769, 0x7fb2b2cd, 0xea75759f, 0x1209091b, 0x1d83839e, 0x582c2c74, 0x341a1a2e, 0x361b1b2d, 0xdc6e6eb2, 0xb45a5aee, 0x5ba0a0fb, 0xa45252f6, 0x763b3b4d, 0xb7d6d661, 0x7db3b3ce, 0x5229297b, 0xdde3e33e, 0x5e2f2f71, 0x13848497, 0xa65353f5, 0xb9d1d168, 0x00000000, 0xc1eded2c, 0x40202060, 0xe3fcfc1f, 0x79b1b1c8, 0xb65b5bed, 0xd46a6abe, 0x8dcbcb46, 0x67bebed9, 0x7239394b, 0x944a4ade, 0x984c4cd4, 0xb05858e8, 0x85cfcf4a, 0xbbd0d06b, 0xc5efef2a, 0x4faaaae5, 0xedfbfb16, 0x864343c5, 0x9a4d4dd7, 0x66333355, 0x11858594, 0x8a4545cf, 0xe9f9f910, 0x04020206, 0xfe7f7f81, 0xa05050f0, 0x783c3c44, 0x259f9fba, 0x4ba8a8e3, 0xa25151f3, 0x5da3a3fe, 0x804040c0, 0x058f8f8a, 0x3f9292ad, 0x219d9dbc, 0x70383848, 0xf1f5f504, 0x63bcbcdf, 0x77b6b6c1, 0xafdada75, 0x42212163, 0x20101030, 0xe5ffff1a, 0xfdf3f30e, 0xbfd2d26d, 0x81cdcd4c, 0x180c0c14, 0x26131335, 0xc3ecec2f, 0xbe5f5fe1, 0x359797a2, 0x884444cc, 0x2e171739, 0x93c4c457, 0x55a7a7f2, 0xfc7e7e82, 0x7a3d3d47, 0xc86464ac, 0xba5d5de7, 0x3219192b, 0xe6737395, 0xc06060a0, 0x19818198, 0x9e4f4fd1, 0xa3dcdc7f, 0x44222266, 0x542a2a7e, 0x3b9090ab, 0x0b888883, 0x8c4646ca, 0xc7eeee29, 0x6bb8b8d3, 0x2814143c, 0xa7dede79, 0xbc5e5ee2, 0x160b0b1d, 0xaddbdb76, 0xdbe0e03b, 0x64323256, 0x743a3a4e, 0x140a0a1e, 0x924949db, 0x0c06060a, 0x4824246c, 0xb85c5ce4, 0x9fc2c25d, 0xbdd3d36e, 0x43acacef, 0xc46262a6, 0x399191a8, 0x319595a4, 0xd3e4e437, 0xf279798b, 0xd5e7e732, 0x8bc8c843, 0x6e373759, 0xda6d6db7, 0x018d8d8c, 0xb1d5d564, 0x9c4e4ed2, 0x49a9a9e0, 0xd86c6cb4, 0xac5656fa, 0xf3f4f407, 0xcfeaea25, 0xca6565af, 0xf47a7a8e, 0x47aeaee9, 0x10080818, 0x6fbabad5, 0xf0787888, 0x4a25256f, 0x5c2e2e72, 0x381c1c24, 0x57a6a6f1, 0x73b4b4c7, 0x97c6c651, 0xcbe8e823, 0xa1dddd7c, 0xe874749c, 0x3e1f1f21, 0x964b4bdd, 0x61bdbddc, 0x0d8b8b86, 0x0f8a8a85, 0xe0707090, 0x7c3e3e42, 0x71b5b5c4, 0xcc6666aa, 0x904848d8, 0x06030305, 0xf7f6f601, 0x1c0e0e12, 0xc26161a3, 0x6a35355f, 0xae5757f9, 0x69b9b9d0, 0x17868691, 0x99c1c158, 0x3a1d1d27, 0x279e9eb9, 0xd9e1e138, 0xebf8f813, 0x2b9898b3, 0x22111133, 0xd26969bb, 0xa9d9d970, 0x078e8e89, 0x339494a7, 0x2d9b9bb6, 0x3c1e1e22, 0x15878792, 0xc9e9e920, 0x87cece49, 0xaa5555ff, 0x50282878, 0xa5dfdf7a, 0x038c8c8f, 0x59a1a1f8, 0x09898980, 0x1a0d0d17, 0x65bfbfda, 0xd7e6e631, 0x844242c6, 0xd06868b8, 0x824141c3, 0x299999b0, 0x5a2d2d77, 0x1e0f0f11, 0x7bb0b0cb, 0xa85454fc, 0x6dbbbbd6, 0x2c16163a ]
T2 = [ 0xa5c66363, 0x84f87c7c, 0x99ee7777, 0x8df67b7b, 0x0dfff2f2, 0xbdd66b6b, 0xb1de6f6f, 0x5491c5c5, 0x50603030, 0x03020101, 0xa9ce6767, 0x7d562b2b, 0x19e7fefe, 0x62b5d7d7, 0xe64dabab, 0x9aec7676, 0x458fcaca, 0x9d1f8282, 0x4089c9c9, 0x87fa7d7d, 0x15effafa, 0xebb25959, 0xc98e4747, 0x0bfbf0f0, 0xec41adad, 0x67b3d4d4, 0xfd5fa2a2, 0xea45afaf, 0xbf239c9c, 0xf753a4a4, 0x96e47272, 0x5b9bc0c0, 0xc275b7b7, 0x1ce1fdfd, 0xae3d9393, 0x6a4c2626, 0x5a6c3636, 0x417e3f3f, 0x02f5f7f7, 0x4f83cccc, 0x5c683434, 0xf451a5a5, 0x34d1e5e5, 0x08f9f1f1, 0x93e27171, 0x73abd8d8, 0x53623131, 0x3f2a1515, 0x0c080404, 0x5295c7c7, 0x65462323, 0x5e9dc3c3, 0x28301818, 0xa1379696, 0x0f0a0505, 0xb52f9a9a, 0x090e0707, 0x36241212, 0x9b1b8080, 0x3ddfe2e2, 0x26cdebeb, 0x694e2727, 0xcd7fb2b2, 0x9fea7575, 0x1b120909, 0x9e1d8383, 0x74582c2c, 0x2e341a1a, 0x2d361b1b, 0xb2dc6e6e, 0xeeb45a5a, 0xfb5ba0a0, 0xf6a45252, 0x4d763b3b, 0x61b7d6d6, 0xce7db3b3, 0x7b522929, 0x3edde3e3, 0x715e2f2f, 0x97138484, 0xf5a65353, 0x68b9d1d1, 0x00000000, 0x2cc1eded, 0x60402020, 0x1fe3fcfc, 0xc879b1b1, 0xedb65b5b, 0xbed46a6a, 0x468dcbcb, 0xd967bebe, 0x4b723939, 0xde944a4a, 0xd4984c4c, 0xe8b05858, 0x4a85cfcf, 0x6bbbd0d0, 0x2ac5efef, 0xe54faaaa, 0x16edfbfb, 0xc5864343, 0xd79a4d4d, 0x55663333, 0x94118585, 0xcf8a4545, 0x10e9f9f9, 0x06040202, 0x81fe7f7f, 0xf0a05050, 0x44783c3c, 0xba259f9f, 0xe34ba8a8, 0xf3a25151, 0xfe5da3a3, 0xc0804040, 0x8a058f8f, 0xad3f9292, 0xbc219d9d, 0x48703838, 0x04f1f5f5, 0xdf63bcbc, 0xc177b6b6, 0x75afdada, 0x63422121, 0x30201010, 0x1ae5ffff, 0x0efdf3f3, 0x6dbfd2d2, 0x4c81cdcd, 0x14180c0c, 0x35261313, 0x2fc3ecec, 0xe1be5f5f, 0xa2359797, 0xcc884444, 0x392e1717, 0x5793c4c4, 0xf255a7a7, 0x82fc7e7e, 0x477a3d3d, 0xacc86464, 0xe7ba5d5d, 0x2b321919, 0x95e67373, 0xa0c06060, 0x98198181, 0xd19e4f4f, 0x7fa3dcdc, 0x66442222, 0x7e542a2a, 0xab3b9090, 0x830b8888, 0xca8c4646, 0x29c7eeee, 0xd36bb8b8, 0x3c281414, 0x79a7dede, 0xe2bc5e5e, 0x1d160b0b, 0x76addbdb, 0x3bdbe0e0, 0x56643232, 0x4e743a3a, 0x1e140a0a, 0xdb924949, 0x0a0c0606, 0x6c482424, 0xe4b85c5c, 0x5d9fc2c2, 0x6ebdd3d3, 0xef43acac, 0xa6c46262, 0xa8399191, 0xa4319595, 0x37d3e4e4, 0x8bf27979, 0x32d5e7e7, 0x438bc8c8, 0x596e3737, 0xb7da6d6d, 0x8c018d8d, 0x64b1d5d5, 0xd29c4e4e, 0xe049a9a9, 0xb4d86c6c, 0xfaac5656, 0x07f3f4f4, 0x25cfeaea, 0xafca6565, 0x8ef47a7a, 0xe947aeae, 0x18100808, 0xd56fbaba, 0x88f07878, 0x6f4a2525, 0x725c2e2e, 0x24381c1c, 0xf157a6a6, 0xc773b4b4, 0x5197c6c6, 0x23cbe8e8, 0x7ca1dddd, 0x9ce87474, 0x213e1f1f, 0xdd964b4b, 0xdc61bdbd, 0x860d8b8b, 0x850f8a8a, 0x90e07070, 0x427c3e3e, 0xc471b5b5, 0xaacc6666, 0xd8904848, 0x05060303, 0x01f7f6f6, 0x121c0e0e, 0xa3c26161, 0x5f6a3535, 0xf9ae5757, 0xd069b9b9, 0x91178686, 0x5899c1c1, 0x273a1d1d, 0xb9279e9e, 0x38d9e1e1, 0x13ebf8f8, 0xb32b9898, 0x33221111, 0xbbd26969, 0x70a9d9d9, 0x89078e8e, 0xa7339494, 0xb62d9b9b, 0x223c1e1e, 0x92158787, 0x20c9e9e9, 0x4987cece, 0xffaa5555, 0x78502828, 0x7aa5dfdf, 0x8f038c8c, 0xf859a1a1, 0x80098989, 0x171a0d0d, 0xda65bfbf, 0x31d7e6e6, 0xc6844242, 0xb8d06868, 0xc3824141, 0xb0299999, 0x775a2d2d, 0x111e0f0f, 0xcb7bb0b0, 0xfca85454, 0xd66dbbbb, 0x3a2c1616 ]
T3 = [ 0x63a5c663, 0x7c84f87c, 0x7799ee77, 0x7b8df67b, 0xf20dfff2, 0x6bbdd66b, 0x6fb1de6f, 0xc55491c5, 0x30506030, 0x01030201, 0x67a9ce67, 0x2b7d562b, 0xfe19e7fe, 0xd762b5d7, 0xabe64dab, 0x769aec76, 0xca458fca, 0x829d1f82, 0xc94089c9, 0x7d87fa7d, 0xfa15effa, 0x59ebb259, 0x47c98e47, 0xf00bfbf0, 0xadec41ad, 0xd467b3d4, 0xa2fd5fa2, 0xafea45af, 0x9cbf239c, 0xa4f753a4, 0x7296e472, 0xc05b9bc0, 0xb7c275b7, 0xfd1ce1fd, 0x93ae3d93, 0x266a4c26, 0x365a6c36, 0x3f417e3f, 0xf702f5f7, 0xcc4f83cc, 0x345c6834, 0xa5f451a5, 0xe534d1e5, 0xf108f9f1, 0x7193e271, 0xd873abd8, 0x31536231, 0x153f2a15, 0x040c0804, 0xc75295c7, 0x23654623, 0xc35e9dc3, 0x18283018, 0x96a13796, 0x050f0a05, 0x9ab52f9a, 0x07090e07, 0x12362412, 0x809b1b80, 0xe23ddfe2, 0xeb26cdeb, 0x27694e27, 0xb2cd7fb2, 0x759fea75, 0x091b1209, 0x839e1d83, 0x2c74582c, 0x1a2e341a, 0x1b2d361b, 0x6eb2dc6e, 0x5aeeb45a, 0xa0fb5ba0, 0x52f6a452, 0x3b4d763b, 0xd661b7d6, 0xb3ce7db3, 0x297b5229, 0xe33edde3, 0x2f715e2f, 0x84971384, 0x53f5a653, 0xd168b9d1, 0x00000000, 0xed2cc1ed, 0x20604020, 0xfc1fe3fc, 0xb1c879b1, 0x5bedb65b, 0x6abed46a, 0xcb468dcb, 0xbed967be, 0x394b7239, 0x4ade944a, 0x4cd4984c, 0x58e8b058, 0xcf4a85cf, 0xd06bbbd0, 0xef2ac5ef, 0xaae54faa, 0xfb16edfb, 0x43c58643, 0x4dd79a4d, 0x33556633, 0x85941185, 0x45cf8a45, 0xf910e9f9, 0x02060402, 0x7f81fe7f, 0x50f0a050, 0x3c44783c, 0x9fba259f, 0xa8e34ba8, 0x51f3a251, 0xa3fe5da3, 0x40c08040, 0x8f8a058f, 0x92ad3f92, 0x9dbc219d, 0x38487038, 0xf504f1f5, 0xbcdf63bc, 0xb6c177b6, 0xda75afda, 0x21634221, 0x10302010, 0xff1ae5ff, 0xf30efdf3, 0xd26dbfd2, 0xcd4c81cd, 0x0c14180c, 0x13352613, 0xec2fc3ec, 0x5fe1be5f, 0x97a23597, 0x44cc8844, 0x17392e17, 0xc45793c4, 0xa7f255a7, 0x7e82fc7e, 0x3d477a3d, 0x64acc864, 0x5de7ba5d, 0x192b3219, 0x7395e673, 0x60a0c060, 0x81981981, 0x4fd19e4f, 0xdc7fa3dc, 0x22664422, 0x2a7e542a, 0x90ab3b90, 0x88830b88, 0x46ca8c46, 0xee29c7ee, 0xb8d36bb8, 0x143c2814, 0xde79a7de, 0x5ee2bc5e, 0x0b1d160b, 0xdb76addb, 0xe03bdbe0, 0x32566432, 0x3a4e743a, 0x0a1e140a, 0x49db9249, 0x060a0c06, 0x246c4824, 0x5ce4b85c, 0xc25d9fc2, 0xd36ebdd3, 0xacef43ac, 0x62a6c462, 0x91a83991, 0x95a43195, 0xe437d3e4, 0x798bf279, 0xe732d5e7, 0xc8438bc8, 0x37596e37, 0x6db7da6d, 0x8d8c018d, 0xd564b1d5, 0x4ed29c4e, 0xa9e049a9, 0x6cb4d86c, 0x56faac56, 0xf407f3f4, 0xea25cfea, 0x65afca65, 0x7a8ef47a, 0xaee947ae, 0x08181008, 0xbad56fba, 0x7888f078, 0x256f4a25, 0x2e725c2e, 0x1c24381c, 0xa6f157a6, 0xb4c773b4, 0xc65197c6, 0xe823cbe8, 0xdd7ca1dd, 0x749ce874, 0x1f213e1f, 0x4bdd964b, 0xbddc61bd, 0x8b860d8b, 0x8a850f8a, 0x7090e070, 0x3e427c3e, 0xb5c471b5, 0x66aacc66, 0x48d89048, 0x03050603, 0xf601f7f6, 0x0e121c0e, 0x61a3c261, 0x355f6a35, 0x57f9ae57, 0xb9d069b9, 0x86911786, 0xc15899c1, 0x1d273a1d, 0x9eb9279e, 0xe138d9e1, 0xf813ebf8, 0x98b32b98, 0x11332211, 0x69bbd269, 0xd970a9d9, 0x8e89078e, 0x94a73394, 0x9bb62d9b, 0x1e223c1e, 0x87921587, 0xe920c9e9, 0xce4987ce, 0x55ffaa55, 0x28785028, 0xdf7aa5df, 0x8c8f038c, 0xa1f859a1, 0x89800989, 0x0d171a0d, 0xbfda65bf, 0xe631d7e6, 0x42c68442, 0x68b8d068, 0x41c38241, 0x99b02999, 0x2d775a2d, 0x0f111e0f, 0xb0cb7bb0, 0x54fca854, 0xbbd66dbb, 0x163a2c16 ]
T4 = [ 0x6363a5c6, 0x7c7c84f8, 0x777799ee, 0x7b7b8df6, 0xf2f20dff, 0x6b6bbdd6, 0x6f6fb1de, 0xc5c55491, 0x30305060, 0x01010302, 0x6767a9ce, 0x2b2b7d56, 0xfefe19e7, 0xd7d762b5, 0xababe64d, 0x76769aec, 0xcaca458f, 0x82829d1f, 0xc9c94089, 0x7d7d87fa, 0xfafa15ef, 0x5959ebb2, 0x4747c98e, 0xf0f00bfb, 0xadadec41, 0xd4d467b3, 0xa2a2fd5f, 0xafafea45, 0x9c9cbf23, 0xa4a4f753, 0x727296e4, 0xc0c05b9b, 0xb7b7c275, 0xfdfd1ce1, 0x9393ae3d, 0x26266a4c, 0x36365a6c, 0x3f3f417e, 0xf7f702f5, 0xcccc4f83, 0x34345c68, 0xa5a5f451, 0xe5e534d1, 0xf1f108f9, 0x717193e2, 0xd8d873ab, 0x31315362, 0x15153f2a, 0x04040c08, 0xc7c75295, 0x23236546, 0xc3c35e9d, 0x18182830, 0x9696a137, 0x05050f0a, 0x9a9ab52f, 0x0707090e, 0x12123624, 0x80809b1b, 0xe2e23ddf, 0xebeb26cd, 0x2727694e, 0xb2b2cd7f, 0x75759fea, 0x09091b12, 0x83839e1d, 0x2c2c7458, 0x1a1a2e34, 0x1b1b2d36, 0x6e6eb2dc, 0x5a5aeeb4, 0xa0a0fb5b, 0x5252f6a4, 0x3b3b4d76, 0xd6d661b7, 0xb3b3ce7d, 0x29297b52, 0xe3e33edd, 0x2f2f715e, 0x84849713, 0x5353f5a6, 0xd1d168b9, 0x00000000, 0xeded2cc1, 0x20206040, 0xfcfc1fe3, 0xb1b1c879, 0x5b5bedb6, 0x6a6abed4, 0xcbcb468d, 0xbebed967, 0x39394b72, 0x4a4ade94, 0x4c4cd498, 0x5858e8b0, 0xcfcf4a85, 0xd0d06bbb, 0xefef2ac5, 0xaaaae54f, 0xfbfb16ed, 0x4343c586, 0x4d4dd79a, 0x33335566, 0x85859411, 0x4545cf8a, 0xf9f910e9, 0x02020604, 0x7f7f81fe, 0x5050f0a0, 0x3c3c4478, 0x9f9fba25, 0xa8a8e34b, 0x5151f3a2, 0xa3a3fe5d, 0x4040c080, 0x8f8f8a05, 0x9292ad3f, 0x9d9dbc21, 0x38384870, 0xf5f504f1, 0xbcbcdf63, 0xb6b6c177, 0xdada75af, 0x21216342, 0x10103020, 0xffff1ae5, 0xf3f30efd, 0xd2d26dbf, 0xcdcd4c81, 0x0c0c1418, 0x13133526, 0xecec2fc3, 0x5f5fe1be, 0x9797a235, 0x4444cc88, 0x1717392e, 0xc4c45793, 0xa7a7f255, 0x7e7e82fc, 0x3d3d477a, 0x6464acc8, 0x5d5de7ba, 0x19192b32, 0x737395e6, 0x6060a0c0, 0x81819819, 0x4f4fd19e, 0xdcdc7fa3, 0x22226644, 0x2a2a7e54, 0x9090ab3b, 0x8888830b, 0x4646ca8c, 0xeeee29c7, 0xb8b8d36b, 0x14143c28, 0xdede79a7, 0x5e5ee2bc, 0x0b0b1d16, 0xdbdb76ad, 0xe0e03bdb, 0x32325664, 0x3a3a4e74, 0x0a0a1e14, 0x4949db92, 0x06060a0c, 0x24246c48, 0x5c5ce4b8, 0xc2c25d9f, 0xd3d36ebd, 0xacacef43, 0x6262a6c4, 0x9191a839, 0x9595a431, 0xe4e437d3, 0x79798bf2, 0xe7e732d5, 0xc8c8438b, 0x3737596e, 0x6d6db7da, 0x8d8d8c01, 0xd5d564b1, 0x4e4ed29c, 0xa9a9e049, 0x6c6cb4d8, 0x5656faac, 0xf4f407f3, 0xeaea25cf, 0x6565afca, 0x7a7a8ef4, 0xaeaee947, 0x08081810, 0xbabad56f, 0x787888f0, 0x25256f4a, 0x2e2e725c, 0x1c1c2438, 0xa6a6f157, 0xb4b4c773, 0xc6c65197, 0xe8e823cb, 0xdddd7ca1, 0x74749ce8, 0x1f1f213e, 0x4b4bdd96, 0xbdbddc61, 0x8b8b860d, 0x8a8a850f, 0x707090e0, 0x3e3e427c, 0xb5b5c471, 0x6666aacc, 0x4848d890, 0x03030506, 0xf6f601f7, 0x0e0e121c, 0x6161a3c2, 0x35355f6a, 0x5757f9ae, 0xb9b9d069, 0x86869117, 0xc1c15899, 0x1d1d273a, 0x9e9eb927, 0xe1e138d9, 0xf8f813eb, 0x9898b32b, 0x11113322, 0x6969bbd2, 0xd9d970a9, 0x8e8e8907, 0x9494a733, 0x9b9bb62d, 0x1e1e223c, 0x87879215, 0xe9e920c9, 0xcece4987, 0x5555ffaa, 0x28287850, 0xdfdf7aa5, 0x8c8c8f03, 0xa1a1f859, 0x89898009, 0x0d0d171a, 0xbfbfda65, 0xe6e631d7, 0x4242c684, 0x6868b8d0, 0x4141c382, 0x9999b029, 0x2d2d775a, 0x0f0f111e, 0xb0b0cb7b, 0x5454fca8, 0xbbbbd66d, 0x16163a2c ]
# Transformations for decryption
T5 = [ 0x51f4a750, 0x7e416553, 0x1a17a4c3, 0x3a275e96, 0x3bab6bcb, 0x1f9d45f1, 0xacfa58ab, 0x4be30393, 0x2030fa55, 0xad766df6, 0x88cc7691, 0xf5024c25, 0x4fe5d7fc, 0xc52acbd7, 0x26354480, 0xb562a38f, 0xdeb15a49, 0x25ba1b67, 0x45ea0e98, 0x5dfec0e1, 0xc32f7502, 0x814cf012, 0x8d4697a3, 0x6bd3f9c6, 0x038f5fe7, 0x15929c95, 0xbf6d7aeb, 0x955259da, 0xd4be832d, 0x587421d3, 0x49e06929, 0x8ec9c844, 0x75c2896a, 0xf48e7978, 0x99583e6b, 0x27b971dd, 0xbee14fb6, 0xf088ad17, 0xc920ac66, 0x7dce3ab4, 0x63df4a18, 0xe51a3182, 0x97513360, 0x62537f45, 0xb16477e0, 0xbb6bae84, 0xfe81a01c, 0xf9082b94, 0x70486858, 0x8f45fd19, 0x94de6c87, 0x527bf8b7, 0xab73d323, 0x724b02e2, 0xe31f8f57, 0x6655ab2a, 0xb2eb2807, 0x2fb5c203, 0x86c57b9a, 0xd33708a5, 0x302887f2, 0x23bfa5b2, 0x02036aba, 0xed16825c, 0x8acf1c2b, 0xa779b492, 0xf307f2f0, 0x4e69e2a1, 0x65daf4cd, 0x0605bed5, 0xd134621f, 0xc4a6fe8a, 0x342e539d, 0xa2f355a0, 0x058ae132, 0xa4f6eb75, 0x0b83ec39, 0x4060efaa, 0x5e719f06, 0xbd6e1051, 0x3e218af9, 0x96dd063d, 0xdd3e05ae, 0x4de6bd46, 0x91548db5, 0x71c45d05, 0x0406d46f, 0x605015ff, 0x1998fb24, 0xd6bde997, 0x894043cc, 0x67d99e77, 0xb0e842bd, 0x07898b88, 0xe7195b38, 0x79c8eedb, 0xa17c0a47, 0x7c420fe9, 0xf8841ec9, 0x00000000, 0x09808683, 0x322bed48, 0x1e1170ac, 0x6c5a724e, 0xfd0efffb, 0x0f853856, 0x3daed51e, 0x362d3927, 0x0a0fd964, 0x685ca621, 0x9b5b54d1, 0x24362e3a, 0x0c0a67b1, 0x9357e70f, 0xb4ee96d2, 0x1b9b919e, 0x80c0c54f, 0x61dc20a2, 0x5a774b69, 0x1c121a16, 0xe293ba0a, 0xc0a02ae5, 0x3c22e043, 0x121b171d, 0x0e090d0b, 0xf28bc7ad, 0x2db6a8b9, 0x141ea9c8, 0x57f11985, 0xaf75074c, 0xee99ddbb, 0xa37f60fd, 0xf701269f, 0x5c72f5bc, 0x44663bc5, 0x5bfb7e34, 0x8b432976, 0xcb23c6dc, 0xb6edfc68, 0xb8e4f163, 0xd731dcca, 0x42638510, 0x13972240, 0x84c61120, 0x854a247d, 0xd2bb3df8, 0xaef93211, 0xc729a16d, 0x1d9e2f4b, 0xdcb230f3, 0x0d8652ec, 0x77c1e3d0, 0x2bb3166c, 0xa970b999, 0x119448fa, 0x47e96422, 0xa8fc8cc4, 0xa0f03f1a, 0x567d2cd8, 0x223390ef, 0x87494ec7, 0xd938d1c1, 0x8ccaa2fe, 0x98d40b36, 0xa6f581cf, 0xa57ade28, 0xdab78e26, 0x3fadbfa4, 0x2c3a9de4, 0x5078920d, 0x6a5fcc9b, 0x547e4662, 0xf68d13c2, 0x90d8b8e8, 0x2e39f75e, 0x82c3aff5, 0x9f5d80be, 0x69d0937c, 0x6fd52da9, 0xcf2512b3, 0xc8ac993b, 0x10187da7, 0xe89c636e, 0xdb3bbb7b, 0xcd267809, 0x6e5918f4, 0xec9ab701, 0x834f9aa8, 0xe6956e65, 0xaaffe67e, 0x21bccf08, 0xef15e8e6, 0xbae79bd9, 0x4a6f36ce, 0xea9f09d4, 0x29b07cd6, 0x31a4b2af, 0x2a3f2331, 0xc6a59430, 0x35a266c0, 0x744ebc37, 0xfc82caa6, 0xe090d0b0, 0x33a7d815, 0xf104984a, 0x41ecdaf7, 0x7fcd500e, 0x1791f62f, 0x764dd68d, 0x43efb04d, 0xccaa4d54, 0xe49604df, 0x9ed1b5e3, 0x4c6a881b, 0xc12c1fb8, 0x4665517f, 0x9d5eea04, 0x018c355d, 0xfa877473, 0xfb0b412e, 0xb3671d5a, 0x92dbd252, 0xe9105633, 0x6dd64713, 0x9ad7618c, 0x37a10c7a, 0x59f8148e, 0xeb133c89, 0xcea927ee, 0xb761c935, 0xe11ce5ed, 0x7a47b13c, 0x9cd2df59, 0x55f2733f, 0x1814ce79, 0x73c737bf, 0x53f7cdea, 0x5ffdaa5b, 0xdf3d6f14, 0x7844db86, 0xcaaff381, 0xb968c43e, 0x3824342c, 0xc2a3405f, 0x161dc372, 0xbce2250c, 0x283c498b, 0xff0d9541, 0x39a80171, 0x080cb3de, 0xd8b4e49c, 0x6456c190, 0x7bcb8461, 0xd532b670, 0x486c5c74, 0xd0b85742 ]
T6 = [ 0x5051f4a7, 0x537e4165, 0xc31a17a4, 0x963a275e, 0xcb3bab6b, 0xf11f9d45, 0xabacfa58, 0x934be303, 0x552030fa, 0xf6ad766d, 0x9188cc76, 0x25f5024c, 0xfc4fe5d7, 0xd7c52acb, 0x80263544, 0x8fb562a3, 0x49deb15a, 0x6725ba1b, 0x9845ea0e, 0xe15dfec0, 0x02c32f75, 0x12814cf0, 0xa38d4697, 0xc66bd3f9, 0xe7038f5f, 0x9515929c, 0xebbf6d7a, 0xda955259, 0x2dd4be83, 0xd3587421, 0x2949e069, 0x448ec9c8, 0x6a75c289, 0x78f48e79, 0x6b99583e, 0xdd27b971, 0xb6bee14f, 0x17f088ad, 0x66c920ac, 0xb47dce3a, 0x1863df4a, 0x82e51a31, 0x60975133, 0x4562537f, 0xe0b16477, 0x84bb6bae, 0x1cfe81a0, 0x94f9082b, 0x58704868, 0x198f45fd, 0x8794de6c, 0xb7527bf8, 0x23ab73d3, 0xe2724b02, 0x57e31f8f, 0x2a6655ab, 0x07b2eb28, 0x032fb5c2, 0x9a86c57b, 0xa5d33708, 0xf2302887, 0xb223bfa5, 0xba02036a, 0x5ced1682, 0x2b8acf1c, 0x92a779b4, 0xf0f307f2, 0xa14e69e2, 0xcd65daf4, 0xd50605be, 0x1fd13462, 0x8ac4a6fe, 0x9d342e53, 0xa0a2f355, 0x32058ae1, 0x75a4f6eb, 0x390b83ec, 0xaa4060ef, 0x065e719f, 0x51bd6e10, 0xf93e218a, 0x3d96dd06, 0xaedd3e05, 0x464de6bd, 0xb591548d, 0x0571c45d, 0x6f0406d4, 0xff605015, 0x241998fb, 0x97d6bde9, 0xcc894043, 0x7767d99e, 0xbdb0e842, 0x8807898b, 0x38e7195b, 0xdb79c8ee, 0x47a17c0a, 0xe97c420f, 0xc9f8841e, 0x00000000, 0x83098086, 0x48322bed, 0xac1e1170, 0x4e6c5a72, 0xfbfd0eff, 0x560f8538, 0x1e3daed5, 0x27362d39, 0x640a0fd9, 0x21685ca6, 0xd19b5b54, 0x3a24362e, 0xb10c0a67, 0x0f9357e7, 0xd2b4ee96, 0x9e1b9b91, 0x4f80c0c5, 0xa261dc20, 0x695a774b, 0x161c121a, 0x0ae293ba, 0xe5c0a02a, 0x433c22e0, 0x1d121b17, 0x0b0e090d, 0xadf28bc7, 0xb92db6a8, 0xc8141ea9, 0x8557f119, 0x4caf7507, 0xbbee99dd, 0xfda37f60, 0x9ff70126, 0xbc5c72f5, 0xc544663b, 0x345bfb7e, 0x768b4329, 0xdccb23c6, 0x68b6edfc, 0x63b8e4f1, 0xcad731dc, 0x10426385, 0x40139722, 0x2084c611, 0x7d854a24, 0xf8d2bb3d, 0x11aef932, 0x6dc729a1, 0x4b1d9e2f, 0xf3dcb230, 0xec0d8652, 0xd077c1e3, 0x6c2bb316, 0x99a970b9, 0xfa119448, 0x2247e964, 0xc4a8fc8c, 0x1aa0f03f, 0xd8567d2c, 0xef223390, 0xc787494e, 0xc1d938d1, 0xfe8ccaa2, 0x3698d40b, 0xcfa6f581, 0x28a57ade, 0x26dab78e, 0xa43fadbf, 0xe42c3a9d, 0x0d507892, 0x9b6a5fcc, 0x62547e46, 0xc2f68d13, 0xe890d8b8, 0x5e2e39f7, 0xf582c3af, 0xbe9f5d80, 0x7c69d093, 0xa96fd52d, 0xb3cf2512, 0x3bc8ac99, 0xa710187d, 0x6ee89c63, 0x7bdb3bbb, 0x09cd2678, 0xf46e5918, 0x01ec9ab7, 0xa8834f9a, 0x65e6956e, 0x7eaaffe6, 0x0821bccf, 0xe6ef15e8, 0xd9bae79b, 0xce4a6f36, 0xd4ea9f09, 0xd629b07c, 0xaf31a4b2, 0x312a3f23, 0x30c6a594, 0xc035a266, 0x37744ebc, 0xa6fc82ca, 0xb0e090d0, 0x1533a7d8, 0x4af10498, 0xf741ecda, 0x0e7fcd50, 0x2f1791f6, 0x8d764dd6, 0x4d43efb0, 0x54ccaa4d, 0xdfe49604, 0xe39ed1b5, 0x1b4c6a88, 0xb8c12c1f, 0x7f466551, 0x049d5eea, 0x5d018c35, 0x73fa8774, 0x2efb0b41, 0x5ab3671d, 0x5292dbd2, 0x33e91056, 0x136dd647, 0x8c9ad761, 0x7a37a10c, 0x8e59f814, 0x89eb133c, 0xeecea927, 0x35b761c9, 0xede11ce5, 0x3c7a47b1, 0x599cd2df, 0x3f55f273, 0x791814ce, 0xbf73c737, 0xea53f7cd, 0x5b5ffdaa, 0x14df3d6f, 0x867844db, 0x81caaff3, 0x3eb968c4, 0x2c382434, 0x5fc2a340, 0x72161dc3, 0x0cbce225, 0x8b283c49, 0x41ff0d95, 0x7139a801, 0xde080cb3, 0x9cd8b4e4, 0x906456c1, 0x617bcb84, 0x70d532b6, 0x74486c5c, 0x42d0b857 ]
T7 = [ 0xa75051f4, 0x65537e41, 0xa4c31a17, 0x5e963a27, 0x6bcb3bab, 0x45f11f9d, 0x58abacfa, 0x03934be3, 0xfa552030, 0x6df6ad76, 0x769188cc, 0x4c25f502, 0xd7fc4fe5, 0xcbd7c52a, 0x44802635, 0xa38fb562, 0x5a49deb1, 0x1b6725ba, 0x0e9845ea, 0xc0e15dfe, 0x7502c32f, 0xf012814c, 0x97a38d46, 0xf9c66bd3, 0x5fe7038f, 0x9c951592, 0x7aebbf6d, 0x59da9552, 0x832dd4be, 0x21d35874, 0x692949e0, 0xc8448ec9, 0x896a75c2, 0x7978f48e, 0x3e6b9958, 0x71dd27b9, 0x4fb6bee1, 0xad17f088, 0xac66c920, 0x3ab47dce, 0x4a1863df, 0x3182e51a, 0x33609751, 0x7f456253, 0x77e0b164, 0xae84bb6b, 0xa01cfe81, 0x2b94f908, 0x68587048, 0xfd198f45, 0x6c8794de, 0xf8b7527b, 0xd323ab73, 0x02e2724b, 0x8f57e31f, 0xab2a6655, 0x2807b2eb, 0xc2032fb5, 0x7b9a86c5, 0x08a5d337, 0x87f23028, 0xa5b223bf, 0x6aba0203, 0x825ced16, 0x1c2b8acf, 0xb492a779, 0xf2f0f307, 0xe2a14e69, 0xf4cd65da, 0xbed50605, 0x621fd134, 0xfe8ac4a6, 0x539d342e, 0x55a0a2f3, 0xe132058a, 0xeb75a4f6, 0xec390b83, 0xefaa4060, 0x9f065e71, 0x1051bd6e, 0x8af93e21, 0x063d96dd, 0x05aedd3e, 0xbd464de6, 0x8db59154, 0x5d0571c4, 0xd46f0406, 0x15ff6050, 0xfb241998, 0xe997d6bd, 0x43cc8940, 0x9e7767d9, 0x42bdb0e8, 0x8b880789, 0x5b38e719, 0xeedb79c8, 0x0a47a17c, 0x0fe97c42, 0x1ec9f884, 0x00000000, 0x86830980, 0xed48322b, 0x70ac1e11, 0x724e6c5a, 0xfffbfd0e, 0x38560f85, 0xd51e3dae, 0x3927362d, 0xd9640a0f, 0xa621685c, 0x54d19b5b, 0x2e3a2436, 0x67b10c0a, 0xe70f9357, 0x96d2b4ee, 0x919e1b9b, 0xc54f80c0, 0x20a261dc, 0x4b695a77, 0x1a161c12, 0xba0ae293, 0x2ae5c0a0, 0xe0433c22, 0x171d121b, 0x0d0b0e09, 0xc7adf28b, 0xa8b92db6, 0xa9c8141e, 0x198557f1, 0x074caf75, 0xddbbee99, 0x60fda37f, 0x269ff701, 0xf5bc5c72, 0x3bc54466, 0x7e345bfb, 0x29768b43, 0xc6dccb23, 0xfc68b6ed, 0xf163b8e4, 0xdccad731, 0x85104263, 0x22401397, 0x112084c6, 0x247d854a, 0x3df8d2bb, 0x3211aef9, 0xa16dc729, 0x2f4b1d9e, 0x30f3dcb2, 0x52ec0d86, 0xe3d077c1, 0x166c2bb3, 0xb999a970, 0x48fa1194, 0x642247e9, 0x8cc4a8fc, 0x3f1aa0f0, 0x2cd8567d, 0x90ef2233, 0x4ec78749, 0xd1c1d938, 0xa2fe8cca, 0x0b3698d4, 0x81cfa6f5, 0xde28a57a, 0x8e26dab7, 0xbfa43fad, 0x9de42c3a, 0x920d5078, 0xcc9b6a5f, 0x4662547e, 0x13c2f68d, 0xb8e890d8, 0xf75e2e39, 0xaff582c3, 0x80be9f5d, 0x937c69d0, 0x2da96fd5, 0x12b3cf25, 0x993bc8ac, 0x7da71018, 0x636ee89c, 0xbb7bdb3b, 0x7809cd26, 0x18f46e59, 0xb701ec9a, 0x9aa8834f, 0x6e65e695, 0xe67eaaff, 0xcf0821bc, 0xe8e6ef15, 0x9bd9bae7, 0x36ce4a6f, 0x09d4ea9f, 0x7cd629b0, 0xb2af31a4, 0x23312a3f, 0x9430c6a5, 0x66c035a2, 0xbc37744e, 0xcaa6fc82, 0xd0b0e090, 0xd81533a7, 0x984af104, 0xdaf741ec, 0x500e7fcd, 0xf62f1791, 0xd68d764d, 0xb04d43ef, 0x4d54ccaa, 0x04dfe496, 0xb5e39ed1, 0x881b4c6a, 0x1fb8c12c, 0x517f4665, 0xea049d5e, 0x355d018c, 0x7473fa87, 0x412efb0b, 0x1d5ab367, 0xd25292db, 0x5633e910, 0x47136dd6, 0x618c9ad7, 0x0c7a37a1, 0x148e59f8, 0x3c89eb13, 0x27eecea9, 0xc935b761, 0xe5ede11c, 0xb13c7a47, 0xdf599cd2, 0x733f55f2, 0xce791814, 0x37bf73c7, 0xcdea53f7, 0xaa5b5ffd, 0x6f14df3d, 0xdb867844, 0xf381caaf, 0xc43eb968, 0x342c3824, 0x405fc2a3, 0xc372161d, 0x250cbce2, 0x498b283c, 0x9541ff0d, 0x017139a8, 0xb3de080c, 0xe49cd8b4, 0xc1906456, 0x84617bcb, 0xb670d532, 0x5c74486c, 0x5742d0b8 ]
T8 = [ 0xf4a75051, 0x4165537e, 0x17a4c31a, 0x275e963a, 0xab6bcb3b, 0x9d45f11f, 0xfa58abac, 0xe303934b, 0x30fa5520, 0x766df6ad, 0xcc769188, 0x024c25f5, 0xe5d7fc4f, 0x2acbd7c5, 0x35448026, 0x62a38fb5, 0xb15a49de, 0xba1b6725, 0xea0e9845, 0xfec0e15d, 0x2f7502c3, 0x4cf01281, 0x4697a38d, 0xd3f9c66b, 0x8f5fe703, 0x929c9515, 0x6d7aebbf, 0x5259da95, 0xbe832dd4, 0x7421d358, 0xe0692949, 0xc9c8448e, 0xc2896a75, 0x8e7978f4, 0x583e6b99, 0xb971dd27, 0xe14fb6be, 0x88ad17f0, 0x20ac66c9, 0xce3ab47d, 0xdf4a1863, 0x1a3182e5, 0x51336097, 0x537f4562, 0x6477e0b1, 0x6bae84bb, 0x81a01cfe, 0x082b94f9, 0x48685870, 0x45fd198f, 0xde6c8794, 0x7bf8b752, 0x73d323ab, 0x4b02e272, 0x1f8f57e3, 0x55ab2a66, 0xeb2807b2, 0xb5c2032f, 0xc57b9a86, 0x3708a5d3, 0x2887f230, 0xbfa5b223, 0x036aba02, 0x16825ced, 0xcf1c2b8a, 0x79b492a7, 0x07f2f0f3, 0x69e2a14e, 0xdaf4cd65, 0x05bed506, 0x34621fd1, 0xa6fe8ac4, 0x2e539d34, 0xf355a0a2, 0x8ae13205, 0xf6eb75a4, 0x83ec390b, 0x60efaa40, 0x719f065e, 0x6e1051bd, 0x218af93e, 0xdd063d96, 0x3e05aedd, 0xe6bd464d, 0x548db591, 0xc45d0571, 0x06d46f04, 0x5015ff60, 0x98fb2419, 0xbde997d6, 0x4043cc89, 0xd99e7767, 0xe842bdb0, 0x898b8807, 0x195b38e7, 0xc8eedb79, 0x7c0a47a1, 0x420fe97c, 0x841ec9f8, 0x00000000, 0x80868309, 0x2bed4832, 0x1170ac1e, 0x5a724e6c, 0x0efffbfd, 0x8538560f, 0xaed51e3d, 0x2d392736, 0x0fd9640a, 0x5ca62168, 0x5b54d19b, 0x362e3a24, 0x0a67b10c, 0x57e70f93, 0xee96d2b4, 0x9b919e1b, 0xc0c54f80, 0xdc20a261, 0x774b695a, 0x121a161c, 0x93ba0ae2, 0xa02ae5c0, 0x22e0433c, 0x1b171d12, 0x090d0b0e, 0x8bc7adf2, 0xb6a8b92d, 0x1ea9c814, 0xf1198557, 0x75074caf, 0x99ddbbee, 0x7f60fda3, 0x01269ff7, 0x72f5bc5c, 0x663bc544, 0xfb7e345b, 0x4329768b, 0x23c6dccb, 0xedfc68b6, 0xe4f163b8, 0x31dccad7, 0x63851042, 0x97224013, 0xc6112084, 0x4a247d85, 0xbb3df8d2, 0xf93211ae, 0x29a16dc7, 0x9e2f4b1d, 0xb230f3dc, 0x8652ec0d, 0xc1e3d077, 0xb3166c2b, 0x70b999a9, 0x9448fa11, 0xe9642247, 0xfc8cc4a8, 0xf03f1aa0, 0x7d2cd856, 0x3390ef22, 0x494ec787, 0x38d1c1d9, 0xcaa2fe8c, 0xd40b3698, 0xf581cfa6, 0x7ade28a5, 0xb78e26da, 0xadbfa43f, 0x3a9de42c, 0x78920d50, 0x5fcc9b6a, 0x7e466254, 0x8d13c2f6, 0xd8b8e890, 0x39f75e2e, 0xc3aff582, 0x5d80be9f, 0xd0937c69, 0xd52da96f, 0x2512b3cf, 0xac993bc8, 0x187da710, 0x9c636ee8, 0x3bbb7bdb, 0x267809cd, 0x5918f46e, 0x9ab701ec, 0x4f9aa883, 0x956e65e6, 0xffe67eaa, 0xbccf0821, 0x15e8e6ef, 0xe79bd9ba, 0x6f36ce4a, 0x9f09d4ea, 0xb07cd629, 0xa4b2af31, 0x3f23312a, 0xa59430c6, 0xa266c035, 0x4ebc3774, 0x82caa6fc, 0x90d0b0e0, 0xa7d81533, 0x04984af1, 0xecdaf741, 0xcd500e7f, 0x91f62f17, 0x4dd68d76, 0xefb04d43, 0xaa4d54cc, 0x9604dfe4, 0xd1b5e39e, 0x6a881b4c, 0x2c1fb8c1, 0x65517f46, 0x5eea049d, 0x8c355d01, 0x877473fa, 0x0b412efb, 0x671d5ab3, 0xdbd25292, 0x105633e9, 0xd647136d, 0xd7618c9a, 0xa10c7a37, 0xf8148e59, 0x133c89eb, 0xa927eece, 0x61c935b7, 0x1ce5ede1, 0x47b13c7a, 0xd2df599c, 0xf2733f55, 0x14ce7918, 0xc737bf73, 0xf7cdea53, 0xfdaa5b5f, 0x3d6f14df, 0x44db8678, 0xaff381ca, 0x68c43eb9, 0x24342c38, 0xa3405fc2, 0x1dc37216, 0xe2250cbc, 0x3c498b28, 0x0d9541ff, 0xa8017139, 0x0cb3de08, 0xb4e49cd8, 0x56c19064, 0xcb84617b, 0x32b670d5, 0x6c5c7448, 0xb85742d0 ]
# Transformations for decryption key expansion
U1 = [ 0x00000000, 0x0e090d0b, 0x1c121a16, 0x121b171d, 0x3824342c, 0x362d3927, 0x24362e3a, 0x2a3f2331, 0x70486858, 0x7e416553, 0x6c5a724e, 0x62537f45, 0x486c5c74, 0x4665517f, 0x547e4662, 0x5a774b69, 0xe090d0b0, 0xee99ddbb, 0xfc82caa6, 0xf28bc7ad, 0xd8b4e49c, 0xd6bde997, 0xc4a6fe8a, 0xcaaff381, 0x90d8b8e8, 0x9ed1b5e3, 0x8ccaa2fe, 0x82c3aff5, 0xa8fc8cc4, 0xa6f581cf, 0xb4ee96d2, 0xbae79bd9, 0xdb3bbb7b, 0xd532b670, 0xc729a16d, 0xc920ac66, 0xe31f8f57, 0xed16825c, 0xff0d9541, 0xf104984a, 0xab73d323, 0xa57ade28, 0xb761c935, 0xb968c43e, 0x9357e70f, 0x9d5eea04, 0x8f45fd19, 0x814cf012, 0x3bab6bcb, 0x35a266c0, 0x27b971dd, 0x29b07cd6, 0x038f5fe7, 0x0d8652ec, 0x1f9d45f1, 0x119448fa, 0x4be30393, 0x45ea0e98, 0x57f11985, 0x59f8148e, 0x73c737bf, 0x7dce3ab4, 0x6fd52da9, 0x61dc20a2, 0xad766df6, 0xa37f60fd, 0xb16477e0, 0xbf6d7aeb, 0x955259da, 0x9b5b54d1, 0x894043cc, 0x87494ec7, 0xdd3e05ae, 0xd33708a5, 0xc12c1fb8, 0xcf2512b3, 0xe51a3182, 0xeb133c89, 0xf9082b94, 0xf701269f, 0x4de6bd46, 0x43efb04d, 0x51f4a750, 0x5ffdaa5b, 0x75c2896a, 0x7bcb8461, 0x69d0937c, 0x67d99e77, 0x3daed51e, 0x33a7d815, 0x21bccf08, 0x2fb5c203, 0x058ae132, 0x0b83ec39, 0x1998fb24, 0x1791f62f, 0x764dd68d, 0x7844db86, 0x6a5fcc9b, 0x6456c190, 0x4e69e2a1, 0x4060efaa, 0x527bf8b7, 0x5c72f5bc, 0x0605bed5, 0x080cb3de, 0x1a17a4c3, 0x141ea9c8, 0x3e218af9, 0x302887f2, 0x223390ef, 0x2c3a9de4, 0x96dd063d, 0x98d40b36, 0x8acf1c2b, 0x84c61120, 0xaef93211, 0xa0f03f1a, 0xb2eb2807, 0xbce2250c, 0xe6956e65, 0xe89c636e, 0xfa877473, 0xf48e7978, 0xdeb15a49, 0xd0b85742, 0xc2a3405f, 0xccaa4d54, 0x41ecdaf7, 0x4fe5d7fc, 0x5dfec0e1, 0x53f7cdea, 0x79c8eedb, 0x77c1e3d0, 0x65daf4cd, 0x6bd3f9c6, 0x31a4b2af, 0x3fadbfa4, 0x2db6a8b9, 0x23bfa5b2, 0x09808683, 0x07898b88, 0x15929c95, 0x1b9b919e, 0xa17c0a47, 0xaf75074c, 0xbd6e1051, 0xb3671d5a, 0x99583e6b, 0x97513360, 0x854a247d, 0x8b432976, 0xd134621f, 0xdf3d6f14, 0xcd267809, 0xc32f7502, 0xe9105633, 0xe7195b38, 0xf5024c25, 0xfb0b412e, 0x9ad7618c, 0x94de6c87, 0x86c57b9a, 0x88cc7691, 0xa2f355a0, 0xacfa58ab, 0xbee14fb6, 0xb0e842bd, 0xea9f09d4, 0xe49604df, 0xf68d13c2, 0xf8841ec9, 0xd2bb3df8, 0xdcb230f3, 0xcea927ee, 0xc0a02ae5, 0x7a47b13c, 0x744ebc37, 0x6655ab2a, 0x685ca621, 0x42638510, 0x4c6a881b, 0x5e719f06, 0x5078920d, 0x0a0fd964, 0x0406d46f, 0x161dc372, 0x1814ce79, 0x322bed48, 0x3c22e043, 0x2e39f75e, 0x2030fa55, 0xec9ab701, 0xe293ba0a, 0xf088ad17, 0xfe81a01c, 0xd4be832d, 0xdab78e26, 0xc8ac993b, 0xc6a59430, 0x9cd2df59, 0x92dbd252, 0x80c0c54f, 0x8ec9c844, 0xa4f6eb75, 0xaaffe67e, 0xb8e4f163, 0xb6edfc68, 0x0c0a67b1, 0x02036aba, 0x10187da7, 0x1e1170ac, 0x342e539d, 0x3a275e96, 0x283c498b, 0x26354480, 0x7c420fe9, 0x724b02e2, 0x605015ff, 0x6e5918f4, 0x44663bc5, 0x4a6f36ce, 0x587421d3, 0x567d2cd8, 0x37a10c7a, 0x39a80171, 0x2bb3166c, 0x25ba1b67, 0x0f853856, 0x018c355d, 0x13972240, 0x1d9e2f4b, 0x47e96422, 0x49e06929, 0x5bfb7e34, 0x55f2733f, 0x7fcd500e, 0x71c45d05, 0x63df4a18, 0x6dd64713, 0xd731dcca, 0xd938d1c1, 0xcb23c6dc, 0xc52acbd7, 0xef15e8e6, 0xe11ce5ed, 0xf307f2f0, 0xfd0efffb, 0xa779b492, 0xa970b999, 0xbb6bae84, 0xb562a38f, 0x9f5d80be, 0x91548db5, 0x834f9aa8, 0x8d4697a3 ]
U2 = [ 0x00000000, 0x0b0e090d, 0x161c121a, 0x1d121b17, 0x2c382434, 0x27362d39, 0x3a24362e, 0x312a3f23, 0x58704868, 0x537e4165, 0x4e6c5a72, 0x4562537f, 0x74486c5c, 0x7f466551, 0x62547e46, 0x695a774b, 0xb0e090d0, 0xbbee99dd, 0xa6fc82ca, 0xadf28bc7, 0x9cd8b4e4, 0x97d6bde9, 0x8ac4a6fe, 0x81caaff3, 0xe890d8b8, 0xe39ed1b5, 0xfe8ccaa2, 0xf582c3af, 0xc4a8fc8c, 0xcfa6f581, 0xd2b4ee96, 0xd9bae79b, 0x7bdb3bbb, 0x70d532b6, 0x6dc729a1, 0x66c920ac, 0x57e31f8f, 0x5ced1682, 0x41ff0d95, 0x4af10498, 0x23ab73d3, 0x28a57ade, 0x35b761c9, 0x3eb968c4, 0x0f9357e7, 0x049d5eea, 0x198f45fd, 0x12814cf0, 0xcb3bab6b, 0xc035a266, 0xdd27b971, 0xd629b07c, 0xe7038f5f, 0xec0d8652, 0xf11f9d45, 0xfa119448, 0x934be303, 0x9845ea0e, 0x8557f119, 0x8e59f814, 0xbf73c737, 0xb47dce3a, 0xa96fd52d, 0xa261dc20, 0xf6ad766d, 0xfda37f60, 0xe0b16477, 0xebbf6d7a, 0xda955259, 0xd19b5b54, 0xcc894043, 0xc787494e, 0xaedd3e05, 0xa5d33708, 0xb8c12c1f, 0xb3cf2512, 0x82e51a31, 0x89eb133c, 0x94f9082b, 0x9ff70126, 0x464de6bd, 0x4d43efb0, 0x5051f4a7, 0x5b5ffdaa, 0x6a75c289, 0x617bcb84, 0x7c69d093, 0x7767d99e, 0x1e3daed5, 0x1533a7d8, 0x0821bccf, 0x032fb5c2, 0x32058ae1, 0x390b83ec, 0x241998fb, 0x2f1791f6, 0x8d764dd6, 0x867844db, 0x9b6a5fcc, 0x906456c1, 0xa14e69e2, 0xaa4060ef, 0xb7527bf8, 0xbc5c72f5, 0xd50605be, 0xde080cb3, 0xc31a17a4, 0xc8141ea9, 0xf93e218a, 0xf2302887, 0xef223390, 0xe42c3a9d, 0x3d96dd06, 0x3698d40b, 0x2b8acf1c, 0x2084c611, 0x11aef932, 0x1aa0f03f, 0x07b2eb28, 0x0cbce225, 0x65e6956e, 0x6ee89c63, 0x73fa8774, 0x78f48e79, 0x49deb15a, 0x42d0b857, 0x5fc2a340, 0x54ccaa4d, 0xf741ecda, 0xfc4fe5d7, 0xe15dfec0, 0xea53f7cd, 0xdb79c8ee, 0xd077c1e3, 0xcd65daf4, 0xc66bd3f9, 0xaf31a4b2, 0xa43fadbf, 0xb92db6a8, 0xb223bfa5, 0x83098086, 0x8807898b, 0x9515929c, 0x9e1b9b91, 0x47a17c0a, 0x4caf7507, 0x51bd6e10, 0x5ab3671d, 0x6b99583e, 0x60975133, 0x7d854a24, 0x768b4329, 0x1fd13462, 0x14df3d6f, 0x09cd2678, 0x02c32f75, 0x33e91056, 0x38e7195b, 0x25f5024c, 0x2efb0b41, 0x8c9ad761, 0x8794de6c, 0x9a86c57b, 0x9188cc76, 0xa0a2f355, 0xabacfa58, 0xb6bee14f, 0xbdb0e842, 0xd4ea9f09, 0xdfe49604, 0xc2f68d13, 0xc9f8841e, 0xf8d2bb3d, 0xf3dcb230, 0xeecea927, 0xe5c0a02a, 0x3c7a47b1, 0x37744ebc, 0x2a6655ab, 0x21685ca6, 0x10426385, 0x1b4c6a88, 0x065e719f, 0x0d507892, 0x640a0fd9, 0x6f0406d4, 0x72161dc3, 0x791814ce, 0x48322bed, 0x433c22e0, 0x5e2e39f7, 0x552030fa, 0x01ec9ab7, 0x0ae293ba, 0x17f088ad, 0x1cfe81a0, 0x2dd4be83, 0x26dab78e, 0x3bc8ac99, 0x30c6a594, 0x599cd2df, 0x5292dbd2, 0x4f80c0c5, 0x448ec9c8, 0x75a4f6eb, 0x7eaaffe6, 0x63b8e4f1, 0x68b6edfc, 0xb10c0a67, 0xba02036a, 0xa710187d, 0xac1e1170, 0x9d342e53, 0x963a275e, 0x8b283c49, 0x80263544, 0xe97c420f, 0xe2724b02, 0xff605015, 0xf46e5918, 0xc544663b, 0xce4a6f36, 0xd3587421, 0xd8567d2c, 0x7a37a10c, 0x7139a801, 0x6c2bb316, 0x6725ba1b, 0x560f8538, 0x5d018c35, 0x40139722, 0x4b1d9e2f, 0x2247e964, 0x2949e069, 0x345bfb7e, 0x3f55f273, 0x0e7fcd50, 0x0571c45d, 0x1863df4a, 0x136dd647, 0xcad731dc, 0xc1d938d1, 0xdccb23c6, 0xd7c52acb, 0xe6ef15e8, 0xede11ce5, 0xf0f307f2, 0xfbfd0eff, 0x92a779b4, 0x99a970b9, 0x84bb6bae, 0x8fb562a3, 0xbe9f5d80, 0xb591548d, 0xa8834f9a, 0xa38d4697 ]
U3 = [ 0x00000000, 0x0d0b0e09, 0x1a161c12, 0x171d121b, 0x342c3824, 0x3927362d, 0x2e3a2436, 0x23312a3f, 0x68587048, 0x65537e41, 0x724e6c5a, 0x7f456253, 0x5c74486c, 0x517f4665, 0x4662547e, 0x4b695a77, 0xd0b0e090, 0xddbbee99, 0xcaa6fc82, 0xc7adf28b, 0xe49cd8b4, 0xe997d6bd, 0xfe8ac4a6, 0xf381caaf, 0xb8e890d8, 0xb5e39ed1, 0xa2fe8cca, 0xaff582c3, 0x8cc4a8fc, 0x81cfa6f5, 0x96d2b4ee, 0x9bd9bae7, 0xbb7bdb3b, 0xb670d532, 0xa16dc729, 0xac66c920, 0x8f57e31f, 0x825ced16, 0x9541ff0d, 0x984af104, 0xd323ab73, 0xde28a57a, 0xc935b761, 0xc43eb968, 0xe70f9357, 0xea049d5e, 0xfd198f45, 0xf012814c, 0x6bcb3bab, 0x66c035a2, 0x71dd27b9, 0x7cd629b0, 0x5fe7038f, 0x52ec0d86, 0x45f11f9d, 0x48fa1194, 0x03934be3, 0x0e9845ea, 0x198557f1, 0x148e59f8, 0x37bf73c7, 0x3ab47dce, 0x2da96fd5, 0x20a261dc, 0x6df6ad76, 0x60fda37f, 0x77e0b164, 0x7aebbf6d, 0x59da9552, 0x54d19b5b, 0x43cc8940, 0x4ec78749, 0x05aedd3e, 0x08a5d337, 0x1fb8c12c, 0x12b3cf25, 0x3182e51a, 0x3c89eb13, 0x2b94f908, 0x269ff701, 0xbd464de6, 0xb04d43ef, 0xa75051f4, 0xaa5b5ffd, 0x896a75c2, 0x84617bcb, 0x937c69d0, 0x9e7767d9, 0xd51e3dae, 0xd81533a7, 0xcf0821bc, 0xc2032fb5, 0xe132058a, 0xec390b83, 0xfb241998, 0xf62f1791, 0xd68d764d, 0xdb867844, 0xcc9b6a5f, 0xc1906456, 0xe2a14e69, 0xefaa4060, 0xf8b7527b, 0xf5bc5c72, 0xbed50605, 0xb3de080c, 0xa4c31a17, 0xa9c8141e, 0x8af93e21, 0x87f23028, 0x90ef2233, 0x9de42c3a, 0x063d96dd, 0x0b3698d4, 0x1c2b8acf, 0x112084c6, 0x3211aef9, 0x3f1aa0f0, 0x2807b2eb, 0x250cbce2, 0x6e65e695, 0x636ee89c, 0x7473fa87, 0x7978f48e, 0x5a49deb1, 0x5742d0b8, 0x405fc2a3, 0x4d54ccaa, 0xdaf741ec, 0xd7fc4fe5, 0xc0e15dfe, 0xcdea53f7, 0xeedb79c8, 0xe3d077c1, 0xf4cd65da, 0xf9c66bd3, 0xb2af31a4, 0xbfa43fad, 0xa8b92db6, 0xa5b223bf, 0x86830980, 0x8b880789, 0x9c951592, 0x919e1b9b, 0x0a47a17c, 0x074caf75, 0x1051bd6e, 0x1d5ab367, 0x3e6b9958, 0x33609751, 0x247d854a, 0x29768b43, 0x621fd134, 0x6f14df3d, 0x7809cd26, 0x7502c32f, 0x5633e910, 0x5b38e719, 0x4c25f502, 0x412efb0b, 0x618c9ad7, 0x6c8794de, 0x7b9a86c5, 0x769188cc, 0x55a0a2f3, 0x58abacfa, 0x4fb6bee1, 0x42bdb0e8, 0x09d4ea9f, 0x04dfe496, 0x13c2f68d, 0x1ec9f884, 0x3df8d2bb, 0x30f3dcb2, 0x27eecea9, 0x2ae5c0a0, 0xb13c7a47, 0xbc37744e, 0xab2a6655, 0xa621685c, 0x85104263, 0x881b4c6a, 0x9f065e71, 0x920d5078, 0xd9640a0f, 0xd46f0406, 0xc372161d, 0xce791814, 0xed48322b, 0xe0433c22, 0xf75e2e39, 0xfa552030, 0xb701ec9a, 0xba0ae293, 0xad17f088, 0xa01cfe81, 0x832dd4be, 0x8e26dab7, 0x993bc8ac, 0x9430c6a5, 0xdf599cd2, 0xd25292db, 0xc54f80c0, 0xc8448ec9, 0xeb75a4f6, 0xe67eaaff, 0xf163b8e4, 0xfc68b6ed, 0x67b10c0a, 0x6aba0203, 0x7da71018, 0x70ac1e11, 0x539d342e, 0x5e963a27, 0x498b283c, 0x44802635, 0x0fe97c42, 0x02e2724b, 0x15ff6050, 0x18f46e59, 0x3bc54466, 0x36ce4a6f, 0x21d35874, 0x2cd8567d, 0x0c7a37a1, 0x017139a8, 0x166c2bb3, 0x1b6725ba, 0x38560f85, 0x355d018c, 0x22401397, 0x2f4b1d9e, 0x642247e9, 0x692949e0, 0x7e345bfb, 0x733f55f2, 0x500e7fcd, 0x5d0571c4, 0x4a1863df, 0x47136dd6, 0xdccad731, 0xd1c1d938, 0xc6dccb23, 0xcbd7c52a, 0xe8e6ef15, 0xe5ede11c, 0xf2f0f307, 0xfffbfd0e, 0xb492a779, 0xb999a970, 0xae84bb6b, 0xa38fb562, 0x80be9f5d, 0x8db59154, 0x9aa8834f, 0x97a38d46 ]
U4 = [ 0x00000000, 0x090d0b0e, 0x121a161c, 0x1b171d12, 0x24342c38, 0x2d392736, 0x362e3a24, 0x3f23312a, 0x48685870, 0x4165537e, 0x5a724e6c, 0x537f4562, 0x6c5c7448, 0x65517f46, 0x7e466254, 0x774b695a, 0x90d0b0e0, 0x99ddbbee, 0x82caa6fc, 0x8bc7adf2, 0xb4e49cd8, 0xbde997d6, 0xa6fe8ac4, 0xaff381ca, 0xd8b8e890, 0xd1b5e39e, 0xcaa2fe8c, 0xc3aff582, 0xfc8cc4a8, 0xf581cfa6, 0xee96d2b4, 0xe79bd9ba, 0x3bbb7bdb, 0x32b670d5, 0x29a16dc7, 0x20ac66c9, 0x1f8f57e3, 0x16825ced, 0x0d9541ff, 0x04984af1, 0x73d323ab, 0x7ade28a5, 0x61c935b7, 0x68c43eb9, 0x57e70f93, 0x5eea049d, 0x45fd198f, 0x4cf01281, 0xab6bcb3b, 0xa266c035, 0xb971dd27, 0xb07cd629, 0x8f5fe703, 0x8652ec0d, 0x9d45f11f, 0x9448fa11, 0xe303934b, 0xea0e9845, 0xf1198557, 0xf8148e59, 0xc737bf73, 0xce3ab47d, 0xd52da96f, 0xdc20a261, 0x766df6ad, 0x7f60fda3, 0x6477e0b1, 0x6d7aebbf, 0x5259da95, 0x5b54d19b, 0x4043cc89, 0x494ec787, 0x3e05aedd, 0x3708a5d3, 0x2c1fb8c1, 0x2512b3cf, 0x1a3182e5, 0x133c89eb, 0x082b94f9, 0x01269ff7, 0xe6bd464d, 0xefb04d43, 0xf4a75051, 0xfdaa5b5f, 0xc2896a75, 0xcb84617b, 0xd0937c69, 0xd99e7767, 0xaed51e3d, 0xa7d81533, 0xbccf0821, 0xb5c2032f, 0x8ae13205, 0x83ec390b, 0x98fb2419, 0x91f62f17, 0x4dd68d76, 0x44db8678, 0x5fcc9b6a, 0x56c19064, 0x69e2a14e, 0x60efaa40, 0x7bf8b752, 0x72f5bc5c, 0x05bed506, 0x0cb3de08, 0x17a4c31a, 0x1ea9c814, 0x218af93e, 0x2887f230, 0x3390ef22, 0x3a9de42c, 0xdd063d96, 0xd40b3698, 0xcf1c2b8a, 0xc6112084, 0xf93211ae, 0xf03f1aa0, 0xeb2807b2, 0xe2250cbc, 0x956e65e6, 0x9c636ee8, 0x877473fa, 0x8e7978f4, 0xb15a49de, 0xb85742d0, 0xa3405fc2, 0xaa4d54cc, 0xecdaf741, 0xe5d7fc4f, 0xfec0e15d, 0xf7cdea53, 0xc8eedb79, 0xc1e3d077, 0xdaf4cd65, 0xd3f9c66b, 0xa4b2af31, 0xadbfa43f, 0xb6a8b92d, 0xbfa5b223, 0x80868309, 0x898b8807, 0x929c9515, 0x9b919e1b, 0x7c0a47a1, 0x75074caf, 0x6e1051bd, 0x671d5ab3, 0x583e6b99, 0x51336097, 0x4a247d85, 0x4329768b, 0x34621fd1, 0x3d6f14df, 0x267809cd, 0x2f7502c3, 0x105633e9, 0x195b38e7, 0x024c25f5, 0x0b412efb, 0xd7618c9a, 0xde6c8794, 0xc57b9a86, 0xcc769188, 0xf355a0a2, 0xfa58abac, 0xe14fb6be, 0xe842bdb0, 0x9f09d4ea, 0x9604dfe4, 0x8d13c2f6, 0x841ec9f8, 0xbb3df8d2, 0xb230f3dc, 0xa927eece, 0xa02ae5c0, 0x47b13c7a, 0x4ebc3774, 0x55ab2a66, 0x5ca62168, 0x63851042, 0x6a881b4c, 0x719f065e, 0x78920d50, 0x0fd9640a, 0x06d46f04, 0x1dc37216, 0x14ce7918, 0x2bed4832, 0x22e0433c, 0x39f75e2e, 0x30fa5520, 0x9ab701ec, 0x93ba0ae2, 0x88ad17f0, 0x81a01cfe, 0xbe832dd4, 0xb78e26da, 0xac993bc8, 0xa59430c6, 0xd2df599c, 0xdbd25292, 0xc0c54f80, 0xc9c8448e, 0xf6eb75a4, 0xffe67eaa, 0xe4f163b8, 0xedfc68b6, 0x0a67b10c, 0x036aba02, 0x187da710, 0x1170ac1e, 0x2e539d34, 0x275e963a, 0x3c498b28, 0x35448026, 0x420fe97c, 0x4b02e272, 0x5015ff60, 0x5918f46e, 0x663bc544, 0x6f36ce4a, 0x7421d358, 0x7d2cd856, 0xa10c7a37, 0xa8017139, 0xb3166c2b, 0xba1b6725, 0x8538560f, 0x8c355d01, 0x97224013, 0x9e2f4b1d, 0xe9642247, 0xe0692949, 0xfb7e345b, 0xf2733f55, 0xcd500e7f, 0xc45d0571, 0xdf4a1863, 0xd647136d, 0x31dccad7, 0x38d1c1d9, 0x23c6dccb, 0x2acbd7c5, 0x15e8e6ef, 0x1ce5ede1, 0x07f2f0f3, 0x0efffbfd, 0x79b492a7, 0x70b999a9, 0x6bae84bb, 0x62a38fb5, 0x5d80be9f, 0x548db591, 0x4f9aa883, 0x4697a38d ]
def __init__(self, key):
if len(key) not in (16, 24, 32):
raise_exception( ValueError('Invalid key size') )
rounds = self.number_of_rounds[len(key)]
# Encryption round keys
self._Ke = [[0] * 4 for i in range(rounds + 1)]
# Decryption round keys
self._Kd = [[0] * 4 for i in range(rounds + 1)]
round_key_count = (rounds + 1) * 4
KC = len(key) // 4
# Convert the key into ints
tk = [ struct.unpack('>i', key[i:i + 4])[0] for i in range(0, len(key), 4) ]
# Copy values into round key arrays
for i in range(0, KC):
self._Ke[i // 4][i % 4] = tk[i]
self._Kd[rounds - (i // 4)][i % 4] = tk[i]
# Key expansion (fips-197 section 5.2)
rconpointer = 0
t = KC
while t < round_key_count:
tt = tk[KC - 1]
tk[0] ^= ((self.S[(tt >> 16) & 0xFF] << 24) ^
(self.S[(tt >> 8) & 0xFF] << 16) ^
(self.S[ tt & 0xFF] << 8) ^
self.S[(tt >> 24) & 0xFF] ^
(self.rcon[rconpointer] << 24))
rconpointer += 1
if KC != 8:
for i in range(1, KC):
tk[i] ^= tk[i - 1]
# Key expansion for 256-bit keys is "slightly different" (fips-197)
else:
for i in range(1, KC // 2):
tk[i] ^= tk[i - 1]
tt = tk[KC // 2 - 1]
tk[KC // 2] ^= (self.S[ tt & 0xFF] ^
(self.S[(tt >> 8) & 0xFF] << 8) ^
(self.S[(tt >> 16) & 0xFF] << 16) ^
(self.S[(tt >> 24) & 0xFF] << 24))
for i in range(KC // 2 + 1, KC):
tk[i] ^= tk[i - 1]
# Copy values into round key arrays
j = 0
while j < KC and t < round_key_count:
self._Ke[t // 4][t % 4] = tk[j]
self._Kd[rounds - (t // 4)][t % 4] = tk[j]
j += 1
t += 1
# Inverse-Cipher-ify the decryption round key (fips-197 section 5.3)
for r in range(1, rounds):
for j in range(0, 4):
tt = self._Kd[r][j]
self._Kd[r][j] = (self.U1[(tt >> 24) & 0xFF] ^
self.U2[(tt >> 16) & 0xFF] ^
self.U3[(tt >> 8) & 0xFF] ^
self.U4[ tt & 0xFF])
def encrypt(self, plaintext):
'Encrypt a block of plain text using the AES block cipher.'
if len(plaintext) != 16:
raise_exception( ValueError('wrong block length') )
rounds = len(self._Ke) - 1
(s1, s2, s3) = [1, 2, 3]
a = [0, 0, 0, 0]
# Convert plaintext to (ints ^ key)
t = [(AES._compact_word(plaintext[4 * i:4 * i + 4]) ^ self._Ke[0][i]) for i in range(0, 4)]
# Apply round transforms
for r in range(1, rounds):
for i in range(0, 4):
a[i] = (self.T1[(t[ i ] >> 24) & 0xFF] ^
self.T2[(t[(i + s1) % 4] >> 16) & 0xFF] ^
self.T3[(t[(i + s2) % 4] >> 8) & 0xFF] ^
self.T4[ t[(i + s3) % 4] & 0xFF] ^
self._Ke[r][i])
t = copy.copy(a)
# The last round is special
result = [ ]
for i in range(0, 4):
tt = self._Ke[rounds][i]
result.append((self.S[(t[ i ] >> 24) & 0xFF] ^ (tt >> 24)) & 0xFF)
result.append((self.S[(t[(i + s1) % 4] >> 16) & 0xFF] ^ (tt >> 16)) & 0xFF)
result.append((self.S[(t[(i + s2) % 4] >> 8) & 0xFF] ^ (tt >> 8)) & 0xFF)
result.append((self.S[ t[(i + s3) % 4] & 0xFF] ^ tt ) & 0xFF)
return result
def decrypt(self, ciphertext):
'Decrypt a block of cipher text using the AES block cipher.'
if len(ciphertext) != 16:
raise_exception( ValueError('wrong block length') )
rounds = len(self._Kd) - 1
(s1, s2, s3) = [3, 2, 1]
a = [0, 0, 0, 0]
# Convert ciphertext to (ints ^ key)
t = [(AES._compact_word(ciphertext[4 * i:4 * i + 4]) ^ self._Kd[0][i]) for i in range(0, 4)]
# Apply round transforms
for r in range(1, rounds):
for i in range(0, 4):
a[i] = (self.T5[(t[ i ] >> 24) & 0xFF] ^
self.T6[(t[(i + s1) % 4] >> 16) & 0xFF] ^
self.T7[(t[(i + s2) % 4] >> 8) & 0xFF] ^
self.T8[ t[(i + s3) % 4] & 0xFF] ^
self._Kd[r][i])
t = copy.copy(a)
# The last round is special
result = [ ]
for i in range(0, 4):
tt = self._Kd[rounds][i]
result.append((self.Si[(t[ i ] >> 24) & 0xFF] ^ (tt >> 24)) & 0xFF)
result.append((self.Si[(t[(i + s1) % 4] >> 16) & 0xFF] ^ (tt >> 16)) & 0xFF)
result.append((self.Si[(t[(i + s2) % 4] >> 8) & 0xFF] ^ (tt >> 8)) & 0xFF)
result.append((self.Si[ t[(i + s3) % 4] & 0xFF] ^ tt ) & 0xFF)
return result
class AES_128_CBC:
def __init__(self, key, iv = None):
self._aes = AES(key)
if iv is None:
self._last_cipherblock = [ 0 ] * 16
elif len(iv) != 16:
raise_exception( ValueError('initialization vector must be 16 bytes') )
else:
self._last_cipherblock = iv
def encrypt(self, plaintext):
if len(plaintext) != 16:
raise_exception( ValueError('plaintext block must be 16 bytes') )
precipherblock = [ (p ^ l) for (p, l) in zip(plaintext, self._last_cipherblock) ]
self._last_cipherblock = self._aes.encrypt(precipherblock)
return b''.join(map(lambda x: x.to_bytes(1, 'little'), self._last_cipherblock))
def decrypt(self, ciphertext):
if len(ciphertext) != 16:
raise_exception( ValueError('ciphertext block must be 16 bytes') )
cipherblock = ciphertext
plaintext = [ (p ^ l) for (p, l) in zip(self._aes.decrypt(cipherblock), self._last_cipherblock) ]
self._last_cipherblock = cipherblock
return b''.join(map(lambda x: x.to_bytes(1, 'little'), plaintext))
ISP_PROG = '789cbcbc7d5c1357d6007c6792c924080a0e1890d82201a2acebc38a4aab960d428862eb636b85ea56171d10b5dacafa416dcb4a48263122453a60c0604bb182b25bd71535ada8011569edf787623fb4688080a2420522cac77befcc04d0769f7dff787faffec2ccbdf7dc73cf3df79c73cfb91f33f1e0dd331f5ec300061efdb7695144c4a64541f0a746bfe9161c00363b7bcca6dd2f44e8d4588c2e069bab9b8bc5ea62b1385d1ca6d169b0785d3ca6d569b179ba79d87cdd7c2c4197802dd02dc09ed53d8b3da77b0e5ba85b88fdafee7f8b576d2ac52336794780c500747c077f8b31f884bfc5387cc2df62117cc2df62317cc2df62023ee16fb1043ee16f31099ff0b7580a9ff0b758069ff0b7d8033ee16ff128f884bfc59ef0097f8bbde013fe168f864ff85b3c063ee1cfe283015617532cff9b0c50f85117f537bc6f53a8ba93c2fc40343636ffb5a08db12b7dc635de0b0e9a4eff8ce346bc6d2c5d84e32b7dc6375a7058371bbfbd49a9ee1c2ddafb0e4e4fc4f4477140efc041f6aa89d86b20287013088a40ef58baa4c9a37574fbd88e715de3ef3fd157d658de54d97ab8fd68c789ae93f7cff4cdbd1edf9cd0b6f0f60b9d89ddcb7a57f4dfb8ded27cb3edceed5f3b7bba1ff40ef4ab82306f55888fb76ad2446fd5943f79ab8262c6aa429e1fab9ab472ac6a4afa585590ce571592efab9af4beaf6acabf7d5541b5e35421df8c534dba3e4e35a5731cac1f00eb07c0fa01b07e00ac1f08eb07c2fa81b07e20ac3f01d69f00eb4f80f527c0fa4fc2fa4fc2fa4fc2fa4f6e0a0d9a7e2fd47bba87d117a79710e06ed0918501a57723364edae4ed37bd2c79eeca1b2bf1bf95fd6deea61b9b2429e529f1a92da992ade55be3335a323cd654ae49587b73adc71b956f24bc79f3cdd1eb0faf5fb8e1ce86d17f3ffcf785dbef6c2761df4c413a6f7222e6639aa8f32183b1b1a660dd5852895126a58e2243305f5388ce970cc5fc4ca13a3f320c1b670ad38d235598dca4d2c9c94998bf6992ce9f9c8c059826eb02c8706cbc295c379efc031668fa832e909c82294c53740af28fd804d31f7513c8a9d813a6a9ba27c8ffc19e34fd8fee490ac382d038010047a9d1a3696cebb8f6f11d4f7405df0febab6c3cdc74b4f544fbc98e335d67ef5fe84bb8beb0f985b6c4dbcb3a5774afea5ddd7ff3fa9de65fdb7a6e3fe81ce8be171c01f6f4de0bf69ebea71f2f93947b548e3e3cf6e8b813e34f3e71a6acacbcbcb2f2f0e1a3474f9c3879f2cc99b9fbe32b12feb1f05f2f5425da9655afb0dfd8df5271f31f77fef56b558fed41f5809d97adb99fbb650bc995f24830b02c02c099551a22945f74978bfe8d037ce5977fdbb47b1ab803ff8e9d6e0b06c09a8def409a969d1cbc72eeaa1babf04d659be66ebeb159925a9e1abfba65b524a33c23fef596d73dd656ae4d5877739dc79b956f26bc75f3add11b0e6f58f8ea9d57476f3fbc7d6cd0d1a01726fe3a71ecb4a3d35e88fc35725cc88990c4d09ed071334fcc4c8cea891a3fe9e4a465931f4c1e3febe4ac65b31fcc7e62ca99292bfe38f0c727a2cf44aff8f3c09fdddc8df191346d52a93b192dad7001426351a39e349aee65c4456c028bbcbbed166f94937ca42c19d3c460d95972020314c1b45848d85713d9429b2b805c04b22851ec715a2402b8de0da369b2c027cbd439ccb9b98c595b55d78deb3ee561f547ddb0088fbede749db98b68b100d45ad0030b09b25893ea88f933a1bd5fb812f82b4b66b472584789bbae217a1006049bf661eeb9181fbec5d026be56e8cf0246ae5e0c06db066940df2397624034efa51f27686743547c4fd27eb2c05cfd276b7ef29a5742cff6f7c722d3480c53159d0b34f50c5e3947688bea2b0be909c771f8f6d9e78b75a1812602d05d2740a04902749a4006a6e8669cf523715d7ef1bb53f7740e280f9240f90109304d76164188b4dd831f7d9769171d242591c409c0104a4060d76d3c35af7fc353c3cc13e8f996cf5ff32da4a6483a448df2033906e918775cbc6b314502bc9ab0a9a79a1ea363fd301d8476ea9ea2cf2802e0cccdfcf8a224bf1cc7130dbf4bd9161ba24aa421206507bf4bb5ef5a8c30ce961e57d39baf8040528ab0ef80d8d7b4021efbd43dc5ef6eb0454a61ad7952d49f5292ecb5633ec5da4900ce39f18c46d2ae1477027e541b3b6f40192256f263ce9ab4e74be840d2cd5bdf9863dfbd087f459f7dbf5817865aa6375f86adc23ec5073230f56a3360c7c156df0978f74461fa233de038ca5c028c0471f453fbb6f4b0f6fea44ac3164d7087a44be9db09e44d0050e20f8ef2f29d75a5b969825bd6bf7f46780b6ad069428577f5d57f6b266a2c119c5ea7d5a17ceebd7175be66b2f01eb4ba50f347e13d39b558f327377cca3ecd0c373cad48e6a475b6eb582d27b7fb25aea3eff3124cb88e5de2dfa4aea3823e5595d032c8bd2b9f29cb49dcdd4728e148223c39d859aec3ba783ea7622e166fd670b991aec33c97b37e2adb886fbe9182693490d39385fe645d982fbcd57ca24394f27dff52b19eab3dc75589dadfbf833c7888a7887455fea02db4613ec1ad85da704304f060187e3c817b3cc1e77aad0ea03155822e3037058d2ce643119d40d22ed0d227d2744208c28ccab6a59bb54acf4e604d22b0fdf0c926318075b9c60413cb73255dc11d539ba6352a27bcefc6fe73596af6ea49d2a9c67c8d203f5f107142d9032cd4318a19403de5fb89f9b064563f92350ebbf831eced4a3f37deaccbbc6cbab1667de2964a840348464a6ccd776e3916d2756ed8e5e9658d0bbb5ee8486c5fd6baa2c9d814d69770bfd23c75e73403eb01bca95100b78c82e3999353206b2f4f96a4dcd83c775bd936fc8d1bebe6be52f60aa6590971b0aede31b34d0fd4274df4a6265066b21a4830a124cc4c6fb6024a068da36d14f89659c8044a6420dbc2ca65b88ead3684c454cb9ec6a2f75eef5b6e9b90fec27d6aaf044fec5ad6b1aa955a7613937f0ee5dcf39fff237b257f5d59ebc5c6857d2bda8dede54de186a9e6a3e61339abf6068e0a0cb6eedc39917e4f065a92e3a15720597b63dbdc37cadec0dfbcf1cadcf5982614d1062d8bd52083f62d0a63c5002fd4d27732447a9507b861c936e56a794e8b054e1b00253770dcfe214fe9d709ca8d0970d426b9397e88b58821eda125d1c5d5e662f552db87ffa57e7f9252c26329d46cb0b3e41151f479413f740534924f91d69965d6565b9c40de0380287e4f26b29ef5e0873c3e2df93b21a4832fe854256795958963c28c1719374dea8f8edb20de51d1e7adb0e5e80b1617d227d59b103b7c1369c93747623f91f128f6b0d7ddd8116e96f40e5656f68cd6208c124829872b6deb6ea8c9e85d5fafda6ae986cfba199bf6c557c306aaf79c0097f22c1097fec2ea4db3e50d029d8ab3caf0c431efa31e8f7e943253fa30366d3a8f6dc9abbfc5667c95c7c6e35a3de64f36269e9ff5f65f7d3a3da1234a2605d083ea8a1487629506229e85da2ad32c6c97f4ed47f3f0d30620580909f408bcbbc5487f59e253acbc90e75bf2ff5a4938037c46e28a67034d72902dd7b1fe9f7f6951665500ff676bd36a49badb092047462bbebeb171eee6b2cdf8b61b6b31cd44a4ff490e8b73c037c5468a413e4999e240b985dedb027c5b154d3638bb9c2950fe530cda6c8a56ffa63a3bd2466f2f497b702bb2dfe106c106cc310fc9558d3a9811de52ad0401dc29f59f052f62d68d64de2a553230fe899f182fd3c863112f684c5fef0744f50428237fc8b39a9cbfb27b4851ed3b7e5a7a6dc313706001453afb584b1838692989dd961edc11d69ed02ae95ad874d272d430d5106ece3689ea0ae1583080d0220ba3af3f00ce92e8791cdc203f85fca737bb406ed29252914a0664398483ea768d11f4610f169a9f34bf590f4b58a70a28837a812ee9c3838a18734e6f030f536a13f8fd0f2c34b3f70b1b2afbe2277dfd1238c61a48791da4fc529efe422aa08c0c382b41ef5bc00dc9d5b7f5f597802e0cf54b54df0ce672cf6e60633ebacd32c4b455e4f2bc89aafea480bb6cbd6b907ebb073727edcfea02fc9c31776dd95a644d451a09389437511378450e989c6ab2437d3bafaa9ec43267e8545bba09496654ad4daf2d00a2f906484795538589e67f00e6caaad27a61de31b04a16b9b6176093a1cd12d5b2c55fe19bcb36f2d8e57590b7c48156cd4a5e2a453762b6446e39e4b6f91f30f38571b6220b3f6350affd1eeadb79288362ac8cfc264f14df041298b912f4d605564996bf8d8509a35eb1e5b408f65d0f79130c7ba987bd3fcb3dbbe188208df0d1d20627609230fb0e772bd7561ecfce8af16109b5d847037d457efe6e2d4b668945623fe8efca23398fd1b92d1dcd50480a04d9eae761d577047fe113997b7ebddf99ca4b1be477903b57fd23df73e247615c9f8e9248012531e23a8dbc1b9534cf164a9e146cd16cbe46f315b94b8ae6f75982accfd069dc584b7f853601ea3d6b943c2597406849d859f350a9fa571e834690a9a069f22ed8a6386e062b0eea97f7c2329979bae71aa659b1e8f62f8a45fd3f21dd42165929ce0716d4ea07e474812aafeb763dd49704260220dbbd5f8cb9b5f0334e2311f471324290708fe7a0feee27df07c8be1050efa047c0d1947c1e7a0350c2a03feac94b80a31ff9fdd093f5a4bd9b718a38e225b4d84711355ec35e3bfd90228f7839fc9b0784f20714f1b5579d8dd74e4957a5018bb342cb423b6d82342349ae321980a88e04539b08b1405b6f59ea75489da43dbc696aa390e7ca4e0d4e15f03e44140ad2d983e82a89a1039b81cec9684b6a271e549612a037299a817eb396363981399e7ebb01ce58fd362b0c3a108dd60632c6e1dd3c60d6d6424b5ee3458feb0645b57046f56cb3b192afbdfa931c3e97fa50ce11af54fb33b1e1263afd1cce66b840ade3f47576ab64f0999d27f7783a3c9d31a46343c380cf0dd80fc939f54e095aebd9b3272dab1cfa6736e849da8012fef613f0097ffbc5f0097ffb45f0097ffb71f884bffd187cc2df7e009ff00767f7eeee31ec1e065cca65f71018f717bf94bbd4c6ee5101be0d63b6332becc19eac5af984bce83d8ef473f7f7d996a697b7571aa63265d0533376093235da52cfcd4f47cf3432da334d0affb2d4e876b336faeb53f3c2fb0862eafdf0aeac35f9ce30c13667f93f2b1d8a3263f52af128264777570f3d01d64506153b2942ec21c8b6875e2ec6288b4a443f4702591af40cfcbb00aab12d2353de9ba79cd4e511c64448cbdc734040ff8b3ed70fc5fd5f38c1981734cf4a85f739e51b259bc3989694541491f03af76b25b38316a2c4ce4a13438aa095dc3018c954a833fd90a5086e94405f4ae97f1c2803ea80328200cae9f03789f050ce81cf2002578610f8c4789d06c6ac1e8265ba619173de7a97a28f7f4b73cd6914defae634f16fa641707f4e2ba2154fe1a49e9ba9627b2c7e0045b4be3c55b1cff0d682e914caef0be90e3eed007c9ae870d717d27785f201771f84f1e89f334ff799d022af67fd7c6ce1bc3fa75d9a023dbd90ae1758d2b058f954d70bc86b5646885f10eade95ae21e20f08be71e91de97a59fca76e4ff936a39df349aa4db14ee7d46991b6e81b082c20bdecfec58e6f5b8d7de55d5fb65f6efab17145dfaafbabbbd675bcdafe7eceb49d6866bd545fb2a1ae1e8d9efe690f40b954801e4b7ac3114da0b3487f4ca35b802da0c1a849b2dbcb8c87342b18617d6207ed45841144b5c79698c45df4a6bf60cb76499315eb7d17d15d15617171c74be8af0daa6b4c9851807f47915cc62430e51645a3901346fb13ded53bf7aa090d5dec0150295f521392d9a68ff2c0caad91a3acea2a8345ad7c6a2f38de77a8741806b4f627fde9d21f0fea4fca80fe9819d4cafaaf941be94d56f0a3714eb2e7ebbab0446302936da113a15c435c957bcdbb9054299faa04db7a75da968df19bcb374bb6b5ac8d5f57be4ef2ca8d37e7be55f6169e796343342383f219350a64e9127471d4825135cb6da815d169d44a5c5cb9c921f31a10417e4993d9ad64109d2df1a637ec93c2f7799f5e142ff25c5f2d2ec1bd72ad5e041cd509e0a6a5fa7c8188eeee024468b5f800466590eae5e7ca514ff839f809bfe67f6b914fc26990931423bdeb4ffad88e686017dc1ac49ea3163c1cf47ad6fcec7c9be7fa578de1c2aa4d4d3aa2a22c8dca70d9a83452aaf46f028a76f1d7772cd58663318ac6b8f3374d370df4a67e70d3244d8e365e66a2859a59db5e8a2b4b9bf0de4be7fd934fc211e1b8bd37d26bdf10b7757d9da7c38c6e1a83322b4dcb3b3dd7bbb99f1ccc6bbbcb856835db14ebe9e60a00c7fecdab38bd691fb8690883ad0954feede5b8d9268b5a91fcd27934facf3f1c1ec520c9d26f26db6e9ae87bffc07e3455d737e04fc1b14323576d6cc0714ba5c1115bdad37c5c5f0df95f6d0698a65656586df50804e955f4829f011ae915dc3843c805a53f669e5e65bce6f64a5f7939ee86a1da648a79f9bc22f9066ab91a4591617dc35164c2fd855de13b0fef64c5c40b581cb1a0ff2fe6f660e6b7f1238a1e451f430d396400fd4fd3dfcb8021f907bb3e4986d16f3e047af9648c36656017991798ecbdf3776121709e1f75dd3e21ddd85ed65ade84a2c217eea3987145fbaad6d54dac690b9cb71a71fa0d17ce76918336b49e86bcc32e0bfe6fada216d3fc5e9c58b69e8b13616ce8b873ac2fdc2c8ce44a96dcf8a4a50d5a10f3a7bfd28d1631db301de85428560894cbc16872a105a4b11272901ef73d00af9955f468935891e2bfde774da7b5cc5274c7f756511a95260745ce95a55443018c253a44d867f95abad929651b5280d564123d868d20073335b4ec18f04f49b7464a2c6afa853a8ffe2846253a26c32dcdc87e12b768fc98b77f4a7fbc00f1f30529ebf4058aa62a490180811e501e6db8c93d4f36b4b269e3c0ae3bbb6e96b4045bbc1d6ea9ab79115a67ecd2e96f102dc04ada8803707e5713b413463c0de721654ec93e1803aafdf89c2e989341e6db59897a094b36928ee6962eb62104b39aeac967ecbeebe97912a0585370b7e46615d90426db661a66d82d2d9082d3271ab9e7f1138d0826d8a248639d72a070ee83b83a9e147d6406f48d96715e7a4c1598a852132a261e71a36acf058e23552d124c8eea3f7df9475f9a82bcf66ef76d058eac3b810d103a36927200910de2589526a9da9a068a5b3aad12117acb57355fcb5c38ccdd18f942cbac1dd18c68fa0e8c0e100710a1ecd6192046dafc80ed8902f43e8902e161b5f7067bf3e4887711aeef518e4444a7c0992440a4809a82898ee500f68df383b4732f8834ed55f76be92f9ce1fc8c4b9a503d8a24dfcca27de9fea4c85d7fc1c20a66eda4f7f5f85362315e3664979233fb93c273e831dd41f46606a3adb3275afaa480ddd5be5354e10926bec779a0a5ce9df4b8f57256feac887e871ca7b7c9b0c4022ac51764398576b6eb21b43f4d6f75018ee23fbbdee24b9c993077a34bc1e53ee57a6b4925956801f9f1f4f50b6236310d588d4651a0711c081c370e8c962cdc43119241ef55b47f9d84d05a2f87a9abc635008ef3a8fe9c86b3d276ef56d006eea41fff269e6ebe00ac9213442df4ff90bc5c006c623dc4d822c944397e7c8e0be66c25d3ed1401e54502e5656ddd7d365185598d17c8a536ef75b4a40ef8aeeedceb78a56140faaac6e69be2bdbe736fd58e3d6afa2f4f88786956ac89240b40d99edd7715bc34db9ca7b8a7dd799275fa03cfbb9eb7144ee59171d81fabe8cfa410b2643592ad9216c51a54c7a77a9a21e0748cdcf795fe28c7db2d8d5ea7f355bc6c345f6baeaa345db5136999e4ac82fe8ccc5d85c7f5333de0cc40efe9f1e0da389ef657eaad8740a7a5259ea0565b056d5db495488ddc5901961564fa55657c00223d1ea823dfa804df0e8d6be35f2186806e18bdd38427f4543c07830bb104344a91446a4cb5498b85ef953be1d806bbd653ebc6f3ba7a24ed5fcae90b308771b623534bbf9a2462b4cae9d0f2c1f12e49ebb44642fb4e375b0094b44f9ce221fdf59f70a68acc89f9f83d0e5fb86bb5d9aeb7e500cab07320a688c9a9cd531e7b080ec45799724081b3e09623c3f433cb8cc286e6f8535baaa145c104bb01f9cc69e9c99e0f746790471dbe87f6ade3d6a3f55a0238726d0304d16fabaa3f00f6e54492753082d8a7a5efd4834a13bdb1d9437f81000cc910fbf31a061c379d03d0966edd02cbf22f8e2c617b3240bf8ddd6a1aa48c923ebaa35ea2af2280fe2803f4760213d9194c7f86c04567185c544588f41f31229661c4ca231940f41133a83c9604941fb930e52927a63cd220521eab17293f4ac395a7e422e8718a95a7c8c12db6dd2bdf27df2794a7ea071cb7202d4dce9eec2cfade017195b10ed0db2fe1aca66750a7ed97d36c378c1d224d3d40348f849a5c0fc22d4a4062576d91a66ea08f87796437a45c091a8043de037ba4d336dbe5cd9cff399f8fa4639f13d2f3f8b463c984f4b25663fb54c3a3f3d25126a03d3ea525f9d159275488cbceb1953ba735b9e3c0b254fd4cc320b4b0386bde82b1a3c48354ef8541fa419408cd64d443d9203f9ba589a05f53fa006009683e7b81791c378c4a47c1d9ecd5ea3e3a69ed0cde2a988cdeaf5499fe8015a5d1d21e406d710e46174a5309f3c4dbe9d62a591ba0bd7bc414310aa72f1c12ff68ad32af8de1e5338361db0280f2d86ce8c51a061399e09d6c6f00a86a5b8b29e74c1f14e4f1ee013b7d712d56f45ac92b457715b7ae196694a17d7a66279d2f09d64f170f5619ac6ad14903a0deb83898b0973692fe55322d462b7a3c5812b6f965efb8a23525eb29897990da29c614778b6e516496984ead97b2928df8b726baff02502497b42a524adae1bc2ca21df5e218316b12e3fec9450eff9422a7ff9aa25b55a6140c512e6f00689de1759ef228ac2485c9a96a4bc394c7a66389eef8ade99be38a35251d8af5255d34d902908d86f8441abb62237c1795dc1745c1b13089c5457de9f632e88108a3f4d35a3b7d61ad17ea2b472f23c6509f11fd886e16d1dd58ef41911b713ab31e40fa5a217ded14a2fa354435ac812b9215d715298a66c51a455b514a95716d4c663c9153659a8ed11b2f8faf5a7b0b205aab98148c6d19070a9a8bd26a894486f726049e6f77f7a4b101f1ec6a35e44087fffa22d89b1eae37b01dd14ad41b462ce67ba4873d2ae97bdee68e344bbf7cde46abd7c94ad617ad71f765a80f77eac5bfdb870df5e0b77df8d6fc47fbefb5dff93bed5f1f6effc2751bec45c7e020f458e9f09d0e9c68413014490cf65aaf59e82227d03fcd0cb2bd9f0e5e2cde508568e4686ba99770b46daf17fb27fb5ff74ff16fe668a3eb718a84fe334960be7481c337adc0e9ff4ad1ad32c6cdabe4f5d871ff35fe6dfeebfd6f73749270d44902ff14d1498a458a4edfcd302df2edaeb5559a824d1bec6e5a1bab37d82f9a275455b5b57163e3f6ed3e3c8de2d070c354c6922403ca7f925f4b9a2c47382b5bc8cf07281ffa075f97c4e69f53d49425e32968b717d59a6ad0178662109eb3caa57be8f90478794da5e5e55637148a0632f95502180fdcb6a1d8d3bd8abbb049d2c5ba48ef4ac354c334f309b3f93384cd1c8fc50b12fed6a3b5f57e21185abffcd298602c5b3b776d1883569684f521e2117af8faaff727a138880eec06fd19fd021637658cb08e31afb5df866a096b6063dddc38cacc34721cd94fd6b8391294339223ce338f7204f341168735a5b58435551a76d0c1ad12684d03eae6cc373b7919516322391c5d3929aa65d91e13f8f80edb4362e63b07b42535304e13a3382d73095a2b2fe356cbf37398cfa6acd77dc19271240d2dbb7bed063cab9c231623ee2843c48075d9fa514f14eb61bcc6458a26c0a78bb9b5178a044190cf93e81d4e184bbb71346ee4ca5c7240d42b1629c3c400ad2e2bbe57acc734a707b9151f17299df1933bee3c6033e760da7c87ffa297ce77da1f919932f230e45029c7f3e5233954ffcf473934bc767bbc168dfa48de64dd5d8f4e3df07bb5e3ffc3da11bf2a7c1d51b7cfc1c33c52d2364db31e08efe3b0f9cf950ef7b7d463ce9a707371bde7eb6128fa5f4c80cca5027629cc5def8653cffc3d8e374e471c9ff296ee53774e4d24cc91f0b4d7fc826851fac3b1108b3d981c6e958da322abd1fd56f32d0f0bae0dd398ec89d6471318b4228fd648f5165e3b280b298a61d15a2ad212e87300c42dfa1d893791063de7ac1ff28a34caa26e403fd7083e5a4a2734425fe7d17d83a752321b4487203d7ba244ca5231a07f944817ad41fdcdcf29ae57c44d787bcefae273735e82cf14cfd775f18827fd49fa44a82d13ba41897a8b964e7549a3487510414626b90091f3c7affa3358936bbbfffaee3cab85c4d0daaa7f23920c248394254a44bf44221fd417d5cf77cc6b9db25e591a829dfee6715e0605215e7e6cf37c1d8b6760cb386c39dce818dbdd53a2fef817d832e3044bb57481538cea4fb0b14e092781994bfe64d77fcc608e9db20ed1c704f0ca511e9981cd963d1da3d3fa384a9e7ff9bc6379d05d14f95f342e3472a3ac2d1d5a3bc9b45716cf961d5277daddebd409ad8fcba0ba1e491636fff8b961590cfaf5ff92c5c68f86252cab0941869bff136c8d7d78e46bda1f1df9619b38d520e98231fc0f286e9fef12e42b05f9d3449a218d255dfd910693bab328ace04f5999bed3c4f45fd13e8c403ffdb81ce83443ab49650847b9c531bebbaf8ac852bbf3b34ae1984de8c6872193df1341c8fef82ab259edf8aee701f2ad59a3b1b860151ffbfae55a5aa0d7feaff2ca95368453b78b6df20506a705faf2d097ad2c894d6066431a515af94fd34141bf97129a421b6d249e402719c29ac20d870d9276a17f89fae3d057318a07e597904f5ae7645dcd8391c61ee0bf5af4148131e6742b3de113f1498bb94d64130f16bfbd7bdd494bd1abfe77bc6f5190a6a216ce639ad5d0c4fb8ada3c1841b63b81608f8b77d1ee1dc9c6b660666eca8d64645fe5f70607e1ccb1c8bf059b57785d7f5c3c68b902e177b436e6be7dd8b26b1dbdd34fa6ffc8087d763f4cf9d147a08a6801bb52197361f3498bff3a7a825462b98456349aafd2729b447e195230e3f235eef9cce59f8bee14dd24e6f1d4214f5de17050d2fb14d9f1249de212530d51805f3360b455967ae09dfa4d1e57737acf152b2907568b1cbc4d565a8a68ca480eee86b8a0377037df46ccfbd0eefdaaf75dff9bcb6dd4650bf0d1d22ba19fd3900602499328502217a2e2f03d2c1f1517d549186d60924a1d0999a1485d9e67419cfab7f3db5d7776df2c68f3bd5377fa79a8ebf5b03e8a8a29b48a92560fa80694e3845131855651b81c17ccc98051310ba3620aada2b4d5dda71a54582059cf47c5304af15f9d6e75dc720e78c2a898f71186e6522d9c293e2413dc3345a9549829b468a650253c3a53cc480f6b3ecccde6c16d92dbd30cf4cf30d66a80f35da904208bcddc6549a09e7030423ad31d7d8c2d5bedf85fa2ef57ed3369446a6f9e2cad388fb5c178fa2c825434c248276b72297c172b1a31cdcde48494ca14af95fc6a38d23e4917d23f73ce5483f92b96dc4c2a1a15ed8c56f1b55b9f9036bd0822a02516e6899eb2d47073985bf3ce61b7853933689babd3caed26439bb5dc6e710e422fbce2ad53edc826481ba5edd01a49394f244df1f53b5ae61cf26b866c880cfb14c1210c720e034aa13d5ca57f085668ff0d473f2067ba395ad33392a3e48cdffa6b7c4daede41729abb5e63ebc87aa63ffd7e3d61fca6ba6ba96f3e327e531fad85fe8f262c0da8efce29488e337f41d608d34cd3e37ab4e737bf0d49226c6f991c6280f15df8c4d27c5279981c70dc760e4c6db20471b44d117ccb5eb9065986504638219625ec4d9904b8361eaed14b483bcb52e59707072949cb3a4c63698096a89a5c35f414f61a050b9f294dcbb65cb287bbfd58198fa3d4e1de89acb9278735a15df9ab82ee7c884e63629a6f6c01e9173bbe6c0f371f36cb0f70fbc37d96039c3d7860ec2bbb5fdef56debe5a6a3869982bc80f485c68b0c0fa97928e07dcd7288b3f3a388f9964af8b6b3f221e5e18d8b3e32702b2805c9a213329060b212bd3072954d76c700ac5986d78a1d13e6f7cb0fa1bde3f92a268eaf1fced5f74d163d2543f12e58dc3df36bab490ce6f645df9fd92578462b443365b0e4e9afad4cd7e0b4f6a9ade11d7c0958beeaf5b3197846d9d655abcfa686377eb92391318708f4a75232191e2376bc13f79022762f1678a5a7e43380a582df4b3b6911559100d797688a36e43b645fe85ba0edd6462676e3f4eb97c18179ba5b251b26e41575a2dde006bcd956054b441026708704446ee9c6ab1a60fc230122fa672926fa84009464a3887a7d1ea07dd10efd468220fffdeefe310d03a9f6ff7f5af834aff8ab82f98c36a0b4dad483cf86b81c6f5d1e3830cfec74b732fba56e7cf7fcd910f35963f5ff4bdc9fda96a72f6b15cd96e02f74bc3c57146e84767514286b9ced71533ddbe0548bb417003ac5b6b02bb1ddd8b4a20949d851837e9611f33997395d5f2f06e69cfcfa9279315a3adf29d9369d7e518dcd16c7c594245b4789c1819c03e7f475e741716d02935dcc43ef3bc741bfe714b30410b9f359228b80dea0033e4994476bd404ca0ff4100326e7d0ad44eb81d2895a47bef341e67464f5018ea0fa673b166c1c0c24bb06cb4cf46231d069678b5dea6d3608b9dbd987f02fb1bb5bc99cb5c536fcee58a0fe4dbd743bd7b2711dc678e8eb1e0cdeee2d499ea8c5420ee4d0d94efcf7f75bdc94644e772cc81ae8b44f65e44a6e9fb8b56025a60d8df59b7768dee4855692e867075c63742a7afb779895ecee45a7dae8e4fbb8952c7c403372ace0baff593882e41d9c9188ea7a069f79afd306e11e2238c7aafb0310aeefba1dcd4409d7bfecfbf6fee52ec9ede0b61f3ba0d7689ed6c4ca309cf2f81a673d80a82c8d32c9c1ac267d1d098adbf8b9c5636dcbb6f837cadf90bce9c59de25b84d366e403cf6cc48bb357cf349419db6c7aad4e845f53e298b8d6aaf4ecc4fb93ced47cd99add4e492270374e7466625bb3084252300ec02e6c8390d7718a2cc569aa793465527130709e9b4b198c20fa1afd130910348ae6a834180942e8e86b684d0e5aeb47b072387c9a3d791cfe6a8485250d805e4c82cca49335d1068996d6920051d9cbd1c792a564b811e2f1284b638d723093c3c36859c96e093dea9227ed28f766255904ed75692c7dabdcc3318ab8837a0e7dce0e965884435fbf78a28d1ea5c168b044e41857f72bc28de6cac9a5fc1ba2b713f7b1f12939f8c2ce8d231c9150ed4c836383ebd78dd229eed38cd77fd54eb00d97d3afb8c07069e92fbf6acfd43c63cfaef9a39d95d448745ac7844b5d65ab59d338100de99640ba1dbe755dac298c4b8f03666d28848c90d0e24b6359895ae278f252d3a1f813e7e9174940684fd6c8b4d428c89d12a7549a4cbf44405e7d8d4712a56a25de299ee22f58f423a907210e92f6b8349abe510e39d621a1b16e4fe877900e7977a3bbbfa8978c5606611791347109875c83e597aed5c2f6b0507d3c1c6d827940cdf003b7f77a104b72959eefe301a5bf43cd3e276c5b9a3cdf46206b80ff67cad41f3e7d50aa7697ce7eac34ebdf11fe17872299e413c441423bc51f6207d2e45adbfbda335cbb0ce4286a15b6a8be6ed36bc4509e511a2f5e6620aa15c9e8fd4ceb33ef598d24a0250d8021f5175c83ba1624d71a5bcb6a4672a2fd926d42fa99a639dee83ce1d4ae69f767f69d6c3cdb7aa13dbae95e98ba93f709ea0f6626461ba699fea7c2e989a9c6808e315b1e5e89f573305acecf9fe42ae7bcd8f2867254c6bdff93fc0003b9a5255abaa807102a2eef03e70775b621887f906518e84f24e25f56eb54dd83e8aec509d3bf545d9e3aae85dbae97d5dfdc421ebdf2605d29f72c6f2e45255c9b53c87731d07b8fa74f7b29dc58b67a076d21d07c9ffa9130f6a7510fa0e68cc0dafd5011475b7bc42fabad46d720e7e14fbebc974ef81a70341d70eee59eff228b86e8ac208b3080d680f8f3bf9e7174800d302a8e8a3fb8f6e8b4c76dfd490786fbf52fb20003acf46b82bf61b609d2b2c48ef98435210e571a368580235e1a7eb6562fc1fdb0fc48a210b769c7201f31247249377e4f058ee8f21cb72b06f89527e10cb686ff899e324a85f99dc67ca88cfb634e5af4f524086e9d668051bf30f757a49fb48435e93f22c161c39cbf3ff5c6f4ad53d64f5a13f137497b76d1a456849762e200bfaa854e99859b10afee858223f272745aae7c31ef191d582cac115cc34a79b8c3a6b3460e8e2b0f4d9cb683af615ce43e658f95224f6caab99291172298d414defffab21d7960d35a4f1866093e58cd95b9ab2f0a50042dd46fe07d30990cfa607b910fb58efeef3e58d0f78ffb60b250e88371f56faefaef3e58f285ffe483d59cffcf3e58cde7c33ed8a3e754641c5f679a8e1aa71a2b052f33b5cdbd129a7c87f3d8f8feb6fe06ee861baef1d6301c68791cae79082eb96d043ec76ff0fd3c84af6504beebbfc1f7d310bea611f8aefd06df8f43f8ae8fc0f7b3fe1a01d05a82a49d1d8806b9daa9067af78078570ca10dbcb205047ed70cac57ba81b54183b183dda4f5521dc83553afdfc702c788007bff3e16fdf66cb90dd03b7e11edd3300b69c68ecf345532478d7cabccb560a340d18fc3ad967eaf5b18b8e28c9a968c46a7e4ff8cf6b3de961e2dd21f9702a87ff30e5baa886ea0af2340e0133300415415d66155ad1f01742ada71ae79808abd37486d7912509b28c0fa3d05bcdae8bda371733c3dba01ed7083069c89a7c735006bd10ecc7a6547ccd1c28083d622297c97c65cb2eb26d5d91da6d103270a9547fcb061edb9f0e4d6a6f18d6502e5a99f09dc3a441171c07df21ac6b902b77273bc10aff63811af688c0a841ce239b50550033ca7223739c03ec82d17d69f4368ab9c6900f18a252270c439e853c9cee02cb111276269e2cc237c23be73f30dd40ff32da8168b652fd9d4e6bac7b946412e1cb6d032917866a1fe2902dd31da331354310d20f2f227e81e51561d08344a8043d33230c25e780ff7b7f9bcd0dfbd23fb3badf0397be60c382b2657e0313e320d3a6b37cd20442dbec18c3b52a588ac07d959136b2df57f45762c86ee41f092a6e1b65a64423b12415a0bca929f5f82eaa213974a5bc59aec2cf90c0fb46e15939d155a67b171e729628ba0bdfc771a6f87a32da210d3a05e553f5869c07db3e5d86e2fcdf25d42ccfad0e67a805b4da4b43aa91b8fd28c415c0cb9918c4ef4572575e17353221b2af02a57055ed488694a9a782b8a6c33531be33395a12459121d3a33fd35b7befc625932ead370df8623b4fa127784c668e9db158095878228e84d47172a62b8b5bd5a84357309c4aca5ef56000fc2068161f41e0be7849c80faa25842b3cd5a748e87599ebe4563143413bc05231fb14ee3c1f054942ec0e2e5fb21d744225f4a0abca9a444353b2e0ccabc0a9480bd7bce1496ccd51f24faa82d3d98f5b209637656ef2907552de3b089a4b2f4529f075900665af461649fb5c1823139d5b00397f29441ae3e22cc315ef270a709d346117f5834317ed762769c0a54ee51969e18bd2b9650e5c3fa273c1d7ef503e5c9fced1e8aecc02b4df4ba0ae835778cc95c42afb6e1d97ef20ab452a8f2e8d59d2c2c8ab149fa40ff388471275969d9925774cecd09e8c1e1f49d0ac9f01ad1c2a6171a11a6a986fe247a8d4dcc925924369f7ebb1baf344479fc61112b03a24042c671f75b4b492c1777ec35015a5a88622e52e0988b98ec98f074ff881d996e9d7678f5165f17cda035a77eee84f33de8ad502612b74980673039d3a20c82febbc5044eee91a5c154300994212440f72544da34b59fc346004f4ae22d46bc39bc07ad9a112ac7f8fa01426bb9006556d712cdee69005124f09c69b14982201e6e654d02e6a2b8d972410a58e3eae7294990f8844517af3c02f11f83fe5f582de9d87b6100d5807ebd64a6e5c5585d58f1d9fea8e641260d4a70d6bd0a6d04e299b7986eaec0918f654d5a02c693272d3bc92815f03c096589894763bd240fdda11ce632ac71b302e07e50e23cb7e9a20b3d876491eb3b1924ae323971e4bfc8ebd15a8c495b697907e2dd05c34e0e663a65d18ae845a4c7d3e9923e74b326a163613baa7b6fb23a02f18d2a6800330ba2c820106c898b154d26812e84da23c6b17728bf3028410573b1c9ba30df73149925dd14a58dd894268e70b7cbb7a9551fe5da2ca265f53bd2a636e196b2541bc4175d1057732f431c61463cd871af828cd8b47b7e843228041bbe79339541376fa6c2b1ecd4d2372a46e382049221db74b3a0fe31e693966c531419b748f1975eebd55ccf9fb2b350ea652ef5ec50ca5ae807baadfd36772a14dcb6d60da56680abd64fedee94069cb67e3354b604cd58fd4b864a5351fae192a1f22d28ed7a7ea89cbbb577f7f9a1f24294fea570a8fc004adb0b6d3649dca29f976566580b8fc3feb8b6f7e65dcdfdfcc7ec2c3edf5a5807d01eee6d9b3b7d096ccba8b3bb53cd203de38ba1b26eb032e39ba132028bc9f861a8cc0fc3325287ca42b128b26f3075a874064cdf1f5c3954ae81e98ec19543e54b60ba71f0c050792a4cd70c1e182adf02d35983ef0f9533300d06df1f2a2f44e981d0a1f20328dd173adc2b946e9c68e722fdd60adc3dbe5aaf91e38ba04be211a64c39c28978b5eb0292740ecf63259f5fb8cae12f4978acc6c55e6e5414499ca434d40aa991b28152c3b28152c3b28152c3b281528fc80687f5d9860ff9917f04ebcb8f607df911ac2f3f8215a578895932542ab432542e48d850392f51cff33c4d7c4ca22eb79de6f31f93a8c447242af111894a7c44a2121f91a8c447242af131894a1c92a0a1725e82560e950b123454ce4bd081a1725e42de1f2ae725f4fda1725ec24287ca05091b2a3f8ed2f7270e950b1236c43d5e7f8b4fbbd3bcfe161f8ff161a0378056ed057fe5e3c91a61eff9c96006dd30d069dc770cf6db58b40ae4a890209f27900c1dbcaaa51b9d38da957282a87a15089567e6118e28ed15909937c1519cb3cf397407f71bf79708782942ab378a862f6cee14897d611f7ac7afda902d6fd3d2ce8ab021dbd7ccebc690e55b9c6b46d2d66cfd224ff11dd20a3e779f19c919ebcae86bb60de770fd77fcc0f1a7e485cc0ce8af92ae2791d4f5e67d9157f2ada0157f3d6dae455af28bc6ee4e233985753b602bbface5ebaf386d46f2c620791bc3d5bf166373972089d340896be6724e9b4b5620498a223b07bbf9d1f82b778fc7457640bdf94518e115282fe6119c3f5f9bc195a19228121bfc99c39329bf24d480e38b212d6778edbfd6cf8ded9cd861ec53cedd3e8e604fcdfd2df653677b8f8fc47e6aee48eca7e2511e2a432d101857e342fa6977d9c896ff79e1f9e3a8e5b7b4c32dafaf6738d8bef8dfb6dc7721fff4c896fbe247b6dcf718f6f60b1338ec73168ce8d767329e9284980c948b30a316382a2f5eade6a8893deb57bbb536bf1be2587fee8b2a3e0fb574565ecbf22331473b0c33a53eb31ac95cbe966e479e052f73a6f3c33287244c383fe44768f837252ed35cb543cf19174a7ca3ea7661d0233f38f2fccac226e82113dd1ea15f98b534e8f6e4bef0331fd3f03b4741e72bcd090cff9e7cae9259289c5b2d9dcfed18f23b54b5958c1b06e82b0dc16ef89a4ae411f2f0f17c5ecd4758e8a3fbfa36d857d121133068b3b4d4b65e60fd3e0eb05d5d2242c688035ffc003cb303514d8811ddc76d3c743d38afadf90d34b4dd62a815f3bcb498c6da5081e5e638729cf785b92b03c60b75a4fbab11b74493c598781e98e7c6815a42789ed98185225c8418b361021ff7dfe62c4b7bc51437e7558779ce234b13dcc8f3df1d858dbcf714db476846a6456d324d1d4fe52226877e9b0cc334d71bdc7708158b904f686d9083dc1c4615b9a41b60a11b6c68fc043ffb3eea7dc95f4fe7e4960ee796f6f0b9d6062db83a023aa8db9d4f62bd23f24befb9f3ebc172fb08f85fb9fcc5d68628b00fb63f7b49373e6324be0e77791aa81b89ef8e3bdf02aedb08c1a226ffe86ec504747699900b7e40b97db148ca030eba734b1bf85c4429a342e7059eb7b9cb1a2fa1b253da5e6ee40f2ea0b6748f2142ad723f80a42210ca049c95b892c0250cf842b8e15cd384f27efe2b8a57326deed51d3e0f51d46c73dfa273e791d83776f7ea0e478df6ba3bddc8a73bdde95ff834c48399edeebed7032647f715a6c9af17faf43316cae3be6e776b5ddd1074c363d0353ff0d0a7623fb6fd16daf91874e36537b4dfefe0763d060dbe77434f1e01ed7e3b5435c2727ccbc7fbdab53b62d1a9427714740f460fd9597c597dda8ed86813d1cc59b817ac0d2ad85afe5766ade5045abfed49e1f351047eea5b29c48260a2204cf1b95cc84194d2c254db2f0c1c517d1ac1ad4f0cafbda6e94f5ad0b74a4ab468e4d81e17a0f35b70bee5b4e492589dca5caf585b7c4eba21dbe29eb1f7e5f8681d7b9d038a95fd0dd2d4661bee47255dc284f5d3bfebb4b30ae9c27a10c890203bcb77e595042267cb4f0573d1e972fe54f917420dd48ab2945cc6d7ac7b4358fb583aabb0281653a173253b62b150c8174254df3d287a8a00fab07918db23c19b07309532a81514c1989de3c3872d2f76dad0f762ace21383e88e7fb839b8239011835c2ddd5d815bf6a05348615b2909186329e4d6ddff2eac77e9792b99958b95166a69ff1e1cda646b8f98af61dc424922841a9a3785fdf72cc1d6eec44aa11d2fec11b0c76fa6241b0558e675be4f4ca6701ac08495ce6c9ada08713739bd71b9b0ee1d8ab1d196008d28ce6d25c12fd9a97800b5f4076cf692e378d9ea40dfe9a06ca335a90e50f75d6302f7d4039b6b0cc87cdb62e1eaa74517eba36460b6a44e1d98180f4415f51e986a9b5559eef4e0df978f787f867bd71fa8f7a0327ac6048e8b07ba50b6590394e5362ed7671fd2745d6877319f438ed285f6162b0fda3c3ed7224842fa79bdb28cc48ace45467d0d02e52aa0c953965678c6e429cbe114adba527f45cb7db1464565b85c57ea8b1660aa746bd16757e058a27b8457606cbc5b5454cf9f86e17926a1294996b7c0df549e679a75c26dc0cd58696a287fcecab114f76387642cf4a5e842c54ae5619b94fbda8b9cc0d12a19334fc7ca3f8191b6f4e017f43e3f922d84325e84ce644d5a72b848542505fa49b14037896d9d0494c114c64ab3f0aa1d72d1d922e5479f00e5a92b00d2825799eaf140f17170928994dc07d26f8399694d82fffb2078359c67b95d9be6cf8473690ff81361eef360fbbbccf31cfbfc1e0eb7bde6b947dbf67fbced23b0ed6357008a94c153beb1d6cb3dd8ed01452cf26c3ecd53706b1c313e5e9aa155ec539dc84a04a1b9c2066b14fd6fedd62d799933e6fac5e42fcdddf15591963b990dc7023dafd41768a3c8dec14c79efdb05f5df6bb7e4c1515ac0957c869e2af82c5ab00d8e5211972afa0ccd9cc2aa1fa78dceae9316425be4e0fb77ae869f7fa70ad141d0b3c2de1023fb54b0c209829def1b5e2545a78004ddeea07c6780326871244d4367774f0b36f06e9905ad1509b607a6507d94c39f8f0e6b4a68e4f6fd843b8a208e8554ca6d9c5fd61e6d61e27d1d7caae266b4c59fae34d04f3600f931f4c528c3ade8026972409d40f572658478e84469560f5af54d60ca52f0b5fc0e5370238f47d582a80db62846505b7a54a0cff1287da309815f8e4acb0e3a724629ba9f4f423f3649d1ecffa9e2b6a213dd32cfce52a416bd52d25cd456f48aa773d72dc5ca1d9d0ca9b80eed64aaff5a457ad14d5f67411acd96832267110d6be099f222c70e1aea0bb7aa66acd069679aa01553ef5a45ef3d00b2e52b353ab6ca548147122e889d515d9738d8fa012435c23adc07c335f26fa355e1ebf16895eea4658703adca293f22015a23e46abe57dfa7d37e6147bb8398f67fa29a3d318893db938ceaf644b9fd0f61af1ed92d44238346058dcea6c9eacefe2437ec69873b862ced9a70dd7d8612b864b787ce53f60ab2f2501f9505f0dd8ad44207a355342b9f0ac574716ea82c171a9f30065fcbaf5dfbafed7fd131563508f9d6dc66ef7ff10b9b70e3bdf3d13bf18d5d6e0ca5ed01754377fc9f425f9819e921ea3bf99d0d74aaf45e88ba53a0b373e8ab34adc3d407dd1aa61edc11fcaebb6eeacd7711f5e291a74479ca6ef9afa50b9b00a2f8926da8fe4dcb79384286b89d2529fd8343bdbdb5a177f80cbf7b2736a1b1d280b82bf06be83b0ce09770834fdbd01ab06023d42d8a35b4bc1757a4f4cb154df4bc0ed86b31ec3dfc41b947b4e16bc39811f45d1f3a972e7c4923e8469b5dbe07ed5fac1e8bcee008769712bcf3a04d68b79d97f713fd4ba24d878dff2a6ff0d48571bbd60fbe8f2dbcc168910e2927bb8e71b37c45fd316ecf19bd7f481e45725582665bc0847179071a8e0abbed366e3ffe0806321319cdcbb168b71dedf61e35fd4be51ade6d8f2d74c88f23fcc70f734fd5a5c35c6dce1790fc13edb6676749b574713960780fe1839e7feaa03fd49f74dd8ebe41caef2407376e4223ce9f234ee36765ed3f4a684a3e1dc67c272d3e6968cd19cb2f4b46dfacbb12ab7b074fc1e54bf298d550179f8a5c72092f889d5548c11908638bce61f957623f39c75b3c49136a45188fea11df6578fed1b2c653b26661e4e911503739a8212e9b4afb13673e72a2a1484bcb5bc43fc7069a5c8372d4bf909677d1597edea76a78977b569225433c2d234b3030dc6ef271bfe611edfd7508ee303a41f0a88e6fb1f3b302aa2dc0bf8ba2d9f0f6a9add39a366d1447b8a511ed7b95a5cd4dbbb19a971e63b7b05fde6369e46c7d53e612213a6de367b6a16f453a396b2e47a739554ba3a1cdf5200a68bef7aa8f8a5b2a2d994965c9451c77a16db7f8bf06ff6bfb6dcbd38d4d658dfc79b765ade816fe8aa6558dee7bec35d7c37304dada28dfd9c2dc52915b6629a22d1fa39dfbb61ccb7819da775f4c90458e489319f8d27481c45719a1c28a68e8df8d7709dfff00575971966795b8100fdfa95be0f7ece58230861925b4f33e6b30e08ea29c4108835789f7e27205ec89e7ae344b20c49e336afd688fb092556e3d7db7dcbac26d51de95fbc9d0fee0ab61c2b7b9c03e3ea72ec39dd368e5730ea4bb736af65ae4dc79dbf86fe19c69c2f854457cf45e65c41b98309f66a35e8a669809d43361fee5f2e84d1560878325b3e0acd18dc3a7274b9841e4a887f8b2c244779bee678330df65a159fbe3fa85058ae48f5ed34799093add2596a494273f7ac60e5fbf52b02afaabc2bee15ba8e67197823ec6d5735c6fe841dc3d64ffed093d543b3b0b7d5f047d5b049d9045a730467e5f44d0a8f3686616f0ff0de1a7df21c7b3e343851136155e867a1b6e0ef494436d3061f4584f09cfa3b4c8489309f8d3670cfe34718e4915e4f412cb145608fee797421bdff2d2a8fd13fafa50b0a580f6a6773bf8364daffa3a102739cef223f29d057a1aac2125b7caf42256126b0980ad99cd53e59510a7c7ce549d0a7b8e5ed787b39e50863c8b45722817d4a89c70cb3f00fabee09a2a722708b69659a31bf551a340097dc071b9a03fa9a42678af3eca03b0d0333df51a9ab7e19b67a4ec073cba7861b17ea60c4009bdd70088e778bac8b468383ef9e788e7d0098f75ccef7f8b049df7e05a794d7fda84e9cb7301bde56770b9a0a066be9d6572c15466aadbe7cdd09ff022ca52751729c64b24d8c92d65a9a2e3b984fec42840779503e5917f00b72cad2e5c58c84335a667a7e94fe412bbe939af3a9c3dbf12cf85da84f14a449cc39e43bc3b645b9a7eb1d5d855d651defe65d3b78d897dcbeeafe85ad5b1ba7d5debabee9b7807a800f758932f5e449acbdbe99788fae86246ab8ce805e81b402d29f16bcbd74ad6dd7863ee9b656fe26fdd583f7743d906f455a08b16cb07686c429e43de5aa42904f3a51574e5ce8255a736d1371b8094f6a461cc26ecd037eea6a09e471a76e202bd0b234d2a4c340b7aa6dc98cef7615e89de05c7115f9823df0747d7ebc3236cce3f3eb6c077367743659564afa8ca64045f424a27d617a5c5886716f8a72d2c2872d2be50f2d28a9c682507695d95e9017eb160557124518947172e2b54a496bc56d2a88c988f9524c3be7aa0952fe52c578a978d0ef000eb8cd79857dd5ec2c1ccf12beddc1c316c7d59e883f271a5a6126a44f16702fdcf4417faaf9cb39d21e70c2a56f2967cd8ee96e6b96b69e7a25a85e78475815928c63ab53dff3354b75f8eea9eda5eecd8942c8e403f1ecff274f4bd13f4f5136481f92fa01cced9141e043297b24b35226ae9a780ededdd8e85476e3b04d865dd223a4b86ce356ea7737b71463b5a764d8eee27b0e6801a6c812e2ef4204dc8a4ee9bfaa5d958a895640659d9d7359101bda0ca7c48dd66a74999849543bcb28eed5e0b6ae31c75641fdbb04094f9e29fb258976b7be4e6cd18fbd343d1c7f5146a47ef9410dad1e215f2a5bb5893bc06d3b2e3fe007e78bbcadcabee1ff7b136f2c21b18ddd2201162d0a29d92c37b4ae61eb6649b74f38905a76da815d9827d36af4f86be4bb88da72ad2b70b9d86ac31e76db1c93e67e5e701fd2589be73b2dd4b5b1bb7d6aed312dae7ed282dd3a6dbf8fce76d949f01d0af9bbdfb0374f1d48b045ef59219ef4fd2b16d79ba772233ccf8052612c685c8c7a0df22b1700bcbdc1fa4d32aa0869218cbb40e865b22493226f225131e9961c2ab8ca4889593f85c13ddd7002a2d1409a19d10babb015aaed641985340e085794b7755457580e59003bd7fa697748ca6323603af73322dba7b458b6f7a7a69e9d1373d687f0f406574a1b344a206e023a6e49b017f2fef86bc5f9e096bcbd414d9b8dd2b8bfeab0b98619d07d25c2d2d7920a5090f69e68b2c2c639b5e04994bd038fc49b732f75b39e5db235a9e1b935b5cb3a9d417fc8733d9e37aa1a40480bad389c604c33213c2ec422b37b8cb97d13ac6b9bab038a26c85b1d9c6361900168778996e27e216fa62f15eda4fed1307268af3ed937fbc6d9f31c0977e684f1d60ec8476df2d76eb66106cda7756e8eba89b52d857d14d69b009e914bb15f656431b4e406ec581952ca595032287d05491dd6a686945f4d91e295ac742377fccef55257e00d0a8474ecf024b76b19087b9e7cc3c5ee28118d23be6018e78c8f23cd4231eb27238de57481c416171b0373879876d0a019da7834d310f28d200d6da824d4bbfbd6adff660a2f853fb4ad80baf2c9dd6319abc49b97c41adddab4668037b80c336c807c02f8e967800222ea034346ec3805716c2a7d3ae3d1e1ab7e5c16d1be20be201165f1bb7eff470eac3e3bfafadf7a0b6f6bf48416dfd7716b5ad77bb2ebc6d17bb146aea181927cdf428a4a96f8bafc9afeea264484fb104bf83b5e787eebccee3f581825a5a5500b9c3dd933c7d9eb59c0714a7a331098efadfd7d1f73fe374d4f0fb3a7ad10235541379611b46ed9003baf9928785415a7a60e5b09662504b193bc569e981115ada18fdb8966ef88d96c624ace4b474bea0a56b052d9d3f82873109bf27cdb4d8036c0a1aff1fe439330963d9cb629c5d22c67b776ddb154914e0910623d4d61378155388b350efcb0a0f17d0dd06406871f9c93d992f62bb235d06b0321797875b285f0bd0ed4652b69c3d511093572e8e6c6a029c362e23a56c922f2076405d26efff995ed4e1e1d665f455153aefa604cab7f9a6842efa2fba4c4a055d762c72fd0a35cecc49b0ee8198cef210f3ba4cad5b0664b685c6305e2709d758a4edaeb1508ac7baee63714f733a19bd639ffdb95f278ab7d99fbf76dbbeed575e03bfb1a50e6c784c03390a73380dd4731a58fcff8d067278198efe3ca88145bfaf81088ad3402979835a170e428fff170d14938d54c6784103b936749c069a380d348ed440844fa7fda19ad7409e03d7ab86a5684bf53d15d4b2242a096a999eca706d67258d7f6ecbcbcca72fb400e2eebe5b281a9ad0067363e88b2d38138fa9be1d471111b197f202f605920d3591972a00da55c742cd5f31da4da572b4daa3c53457edb5dadbf097b9a4dbe68e69bdbea07c6144ba7b68d562341ccd58f37b1419113b9b70a9237d3f0054b76b7b7fd2bff599bb5ef08bc91dfa0a419c4e7bd536bca2b398201a11e536320b042edd82f5cbdbf2a08d18a353595fecc6282200500590b35b9ac760bb7dea2842f6d456b14eab2ba5c7ca70a8599f0fad7548ace412411777cf9acd54c0797c624d14a906bd798466a11fba6bb9553c7fd73b2445746ca7b6746fa7e41a51e612a47503be4b7335b9c5e7509ff995117e9d0e41a2f9354b1499d40d65dd3526c0715ddefd9082923221af0ae6c9196e5f21b892db879c4d54a8d13784d197c0310d7feb476411fccaafd01a5b8c8f4e23ac2071abf6ea52619dfdd00e7ace86fc73d916d89f64ceff4ac47cf8b2b4cae05679015a638cab9cb00badd67819ce187c9a86d770327d59f974500e7dab6c4bb3cd5d4f5eeeaeb7f900aa87d636bd0cbab86833bdc780f330aa0f7888900f26ec12cade333cc203d6321d5c84981971b6e5ba6d64ea07fb8474743fe24cd3d42efe8e84e0d7d594d0d3bdc38dbb5e896624edeefbafea513792d17d01c53af42d3bfe3be8c3df459f6628beab0c91004cf3fbf64e5f418c3a6ce6bef1b982148b9e9680ca9d997fd14f3703fd6c19c012cc59ba102be901e8af2c40b166a7384686be07286eededc334dcca5488182cb7b1f27070cdb22b76a13151d8cdae791f0b253c1c546a7fe692eb76cb112efa68e8b55118f69372ba4f88724e7eb0f2a9894ae5f43f29954fad9c78d830cd3c334719a10b5146c42895739e9fa89c9e3e11a627b60cde7c78c75569164ec9c728564633671a798cc967c25bb3dbb26fd33b65405f2123f29d0b19be24e8e294004238fd92d5adaf106ef2370cddce67c41ec20dde537acb635f392dea02a886f095d3835d1ea8defb39c5e74ad6bfef2847bbfa355c9c54e3f5e9ff89b966dd922901c2fbedfea4e876dd41043759cbf65460f4be1689c81282d1f34989086221727c6e3d8e21e8d43086a0b61f8d09c6f2a13305357f096fd7437e89a66321ab52f5d3754ad1d39852ffb42e58340b0bd6cfd24d14cdc626ea67eb824473b0a081b756649ec984dc0fcdce2a197ba633bb37fb129610bd2bbb3dbb156fc4fbb29bf0f60403452ec2f55a31283e475b9dd26a8f901834a37c99a18c988229a7bf8c5db69ce93d9c14e389be4f77f2d23e2d7dd50976fb7034f1d6ea0a4d112f529ed8b81b69d5a6242cba9dc3e678cd1b3cc792be603f8979f42fc73bb0b8430b429fd35f24017f2b8d95941294a414b79a3c06024d2470dc6a89cb96cb4ddcfec0afbd689f33169d1b20c4b9b685c2773582262a52a1243409b2b52fbccf5d92f5a4626d74e38d8cdc73d969c55fa13b2bbaafa814b11a0bc10eeab59d1e6627fab6fe7eb2d3a32cedb81d7d9beeaa4daf1a05268e8aee70349003e8065d74079512077ad1f7e4f11fecc1696569fa04123024fdee455c347514a42b07b0e346e3f4771290b9c2ea351accca896ed2d867bf5d2d6248fd45d760b5d71d60653cfaa79a1c5b4f0f4437f9d974cfd5c1d6c4d203f6b234bc03cfb0eef4e867c8d9a65bf86cd34d80eaccea083479f43b36dd1ea8859006f0b1ad2c4d548f66d02c09edef026569b07e1043426c0893f899a1f21aa11c95c112dc0795634f43fa73407417ea1de2bdd2b37334b680790ed362cfe916e89ea564ba31278df4621b76c7288aeff4b09a2420fa3e76c18aeecadd67b786009473a643577f36c3ac82391971d2335d1fdfcdce38bbb5dad8a0a6977d2dae9d27df01c7492abdbf6fe1e125ca084f6ce1d640c60f5432d4184fccf1d59549cae93f83a37b4ef4565eda17ffc62efaeb729c979993f785f5976301b5af79abc3e23390e4e8211d87ceb2924e0f65c4cf808947b5bfdc1ae3c9484ef41e4e44dfff3b0af1d05f5f780c4bf21184a571228fc598f1fe39df0084c13c84c1fc9f30740896f49f0843cd040987a13c83950029bdbb4132b2b731f3fe9ff6de04ae896bfd1f3e936d121605230404351871a1ad55e3be50628188556aad806b0b0e8b581565116d4bcb164258448c8a8a0b5ab77a7bfdb5a61a2b95c50db555ab15b0b65a34026ad5caad082890f739331308a35deef2fe7f9ff7ffdee1f3e599b36fcf9c6532df73b64e3e1054a0a5df9fdf550c6d45b8ac812bf66bf4dda0b493aff6b12eed0aebd2b2a9d4ecc6a984ca9854c693461535ab868e774830136fd46d1c6fa1d019723d24432ac4f1de71578cb88f46151c6d3950d95bed9543bd63ecf2e42517e158510f1ca77e85c02130d15420723f10a25cfd00e1a7173fb18a115f13ff68c1cfadcace74a97e4297f05be8f0dd70f818b46c1cd61ea23fa33dfa44016ffd45fe785b44f9daa21807790fcc24c4cf8fd23654a5512785509a07eff7d34ae31f3711fda9beb62b4dc9f662a19ada5abf12e27480f0c4dddfe0bf20a582ef668b9694c2bd1cee6d6d519291ff922d8a30ea13d3d1b18a89b6d20f6ccd7ad296009d1d6d8b4aeea68dce46250f32252c4ffd86d78f309beddb28088c7818814f5563ce7d2afb840a66f88d38577a729940a6a3f9f167a9aa1082b9979c65f247bc910bfa9fb2973fd21631e563e605353bf6df3df06076a9464d3937f21e465155e40b621c7cca743de419737ff7649251ffec691bfd0dbbfe99187a15be42dc6023251d44981b40a5dfc34faef0136c17de1fee878aa07eae195f29d4f4461d23d5762f1dd553f8323d82e03d97d8d1e705a30afd1ebe664ee79850a3657d084cd9e4b65547db42a894b64138b7c41bd48667885a678f96666ba6e9a3040e4b17a50c2c7994da7a6129351bef19d92e534e794a97b0a4952ec19a678356682eb3b3a4b2ad535dd9deb49aad9f3c463bc92fd645116cbd8716e036f0ba39248bdd5b7e85eb4d8ddafb47a6e78e3240bca271cd78d6cd84ad3078371764e0af07f6fc7d4ba49bb6cd7c2ec7bb767fa40db4e565cdd1c74cac68ed902caf5cf644af95f8b48b92663657c90732bc322e6b166918579fdf8e27fa276aecbc6e1a32ead180c4b4d3307fc92c244954f2f893bc9fdadb4286e40ec9a26c9a9c9346e39c8ebbebf5803d23e0bd2951f716ce6e60725ab19b3f4e8b087f7fcde58c7b1aff9beceefd71d4a4225ed642c85b86bfe69e65078a85d05a3d4f27e21db4a8ee648fe38975b5f87736fee926742052aa7d623e5acbaf7862a678150e52d11a3ee558e1703c915fb1177d9257d71e992585d6196fe3a5ba368f5afb8b7b420875f23ea2bedbaa68195798d16a1ef5a3f4c9d336a9d041206d7ada3dc58f72de8326e2af32ba6bfc14e8a61d1e8fa86a11f48bc5191b54b47ea206bbdd6aea5a752f29d960374ef3a99f543c54545c7f98271535990bb524ff93ee4dedc515877954e05567c6fca47d954c2af2b0a316ee75de4214934f7c4aee14e7aa09a1faa196f25313d2c49e88a89814559c5b8fe0a929859e0034a8e411336ad425ea2aa4f49371d346144fa8938e126fe09124bad82b7b7631f1c6a669aae2b69092c741873ec93bdcde784848e75f6214da7cd2fd5afb78ad5a659a4bb6d71edb931899057d2cbf58db8a1ec048152aa0b29f617e8960692eb5ee1972cbed0dbdb76bd1d2dc5cff79ae4f347a718a9d545c6ebb483b232350b3a299fdfaa711ebbb97eef847540184869c6f320d59147d4c0af119ec3e25a84dbff0d2fcf1af01888fbf8f5aa1c19a70198767daf38de31f494e09d5e3edf7aa16fd3827e4e816ec4e3f098c7b80e34d6861c8e7b022a948860ed6e2efa0864499161598c34a712aa675bfdcc635a6ac7f80888154d01aa1eae2f1a8f1da02a2d8de48e86d4245c78a866b4b320e1d129269ea26b350845be0ad4352d2c36e532ddd6b100d765272981d8cb67c9c860235a1dd87c67e733c0af7db4b4aa96fa7d26d52974b541c875629805897893cf61d8fc212b71ce5bf8618bf7aabaad0be0fb498c007ef08414c3365dbde4e79c3346b0de4ff51f7de4575b9b9a5383f9a690da590d3e905fcd4a8e29cfb3efc0a19b23fa121352f79d7a4d2330a936375fbadc4f1dabf11e3c9261e2ec7e043c2572a8d78de33eeb1144604dc3fbb562c88da68eaf741bfe898476ffe04b644c433f82f9080bd624d3dfec2325e205f10459c5424d7a39ba51b4f5c2ae58e2cf9dfe09185d1b214359ea5f81dc523c683263c62e0d1415731c94d3ad796dfc08e1bd20f9e752f2fe5613bde4d76867333515f41a6e15e56e1d2605390488fa3ef95dcc77113fd756abd4d3219c4faa567a121fcb1309e25b7f19666d03b60317df116af6c134fb8add8de5345bc71f728d817d1fd6c13debbbdcdc8cdbbae02c7af517f6ac4396e7cc6cdf17563678eefb2393e5bcae48ea44aee77c6b495ae05acbd2917295b3c32213e5d1a183b3e11bf654bf4676a48a76e39c684d7ce2fa753bdf95c3d2db1aaa7242393ea7c633fad46fdd511a1da945eef2c515379f536ad36d2c40ff0d7f8fc4ff2eadba573c723a9aec94cb9d8f2f42b7a22dd39acf398cdff49f71fdaa5ba7ab3b45bb9d3522df146b1fd1015d313e0563fa785f43e80d9f594569b4fa7a6bc24153feaee394d9958c8c7b109bddcd8f8a93c5b5e6a149e8d53bd9ef2a84c924fad58c2bf95b8278298a294b4f8786758be8a967f9c1a85634903ad4ca948c573da1a434821dfb46c498b77f6396d90512a29778a8cd2273e4349ae0c2b5d3af70d786a5a90c9c9b67d472976fff499307b6c35aec74fec1b7ac62007b149632350d8bdd51db7ff9cc7318ff22eaa448a11eec441985351d5072cf3b4d6c3cdec9acd9692093f8c4c3cbd42da9d70de1a285b89e7b99901e593cb47bd9f0365bf84c3db31e1bfdf6709fff8f023363c09e1570e48cc58a1ef06e127ab20dc0a08f7ea4fe574b8c32d072ba9ef8e74ce0f99319d4fb90be34589993d7008d5682f089177432f9a2ea2aa82675be63de09b19a3cca2c4250f587d7f0dcf1bf13eccade84088903cda6259a7a9ecac566acc9ab88d72152ec5b3bd6b21d4693581358fd292bd32a22645a9ec14635adb8f472926b4b65f483c4aaff95eb0e2637e656ca19c848b703c6e92a7662a582e2b9fb2f58d21211712a5223ba210fab2515aea84175f96488fb3a31423602539f43ef2de74aee530c4373c87faa9ba47cca398ea8204fafdf088725231f46b84579b253057a57efab16b7affa07a0b17ace9a11a3b3c27baf45a30e5fb9d33fb3cbcfad74b5ff480ea257c87293dd6f5c04443d45354999732f07c454694946f47e0b8a00620ae3f29ff5d28ff1c291f629a4355db088a6da6ab366bcbdfe83ded407060a28adeaffc6006553157a2f83284285881e73e91038e6e5e67aa82d6c7f155748def36946f268e0fafba15b0ea7e9db84ce7406ac7273a7271a2fea46326d54310c883b5496a08f576353a1eb220eaabeddef72f24e67f631d63cd0d98af4c4fad8479f90c5c62ddd74c4d0bfba8c81c93e2cb44e228b40435ff7c57fdf811f4632ace47f9b8cd5aa7d23a58256e262e6714e68a1035f773fe5258890c0961de0b7c4de0fde70ec2ba046a7a3ea73cd5503f9371ba0fb3a8efed51ca94e2ac792a2a64bab07c1aaca48299b6533bd12b29a8abc0c483197a58f150c7e73b611d38ba19d7536ff5901c6ecca1972817e1247abd9328f031d86c22f22f1ac80062ebb4921c6aa36d0fcac5be073653bf85102559a0e5a7182d5f7ff19fd5f1b27394543881d696b933368d97cc561593efab4ab2a9992da8d05682a879cbf8b8b5e6415d7c48dcf8fd77244c7b9c82e77b148eed8696fade16a5a8e7cc1e2ff1528d873ed0142c373dfffc982e7b9d67daec14ef8f9f0e7919d5473894793a124a55246e1bba75af9fe9aaffc740ff87e4400f533e6e48cedf8f51427b9ba4897ba2162432f5722bf1f97aa1ae3fe85aff5f41fd7be172f0a28ab5f3547c35f4ec57ea117e7f43cdafe1596ae43efb04a9ec0a336408f2f2f375d4b5c72bfb12ea7700b31ec66f234894e29f1aa59a8cfbf5f5dba9c98f04ccb851b2d9f3506a54dbf83699774d7ea92c81de37a4014a39f43384fb53baa43f5bf4b8c4d29f1e00fd976724e6f480fe77f4901cfc5c620d639e4cd3b4e6a7f829679eccd3f78f16e0770625a07343724c95e79bad4becb39fea2bec8573a929851ec61f6b3f2e93903cacd5db82cefafb13fa04e84793853d85fe25958a11dfa2a32d5803aabbead36eca4de8acb7259cb12696db2a8e3cc39af8f5b33fd544f94e68df1e38a4651c28b4796a6677c4dbe6597e20243051f1e55c265792a39b159fbf4454d1fd6cb92df4b35ddb6f6bc45d1cd31642f1f933743aca3f4a9a634be0fdf7b9b930fd587f24a6e6dd36d3dce96daa80ded368edccb0b5f4ee3eb3f9384da687df7bb50452fd0531a91ecc8e36329aeb590d9afb7931527cf912a3b973af77ed6fd6bd72734d8ff27107b3a719997ac3e1bf30a6264e4ac47ba399e6257f5e3e45b60aafbba32fe12704d7fe8544360fe7b3788a1139c87bdd4e3a9ca9b2fa40177d5f6db2137e46096db3ae692ceb6e1f63cca3ffd141aed86709fa2eced3e493bdaa11b7b66a44406967b832c392af2f24a6d4a75642784d17ff99096017837707d8dc7607d664b551bfbad9dac2fc23cc492af9c229e651f7fb78d74929417ca118e331adb150d1b38187dfc07be906b9969c4aabf684e7c5315031342550e1d4c0dbb24a31217fea90f492db5f870dbf3daaa6e4fe1689f77def66bcafa5f765ef9f52cb983d17e4535323fb45be2cb6ec062bcfda9f4e4d1122da86c9f31ac7f2b6996921293cfcae605385f5b942835c99587c26a741bafc1144e0000d7f2c312d6d44ca34c8e99bcc1eb55ee9cc2f32bbfd86dc1f7e67d46dd1632654e864a6071f28dd48ed8c9c1499365adbf756047ec34aca15139afa6c5427f951efd65a7ecf51b1bb44042605615e1b6613d89f913a8d46deeb15634e39f0f791c384fee39deb9072e119bc57e530c5984ab9ce9f925d41697bd4c382562bf6570cc56d2e21bb91eb2a8e1630f1fa4dc831b1bb6004107ef38df39707dfdf9fc5b869c6e15f43e6dcb1fc7afeceed21596c2e26b0f32632ed2b1209df48fb52d7776bcdf3dfc531f108478585c9bea7bfa49f94eceb26796cee2618b0ce142069a52436286d74ba9cd2a889a4b1613d1b8d5b7c7b86b78da60a06f215bbfa138d55fcbd8261e39d5f22d2fe968e96e7110325bfa66ed8e26bda98d89266d4f5356514b4be65c4dff8fe509ab6d77758dac8748706988ba4d4076b5237dc2c1d02ffcf965a4e2a62ce28c2bfee78651d481fcefede55f3393e3fbc9157a87d6c969235dda9c8261e3e4350134004b4cd19fc06251d2bc49c1aeeaf40f4371ca2749b01192fa31996f7e7a3da42f6eb4c3d1b5badf52429e89271023a5780f7ba9be78bbf5d9d75920a2191fb4c6a6a13a2b22208f750fc36dfa546916c84de6795b15f8625c6a2fe26972073127d2a34bdfbb24ee8b718e17d3520ff8cf6cdfd9ddd969956edd77112ddd4ae27d1d1f7f43974fd32ce59be847e805d0a0542f47bf1150da0776466eedfc69a384033299a39af504ffa1112a1a2afb0ddd4bb913e41e32db541d488f8552264c868443a52b1e634c2e7a862bfbb397ec37ec7af620b6610e4aba3fdb13b7d5a07f8e143fd51da3a24f4a3165623cc59d591a6350c07402ff2c35f8608c356f0ab45489f20439bca31534cdf381009d336ddc5cc3fc59a3a745ca61395eb4d7d2a5b5393dfca7b60d48b2208429d563d00b96d2009b760914aef24e251578408734770a8e332e98a81a85cef7a0f9f31a4c87f8226c95a9c557a93338e03a7510b2b653f4440ea69552494c2ad2a03bd95a7d85edfee569001bec16fcf46bafc745a5503518b4cba81f6b3b61e49ab3390c99971f783314143e8481c8f627b75bbc9ada95de12ea419161a420aee2d32da2dbf1ad16e3db0db4da114624ef1c77b17ad92b5381d2e480b56a3c3058afc46549ea67f224326696dfbf3fed6e7617febf3b8fe084715a5cfd4100a370229dc09942f1c214e92f2abc5a8c5b9c54558afd82e6a376d723243fba80dc2069456294406d113bafd0ed3ed877fb155927b79fcab039021310a1d2f905e25095c77fa1092c02d74b4e0788162cd134857b1fd49fbb4d2b4104f02ef77242d90f125bfe8494fd0147e48ad59526fdab2976e61a15fe77986ecef676d92bb7ca7fe845b5023d1f1bbbedff5d2349927d1f9cb3ef52329680b51ac013d83347644b5c992a0c4b530320a09aa57936048ba65af6279b3f0ac9e0c478a1e0224d59c34af72e5870cc6a5829634f5fab4ddfaf768c51a21f22bd593be482fdc45603f64bb69abb1bda154af8d408ab550c34edcf058ab134a7b2fc727c7e093cdbc74072ca71a3e0a78d0c14cf8d52bfdaddaac8ef386927fb373e870fb073f84406921048fcaa9e7e9d437fd95a206c40f1621e12d0707ef0df8cef51e3c15ebabdbfb692ca7a0e1dfbb85be113f26cd6c346afc1d4fb405a7a86ffa29491c92442e0ede05f49325ae43a3d24d1115ade305dff99c35422cdb8dad4b4a57cd9cdfa575885f3bbfa944b53b43f1ee8b4ccde33d843c89fd5a4b6dd4b7e3701e514932e8755cf1593c6397f76bc66737e35d6766dc09bebd5f37249dba24b41b6ff3b9cf0c5d8aefb12954c514bedbec00541e542cfc8ad77bca789dc6674a3ae5efc52f0c1a816e86146b1a79af6c11de518e5c83b6ee4babd611811ad5b8eb396c7e2e941bd32ab05dd8588b5dd1e9b346c65fd2084db6c536f45842a9e5d79c9a33e5234c5336b51706f9a2723a85de01c5822fd100487713cf3a5dfbdb845fe75e359316ed5c8477ab39903e31a0dc68890d1dbf39e2668729f468d288bbc6b410e81fac182eba8b98f546f85a78b4a15f87cd944a1e3df9fbbe345904227ca51264435d2279fd34d63a3740a3b2ec7077ccc4ef6f4eab0e478dd5f83f8e4d7302874a2b0847f83bd1011a21ebd7e7a02958d73ebfb4834973c4a37f837100d4061bd76726fbfe2d69d5bea8b149552a25855d5ad05861fed8fd6c6ab297360d746588262d6420da74fce68a4aba45f9c103d18ec836673d1985a8de55289aeeb9b6de02ddd952dfceee14f0b34a1846e23eaddebcca49e1588930a3d6c20ba65ac62e1ff0787fba8da65fb3a8d5c2b62bdbcaafbe8d0a65ccf909fa0d248f3a2be2291c083ab5843cbdf636a27a548b06d0b9da74a18353f498691d0f7a47288543032a2f9590a66df5ad161fa1eb3bb852956905b791e92ad9e1263f12dd9016d29f084bfc0a4af71805952a1c6fa2ac8c16191ff7eb22c55a189d363f6e77fc45d183403b353696981e461b35eab6201b8da24783e594e14ba78c58db99fd95b0ae8fd279a78b5af7a71fd00dcf3a9ac56ade0d989d4af0778454f55ede8ea81619757ca050afed8fee0a2c6d50d0febcc651efe22f165113f57a4bcf42a1a483e1178ad7de3da9741b87593eee3e944d93400abd135b3799f37ca464329f4aad77a62635f508535369f52e52723a9fcaa8771e4d12c949a3679d11aa15eeb7d12b6a9e6c18e9727b966f5a703ae1418ac87e050aa76ae45e46a9e4fc593e38d7945d134aab4e270668f0a94b2d96dd0fb4eebefc605f62aa8f7e830cedd980fbfae57952d1233135f930d20cfcc2cf0def7afafda91e6921be041ef971bfd29087c782b49070c2f0a41eda93caafef416d9538ccf3e587a41319e49e82b46a12b5e451eb49076aaa9c60734036759e3c92863578538565a7f9d08f85921d518a2d021831a9cb03bbe337de4d6645fe6398c1faa29b02c51a3c529bb63e6e9f76885bdfaa639a808463f8eb2a2aa59e27251d48537efdc3d15ae4836b282c11d7d060a8a1c164cbb1aef566af16c97a93370f8dc65f91914ddd679d81581cb08b2bf8772593c00509c0e5630811e54692c995c6e7fd3a825f4732e090146a70c0067e10f4e9e57836a27010210d597ec8d4d7e6e11e4d67cb8626371ea28224302b10a04bc58afcfe448b61ecf21977f8733c51bfe621e94c7f1b7c7b4ecd709dd4d69f1f08b37db6d60ea7cdec8f9436fb3b4fa63fa4cb9963a9c308aad297df4f3329fed63246f778ef9dd3eec9e0adda194fa8613dede76139dfec127df612f30e2b2225800a28429d36352b35011d7146a7859c15f4d3b4f5c43aa3701e2b64fd44bd55ba7c79bf86010f026b67dc14354eb95bd57aadf9c6e35b781fe9f483ba5159acbfa4dee5173496564e9ebd7f994dfcbdf029d1fba36d16d5bdefffc19e0f441f4e8327a49f25edf789531734fd34e51dfd5f917e6207cbb166adebd982089ac72ee9788632069fefd0aa35aeb51dbdc39a61672de9d624d23b9130ebc7442a7b17ef66a262530bea1851f2761b997885828eb492031a3bc688fd3b8c8a4d1284739565a9b7847e1a3b07766d1ccf9fe981d2667af0a24fa68578f00664b77c931a22954ce1eb6def09145e4f797a328b6fd1e1447356bad4aebc3d650abf329d909cb7735883d87727d4394d267b8fde490b392798a269e969b8f7d46237d724dbd27a4ee3eec08739a14ec8ce553627981b56cc59a7135dd6e2a760e706c2bf4a237660f36ee89883383689a04fe65de874dba639867b7a1823ce75f48e6b9482229f574ab3b4b3c4a6b5f5ed1def7526e41f727798e562caac6f67c2b45db3c4139aabccdee533d648f7e13c3fe3b0e5fd5a06344e6908ae15b5053e987177cecd1badb79aeb1eb36762b1ba219f273c75ae4337ca46df5b36257e7fbccdaa7bd15316ed5f64f35edd07fe1feef970b09576a0d90484e8a769615bcbf3ae25348a1c7caa43cb96edd7594294c5509b248261e73b4eb408e9d484e4604ab7cb1d34c1b5a5f3ac9998bf97327147982cfe7c42e737749c22a3fbb434a1c5e23bf9a52f8c9d1a51f376a746c8dfc6da80b5824a1140dfe98176665399644f5e885e9bc597668356ecbfc72bd1597ab128b3980d5936843f2f0ca91625f532d84de5a785a8209630de717267c18c74aaae1529af68f9937aaad664413d58f4c6c7c7cea1436f06516f17f172908da5662674baf90ca05e2fea96340287efb4ad192b7650acd1219d84350f37398d6f7171d08bc2f847b5297efc2a35e8eda55cd39627ed299e86919f439f3c85cfac9d24a4a22fac7b7a3d6bcf8f34a4c7abda9cab681ddc033a68d18e9aac0e0decc368e0398da5b435098319add2d6dfb3d22a26e70b0da089970e59692853eb9106db5d3ef38f5934d4123bacb06e295c25c8e9d82a2821ae01d51ab64e5f929432ba391b7fa379fd465349b69e74900c79942a7b79799b4c3134057fafe95165bef64c31c2b19f6242be5c3162988762a8cae3d66f75bfdefbe561bd7fcdf0c7254fbd407bc7654f4ad79d1f13a4511717d4a38da77e3ac57cb398fc8562a8442476c811b3a92ec964ef6a5ea5ba099d356af7b25c3595f3d4f9922f9563830ac96766bc5b7d613a8cb0339b905e50c44b1ba892658abd2d2c372f583b31fdd25a939bb0071e45735c32c50aaf0667b67f16806e8989296e05b01a0e81b99733ccbd7e10213c8b6fc9532d52383688937ad949609a8a79d2bbd878093e8432f524edb6c08cc2438cdbb211af43dc9bda7324628b1e0cb09364b2f745284792e9c2a6e9097960ede5e635bd725c2004d39fac4b9b5dee265b0c69d92dfe1f6ab304ddd25842d5bc44c984efe06f3aa89e367836c453881b7ae4aa076451854fd1786d3ac23b2b8dcbc26fd9a53645bc4fd5d4fa96506825fe27e00feff4af9034f490da0d9352b37d08fa974f30278d93daa9a4c2fed46c9b77fa653075e597bac0726ef7ce18e44028e4021476e89af6ad636932425c10448f1f1f53df931fe5b830f5b27b2d5b4a0fc9256a9e70829d6b47591fe647156aebc5508f76782dea7da721513750358d29e3679bbd2facebe520b173b53068cb3e7fd9c5527f45f78e47a5cdde2130b4b5a07fa88bb56da824979af648d051a7f5859b2444e10f1295be40c2a32e930e29ea312ede05788ebc51c2aef292eb4630798cc864b54aec793ec7d29e5acbef6a3fe0338024c53946558e64904b5d2efba4dca242d6d86e91446a8ff953dfd6f10aaba621698188b7459294e65d006b27bc2b6a30bd9f4c4cb1d60833b7c3f4cc0de7a0314fe1d084be2ac5a3966ab48e2c3776ea02712a473244f396511644bff7fc88cd85eb7e9dc58f8fcbfe1c2a4bd283e2091d185f111f5a34a6d357683bb5f6d36e1df12a53d42e124ef9b7d5412f13a036adab6fb773b5d42caa1c2fc8456fd91544d1bf8d664ae7f74626f7d63a7816ecb0668346ba7eb57d79a21b39b0e9abedb8edd8fab8044f029d1b4d1c9b83ef848b0b66d37cebb80268537dcee2543bd748cb9b75640a28ba60e7daf1a6dd4cf916f11c5c0f6ad6f5b2cc0d8af21d24eb7a1decfc4d3066bc200749e7b92353f6a66f2e49a695b2e55f9c637966ce5a4a8ceebb36429ebbd12b18c7866e84d1d453782fc79509e1176dd1413b9b8e3a7a003d5df7b4f91eddd3667b38d2736497866e06cd441535eb739e5d2f31fb6d75d1290863f98eee648eabb8179bf64983e62b150efb55b99e7cf484bad828b2f84b26f6e7f271ec2184230e4db93759bef5aedcd20bf7ea173222331e6a2c6984be9252d4994a72d9b00796d094b8097568c257af1c82323ae69ad264618e98e3d491c713d0d758e272723ce46e2325c37ae0be5e775a2351b80e733c85df6ab834b55796a64c33bd5df4c4aa7c46abf21db62adfe189a56c7ebed54cfdd448ebce86fa73f30db89f645a3a20fcec45dc9be55beaf76a8babd46e7a5b5a484ab7add95ffd0a4fbf25ef5f88d934420f59eecabeec742f3b01b1dae60e54b836d8c0486dfbe05788c32eff576c4bfdd2027da6d815d6ed4c7e3e3339a43febec397d44c3f6ad71b1d69cd0379246e01eca5449b677f6b63e7b2c3aeb37992de3ee94a96dab0ae663566fee28a9bd8f8352bb952808a19f05750eeb3bc26fdc1d654e01c16ada954951d48c22418e649c65b67299f215f2d648b02e2fb0cc745e6774b9c33c697c3acc0bd497ec8acfb6f04c5b3f6bc1ee4c0f1af1da70cb781cd670d0523b353b71d93503db0e75e47f77676feab3a5c3df27963b9f4ff0d3cad6e701763461caf92ad4078c2711470b75123e1e013e110f935a46815bba62ad4e454d7f0a2360b99ba35a9ad36236bddbb4cd494d25d78b6f69a5a21a9e5e98ccc7dff9b4c9f0575fd2c426929a6183bf60e5e32fb684fd5b0e768c4e9b3b7359936fc94ff2a68efc6eeccc65e82eeb5c96f54f5397bb6924f4b787792ded7f37da75b41dbac3d4d56e6f8befe4e2890da6a936ebd22aee9af5da816d985f849fe25bf4773e9aec1d4daf1ca51c2a787cb0a3596675754838107f61647f50d81fe7bcb294aaab053b5c82b10715c9d5c8de5848f6448aa267480f65beacc56525eaa5e474c1d87dd20c19a204d5e8a0f681117f07a721715c49cfce1bdb6469a7a17f4daf43a6f79eb62b9249e8be35e4b0fd8775bd77b6b92a9225885fd162c6bbf62bb5012a53457d3bcc5d363d157b6ea142044821267a288a6cd1655dda5e5b343e3b8ab07f49d15f22ca574bb515449aa74e4425fdd27132badc5531f453d1658de54d70d92ba951f915065d806ae726c5e7a3897e9a349881f14710fdd246a478f0c7121e696353e4fc7184bcfdfd773e28f9a0cf87ed8bdf5952b204e66e8ad4648da7a2ff57227a6fc052cf97a28d785c36499b5299baae459883cadc7b9ac5bd24e16c3fed21153c5eb6330a664ff21497ce1954d1dfa5368494b86b99ab95ad354984f7f09ed74dcb709d5ec67b14f1f4a2ef78f49783dd2bee9becabef5fd66948ba8d325ce92f5f4d4ee47d895aa3a632ef217c2a0675251de67702bcf7383fe2987bd8d8e57662ef1af7081b8da8d532bb2bfb06af1509bfcef7e862360fc959f82dad50786b197e7bc48cf5b51473c7f7f38b863ab8a9725c8c083fc773ab8252cea5a945684b39df4f84f0fe4eccdb5d5770d1715cf0f97513923a77acc2bf55e57e73e443bd6c2c3a5a90498dbd29f41ddfb33fe1267a6cf610d988347e1dfb7a663a9e57f4f7247c4ef52ec23bb0ef60f7040dd5ee6b0d6b20fcacdf7bba846d34e1dd8c1a8dbab35bfcf08e7e1392f0eee21b6fe698989d838c02948cd3d7870827e94f3e361b9e1480fe0ad168d1a38fbdb4d4cfa705caba3d881830ec1cb6495a3d24dde916d3bf91f63914c34f924fc4a1a5a2a2562f2d0eef50e6e05358e5af2acc1850f6c5b69e65469143729b73639e5154f35a655e5b70cada545942defc3c26a7e33438a78ecb4b1af119eeb805bcaf789f2b4f2cd46ae507ef1e7e70b4e178cbe936ef6f9919c6de60afdb5e8f32a9759221192fa38cda51ec6e4445829d51f8c93910ea08b1ed87274f084f9846889fb18467d846728ed9235043263963db8867de1b287d05e648c29348c2c8f6bb9c31ddc1f43fe28c59769249bec172c696d1b53237ad5a4230bc317ca2e4fa6fd28c3a626bbd5e88f0d3984cf765bb159f7f8a145f7f8a5c6bbf502be42da82dc4fbd1fcbf6126aba2a885f73b0caa47039e6350adf3b130a8b6a0df67505dd058f64d285b4cf5601854a9893483ea9185415533d58a4175ff2f32a842e8eff0ef776550759c0c7b9e6550d1f55214e0d5dab186ffe65f6650ddef6050ddff7f9941757f41d40b1854f7ffcf32a876b9fe2bfca95b515df853b368fed4dd17f2a7caae7abd883f8535e66833bbaa0ef857f953d6b1c8fdfe15fe141dc3232606a47a317fcabab42c7f8aeea9eaaf3dc79f0a78317fca3a95a2717fc09fa2e38daa7c8e3f15f062fe54e773e733f239fed4c64efed416f41c7fea32e64f59851f66cd9f5af387fca932c7e7f8533a2bfe94d4f67d532acd9fcaaf7f3fc6a1a8fb7f843fc53cf1595ed739fc29a6d77993cb9f2af0c5bbeef82660fe14732f48b0e64fedb0e64f31e3dfb43fe64f31b1f45f8ef953ccfded6556fca98d7f993ff533c39f829a67faf337f6a7532e983f2561f9532d88bec7aca8f5e42e0b479697f0cf7064db42a8e47f9949b5fdd9009a3fc27ccd1ef0b22bfb962f95e5a68f2f50d37a3a8f6652312d30a98349c5983d6826d575999afe6efa1d2b26151d76e0bbdecd05fef43eacb31926d59eae4c2ae6edce449a4945a78afab34c2a26572f7730a9e4ccd7837f894925e9c2a462fa0d19cba4a2733af02d0e938a7966fb7098544cbf25c5bfedb14caa6e7fc0a422582695c38b98549b7deecda3f23b98543f6ff5609954d7ff4526d50f7f814935ed392655d47f8249157494786397ebf262afec309a471540f3a8e6d33caa369647e5c8e1513df8a7785413383caa299807c5acc476606ddf9fde9547b59ce151e5ec627854a7052fe251316dfcc48a47757d91358f8a717ffca73caaf57f8d4775ac0b8f6af69ff0a8fefe9fe051e5d03caad9ff091ed5b017f2a878899847b5333ce6d1af3f5af3a878899d3c2a5ee23fc7a38a38f86ff3a8e8bea6def98f785469233a795403d83eb948ed956dea26dc526cbfde8778e3ec518b7de827ff511e159d3b6db77f994745872725ff3c8f6a13e651e5d6f794a8a9d52fe251e9d35fc8a3cabbd2ae4fb7e6516df3b1f0a82efeb8e777795438362b1ed5ea0e1e95eb531e05bd109560e1516dcdf1cee8f8b5d0eb775854cb318b6a8f76bef1b60dc3a16aeb69c5a14a7e4c73a8be28bd6db3c32ccc1ef66327836a0d694aff3306153d53c30c2ae684bbed94731706d5643cd35d38cec2a07af53906151d1e33a898f08510fe050caaeecf31a8ac6788c91b29372e832a8661505576615031e3d7df680615ad0bea5ffe8c43d439e393af855907cda0baf72f30a83ae329caa57a320caa42016650ad717e11836a338f195ff70e548c8095e4d0fba864dd9e2e0caa37ab0bfce8df6d3c5fc411b14a2f8372c70caaf2119841758f615031f3911fff7ae97d522957864175ef5f62505de8f81ab02809ca4f33a8eefd1506153de7f19771195456f1ad84f27530a8f4e8771854c7690695a3207001ac4e6ea91906152ff1abedde8f180655678cc971304fc10caa3e0c836ac7d74c4dd7767b9e41d5194abe0cf4e32f30a8b6a01731a8accaf31ed4cf1f31a898f9a0e8390655f98b19549d31a3484a66c5a0ca5a8719545a5f86415560db8392d9f7c066864175ef9f665075a614fa2ed5836150dd9b7b795d71fa4c0b83eaca6354982df84b0c2aabf69803cf771706d5bd99c582cd3e5bb79842d6dc7a017feafbcddf322d76eab7df7d369819fa4c863f859f8de7f95356e59961cd9f3af44ff0a7acf423f0dfe14fb1fd1d531f017f953ff58a357fca8ffe5df7d68bf85338f61236f6a2d7ff127f8a7e2e4f5f7f9e3f65a56faf71f9535bd05fe74f59b5ffb87f8e3f65d5bea35ec89f62ca39c5b31ceb8d4680b5a6aa40f17921c2bca6922efc29abf61bfe97f9533f75f2a7ca7d9fe34f95cde45715b0ebbcb3983bd5992acd9fa235d7f38c853f55883af9539db9f119fc62fed4ad28963f351ff3a7987577f871863fb505fd2e7faa0af3a7acf45d61b2b7f0a72ceb6e1feac5fc29abf6965bf3a72ce1cade4df8fa42229547a677c97f6f980361e6547edb3d5883ddef644ee9059839f5eb7dcc42c2ec11d1e3c0dbf83bf3e1e9aea72604e8ea0b42e9b700b6ee33e79dc45c8f1dd9bb2fba2f3ef40d668ce85b1e77bf904ef346e69222f7c5e2d0de5bdda753018d6867477ed06b63efcef3157ab68d9e77b293c1329a007fbabd08c7a8c9def4cda76a6ad363ab5045e385373b7923ccd95c9dae3ee6804b6d334f19ad992aec6e62831edbec2e65cac2fe1e7946f4d82bfd707a6bdca69b847f9626b5006a41d688144502fcbd87d8bd8c39010ca7e0468a50fe793df9a83bd554cfb89663d6083e2b59c89edb6ed09ce231ac02b666c67aa53b9de78fd298a564b260f9e63067eabd0a815eb40c9f8ddb7a1a594e5780d51f9ffab502e1932f2ce7bee213263acf02c1bf582826609f429ea96f5d2b3ee302eef9109ad7624cf163d24bfea6c5e8bea8f7f2c056cc08c0cc8019cdc18fe73c7ae7fe823b91b717d51cd43db8c81f2f425e5943d253a61001b96f48a652628988ffca4944519fdaf2315f27919c7cf6a260badd62f7d0e2f4425e6e5e610e09fd8b3b3ab7a1f8643a9f7a720515a6cb50717a7f58cf913ed1157b3433346c79bdbc729de8b7f64fe9d64b0ab2309814472448b14b82544569e36c117f6c3652d9ce36da2d8ecc6042d624315ca126238410d32126ec2204df456e2d4e5fa7520cdd45f89e74b397a042810451719f120372c5a18b32aa2c6b1bcd3cdf3d9b95924d3ec5dabd2aac4bde5bf139f2389c264f71a405e9cc3875c2cfc262b8b56ad2fb3bdfe77d70ebbd498b772ee62dc1ac86019833c5e8a5d62b57d560b7d862f67979a411efe7e811a5bf2ae427c9da42cad3a4df8bf886e63b48d9dc489fdcdd98d796dfb696da043651fbd0d87aa19a9249f9ae4594930956d515a837b6d9b61051eecd88ea750a51599588da20a67d2b47d7a0711ad021f583d2b85017f44a11fe428a5a7387875d898bd259cd3c9340dc8e6d857e948b09cd5f2d397d4cdff023d63c7c12af1ced582885bcb5b9d079ab16f1952b5d08e56f90b74419c1e6ad970b6158f809a272c5229c4a4b48ef7d94b38c30dca940a68de2764a2a166177fd111724a9573eac4014ea26a00ac5885a07799d4b121a757e397336725bc8831b1af5f5522ae80e6a0b696b69349a6e45b553eebf21d3d23dad296ad347a75a6b8d20378b5bef1a4d31bf81dd4da3c9966c097a963239c8f8afe558a7fedfcb71efe5c18f236f5b9eabc0d619cdcc737530e770d66f83b16eb4cdd4ff88b5a33cd9d0f40b4a19fc2047fa63233fb7c22b0befc44a096ac5d2906bfcd502bfbce5091adf0b3d85c21f72bad918b2f722622a1170689fa17e17f24aaf3df169d900963b8a4e139e52d2c9ac1795f1f1fb40fc959494fc8eef96351a15934d3e8675be8421e431ba9ea3ece54b347c70cad690fd379f69ec9e8ff2e3105a2814249516a68f6e35f45c8788290da5867a5fe28713fa0201b13fdb7eaa2ac074826c4e0a51a5291fdf8332482b6da004ca15b5a84eabbff6943ff1b89b5084f36e87cff6d057fdc457deae43cb1384ea409950383b475f30010d5f8f771e7e19fdb0fa9df56dce63fd9593e7127ab113a2c2ab1c64a3f03901fe1159a2831bd64d3ab8213523c5979892f28670ea5823bedb9f2d99ea683444ad23765f1c56d68f2d77f261c2530fe59692506efc857f60134f2afa8e5fa81d858a45553e4ad96364a8fa09e1bc180a5ad1f2443f5229dcebb3bb34928d21f42053f625a56eda51adca75f761a67ab75409e9482e4a6174a2024964af5605ec30e2bd5b093fa17a6229be93a87b97daabed4b85ea1d91de32c2afcd9998ac0af0886c0b4972d6579244c3eae3ce2db995b99ab48f5cda720db54282821e294eee8a5edcb378b2fb1a1bb2a0a5d2b448faae0d4f5f45f20ca55308c3c1df90f2fd7ba825cfa051f30b57778735040f1948334f49fecc3364f8f3a5d784bc920da33653094f616d982af32ec0e1c76dc7312ae3ee21431384ce955ec371f279ca2613e24987e80d9a7a88e11e4f69730d6279c2d3570b7947371c2ea01aeb11de3779dc66fc7bb3f406c96bcc533655a25499be52c03b5ad098230d11f0127294b678df650dcfa0a985780a78fa99b6bc7e05473bf75d5e2fad06ffebe9723db98d941ffc8294da7ac230ba9ef0de365e54e613bb59b9229818dfed3bd58c0d06f3bbc830fa2932c47d8494ab4fa0f602e5387fc2d03dcae77281a1db4355aab372d4056418de44786fb8b6c130fa0641f81a56bfe343ad682294dd2eaaa423fa2025791829337ef6512e8576d794f8188c1791a1a40419b44f7c085feaf14524fdd20995ac371cae44a30a0c4baf104ab242a5975541c86ad528527a6524df501f8598fddca7adbd94775828957d89cee751734891b45a8634d952a7c7fc1ff2cee60993f323db9c4f6fa082bfb3933625a291788fe8685744e5dc15d8aba9f4bb02fccdb6b4a90939fa53820a8154ab86f235c1fceda8d673df9ccc01e997b50bb43a7fcab6ba4fae3f655fdd47e36f92553f2540cb7ed81f9c79ddb83f735529251323a9568b7e309a7a8bdb715ef02ed34921336449b2fcc85539a737ec88906a6bcd544035ca4d933a5de1cfcf95868c405b8b756a2aad8597abd667349aa9ac1684f75f4d0a3a9a218d9e8dbcb49b8a4dbdc4ed84dff95293084becebba51e8ffe93de99344d44feb719c2d8dee2e0f4a937c97d74f4b6d90f0a44f9af07ed2c4619ed45861c6bb49f72b70dc076e2e904fa3d6dc66eca735b9434e83d57ca9bf33126669fc0c19953e52d287a42657dbe9858ff86e5ae829ce55c2fc0609366d3354d623a5d315783eabd1b45c7a1ffd5c65c86da44f4c40bb2b744c2e56b7e0dda7335aa0cf9608f4898d78f7e98c6a01e6ae2a0fed455e05de9a91fbdc84642bfe424019b40ba54ca6320f23bd1889c76beff8185c6e23437d33626227d441465caf7a5283ced2f5aa97c188f01389705ab8fea1256c45c7a5d183d1c42ff4b26abe7dcafc5cfbb4147f13121dd727ba22530e5976dea8acd5a2dee56c0ed35be8bdab5bd07af88fcf0df128daada6dbac7234c23111ea14ff5387c1ce45dc7edeb82312a7d3e6dc72d8ba07293f6c7d8aed8c9adffacb51db0cfd651eb12346fa3d8c817df557457ce90f8d7ce98d277c436310323c0946d299957c65dc6fc46aa1b4ef48b45a240daa8219473391d27fb3ae3cf7ec6a7d9f51885adf4c18eaab884d2669601f1813a42a8f08bc8a958a403bdc870ca3dea9e946ad158b95e0c7f557aae789658687b50495d22d5e5f50890e643c5d4d9d2605e5eacafd98553eccf474db810dcad287c4d30dfa82d388aa2617ef88d248a4e006bae2951414e894e49490abafec83f4ed00e746bee1612561785845e8839da02e3c77510e2778741a79dd165bd2786b9f893cd1ae515f905d5fadc71a73a11edaf8119f3a51cf930a90c0758ba1200329ab1e22c3b6d5e8529ec1fb3b88b39a3054d5a1634618f95aa5a05386ea27089e7671058230e2f1e46d1fe53ae8ffe169b7f84ff14f31e2bc607365e9aa8b540fbe58595b4b383e38ada3f2cf84e29aa2b2c905fa822ac82b7581747635260551bdf6a215db0f6c3078ff4c3cd9f074f5699da391d29e11e0ef6edc32f12fa990e393a484cef1595228055d1c2f14aa0c21dd09e5badd48d9f77fd0fc1ce56b358432e636a1bc2a24566c1fb5d130ea67e2f88669a5d4e633884e378d9c87d33dad9b6d1c2f7ee443d7466e90d14d206ec5dfb628af68607e4a899b11c42f1e2f16aa94eb8510632d618999981c6154dec6e66454592aed26859cd59a9509ed68762d2e25e5c017504838834e4d47be4cb7e5970f09832603e90b60c6f93d39bd2df882739bf383d5296a4d80f4c94768f6434a448ec1bf02a6fc8dea7b82470985b3f4d59e7c4d36fef66f2d496d59889475109f48142cdd5085a8d745a85ced7aca80edf4a211b8757744b7b90695c2cc69fd99f614f5d9524a7081f7749b32bd12190e3f2494ddaf809e98f9fa1b7d106e9b7fe8f4c17d61340e30e69f84fe71c31544b94ad03f745f14419fb9ee0a32b949dadb82dbde29d1ad574bfc31239ed29d9c62ef4fa59c9c426d104c913e69677aaa297a4d208c0295d0660756077c9a7f12fab38db711e52641a00781ebd5d33e93a8a975b791c9958951afab3713ea25466a768d0db55e2cc2cf8489242f60cd773219bafdacc23d9af2b59f0943469d8ad5ff094933472627e4acc86afbcc243c7199d1857a11a30b7faebd4b8c269b131718cdbff44f69b2a6b45393977f4339319a4c3ca0d69f1941b7700a3986d1a7589d53316871efbd4899f75095bbda50026d9e51a5c29a1cabf32c35e9ceb4625d4e1948f7d66f3689a438f7be98cf8904bdb72a679991458f95e1820e7d335437d231e25116b7a421a3da27c1684a3dd38c677ff83ba31f72938c6ee950a64efdcdaca7f5b7986cf451ae87f056fa9ba29e566aa5bf62467f0d89b01eda463a2bef804b2d94ab07df8672108ba89ec20152dd6db374ac2b4af123de342c2d8531b8db20a9735fa40f31f3a5054e48bf6205f22e005d3e0ea362f6e3fef82483c7fd34ea943bfa154f90a39f74d4693355b8a7df5becf908585b53023c0c413fc00c4ba51cf52bcc202a7df2f3a41b609c9827a24bab4c3fcc945678c52756176d54c63421cab17b9fcebc6c2da59f1489b87f972765f3429ee3a9ce6784ea21e6c5ea4c3d841734fe3ba25302898073ce6daee5ea61fbe04959cb3c292642f82d76c56e29816dae52ac73f3489887439f2b823e37edac7b5248a00c9f430a73f3f555f0147923989f475412b9a9d4a6bb282837c55f724ae35fe24cf811013b229242da9ca4c1246168fc0a1d776a585d992b4cfbc8b50566aa9e04731640e799e6bc45f8bd86d206f47ef49b043caf3ebd73e8933e6e90226596a5260209a79c58dd2bf884107a3fc6b256c2d34de467c64fc0f03cea5485088ff7f9db943d4f2243f035149463e8730e51bea324c592d7548519ae88d63675b5006bbed33665df934859790329375c00ed008d1b550612e6734157d0b063ca982c64ead5bd39e0d80c4b7a8d6e308f5f05fda4df3365cf2c9412087d669e2ba373ef42afe88f77f180b8c510b733c45d77a123cefc634c7f67d09df5b9b9da942fd97058b3226f7e31d683f3a5f8e96834621b8f63259a7179ab8a356ae87d7e928eca30e339cef0829b067cf28ce4a73623ee5140cf0e61795a77bd34c5ff7c29731fababc5ae01e761ee21417acd6a74e8b0a9b7a4dd6443ded2d70f44230fa5046efa167aba8dd0d3d94a10f443927af330e8ed4c8557da4d1249bbbe3e043dc9101e02bdd8b8f0a75a83492cdc1ff1a569fbc2cfa67dcd9c7fa1dfe0841c23cbd34ead959275e607b036863917da5d77368f507ff52b7e732025e1c979bb9a379a44c9527593f995ed5f9dd20beb0815f4a98705fa2a19b1e35689ec6cde5958d94c92fd90b73c316580dfc091db9505f56876ae63941bfe855c54ed63a8aea6df826f2a1f4fd6f8ccbf24f4f728c26f2554fe71726734bbd4b213c6d8f3f88b16fdcc463e355388705f804fb4a0cfcea05b2df45a7e64a1b68e3f5ebbd7a710d620301f945d41b08e0a4e72965e1111cbf36ed2dfbae2bc6852f5eb1ff31bf37686137ed773f197afb02253a7f89eed480dd7023e03049f0082cf5891217c1e4efdc7d1393807f9dfe447e9b5649977813ec417c5f684beb3c8c394a28ef8c9929be44b6e90dbafd4d4c57a744caddbee463af1f1ba59d9f336c265af28936cbf0bbdb393d9002b4bc2b7d138f8a410d6b4a3f93757e338bd65fac3cee8873c8fa8b5a2d89e49328f88a4e024277d908888ce9b246bcbfd215798dadeb321f754eea6e37145cec8b2d3022e8b47548a3a4976c8f8db00dc923b16e2771d42be34b3d92cad6ae4b7e5ea0b4622652249e8832af9d7735b5c70e90ccd32226580becf4844658b9d977f24edd6cd67dc367cb60cf56385e4d3e3b9f8ac6d9efd85b61069b79a8f09f5fac93a3f7c46329577d82517fae0c32e9beaf48db042f1a3d276a37c92b8884f6f96ecd3a94d054f5aa3ab52d4d26e8f3e5e3f595aa0c56f4990b01b7d9e10ffa1735bf00d67fc5d3236e7a651ce0f514a051ebd08ffa4106942fbc7c3d26638b5e47de43c3ff7ad5c2a47dc33aec805a52653af97d9eac9b256eacdf66ed8b794ff1d9445c93fa1c2bd849bb01ba2ce4709dcc423117e7b21d96e28e84628ab1fd2f3f61ff20c30cf399f4704367c232d90217d4222ccf2379dd7f951b9876d72fd3493a98cc336509e044b791c49cb793744607e51e3ace2cc7d98c12f305c7582391225fa0db9912305ca8db548b9f02aa11c8360862c26fc8c7a59029da2f4ea05580d92c45a21a4aa4e99ac6f4c44a635c2fb2609f950dae482ce1e33d45712443df6fb7763ee455873c8a07dae7af2f17ba6b5627df505fe5ae1f55c8dfae6a66bb2f379b82e615e510ae38e71e23f3cc85ca366f2d81bb5a5d83e65f2f2632647d19d1de4a163267b515d2dcc93741760a652508724e798f61b5c24519b0aebda975cd3df9985661fb29c4783cf114df1871caaf5e2357c7ca2a8573a354788c60b927d18bd46fb5079d24c37a10059de135b87f94e88c350ef08117e27ed952ebadd193274172a278a88a291cb07d48a1e1c48ef77374bd31912095f949a7c074e8d6614333c899e38e57ba153c2f787dbd3e74fbe8c325ad95f7b9eb68550924634dccf2da8d247a8ab5b864f4f24fc6a4bd9b2b129d5bcb05cc99b714ad627615b8779f4c2303e055dc3148ca6bf9f89dd48e1d05e69de99bd7f61ce6685de1a6209b5c7b1402edd1b115b3efd176afc4e9f39a30f875718f72ee58f11e3f987bf52c8272c3960c24ed29b7a46b4d2fe0c7b174fd2dc3476cd2712bc289f65395df3b9254ce1d088b644281ce17fc0bcf29f02143de16e9ac2a911fd344de10cf7cb1532f8bfcafdeed76113ee6e993fa1f6eb08f7075f2f716ffc3ac1bd2d351912418b279acd9f01ae01867b9bcde9806f017b7ccce6fd80d3800f01f75566f32c90aac5d1514b23c2e511b1b131b11e3668aeafdfa420f57cb9dfd2f88858b9ffe2b0b885f26931e11188bd2ceeb40b7618278f5e1a1d2f4f088b8d873b08149bb02cfe85fe16c7c42c43366c3c094b17862d0d5f0c295b05c25a1a131b210f4f58b26c9cdc33cede06bd1e961017211fbad273e8b0518b57be2cf79bfe7a87099c516c44d45ccfa1c3c3e70ff28c1b2c9f68e5f3f75cece92c44fe9ef320cfc8c12fcbffd019c7e03204d1f5cd9593df7ab1bd457e37fd8fdd77bdf6c7ee455718b98f95475859c6caef587995957758f98895a892916256bab052ceca311cb30fc73c9995d359b98c952b5999c9ca35ac3cc23197b1f20c2bef70cc8f38e6668e195575358b396607d6ecc2ca411cf3508e790cc7ecc3314fe698a773ccb338e6508e7921c7bcacaa6b7daee4989339e64c8ed9e17b363e560ee2988772cc6338661f8e7932c73c9d639ec53187b272212b9339e64c56ae61e5e71cf3115696b1f22ac75cc331dfe1981f71cccd1c3362f55ccc4a39c73c88631eca318fe1987d38e6c91cf3748e7916c71cca312f64e532566672cc6b38e68d1c7364fc503996fed3e5f1114b96c5c486c54647c461fb6172c67d382b95ac1cc1ca91ac1cc5cad18c8ceb882f2e6c0574c7d0f145c7418f1c47d726b8b3f18675f80b8b8d4a5812b1343eeed5d888f884d8a5f215618b13222cfec32cfe873fe7dfda6364189bbf30367f616cfec2d8fc8559f2d711cfdb2fce1f1b4f1c1b4f1c1b4f1c1b4f9c259e31ac1ccbca614359398ca98f3196746676d6abfcf5b0c58b23626977365c3c1b2e9e09f77e446c0c76981c161bfe4a62742ce4105ba1d830b91cdbcf60aa282c3c3c36220ef21cb78cb17f3b3e8c7a4fbe2c861efdc01cc5daab17c72c085bdce910cfdacf5c181b11166e653f94b57f5166e10275a0dd411d681937f4d5c86538dd2e95f86a646cd89208ab6ca038365c577f0885b1e9f9272ca5e2a36396feb116807f369eb0e1bf17ae6b005007c6ff08568e64e528568e66cb31fc45f98b43716cf838367c1c1b3e8e0d1f67093f86956359396c282b8731f5c5c613cfc613cfc613cfc6b36f0c3bceb2b29995f2b1ecf3cacac9ac3c3d8e7d7e5973fba83f960ba099dfa3db83a997c8b084c5f1f2c888786aa19579714c58b8b57b5c3c4c9b687334a840142850f4d2b8f8d804baca69fb25d17161ec54af3332b882e5833c17270c86c9d638b9951f9c00d42e15b32202ab7458bcdc73e898c52b87c817472c1d87e7452f638d1e475bbe0cadc0d88587c5878d63e75f717444e33c8785bf1c09b1c5e33bb7b94397b013404e5abf970fba60ff5e465e947a97f43b2b0ffd777ef5dff955d7f9ce7fe7575dcdffb7cfafa09318af1cb6c4cfaa3b7a33215e1e13295f12b124267695bd0dd38dbc3dfbedd75553a732fe954b42acfcc7ad8a7b376225ac89293c1e87cb17ac92d3ab5acfc5e1f2c4e8f885782d8997a15dfaa317a41bb4342e61190ceef1b88f5a158763c3718c93870d9de88957b861c358391c4b0f4e8cffe225628480632db436f078ff66225617f9cf78e6ff659f2b4733edf9841d57a7b3e6dbe3197994959b2730f229eb6f086b064d41048007e0f785fa78ed732404880024400c90006c00b6003bf0630fe806e80e70406bf88e207b407829c89e209d00ce702f03b8f45d835c01bd006e0077406f401f405f801ce001e8075000fa033c0103000301830083015e8097002f035e010c01bc0a180a1806180e50024600460246014603c600c602c601c60326002602bc01af017c20af2ac8e72490af83f405e907d21fcaaa064c060400a600de004c054c0304829f3701d3016f0166403dbc0d7226b805411cc1801030cf02cc06cc01f35cc03cc07cc03b60f72e201410065800a000e18008708f0444011602a2c16e11e03dc062c012c052400c6019b82f07c402e200f160970058014804ac84165e05781ff001e0434012e023874de86340322005900a4803a443180d2003a085f832013a4016201b9003c805ac06e401f05f3e60edd04d480f5807f7eb011b0005808d804d10d7664021600b602b601b603ba008b003b013fc7d02d805d80dd803d80bf6fb009f02f603fe06f80cf077c001c0ff003e077f5f000e020c802f0187c0fe30c0083802f80a7014500cf81a700c500228059401ca01c70127002701a700a70115803380b38073806f00df02ce032e002e02be035c025c067c0fb8d2174f8dd6c0f4620daa065c05fc003574adef26f423dcff04b80eb8016df033c81af07f13700b6002dc06d402ea00f5803b80bb807b805f00f7010f000f01bf021e011a00ff00fc06780c68043c0134019a012d80a780678056401ba01d60ee4b770033d8aee46d46f49fce9a67e27f57e69bcdb7008f00ed00bb77cce65e8041803180c9805980858095804c4011e0734009e05bc035c01dc01380e85db3d9193000300ae00f08064402560032003e8039805840166023600fe04bc071c077801ac043402bc026d46c76018c02a48799cdeb00fe703f00100c88047c04580dd802d80f3802380db8021801fe6f837c0c10c07d4f802f6026a03f201c100f380eb80ab8076805382f8034001300c1808580f701ab01fb016580ab80fb00310579063c0608c2cde63e20c70166002201a9808d80cf0127018f001700556cb8db009f08b33910f00e6001600d600fe021a05b24e40550c2ba79c37d157b8fdd4bc07c1950071812056503b9089001780ab001bb5e518cdf87567162390fecb30047001bc1ee6064a77b1dd88d8a60b08005b643c883d5a37eac54d012af58de4d8c0e87e981f744f948f98001722bab0913e56318dfb05859b600e6161e13e541aa1933df7d7be69bd3df1dfe3bf6c3de1d69190f3dfa052c85057774b87c59586c74fcaa7eacfd8a776171b5e2dd0509911e1303836042c35eafe2d45f9d1e1bb328828235fef0a1c3c6befa5ec4d2f0d855f111ef26c6c4be17b72c8c8ae8b07a252e3e6c6978d8e298a5701bfedeab8ba317bc1a1e1b0d8bb6b857f14f03432836de4130b3f10c1fccbcdca7f345bf09c1cb3b58a825ca3dfafdc7d28f8c8d88888d8f897b3506e6634be36357e14cf883e58c996fbe6d29e70be65d9d59b49a4fad9c1916f7de7f385ff110651c5b33092b67c62c9b1e1b1d831b07d7937c71745cfcd480b767be1bf0f6bb7ed3a6cf9c3d48be6ca56fc4e2b05511e1383353c17d6ec2cae971b101e1f3e583e58351bf95a0ab803eacf45fd5691ec002db6133cf0a4300a11f30580198c722f0fd4e33be9fc7bacfb1c2ba0fb14ec345f0787cb804ec25e45ca23fb9c87ff312ff2f5f2fa740df9fcbc0dbea1ee36172e7bd2fb8cd63dd83412e60ef3f0379f02fc03a5e2e4ea633d23d15fab174c66c01d76f95555c07b2baba796533725176a7dde7d97f9c3637fd399cf47f4a61ec9c53193fad6cfa8b21ed2f01ed0087d4cef4b1bf6556693e62efb1fdf3b37dcbd5fa5a5733dfa7abb9fdb580407ffa2e7a69242d035581b45c1ac6bc271b3a6cb872c4c851a3c78c0d5b408547445a4276daab26bdeeebe7df61cf4a39e71acab9e481618172ecef9580a591f817d35572265c5723026fb4f4f30ca725d48cb9d5dc6c7e64ae31979993cdc88cda512b6a468f500d2ab3ac55201ff2d68b277628524cafdcd87ee1f219c1f53d576dc253bfb97cef1f577a7806dc7e65d48837df3e70e13dcfd71a373ef9febd59c41eedb1b2906bbb53f7ec3c347c58fd15c13f4cc3bf1fbedcfdb2e2c7e10f03d6cc19ebbd53fcf6b3f620c7fca7635ddfa73c5f1f56f6d98425b76efc38ffbb27860f3f2bbafd75f887fedf5d0ff2967bf55f6050576f54bcb6fbeed8dd03ef7dbccfebb5b9b6ff9892b2fb81cb8da8a3ebc6fafdbdcf1717d7bdb6bcd4f0a96ed5c2922153e7de9c76b264ddfabab3af4d9b38f1c3a3a35eb23fded67a212cdd31f1dbbfdd7e2da87cdfae401f69d889ef5f8a70bd354b66bcd4f09a4df527cb2676f9d7fcdaf96fb9d7334edbb357b299b65f11b176ca94e4d79ca69edd5c20cfe2cd083de999307cd18c0b9b875f18346defedf9134fcd1f9fbd6ded9cc5332e4ccf6e1834d13e426697909291b0e2ece469915b9a674dff7ae6bc886f4fe777db73a0c8d5f4897f1f8df15efb46b735c5d7736fb46c7df6cb47fbdf7a7d246f34f1c5a8c5ca98d33fe7dc4d3eab3eaadfa76a9efeccf3bd8ae5ef954ea8dc73d11cd077e5f1ef3b7452fe68eba1652fedfbfba9add15df554507ad66df5088eee22f9d725ae5c3be4b3c5ee393b7446f8bc5d11ef793babeb75564e7ff3ed8059208774719dfefaea2f60ae738c4191d53dc6742bf31eb82f61cd47409e64efbb1d84f9e35f8075bc5cf43fc4c88fc0df975f32660bb87ec758c5e55cdcd56d2d6bbe6c65eff2f51fa7cd4dff28277d6fd66e259bee3bacbc7214fa43482712107bb0337decefaa55fab3d8f4b13dddd87fd29f747a18842fae77b93cd3a1eb25e8b8580b157d112fb818f749f4c57bc1855de57fa44cff87ae5ddf9acd185f03b6b1083fd769c6f7db58f72d56d87301d6638f3afa4ee66516c1bed32298774c5e42f6fda8a0b38fc5575923fb5ef909fb7eb3a9abfbffd72f8295dd94e7ff36a2c792bafb3784c8f67f3547ffbdfe7bfdf7faeff5dfebff4fd71d765cb5c8668e143777952e1c398823c770e4648e9cc5910b397225476672e4468edcc7914738f20c475ee5c83b1cd9cc91e296aed285230771e4188e9ccc91b338722147aee4c84c8edcc891fb38f208479ee1c8ab1c7987239b3952fcb4ab74e1c8411c3986232773e42c8e5cc8912b393293233772e43e8e3cc2916738f22a47dee1c8668e143feb2a5d387210478ee1c8c91c398b231772e44a8ecce4c88d1cb98f238f70e4198ebcca917738b29923c5ad5da50b470ee2c8311c3999236771e4428e5cc991991cb99123f771e4118e3cc3915739f20e473673a4b8adab74e1c8411c3986232773e42c8e5cc8912b393293233772e43e8e3cc2916738f22a47dee1c8668e14b777952e1c398823c770e4648e9cc5910b397225476672e4468edcc7914738f20c475ee5c83b1cd9cc91627357e9c2918338720c474ee6c8591cb9902357726426476ee4c87d1c798423cf70e4558eb45c66f31fbdeffde7afff747cffa9ebf5ffcbddd77ec9b46b4961d7f675f995d5ab078cacf9e93f23cd5d2efc5e43f5f6eb01017f90c33f73672ecb7ba3ce9899f7463ecb6c3aca95fcc05bf6a711fdc5ebff019e470e55'
ISP_PROG = binascii.unhexlify(ISP_PROG)
ISP_PROG = zlib.decompress(ISP_PROG)
def printProgressBar (iteration, total, prefix = '', suffix = '', filename = '', decimals = 1, length = 100, fill = '='):
"""
Call in a loop to create terminal progress bar
@params:
iteration - Required : current iteration (Int)
total - Required : total iterations (Int)
prefix - Optional : prefix string (Str)
suffix - Optional : suffix string (Str)
decimals - Optional : positive number of decimals in percent complete (Int)
length - Optional : character length of bar (Int)
fill - Optional : bar fill character (Str)
"""
percent = ("{0:." + str(decimals) + "f}").format(100 * (iteration / float(total)))
filledLength = int(length * iteration // total)
bar = fill * filledLength + '-' * (length - filledLength)
KFlash.log('\r%s |%s| %s%% %s' % (prefix, bar, percent, suffix), end = '\r')
# Print New Line on Complete
if iteration == total:
KFlash.log()
if callback:
fileTypeStr = filename
if prefix == "Downloading ISP:":
fileTypeStr = "ISP"
elif prefix == "Programming BIN:" and fileTypeStr == "":
fileTypeStr = "BIN"
callback(fileTypeStr, iteration, total, suffix)
def slip_reader(port):
partial_packet = None
in_escape = False
while True:
waiting = port.inWaiting()
read_bytes = port.read(1 if waiting == 0 else waiting)
if read_bytes == b'':
raise_exception( Exception("Timed out waiting for packet %s" % ("header" if partial_packet is None else "content")) )
for b in read_bytes:
if type(b) is int:
b = bytes([b]) # python 2/3 compat
if partial_packet is None: # waiting for packet header
if b == b'\xc0':
partial_packet = b""
else:
raise_exception( Exception('Invalid head of packet (%r)' % b) )
elif in_escape: # part-way through escape sequence
in_escape = False
if b == b'\xdc':
partial_packet += b'\xc0'
elif b == b'\xdd':
partial_packet += b'\xdb'
else:
raise_exception( Exception('Invalid SLIP escape (%r%r)' % (b'\xdb', b)) )
elif b == b'\xdb': # start of escape sequence
in_escape = True
elif b == b'\xc0': # end of packet
yield partial_packet
partial_packet = None
else: # normal byte in packet
partial_packet += b
class ISPResponse:
class ISPOperation(Enum):
ISP_ECHO = 0xC1
ISP_NOP = 0xC2
ISP_MEMORY_WRITE = 0xC3
ISP_MEMORY_READ = 0xC4
ISP_MEMORY_BOOT = 0xC5
ISP_DEBUG_INFO = 0xD1
ISP_CHANGE_BAUDRATE = 0xc6
class ErrorCode(Enum):
ISP_RET_DEFAULT = 0
ISP_RET_OK = 0xE0
ISP_RET_BAD_DATA_LEN = 0xE1
ISP_RET_BAD_DATA_CHECKSUM = 0xE2
ISP_RET_INVALID_COMMAND = 0xE3
@staticmethod
def parse(data):
# type: (bytes) -> (int, int, str)
op = 0
reason = 0
text = ''
if (sys.version_info > (3, 0)):
op = int(data[0])
reason = int(data[1])
else:
op = ord(data[0])
reason = ord(data[1])
try:
if ISPResponse.ISPOperation(op) == ISPResponse.ISPOperation.ISP_DEBUG_INFO:
text = data[2:].decode()
except ValueError:
KFlash.log('Warning: recv unknown op', op)
return (op, reason, text)
class FlashModeResponse:
class Operation(Enum):
ISP_DEBUG_INFO = 0xD1
ISP_NOP = 0xD2
ISP_FLASH_ERASE = 0xD3
ISP_FLASH_WRITE = 0xD4
ISP_REBOOT = 0xD5
ISP_UARTHS_BAUDRATE_SET = 0xD6
FLASHMODE_FLASH_INIT = 0xD7
class ErrorCode(Enum):
ISP_RET_DEFAULT = 0
ISP_RET_OK = 0xE0
ISP_RET_BAD_DATA_LEN = 0xE1
ISP_RET_BAD_DATA_CHECKSUM = 0xE2
ISP_RET_INVALID_COMMAND = 0xE3
ISP_RET_BAD_INITIALIZATION = 0xE4
@staticmethod
def parse(data):
# type: (bytes) -> (int, int, str)
op = 0
reason = 0
text = ''
if (sys.version_info > (3, 0)):
op = int(data[0])
reason = int(data[1])
else:
op = ord(data[0])
reason = ord(data[1])
if FlashModeResponse.Operation(op) == FlashModeResponse.Operation.ISP_DEBUG_INFO:
text = data[2:].decode()
return (op, reason, text)
def chunks(l, n):
"""Yield successive n-sized chunks from l."""
for i in range(0, len(l), n):
yield l[i:i + n]
class TerminalSize:
@staticmethod
def getTerminalSize():
import platform
current_os = platform.system()
tuple_xy=None
if current_os == 'Windows':
tuple_xy = TerminalSize._getTerminalSize_windows()
if tuple_xy is None:
tuple_xy = TerminalSize._getTerminalSize_tput()
# needed for window's python in cygwin's xterm!
if current_os == 'Linux' or current_os == 'Darwin' or current_os.startswith('CYGWIN'):
tuple_xy = TerminalSize._getTerminalSize_linux()
if tuple_xy is None:
# Use default value
tuple_xy = (80, 25) # default value
return tuple_xy
@staticmethod
def _getTerminalSize_windows():
res=None
try:
from ctypes import windll, create_string_buffer
# stdin handle is -10
# stdout handle is -11
# stderr handle is -12
h = windll.kernel32.GetStdHandle(-12)
csbi = create_string_buffer(22)
res = windll.kernel32.GetConsoleScreenBufferInfo(h, csbi)
except:
return None
if res:
import struct
(bufx, bufy, curx, cury, wattr,
left, top, right, bottom, maxx, maxy) = struct.unpack("hhhhHhhhhhh", csbi.raw)
sizex = right - left + 1
sizey = bottom - top + 1
return sizex, sizey
else:
return None
@staticmethod
def _getTerminalSize_tput():
# get terminal width
# src: http://stackoverflow.com/questions/263890/how-do-i-find-the-width-height-of-a-terminal-window
try:
import subprocess
proc=subprocess.Popen(["tput", "cols"],stdin=subprocess.PIPE,stdout=subprocess.PIPE)
output=proc.communicate(input=None)
cols=int(output[0])
proc=subprocess.Popen(["tput", "lines"],stdin=subprocess.PIPE,stdout=subprocess.PIPE)
output=proc.communicate(input=None)
rows=int(output[0])
return (cols,rows)
except:
return None
@staticmethod
def _getTerminalSize_linux():
def ioctl_GWINSZ(fd):
try:
import fcntl, termios, struct, os
cr = struct.unpack('hh', fcntl.ioctl(fd, termios.TIOCGWINSZ,'1234'))
except:
return None
return cr
cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2)
if not cr:
try:
fd = os.open(os.ctermid(), os.O_RDONLY)
cr = ioctl_GWINSZ(fd)
os.close(fd)
except:
pass
if not cr:
try:
cr = (os.env['LINES'], os.env['COLUMNS'])
except:
return None
return int(cr[1]), int(cr[0])
@staticmethod
def get_terminal_size(fallback=(100, 24), terminal = False):
try:
columns, rows = TerminalSize.getTerminalSize()
if not terminal:
if not terminal_auto_size:
columns, rows = terminal_size
except:
columns, rows = fallback
return columns, rows
class MAIXLoader:
def change_baudrate(self, baudrate):
KFlash.log(INFO_MSG,"Selected Baudrate: ", baudrate, BASH_TIPS['DEFAULT'])
out = struct.pack('III', 0, 4, baudrate)
crc32_checksum = struct.pack('I', binascii.crc32(out) & 0xFFFFFFFF)
out = struct.pack('HH', 0xd6, 0x00) + crc32_checksum + out
self.write(out)
time.sleep(0.05)
self._port.baudrate = baudrate
if args.Board == "goE":
if baudrate >= 4500000:
# OPENEC super baudrate
KFlash.log(INFO_MSG, "Enable OPENEC super baudrate!!!", BASH_TIPS['DEFAULT'])
if baudrate == 4500000:
self._port.baudrate = 300
if baudrate == 6000000:
self._port.baudrate = 250
if baudrate == 7500000:
self._port.baudrate = 350
def change_baudrate_stage0(self, baudrate):
# Dangerous, here are dinosaur infested!!!!!
# Don't touch this code unless you know what you are doing
# Stage0 baudrate is fixed
# Contributor: [@rgwan](https://github.com/rgwan)
# rgwan <dv.xw@qq.com>
baudrate = 1500000
if args.Board == "goE" or args.Board == "trainer":
KFlash.log(INFO_MSG,"Selected Stage0 Baudrate: ", baudrate, BASH_TIPS['DEFAULT'])
# This is for openec, contained ft2232, goE and trainer
KFlash.log(INFO_MSG,"FT2232 mode", BASH_TIPS['DEFAULT'])
baudrate_stage0 = int(baudrate * 38.6 / 38)
out = struct.pack('III', 0, 4, baudrate_stage0)
crc32_checksum = struct.pack('I', binascii.crc32(out) & 0xFFFFFFFF)
out = struct.pack('HH', 0xc6, 0x00) + crc32_checksum + out
self.write(out)
time.sleep(0.05)
self._port.baudrate = baudrate
retry_count = 0
while 1:
self.checkKillExit()
retry_count = retry_count + 1
if retry_count > 3:
err = (ERROR_MSG,'Fast mode failed, please use slow mode by add parameter ' + BASH_TIPS['GREEN'] + '--Slow', BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
try:
self.greeting()
break
except TimeoutError:
pass
elif args.Board == "dan" or args.Board == "bit" or args.Board == "kd233":
KFlash.log(INFO_MSG,"CH340 mode", BASH_TIPS['DEFAULT'])
# This is for CH340, contained dan, bit and kd233
baudrate_stage0 = int(baudrate * 38.4 / 38)
# CH340 can not use this method, test failed, take risks at your own risk
else:
# This is for unknown board
KFlash.log(WARN_MSG,"Unknown mode", BASH_TIPS['DEFAULT'])
def __init__(self, port='/dev/ttyUSB1', baudrate=115200):
# configure the serial connections (the parameters differs on the device you are connecting to)
self._port = serial.Serial(
port=port,
baudrate=baudrate,
parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE,
bytesize=serial.EIGHTBITS,
timeout=0.1
)
KFlash.log(INFO_MSG, "Default baudrate is", baudrate, ", later it may be changed to the value you set.", BASH_TIPS['DEFAULT'])
self._port.isOpen()
self._slip_reader = slip_reader(self._port)
self._kill_process = False
""" Read a SLIP packet from the serial port """
def read(self):
return next(self._slip_reader)
""" Write bytes to the serial port while performing SLIP escaping """
def write(self, packet):
buf = b'\xc0' \
+ (packet.replace(b'\xdb', b'\xdb\xdd').replace(b'\xc0', b'\xdb\xdc')) \
+ b'\xc0'
#KFlash.log('[WRITE]', binascii.hexlify(buf))
return self._port.write(buf)
def read_loop(self):
#out = b''
# while self._port.inWaiting() > 0:
# out += self._port.read(1)
# KFlash.log(out)
while 1:
sys.stdout.write('[RECV] raw data: ')
sys.stdout.write(binascii.hexlify(self._port.read(1)).decode())
sys.stdout.flush()
def recv_one_return(self):
timeout_init = time.time()
data = b''
# find start boarder
#sys.stdout.write('[RECV one return] raw data: ')
while 1:
if time.time() - timeout_init > ISP_RECEIVE_TIMEOUT:
raise TimeoutError
c = self._port.read(1)
#sys.stdout.write(binascii.hexlify(c).decode())
sys.stdout.flush()
if c == b'\xc0':
break
in_escape = False
while 1:
if time.time() - timeout_init > ISP_RECEIVE_TIMEOUT:
self.raise_exception( TimeoutError )
c = self._port.read(1)
#sys.stdout.write(binascii.hexlify(c).decode())
sys.stdout.flush()
if c == b'\xc0':
break
elif in_escape: # part-way through escape sequence
in_escape = False
if c == b'\xdc':
data += b'\xc0'
elif c == b'\xdd':
data += b'\xdb'
else:
self.raise_exception( Exception('Invalid SLIP escape (%r%r)' % (b'\xdb', c)) )
elif c == b'\xdb': # start of escape sequence
in_escape = True
data += c
#sys.stdout.write('\n')
return data
# kd233 or open-ec or new cmsis-dap
def reset_to_isp_kd233(self):
self._port.setDTR (False)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW, IO16 to HIGH --')
# Pull reset down and keep 10ms
self._port.setDTR (True)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- IO16 to LOW, RESET to HIGH --')
# Pull IO16 to low and release reset
self._port.setRTS (True)
self._port.setDTR (False)
time.sleep(0.1)
def reset_to_boot_kd233(self):
self._port.setDTR (False)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW --')
# Pull reset down and keep 10ms
self._port.setDTR (True)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to HIGH, BOOT --')
# Pull IO16 to low and release reset
self._port.setRTS (False)
self._port.setDTR (False)
time.sleep(0.1)
#dan dock
def reset_to_isp_dan(self):
self._port.setDTR (False)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW, IO16 to HIGH --')
# Pull reset down and keep 10ms
self._port.setDTR (False)
self._port.setRTS (True)
time.sleep(0.1)
#KFlash.log('-- IO16 to LOW, RESET to HIGH --')
# Pull IO16 to low and release reset
self._port.setRTS (False)
self._port.setDTR (True)
time.sleep(0.1)
def reset_to_boot_dan(self):
self._port.setDTR (False)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW --')
# Pull reset down and keep 10ms
self._port.setDTR (False)
self._port.setRTS (True)
time.sleep(0.1)
#KFlash.log('-- RESET to HIGH, BOOT --')
# Pull IO16 to low and release reset
self._port.setRTS (False)
self._port.setDTR (False)
time.sleep(0.1)
# maix goD for old cmsis-dap firmware
def reset_to_isp_goD(self):
self._port.setDTR (True) ## output 0
self._port.setRTS (True)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW --')
# Pull reset down and keep 10ms
self._port.setRTS (False)
self._port.setDTR (True)
time.sleep(0.1)
#KFlash.log('-- RESET to HIGH, BOOT --')
# Pull IO16 to low and release reset
self._port.setRTS (False)
self._port.setDTR (True)
time.sleep(0.1)
def reset_to_boot_goD(self):
self._port.setDTR (False)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW --')
# Pull reset down and keep 10ms
self._port.setRTS (False)
self._port.setDTR (True)
time.sleep(0.1)
#KFlash.log('-- RESET to HIGH, BOOT --')
# Pull IO16 to low and release reset
self._port.setRTS (True)
self._port.setDTR (True)
time.sleep(0.1)
# maix goE for openec or new cmsis-dap firmware
def reset_to_boot_maixgo(self):
self._port.setDTR (False)
self._port.setRTS (False)
time.sleep(0.1)
#KFlash.log('-- RESET to LOW --')
# Pull reset down and keep 10ms
self._port.setRTS (False)
self._port.setDTR (True)
time.sleep(0.1)
#KFlash.log('-- RESET to HIGH, BOOT --')
# Pull IO16 to low and release reset
self._port.setRTS (False)
self._port.setDTR (False)
time.sleep(0.1)
def greeting(self):
self._port.write(b'\xc0\xc2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0')
op, reason, text = ISPResponse.parse(self.recv_one_return())
#KFlash.log('MAIX return op:', ISPResponse.ISPOperation(op).name, 'reason:', ISPResponse.ErrorCode(reason).name)
def flash_greeting(self):
retry_count = 0
while 1:
self.checkKillExit()
self._port.write(b'\xc0\xd2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0')
retry_count = retry_count + 1
try:
op, reason, text = FlashModeResponse.parse(self.recv_one_return())
except IndexError:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to Connect to K210's Stub",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Index Error, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
except TimeoutError:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to Connect to K210's Stub",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Timeout Error, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
except:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to Connect to K210's Stub",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Unexcepted Error, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
# KFlash.log('MAIX return op:', FlashModeResponse.Operation(op).name, 'reason:',
# FlashModeResponse.ErrorCode(reason).name)
if FlashModeResponse.Operation(op) == FlashModeResponse.Operation.ISP_NOP and FlashModeResponse.ErrorCode(reason) == FlashModeResponse.ErrorCode.ISP_RET_OK:
KFlash.log(INFO_MSG,"Boot to Flashmode Successfully",BASH_TIPS['DEFAULT'])
self._port.flushInput()
self._port.flushOutput()
break
else:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to Connect to K210's Stub",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Unexcepted Return recevied, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
def boot(self, address=0x80000000):
KFlash.log(INFO_MSG,"Booting From " + hex(address),BASH_TIPS['DEFAULT'])
out = struct.pack('II', address, 0)
crc32_checksum = struct.pack('I', binascii.crc32(out) & 0xFFFFFFFF)
out = struct.pack('HH', 0xc5, 0x00) + crc32_checksum + out # op: ISP_MEMORY_WRITE: 0xc3
self.write(out)
def recv_debug(self):
op, reason, text = ISPResponse.parse(self.recv_one_return())
#KFlash.log('[RECV] op:', ISPResponse.ISPOperation(op).name, 'reason:', ISPResponse.ErrorCode(reason).name)
if text:
KFlash.log('-' * 30)
KFlash.log(text)
KFlash.log('-' * 30)
if ISPResponse.ErrorCode(reason) not in (ISPResponse.ErrorCode.ISP_RET_DEFAULT, ISPResponse.ErrorCode.ISP_RET_OK):
KFlash.log('Failed, retry, errcode=', hex(reason))
return False
return True
def flash_recv_debug(self):
op, reason, text = FlashModeResponse.parse(self.recv_one_return())
#KFlash.log('[Flash-RECV] op:', FlashModeResponse.Operation(op).name, 'reason:',
# FlashModeResponse.ErrorCode(reason).name)
if text:
KFlash.log('-' * 30)
KFlash.log(text)
KFlash.log('-' * 30)
if FlashModeResponse.ErrorCode(reason) not in (FlashModeResponse.ErrorCode.ISP_RET_OK, FlashModeResponse.ErrorCode.ISP_RET_OK):
KFlash.log('Failed, retry')
return False
return True
def init_flash(self, chip_type):
chip_type = int(chip_type)
KFlash.log(INFO_MSG,"Selected Flash: ",("In-Chip", "On-Board")[chip_type],BASH_TIPS['DEFAULT'])
out = struct.pack('II', chip_type, 0)
crc32_checksum = struct.pack('I', binascii.crc32(out) & 0xFFFFFFFF)
out = struct.pack('HH', 0xd7, 0x00) + crc32_checksum + out
'''Retry when it have error'''
retry_count = 0
while 1:
self.checkKillExit()
sent = self.write(out)
retry_count = retry_count + 1
try:
op, reason, text = FlashModeResponse.parse(self.recv_one_return())
except IndexError:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to initialize flash",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Index Error, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
except TimeoutError:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to initialize flash",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Timeout Error, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
except:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to initialize flash",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Unexcepted Error, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
# KFlash.log('MAIX return op:', FlashModeResponse.Operation(op).name, 'reason:',
# FlashModeResponse.ErrorCode(reason).name)
if FlashModeResponse.Operation(op) == FlashModeResponse.Operation.FLASHMODE_FLASH_INIT and FlashModeResponse.ErrorCode(reason) == FlashModeResponse.ErrorCode.ISP_RET_OK:
KFlash.log(INFO_MSG,"Initialization flash Successfully",BASH_TIPS['DEFAULT'])
break
else:
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Failed to initialize flash",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
KFlash.log(WARN_MSG,"Unexcepted Return recevied, retrying...",BASH_TIPS['DEFAULT'])
time.sleep(0.1)
continue
def flash_dataframe(self, data, address=0x80000000):
DATAFRAME_SIZE = 1024
data_chunks = chunks(data, DATAFRAME_SIZE)
#KFlash.log('[DEBUG] flash dataframe | data length:', len(data))
total_chunk = math.ceil(len(data)/DATAFRAME_SIZE)
time_start = time.time()
for n, chunk in enumerate(data_chunks):
self.checkKillExit()
while 1:
self.checkKillExit()
#KFlash.log('[INFO] sending chunk', i, '@address', hex(address), 'chunklen', len(chunk))
out = struct.pack('II', address, len(chunk))
crc32_checksum = struct.pack('I', binascii.crc32(out + chunk) & 0xFFFFFFFF)
out = struct.pack('HH', 0xc3, 0x00) + crc32_checksum + out + chunk # op: ISP_MEMORY_WRITE: 0xc3
sent = self.write(out)
#KFlash.log('[INFO]', 'sent', sent, 'bytes', 'checksum', binascii.hexlify(crc32_checksum).decode())
address += len(chunk)
if self.recv_debug():
break
columns, lines = TerminalSize.get_terminal_size((100, 24), terminal)
time_delta = time.time() - time_start
speed = ''
if (time_delta > 1):
speed = str(int((n + 1) * DATAFRAME_SIZE / 1024.0 / time_delta)) + 'kiB/s'
printProgressBar(n+1, total_chunk, prefix = 'Downloading ISP:', suffix = speed, length = columns - 35)
def dump_to_flash(self, data, address=0):
'''
typedef struct __attribute__((packed)) {
uint8_t op;
int32_t checksum; /* All the fields below are involved in the calculation of checksum */
uint32_t address;
uint32_t data_len;
uint8_t data_buf[1024];
} isp_request_t;
'''
DATAFRAME_SIZE = ISP_FLASH_DATA_FRAME_SIZE
data_chunks = chunks(data, DATAFRAME_SIZE)
#KFlash.log('[DEBUG] flash dataframe | data length:', len(data))
for n, chunk in enumerate(data_chunks):
#KFlash.log('[INFO] sending chunk', i, '@address', hex(address))
out = struct.pack('II', address, len(chunk))
crc32_checksum = struct.pack('I', binascii.crc32(out + chunk) & 0xFFFFFFFF)
out = struct.pack('HH', 0xd4, 0x00) + crc32_checksum + out + chunk
#KFlash.log("[$$$$]", binascii.hexlify(out[:32]).decode())
retry_count = 0
while True:
try:
sent = self.write(out)
#KFlash.log('[INFO]', 'sent', sent, 'bytes', 'checksum', crc32_checksum)
self.flash_recv_debug()
except:
retry_count = retry_count + 1
if retry_count > MAX_RETRY_TIMES:
err = (ERROR_MSG,"Error Count Exceeded, Stop Trying",BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
continue
break
address += len(chunk)
def flash_erase(self):
#KFlash.log('[DEBUG] erasing spi flash.')
self._port.write(b'\xc0\xd3\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0')
op, reason, text = FlashModeResponse.parse(self.recv_one_return())
#KFlash.log('MAIX return op:', FlashModeResponse.Operation(op).name, 'reason:',
# FlashModeResponse.ErrorCode(reason).name)
def install_flash_bootloader(self, data):
# Download flash bootloader
self.flash_dataframe(data, address=0x80000000)
def load_elf_to_sram(self, f):
try:
from elftools.elf.elffile import ELFFile
from elftools.elf.descriptions import describe_p_type
except ImportError:
err = (ERROR_MSG,'pyelftools must be installed, run '+BASH_TIPS['GREEN']+'`' + ('pip', 'pip3')[sys.version_info > (3, 0)] + ' install pyelftools`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
self.raise_exception( Exception(err) )
elffile = ELFFile(f)
if elffile['e_entry'] != 0x80000000:
KFlash.log(WARN_MSG,"ELF entry is 0x%x instead of 0x80000000" % (elffile['e_entry']), BASH_TIPS['DEFAULT'])
for segment in elffile.iter_segments():
t = describe_p_type(segment['p_type'])
KFlash.log(INFO_MSG, ("Program Header: Size: %d, Virtual Address: 0x%x, Type: %s" % (segment['p_filesz'], segment['p_vaddr'], t)), BASH_TIPS['DEFAULT'])
if not (segment['p_vaddr'] & 0x80000000):
continue
if segment['p_filesz']==0 or segment['p_vaddr']==0:
KFlash.log("Skipped")
continue
self.flash_dataframe(segment.data(), segment['p_vaddr'])
def flash_firmware(self, firmware_bin, aes_key = None, address_offset = 0, sha256Prefix = True, filename = ""):
# type: (bytes, bytes, int, bool) -> None
# Don't remove above code!
#KFlash.log('[DEBUG] flash_firmware DEBUG: aeskey=', aes_key)
if sha256Prefix == True:
# Add header to the firmware
# Format: SHA256(after)(32bytes) + AES_CIPHER_FLAG (1byte) + firmware_size(4bytes) + firmware_data
aes_cipher_flag = b'\x01' if aes_key else b'\x00'
# Encryption
if aes_key:
enc = AES_128_CBC(aes_key, iv=b'\x00'*16).encrypt
padded = firmware_bin + b'\x00'*15 # zero pad
firmware_bin = b''.join([enc(padded[i*16:i*16+16]) for i in range(len(padded)//16)])
firmware_len = len(firmware_bin)
data = aes_cipher_flag + struct.pack('I', firmware_len) + firmware_bin
sha256_hash = hashlib.sha256(data).digest()
firmware_with_header = data + sha256_hash
total_chunk = math.ceil(len(firmware_with_header)/ISP_FLASH_DATA_FRAME_SIZE)
# Slice download firmware
data_chunks = chunks(firmware_with_header, ISP_FLASH_DATA_FRAME_SIZE) # 4kiB for a sector, 16kiB for dataframe
else:
total_chunk = math.ceil(len(firmware_bin)/ISP_FLASH_DATA_FRAME_SIZE)
data_chunks = chunks(firmware_bin, ISP_FLASH_DATA_FRAME_SIZE)
time_start = time.time()
for n, chunk in enumerate(data_chunks):
self.checkKillExit()
chunk = chunk.ljust(ISP_FLASH_DATA_FRAME_SIZE, b'\x00') # align by size of dataframe
# Download a dataframe
#KFlash.log('[INFO]', 'Write firmware data piece')
self.dump_to_flash(chunk, address= n * ISP_FLASH_DATA_FRAME_SIZE + address_offset)
columns, lines = TerminalSize.get_terminal_size((100, 24), terminal)
time_delta = time.time() - time_start
speed = ''
if (time_delta > 1):
speed = str(int((n + 1) * ISP_FLASH_DATA_FRAME_SIZE / 1024.0 / time_delta)) + 'kiB/s'
printProgressBar(n+1, total_chunk, prefix = 'Programming BIN:', filename=filename, suffix = speed, length = columns - 35)
def kill(self):
self._kill_process = True
def checkKillExit(self):
if self._kill_process:
self._port.close()
self._kill_process = False
raise Exception("Cancel")
def open_terminal(reset):
control_signal = '0' if reset else '1'
control_signal_b = not reset
import serial.tools.miniterm
# For using the terminal with MaixPy the 'filter' option must be set to 'direct'
# because some control characters are emited
sys.argv = [sys.argv[0], _port, '115200', '--dtr='+control_signal, '--rts='+control_signal, '--filter=direct']
serial.tools.miniterm.main(default_port=_port, default_baudrate=115200, default_dtr=control_signal_b, default_rts=control_signal_b)
sys.exit(0)
boards_choices = ["kd233", "dan", "bit", "bit_mic", "goE", "goD", "maixduino", "trainer"]
if terminal:
parser = argparse.ArgumentParser()
parser.add_argument("-p", "--port", help="COM Port", default="DEFAULT")
parser.add_argument("-f", "--flash", help="SPI Flash type, 0 for SPI3, 1 for SPI0", default=1)
parser.add_argument("-b", "--baudrate", type=int, help="UART baudrate for uploading firmware", default=115200)
parser.add_argument("-l", "--bootloader", help="Bootloader bin path", required=False, default=None)
parser.add_argument("-k", "--key", help="AES key in hex, if you need encrypt your firmware.", required=False, default=None)
parser.add_argument("-v", "--version", help="Print version.", action='version', version='0.8.3')
parser.add_argument("--verbose", help="Increase output verbosity", default=False, action="store_true")
parser.add_argument("-t", "--terminal", help="Start a terminal after finish (Python miniterm)", default=False, action="store_true")
parser.add_argument("-n", "--noansi", help="Do not use ANSI colors, recommended in Windows CMD", default=False, action="store_true")
parser.add_argument("-s", "--sram", help="Download firmware to SRAM and boot", default=False, action="store_true")
parser.add_argument("-B", "--Board",required=False, type=str, help="Select dev board", choices=boards_choices)
parser.add_argument("-S", "--Slow",required=False, help="Slow download mode", default=False)
parser.add_argument("firmware", help="firmware bin path")
args = parser.parse_args()
else:
args = argparse.Namespace()
setattr(args, "port", "DEFAULT")
setattr(args, "flash", 1)
setattr(args, "baudrate", 115200)
setattr(args, "bootloader", None)
setattr(args, "key", None)
setattr(args, "verbose", False)
setattr(args, "terminal", False)
setattr(args, "noansi", False)
setattr(args, "sram", False)
setattr(args, "Board", None)
setattr(args, "Slow", False)
# udpate args for none terminal call
if not terminal:
args.port = dev
args.baudrate = baudrate
args.noansi = noansi
args.sram = sram
args.Board = board
args.firmware = file
if args.Board == "maixduino" or args.Board == "bit_mic":
args.Board = "goE"
if (args.noansi == True):
BASH_TIPS = dict(NORMAL='',BOLD='',DIM='',UNDERLINE='',
DEFAULT='', RED='', YELLOW='', GREEN='',
BG_DEFAULT='', BG_WHITE='')
ERROR_MSG = BASH_TIPS['RED']+BASH_TIPS['BOLD']+'[ERROR]'+BASH_TIPS['NORMAL']
WARN_MSG = BASH_TIPS['YELLOW']+BASH_TIPS['BOLD']+'[WARN]'+BASH_TIPS['NORMAL']
INFO_MSG = BASH_TIPS['GREEN']+BASH_TIPS['BOLD']+'[INFO]'+BASH_TIPS['NORMAL']
KFlash.log(INFO_MSG,'ANSI colors not used',BASH_TIPS['DEFAULT'])
manually_set_the_board = False
if args.Board:
manually_set_the_board = True
if args.port == "DEFAULT":
if args.Board == "goE":
list_port_info = list(serial.tools.list_ports.grep("0403")) #Take the second one
if len(list_port_info) == 0:
err = (ERROR_MSG,"No vaild COM Port found in Auto Detect, Check Your Connection or Specify One by"+BASH_TIPS['GREEN']+'`--port/-p`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
list_port_info.sort()
if len(list_port_info) == 1:
_port = list_port_info[0].device
elif len(list_port_info) > 1:
_port = list_port_info[1].device
KFlash.log(INFO_MSG,"COM Port Auto Detected, Selected ", _port, BASH_TIPS['DEFAULT'])
elif args.Board == "trainer":
list_port_info = list(serial.tools.list_ports.grep("0403")) #Take the first one
if(len(list_port_info)==0):
err = (ERROR_MSG,"No vaild COM Port found in Auto Detect, Check Your Connection or Specify One by"+BASH_TIPS['GREEN']+'`--port/-p`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
list_port_info.sort()
_port = list_port_info[0].device
KFlash.log(INFO_MSG,"COM Port Auto Detected, Selected ", _port, BASH_TIPS['DEFAULT'])
else:
try:
list_port_info = next(serial.tools.list_ports.grep(VID_LIST_FOR_AUTO_LOOKUP)) #Take the first one within the list
_port = list_port_info.device
KFlash.log(INFO_MSG,"COM Port Auto Detected, Selected ", _port, BASH_TIPS['DEFAULT'])
except StopIteration:
err = (ERROR_MSG,"No vaild COM Port found in Auto Detect, Check Your Connection or Specify One by"+BASH_TIPS['GREEN']+'`--port/-p`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
else:
_port = args.port
KFlash.log(INFO_MSG,"COM Port Selected Manually: ", _port, BASH_TIPS['DEFAULT'])
self.loader = MAIXLoader(port=_port, baudrate=115200)
file_format = ProgramFileFormat.FMT_BINARY
# 0. Check firmware
try:
firmware_bin = open(args.firmware, 'rb')
except FileNotFoundError:
err = (ERROR_MSG,'Unable to find the firmware at ', args.firmware, BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
with open(args.firmware, 'rb') as f:
file_header = f.read(4)
#if file_header.startswith(bytes([0x50, 0x4B])):
if file_header.startswith(b'\x50\x4B'):
if ".kfpkg" != os.path.splitext(args.firmware)[1]:
KFlash.log(INFO_MSG, 'Find a zip file, but not with ext .kfpkg:', args.firmware, BASH_TIPS['DEFAULT'])
else:
file_format = ProgramFileFormat.FMT_KFPKG
#if file_header.startswith(bytes([0x7F, 0x45, 0x4C, 0x46])):
if file_header.startswith(b'\x7f\x45\x4c\x46'):
file_format = ProgramFileFormat.FMT_ELF
if args.sram:
KFlash.log(INFO_MSG, 'Find an ELF file:', args.firmware, BASH_TIPS['DEFAULT'])
else:
err = (ERROR_MSG, 'This is an ELF file and cannot be programmed to flash directly:', args.firmware, BASH_TIPS['DEFAULT'] , '\r\nPlease retry:', args.firmware + '.bin', BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
# 1. Greeting.
KFlash.log(INFO_MSG,"Trying to Enter the ISP Mode...",BASH_TIPS['DEFAULT'])
retry_count = 0
while 1:
self.checkKillExit()
try:
retry_count = retry_count + 1
if retry_count > 15:
err = (ERROR_MSG,"No vaild Kendryte K210 found in Auto Detect, Check Your Connection or Specify One by"+BASH_TIPS['GREEN']+'`-p '+('/dev/ttyUSB0', 'COM3')[sys.platform == 'win32']+'`',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
if args.Board == "dan" or args.Board == "bit" or args.Board == "trainer":
try:
KFlash.log('.', end='')
self.loader.reset_to_isp_dan()
self.loader.greeting()
break
except TimeoutError:
pass
elif args.Board == "kd233":
try:
KFlash.log('_', end='')
self.loader.reset_to_isp_kd233()
self.loader.greeting()
break
except TimeoutError:
pass
elif args.Board == "goE":
try:
KFlash.log('*', end='')
self.loader.reset_to_isp_kd233()
self.loader.greeting()
break
except TimeoutError:
pass
elif args.Board == "goD":
try:
KFlash.log('#', end='')
self.loader.reset_to_isp_goD()
self.loader.greeting()
break
except TimeoutError:
pass
else:
try:
KFlash.log('.', end='')
self.loader.reset_to_isp_dan()
self.loader.greeting()
args.Board = "dan"
KFlash.log()
KFlash.log(INFO_MSG,"Automatically detected dan/bit/trainer",BASH_TIPS['DEFAULT'])
break
except TimeoutError:
pass
try:
KFlash.log('_', end='')
self.loader.reset_to_isp_kd233()
self.loader.greeting()
args.Board = "kd233"
KFlash.log()
KFlash.log(INFO_MSG,"Automatically detected goE/kd233",BASH_TIPS['DEFAULT'])
break
except TimeoutError:
pass
try:
KFlash.log('.', end='')
self.loader.reset_to_isp_goD()
self.loader.greeting()
args.Board = "goD"
KFlash.log()
KFlash.log(INFO_MSG,"Automatically detected goD",BASH_TIPS['DEFAULT'])
break
except TimeoutError:
pass
try:
# Magic, just repeat, don't remove, it may unstable, don't know why.
KFlash.log('_', end='')
self.loader.reset_to_isp_kd233()
self.loader.greeting()
args.Board = "kd233"
KFlash.log()
KFlash.log(INFO_MSG,"Automatically detected goE/kd233",BASH_TIPS['DEFAULT'])
break
except TimeoutError:
pass
except Exception as e:
KFlash.log()
raise_exception( Exception("Greeting fail, check serial port ("+str(e)+")" ) )
# Don't remove this line
# Dangerous, here are dinosaur infested!!!!!
ISP_RECEIVE_TIMEOUT = 3
KFlash.log()
KFlash.log(INFO_MSG,"Greeting Message Detected, Start Downloading ISP",BASH_TIPS['DEFAULT'])
if manually_set_the_board and (not args.Slow):
if (args.baudrate >= 1500000) or args.sram:
self.loader.change_baudrate_stage0(args.baudrate)
# 2. download bootloader and firmware
if args.sram:
if file_format == ProgramFileFormat.FMT_KFPKG:
err = (ERROR_MSG, "Unable to load kfpkg to SRAM")
err = tuple2str(err)
raise_exception( Exception(err) )
elif file_format == ProgramFileFormat.FMT_ELF:
self.loader.load_elf_to_sram(firmware_bin)
else:
self.loader.install_flash_bootloader(firmware_bin.read())
else:
# install bootloader at 0x80000000
isp_loader = open(args.bootloader, 'rb').read() if args.bootloader else ISP_PROG
self.loader.install_flash_bootloader(isp_loader)
# Boot the code from SRAM
self.loader.boot()
if args.sram:
# Dangerous, here are dinosaur infested!!!!!
# Don't touch this code unless you know what you are doing
self.loader._port.baudrate = args.baudrate
KFlash.log(INFO_MSG,"Boot user code from SRAM", BASH_TIPS['DEFAULT'])
if(args.terminal == True):
open_terminal(False)
msg = "Burn SRAM OK"
raise_exception( Exception(msg) )
# Dangerous, here are dinosaur infested!!!!!
# Don't touch this code unless you know what you are doing
self.loader._port.baudrate = 115200
KFlash.log(INFO_MSG,"Wait For 0.1 second for ISP to Boot", BASH_TIPS['DEFAULT'])
time.sleep(0.1)
self.loader.flash_greeting()
if args.baudrate != 115200:
self.loader.change_baudrate(args.baudrate)
KFlash.log(INFO_MSG,"Baudrate changed, greeting with ISP again ... ", BASH_TIPS['DEFAULT'])
self.loader.flash_greeting()
self.loader.init_flash(args.flash)
if file_format == ProgramFileFormat.FMT_KFPKG:
KFlash.log(INFO_MSG,"Extracting KFPKG ... ", BASH_TIPS['DEFAULT'])
firmware_bin.close()
with tempfile.TemporaryDirectory() as tmpdir:
try:
with zipfile.ZipFile(args.firmware) as zf:
zf.extractall(tmpdir)
except zipfile.BadZipFile:
err = (ERROR_MSG,'Unable to Decompress the kfpkg, your file might be corrupted.',BASH_TIPS['DEFAULT'])
err = tuple2str(err)
raise_exception( Exception(err) )
fFlashList = open(os.path.join(tmpdir, 'flash-list.json'), "r")
sFlashList = re.sub(r'"address": (.*),', r'"address": "\1",', fFlashList.read()) #Pack the Hex Number in json into str
fFlashList.close()
jsonFlashList = json.loads(sFlashList)
for lBinFiles in jsonFlashList['files']:
self.checkKillExit()
KFlash.log(INFO_MSG,"Writing",lBinFiles['bin'],"into","0x%08x"%int(lBinFiles['address'], 0),BASH_TIPS['DEFAULT'])
with open(os.path.join(tmpdir, lBinFiles["bin"]), "rb") as firmware_bin:
self.loader.flash_firmware(firmware_bin.read(), None, int(lBinFiles['address'], 0), lBinFiles['sha256Prefix'], filename=lBinFiles['bin'])
else:
if args.key:
aes_key = binascii.a2b_hex(args.key)
if len(aes_key) != 16:
raise_exception( ValueError('AES key must by 16 bytes') )
self.loader.flash_firmware(firmware_bin.read(), aes_key=aes_key)
else:
self.loader.flash_firmware(firmware_bin.read())
# 3. boot
if args.Board == "dan" or args.Board == "bit" or args.Board == "trainer":
self.loader.reset_to_boot_dan()
elif args.Board == "kd233":
self.loader.reset_to_boot_kd233()
elif args.Board == "goE":
self.loader.reset_to_boot_maixgo()
elif args.Board == "goD":
self.loader.reset_to_boot_goD()
else:
KFlash.log(WARN_MSG,"Board unknown !! please press reset to boot!!")
KFlash.log(INFO_MSG,"Rebooting...", BASH_TIPS['DEFAULT'])
try:
self.loader._port.close()
except Exception:
pass
if(args.terminal == True):
open_terminal(True)
def kill(self):
if self.loader:
self.loader.kill()
self.killProcess = True
def checkKillExit(self):
if self.killProcess:
if self.loader:
self.loader._port.close()
raise Exception("Cancel")
def main():
kflash = KFlash()
try:
kflash.process()
except Exception as e:
if str(e) == "Burn SRAM OK":
sys.exit(0)
kflash.log(str(e))
sys.exit(1)
if __name__ == '__main__':
main()
| 125.408809 | 76,163 | 0.773185 | 10,472 | 182,219 | 13.359817 | 0.295646 | 0.006433 | 0.00654 | 0.003431 | 0.115436 | 0.100626 | 0.094286 | 0.086424 | 0.080284 | 0.07721 | 0 | 0.44699 | 0.168435 | 182,219 | 1,452 | 76,164 | 125.495179 | 0.476305 | 0.038201 | 0 | 0.488605 | 0 | 0.002735 | 0.465225 | 0.436909 | 0 | 1 | 0.190479 | 0 | 0 | 1 | 0.050137 | false | 0.011851 | 0.025524 | 0.001823 | 0.112124 | 0.009116 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d4ec7baa1912e41d558090b1b5417088a169815f | 39,322 | py | Python | cogs/announcement.py | mischievousdev/announcer | 0cfdcf22fdfe4ce9a1422ac22b77a46ba65ca3ca | [
"MIT"
] | null | null | null | cogs/announcement.py | mischievousdev/announcer | 0cfdcf22fdfe4ce9a1422ac22b77a46ba65ca3ca | [
"MIT"
] | null | null | null | cogs/announcement.py | mischievousdev/announcer | 0cfdcf22fdfe4ce9a1422ac22b77a46ba65ca3ca | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import re
import asyncio
from datetime import datetime
import discord
import pytz
from discord.ext import commands, tasks
from utils.utlities import generate_embed, check_allowed, generate_id
from utils.time import parse
class Announcement(commands.Cog):
"""Announcement commands with which you can make announcements!"""
def __init__(self, bot):
self.bot = bot
self.timed_announcements.start()
self.raw_timed_announcements.start()
self.bot.log.info("Timed announcements tasks started")
@tasks.loop(seconds=1)
async def timed_announcements(self):
announcements = self.bot.cache.all_timed_announcements
utc = pytz.UTC
now = datetime.utcnow().replace(tzinfo=utc)
if len(announcements) == 0:
return
for data in announcements:
if now >= data.expires.replace(tzinfo=utc):
channel = self.bot.get_channel(data.channel_id)
embed = discord.Embed.from_dict(data.embed_details)
await self.bot.pool.execute("DELETE FROM timed_announcements WHERE announcement_id = $1", data.announcement_id)
await self.bot.cache.cache_timed_announcements()
await self.bot.cache.list_timed_announcements()
return await channel.send(embed=embed)
@tasks.loop(seconds=1)
async def raw_timed_announcements(self):
announcements = self.bot.cache.all_raw_ta
utc = pytz.UTC
now = datetime.utcnow().replace(tzinfo=utc)
if len(announcements) == 0:
return
for data in announcements:
if now >= data.expires.replace(tzinfo=utc):
channel = self.bot.get_channel(data.channel_id)
await self.bot.pool.execute("DELETE FROM timed_raw_announcements WHERE announcement_id = $1", data.announcement_id)
await self.bot.cache.cache_timed_raw_announcements()
await self.bot.cache.list_timed_raw_announcements()
return await channel.send(data.content)
@commands.group(invoke_without_command=True, aliases=["a"])
async def announcement(self, ctx):
"""Base group command for announcement category! Sends all sub-commands it has and those who have administrator permissions and those who have role which is in allowed role list can only make announcement!"""
return await ctx.send_help(ctx.command)
@announcement.command()
@commands.max_concurrency(1, commands.BucketType.guild)
async def quick(self, ctx, channel: discord.TextChannel):
"""Interactively creates an embed to suit your needs"""
allowed = await check_allowed(ctx)
if (
ctx.author == ctx.guild.owner
or ctx.author.guild_permissions.administrator
or allowed
):
# i'm lazy to make these checks, so used from officialpiyush/modmail-plugins/announcement
def check(msg: discord.Message):
return ctx.author == msg.author and ctx.channel == msg.channel
def field_check(msg: discord.Message):
return (
ctx.author == msg.author
and ctx.channel == msg.channel
and (len(msg.content) < 256)
)
def description_check(msg: discord.Message):
return (
ctx.author == msg.author
and ctx.channel == msg.channel
and (len(msg.content) < 2048)
)
def footer_check(msg: discord.Message):
return (
ctx.author == msg.author
and ctx.channel == msg.channel
and (len(msg.content) < 2048)
)
def cancel_check(msg: discord.Message):
if msg.content == "cancel" or msg.content == f"{ctx.prefix}cancel":
return True
else:
return False
embed = discord.Embed()
title_msg = await ctx.send(
embed=generate_embed("Would the announcement embed have title? [y/n]")
)
try:
title = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(title):
await title_msg.delete()
return await ctx.send("Cancelled!")
elif not title.content.strip().lower() in ["y", "n"]:
await title_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif title.content.strip().lower() == "y":
await title_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the title of the embed?(Should be within 256 characters)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=field_check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
await msg.delete()
embed.title = answer.content
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif title.content.strip().lower() == "n":
await title_msg.delete()
except asyncio.TimeoutError:
await title_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
desc_msg = await ctx.send(
embed=generate_embed(
"Would the announcement embed have description? [y/n]"
)
)
try:
desc = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(desc):
await desc_msg.delete()
return await ctx.send("Cancelled!")
elif not desc.content.strip().lower() in ["y", "n"]:
await desc_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif desc.content.strip().lower() == "y":
await desc_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the descirption of the embed?(Should be within 2048 characters)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=description_check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
await msg.delete()
embed.description = answer.content
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif desc.content.strip().lower() == "n":
await desc_msg.delete()
except asyncio.TimeoutError:
await desc_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
thumb_msg = await ctx.send(
embed=generate_embed(
"Would the announcement embed have thumbnail? [y/n]"
)
)
try:
thumbnail = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(thumbnail):
return await ctx.send("Cancelled!")
elif not thumbnail.content.strip().lower() in ["y", "n"]:
await thumb_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif thumbnail.content.strip().lower() == "y":
await thumb_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would the thumbnail of the embed?(Please send a valid URL)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
match = re.match(
r"(?i)(https?:\/\/.*\.(?:png|jpg|gif|jpeg|JPG|JPEG|PNG|gif|gifv|webm))",
answer.content,
)
if match:
await msg.delete()
embed.set_thumbnail(url=answer.content)
else:
await msg.delete()
await ctx.send("Invalid URL, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif thumbnail.content.strip().lower() == "n":
await thumb_msg.delete()
except asyncio.TimeoutError:
await thumb_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
img_msg = await ctx.send(
embed=generate_embed("Would the announcement embed have image? [y/n]")
)
try:
image = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(image):
await img_msg.delete()
return await ctx.send("Cancelled!")
elif not image.content.strip().lower() in ["y", "n"]:
await img_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif image.content.lower() == "y":
await img_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the URL of the image?(Should be a valid URL)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
match = re.match(
r"(?i)(https?:\/\/.*\.(?:png|jpg|gif|jpeg|JPG|JPEG|PNG|gif|gifv|webm))",
answer.content,
)
if match:
await msg.delete()
embed.set_image(url=answer.content)
else:
await msg.delete()
await ctx.send("Invalid URL, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif image.content.strip().lower() == "n":
await img_msg.delete()
except asyncio.TimeoutError:
await img_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
color_msg = await ctx.send(
embed=generate_embed("Would the embed have color? [y/n]")
)
try:
color = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(color):
await color_msg.delete()
return await ctx.send("Cancelled!")
elif not color.content.strip().lower() in ["y", "n"]:
await color_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif color.content.strip().lower() == "y":
await color_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the embed color?(Should be a valid hex color)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
match = re.match(r"^#(?:[0-9a-fA-F]{3}){1,2}$", answer.content)
if match:
color = answer.content.replace("#", "0x")
embed.color = int(color, 16)
await msg.delete()
else:
await msg.delete()
await ctx.send(
"Invalid Hex string, starting the command again.."
)
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif color.content.strip().lower() == "n":
await color_msg.delete()
except asyncio.TimeoutError:
await color_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
footer_msg = await ctx.send(embed=generate_embed('Would the announcement embed have footer? [y/n]'))
try:
footer = await self.bot.wait_for('message', check=check, timeout=60.0)
if cancel_check(footer):
return await ctx.send('Cancelled!')
elif not footer.content.strip().lower() in ["y", "n"]:
await footer_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif footer.content.strip().lower() == 'y':
await footer_msg.delete()
msg = await ctx.send(embed=generate_embed('What would be the footer text?(Must be within 2048 characters)'))
try:
answer = await self.bot.wait_for("message", check=footer_check, timeout=60.0)
if cancel_check(answer):
return await ctx.send("Cancelled!")
embed.set_footer(text=answer.content)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif footer.content.strip().lower() == 'n':
await footer_msg.delete()
except asyncio.TimeoutError:
await footer_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
announcement_id = generate_id()
embed_details = f'{embed.to_dict()}'.replace("'", '"')
await self.bot.pool.execute("INSERT INTO announcements(announcement_id, channel_id, embed_details) VALUES($1, $2, $3);", announcement_id, channel.id, embed_details)
await self.bot.cache.cache_announcements()
await ctx.reply(f":thumbsup: | Your announcement has been successfully posted! If you would like to restore this announcement, please use the following command `{ctx.prefix}restore quick {announcement_id}`.")
return await channel.send(embed=embed)
else:
return await ctx.send("You don't have permissions to use this command!")
@announcement.command()
@commands.max_concurrency(1, commands.BucketType.guild)
async def timed(self, ctx, channel: discord.TextChannel):
"""Interactively created a timed announcement to suit your needs!"""
allowed = await check_allowed(ctx)
if (
ctx.author == ctx.guild.owner
or ctx.author.guild_permissions.administrator
or allowed
):
# i'm lazy to make these checks, so used from officialpiyush/modmail-plugins/announcement
def check(msg: discord.Message):
return ctx.author == msg.author and ctx.channel == msg.channel
def field_check(msg: discord.Message):
return (
ctx.author == msg.author
and ctx.channel == msg.channel
and (len(msg.content) < 256)
)
def description_check(msg: discord.Message):
return (
ctx.author == msg.author
and ctx.channel == msg.channel
and (len(msg.content) < 2048)
)
def footer_check(msg: discord.Message):
return (
ctx.author == msg.author
and ctx.channel == msg.channel
and (len(msg.content) < 2048)
)
def cancel_check(msg: discord.Message):
if msg.content == "cancel" or msg.content == f"{ctx.prefix}cancel":
return True
else:
return False
embed = discord.Embed()
title_msg = await ctx.send(
embed=generate_embed("Would the announcement embed have title? [y/n]")
)
try:
title = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(title):
await title_msg.delete()
return await ctx.send("Cancelled!")
elif not title.content.strip().lower() in ["y", "n"]:
await title_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif title.content.strip().lower() == "y":
await title_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the title of the embed?(Should be within 256 characters)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=field_check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
await msg.delete()
embed.title = answer.content
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif title.content.strip().lower() == "n":
await title_msg.delete()
except asyncio.TimeoutError:
await title_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
desc_msg = await ctx.send(
embed=generate_embed(
"Would the announcement embed have description? [y/n]"
)
)
try:
desc = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(desc):
await desc_msg.delete()
return await ctx.send("Cancelled!")
elif not desc.content.strip().lower() in ["y", "n"]:
await desc_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif desc.content.strip().lower() == "y":
await desc_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the descirption of the embed?(Should be within 2048 characters)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=description_check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
await msg.delete()
embed.description = answer.content
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif desc.content.strip().lower() == "n":
await desc_msg.delete()
except asyncio.TimeoutError:
await desc_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
thumb_msg = await ctx.send(
embed=generate_embed(
"Would the announcement embed have thumbnail? [y/n]"
)
)
try:
thumbnail = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(thumbnail):
return await ctx.send("Cancelled!")
elif not thumbnail.content.strip().lower() in ["y", "n"]:
await thumb_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif thumbnail.content.strip().lower() == "y":
await thumb_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would the thumbnail of the embed?(Please send a valid URL)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
match = re.match(
r"(?i)(https?:\/\/.*\.(?:png|jpg|gif|jpeg|JPG|JPEG|PNG|gif|gifv|webm))",
answer.content,
)
if match:
await msg.delete()
embed.set_thumbnail(url=answer.content)
else:
await msg.delete()
await ctx.send("Invalid URL, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif thumbnail.content.strip().lower() == "n":
await thumb_msg.delete()
except asyncio.TimeoutError:
await thumb_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
img_msg = await ctx.send(
embed=generate_embed("Would the announcement embed have image? [y/n]")
)
try:
image = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(image):
await img_msg.delete()
return await ctx.send("Cancelled!")
elif not image.content.strip().lower() in ["y", "n"]:
await img_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif image.content.lower() == "y":
await img_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the URL of the image?(Should be a valid URL)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
match = re.match(
r"(?i)(https?:\/\/.*\.(?:png|jpg|gif|jpeg|JPG|JPEG|PNG|gif|gifv|webm))",
answer.content,
)
if match:
await msg.delete()
embed.set_image(url=answer.content)
else:
await msg.delete()
await ctx.send("Invalid URL, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif image.content.strip().lower() == "n":
await img_msg.delete()
except asyncio.TimeoutError:
await img_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
color_msg = await ctx.send(
embed=generate_embed("Would the embed have color? [y/n]")
)
try:
color = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(color):
await color_msg.delete()
return await ctx.send("Cancelled!")
elif not color.content.strip().lower() in ["y", "n"]:
await color_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif color.content.strip().lower() == "y":
await color_msg.delete()
msg = await ctx.send(
embed=generate_embed(
"What would be the embed color?(Should be a valid hex color)"
)
)
try:
answer = await self.bot.wait_for(
"message", check=check, timeout=60.0
)
if cancel_check(answer):
await msg.delete()
return await ctx.send("Cancelled!")
match = re.match(r"^#(?:[0-9a-fA-F]{3}){1,2}$", answer.content)
if match:
color = answer.content.replace("#", "0x")
embed.color = int(color, 16)
await msg.delete()
else:
await msg.delete()
await ctx.send(
"Invalid Hex string, starting the command again.."
)
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif color.content.strip().lower() == "n":
await color_msg.delete()
except asyncio.TimeoutError:
await color_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
footer_msg = await ctx.send(embed=generate_embed('Would the announcement embed have footer? [y/n]'))
try:
footer = await self.bot.wait_for('message', check=check, timeout=60.0)
if cancel_check(footer):
return await ctx.send('Cancelled!')
elif not footer.content.strip().lower() in ["y", "n"]:
await footer_msg.delete()
await ctx.send("Invalid option, starting the command again..")
await asyncio.sleep(1)
return await ctx.invoke(ctx.command, channel)
elif footer.content.strip().lower() == 'y':
await footer_msg.delete()
msg = await ctx.send(embed=generate_embed('What would be the footer text?(Must be within 2048 characters)'))
try:
answer = await self.bot.wait_for("message", check=footer_check, timeout=60.0)
if cancel_check(answer):
return await ctx.send("Cancelled!")
embed.set_footer(text=answer.content)
except asyncio.TimeoutError:
await msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
elif footer.content.strip().lower() == 'n':
await footer_msg.delete()
except asyncio.TimeoutError:
await footer_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
time_msg = await ctx.send(embed=generate_embed('When the announcment embed should be posted?(Must be a valid human time eg. 10s/10m/10h, max time is 24hr)'))
try:
time = await self.bot.wait_for("message", check=check, timeout=60.0)
if cancel_check(time):
return await ctx.send("Cancelled!")
await time_msg.delete()
try:
parsed_time = parse(time.content)
except Exception:
return await ctx.send(f":negative_squared_cross_mark: | The given time is invalid or the given time is more than max time(24hr)")
except asyncio.TimeoutError:
await time_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
announcement_id = generate_id()
embed_details = f'{embed.to_dict()}'.replace("'", '"')
await self.bot.pool.execute("INSERT INTO timed_announcements(announcement_id, channel_id, embed_details, expires) VALUES($1, $2, $3, $4);", announcement_id, channel.id, embed_details, parsed_time)
await self.bot.cache.cache_timed_announcements()
await self.bot.cache.list_timed_announcements()
await self.bot.pool.execute("INSERT INTO timed_announcement_backups(announcement_id, channel_id, embed_details, expires) VALUES($1, $2, $3, $4);", announcement_id, channel.id, embed_details, parsed_time)
await self.bot.cache.cache_backup_timed_announcements()
await ctx.reply(f":thumbsup: | Your announcement has been successfully added to the queue! If you would like to restore this announcement, please use the following command `{ctx.prefix}restore timed {announcement_id}`.")
else:
return await ctx.send("You don't have permissions to use this command!")
@announcement.command()
@commands.max_concurrency(1, commands.BucketType.guild)
async def timedRaw(self, ctx, channel: discord.TextChannel):
"""Interactively creates a timed raw announcement!"""
allowed = await check_allowed(ctx)
def check(msg: discord.Message):
return ctx.author == msg.author and ctx.channel == msg.channel
if (
ctx.author == ctx.guild.owner
or ctx.author.guild_permissions.administrator
or allowed
):
content_msg = await ctx.channel.send(embed=generate_embed('What would be the content of the embed?(Must be within 2048 characters)'))
try:
content = await self.bot.wait_for("message", check=check, timeout=300.0)
if content.content.lower() == 'cancel' or content.content.lower() == f'{ctx.prefix}cancel':
return await ctx.send("Cancelled!")
elif len(content.content) >= 2048:
return await ctx.send("Content is too long, must be within 2048 characters!")
await content_msg.delete()
except asyncio.TimeoutError:
await content_msg.delete()
return await ctx.send("Cancelled as the session is inactive!")
time_msg = await ctx.send(embed=generate_embed('When the announcment embed should be posted?(Must be a valid human time eg. 10s/10m/10h, max time is 24hr)'))
try:
time = await self.bot.wait_for("message", check=check, timeout=60.0)
if content.content.lower() == 'cancel' or content.content.lower() == f'{ctx.prefix}cancel':
return await ctx.send("Cancelled!")
await time_msg.delete()
try:
parsed_time = parse(time.content)
except Exception:
return await ctx.send(f":negative_squared_cross_mark: | The given time is invalid or the given time is more than max time(24hr)")
except asyncio.TimeoutError:
await time_msg.delete()
return await ctx.send("Cancelled the session as it's inactive!")
announcement_id = generate_id()
await self.bot.pool.execute("INSERT INTO timed_raw_announcements(announcement_id, channel_id, content, expires) VALUES($1, $2, $3, $4);", announcement_id, channel.id, content.content, parsed_time)
await self.bot.pool.execute("INSERT INTO timed_raw_announcement_backups(announcement_id, channel_id, content, expires) VALUES($1, $2, $3, $4);", announcement_id, channel.id, content.content, parsed_time)
await self.bot.cache.cache_timed_raw_announcements()
await self.bot.cache.list_timed_raw_announcements()
await self.bot.cache.cache_timed_raw_announcement_backups()
return await ctx.reply(f":thumbsup: | Your announcement has been successfully added to the queue! If you would like to restore this announcement, please use the following command `{ctx.prefix}restore timedRaw {announcement_id}`.")
else:
return await ctx.send("You don't have permissions to use this command!")
@announcement.command()
@commands.max_concurrency(1, commands.BucketType.guild)
async def raw(self, ctx, channel: discord.TextChannel):
"""Interactively creates a raw announcement!"""
allowed = await check_allowed(ctx)
def check(msg: discord.Message):
return ctx.author == msg.author and ctx.channel == msg.channel
if (
ctx.author == ctx.guild.owner
or ctx.author.guild_permissions.administrator
or allowed
):
content_msg = await ctx.channel.send(embed=generate_embed('What would be the content of the embed?(Must be within 2048 characters)'))
try:
content = await self.bot.wait_for("message", check=check, timeout=300.0)
if content.content.lower() == 'cancel' or content.content.lower() == f'{ctx.prefix}cancel':
return await ctx.send("Cancelled!")
elif len(content.content) >= 2048:
return await ctx.send("Content is too long, must be within 2048 characters!")
await content_msg.delete()
announcement_id = generate_id()
await self.bot.pool.execute("INSERT INTO raw_announcements(announcement_id, channel_id, content) VALUES($1, $2, $3);", announcement_id, channel.id, content.content.strip())
await self.bot.cache.cache_raw_announcements()
await ctx.reply(f":thumbsup: | Your announcement has been successfully posted! If you would like to restore this announcement, please use the following command `{ctx.prefix}restore timedRaw {announcement_id}`.")
await channel.send(content.content)
except asyncio.TimeoutError:
await content_msg.delete()
return await ctx.send("Cancelled as the session is inactive!")
else:
return await ctx.send("You don't have permissions to use this command!")
def setup(bot):
bot.add_cog(Announcement(bot))
| 52.359521 | 242 | 0.506154 | 4,035 | 39,322 | 4.853779 | 0.059975 | 0.054327 | 0.066786 | 0.05974 | 0.952923 | 0.948583 | 0.937197 | 0.926883 | 0.907021 | 0.905131 | 0 | 0.010233 | 0.401048 | 39,322 | 750 | 243 | 52.429333 | 0.821332 | 0.006587 | 0 | 0.852941 | 1 | 0.022409 | 0.171396 | 0.016441 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019608 | false | 0 | 0.011204 | 0.014006 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d4f3f8df2f1080c4e6ff34a135c097aa88a362ba | 12,282 | py | Python | src/cuteSV/cuteSV_resolveINDEL.py | bnoyvert/cuteSV | 58ca0fa051f80f716ef69a39924102abdd4249a0 | [
"MIT"
] | null | null | null | src/cuteSV/cuteSV_resolveINDEL.py | bnoyvert/cuteSV | 58ca0fa051f80f716ef69a39924102abdd4249a0 | [
"MIT"
] | null | null | null | src/cuteSV/cuteSV_resolveINDEL.py | bnoyvert/cuteSV | 58ca0fa051f80f716ef69a39924102abdd4249a0 | [
"MIT"
] | null | null | null | import sys
import numpy as np
from collections import Counter
from cuteSV.cuteSV_genotype import cal_GL, cal_CIPOS, threshold_ref_count, count_coverage
import time
'''
*******************************************
TO DO LIST
*******************************************
1. Identify DP with samfile pointer;
2. Add CIPOS, CILEN and/or CIEND;
3. Determine (IM)PRECISE type.
*******************************************
'''
def resolution_DEL(path, chr, svtype, read_count, threshold_gloab, max_cluster_bias,
minimum_support_reads, bam_path, action, gt_round):
'''
cluster DEL
********************************************************************************************
path: DEL.sigs
chr: chromosome id
svtype: <DEL>
SEQTYPE read_count max_cluster_bias sv_size threshold_gloab threshold_local
--------------------------------------------------------------------------------------------
CCS 3 200 bp (<500 bp) 30 bp 0.4 0.5
CLR 5/10 200 bp (<500 bp) 50 bp 0.3 0.7
--------------------------------------------------------------------------------------------
Input file format
--------------------------------------------------------------------------------------------
column #1 #2 #3 #4 #5
DEL CHR BP LEN ID
#1 deletion type
#2 chromosome number
#3 breakpoint in each read
#4 DEL_len in each read
#5 read ID
********************************************************************************************
'''
semi_del_cluster = list()
semi_del_cluster.append([0,0,''])
candidate_single_SV = list()
file = open(path, 'r')
for line in file:
seq = line.strip('\n').split('\t')
if seq[1] != chr:
continue
pos = int(seq[2])
indel_len = int(seq[3])
read_id = seq[4]
if pos - semi_del_cluster[-1][0] > max_cluster_bias:
if len(semi_del_cluster) >= read_count:
if semi_del_cluster[-1][0] == semi_del_cluster[-1][1] == 0:
pass
else:
generate_del_cluster(semi_del_cluster,
chr,
svtype,
read_count,
threshold_gloab,
# threshold_local,
minimum_support_reads,
candidate_single_SV,
bam_path,
max_cluster_bias,
action,
gt_round)
semi_del_cluster = []
semi_del_cluster.append([pos, indel_len, read_id])
else:
if semi_del_cluster[-1][0] == semi_del_cluster[-1][1] == 0:
semi_del_cluster = []
semi_del_cluster.append([pos, indel_len, read_id])
else:
semi_del_cluster.append([pos, indel_len, read_id])
if len(semi_del_cluster) >= read_count:
if semi_del_cluster[-1][0] == semi_del_cluster[-1][1] == 0:
pass
else:
generate_del_cluster(semi_del_cluster,
chr,
svtype,
read_count,
threshold_gloab,
# threshold_local,
minimum_support_reads,
candidate_single_SV,
bam_path,
max_cluster_bias,
action,
gt_round)
file.close()
return candidate_single_SV
def generate_del_cluster(semi_del_cluster, chr, svtype, read_count,
threshold_gloab, minimum_support_reads, candidate_single_SV,
bam_path, max_cluster_bias, action, gt_round):
'''
generate deletion
*************************************************************
threshold_gloab threshold_local minimum_support_reads
-------------------------------------------------------------
0.3 0.7 5 CLR
0.4 0.5 <=5 CCS
*************************************************************
'''
# Remove duplicates
read_tag = dict()
for element in semi_del_cluster:
if element[2] not in read_tag:
read_tag[element[2]] = element
else:
if element[1] > read_tag[element[2]][1]:
read_tag[element[2]] = element
if len(read_tag) < read_count:
return
read_tag2SortedList = sorted(list(read_tag.values()), key = lambda x:x[1])
global_len = [i[1] for i in read_tag2SortedList]
DISCRETE_THRESHOLD_LEN_CLUSTER_DEL_TEMP = threshold_gloab * np.mean(global_len)
last_len = read_tag2SortedList[0][1]
allele_collect = list()
'''
*************************************************************
#1 #2 #3 #4
-------------------------------------------------------------
del-breakpoint del-len #support read-id
*************************************************************
'''
allele_collect.append([[read_tag2SortedList[0][0]],[read_tag2SortedList[0][1]],[],
[read_tag2SortedList[0][2]]])
for i in read_tag2SortedList[1:]:
if i[1] - last_len > DISCRETE_THRESHOLD_LEN_CLUSTER_DEL_TEMP:
allele_collect[-1][2].append(len(allele_collect[-1][0]))
allele_collect.append([[],[],[],[]])
allele_collect[-1][0].append(i[0])
allele_collect[-1][1].append(i[1])
allele_collect[-1][3].append(i[2])
last_len = i[1]
allele_collect[-1][2].append(len(allele_collect[-1][0]))
allele_sort = sorted(allele_collect, key = lambda x:x[2])
for allele in allele_sort:
if allele[2][0] >= minimum_support_reads:
breakpointStart = np.mean(allele[0])
search_threshold = np.min(allele[0])
CIPOS = cal_CIPOS(np.std(allele[0]), len(allele[0]))
signalLen = np.mean(allele[1])
signalLen_STD = np.std(allele[1])
CILEN = cal_CIPOS(np.std(allele[1]), len(allele[1]))
if action:
DV, DR, GT, GL, GQ, QUAL = call_gt(bam_path,
int(search_threshold),
chr,
allele[3],
max_cluster_bias,
gt_round)
else:
DR = '.'
GT = './.'
GL = '.,.,.'
GQ = "."
QUAL = "."
candidate_single_SV.append([chr,
svtype,
str(int(breakpointStart)),
str(int(-signalLen)),
str(allele[2][0]),
str(CIPOS),
str(CILEN),
str(DR),
str(GT),
str(GL),
str(GQ),
str(QUAL),
str(','.join(allele[3]))])
def resolution_INS(path, chr, svtype, read_count, threshold_gloab,
max_cluster_bias, minimum_support_reads, bam_path, action, gt_round):
'''
cluster INS
********************************************************************************************
path: INS.sigs
chr: chromosome id
svtype: <INS>
SEQTYPE read_count max_cluster_bias sv_size threshold_gloab threshold_local
--------------------------------------------------------------------------------------------
CCS 3 200 bp (<500 bp) 30 bp 0.65 0.7
CLR 5/10 100 bp (<500 bp) 50 bp 0.2 0.6
--------------------------------------------------------------------------------------------
Input file format
--------------------------------------------------------------------------------------------
column #1 #2 #3 #4 #5
INS CHR BP LEN ID
#1 insertion type
#2 chromosome number
#3 breakpoint in each read
#4 DEL_len in each read
#5 read ID
********************************************************************************************
'''
semi_ins_cluster = list()
semi_ins_cluster.append([0,0,'',''])
candidate_single_SV = list()
file = open(path, 'r')
for line in file:
seq = line.strip('\n').split('\t')
if seq[1] != chr:
continue
pos = int(seq[2])
indel_len = int(seq[3])
read_id = seq[4]
try:
ins_seq = seq[5]
except:
ins_seq = ''
if pos - semi_ins_cluster[-1][0] > max_cluster_bias:
if len(semi_ins_cluster) >= read_count:
if semi_ins_cluster[-1][0] == semi_ins_cluster[-1][1] == 0:
pass
else:
generate_ins_cluster(semi_ins_cluster,
chr,
svtype,
read_count,
threshold_gloab,
# threshold_local,
minimum_support_reads,
candidate_single_SV,
bam_path,
max_cluster_bias,
action,
gt_round)
semi_ins_cluster = []
semi_ins_cluster.append([pos, indel_len, read_id, ins_seq])
else:
if semi_ins_cluster[-1][0] == semi_ins_cluster[-1][1] == 0:
semi_ins_cluster = []
semi_ins_cluster.append([pos, indel_len, read_id, ins_seq])
else:
semi_ins_cluster.append([pos, indel_len, read_id, ins_seq])
if len(semi_ins_cluster) >= read_count:
if semi_ins_cluster[-1][0] == semi_ins_cluster[-1][1] == 0:
pass
else:
generate_ins_cluster(semi_ins_cluster,
chr,
svtype,
read_count,
threshold_gloab,
# threshold_local,
minimum_support_reads,
candidate_single_SV,
bam_path,
max_cluster_bias,
action,
gt_round)
file.close()
return candidate_single_SV
def generate_ins_cluster(semi_ins_cluster, chr, svtype, read_count,
threshold_gloab, minimum_support_reads, candidate_single_SV,
bam_path, max_cluster_bias, action, gt_round):
'''
generate deletion
*************************************************************
threshold_gloab threshold_local minimum_support_reads
-------------------------------------------------------------
0.2 0.6 5 CLR
0.65 0.7 <=5 CCS
*************************************************************
'''
# Remove duplicates
read_tag = dict()
for element in semi_ins_cluster:
if element[2] not in read_tag:
read_tag[element[2]] = element
else:
if element[1] > read_tag[element[2]][1]:
read_tag[element[2]] = element
if len(read_tag) < read_count:
return
read_tag2SortedList = sorted(list(read_tag.values()), key = lambda x:x[1])
# start&end breakpoint
global_len = [i[1] for i in read_tag2SortedList]
DISCRETE_THRESHOLD_LEN_CLUSTER_INS_TEMP = threshold_gloab * np.mean(global_len)
last_len = read_tag2SortedList[0][1]
allele_collect = list()
allele_collect.append([[read_tag2SortedList[0][0]],
[read_tag2SortedList[0][1]],
[],
[read_tag2SortedList[0][2]],
[read_tag2SortedList[0][3]]])
for i in read_tag2SortedList[1:]:
if i[1] - last_len > DISCRETE_THRESHOLD_LEN_CLUSTER_INS_TEMP:
allele_collect[-1][2].append(len(allele_collect[-1][0]))
allele_collect.append([[],[],[],[],[]])
allele_collect[-1][0].append(i[0])
allele_collect[-1][1].append(i[1])
allele_collect[-1][3].append(i[2])
allele_collect[-1][4].append(i[3])
last_len = i[1]
allele_collect[-1][2].append(len(allele_collect[-1][0]))
allele_sort = sorted(allele_collect, key = lambda x:x[2])
for allele in allele_sort:
if allele[2][0] >= minimum_support_reads:
breakpointStart = np.mean(allele[0])
CIPOS = cal_CIPOS(np.std(allele[0]), len(allele[0]))
signalLen = np.mean(allele[1])
signalLen_STD = np.std(allele[1])
CILEN = cal_CIPOS(np.std(allele[1]), len(allele[1]))
ideal_ins_seq = '<INS>'
for i in allele[4]:
if len(i) >= int(signalLen):
ideal_ins_seq = i[0:int(signalLen)]
break
if ideal_ins_seq == '<INS>':
continue
if action:
DV, DR, GT, GL, GQ, QUAL = call_gt(bam_path,
int(breakpointStart),
chr,
allele[3],
# max_cluster_bias,
1000,
gt_round)
else:
DR = '.'
GT = './.'
GL = '.,.,.'
GQ = "."
QUAL = "."
candidate_single_SV.append([chr,
svtype,
str(int(breakpointStart)),
str(int(signalLen)),
str(allele[2][0]),
str(CIPOS),
str(CILEN),
str(DR),
str(GT),
str(GL),
str(GQ),
str(QUAL),
str(','.join(allele[3])),
ideal_ins_seq])
def run_del(args):
return resolution_DEL(*args)
def run_ins(args):
return resolution_INS(*args)
def call_gt(bam_path, search_threshold, chr, read_id_list, max_cluster_bias, gt_round):
import pysam
querydata = set()
bamfile = pysam.AlignmentFile(bam_path)
search_start = max(int(search_threshold) - max_cluster_bias, 0)
search_end = min(int(search_threshold) + max_cluster_bias, bamfile.get_reference_length(chr))
up_bound = threshold_ref_count(len(read_id_list))
status = count_coverage(chr,
search_start,
search_end,
bamfile,
querydata,
up_bound,
gt_round)
bamfile.close()
if status == -1:
DR = '.'
GT = "./."
GL = ".,.,."
GQ = "."
QUAL = "."
# elif status == 1:
# pass
else:
DR = 0
for query in querydata:
if query not in read_id_list:
DR += 1
GT, GL, GQ, QUAL = cal_GL(DR, len(read_id_list))
return len(read_id_list), DR, GT, GL, GQ, QUAL
| 28.830986 | 94 | 0.544537 | 1,550 | 12,282 | 4.054839 | 0.108387 | 0.036595 | 0.044551 | 0.022912 | 0.814002 | 0.793158 | 0.76961 | 0.76961 | 0.76961 | 0.74813 | 0 | 0.029315 | 0.211203 | 12,282 | 425 | 95 | 28.898824 | 0.619426 | 0.277561 | 0 | 0.723776 | 0 | 0 | 0.006025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024476 | false | 0.013986 | 0.020979 | 0.006993 | 0.06993 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
be179c5d4db633c386aacda74cd349e1ef5cac11 | 180 | py | Python | python/python_crash_course/project_data_visualization/dice.py | lmonsalve22/Learning-to-Code | 2e32eba3fbd0bd63cc539e1e6d372ca346b765c9 | [
"MIT"
] | null | null | null | python/python_crash_course/project_data_visualization/dice.py | lmonsalve22/Learning-to-Code | 2e32eba3fbd0bd63cc539e1e6d372ca346b765c9 | [
"MIT"
] | null | null | null | python/python_crash_course/project_data_visualization/dice.py | lmonsalve22/Learning-to-Code | 2e32eba3fbd0bd63cc539e1e6d372ca346b765c9 | [
"MIT"
] | null | null | null | from random import randint
class Dice:
def __init__(self, num_sides=6):
self.num_sides = num_sides
def roll(self):
return randint(1, self.num_sides)
| 18 | 41 | 0.65 | 26 | 180 | 4.192308 | 0.576923 | 0.293578 | 0.330275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0.266667 | 180 | 9 | 42 | 20 | 0.810606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.166667 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
077dad708e548c74f6a829d4d3b49055f3587643 | 1,475 | py | Python | Python_Codes/calculator3.py | arnelimperial/Code-Py | c48be58027e99f12a358644b45d502c8fcbd3b98 | [
"Zlib"
] | null | null | null | Python_Codes/calculator3.py | arnelimperial/Code-Py | c48be58027e99f12a358644b45d502c8fcbd3b98 | [
"Zlib"
] | null | null | null | Python_Codes/calculator3.py | arnelimperial/Code-Py | c48be58027e99f12a358644b45d502c8fcbd3b98 | [
"Zlib"
] | null | null | null | #!/usr/bin/env python3
print("Calculator")
s = int(input("Give the first number:"))
t = int(input("Give the second number:"))
print("\n\n(1) +\n(2) -\n(3) *\n(4) /\n"
"(5) Change numbers\n(6) Quit")
print("Current numbers:" ,s,t)
summer = True
while summer:
eventhorizon = int(input("Please select something (1-6):"))
if eventhorizon == 1:
print("The result is:",s + t)
print("\n\n(1) +\n(2) -\n(3) *\n(4) /\n"
"(5) Change numbers\n(6) Quit")
print("Current numbers:" ,s,t)
elif eventhorizon == 2:
print("The result is:",s - t)
print("\n\n(1) +\n(2) -\n(3) *\n(4) /\n"
"(5) Change numbers\n(6) Quit")
print("Current numbers:" ,s,t)
elif eventhorizon == 3:
print("The result is:",s * t)
print("\n\n(1) +\n(2) -\n(3) *\n(4) /\n"
"(5) Change numbers\n(6) Quit")
print("Current numbers:" ,s,t)
elif eventhorizon == 4:
print("The result is:",s / t)
print("\n\n(1) +\n(2) -\n(3) *\n(4) /\n"
"(5) Change numbers\n(6) Quit")
print("Current numbers:" ,s,t)
elif eventhorizon == 5:
s = int(input("Give the first number:"))
t = int(input("Give the second number:"))
print("\n\n(1) +\n(2) -\n(3) *\n(4) /\n"
"(5) Change numbers\n(6) Quit")
print("Current numbers:" ,s,t)
elif eventhorizon == 6:
print("Thank you!")
break
| 35.119048 | 63 | 0.495593 | 224 | 1,475 | 3.263393 | 0.178571 | 0.02736 | 0.057456 | 0.065663 | 0.818057 | 0.818057 | 0.818057 | 0.818057 | 0.818057 | 0.818057 | 0 | 0.042776 | 0.28678 | 1,475 | 42 | 64 | 35.119048 | 0.652091 | 0.014237 | 0 | 0.578947 | 0 | 0.157895 | 0.448418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.473684 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
07a3d32f6108e518b392253c1e1b882909a3ed27 | 85 | py | Python | lambdata_mali_tree_classifier/_init_.py | cartman12/lambdata_mali_tree_classifier | 62cf7d5c105e4da05e4467f0b0b73338d70d59c0 | [
"MIT"
] | null | null | null | lambdata_mali_tree_classifier/_init_.py | cartman12/lambdata_mali_tree_classifier | 62cf7d5c105e4da05e4467f0b0b73338d70d59c0 | [
"MIT"
] | null | null | null | lambdata_mali_tree_classifier/_init_.py | cartman12/lambdata_mali_tree_classifier | 62cf7d5c105e4da05e4467f0b0b73338d70d59c0 | [
"MIT"
] | null | null | null | from lambdata_mali_tree_classifier.lambdata_mali_tree_classifier import fit, predict
| 42.5 | 84 | 0.917647 | 12 | 85 | 6 | 0.666667 | 0.333333 | 0.444444 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 85 | 1 | 85 | 85 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
07fbb03bc39e6d6a059fef478943486d3da9af97 | 40 | py | Python | python/sub/test.py | robotlightsyou/test | 015f13943fc402d8ce86c5f6d2f5a7d032b3340a | [
"MIT"
] | 2 | 2019-05-26T15:09:34.000Z | 2021-09-12T08:01:23.000Z | python/sub/test.py | robotlightsyou/test | 015f13943fc402d8ce86c5f6d2f5a7d032b3340a | [
"MIT"
] | null | null | null | python/sub/test.py | robotlightsyou/test | 015f13943fc402d8ce86c5f6d2f5a7d032b3340a | [
"MIT"
] | 1 | 2021-04-11T20:28:21.000Z | 2021-04-11T20:28:21.000Z | import find_this
print(find_this.VALUE)
| 13.333333 | 22 | 0.85 | 7 | 40 | 4.571429 | 0.714286 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 2 | 23 | 20 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
ed4a10516a487dafbcb71a33b31d31516595442c | 124 | py | Python | test.py | parul6571/test-oct | 40b73b79e68f06d954f3ae344086b6ae7e581ab9 | [
"MIT"
] | null | null | null | test.py | parul6571/test-oct | 40b73b79e68f06d954f3ae344086b6ae7e581ab9 | [
"MIT"
] | null | null | null | test.py | parul6571/test-oct | 40b73b79e68f06d954f3ae344086b6ae7e581ab9 | [
"MIT"
] | 1 | 2021-10-03T15:47:30.000Z | 2021-10-03T15:47:30.000Z | print("ATGATAATGATAGATAGTAGT")
print("ATGATAATGATAGATAGTAGT")
print("ATGATAATGATAGATAGTAGT")
print("ATGATAATGATAGATAGTAGT")
| 24.8 | 30 | 0.83871 | 8 | 124 | 13 | 0.25 | 1 | 0.894231 | 1.5 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 124 | 4 | 31 | 31 | 0.866667 | 0 | 0 | 1 | 0 | 0 | 0.677419 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 13 |
ed89cf2b7e8a056b03b256f0d31cd2fb610c039d | 4,660 | py | Python | tests/molecular/molecules/molecule/fixtures/cage/metal_topologies/m12l24.py | andrewtarzia/stk | 1ac2ecbb5c9940fe49ce04cbf5603fd7538c475a | [
"MIT"
] | 21 | 2018-04-12T16:25:24.000Z | 2022-02-14T23:05:43.000Z | tests/molecular/molecules/molecule/fixtures/cage/metal_topologies/m12l24.py | JelfsMaterialsGroup/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 8 | 2019-03-19T12:36:36.000Z | 2020-11-11T12:46:00.000Z | tests/molecular/molecules/molecule/fixtures/cage/metal_topologies/m12l24.py | supramolecular-toolkit/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 5 | 2018-08-07T13:00:16.000Z | 2021-11-01T00:55:10.000Z | import pytest
import stk
from ...building_blocks import get_pd_atom, get_linker
from ....case_data import CaseData
@pytest.fixture(
scope='session',
params=(
lambda name: CaseData(
molecule=stk.ConstructedMolecule(
stk.cage.M12L24(
building_blocks=(
get_pd_atom(),
get_linker(),
),
reaction_factory=stk.DativeReactionFactory(
stk.GenericReactionFactory(
bond_orders={
frozenset({
stk.GenericFunctionalGroup,
stk.SingleAtom
}): 9
}
)
)
)
),
smiles=(
'[H]C1=C([H])C2=C([H])C(=C1[H])C1=C([H])C([H])=N(->['
'Pd+2]34<-N5=C([H])C([H])=C(C([H])=C5[H])C5=C([H])C('
'[H])=C([H])C(=C5[H])C5=C([H])C([H])=N(->[Pd+2]67<-N'
'8=C([H])C([H])=C(C([H])=C8[H])C8=C([H])C([H])=C([H]'
')C(=C8[H])C8=C([H])C([H])=N(->[Pd+2]9(<-N%10=C([H])'
'C([H])=C(C([H])=C%10[H])C%10=C([H])C([H])=C([H])C(='
'C%10[H])C%10=C([H])C([H])=N(->[Pd+2]%11%12<-N%13=C('
'[H])C([H])=C(C([H])=C%13[H])C%13=C([H])C(=C([H])C(['
'H])=C%13[H])C%13=C([H])C([H])=N(->[Pd+2]%14(<-N%15='
'C([H])C([H])=C(C([H])=C%15[H])C%15=C([H])C([H])=C(['
'H])C(=C%15[H])C%15=C([H])C([H])=N(->[Pd+2]%16(<-N%1'
'7=C([H])C([H])=C(C([H])=C%17[H])C%17=C([H])C([H])=C'
'([H])C(=C%17[H])C%17=C([H])C([H])=N(->[Pd+2]%18(<-N'
'%19=C([H])C([H])=C(C([H])=C%19[H])C%19=C([H])C(=C(['
'H])C([H])=C%19[H])C%19=C([H])C([H])=N->%14C([H])=C%'
'19[H])<-N%14=C([H])C([H])=C(C([H])=C%14[H])C%14=C(['
'H])C(=C([H])C([H])=C%14[H])C%14=C([H])C([H])=N(->[P'
'd+2]%19(<-N%20=C([H])C([H])=C(C([H])=C%20[H])C%20=C'
'([H])C([H])=C([H])C(=C%20[H])C%20=C([H])C([H])=N(->'
'[Pd+2]%21(<-N%22=C([H])C([H])=C(C([H])=C%22[H])C%22='
'C([H])C([H])=C([H])C(=C%22[H])C%22=C([H])C([H])=N->%'
'18C([H])=C%22[H])<-N%18=C([H])C([H])=C(C([H])=C%18[H'
'])C%18=C([H])C(=C([H])C([H])=C%18[H])C%18=C([H])C([H'
'])=N(->[Pd+2](<-N%22=C([H])C([H])=C(C([H])=C%22[H])C'
'%22=C([H])C([H])=C([H])C(=C%22[H])C%22=C([H])C([H])='
'N(->[Pd+2](<-N%23=C([H])C([H])=C(C([H])=C%23[H])C%2'
'3=C([H])C([H])=C([H])C(=C%23[H])C%23=C([H])C([H])=N'
'->%16C([H])=C%23[H])(<-N%16=C([H])C([H])=C(C([H])=C'
'%16[H])C%16=C([H])C([H])=C([H])C(=C%16[H])C%16=C([H'
'])C([H])=N->%21C([H])=C%16[H])<-N%16=C([H])C([H])=C'
'2C([H])=C%16[H])C([H])=C%22[H])(<-N2=C([H])C([H])=C'
'(C([H])=C2[H])C2=C([H])C([H])=C([H])C(=C2[H])C2=C(['
'H])C([H])=N->6C([H])=C2[H])<-N2=C([H])C([H])=C(C([H'
'])=C2[H])C2=C([H])C([H])=C([H])C(=C2[H])C2=C([H])C('
'[H])=N(->[Pd+2](<-N6=C([H])C([H])=C(C([H])=C6[H])C6'
'=C([H])C([H])=C([H])C(=C6[H])C6=C([H])C([H])=N->%11'
'C([H])=C6[H])(<-N6=C([H])C([H])=C(C([H])=C6[H])C6=C'
'([H])C(=C([H])C([H])=C6[H])C6=C([H])C([H])=N->%19C('
'[H])=C6[H])<-N6=C([H])C([H])=C(C([H])=C6[H])C6=C([H]'
')C(=C([H])C([H])=C6[H])C6=C([H])C([H])=N->7C([H])=C6'
'[H])C([H])=C2[H])C([H])=C%18[H])C([H])=C%20[H])<-N2='
'C([H])C([H])=C(C([H])=C2[H])C2=C([H])C([H])=C([H])C('
'=C2[H])C2=C([H])C([H])=N->%12C([H])=C2[H])C([H])=C%14'
'[H])C([H])=C%17[H])<-N2=C([H])C([H])=C(C([H])=C2[H])'
'C2=C([H])C(=C([H])C([H])=C2[H])C2=C([H])C([H])=N->3'
'C([H])=C2[H])C([H])=C%15[H])<-N2=C([H])C([H])=C(C(['
'H])=C2[H])C2=C([H])C([H])=C([H])C(=C2[H])C2=C([H])C'
'([H])=N->9C([H])=C2[H])C([H])=C%13[H])C([H])=C%10[H'
'])<-N2=C([H])C([H])=C(C([H])=C2[H])C2=C([H])C(=C([H'
'])C([H])=C2[H])C2=C([H])C([H])=N->4C([H])=C2[H])C([H'
'])=C8[H])C([H])=C5[H])C([H])=C1[H]'
),
name=name,
),
),
)
def metal_cage_m12l24(request) -> CaseData:
return request.param(
f'{request.fixturename}{request.param_index}',
)
| 51.208791 | 71 | 0.314163 | 905 | 4,660 | 1.60221 | 0.096133 | 0.281379 | 0.326897 | 0.237241 | 0.638621 | 0.573103 | 0.564828 | 0.471724 | 0.384828 | 0.308276 | 0 | 0.086271 | 0.306009 | 4,660 | 90 | 72 | 51.777778 | 0.36209 | 0 | 0 | 0.057471 | 0 | 0.586207 | 0.568026 | 0.566524 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011494 | false | 0 | 0.045977 | 0.011494 | 0.068966 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c091b0b06abad07f033205d2aa9de0bd31fa3ba | 10,827 | py | Python | tests/test_pipeline_idamidseq.py | Multiscale-Genomics/mg-process-fastq | 50c7115c0c1a6af48dc34f275e469d1b9eb02999 | [
"Apache-2.0"
] | 2 | 2017-07-31T11:45:46.000Z | 2017-08-09T09:32:35.000Z | tests/test_pipeline_idamidseq.py | Multiscale-Genomics/mg-process-fastq | 50c7115c0c1a6af48dc34f275e469d1b9eb02999 | [
"Apache-2.0"
] | 28 | 2016-11-17T11:12:32.000Z | 2018-11-02T14:09:13.000Z | tests/test_pipeline_idamidseq.py | Multiscale-Genomics/mg-process-fastq | 50c7115c0c1a6af48dc34f275e469d1b9eb02999 | [
"Apache-2.0"
] | 4 | 2017-02-12T17:47:21.000Z | 2018-05-29T08:16:27.000Z | """
.. See the NOTICE file distributed with this work for additional information
regarding copyright ownership.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import print_function
import os.path
import pytest
from basic_modules.metadata import Metadata
from process_damidseq import process_damidseq
@pytest.mark.idamidseq
@pytest.mark.pipeline
def test_idamidseq_pipeline_00():
"""
Test case to ensure that the ChIP-seq pipeline code works.
Running the pipeline with the test data from the command line:
.. code-block:: none
runcompss \\
--lang=python \\
--library_path=${HOME}/bin \\
--pythonpath=/<pyenv_virtenv_dir>/lib/python2.7/site-packages/ \\
--log_level=debug \\
process_damidseq.py \\
--taxon_id 9606 \\
--genome /<dataset_dir>/Human.GCA_000001405.22.fasta \\
--assembly GRCh38 \\
--file /<dataset_dir>/DRR000150.22.fastq
"""
resource_path = os.path.join(os.path.dirname(__file__), "data/")
files = {
'genome': resource_path + 'idear.Human.GCA_000001405.22.fasta',
'index': resource_path + 'idear.Human.GCA_000001405.22.fasta.bwa.tar.gz',
'fastq_1': resource_path + 'idear.Human.SRR3714775.fastq',
'fastq_2': resource_path + 'idear.Human.SRR3714776.fastq',
'bg_fastq_1': resource_path + 'idear.Human.SRR3714777.fastq',
'bg_fastq_2': resource_path + 'idear.Human.SRR3714778.fastq',
}
metadata = {
"genome": Metadata(
"Assembly", "fasta", files['genome'], None,
{'assembly': 'GCA_000001405.22'}),
"index": Metadata(
"Index", "bwa_index", files['index'], files['genome'],
{'assembly': 'GCA_000001405.22', "tool": "bwa_indexer"}),
"fastq_1": Metadata(
"data_idamid_seq", "fastq", files['fastq_1'], None,
{'assembly': 'GCA_000001405.22'}
),
"fastq_2": Metadata(
"data_idamid_seq", "fastq", files['fastq_2'], None,
{'assembly': 'GCA_000001405.22'}
),
"bg_fastq_1": Metadata(
"data_idamid_seq", "fastq", files['bg_fastq_1'], None,
{'assembly': 'GCA_000001405.22'}
),
"bg_fastq_2": Metadata(
"data_idamid_seq", "fastq", files['bg_fastq_2'], None,
{'assembly': 'GCA_000001405.22'}
),
}
config_param = {
"idear_title": "Full genome sequences for Homo sapiens (GRCh38)",
"idear_description": "Full genome sequences for Homo sapiens (GRCh38)",
"idear_common_name": "Human",
"idear_organism": "Homo sapiens",
"idear_provider": "ENA",
"idear_release_date": "2013",
"idear_sample_param": "Nup98",
"idear_background_param": "GFP",
"execution": resource_path
}
files_out = {
"bam": [
files['fastq_1'].replace(".fastq", ".bam"),
files['fastq_2'].replace(".fastq", ".bam")
],
"bg_bam": [
files['bg_fastq_1'].replace(".fastq", ".bam"),
files['bg_fastq_2'].replace(".fastq", ".bam")
],
"bam_filtered": [
files['fastq_1'].replace(".fastq", ".filtered.bam"),
files['fastq_2'].replace(".fastq", ".filtered.bam")
],
"bg_bam_filtered": [
files['bg_fastq_1'].replace(".fastq", ".filtered.bam"),
files['bg_fastq_2'].replace(".fastq", ".filtered.bam")
],
"bsgenome": resource_path + "idear.Human.GCA_000001405.22.22.bsgenome.tar.gz",
"chrom_size": resource_path + "chrom.size",
"genome_2bit": resource_path + "idear.Human.GCA_000001405.22.2bit",
"seed_file": resource_path + "idear.Human.GCA_000001405.22.seed",
"bigwig": resource_path + "idear.Human.Nup98-GFP.bw"
}
damidseq_handle = process_damidseq(config_param)
damidseq_files, damidseq_meta = damidseq_handle.run(files, metadata, files_out) # pylint: disable=unused-variable
print(damidseq_files)
# Add tests for all files created
for f_out in damidseq_files:
print("iDamID-SEQ RESULTS FILE:", f_out)
# assert damidseq_files[f_out] == files_out[f_out]
if isinstance(damidseq_files[f_out], list):
for sub_file_out in damidseq_files[f_out]:
assert os.path.isfile(sub_file_out) is True
assert os.path.getsize(sub_file_out) > 0
try:
os.remove(sub_file_out)
except OSError as ose:
print("Error: %s - %s." % (ose.filename, ose.strerror))
else:
assert os.path.isfile(damidseq_files[f_out]) is True
assert os.path.getsize(damidseq_files[f_out]) > 0
try:
os.remove(damidseq_files[f_out])
except OSError as ose:
print("Error: %s - %s." % (ose.filename, ose.strerror))
@pytest.mark.idamidseq
@pytest.mark.pipeline
def test_idamidseq_pipeline_01():
"""
Test case to ensure that the ChIP-seq pipeline code works.
Running the pipeline with the test data from the command line:
.. code-block:: none
runcompss \\
--lang=python \\
--library_path=${HOME}/bin \\
--pythonpath=/<pyenv_virtenv_dir>/lib/python2.7/site-packages/ \\
--log_level=debug \\
process_damidseq.py \\
--taxon_id 9606 \\
--genome /<dataset_dir>/Human.GCA_000001405.22.fasta \\
--assembly GRCh38 \\
--file /<dataset_dir>/DRR000150.22.fastq
"""
resource_path = os.path.join(os.path.dirname(__file__), "data/")
files = {
'genome_public': resource_path + 'idear.Human.GCA_000001405.22.fasta',
'index_public': resource_path + 'idear.Human.GCA_000001405.22.fasta.bwa.tar.gz',
'fastq_1': resource_path + 'idear.Human.SRR3714775.fastq',
'fastq_2': resource_path + 'idear.Human.SRR3714776.fastq',
'bg_fastq_1': resource_path + 'idear.Human.SRR3714777.fastq',
'bg_fastq_2': resource_path + 'idear.Human.SRR3714778.fastq',
}
metadata = {
"genome_public": Metadata(
"Assembly", "fasta", files['genome_public'], None,
{'assembly': 'GCA_000001405.22'}),
"index_public": Metadata(
"Index", "bwa_index", files['index_public'], files['genome_public'],
{'assembly': 'GCA_000001405.22', "tool": "bwa_indexer"}),
"fastq_1": Metadata(
"data_idamid_seq", "fastq", files['fastq_1'], None,
{'assembly': 'GCA_000001405.22'}
),
"fastq_2": Metadata(
"data_idamid_seq", "fastq", files['fastq_2'], None,
{'assembly': 'GCA_000001405.22'}
),
"bg_fastq_1": Metadata(
"data_idamid_seq", "fastq", files['bg_fastq_1'], None,
{'assembly': 'GCA_000001405.22'}
),
"bg_fastq_2": Metadata(
"data_idamid_seq", "fastq", files['bg_fastq_2'], None,
{'assembly': 'GCA_000001405.22'}
),
}
config_param = {
"idear_title": "Full genome sequences for Homo sapiens (GRCh38)",
"idear_description": "Full genome sequences for Homo sapiens (GRCh38)",
"idear_common_name": "Human",
"idear_organism": "Homo sapiens",
"idear_provider": "ENA",
"idear_release_date": "2013",
"idear_sample_param": "Nup98",
"idear_background_param": "GFP",
}
files_out = {
"bam": [
files['fastq_1'].replace(".fastq", ".bam"),
files['fastq_2'].replace(".fastq", ".bam")
],
"bg_bam": [
files['bg_fastq_1'].replace(".fastq", ".bam"),
files['bg_fastq_2'].replace(".fastq", ".bam")
],
"bam_filtered": [
files['fastq_1'].replace(".fastq", ".filtered.bam"),
files['fastq_2'].replace(".fastq", ".filtered.bam")
],
"bg_bam_filtered": [
files['bg_fastq_1'].replace(".fastq", ".filtered.bam"),
files['bg_fastq_2'].replace(".fastq", ".filtered.bam")
],
"bsgenome": resource_path + "idear.Human.GCA_000001405.22.22.bsgenome.tar.gz",
"chrom_size": resource_path + "chrom.size",
"genome_2bit": resource_path + "idear.Human.GCA_000001405.22.2bit",
"seed_file": resource_path + "idear.Human.GCA_000001405.22.seed",
"bigwig": resource_path + "idear.Human.Nup98-GFP.bw"
}
damidseq_handle = process_damidseq(config_param)
damidseq_files, damidseq_meta = damidseq_handle.run(files, metadata, files_out) # pylint: disable=unused-variable
print(damidseq_files)
# Add tests for all files created
for f_out in damidseq_files:
print("iDamID-SEQ RESULTS FILE:", f_out)
# assert damidseq_files[f_out] == files_out[f_out]
if isinstance(damidseq_files[f_out], list):
for sub_file_out in damidseq_files[f_out]:
assert os.path.isfile(sub_file_out) is True
assert os.path.getsize(sub_file_out) > 0
try:
os.remove(sub_file_out)
except OSError as ose:
print("Error: %s - %s." % (ose.filename, ose.strerror))
else:
assert os.path.isfile(damidseq_files[f_out]) is True
assert os.path.getsize(damidseq_files[f_out]) > 0
try:
os.remove(damidseq_files[f_out])
except OSError as ose:
print("Error: %s - %s." % (ose.filename, ose.strerror))
| 40.399254 | 118 | 0.556941 | 1,196 | 10,827 | 4.80602 | 0.178094 | 0.052192 | 0.058455 | 0.076548 | 0.875783 | 0.864649 | 0.843076 | 0.843076 | 0.843076 | 0.824983 | 0 | 0.058382 | 0.310243 | 10,827 | 267 | 119 | 40.550562 | 0.711302 | 0.236538 | 0 | 0.847826 | 0 | 0 | 0.334032 | 0.086345 | 0 | 0 | 0 | 0 | 0.043478 | 1 | 0.01087 | false | 0 | 0.027174 | 0 | 0.038043 | 0.048913 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c5312db77fd1313819b89ebd537a50d23d1186c | 7,755 | py | Python | onnxruntime/test/testdata/transform/fusion/bias_softmax_gen.py | jamill/onnxruntime | 0565fecf46c4dd711c01a4106641946963bf7ff0 | [
"MIT"
] | 669 | 2018-12-03T22:00:31.000Z | 2019-05-06T19:42:49.000Z | onnxruntime/test/testdata/transform/fusion/bias_softmax_gen.py | jamill/onnxruntime | 0565fecf46c4dd711c01a4106641946963bf7ff0 | [
"MIT"
] | 440 | 2018-12-03T21:09:56.000Z | 2019-05-06T20:47:23.000Z | onnxruntime/test/testdata/transform/fusion/bias_softmax_gen.py | jamill/onnxruntime | 0565fecf46c4dd711c01a4106641946963bf7ff0 | [
"MIT"
] | 140 | 2018-12-03T21:15:28.000Z | 2019-05-06T18:02:36.000Z | import onnx
from onnx import OperatorSetIdProto, TensorProto, helper
add = helper.make_node("Add", ["input", "bias"], ["add_out"], "add")
reverseadd = helper.make_node("Add", ["bias", "input"], ["add_out"], "add")
softmax1 = helper.make_node("Softmax", ["add_out"], ["output"], "softmax", axis=1)
softmax3 = helper.make_node("Softmax", ["add_out"], ["output"], "softmax", axis=3)
softmax6 = helper.make_node("Softmax", ["add_out"], ["output"], "softmax", axis=6)
softmax_no_axis = helper.make_node("Softmax", ["add_out"], ["output"], "softmax")
onnxdomain = OperatorSetIdProto()
onnxdomain.version = 13
# The empty string ("") or absence of this field implies the operator set that is defined as part of the ONNX specification.
onnxdomain.domain = ""
msdomain = OperatorSetIdProto()
msdomain.version = 1
msdomain.domain = "com.microsoft"
opsets = [onnxdomain, msdomain]
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax_no_axis],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info("input", TensorProto.FLOAT, ["d_1", "d_2"]),
helper.make_tensor_value_info("bias", TensorProto.FLOAT, ["d_1", "d_2"]),
],
[
helper.make_tensor_value_info("output", TensorProto.FLOAT, ["d_1", "d_2"]),
],
[],
),
opset_imports=opsets,
),
r"bias_softmax_fusion_simple_no_axis_opset13.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax1],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info("input", TensorProto.BFLOAT16, ["d_1", "d_2"]),
helper.make_tensor_value_info("bias", TensorProto.BFLOAT16, ["d_1", "d_2"]),
],
[
helper.make_tensor_value_info("output", TensorProto.BFLOAT16, ["d_1", "d_2"]),
],
[],
),
opset_imports=opsets,
),
r"bias_softmax_fusion_bfloat16.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax1],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info("input", TensorProto.FLOAT, ["d_1", "d_2"]),
helper.make_tensor_value_info("bias", TensorProto.FLOAT, ["d_1", "d_2"]),
],
[
helper.make_tensor_value_info("output", TensorProto.FLOAT, ["d_1", "d_2"]),
],
[],
)
),
r"bias_softmax_fusion_simple.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax6],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info(
"input",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
helper.make_tensor_value_info(
"bias",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", 1, 1, 1, "d_6", "d_7", "d_8"],
),
],
[
helper.make_tensor_value_info(
"output",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
],
[],
)
),
r"bias_softmax_fusion_middleones.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[reverseadd, softmax6],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info(
"input",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
helper.make_tensor_value_info(
"bias",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", 1, 1, 1, "d_6", "d_7", "d_8"],
),
],
[
helper.make_tensor_value_info(
"output",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
],
[],
)
),
r"bias_softmax_fusion_middleones_reversed.onnx",
)
# should NOT fuse
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax3],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info(
"input",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
helper.make_tensor_value_info(
"bias",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", 1, 1, 1, "d_6", "d_7", "d_8"],
),
],
[
helper.make_tensor_value_info(
"output",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
],
[],
)
),
r"bias_softmax_fusion_middleones_badaxis.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax6],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info(
"input",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
helper.make_tensor_value_info("bias", TensorProto.FLOAT, [1, 1, 1, 1, 1, 1, "d_6", "d_7", "d_8"]),
],
[
helper.make_tensor_value_info(
"output",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
],
[],
)
),
r"bias_softmax_fusion_allleadingones.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax6],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info(
"input",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
helper.make_tensor_value_info("bias", TensorProto.FLOAT, [1, 1, "d_6", "d_7", "d_8"]),
],
[
helper.make_tensor_value_info(
"output",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
],
[],
)
),
r"bias_softmax_fusion_someleadingones.onnx",
)
onnx.save(
helper.make_model(
helper.make_graph(
[add, softmax6],
"Add_Softmax_Fusion",
[
helper.make_tensor_value_info(
"input",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
helper.make_tensor_value_info("bias", TensorProto.FLOAT, ["d_6", "d_7", "d_8"]),
],
[
helper.make_tensor_value_info(
"output",
TensorProto.FLOAT,
["d_0", "d_1", "d_2", "d_3", "d_4", "d_5", "d_6", "d_7", "d_8"],
),
],
[],
)
),
r"bias_softmax_fusion_noleadingones.onnx",
)
| 31.653061 | 124 | 0.446551 | 838 | 7,755 | 3.750597 | 0.093079 | 0.162265 | 0.137448 | 0.180401 | 0.820872 | 0.811009 | 0.808463 | 0.808463 | 0.795737 | 0.714286 | 0 | 0.042293 | 0.399355 | 7,755 | 244 | 125 | 31.782787 | 0.63246 | 0.017795 | 0 | 0.708696 | 0 | 0 | 0.166929 | 0.045968 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017391 | 0 | 0.017391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
92d01a4e65c4b69480e8f2c2376eef9f3f5e3b68 | 8,189 | py | Python | python/cugraph/tests/test_renumber.py | jwyles/cugraph | 1758d085e03d1d62ccd7064fda8cb0257011f50b | [
"Apache-2.0"
] | null | null | null | python/cugraph/tests/test_renumber.py | jwyles/cugraph | 1758d085e03d1d62ccd7064fda8cb0257011f50b | [
"Apache-2.0"
] | null | null | null | python/cugraph/tests/test_renumber.py | jwyles/cugraph | 1758d085e03d1d62ccd7064fda8cb0257011f50b | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2019, NVIDIA CORPORATION.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# This file test the Renumbering features
import gc
import pandas as pd
import pytest
import cudf
from cugraph.structure.number_map import NumberMap
from cugraph.tests import utils
def test_renumber_ips():
source_list = [
"192.168.1.1",
"172.217.5.238",
"216.228.121.209",
"192.16.31.23",
]
dest_list = [
"172.217.5.238",
"216.228.121.209",
"192.16.31.23",
"192.168.1.1",
]
pdf = pd.DataFrame({"source_list": source_list, "dest_list": dest_list})
gdf = cudf.from_pandas(pdf)
gdf["source_as_int"] = gdf["source_list"].str.ip2int()
gdf["dest_as_int"] = gdf["dest_list"].str.ip2int()
numbering = NumberMap()
numbering.from_series(gdf["source_as_int"], gdf["dest_as_int"])
src = numbering.to_internal_vertex_id(gdf["source_as_int"])
dst = numbering.to_internal_vertex_id(gdf["dest_as_int"])
check_src = numbering.from_internal_vertex_id(src)["0"]
check_dst = numbering.from_internal_vertex_id(dst)["0"]
assert check_src.equals(gdf["source_as_int"])
assert check_dst.equals(gdf["dest_as_int"])
def test_renumber_ips_cols():
source_list = [
"192.168.1.1",
"172.217.5.238",
"216.228.121.209",
"192.16.31.23",
]
dest_list = [
"172.217.5.238",
"216.228.121.209",
"192.16.31.23",
"192.168.1.1",
]
pdf = pd.DataFrame({"source_list": source_list, "dest_list": dest_list})
gdf = cudf.from_pandas(pdf)
gdf["source_as_int"] = gdf["source_list"].str.ip2int()
gdf["dest_as_int"] = gdf["dest_list"].str.ip2int()
numbering = NumberMap()
numbering.from_dataframe(gdf, ["source_as_int"], ["dest_as_int"])
src = numbering.to_internal_vertex_id(gdf["source_as_int"])
dst = numbering.to_internal_vertex_id(gdf["dest_as_int"])
check_src = numbering.from_internal_vertex_id(src)["0"]
check_dst = numbering.from_internal_vertex_id(dst)["0"]
assert check_src.equals(gdf["source_as_int"])
assert check_dst.equals(gdf["dest_as_int"])
@pytest.mark.skip(reason="temporarily dropped string support")
def test_renumber_ips_str_cols():
source_list = [
"192.168.1.1",
"172.217.5.238",
"216.228.121.209",
"192.16.31.23",
]
dest_list = [
"172.217.5.238",
"216.228.121.209",
"192.16.31.23",
"192.168.1.1",
]
pdf = pd.DataFrame({"source_list": source_list, "dest_list": dest_list})
gdf = cudf.from_pandas(pdf)
numbering = NumberMap()
numbering.from_dataframe(gdf, ["source_list"], ["dest_list"])
src = numbering.to_internal_vertex_id(gdf["source_list"])
dst = numbering.to_internal_vertex_id(gdf["dest_list"])
check_src = numbering.from_internal_vertex_id(src)["0"]
check_dst = numbering.from_internal_vertex_id(dst)["0"]
assert check_src.equals(gdf["source_list"])
assert check_dst.equals(gdf["dest_list"])
def test_renumber_negative():
source_list = [4, 6, 8, -20, 1]
dest_list = [1, 29, 35, 0, 77]
df = pd.DataFrame({"source_list": source_list, "dest_list": dest_list})
gdf = cudf.DataFrame.from_pandas(df[["source_list", "dest_list"]])
numbering = NumberMap()
numbering.from_dataframe(gdf, ["source_list"], ["dest_list"])
src = numbering.to_internal_vertex_id(gdf["source_list"])
dst = numbering.to_internal_vertex_id(gdf["dest_list"])
check_src = numbering.from_internal_vertex_id(src)["0"]
check_dst = numbering.from_internal_vertex_id(dst)["0"]
assert check_src.equals(gdf["source_list"])
assert check_dst.equals(gdf["dest_list"])
def test_renumber_negative_col():
source_list = [4, 6, 8, -20, 1]
dest_list = [1, 29, 35, 0, 77]
df = pd.DataFrame({"source_list": source_list, "dest_list": dest_list})
gdf = cudf.DataFrame.from_pandas(df[["source_list", "dest_list"]])
numbering = NumberMap()
numbering.from_dataframe(gdf, ["source_list"], ["dest_list"])
src = numbering.to_internal_vertex_id(gdf["source_list"])
dst = numbering.to_internal_vertex_id(gdf["dest_list"])
check_src = numbering.from_internal_vertex_id(src)["0"]
check_dst = numbering.from_internal_vertex_id(dst)["0"]
assert check_src.equals(gdf["source_list"])
assert check_dst.equals(gdf["dest_list"])
# Test all combinations of default/managed and pooled/non-pooled allocation
@pytest.mark.parametrize("graph_file", utils.DATASETS)
def test_renumber_files(graph_file):
gc.collect()
M = utils.read_csv_for_nx(graph_file)
sources = cudf.Series(M["0"])
destinations = cudf.Series(M["1"])
translate = 1000
df = cudf.DataFrame()
df["src"] = cudf.Series([x + translate for x in sources.
values_host])
df["dst"] = cudf.Series([x + translate for x in destinations.
values_host])
numbering = NumberMap()
numbering.from_series(df["src"], df["dst"])
renumbered_df = numbering.add_internal_vertex_id(
numbering.add_internal_vertex_id(df, "src_id", ["src"]),
"dst_id", ["dst"]
)
check_src = numbering.from_internal_vertex_id(renumbered_df, "src_id")
check_dst = numbering.from_internal_vertex_id(renumbered_df, "dst_id")
assert check_src["src"].equals(check_src["0"])
assert check_dst["dst"].equals(check_dst["0"])
# Test all combinations of default/managed and pooled/non-pooled allocation
@pytest.mark.parametrize("graph_file", utils.DATASETS)
def test_renumber_files_col(graph_file):
gc.collect()
M = utils.read_csv_for_nx(graph_file)
sources = cudf.Series(M["0"])
destinations = cudf.Series(M["1"])
translate = 1000
gdf = cudf.DataFrame()
gdf['src'] = cudf.Series([x + translate for x in sources.values_host])
gdf['dst'] = cudf.Series([x + translate for x in destinations.
values_host])
numbering = NumberMap()
numbering.from_dataframe(gdf, ["src"], ["dst"])
renumbered_df = numbering.add_internal_vertex_id(
numbering.add_internal_vertex_id(gdf, "src_id", ["src"]),
"dst_id", ["dst"]
)
check_src = numbering.from_internal_vertex_id(renumbered_df, "src_id")
check_dst = numbering.from_internal_vertex_id(renumbered_df, "dst_id")
assert check_src["src"].equals(check_src["0"])
assert check_dst["dst"].equals(check_dst["0"])
# Test all combinations of default/managed and pooled/non-pooled allocation
@pytest.mark.parametrize("graph_file", utils.DATASETS)
def test_renumber_files_multi_col(graph_file):
gc.collect()
M = utils.read_csv_for_nx(graph_file)
sources = cudf.Series(M["0"])
destinations = cudf.Series(M["1"])
translate = 1000
gdf = cudf.DataFrame()
gdf["src_old"] = sources
gdf["dst_old"] = destinations
gdf["src"] = sources + translate
gdf["dst"] = destinations + translate
numbering = NumberMap()
numbering.from_dataframe(gdf, ["src", "src_old"], ["dst", "dst_old"])
renumbered_df = numbering.add_internal_vertex_id(
numbering.add_internal_vertex_id(
gdf, "src_id", ["src", "src_old"]
),
"dst_id",
["dst", "dst_old"],
)
check_src = numbering.from_internal_vertex_id(renumbered_df, "src_id")
check_dst = numbering.from_internal_vertex_id(renumbered_df, "dst_id")
assert check_src["src"].equals(check_src["0"])
assert check_src["src_old"].equals(check_src["1"])
assert check_dst["dst"].equals(check_dst["0"])
assert check_dst["dst_old"].equals(check_dst["1"])
| 30.901887 | 76 | 0.668702 | 1,162 | 8,189 | 4.442341 | 0.139415 | 0.086788 | 0.099186 | 0.083688 | 0.824293 | 0.815382 | 0.815382 | 0.800077 | 0.793878 | 0.793878 | 0 | 0.046533 | 0.18647 | 8,189 | 264 | 77 | 31.018939 | 0.72831 | 0.100501 | 0 | 0.732955 | 0 | 0 | 0.15594 | 0 | 0 | 0 | 0 | 0 | 0.102273 | 1 | 0.045455 | false | 0 | 0.034091 | 0 | 0.079545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
13281b44e4a1d93fa3660c59d8274c5ec621ff18 | 156 | py | Python | pygluu/containerlib/meta/__init__.py | GluuFederation/pygluu-containerlib | 1f2e1cc46870cf1bfa0dad435201f0bfa695e24d | [
"Apache-2.0"
] | 1 | 2021-01-29T18:28:06.000Z | 2021-01-29T18:28:06.000Z | pygluu/containerlib/meta/__init__.py | GluuFederation/pygluu-containerlib | 1f2e1cc46870cf1bfa0dad435201f0bfa695e24d | [
"Apache-2.0"
] | 27 | 2019-07-22T21:05:10.000Z | 2022-01-15T09:33:33.000Z | pygluu/containerlib/meta/__init__.py | GluuFederation/pygluu-containerlib | 1f2e1cc46870cf1bfa0dad435201f0bfa695e24d | [
"Apache-2.0"
] | 3 | 2019-08-13T19:30:55.000Z | 2020-12-16T12:12:22.000Z | from pygluu.containerlib.meta.docker_meta import DockerMeta # noqa: F401
from pygluu.containerlib.meta.kubernetes_meta import KubernetesMeta # noqa: F401
| 52 | 81 | 0.833333 | 20 | 156 | 6.4 | 0.55 | 0.15625 | 0.34375 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042857 | 0.102564 | 156 | 2 | 82 | 78 | 0.871429 | 0.134615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
136c825699aedae2ba8fc3cfe77a03d197eca185 | 5,054 | py | Python | python_helpers/set_pixel.py | RandomBananazz/chip8mc | 0e184c392a523c82dbc945325aa2cb9e5487e5e7 | [
"MIT"
] | 3 | 2020-09-28T17:50:49.000Z | 2020-12-30T18:23:46.000Z | python_helpers/set_pixel.py | RandomBananazz/chip8mc | 0e184c392a523c82dbc945325aa2cb9e5487e5e7 | [
"MIT"
] | null | null | null | python_helpers/set_pixel.py | RandomBananazz/chip8mc | 0e184c392a523c82dbc945325aa2cb9e5487e5e7 | [
"MIT"
] | null | null | null | # 0x0 to 0xFFF (0-4095) accessible memory
# 4 switch-case per file
"""
for a in range(3):
c = 0
for n in range(4**a):
with open(f'..\\data\\renderer\\functions\\set_pixel\\{4**(3-a)}set_pixel_{(4**(3-a))*n}-{((4**(3-a))*n)+((4**(3-a))-1)}.mcfunction', 'w') as f:
if a != 2:
for i in range(4):
p = 4**(2-a)
q = c+(i*(4**(2-a)))
r = c+((i+1)*(4**(2-a)))-1
f.write(f'execute if score Global pixel matches {q}..{r} run function renderer:set_pixel/{p}set_pixel_{q}-{r}\n')
c += 4**(3-a)
else:
for i in range(4):
p = 4**(2-a)
q = c+(i*(4**(2-a)))
r = c+((i+1)*(4**(2-a)))-1
f.write(f'execute if score Global pixel matches {q} run function renderer:set_pixel/set_pixel_{q}\n')
c += 4**(3-a)
"""
for n in range(2):
with open(f'..\\data\\renderer\\functions\\set_pixel\\1024set_pixel_{1024*n}-{(1024*n)+1023}.mcfunction', 'w') as f:
p = n * 1024
q = p + 256
r = q + 256
s = r + 256
f.write(f'execute if score Global pixel matches {p}..{p+255} run function renderer:set_pixel/256set_pixel_{p}-{p+255}\n'
f'execute if score Global pixel matches {q}..{q+255} run function renderer:set_pixel/256set_pixel_{q}-{q+255}\n'
f'execute if score Global pixel matches {r}..{r+255} run function renderer:set_pixel/256set_pixel_{r}-{r+255}\n'
f'execute if score Global pixel matches {s}..{s+255} run function renderer:set_pixel/256set_pixel_{s}-{s+255}\n')
for n in range(8):
with open(f'..\\data\\renderer\\functions\\set_pixel\\256set_pixel_{256*n}-{(256*n)+255}.mcfunction', 'w') as f:
p = n * 256
q = p + 64
r = q + 64
s = r + 64
f.write(f'execute if score Global pixel matches {p}..{p+63} run function renderer:set_pixel/64set_pixel_{p}-{p+63}\n'
f'execute if score Global pixel matches {q}..{q+63} run function renderer:set_pixel/64set_pixel_{q}-{q+63}\n'
f'execute if score Global pixel matches {r}..{r+63} run function renderer:set_pixel/64set_pixel_{r}-{r+63}\n'
f'execute if score Global pixel matches {s}..{s+63} run function renderer:set_pixel/64set_pixel_{s}-{s+63}\n')
for n in range(32):
with open(f'..\\data\\renderer\\functions\\set_pixel\\64set_pixel_{64*n}-{(64*n)+63}.mcfunction', 'w') as f:
p = n * 64
q = p + 16
r = q + 16
s = r + 16
f.write(f'execute if score Global pixel matches {p}..{p+15} run function renderer:set_pixel/16set_pixel_{p}-{p+15}\n'
f'execute if score Global pixel matches {q}..{q+15} run function renderer:set_pixel/16set_pixel_{q}-{q+15}\n'
f'execute if score Global pixel matches {r}..{r+15} run function renderer:set_pixel/16set_pixel_{r}-{r+15}\n'
f'execute if score Global pixel matches {s}..{s+15} run function renderer:set_pixel/16set_pixel_{s}-{s+15}\n')
for n in range(128):
with open(f'..\\data\\renderer\\functions\\set_pixel\\16set_pixel_{16*n}-{(16*n)+15}.mcfunction', 'w') as f:
p = n * 16
q = p + 4
r = q + 4
s = r + 4
f.write(f'execute if score Global pixel matches {p}..{p+3} run function renderer:set_pixel/4set_pixel_{p}-{p+3}\n'
f'execute if score Global pixel matches {q}..{q+3} run function renderer:set_pixel/4set_pixel_{q}-{q+3}\n'
f'execute if score Global pixel matches {r}..{r+3} run function renderer:set_pixel/4set_pixel_{r}-{r+3}\n'
f'execute if score Global pixel matches {s}..{s+3} run function renderer:set_pixel/4set_pixel_{s}-{s+3}\n')
for n in range(512):
with open(f'..\\data\\renderer\\functions\\set_pixel\\4set_pixel_{4*n}-{(4*n)+3}.mcfunction', 'w') as f:
p = n * 4
q = p + 1
r = q + 1
s = r + 1
f.write(f'execute if score Global pixel matches {p} run function renderer:set_pixel/set_pixel_{p}\n'
f'execute if score Global pixel matches {q} run function renderer:set_pixel/set_pixel_{q}\n'
f'execute if score Global pixel matches {r} run function renderer:set_pixel/set_pixel_{r}\n'
f'execute if score Global pixel matches {s} run function renderer:set_pixel/set_pixel_{s}\n')
for i in range(2048):
with open(f'..\\data\\renderer\\functions\\set_pixel\\set_pixel_{i}.mcfunction', 'w') as f:
f.write(f'execute if score Global pixel_{i} matches 1 run scoreboard players set Global VF 1\n'
f'execute if score Global pixel_{i} matches 1 run scoreboard players set Global erased 1\n'
f'execute if score Global pixel_{i} matches 1 run scoreboard players set Global pixel_{i} 0\n'
f'execute unless score Global VF matches 1 if score Global pixel_{i} matches 0 run scoreboard players set Global pixel_{i} 1\n')
| 58.767442 | 152 | 0.583696 | 845 | 5,054 | 3.381065 | 0.081657 | 0.103605 | 0.118306 | 0.163808 | 0.850543 | 0.826741 | 0.779839 | 0.724886 | 0.411271 | 0.378369 | 0 | 0.066489 | 0.256035 | 5,054 | 85 | 153 | 59.458824 | 0.693351 | 0.184606 | 0 | 0 | 0 | 0.410714 | 0.714738 | 0.319854 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
137048ebe70dc62dd16f170db479f9f3e147f4eb | 28,706 | py | Python | strategy_block/one/tests/test_block.py | spectrum-dev/django-block-monolith | c17a1ef98ae813a4e94581e2e52a4a03f0e65769 | [
"MIT"
] | null | null | null | strategy_block/one/tests/test_block.py | spectrum-dev/django-block-monolith | c17a1ef98ae813a4e94581e2e52a4a03f0e65769 | [
"MIT"
] | null | null | null | strategy_block/one/tests/test_block.py | spectrum-dev/django-block-monolith | c17a1ef98ae813a4e94581e2e52a4a03f0e65769 | [
"MIT"
] | null | null | null | from django.test import TestCase
from blocks.event import event_ingestor
from strategy_block.one.exceptions import StrategyBlockOneInvalidInputPayloadException
class BacktestBlockRunning(TestCase):
def setUp(self):
self.payload = {
"blockType": "STRATEGY_BLOCK",
"blockId": 1,
}
def test_backtest_block_buy_ok(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
],
"SIGNAL_BLOCK-1-1": [{"timestamp": "01/02/2020", "order": "BUY"}],
},
}
response = event_ingestor(payload)
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 10090.0, "timestamp": "01/03/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "BUY",
"cash_allocated": 1000.0,
"shares": 90,
"amount_invested": 990.0,
}
],
}
},
)
def test_backtest_block_buy_cast_to_float(self):
payload = {
**self.payload,
"inputs": {
"start_value": "10000.00",
"commission": "0.00",
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": "1000.00",
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
],
"SIGNAL_BLOCK-1-1": [{"timestamp": "01/02/2020", "order": "BUY"}],
},
}
response = event_ingestor(payload)
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 10090.0, "timestamp": "01/03/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "BUY",
"cash_allocated": 1000.0,
"shares": 90,
"amount_invested": 990.0,
}
],
}
},
)
def test_backtest_block_sell_ok(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
],
"SIGNAL_BLOCK-1-1": [{"timestamp": "01/02/2020", "order": "SELL"}],
},
}
response = event_ingestor(payload)
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 9910.0, "timestamp": "01/03/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "SELL",
"cash_allocated": 1000.0,
"shares": -90,
"amount_invested": -990.0,
}
],
}
},
)
# TODO: There is a bug here (or is there?)
def test_buy_sell_ok(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
{
"timestamp": "01/04/2020",
"timezone": "UTC/EST",
"open": "13.00",
"high": "13.00",
"low": "13.00",
"close": "13.00",
"volume": "13.00",
},
{
"timestamp": "01/05/2020",
"timezone": "UTC/EST",
"open": "14.00",
"high": "14.00",
"low": "14.00",
"close": "14.00",
"volume": "14.00",
},
],
"SIGNAL_BLOCK-1-1": [
{"timestamp": "01/02/2020", "order": "BUY"},
{"timestamp": "01/04/2020", "order": "SELL"},
],
},
}
response = event_ingestor(payload)
# Bug -> It is due to price of the stock going up, hence it does not sell all shares. Need to distinguish between a "closing out" order and a SELL
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 10090.0, "timestamp": "01/03/2020"},
{"value": 10180.0, "timestamp": "01/04/2020"},
{"value": 10180.0, "timestamp": "01/05/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "BUY",
"cash_allocated": 1000.0,
"shares": 90,
"amount_invested": 990.0,
},
{
"timestamp": "01/04/2020",
"order": "SELL_CLOSE",
"cash_allocated": 0.0,
"shares": -90,
"amount_invested": -1170.0,
},
],
}
},
)
def test_sell_buy_ok(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
{
"timestamp": "01/04/2020",
"timezone": "UTC/EST",
"open": "13.00",
"high": "13.00",
"low": "13.00",
"close": "13.00",
"volume": "13.00",
},
{
"timestamp": "01/05/2020",
"timezone": "UTC/EST",
"open": "14.00",
"high": "14.00",
"low": "14.00",
"close": "14.00",
"volume": "14.00",
},
],
"SIGNAL_BLOCK-1-1": [
{"timestamp": "01/02/2020", "order": "SELL"},
{"timestamp": "01/04/2020", "order": "BUY"},
],
},
}
response = event_ingestor(payload)
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 9910.0, "timestamp": "01/03/2020"},
{"value": 9820.0, "timestamp": "01/04/2020"},
{"value": 9820.0, "timestamp": "01/05/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "SELL",
"cash_allocated": 1000.0,
"shares": -90,
"amount_invested": -990.0,
},
{
"timestamp": "01/04/2020",
"order": "BUY_CLOSE",
"cash_allocated": 0.0,
"shares": 90,
"amount_invested": 1170.0,
},
],
}
},
)
def test_consecutive_sell_ok(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
{
"timestamp": "01/04/2020",
"timezone": "UTC/EST",
"open": "13.00",
"high": "13.00",
"low": "13.00",
"close": "13.00",
"volume": "13.00",
},
{
"timestamp": "01/05/2020",
"timezone": "UTC/EST",
"open": "14.00",
"high": "14.00",
"low": "14.00",
"close": "14.00",
"volume": "14.00",
},
],
"SIGNAL_BLOCK-1-1": [
{"timestamp": "01/02/2020", "order": "SELL"},
{"timestamp": "01/04/2020", "order": "SELL"},
],
},
}
response = event_ingestor(payload)
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 9910.0, "timestamp": "01/03/2020"},
{"value": 9820.0, "timestamp": "01/04/2020"},
{"value": 9654.0, "timestamp": "01/05/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "SELL",
"cash_allocated": 1000.0,
"shares": -90,
"amount_invested": -990.0,
},
{
"timestamp": "01/04/2020",
"order": "SELL",
"cash_allocated": 1000.0,
"shares": -76,
"amount_invested": -988.0,
},
],
}
},
)
def test_consecutive_buy_ok(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
{
"timestamp": "01/04/2020",
"timezone": "UTC/EST",
"open": "13.00",
"high": "13.00",
"low": "13.00",
"close": "13.00",
"volume": "13.00",
},
{
"timestamp": "01/05/2020",
"timezone": "UTC/EST",
"open": "14.00",
"high": "14.00",
"low": "14.00",
"close": "14.00",
"volume": "14.00",
},
],
"SIGNAL_BLOCK-1-1": [
{"timestamp": "01/02/2020", "order": "BUY"},
{"timestamp": "01/04/2020", "order": "BUY"},
],
},
}
response = event_ingestor(payload)
self.assertDictEqual(
response,
{
"response": {
"portVals": [
{"value": 10000.0, "timestamp": "01/01/2020"},
{"value": 10000.0, "timestamp": "01/02/2020"},
{"value": 10090.0, "timestamp": "01/03/2020"},
{"value": 10180.0, "timestamp": "01/04/2020"},
{"value": 10346.0, "timestamp": "01/05/2020"},
],
"trades": [
{
"timestamp": "01/02/2020",
"order": "BUY",
"cash_allocated": 1000.0,
"shares": 90,
"amount_invested": 990.0,
},
{
"timestamp": "01/04/2020",
"order": "BUY",
"cash_allocated": 1000.0,
"shares": 76,
"amount_invested": 988.0,
},
],
}
},
)
def test_orders_df_empty_core_function(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
],
"SIGNAL_BLOCK-1-1": [],
},
}
response = event_ingestor(payload)
self.assertDictEqual(response, {"response": {"portVals": [], "trades": []}})
def test_failure_missing_input_variable(self):
payload = {
**self.payload,
"inputs": {
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": 1000.00,
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
{
"timestamp": "01/04/2020",
"timezone": "UTC/EST",
"open": "13.00",
"high": "13.00",
"low": "13.00",
"close": "13.00",
"volume": "13.00",
},
{
"timestamp": "01/05/2020",
"timezone": "UTC/EST",
"open": "14.00",
"high": "14.00",
"low": "14.00",
"close": "14.00",
"volume": "14.00",
},
],
"SIGNAL_BLOCK-1-1": [
{"timestamp": "01/02/2020", "order": "BUY"},
{"timestamp": "01/04/2020", "order": "BUY"},
],
},
}
with self.assertRaises(StrategyBlockOneInvalidInputPayloadException):
event_ingestor(payload)
def test_failure_not_castable_to_float(self):
payload = {
**self.payload,
"inputs": {
"start_value": 10000.00,
"commission": 0.00,
"impact": 0.00,
"stop_loss": 0.0,
"take_profit": 0.0,
"trade_amount_value": "FOO",
},
"outputs": {
"DATA_BLOCK-1-1": [
{
"timestamp": "01/01/2020",
"timezone": "UTC/EST",
"open": "10.00",
"high": "10.00",
"low": "10.00",
"close": "10.00",
"volume": "10.00",
},
{
"timestamp": "01/02/2020",
"timezone": "UTC/EST",
"open": "11.00",
"high": "11.00",
"low": "11.00",
"close": "11.00",
"volume": "11.00",
},
{
"timestamp": "01/03/2020",
"timezone": "UTC/EST",
"open": "12.00",
"high": "12.00",
"low": "12.00",
"close": "12.00",
"volume": "12.00",
},
{
"timestamp": "01/04/2020",
"timezone": "UTC/EST",
"open": "13.00",
"high": "13.00",
"low": "13.00",
"close": "13.00",
"volume": "13.00",
},
{
"timestamp": "01/05/2020",
"timezone": "UTC/EST",
"open": "14.00",
"high": "14.00",
"low": "14.00",
"close": "14.00",
"volume": "14.00",
},
],
"SIGNAL_BLOCK-1-1": [
{"timestamp": "01/02/2020", "order": "BUY"},
{"timestamp": "01/04/2020", "order": "BUY"},
],
},
}
with self.assertRaises(StrategyBlockOneInvalidInputPayloadException):
event_ingestor(payload)
| 36.062814 | 154 | 0.275378 | 1,961 | 28,706 | 3.956655 | 0.064253 | 0.1361 | 0.079263 | 0.095115 | 0.934657 | 0.926408 | 0.925248 | 0.925248 | 0.925248 | 0.916097 | 0 | 0.176881 | 0.580506 | 28,706 | 795 | 155 | 36.108176 | 0.467447 | 0.006445 | 0 | 0.737188 | 0 | 0 | 0.225226 | 0 | 0 | 0 | 0 | 0.001258 | 0.013141 | 1 | 0.014455 | false | 0 | 0.003942 | 0 | 0.019711 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
137790adfb3c6b31d97668007ebd701755e9ae4e | 1,494 | py | Python | TrekBot2_WS/build/costmap_2d/catkin_generated/pkg.develspace.context.pc.py | Rafcin/RescueRoboticsLHMV | d3dc63e6c16a040b16170f143556ef358018b7da | [
"Unlicense"
] | 1 | 2018-10-04T14:37:00.000Z | 2018-10-04T14:37:00.000Z | TrekBot2_WS/build/costmap_2d/catkin_generated/pkg.develspace.context.pc.py | Rafcin/TrekBot | d3dc63e6c16a040b16170f143556ef358018b7da | [
"Unlicense"
] | null | null | null | TrekBot2_WS/build/costmap_2d/catkin_generated/pkg.develspace.context.pc.py | Rafcin/TrekBot | d3dc63e6c16a040b16170f143556ef358018b7da | [
"Unlicense"
] | null | null | null | # generated from catkin/cmake/template/pkg.context.pc.in
CATKIN_PACKAGE_PREFIX = ""
PROJECT_PKG_CONFIG_INCLUDE_DIRS = "/xavier_ssd/TrekBot/TrekBot2_WS/devel/.private/costmap_2d/include;/xavier_ssd/TrekBot/TrekBot2_WS/src/navigation/costmap_2d/include;/usr/include/eigen3;/usr/include".split(';') if "/xavier_ssd/TrekBot/TrekBot2_WS/devel/.private/costmap_2d/include;/xavier_ssd/TrekBot/TrekBot2_WS/src/navigation/costmap_2d/include;/usr/include/eigen3;/usr/include" != "" else []
PROJECT_CATKIN_DEPENDS = "dynamic_reconfigure;geometry_msgs;laser_geometry;map_msgs;message_filters;message_runtime;nav_msgs;pluginlib;roscpp;sensor_msgs;std_msgs;tf2_ros;visualization_msgs;voxel_grid".replace(';', ' ')
PKG_CONFIG_LIBRARIES_WITH_PREFIX = "-lcostmap_2d;-llayers;/usr/lib/aarch64-linux-gnu/libboost_system.so;/usr/lib/aarch64-linux-gnu/libboost_thread.so;/usr/lib/aarch64-linux-gnu/libboost_chrono.so;/usr/lib/aarch64-linux-gnu/libboost_date_time.so;/usr/lib/aarch64-linux-gnu/libboost_atomic.so;/usr/lib/aarch64-linux-gnu/libpthread.so".split(';') if "-lcostmap_2d;-llayers;/usr/lib/aarch64-linux-gnu/libboost_system.so;/usr/lib/aarch64-linux-gnu/libboost_thread.so;/usr/lib/aarch64-linux-gnu/libboost_chrono.so;/usr/lib/aarch64-linux-gnu/libboost_date_time.so;/usr/lib/aarch64-linux-gnu/libboost_atomic.so;/usr/lib/aarch64-linux-gnu/libpthread.so" != "" else []
PROJECT_NAME = "costmap_2d"
PROJECT_SPACE_DIR = "/xavier_ssd/TrekBot/TrekBot2_WS/devel/.private/costmap_2d"
PROJECT_VERSION = "1.16.2"
| 166 | 658 | 0.8166 | 231 | 1,494 | 5.030303 | 0.329004 | 0.061962 | 0.134251 | 0.185886 | 0.680723 | 0.680723 | 0.680723 | 0.680723 | 0.680723 | 0.640275 | 0 | 0.030241 | 0.026104 | 1,494 | 8 | 659 | 186.75 | 0.768385 | 0.036145 | 0 | 0 | 1 | 0.714286 | 0.812935 | 0.799026 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
13bed2633740eba19bca7b0cb1c7f6f9d7438ab0 | 5,122 | py | Python | Recipefy/recipefyApp/migrations/0001_initial.py | yasserkabbout/Recipefy | d780df619029d590303373d8371b292ac8123a6e | [
"MIT"
] | null | null | null | Recipefy/recipefyApp/migrations/0001_initial.py | yasserkabbout/Recipefy | d780df619029d590303373d8371b292ac8123a6e | [
"MIT"
] | 3 | 2020-02-11T23:35:28.000Z | 2021-06-10T21:04:39.000Z | Recipefy/recipefyApp/migrations/0001_initial.py | yasserkabbout/Recipefy | d780df619029d590303373d8371b292ac8123a6e | [
"MIT"
] | 1 | 2018-12-26T01:14:43.000Z | 2018-12-26T01:14:43.000Z | # Generated by Django 2.1.3 on 2018-12-23 13:05
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Login_logs',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(max_length=50)),
('datetime', models.DateTimeField(auto_now_add=True)),
],
),
migrations.CreateModel(
name='Recipes',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=50)),
('rating', models.FloatField()),
('apple', models.CharField(max_length=10)),
('rice', models.CharField(max_length=10)),
('tomato', models.CharField(max_length=10)),
('onion', models.CharField(max_length=10)),
('orange', models.CharField(max_length=10)),
('tea', models.CharField(max_length=10)),
('chocolate', models.CharField(max_length=10)),
('egg', models.CharField(max_length=10)),
('lentil', models.CharField(max_length=10)),
('potato', models.CharField(max_length=10)),
('lemon', models.CharField(max_length=10)),
('garlic', models.CharField(max_length=10)),
('starwberry', models.CharField(max_length=10)),
('vegan', models.CharField(max_length=10)),
('poulty', models.CharField(max_length=10)),
('organic', models.CharField(max_length=10)),
('no_cook', models.CharField(max_length=10)),
('no_sugar', models.CharField(max_length=10)),
('lunch', models.CharField(max_length=10)),
('high_fiber', models.CharField(max_length=10)),
('healthy', models.CharField(max_length=10)),
('grill', models.CharField(max_length=10)),
('dinner', models.CharField(max_length=10)),
('dairy_free', models.CharField(max_length=10)),
('meal_22_minutes', models.CharField(max_length=10)),
('appetizer', models.CharField(max_length=10)),
],
),
migrations.CreateModel(
name='User_Groceries',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(max_length=50)),
('apple', models.CharField(max_length=10)),
('rice', models.CharField(max_length=10)),
('tomato', models.CharField(max_length=10)),
('onion', models.CharField(max_length=10)),
('orange', models.CharField(max_length=10)),
('tea', models.CharField(max_length=10)),
('chocolate', models.CharField(max_length=10)),
('egg', models.CharField(max_length=10)),
('lentil', models.CharField(max_length=10)),
('potato', models.CharField(max_length=10)),
('lemon', models.CharField(max_length=10)),
('garlic', models.CharField(max_length=10)),
('starwberry', models.CharField(max_length=10)),
],
),
migrations.CreateModel(
name='User_Preferences',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(max_length=50)),
('vegan', models.CharField(max_length=10)),
('poulty', models.CharField(max_length=10)),
('organic', models.CharField(max_length=10)),
('no_cook', models.CharField(max_length=10)),
('no_sugar', models.CharField(max_length=10)),
('lunch', models.CharField(max_length=10)),
('high_fiber', models.CharField(max_length=10)),
('healthy', models.CharField(max_length=10)),
('grill', models.CharField(max_length=10)),
('dinner', models.CharField(max_length=10)),
('dairy_free', models.CharField(max_length=10)),
('meal_22_minutes', models.CharField(max_length=10)),
('appetizer', models.CharField(max_length=10)),
],
),
migrations.CreateModel(
name='Users',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(max_length=15, unique=True)),
('email', models.EmailField(max_length=70, unique=True)),
('password', models.CharField(max_length=50)),
],
),
]
| 48.320755 | 114 | 0.546271 | 504 | 5,122 | 5.371032 | 0.18254 | 0.196158 | 0.385667 | 0.514222 | 0.868859 | 0.849649 | 0.849649 | 0.849649 | 0.849649 | 0.837458 | 0 | 0.038268 | 0.301054 | 5,122 | 105 | 115 | 48.780952 | 0.717877 | 0.008786 | 0 | 0.816327 | 1 | 0 | 0.095369 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.010204 | 0.010204 | 0 | 0.05102 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
b926ad544000dc8af44fcf5a0a58d30825914c79 | 14,459 | py | Python | test/unit/spiderfoot/test_spiderfootevent.py | IronFireFA/spiderfoot | e75428e7584666de52a20b0d2f1fb80dffd6f39c | [
"MIT"
] | null | null | null | test/unit/spiderfoot/test_spiderfootevent.py | IronFireFA/spiderfoot | e75428e7584666de52a20b0d2f1fb80dffd6f39c | [
"MIT"
] | null | null | null | test/unit/spiderfoot/test_spiderfootevent.py | IronFireFA/spiderfoot | e75428e7584666de52a20b0d2f1fb80dffd6f39c | [
"MIT"
] | null | null | null | # test_spiderfootevent.py
import unittest
from spiderfoot import SpiderFootEvent
class TestSpiderFootEvent(unittest.TestCase):
"""
Test SpiderFootEvent
"""
def test_init_root_event_should_create_event(self):
"""
Test __init__(self, eventType, data, module, sourceEvent)
"""
event_data = 'example event data'
module = 'example module'
source_event = ''
event_type = 'ROOT'
evt = SpiderFootEvent(event_type, event_data, module, source_event)
self.assertIsInstance(evt, SpiderFootEvent)
def test_init_nonroot_event_with_root_sourceEvent_should_create_event(self):
"""
Test __init__(self, eventType, data, module, sourceEvent)
"""
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
source_event = SpiderFootEvent(event_type, event_data, module, source_event)
event_type = 'example non-root event type'
event_data = 'example event data'
module = 'example module'
evt = SpiderFootEvent(event_type, event_data, module, source_event)
self.assertIsInstance(evt, SpiderFootEvent)
def test_init_argument_eventType_of_invalid_type_should_raise_TypeError(self):
"""
Test __init__(self, eventType, data, module, sourceEvent)
"""
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
source_event = SpiderFootEvent(event_type, event_data, module, source_event)
module = 'example module'
invalid_types = [None, list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
SpiderFootEvent(invalid_type, event_data, module, source_event)
def test_init_argument_eventType_with_empty_value_should_raise_ValueError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
source_event = SpiderFootEvent(event_type, event_data, module, source_event)
event_type = ''
module = 'example module'
with self.assertRaises(ValueError):
SpiderFootEvent(event_type, event_data, module, source_event)
def test_init_argument_data_of_invalid_type_should_raise_TypeError(self):
"""
Test __init__(self, eventType, data, module, sourceEvent)
"""
event_type = 'ROOT'
module = ''
source_event = ''
invalid_types = [None, list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
SpiderFootEvent(event_type, invalid_type, module, source_event)
def test_init_argument_data_with_empty_value_should_raise_ValueError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
source_event = SpiderFootEvent(event_type, event_data, module, source_event)
event_type = 'example event type'
event_data = ''
module = 'example module'
with self.assertRaises(ValueError):
SpiderFootEvent(event_type, event_data, module, source_event)
def test_init_argument_module_of_invalid_type_should_raise_TypeError(self):
"""
Test __init__(self, eventType, data, module, sourceEvent)
"""
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = SpiderFootEvent(event_type, event_data, module, "ROOT")
event_type = 'example non-root event type'
invalid_types = [None, list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
SpiderFootEvent(event_type, event_data, invalid_type, source_event)
def test_init_argument_module_with_empty_value_should_raise_ValueError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
source_event = SpiderFootEvent(event_type, event_data, module, source_event)
event_type = 'example event type'
event_data = 'example event data'
module = ''
with self.assertRaises(ValueError):
SpiderFootEvent(event_type, event_data, module, source_event)
def test_init_argument_sourceEvent_of_invalid_type_should_raise_TypeError(self):
"""
Test __init__(self, eventType, data, module, sourceEvent)
"""
event_type = 'ROOT'
event_data = 'example event data'
module = ''
event_type = 'example non-root event type'
module = 'example module'
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
SpiderFootEvent(event_type, event_data, module, invalid_type)
def test_init_argument_confidence_of_invalid_type_should_raise_TypeError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.confidence = invalid_type
def test_init_argument_confidence_invalid_value_should_raise_ValueError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
invalid_values = [-1, 101]
for invalid_value in invalid_values:
with self.subTest(invalid_value=invalid_value):
with self.assertRaises(ValueError):
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.confidence = invalid_value
def test_init_argument_visibility_of_invalid_type_should_raise_TypeError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.visibility = invalid_type
def test_init_argument_visibility_invalid_value_should_raise_ValueError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
invalid_values = [-1, 101]
for invalid_value in invalid_values:
with self.subTest(invalid_value=invalid_value):
with self.assertRaises(ValueError):
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.visibility = invalid_value
def test_init_argument_risk_of_invalid_type_should_raise_TypeError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.risk = invalid_type
def test_init_argument_risk_invalid_value_should_raise_ValueError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
invalid_values = [-1, 101]
for invalid_value in invalid_values:
with self.subTest(invalid_value=invalid_value):
with self.assertRaises(ValueError):
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.risk = invalid_value
def test_confidence_attribute_should_return_confidence_as_integer(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
confidence = 100
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.confidence = confidence
self.assertEqual(confidence, evt.confidence)
def test_confidence_attribute_setter_invalid_type_should_raise_TypeError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
evt.confidence = invalid_type
def test_visibility_attribute_should_return_visibility_as_integer(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
visibility = 100
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.visibility = visibility
self.assertEqual(visibility, evt.visibility)
def test_visibility_attribute_setter_invalid_type_should_raise_TypeError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
evt.visibility = invalid_type
def test_risk_attribute_should_return_risk_as_integer(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
risk = 100
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt.risk = risk
self.assertEqual(risk, evt.risk)
def test_risk_attribute_setter_invalid_type_should_raise_TypeError(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
invalid_types = [None, "", list(), dict()]
for invalid_type in invalid_types:
with self.subTest(invalid_type=invalid_type):
with self.assertRaises(TypeError):
evt.risk = invalid_type
def test_actualSource_attribute_should_return_actual_source_as_string(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
actual_source = 'example actual source'
evt.actualSource = actual_source
self.assertEqual(actual_source, evt.actualSource)
def test_sourceEventHash_attribute_should_return_source_event_hash_as_string(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
self.assertEqual("ROOT", evt.sourceEventHash)
def test_moduleDataSource_attribute_should_return_module_data_source_as_string(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
module_data_source = 'example module data source'
evt.moduleDataSource = module_data_source
self.assertEqual(module_data_source, evt.moduleDataSource)
def test_asdict_root_event_should_return_event_as_a_dict(self):
"""
Test asDict(self)
"""
event_data = 'example event data'
module = 'example module data'
source_event = ''
event_type = 'ROOT'
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt_dict = evt.asDict()
self.assertIsInstance(evt_dict, dict)
self.assertEqual(evt_dict['type'], event_type)
self.assertEqual(evt_dict['data'], event_data)
self.assertEqual(evt_dict['module'], module)
def test_asdict_nonroot_event_should_return_event_as_a_dict(self):
"""
Test asDict(self)
"""
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
source_event = SpiderFootEvent(event_type, event_data, module, source_event)
event_type = 'example non-root event type'
event_data = 'example event data'
module = 'example_module'
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt_dict = evt.asDict()
self.assertIsInstance(evt_dict, dict)
self.assertEqual(evt_dict['type'], event_type)
self.assertEqual(evt_dict['data'], event_data)
self.assertEqual(evt_dict['module'], module)
def test_hash_attribute_root_event_should_return_root_as_a_string(self):
event_type = 'ROOT'
event_data = 'example event data'
module = ''
source_event = ''
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt_hash = evt.hash
self.assertEqual('ROOT', evt_hash)
def test_hash_attribute_nonroot_event_should_return_a_string(self):
event_type = 'ROOT'
event_data = 'example event data'
module = 'example module'
source_event = SpiderFootEvent(event_type, event_data, module, "ROOT")
event_type = 'not ROOT'
evt = SpiderFootEvent(event_type, event_data, module, source_event)
evt_hash = evt.hash
self.assertIsInstance(evt_hash, str)
| 36.97954 | 89 | 0.644166 | 1,575 | 14,459 | 5.551746 | 0.043175 | 0.100869 | 0.111505 | 0.129689 | 0.881176 | 0.866995 | 0.846638 | 0.827196 | 0.815874 | 0.815188 | 0 | 0.001996 | 0.272495 | 14,459 | 390 | 90 | 37.074359 | 0.829261 | 0.02967 | 0 | 0.789655 | 0 | 0 | 0.074187 | 0 | 0 | 0 | 0 | 0 | 0.117241 | 1 | 0.096552 | false | 0 | 0.006897 | 0 | 0.106897 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b942a31c7182d17be9f04823334a325a4e739ee3 | 1,468 | py | Python | test/acceptance.py | rodrigo-garcia-leon/todo-lists | 14c1270542427d15f96611cb85b7db3aec848a9a | [
"MIT"
] | null | null | null | test/acceptance.py | rodrigo-garcia-leon/todo-lists | 14c1270542427d15f96611cb85b7db3aec848a9a | [
"MIT"
] | null | null | null | test/acceptance.py | rodrigo-garcia-leon/todo-lists | 14c1270542427d15f96611cb85b7db3aec848a9a | [
"MIT"
] | null | null | null | """Acceptance test for the API"""
import requests
def test_acceptance():
"""Acceptance test for the API."""
response = requests.get('http://localhost:5000/todos')
assert response.status_code == 200
assert len(response.json()) == 0
response = requests.post('http://localhost:5000/todos',
json={'title': 'Buy milk'})
assert response.status_code == 201
assert response.json() == {
'title': 'Buy milk',
'done': False,
}
response = requests.get('http://localhost:5000/todos')
assert response.status_code == 200
assert response.json() == [{
'title': 'Buy milk',
'done': False,
}]
response = requests.patch('http://localhost:5000/todos',
json={'title': 'Buy milk', 'done': True})
assert response.status_code == 200
assert response.json() == {
'title': 'Buy milk',
'done': True,
}
response = requests.get('http://localhost:5000/todos')
assert response.status_code == 200
assert response.json() == [{
'title': 'Buy milk',
'done': True,
}]
response = requests.delete('http://localhost:5000/todos',
json={'title': 'Buy milk'})
assert response.status_code == 204
assert response.text == ''
response = requests.get('http://localhost:5000/todos')
assert response.status_code == 200
assert len(response.json()) == 0
| 29.36 | 71 | 0.571526 | 158 | 1,468 | 5.259494 | 0.208861 | 0.202166 | 0.143201 | 0.185319 | 0.89651 | 0.841155 | 0.831528 | 0.831528 | 0.7858 | 0.7858 | 0 | 0.047442 | 0.267711 | 1,468 | 49 | 72 | 29.959184 | 0.725581 | 0.038147 | 0 | 0.710526 | 0 | 0 | 0.214133 | 0 | 0 | 0 | 0 | 0 | 0.368421 | 1 | 0.026316 | false | 0 | 0.026316 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b98cab8ab6346f9bb9a26124d66d09dc84dee209 | 808,953 | py | Python | svelter_sv/svelter.py | mills-lab/svelter | d318b06d588483fe8a8ebcac8c8a6c7878f2c2b3 | [
"MIT"
] | 21 | 2015-11-02T06:31:52.000Z | 2021-12-20T03:14:04.000Z | svelter_sv/svelter.py | mills-lab/svelter | d318b06d588483fe8a8ebcac8c8a6c7878f2c2b3 | [
"MIT"
] | 14 | 2016-03-02T21:12:53.000Z | 2019-08-02T20:01:02.000Z | svelter_sv/svelter.py | mills-lab/svelter | d318b06d588483fe8a8ebcac8c8a6c7878f2c2b3 | [
"MIT"
] | 6 | 2015-08-19T18:33:02.000Z | 2017-05-16T03:42:57.000Z | #!/usr/bin/env python
#!Python
#Usage:
#SVelter.py [option] [Parametres]
#option:
#For debug use only
#command='SVelter.py SVPredict --deterministic-flag 1 --workdir /mnt/EXT/Mills-scratch2/Xuefang/NA12878.NGS --sample /mnt/EXT/Mills-scratch2/Xuefang/NA12878.NGS/alignment/NA12878_S1.chr10.bam'
#command='SVelter.py SVPredict --deterministic-flag 1 --workdir /scratch/remills_flux/xuefzhao/NA12878.NGS/hg19 --sample /scratch/remills_flux/xuefzhao/NA12878.NGS/hg19/alignment/NA12878_S1.chr10.bam --bp-file /scratch/remills_flux/xuefzhao/NA12878.NGS/hg19/bp_files.NA12878_S1.chr10.bam/NA12878_S1.chr10.txt'
#command='SVelter_Add_cram.03312016.py PredefinedBP --input-bed /mnt/EXT/Mills-scratch2/Xuefang/NA12878.NGS/het_RD10_INV.INV.bed --workdir /mnt/EXT/Mills-scratch2/Xuefang/NA12878.NGS/predefinedBP_Test/ --sample /mnt/EXT/Mills-scratch2/Xuefang/NA12878.NGS/alignment/NA12878_S1.bam'
#sys.argv=command.split()
from __future__ import print_function
import os,re,sys
from svelter_sv import readme
script_name=sys.argv[0]
def stat_file_name(bamF_Name,genome_name):
global ILStat
ILStat=NullPath+'ILNull.'+bamF_Name+'.'+genome_name+'.Bimodal'
global IL_Null_Stat
IL_Null_Stat=NullPath+bamF_Name+'.'+genome_name+'.density.null'
global RDStat
RDStat=NullPath+'RDNull.'+bamF_Name+'.'+genome_name+'.NegativeBinomial'
global TBStat
TBStat=NullPath+'TBNull.'+bamF_Name+'.'+genome_name+'.Bimodal'
global SPLenStat
SPLenStat=NullPath+bamF_Name+'.'+genome_name+'.SplitLength'
global AllStat
AllStat=NullPath+bamF_Name+'.'+genome_name+'.Stats'
if len(sys.argv)<2:
readme.print_default_parameters()
else:
from svelter_sv.function import *
function_name=sys.argv[1]
if function_name=='Clean':
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','ref-index=','help=','batch=','prefix=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','segdup=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']: readme.print_default_parameters_clean()
else:
workdir=path_modify(dict_opts['--workdir'])
clean_svelter_set(workdir+'reference_SVelter/')
print('SVelter Cleared Up!')
if function_name=='Setup':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['support=','deterministic-flag=','ref-index=','help=','batch=','prefix=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','segdup=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
Code_path='/'.join(sys.argv[0].split('/')[:-1])+'/'
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']: readme.print_default_parameters_setup()
else:
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
def final_regions_decide(Gap_Hash_Ref1,hash_Cor,chrom):
GapHash=Gap_Hash_Ref1[chrom]
GapHash2=calculate_interval_region(hash_Cor,Chromo_Length,chrom)
GapHash+=GapHash2
temp_hash={}
for k1 in GapHash:
if not k1[0] in list(temp_hash.keys()): temp_hash[k1[0]]={}
if not k1[1] in list(temp_hash[k1[0]].keys()): temp_hash[k1[0]][k1[1]]=[]
temp_list=[]
for k1 in sorted(temp_hash.keys()):
for k2 in sorted(temp_hash[k1].keys()): temp_list.append([k1,k2])
temp2_list=[]
temp2_list.append(temp_list[0])
for k1 in temp_list[1:]:
if k1[0]-temp2_list[-1][1]<1000:
if k1[1]>temp2_list[-1][1]: temp2_list[-1][1]=k1[1]
else: temp2_list.append(k1)
temp3_list=[]
for k1 in temp2_list:
if not k1 in temp3_list: temp3_list.append(k1)
return calculate_interval_region(temp3_list,Chromo_Length,chrom)
def Gap_Hash_Ref_filter(Gap_Hash_Ref1,Chromo_Length):
for x in list(Gap_Hash_Ref1.keys()):
if len(Gap_Hash_Ref1[x])==1:
if Gap_Hash_Ref1[x][0][0]==0:
if Gap_Hash_Ref1[x][0][1]==Chromo_Length[x]: del Gap_Hash_Ref1[x]
def Gap_Hash_Ref1_read_in(Gap_Refs):
Gap_Hash_Ref1={}
for Gap_Ref1 in Gap_Refs:
fgap=open(Gap_Ref1)
for line in fgap:
pgap=line.strip().split()
if pgap[0] in chromos:
if not pgap[0] in list(Gap_Hash_Ref1.keys()): Gap_Hash_Ref1[pgap[0]]=[]
Gap_Hash_Ref1[pgap[0]].append(pgap[1:4])
fgap.close()
Gap_Hash_Ref2=bed_hash_short(Gap_Hash_Ref1,Chromo_Length)
for x in chromos:
if not x in list(Gap_Hash_Ref2.keys()): Gap_Hash_Ref2[x]=[[0,0]]
return Gap_Hash_Ref2
def Global_para_declear_setup():
global chromos
global Chromo_Length
global Gap_Hash
def write_ExcludeBed(ExcludeBed):
if not os.path.isfile(ExcludeBed):
fo=open(ExcludeBed,'w')
for chr_ex in chromos: print(' '.join([chr_ex,'0','0']), file=fo)
fo.close()
if not '--workdir' in list(dict_opts.keys()):
print('working directory not specified')
print('all temporal files would be writen under current directory')
workdir='./'
#print 'Error: please specify working directory using --workdir'
else:
workdir = path_modify(dict_opts['--workdir'])
path_mkdir(workdir)
ref_file=0
if not '--reference' in list(dict_opts.keys()):
print('Error: please specify refrence genome using --reference')
else:
Global_para_declear_setup()
ref_file=dict_opts['--reference']
ref_path='/'.join(ref_file.split('/')[:-1])+'/'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index):
print('Error: reference genome not indexed')
print('Please index reference genome using samtools')
else:
#if not '--svelter-path' in dict_opts.keys():
# print 'Error: please specify path of SVelter scripts using --svelter-path'
#else:
time1=time.time()
ref_path=workdir+'reference_SVelter/'
if not ref_path=='/'.join(ref_file.split('/')[:-1])+'/':
ref_path=workdir+'reference_SVelter/'
path_mkdir(ref_path)
if not ref_file[0]=='/': print('Error: refrence should be specified using absolute path ! ')
os.symlink(ref_file,ref_path+'genome.fa')
os.symlink(ref_index,ref_path+'genome.fa.fai')
if '--ref-index' in list(dict_opts.keys()):
if os.path.isdir(dict_opts['--ref-index']):
ref_index_path=path_modify(dict_opts['--ref-index'])
for ref_index_file in os.listdir(ref_index_path):
if ref_index_file.split('.')[-1]=='GC_Content':
if ref_index_path[0]=='/': os.symlink(ref_index_path+ref_index_file,ref_path+'genome.GC_Content')
else: os.system(r'''cp %s %s'''%(ref_index_path+ref_index_file,ref_path+'genome.GC_Content'))
if ref_index_file.split('.')[-1]=='bed' and ref_index_file.split('.')[-2]=='Mappable':
if ref_index_path[0]=='/': os.symlink(ref_index_path+ref_index_file,ref_path+'genome.Mappable.bed')
else: os.system(r'''cp %s %s'''%(ref_index_path+ref_index_file,ref_path+'genome.Mappable.bed'))
if '--support' in list(dict_opts.keys()):
support_path=path_modify(dict_opts['--support'])
for k1 in os.listdir(support_path):
if 'SVelter' in k1 and k1.split('.')[-1]=='r':
if support_path[0]=='/': os.symlink(support_path+k1,ref_path+k1)
else: os.system(r'''cp %s %s'''%(support_path+k1,ref_path))
if 'CN2' in k1:
if not '--copyneutral' in list(dict_opts.keys()): dict_opts['--copyneutral']=support_path+k1
if 'Exclude' in k1:
if not '--exclude' in list(dict_opts.keys()): dict_opts['--exclude']=support_path+k1
if 'Segdup' in k1:
if not '--segdup' in list(dict_opts.keys()): dict_opts['--segdup']=support_path+k1
if '--copyneutral' in list(dict_opts.keys()):
if dict_opts['--copyneutral'][0]=='/': os.symlink(dict_opts['--copyneutral'],ref_path+'CN2.bed')
else: os.system(r'''cp %s %s'''%(dict_opts['--copyneutral'],ref_path+'CN2.bed'))
else:
[whole_genome,len_genome]=calculate_len_genome(ref_file)
random_produce_cn2_region(ref_path+'CN2.bed',whole_genome,len_genome,dict_opts)
if '--exclude' in list(dict_opts.keys()):
if dict_opts['--exclude'][0]=='/': os.symlink(dict_opts['--exclude'],ref_path+'Exclude.bed')
else: os.system(r'''cp %s %s'''%(dict_opts['--exclude'],ref_path+'Exclude.bed'))
else:
chromos=chromos_readin_list(ref_file)
random_produce_exclude_region(ref_path+'Exclude.bed',chromos)
if '--segdup' in list(dict_opts.keys()):
if dict_opts['--segdup'][0]=='/': os.symlink(dict_opts['--segdup'],ref_path+'Segdup.bed')
else: os.system(r'''cp %s %s'''%(dict_opts['--segdup'],ref_path+'Segdup.bed'))
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
ExcludeBed=ref_path+'Exclude.bed'
[chromos,Chromo_Length]=chromos_info_readin(ref_index)
write_ExcludeBed(ExcludeBed)
fout_Name='.'.join(ref_file.split('.')[:-1])+'.Mappable.bed'
fout_N2='.'.join(ref_file.split('.')[:-1])+'.GC_Content'
if not os.path.isfile(fout_Name):
Gap_Refs=[ExcludeBed]
Gap_Hash_Ref1=Gap_Hash_Ref1_read_in(Gap_Refs)
Gap_Hash_Ref_filter(Gap_Hash_Ref1,Chromo_Length)
Gap_Hash=Gap_Hash_Initiate(chromos)
file_initiate(fout_Name)
file_initiate(fout_N2)
for chrom in chromos:
fref=os.popen(r'''samtools faidx %s %s:'''%(ref_file,chrom))
pref=fref.readline().strip().split()
while True:
pref=fref.readline().strip().split()
if not pref:break
Gap_Hash[chrom].append(pref[0])
fref.close()
fout=open(fout_Name,'a')
fout2=open(fout_N2,'a')
hash_key=chrom
if not Gap_Hash[hash_key]==[]:
hash_Cor=[]
hash_cal=0
if not ''.join(set(Gap_Hash[hash_key][0])) in ['N','n','Nn','nN']: hash_Cor.append([0])
for hts in Gap_Hash[hash_key][1:]:
hash_cal+=len(hts)
if ''.join(set(hts)) in ['N','n','Nn','nN']:
if len(hash_Cor)==0: continue
else:
if len(hash_Cor[-1])==1: hash_Cor[-1].append(hash_cal)
else:
if len(hash_Cor)==0: hash_Cor.append([hash_cal])
elif len(hash_Cor[-1])==2: hash_Cor.append([hash_cal])
hash_Cor=hash_Cor_modify(hash_Cor,Chromo_Length,hash_key,chrom)
hash_to_Seq=''.join(Gap_Hash[hash_key])
hakey2=hash_key
if chrom in list(Gap_Hash_Ref1.keys()):
Cor_Gap2=final_regions_decide(Gap_Hash_Ref1,hash_Cor,chrom)
for key1 in Cor_Gap2:
if key1[1]-key1[0]>1000:
print(' '.join([hakey2, str(key1[0]),str(key1[1])]), file=fout)
print(' '.join([hakey2, str(key1[0]),str(key1[1])]), file=fout2)
GC_out=[]
for key2 in [int(i) for i in range(int((key1[1]-key1[0])/100))]:
GC_region=hash_to_Seq[(key1[0]+key2*100):(key1[0]+(key2+1)*100)]
GC_out.append(str(float(GC_region.count('g')+GC_region.count('G')+GC_region.count('c')+GC_region.count('C'))/100.0))
GC_region=hash_to_Seq[(key1[0]+(key2+1)*100):key1[1]]
if len(GC_region)>0: GC_out.append(str(float(GC_region.count('g')+GC_region.count('G')+GC_region.count('c')+GC_region.count('C'))/float(len(GC_region))))
print(' '.join(GC_out), file=fout2)
fout.close()
fout2.close()
time2=time.time()
print('Suppport files completely set !')
print('Time Consuming:'+str(time2-time1))
if function_name=='NullModel':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','out-path=','help=','long-insert=','batch=','prefix=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
def dict_opts_modify(dict_opts):
global model_comp
if not '--null-model' in list(dict_opts.keys()): model_comp='C'
else:
if dict_opts['--null-model'] in ['S','Simple']: model_comp='S'
else: model_comp='C'
global ReadLength
global ReadLength_Flag
if '--read-length' in list(dict_opts.keys()):
ReadLength=int(dict_opts['--read-length'])
ReadLength_Flag=1
else:
ReadLength=0
ReadLength_Flag=0
global QCAlign
if '--qc-align' in list(dict_opts.keys()): QCAlign=int(dict_opts['--qc-align'])
else: QCAlign=20
global QCSplit
if '--qc-split' in list(dict_opts.keys()): QCSplit=int(dict_opts['--qc-split'])
else: QCSplit=20
global NullSplitLen_perc
if '--split-min-len' in list(dict_opts.keys()): NullSplitLen_perc=int(dict_opts['--split-min-len'])
else: NullSplitLen_perc=0.9
global NullILCIs
if '--NullILCI' in list(dict_opts.keys()): NullILCIs=dict_opts['--NullILCI']
else: NullILCIs=[0.025,0.05,0.95,0.975]
global NullRDCIs
if '--NullRDCI' in list(dict_opts.keys()): NullRDCIs=dict_opts['--NullRDCI']
else: NullRDCIs=[0.025,0.05,0.95,0.975]
global NullTBCIs
if '--NullTBCI' in list(dict_opts.keys()): NullTBCIs=dict_opts['--NullTBCI']
else: NullTBCIs=[0.0001,0.0005,0.9999,0.9995]
global NullILCff
if '--NullILCff'in list(dict_opts.keys()): NullILCff=dict_opts['--NullILCff']
else: NullILCff=0.999
global NullSPCff
if '--NullSPCff' in list(dict_opts.keys()): NullSPCff=dict_opts['--NullSPCff']
else: NullSPCff=0.999
global NullDRCff
if '--NullDRCff' in list(dict_opts.keys()): NullDRCff=dict_opts['--NullDRCff']
else: NullDRCff=0.999
global KeepFile
if '--keep-temp-files' in list(dict_opts.keys()): KeepFile=dict_opts['--keep-temp-files']
else: KeepFile='No'
global KeepFigure
if '--keep-temp-figs' in list(dict_opts.keys()): KeepFigure=dict_opts['--keep-temp-figs']
else: KeepFigure='Yes'
dict_opts_modify(dict_opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters_nullmodel()
else:
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
def clean_files():
os.system('''rm %s'''%(InsertLenNullTemp))
os.system('''rm %s'''%(DRNullTemp))
os.system('''rm %s'''%(SplitNullTemp))
os.system('''rm %s'''%(ILNullTemp))
os.system('''rm %s'''%(TBNullTemp))
os.system('''rm %s'''%(RDNullTemp))
def global_para_declaration_nullmodel():
global bam_path
global bam_files
global bam_file_appdix
global cn2_file
global len_genome
global NullPath
global ref_path
global ref_file
global ref_file
global ref_index
global SamplingPercentage
global whole_genome
bam_path='/'.join(dict_opts['--sample'].split('/')[:-1])+'/'
bam_files=[dict_opts['--sample']]
bam_file_appdix=dict_opts['--sample'].split('.')[-1]
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
cn2_file=ref_path+'CN2.bed'
if not '--workdir' in list(dict_opts.keys()):
print('working directory not specified')
print('all temporal files would be writen under current directory')
workdir='./'
else: workdir=path_modify(dict_opts['--workdir'])
if not '--sample' in list(dict_opts.keys()): print('Error: please specify either input file using --sample')
else:
global_para_declaration_nullmodel()
if not os.path.isfile(ref_index): print('Error: reference genome not indexed')
else:
SamplingPercentage=SamplingPercentage_readin(dict_opts)
[whole_genome,len_genome]=calculate_len_genome(ref_file)
chromos=chromos_readin_list(ref_file)
if os.path.isfile(cn2_file):cn2_length=cn2_length_readin(dict_opts)
else: [cn2_length,SamplingPercentage,whole_genome,len_genome]=cn2_region_write(cn2_file,ref)
if not '--chromosome' in list(dict_opts.keys()): chr_flag=0
elif '--chromosome' in list(dict_opts.keys()):
chr_flag=1
chrom_single=dict_opts['--chromosome']
if not chrom_single in chromos:
print('Error: please make sure the chromosome defined by --chromosome is correct based on the reference genome')
chromos=[]
else: chromos=[chrom_single]
if not chromos==[]:
genome_name=genome_name_readin(dict_opts)
NullPath=NullPath_SetUp(workdir,dict_opts)
path_BP=PathBP_SetUp(NullPath)
print('temp files produced under: '+workdir)
Script_Path=workdir+'reference_SVelter/'
for bamF in bam_files:
time1=time.time()
if ReadLength_Flag==0: ReadLengthHash={}
#outputfile=NullPath+bamF.split('/')[-1].replace('.'+bam_file_appdix,'')+'.'+genome_name+'.null'
outputfile=NullPath+'.'.join(bamF.split('/')[-1].split('.')[:-1]) +'.'+genome_name+'.null'
fo=open(outputfile,'w')
print(' '.join(['position','GCContent','ReadDepth','SplitReads','AbnormalDirection','ThroughBP']), file=fo)
fo.close()
SplitLength={}
InsertLength={}
fcn2=open(cn2_file)
chr_cn2=[]
cn2_regions=[]
while True:
pcn2=fcn2.readline().strip().split()
if not pcn2: break
if not len(pcn2)==3: break
if pcn2[0] in chromos:
if not pcn2[0] in chr_cn2: chr_cn2.append(pcn2[0])
if (int(pcn2[2])-int(pcn2[1]))>cn2_length and (int(pcn2[2])-int(pcn2[1]))<10**6:
if not random.choice(list(range(100)))>SamplingPercentage*100: cn2_regions.append(pcn2)
fcn2.close()
if chr_cn2==[]:
whole_genome=chromos_read_in(ref_file)
cn2_regions=random_pick_cn2_region(cn2_file,whole_genome,chromos,len_genome,dict_opts)
chr_cn2=chromos
for pcn2 in cn2_regions:
freadpairs=os.popen('''samtools view -F 256 %s %s:%d-%d'''%(bamF,pcn2[0],int(pcn2[1])+100,int(pcn2[2])-100))
while True:
preadpair=freadpairs.readline().strip().split()
if not preadpair: break
if not int(preadpair[4])>QCAlign: continue
if preadpair[8]=='0': continue
if not abs(int(preadpair[8])) in list(InsertLength.keys()):InsertLength[abs(int(preadpair[8]))]=1
else: InsertLength[abs(int(preadpair[8]))]+=1
if 'S' in preadpair[5]:
SplitLen=cigar2splitlength(preadpair[5])
for s in SplitLen:
if not s in list(SplitLength.keys()): SplitLength[s]=1
else: SplitLength[s]+=1
freadpairs.close()
if not chr_cn2==[]:
SplitLenPNum=SplitLenPNum_Calculate(SplitLength,NullPath,bamF,bam_file_appdix,genome_name,NullSplitLen_perc)
TotalILNum=IL_Stat_Calculate(InsertLength)
NullILCI=NullILCI_Calculate(InsertLength,TotalILNum,NullILCIs)
Window_Size=int(float(NullILCI[0])/3)
cn2_length=max([cn2_length,NullILCI[2]])
cn2_max_len=max(cn2_length*100,10**6)
[ILNullDensity,RDNullDensity,DRNullDensity,TBNullDensity,SplitNullDensity,GC_Content]=[{} for i in range(6)]
for pcn2 in cn2_regions:
if not len(pcn2)==3: break
if pcn2[0] in chromos:
pos=[int(pcn2[1])+i*Window_Size for i in range(int(float(int(pcn2[2])-int(pcn2[1]))/Window_Size))]
pos0=[pcn2[0]+'_'+str(i*Window_Size) for i in pos]
RDNull=[0 for i in pos]
SplitNull=[0 for i in pos]
DRNull=[0 for i in pos]
TBNull=[0 for i in pos]
ILNull=[0 for i in pos]
readInf=[]
freadpairs=os.popen('''samtools view -F 256 %s %s:%d-%d'''%(bamF,pcn2[0],int(pcn2[1]),int(pcn2[2])))
while True:
preadpair=freadpairs.readline().strip().split()
if not preadpair: break
if not int(preadpair[4])>QCAlign: continue
if not int(preadpair[3])>pos[0]: continue
block_num=int(max(int(preadpair[3])-pos[0],0)/Window_Size)
if not block_num<len(pos): continue
if ReadLength_Flag==0:
if not preadpair[9]=='*':
if not len(preadpair[9]) in list(ReadLengthHash.keys()): ReadLengthHash[len(preadpair[9])]=1
else: ReadLengthHash[len(preadpair[9])]+=1
RDNull[block_num]+=cigar2reaadlength(preadpair[5])
if not int(preadpair[8])<NullILCI[0] and not int(preadpair[8])>NullILCI[-1]:
ILNull[block_num]+=1
if not preadpair[5].find('S')==-1:
splitpos=[i+int(preadpair[3]) for i in cigar2split(preadpair[5])]
splitlent=cigar2splitlength(preadpair[5])
for j in range(len(splitpos)):
if not splitlent[j]<SplitLenPNum and splitpos[j] in pos: SplitNull[block_num]+=1
if preadpair[6]=='=':
if not Reads_Direction_Detect(preadpair)==['+', '-']:
abdrpos=min([int(preadpair[3]),int(preadpair[7])])-int(pcn2[1])-100
if abdrpos>-1 and abdrpos<(int(pcn2[2])-int(pcn2[1])-200): DRNull[int(abdrpos/Window_Size)]+=1
if not preadpair[0] in readInf:
for j in range(int(max((int(preadpair[3])-pos[0])/Window_Size,0)), int(min((int(preadpair[3])+int(preadpair[8])-pos[0])/Window_Size,len(pos)))): TBNull[int(j)]+=1
readInf.append(preadpair[0])
freadpairs.close()
if not sum(RDNull)==0:
fref=os.popen('''samtools faidx %s %s:%d-%d'''%(ref_file,pcn2[0],int(pcn2[1]),int(pcn2[1])+(int(pcn2[2])-int(pcn2[1]))/Window_Size*Window_Size))
tref=fref.readline().strip().split()
REFSEQUENCE=fref.readline().strip().split()
while True:
pref=fref.readline().strip().split()
if not pref: break
REFSEQUENCE=[''.join(REFSEQUENCE+pref)]
fref.close()
GCNull=[int(float(REFSEQUENCE[0][(Window_Size*i):(Window_Size*i+Window_Size)].count('G')+REFSEQUENCE[0][(Window_Size*i):(Window_Size*i+Window_Size)].count('C')+REFSEQUENCE[0][(Window_Size*i):(Window_Size*i+Window_Size)].count('g')+REFSEQUENCE[0][(Window_Size*i):(Window_Size*i+Window_Size)].count('c'))/float(Window_Size)*100) for i in range(int(len(REFSEQUENCE[0])/Window_Size))]
fo=open(outputfile,'a')
for k in range(len(pos)):
if not RDNull[k]==0: print(' '.join([str(pos0[k]),str(GCNull[k]),str(RDNull[k]),str(SplitNull[k]),str(DRNull[k]),str(TBNull[k])]), file=fo)
for i in range(len(RDNull)):
if not GCNull[i] in list(GC_Content.keys()): GC_Content[GCNull[i]]=[RDNull[i]]
if GCNull[i] in list(GC_Content.keys()): GC_Content[GCNull[i]].append(RDNull[i])
fo.close()
for k in range(len(pos)):
if not ILNull[k] in list(ILNullDensity.keys()): ILNullDensity[ILNull[k]]=1
elif ILNull[k] in list(ILNullDensity.keys()): ILNullDensity[ILNull[k]]+=1
if not RDNull[k] in list(RDNullDensity.keys()): RDNullDensity[RDNull[k]]=1
elif RDNull[k] in list(RDNullDensity.keys()): RDNullDensity[RDNull[k]]+=1
if not SplitNull[k] in list(SplitNullDensity.keys()): SplitNullDensity[SplitNull[k]]=1
elif SplitNull[k] in list(SplitNullDensity.keys()): SplitNullDensity[SplitNull[k]]+=1
if not DRNull[k] in list(DRNullDensity.keys()): DRNullDensity[DRNull[k]]=1
elif DRNull[k] in list(DRNullDensity.keys()): DRNullDensity[DRNull[k]]+=1
for k in range(len(pos))[1:]:
if not TBNull[k] in list(TBNullDensity.keys()): TBNullDensity[TBNull[k]]=1
elif TBNull[k] in list(TBNullDensity.keys()): TBNullDensity[TBNull[k]]+=1
if 0 in list(RDNullDensity.keys()):
del RDNullDensity[0]
if 0 in list(TBNullDensity.keys()):
del TBNullDensity[0]
[OverallRDDenominator,OverallRDNumeritor]=[0,0]
if not RDNullDensity=={}:
for key in list(RDNullDensity.keys()):
if not key==0:
OverallRDNumeritor+=key*RDNullDensity[key]
OverallRDDenominator+=RDNullDensity[key]
OverallRDNullDensity=float(OverallRDNumeritor)/float(OverallRDDenominator)
fbRD=open(outputfile)
pbRD=fbRD.readline().strip().split()
RD_Af_Adj={}
for key in list(GC_Content.keys()):
GC_Content[key]=numpy.mean(GC_Content[key])
while True:
pbRD=fbRD.readline().strip().split()
if not pbRD: break
if not len(pbRD)==6 : break
if int(pbRD[1]) in list(GC_Content.keys()):
RDAfAdj=int(pbRD[2])*OverallRDNullDensity/GC_Content[int(pbRD[1])]
if not int(RDAfAdj) in list(RD_Af_Adj.keys()):
RD_Af_Adj[int(RDAfAdj)]=1
elif int(RDAfAdj) in list(RD_Af_Adj.keys()):
RD_Af_Adj[int(RDAfAdj)]+=1
fbRD.close()
RDMedian=numpy.median(list(RD_Af_Adj.keys()))
for key in list(RD_Af_Adj.keys()):
if key > RDMedian*10 or key==0:del RD_Af_Adj[key]
TotalRDNum=0
for key in list(RD_Af_Adj.keys()):TotalRDNum+=RD_Af_Adj[key]
[NullRDCILeft,NullRDCIRight,SubRDNumleft,SubRDNumright,NciLeft,NciRight]=[[],[],0,0,0,0]
for keyn in range(len(sorted(RD_Af_Adj.keys()))):
SubRDNumleft+=RD_Af_Adj[sorted(RD_Af_Adj.keys())[keyn]]
SubRDNumright+=RD_Af_Adj[sorted(RD_Af_Adj.keys())[-(keyn+1)]]
if NciLeft<len(NullRDCIs)/2:
if SubRDNumleft<NullRDCIs[NciLeft]*float(TotalRDNum): continue
if not SubRDNumleft<NullRDCIs[NciLeft]*float(TotalRDNum):
if len(NullRDCILeft)==NciLeft:
NullRDCILeft.append(sorted(RD_Af_Adj.keys())[keyn])
NciLeft+=1
if NciRight<(len(NullRDCIs)/2):
if SubRDNumright<NullRDCIs[NciRight]*float(TotalRDNum): continue
if not SubRDNumright<NullRDCIs[NciRight]*float(TotalRDNum):
if len(NullRDCIRight)==NciRight:
NullRDCIRight.append(sorted(RD_Af_Adj.keys())[-(keyn+1)])
NciRight+=1
if NciLeft==len(NullRDCIs)/2 and NciRight==len(NullRDCIs)/2: break
NullRDCI=NullRDCILeft+sorted(NullRDCIRight)
[TotalTBNum,TBNullDensity]=TBNullDensity_CleanUP(TBNullDensity)
[NullTBCILeft,NullTBCIRight,SubTBNumleft,SubTBNumright,NciLeft,NciRight]=[[],[],0,0,0,0]
for keyn in range(len(sorted(TBNullDensity.keys()))):
SubTBNumleft+=TBNullDensity[sorted(TBNullDensity.keys())[keyn]]
SubTBNumright+=TBNullDensity[sorted(TBNullDensity.keys())[-(keyn+1)]]
if NciLeft<len(NullTBCIs)/2:
if SubTBNumleft<NullTBCIs[NciLeft]*float(TotalTBNum): continue
if not SubTBNumleft<NullTBCIs[NciLeft]*float(TotalTBNum):
if len(NullTBCILeft)==NciLeft:
NullTBCILeft.append(sorted(TBNullDensity.keys())[keyn])
NciLeft+=1
if NciRight<(len(NullTBCIs)/2):
if SubTBNumright<NullTBCIs[NciRight]*float(TotalTBNum): continue
if not SubTBNumright<NullTBCIs[NciRight]*float(TotalTBNum):
if len(NullTBCIRight)==NciRight:
NullTBCIRight.append(sorted(TBNullDensity.keys())[-(keyn+1)])
NciRight+=1
if NciLeft==len(NullTBCIs)/2 and NciRight==len(NullTBCIs)/2: break
NullTBCI=NullTBCILeft+sorted(NullTBCIRight)
TotalILNum=0
for key in list(ILNullDensity.keys()): TotalILNum+=ILNullDensity[key]
ILITotal=0
for key in sorted(ILNullDensity.keys()):
ILITotal+=ILNullDensity[key]
if float(ILITotal)/float(TotalILNum)>NullILCff: break
if sorted(ILNullDensity.keys()).index(key)>4 or len(list(ILNullDensity.keys()))<4:
ILIPoint=sorted(ILNullDensity.keys())[:sorted(ILNullDensity.keys()).index(key)+1]
ILIPoint2=[float(ILNullDensity[i])/float(TotalILNum) for i in ILIPoint]
ILIPoint3=[]
for k in range(len(ILIPoint)): ILIPoint3.append(sum(ILIPoint2[:(k+1)]))
else:
ILIPoint=sorted(ILNullDensity.keys())[:4]
ILIPoint2=[float(ILNullDensity[i])/float(TotalILNum) for i in ILIPoint]
ILIPoint3=[]
for k in range(len(ILIPoint)): ILIPoint3.append(sum(ILIPoint2[:(k+1)]))
TotalSplitNum=0
for key in list(SplitNullDensity.keys()): TotalSplitNum+=SplitNullDensity[key]
SplitITotal=0
for key in sorted(SplitNullDensity.keys()):
SplitITotal+=SplitNullDensity[key]
if float(SplitITotal)/float(TotalSplitNum)>NullSPCff: break
if sorted(SplitNullDensity.keys()).index(key)>4 or len(list(SplitNullDensity.keys()))<4:
SplitIPoint=sorted(SplitNullDensity.keys())[:sorted(SplitNullDensity.keys()).index(key)+1]
SplitIPoint2=[float(SplitNullDensity[i])/float(TotalSplitNum) for i in SplitIPoint]
SplitIPoint3=[]
for k in range(len(SplitIPoint)): SplitIPoint3.append(sum(SplitIPoint2[:(k+1)]))
else:
SplitIPoint=sorted(SplitNullDensity.keys())[:4]
SplitIPoint2=[float(SplitNullDensity[i])/float(TotalSplitNum) for i in SplitIPoint]
SplitIPoint3=[]
for k in range(len(SplitIPoint)): SplitIPoint3.append(sum(SplitIPoint2[:(k+1)]))
TotalDRNum=0
for key in list(DRNullDensity.keys()): TotalDRNum+=DRNullDensity[key]
DRITotal=0
for key in sorted(DRNullDensity.keys()):
DRITotal+=DRNullDensity[key]
if float(DRITotal)/float(TotalDRNum)>NullDRCff: break
if sorted(DRNullDensity.keys()).index(key)>4 or len(list(DRNullDensity.keys()))<4:
DRIPoint=sorted(DRNullDensity.keys())[:(sorted(DRNullDensity.keys()).index(key)+1)]
DRIPoint2=[float(DRNullDensity[i])/float(TotalDRNum) for i in DRIPoint]
DRIPoint3=[]
for k in DRIPoint: DRIPoint3.append(sum(DRIPoint2[:(k+1)]))
else:
DRIPoint=sorted(DRNullDensity.keys())[:4]
DRIPoint2=[float(DRNullDensity[i])/float(TotalDRNum) for i in DRIPoint]
DRIPoint3=[]
for k in DRIPoint: DRIPoint3.append(sum(DRIPoint2[:(k+1)]))
outputfileStat=NullPath+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.Stats'
fos=open(outputfileStat,'w')
print('Insert Lenth CIs', file=fos)
print(' '.join([str(i) for i in NullILCIs]), file=fos)
print(' '.join([str(i) for i in NullILCI]), file=fos)
print('Read Depth CIs', file=fos)
print(' '.join([str(i) for i in NullRDCIs]), file=fos)
print(' '.join([str(i) for i in NullRDCI]), file=fos)
print('Number of Reads Going Through a Break Points CIs', file=fos)
print(' '.join([str(i) for i in NullTBCIs]), file=fos)
print(' '.join([str(i) for i in NullTBCI]), file=fos)
print('Number of Read Pairs with Aberrant Insert Length', file=fos)
print(' '.join([str(i) for i in ILIPoint]), file=fos)
print(' '.join([str(i) for i in ILIPoint3]), file=fos)
print('Number of Split Reads', file=fos)
print(' '.join([str(i) for i in SplitIPoint]), file=fos)
print(' '.join([str(i) for i in SplitIPoint3]), file=fos)
print('Number of Read Pairs with Aberrant Direction', file=fos)
print(' '.join([str(i) for i in DRIPoint]), file=fos)
print(' '.join([str(i) for i in DRIPoint3]), file=fos)
if ReadLength_Flag==0:
ReadLengthTag=0
ReadLengthOut=0
for RLK1 in list(ReadLengthHash.keys()):
if ReadLengthHash[RLK1]>ReadLengthTag:
ReadLengthOut=RLK1
ReadLengthTag=ReadLengthHash[RLK1]
print('Read Length Of Reads'+':'+str(ReadLengthOut), file=fos)
fos.close()
outputfileIL=NullPath+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.density.null'
foIL=open(outputfileIL,'w')
print(' '.join(['InsertLength','Frequency']), file=foIL)
for l in sorted(InsertLength.keys()): print(' '.join([str(l),str(InsertLength[l])]), file=foIL)
print(' '.join(['ReadDepth','Frequency']), file=foIL)
for r in list(RD_Af_Adj.keys()): print(' '.join([str(r),str(RD_Af_Adj[r])]), file=foIL)
print(' '.join(['ThroughBreakPoint','Frequency']), file=foIL)
for t in list(TBNullDensity.keys()): print(' '.join([str(t),str(TBNullDensity[t])]), file=foIL)
print(' '.join(['BinPosition','GC_Content']), file=foIL)
for b in list(GC_Content.keys()):
if not b==0: print(' '.join([str(b), str(GC_Content[b])]), file=foIL)
foIL.close()
RFigureDRSplit=Script_Path+'SVelter1.NullModel.Figure.a.r'
InsertLenNullTemp=NullPath+'InsertLenNull.'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.temp'
fIL=open(InsertLenNullTemp,'w')
for dr in list(ILNullDensity.keys()): print(' '.join([str(dr),str(ILNullDensity[dr])]), file=fIL)
fIL.close()
InsertLenNullfigure1='.'.join(InsertLenNullTemp.split('.')[:-1]+['pdf'])
BoxPlotColor='blue'
InsertLenNullfigure2='.'.join(InsertLenNullTemp.split('.')[:-1])+'.2.pdf'
if KeepFigure in ['no','N','No','n']:
InsertLenNullfigure1=InsertLenNullfigure1.replace('.pdf','.na')
InsertLenNullfigure2=InsertLenNullfigure2.replace('.pdf','.na')
os.system('''Rscript %s %s %s %s %s'''%(RFigureDRSplit,InsertLenNullTemp,InsertLenNullfigure1,BoxPlotColor,InsertLenNullfigure2))
DRNullTemp=NullPath+'DirectionNull.'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.temp'
fDR=open(DRNullTemp,'w')
for dr in list(DRNullDensity.keys()): print(' '.join([str(dr),str(DRNullDensity[dr])]), file=fDR)
fDR.close()
DRNullfigure1='.'.join(DRNullTemp.split('.')[:-1]+['pdf'])
BoxPlotColor='blue'
DRNullfigure2='.'.join(DRNullTemp.split('.')[:-1])+'.2.pdf'
if KeepFigure in ['no','N','No','n']:
DRNullfigure1=DRNullfigure1.replace('.pdf','.na')
DRNullfigure2=DRNullfigure2.replace('.pdf','.na')
os.system('''Rscript %s %s %s %s %s'''%(RFigureDRSplit,DRNullTemp,DRNullfigure1,BoxPlotColor,DRNullfigure2))
SplitNullTemp=NullPath+'SplitNull.'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.temp'
fSP=open(SplitNullTemp,'w')
for sp in list(SplitNullDensity.keys()): print(' '.join([str(sp),str(SplitNullDensity[sp])]), file=fSP)
fSP.close()
SplitNullfigure1='.'.join(SplitNullTemp.split('.')[:-1]+['pdf'])
BoxPlotColor='blue'
SplitNullfigure2='.'.join(SplitNullTemp.split('.')[:-1])+'.2.pdf'
if KeepFigure in ['no','N','No','n']:
SplitNullfigure1=SplitNullfigure1.replace('.pdf','.na')
SplitNullfigure2=SplitNullfigure2.replace('.pdf','.na')
os.system('''Rscript %s %s %s %s %s'''%(RFigureDRSplit,SplitNullTemp,SplitNullfigure1,BoxPlotColor,SplitNullfigure2))
if model_comp=='C': RFigureDRSplit2=Script_Path+'SVelter1.NullModel.Figure.b.r'
else: RFigureDRSplit2=Script_Path+'SVelter1.NullModel.Figure.c.r'
RDNullTemp=NullPath+'RDNull.'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.temp'
fRD=open(RDNullTemp,'w')
for rd in list(RD_Af_Adj.keys()): print(' '.join([str(rd),str(RD_Af_Adj[rd])]), file=fRD)
fRD.close()
RDNullfigure1='.'.join(RDNullTemp.split('.')[:-1]+['pdf'])
BoxPlotColor='blue'
lineColor='red'
RDNullfigure2='.'.join(RDNullTemp.split('.')[:-1])+'.NegativeBinomial'
if KeepFigure in ['no','N','No','n']: RDNullfigure1=RDNullfigure1.replace('.pdf','.na')
os.system('''Rscript %s %s %s %s %s %s %d'''%(RFigureDRSplit2,RDNullTemp,RDNullfigure1,BoxPlotColor,lineColor,RDNullfigure2,Window_Size))
RDNullfigure2_Modify(RDNullfigure2,Window_Size)
ILNullTemp=NullPath+'ILNull.'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.temp'
fIL=open(ILNullTemp,'w')
for il in list(InsertLength.keys()): print(' '.join([str(il),str(InsertLength[il])]), file=fIL)
fIL.close()
ILNullfigure1='.'.join(ILNullTemp.split('.')[:-1]+['pdf'])
BoxPlotColor='blue'
lineColor='red'
ILNullfigure2='.'.join(ILNullTemp.split('.')[:-1])+'.Bimodal'
if KeepFigure in ['no','N','No','n']: ILNullfigure1=ILNullfigure1.replace('.pdf','.na')
os.system('''Rscript %s %s %s %s %s %s %d'''%(RFigureDRSplit2,ILNullTemp,ILNullfigure1,BoxPlotColor,lineColor,ILNullfigure2,Window_Size))
TBNullTemp=NullPath+'TBNull.'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'.temp'
fTB=open(TBNullTemp,'w')
for tb in list(TBNullDensity.keys()): print(' '.join([str(tb),str(TBNullDensity[tb])]), file=fTB)
fTB.close()
TBNullfigure1='.'.join(TBNullTemp.split('.')[:-1]+['pdf'])
BoxPlotColor='blue'
lineColor='red'
TBNullfigure2='.'.join(TBNullTemp.split('.')[:-1])+'.Bimodal'
if KeepFigure in ['no','N','No','n']: TBNullfigure1=TBNullfigure1.replace('.pdf','.na')
os.system('''Rscript %s %s %s %s %s %s %d'''%(RFigureDRSplit2,TBNullTemp,TBNullfigure1,BoxPlotColor,lineColor,TBNullfigure2,Window_Size))
clean_files()
Ref_Seq_File=ref_file
Mini_CN2_Region=int(cn2_length)
Length_Limit=int(cn2_length)
CN2_Region={} #key of hash CN2_Region is the name of each chromosome
for chrom in chromos:
CN2_Region[chrom]={} #key of CN2_Region[chrom] is GC_content
for con in range(101): CN2_Region[chrom][con]=[]
#fcn2=open(cn2_file)
temp_Name='temp.Null1.'+bamF.split('/')[-1]
#while True:
for pcn2 in cn2_regions:
#pcn2=fcn2.readline().strip().split()
if not len(pcn2)==3: break
Chromosome=pcn2[0]
if Chromosome in list(CN2_Region.keys()):
#if int(pcn2[2])-int(pcn2[1])<Length_Limit: continue
#if not int(pcn2[2])-int(pcn2[1])<Length_Limit:
fasta_file=NullPath+temp_Name+'.fa'
os.system(r'''samtools faidx %s %s:%d-%d > %s'''%(Ref_Seq_File,str(pcn2[0]),int(pcn2[1]),int(pcn2[2]),fasta_file))
Seq1=Fasta_To_Sequence_nullmodel(fasta_file)
if Seq1=='ERROR!':continue
if not Seq1=='ERROR!':
sam_file=NullPath+temp_Name+'.sam'
os.system(r'''samtools view -F 256 %s %s:%d-%d > %s'''%(bamF,str(pcn2[0]),int(pcn2[1]),int(pcn2[2]),sam_file))
Number_Of_Windows=len(Seq1)/Window_Size
GC_Content={}
for i in range(int(len(Seq1)/Window_Size+1))[1:]:
Seq2=Seq1[(i-1)*Window_Size:i*Window_Size]
GC_Content[i]=GC_Content_Calculate(Seq2)
coverage=Region_Coverage_Calculate(sam_file,Number_Of_Windows,pcn2,Window_Size)
for j in list(GC_Content.keys()):
if j in list(coverage.keys()): CN2_Region[Chromosome][GC_Content[j][0]].append(coverage[j][-1])
#fcn2.close()
if os.path.isfile(NullPath+temp_Name+'.fa'): os.system(r'''rm %s'''%(NullPath+temp_Name+'.fa'))
if os.path.isfile(NullPath+temp_Name+'.sam'): os.system(r'''rm %s'''%(NullPath+temp_Name+'.sam'))
#Output_File=NullPath+'RD_Stat/'+bamF.split('/')[-1].replace('.'+bam_file_appdix,'')+'.'+genome_name+'_MP'+str(QCAlign)+'_GC_Coverage_ReadLength'
Output_File=NullPath+'RD_Stat/'+'.'.join(bamF.split('/')[-1].split('.')[:-1])+'.'+genome_name+'_MP'+str(QCAlign)+'_GC_Coverage_ReadLength'
Output_Path=NullPath+'RD_Stat/'
if not os.path.isdir(Output_Path): os.system(r'''mkdir %s'''%(Output_Path))
fo=open(Output_File,'w')
print(' '.join(chromos), file=fo)
print(' '.join([str(i) for i in range(101)]), file=fo)
for key_1 in chromos:
for key_2 in range(101): print(':'.join(['@',','.join(str(j) for j in CN2_Region[key_1][key_2])]), file=fo)
fo.close()
time2=time.time()
print('Null Model Built for '+bamF)
print('Time Consuming: '+str(time2-time1))
if function_name=='BPSearch':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','out-path=','help=','long-insert=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration=','BPSPCff=','BPLNCff='])
dict_opts=dict(opts)
CN2_Region={}
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters_bpsearch()
else:
def Define_Default_BPSearching():
global ILCutoff
ILCutoff=0.95
global RDCutoff
RDCutoff=0.95
global TBCutoff
TBCutoff=0.9999
global SplitCutoff
SplitCutoff=0.999
global ABILCutoff
ABILCutoff=0.99
global DRCutoff
DRCutoff=0.99
global SPLCutoff
SPLCutoff=0.85
global Length_Limit
Length_Limit=2000
global model_comp
if not '--null-model' in list(dict_opts.keys()): model_comp='C'
else:
if dict_opts['--null-model'] in ['S','Simple']: model_comp='S'
else: model_comp='C'
global ToolMappingQ
global FileMappingQ
global align_QCflag
if '--qc-map-tool' in list(dict_opts.keys()) and '--qc-map-file' in list(dict_opts.keys()):
ToolMappingQ=dict_opts['--qc-map-tool']
FileMappingQ=dict_opts['--qc-map-file']
align_QCflag=1
else: align_QCflag=0
global BPAlignQC
if '--BPAlignQC' in list(dict_opts.keys()): BPAlignQC=float(dict_opts['--BPAlignQC'])
else: BPAlignQC=0.2
global QCAlign
if '--qc-align' in list(dict_opts.keys()): QCAlign=int(dict_opts['--qc-align'])
else: QCAlign=20
global QC_RDCalculate_Cff
QC_RDCalculate_Cff=10
global QCSplit
if '--qc-split' in list(dict_opts.keys()): QCSplit=int(dict_opts['--qc-split'])
else: QCSplit=20
global NullSplitLen_perc
if '--split-min-len' in list(dict_opts.keys()): NullSplitLen_perc=float(dict_opts['--split-min-len'])
else: NullSplitLen_perc=0.9
global BPAlignQCFlank
if '--BPAlignQCFlank' in list(dict_opts.keys()): BPAlignQCFlank=int(dict_opts['--BPAlignQCFlank'])
else: BPAlignQCFlank=500
def global_para_declaration():
global BPPath
global NullPath
global workdir
global bam_path
global bam_files
global bam_files_appdix
global ref_path
global ref_file
global ref_index
global Window_Size
global ReadLength
global sub_loc_size
workdir=path_modify(dict_opts['--workdir'])
bam_path='/'.join(dict_opts['--sample'].split('/')[:-1])+'/'
bam_files=[dict_opts['--sample']]
bam_files_appdix=dict_opts['--sample'].split('.')[-1]
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
Define_Default_BPSearching()
if not '--workdir' in list(dict_opts.keys()): print('Error: please specify working directory using: --workdir')
else:
if not '--sample' in list(dict_opts.keys()): print('Error: please specify either input file using --sample')
else:
global_para_declaration()
if not os.path.isfile(ref_index): print('Error: reference genome not indexed')
else:
chromos=chromos_read_in(ref_file)
if '--chromosome' in list(dict_opts.keys()):
chrom_single=dict_opts['--chromosome']
if not chrom_single in chromos:
print('Error: please make sure the chromosome defined by --chr is correct based on the reference genome')
chromos=[]
else: chromos=[chrom_single]
if not chromos==[]:
genome_name=genome_name_readin(dict_opts)
print('temp files produced under: '+workdir)
if '--out-path' in list(dict_opts.keys()): BPPath=path_modify(dict_opts['--out-path'])
else: BPPath=workdir+'BreakPoints.'+dict_opts['--sample'].split('/')[-1]+'/'
if not os.path.isdir(BPPath): os.system(r'''mkdir %s'''%(BPPath))
NullPath='/'.join(BPPath.split('/')[:-2])+'/'+'.'.join(['NullModel']+BPPath.split('/')[-2].split('.')[1:])+'/'
for bamF in bam_files:
time1=time.time()
[Window_Size,ReadLength,sub_loc_size]=Null_Stats_Readin_One(NullPath,bamF,NullSplitLen_perc,genome_name,bam_files_appdix)
for chrF in chromos:
bamF_Name='.'.join(bamF.split('/')[-1].split('.')[:-1])
#bamF_Name=bamF.split('/')[-1].replace('.'+bam_files_appdix,'')
floc_Name=BPPath+bamF_Name+'.'+chrF
Refloc_name='.'.join(ref_file.split('.')[:-1])+'.Mappable.bed'
stat_file_name(bamF_Name,genome_name)
if os.path.isfile(Refloc_name):
[BamInput,ILStats,RDStats,TBStats,SPLCff]=[bamF,ILStats_readin(ILStat),RDStats_readin(RDStat),TBStats_readin(TBStat),SPLCff_Calculate(NullSplitLen_perc,SPLenStat,ReadLength)]
fAllS=open(AllStat)
pAllS=fAllS.readline().rstrip()
ILCIs={}
pAllS1=fAllS.readline().strip().split()
pAllS2=fAllS.readline().strip().split()
for i in range(len(pAllS1)): ILCIs[pAllS1[i]]=pAllS2[i]
pAllS=fAllS.readline().rstrip()
RDCIs={}
pAllS1=fAllS.readline().strip().split()
pAllS2=fAllS.readline().strip().split()
for i in range(len(pAllS1)): RDCIs[pAllS1[i]]=pAllS2[i]
pAllS=fAllS.readline().rstrip()
TBCIs={}
pAllS1=fAllS.readline().strip().split()
pAllS2=fAllS.readline().strip().split()
for i in range(len(pAllS1)): TBCIs[pAllS1[i]]=pAllS2[i]
pAllS=fAllS.readline().rstrip()
InsertLen={}
pAllS1=fAllS.readline().strip().split()
pAllS2=fAllS.readline().strip().split()
[InsertLenMin,SplitMin,DRMin,BPSPCff,BPLNCff,SPCluLen]=[5,5,5,3,3,5]
for i in range(len(pAllS1)): InsertLen[pAllS1[i]]=pAllS2[i]
for i in range(len(pAllS1)):
if float(pAllS2[i])>ABILCutoff:
InsertLenMin=int(pAllS1[i])-1
break
if InsertLenMin<5: InsertLenMin=5
pAllS=fAllS.readline().rstrip()
SplitReads={}
pAllS1=fAllS.readline().strip().split()
pAllS2=fAllS.readline().strip().split()
for i in range(len(pAllS1)): SplitReads[pAllS1[i]]=pAllS2[i]
for i in range(len(pAllS1)):
if float(pAllS2[i])>SplitCutoff:
SplitMin=int(pAllS1[i])-1
break
if SplitMin<5: SplitMin=5
pAllS=fAllS.readline().rstrip()
AbDirection={}
pAllS1=fAllS.readline().strip().split()
pAllS2=fAllS.readline().strip().split()
for i in range(len(pAllS1)):
AbDirection[pAllS1[i]]=pAllS2[i]
for i in range(len(pAllS1)):
if float(pAllS2[i])>DRCutoff:
DRMin=int(pAllS1[i])-1
break
if DRMin<5: DRMin=5
fbam=os.popen('''samtools view -H %s'''%(BamInput))
if '--BPSPCff' in list(dict_opts.keys()): BPSPCff=int(float(dict_opts['--BPSPCff']))
else: BPSPCff=int(round(2*float(RDStats['Median'])/float(10)))
if BPSPCff<3: BPSPCff=3
if '--BPLNCff' in list(dict_opts.keys()): BPLNCff=int(float(dict_opts['--BPLNCff']))
else: BPLNCff=int(round(2*float(TBStats['stat']['Median'])/float(10)))
if BPLNCff<3: BPLNCff=3
SPCluMin=BPSPCff
LnCluMin=BPLNCff
if '--SPCluLen' in list(dict_opts.keys()): SPCluLen=int(dict_opts['--SPCluLen'])
else: SPCluLen=int(round(2*float(RDStats['Median'])/float(10)))
if SPCluLen<5: SPCluLen=5
SubLnCluMin=max([LnCluMin,SPCluMin])
LinkCluMin=min([LnCluMin,SPCluMin])
ClusterLen=ClusterLen_Calculation(ILStats,model_comp,ReadLength)
ClusterLen2=int(ClusterLen/Window_Size+1)*Window_Size
Min_Distinguish_Len=Window_Size
subLnClusterLen=ClusterLen/2
if not '-S' in list(dict_opts.keys()): dict_opts['-S']=5
ILCffs=IL_CI_Decide(ILStats,int(dict_opts['-S']),model_comp)
BPOutputa=floc_Name+'.'+'.'.join(['SPCff'+str(SPCluMin),'CluCff'+str(LnCluMin),'AlignCff'+str(BPAlignQC)])+'.SPs'
file_initiate(BPOutputa)
BPOutputb=floc_Name+'.'+'.'.join(['SPCff'+str(SPCluMin),'CluCff'+str(LnCluMin),'AlignCff'+str(BPAlignQC)])+'.LNs'
file_initiate(BPOutputb)
BPOutputd=floc_Name+'.'+'.'.join(['SPCff'+str(SPCluMin),'CluCff'+str(LnCluMin),'AlignCff'+str(BPAlignQC)])+'.chromLNs'
file_initiate(BPOutputd)
BPOutpute='/'.join(BPOutputd.split('/')[:-1])+'/'+'.'.join(BPOutputd.split('/')[-1].split('.')[:-6]+BPOutputd.split('/')[-1].split('.')[-5:])
file_initiate(BPOutpute)
abtLink={}
floc=open(Refloc_name)
loc_rec={}
for line in floc:
ploc=line.strip().split()
if not ploc[0] in list(loc_rec.keys()): loc_rec[ploc[0]]=[]
loc_rec[ploc[0]].append([int(ploc2) for ploc2 in ploc[1:]])
floc.close()
if not loc_rec=={} and chrF in list(loc_rec.keys()):
[test_mul_RP,test_mul_SP,temp_IL_Rec,Link_IL_Rec]=[[],[],{},{}]
chrom=chrF
mini_fout_Name=BPPath+bamF_Name+'.mini.'+chrom+'.sam'
mini_fout_N2=BPPath+bamF_Name+'.mini.'+chrom+'.bam'
mini_fout_N3=BPPath+bamF_Name+'.mini.'+chrom+'.sorted'
mini_fout_N4=BPPath+bamF_Name+'.mini.'+chrom+'.sorted.bam'
if not os.path.isfile(mini_fout_N4):
os.system(r''' samtools view -H %s -o %s'''%(BamInput,mini_fout_Name))
RD_index_Path=NullPath+'RD_Stat/'
if not os.path.isdir(RD_index_Path): os.system(r'''mkdir %s'''%(RD_index_Path))
RD_index_File=RD_index_Path+bamF_Name+'.'+chrF+'.RD.index'
fRDind=open(RD_index_File,'w')
fRDind.close()
for loc in loc_rec[chrom]:
loc2=split_loc_to_subloc(loc,sub_loc_size,ClusterLen2)
for real_region in loc2:
fmini=open(mini_fout_Name,'a')
fRDind=open(RD_index_File,'a')
print(chrom+':'+str(real_region[0])+'-'+str(real_region[1]), file=fRDind)
RD_RealRegion=[0 for i in range(int((real_region[1]-real_region[0])/Window_Size)+1)]
fbam=os.popen('''samtools view -F 256 %s %s:%d-%d'''%(BamInput,chrom,real_region[0],real_region[1]))
while True:
pbam1=fbam.readline().strip()
if not pbam1: break
pbam=pbam1.split()
if not int(pbam[4])>int(QC_RDCalculate_Cff): continue #fail quality control, skip
if int(pbam[1])&4>0: continue #the read was not mapped, skip
DRtemp=Reads_Direction_Detect_flag(pbam[1])
ReadLen=cigar2reaadlength(pbam[5])
pos1=int(pbam[3])
pos2=int(pbam[3])+ReadLen
if pos2>real_region[0] and pos1<real_region[1]:
if pos1<real_region[0] and pos2>real_region[0]: pos1=real_region[0]
if pos1<real_region[1] and pos2>real_region[1]: pos2=real_region[1]
block1=int((pos1-real_region[0])/Window_Size)
RD_RealRegion[block1]+=ReadLen
if int(pbam[4])>int(QCAlign): #fail quality control, skip
absIL=abs(int(pbam[8]))
QCFlag=0
link_flag=0
if absIL<int(ILCffs[0]) or absIL>int(ILCffs[1]): QCFlag+=1
if DRtemp==['+','-'] and int(pbam[8])<0: QCFlag+=1
if DRtemp==['+','+'] or DRtemp==['-','-']: QCFlag+=1
if int(pbam[8])==0: QCFlag+=1
if not pbam[5].find('S')==-1: QCFlag+=1
if not pbam[5].find('H')==-1: QCFlag+=1
if not QCFlag==0: print(pbam1, file=fmini)
fbam.close()
fmini.close()
for rfrr in range(len(RD_RealRegion[:-1])): RD_RealRegion[rfrr]=str(float(RD_RealRegion[rfrr])/Window_Size)
if real_region[1]-real_region[0]-(real_region[1]-real_region[0])/Window_Size*Window_Size ==0: del RD_RealRegion[-1]
else: RD_RealRegion[-1]=str(float(RD_RealRegion[-1])/float(real_region[1]-real_region[0]-(real_region[1]-real_region[0])/Window_Size*Window_Size))
print(' '.join(RD_RealRegion), file=fRDind)
fRDind.close()
os.system(r'''samtools view -h -Sb -F 256 %s -o %s'''%(mini_fout_Name,mini_fout_N2))
samtools_sort_process(mini_fout_N2,mini_fout_N3,mini_fout_N4)
os.system(r'''rm %s'''%(mini_fout_N2))
os.system(r'''rm %s'''%(mini_fout_Name))
for loc in loc_rec[chrom]:
print(loc)
loc2=split_loc_to_loc2(loc,ClusterLen)
for real_region in loc2:
fbam=os.popen('''samtools view -F 256 %s %s:%d-%d'''%(mini_fout_N4,chrom,real_region[0],real_region[1]))
[abInfF,abInfR,abLink,abInf3,LinkFR,LinkFF,LinkRR,LinkRF,LinkNM,LinkSP,abLinkSP,test,LinkSPTemp]=[{},{},{},{},{},{},{},{},{},{},{},[],{}]
LinkNM['F']=[]
LinkNM['R']=[]
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[4])<int(QCAlign): continue #fail quality control, skip
if int(pbam[1])&4>0: continue #the read was not mapped, skip
DRtemp=Reads_Direction_Detect_flag(pbam[1])
ReadLen=cigar2reaadlength(pbam[5])
absIL=abs(int(pbam[8]))
signIL=0
posF=[]
if absIL<int(ILCffs[0]) or absIL>int(ILCffs[1]): signIL+=1
if not pbam[5].find('S')==-1 or not pbam[5].find('H')==-1:
ClippedLen=cigar2splitlen(pbam[5])
ClippedPos=cigar2split(pbam[5])
ClippedQual=cigar2splitqual(pbam[5],pbam[10])
if ClippedQual>QCSplit:
ClipAbsPos=[]
for c in range(len(ClippedLen)):
if ClippedLen[c]>SPLCff:
pos1=int(pbam[3])+ClippedPos[c]
posF.append(pos1)
pos2=int(pbam[7])
if DRtemp[0]=='+':
if not pos1 in list(abInfF.keys()): abInfF[pos1]=[0,0,0,1]
else: abInfF[pos1][3]+=1
elif DRtemp[0]=='-':
if not pos1 in list(abInfR.keys()): abInfR[pos1]=[0,0,0,1]
else: abInfR[pos1][3]+=1
if not pbam[0] in list(LinkSPTemp.keys()): LinkSPTemp[pbam[0]]=[[]]
else: LinkSPTemp[pbam[0]]+=[[]]
for c in ClippedPos:
pos1=int(pbam[3])+c
LinkSPTemp[pbam[0]][-1]+=[pos1,DRtemp[0]]
LinkSPTemp[pbam[0]].append('S')
#else:
if posF==[]:
if DRtemp[0]=='+': pos1=int(pbam[3])+len(pbam[9])
elif DRtemp[0]=='-': pos1=int(pbam[3])
else: pos1=posF[0]
pos2=int(pbam[7])
if pbam[0] in list(LinkSPTemp.keys()): LinkSPTemp[pbam[0]]+=[[pos1,DRtemp[0]]]
else: LinkSPTemp[pbam[0]]=[[pos1,DRtemp[0]]]
if DRtemp==['+','-'] and int(pbam[8])>0:
if signIL>0:
if not pos1 in list(LinkFR.keys()): LinkFR[pos1]=[pos2]
else: LinkFR[pos1]+=[pos2]
if not pos1 in list(abInfF.keys()): abInfF[pos1]=[1,0,0,0]
else: abInfF[pos1][0]+=1
if not pos2 in list(abInfR.keys()): abInfR[pos2]=[1,0,0,0]
else: abInfR[pos2][0]+=1
elif DRtemp==['+','-'] and int(pbam[8])<0:
#pos1=int(pbam[3])+ReadLen
#pos2=int(pbam[7])
if not pos1 in list(LinkFR.keys()): LinkFR[pos1]=[pos2]
else: LinkFR[pos1]+=[pos2]
if not pos1 in list(abInfF.keys()): abInfF[pos1]=[0,1,0,0]
else: abInfF[pos1][1]+=1
if not pos2 in list(abInfR.keys()): abInfR[pos2]=[0,1,0,0]
else: abInfR[pos2][1]+=1
if signIL>0:
abInfF[pos1][0]+=1
abInfR[pos2][0]+=1
elif DRtemp==['+','+'] and not int(pbam[8])==0:
#pos1=int(pbam[3])+ReadLen
if not pos1 in list(abInfF.keys()): abInfF[pos1]=[0,1,0,0]
else: abInfF[pos1][1]+=1
if signIL>0: abInfF[pos1][0]+=1
if not pbam[0] in list(abtLink.keys()): abtLink[pbam[0]]=[pos1]
elif pbam[0] in list(abtLink.keys()):
abtLink[pbam[0]].append(pos1)
if not min(abtLink[pbam[0]]) in list(LinkFF.keys()): LinkFF[min(abtLink[pbam[0]])]=[max(abtLink[pbam[0]])]
else: LinkFF[min(abtLink[pbam[0]])]+=[max(abtLink[pbam[0]])]
del abtLink[pbam[0]]
elif DRtemp==['-','-'] and int(pbam[8])>0:
#pos1=int(pbam[3])
#pos2=int(pbam[7])
if not min(pos1,pos2) in list(LinkRR.keys()): LinkRR[min(pos1,pos2)]=[max(pos1,pos2)]
else: LinkRR[min(pos1,pos2)]+=[max(pos1,pos2)]
if not pos1 in list(abInfR.keys()): abInfR[pos1]=[0,1,0,0]
else: abInfR[pos1][1]+=1
if not pos2 in list(abInfR.keys()): abInfR[pos2]=[0,1,0,0]
else: abInfR[pos2][1]+=1
if signIL>0:
abInfR[pos1][0]+=1
abInfR[pos2][0]+=1
elif int(pbam[8])==0:
if int(pbam[1])&8>0:
if DRtemp[0]=='+':
#pos1=int(pbam[3])+ReadLen
LinkNM['F'].append(pos1)
if pos1>real_region[0] and pos1<real_region[1]:
if not pos1 in list(abInfF.keys()): abInfF[pos1]=[0,0,1,0]
else: abInfF[pos1][2]+=1
elif DRtemp[0]=='-':
#pos1=int(pbam[3])
LinkNM['R'].append(pos1)
if pos1>real_region[0] and pos1<real_region[1]:
if not pos1 in list(abInfR.keys()): abInfR[pos1]=[0,0,1,0]
else: abInfR[pos1][2]+=1
if not pbam[6]=='=':
if DRtemp[0]=='+':
#pos1=int(pbam[3])+ReadLen
if pos1>real_region[0] and pos1<real_region[1]:
if not pos1 in list(abLink.keys()): abLink[pos1]=['f',int(pbam[7]),pbam[6]+'_'+DRtemp[1]]
else: abLink[pos1]+=['f',int(pbam[7]),pbam[6]+'_'+DRtemp[1]]
if not pos1 in list(abInfF.keys()): abInfF[pos1]=[0,0,1,0]
else: abInfF[pos1][2]+=1
elif DRtemp[0]=='-':
#pos1=int(pbam[3])
if pos1>real_region[0] and pos1<real_region[1]:
if not pos1 in list(abLink.keys()): abLink[pos1]=['r',int(pbam[7]),pbam[6]+'_'+DRtemp[1]]
else: abLink[pos1]+=['r',int(pbam[7]),pbam[6]+'_'+DRtemp[1]]
if not pos1 in list(abInfR.keys()): abInfR[pos1]=[0,0,1,0]
else: abInfR[pos1][2]+=1
for k1 in list(LinkSPTemp.keys()):
if not 'S' in LinkSPTemp[k1]: del LinkSPTemp[k1]
else:
if len(LinkSPTemp[k1])==2: del LinkSPTemp[k1]
for k1 in list(LinkSPTemp.keys()):
tempk1=[]
for k2 in LinkSPTemp[k1]:
if not k2=='S': tempk1+=[k2]
for k2 in range(int(len(tempk1[0])/2)):
for k3 in range(int(len(tempk1[1])/2)):
if tempk1[0][2*k2]<tempk1[1][2*k3]:
tempk2=[tempk1[0][2*k2],tempk1[0][2*k2+1],tempk1[1][2*k3],tempk1[1][2*k3+1]]
if [tempk2[1],tempk2[3]]==['+','-']:
if not tempk2[0] in list(LinkFR.keys()): LinkFR[tempk2[0]]=[]
LinkFR[tempk2[0]].append(tempk2[2])
if [tempk2[1],tempk2[3]]==['+','+']:
if not tempk2[0] in list(LinkFF.keys()): LinkFF[tempk2[0]]=[]
LinkFF[tempk2[0]].append(tempk2[2])
if [tempk2[1],tempk2[3]]==['-','-']:
if not tempk2[0] in list(LinkRR.keys()): LinkRR[tempk2[0]]=[]
LinkRR[tempk2[0]].append(tempk2[2])
if [tempk2[1],tempk2[3]]==['-','+']:
if not tempk2[0] in list(LinkRF.keys()): LinkRF[tempk2[0]]=[]
LinkRF[tempk2[0]].append(tempk2[2])
for k1 in list(abInfF.keys()):
abInfF[k1][3]+=abInfF[k1][1]
if not abInfF[k1][-1]==0:
if not k1 in list(LinkSP.keys()): LinkSP[k1]=0
LinkSP[k1]+=abInfF[k1][-1]
for k1 in list(abInfR.keys()):
abInfR[k1][3]+=abInfR[k1][1]
if not abInfR[k1][-1]==0:
if not k1 in list(LinkSP.keys()): LinkSP[k1]=0
LinkSP[k1]+=abInfR[k1][-1]
for k1 in list(LinkFR.keys()):
for k2 in LinkFR[k1]:
if not k2 in list(LinkRF.keys()): LinkRF[k2]=[]
LinkRF[k2].append(k1)
[out_pair_bp,out_single_bp,SP4S]=[[],[],[]]
if not LinkSP=={}:
SP2S=[]
linkSP_First=clusterNums(sorted(LinkSP.keys()), SPCluLen, 'f')
for k1 in range(len(linkSP_First[0])):
if linkSP_First[1][k1]==1:
if linkSP_First[0][k1] in list(abInfF.keys()):
if abInfF[linkSP_First[0][k1]]==[0,0,0,1]: del abInfF[linkSP_First[0][k1]]
if linkSP_First[0][k1] in list(abInfR.keys()):
if abInfR[linkSP_First[0][k1]]==[0,0,0,1]: del abInfR[linkSP_First[0][k1]]
linkSPF=clusterNums(sorted(LinkSP.keys()), SPCluLen, 'f')[0]
linkSPR=clusterNums(sorted(LinkSP.keys()), SPCluLen, 'r')[0]
linkSPFR=clusterSupVis2(sorted(LinkSP.keys()),sorted(linkSPR),sorted(linkSPF),'left')
for k1 in list(linkSPFR.keys()):
qual_num=0
qual_rec=[]
for k2 in linkSPFR[k1]:
qual_num+=LinkSP[k2]
qual_rec.append(LinkSP[k2])
if qual_num<SPCluMin: del linkSPFR[k1]
else: SP2S.append(linkSPFR[k1][qual_rec.index(max(qual_rec))])
if not SP2S==[]:
SP3S=[]
key=[]
if align_QCflag==1:
for key1 in SP2S:
QCLNSP_flag=0
for aqb in [key1]:
if aqb>BPAlignQCFlank:
LNSPFR_aqb=os.popen(r'''%s %s %s %d %d 1'''%(ToolMappingQ,FileMappingQ,chrom,aqb-BPAlignQCFlank,aqb+BPAlignQCFlank))
LNSPFR_score=float(LNSPFR_aqb.read().strip())
if LNSPFR_score<float(BPAlignQC): QCLNSP_flag+=1
if not QCLNSP_flag==0: SP3S.append(key1)
SP4S=[]
for k1 in SP2S:
if not k1 in SP3S: SP4S.append(k1)
SP4S.sort()
for i in range(len(SP4S)-1):
if SP4S[i+1]-SP4S[i]<10: SP4S[i]=SP4S[i+1]
SP5S=[]
for i in SP4S:
if not i in SP5S: SP5S.append(i)
SP4S=SP5S
if not SP4S==[]:
LinkSPF2=clusterSupVis2(sorted(abInfF.keys()), [i-ClusterLen for i in sorted(SP4S)], [i+10 for i in sorted(SP4S)],'right')
for k1x in list(LinkSPF2.keys()):
key1=k1x-10
LinkSPF2[key1]=[i for i in LinkSPF2[k1x]]
del LinkSPF2[k1x]
for k1x in list(LinkSPF2.keys()):
test1=bp_subgroup(LinkSPF2[k1x],Min_Distinguish_Len)
if len(test1)>1:
for k2x in test1[:-1]:
new_core_ele=0
for k3x in k2x:
if k3x in list(abInfF.keys()): new_core_ele+=numpy.sum(abInfF[k3x])
if new_core_ele>BPLNCff:
#new_core.append(max(k2x))
if not max(k2x) in list(LinkSPF2.keys()): LinkSPF2[max(k2x)]=k2x
else: LinkSPF2[max(k2x)]+=k2x
if not max(k2x) in SP4S: SP4S.append(max(k2x))
LinkSPR2=clusterSupVis2(sorted(abInfR.keys()), [i-10 for i in sorted(SP4S)], [i+ClusterLen for i in sorted(SP4S)],'left')
for k1x in list(LinkSPR2.keys()):
key1=k1x+10
LinkSPR2[key1]=[i for i in LinkSPR2[k1x]]
del LinkSPR2[k1x]
for k1x in list(LinkSPR2.keys()):
test1=bp_subgroup(LinkSPR2[k1x],Min_Distinguish_Len)
if len(test1)>1:
for k2x in test1[1:]:
new_core_ele=0
for k3x in k2x:
if k3x in list(abInfR.keys()): new_core_ele+=numpy.sum(abInfR[k3x])
if new_core_ele>BPLNCff:
#new_core.append(min(k2x))
if not min(k2x) in list(LinkSPR2.keys()): LinkSPR2[min(k2x)]=k2x
else: LinkSPR2[min(k2x)]+=k2x
if not min(k2x) in SP4S: SP4S.append(min(k2x))
for k1x in list(LinkSPF2.keys()):
temp_rec=LinkSPF2[k1x]
LinkSPF2[k1x]=[LinkSPF2[k1x],[]]
for k2 in temp_rec:
if k2 in list(LinkFR.keys()): LinkSPF2[k1x][1]+=LinkFR[k2]
if k2 in list(LinkFF.keys()): LinkSPF2[k1x][1]+=LinkFF[k2]
for k1x in list(LinkSPR2.keys()):
temp_rec=LinkSPR2[k1x]
LinkSPR2[k1x]=[LinkSPR2[k1x],[]]
for k2 in temp_rec:
if k2 in list(LinkRR.keys()): LinkSPR2[k1x][1]+=LinkRR[k2]
if k2 in list(LinkRF.keys()): LinkSPR2[k1x][1]+=LinkRF[k2]
LinkSP_To_Link={}
for k1x in SP4S:
if k1x in list(LinkSPF2.keys()):
LinkSP_To_Link[k1x]=[[],[]]
LinkSP_To_Link[k1x][0]+=LinkSPF2[k1x][0]
LinkSP_To_Link[k1x][1]+=LinkSPF2[k1x][1]
if k1x in list(LinkSPR2.keys()):
if not k1x in list(LinkSP_To_Link.keys()): LinkSP_To_Link[k1x]=[[],[]]
LinkSP_To_Link[k1x][0]+=LinkSPR2[k1x][0]
LinkSP_To_Link[k1x][1]+=LinkSPR2[k1x][1]
if k1x in list(LinkSP_To_Link.keys()):
LinkSP_To_Link[k1x][0].sort()
LinkSP_To_Link[k1x][1].sort()
for k1x in sorted(LinkSP_To_Link.keys()):
for k2 in LinkSP_To_Link[k1x][1]:
if k2 in list(LinkSP_To_Link.keys()):
if not sorted([k1x,k2]) in out_pair_bp and not k1x==k2: out_pair_bp.append(sorted([k1x,k2]))
elif k1x==k2:
if not k1x in out_single_bp: out_single_bp.append(k1x)
for k1x in sorted(LinkSP_To_Link.keys()):
if not LinkSP_To_Link[k1x][1]==[]:
for k2 in sorted(LinkSP_To_Link.keys())[sorted(LinkSP_To_Link.keys()).index(k1x):]:
if not LinkSP_To_Link[k2][1]==[]:
if k2==k1x:
if not k1x in out_single_bp: out_single_bp.append(k1x)
else:
overlap_rec=[overlap_calcu(LinkSP_To_Link[k1x][1],LinkSP_To_Link[k2][0]),overlap_calcu(LinkSP_To_Link[k1x][0],LinkSP_To_Link[k2][1])]
if overlap_rec[0]+overlap_rec[1]>0:
if not sorted([k1x,k2]) in out_pair_bp and not k1x==k2: out_pair_bp.append(sorted([k1x,k2]))
elif k1x==k2: out_single_bp.append(k1x)
for k1x in sorted(LinkSP_To_Link.keys()):
if LinkSP_To_Link[k1x][1]==[]:
out_single_bp.append(k1x)
del LinkSP_To_Link[k1x]
for k1x in out_pair_bp:
for k2 in k1x:
if k2 in out_single_bp: del out_single_bp[out_single_bp.index(k2)]
if k2 in list(LinkSP_To_Link.keys()): del LinkSP_To_Link[k2]
for k1x in out_single_bp:
if k1x in list(LinkSP_To_Link.keys()): del LinkSP_To_Link[k1x]
SP4S.sort()
LinkSP_To_Link={}
tempIL={}
if not abInfF=={}:
clu_a_F=clusterNums(list(abInfF.keys()), ClusterLen, 'f')[0]
if not clu_a_F==[]:
clu_b_F=clusterNums(list(abInfF.keys()), ClusterLen, 'r')[0]
clu_c_F=clusterSupVis2(sorted(abInfF.keys()), clu_b_F, [caf+10 for caf in clu_a_F],'right')
if not clu_c_F=={}:
for key2 in list(clu_c_F.keys()):
key2b=key2-10
record=0
record2=0
for key3 in clu_c_F[key2]:
if not key3 >key2b: record+=sum(abInfF[key3][:3])
if abs(key2b-key3)<SPCluLen: record2+=abInfF[key3][3]
if not record+record2<LinkCluMin: tempIL[key2b]=[[record,record2,'f'],clu_c_F[key2]]
del clu_c_F[key2]
if not abInfR=={}:
clu_a_R=clusterNums(list(abInfR.keys()), ClusterLen, 'r')[0]
if not clu_a_R==[]:
clu_b_R=clusterNums(list(abInfR.keys()), ClusterLen, 'f')[0]
clu_c_R=clusterSupVis2(sorted(abInfR.keys()), [car-10 for car in clu_a_R], clu_b_R,'left')
if not clu_c_R=={}:
for key2 in list(clu_c_R.keys()):
key2b=key2+10
record=0
record2=0
for key3 in clu_c_R[key2]:
if not key3<key2b: record+= sum(abInfR[key3][:3])
if abs(key3-key2b)<SPCluLen: record2+=abInfR[key3][3]
if not record+record2<LinkCluMin: tempIL[key2b]=[[record,record2,'r'],clu_c_R[key2]]
del clu_c_R[key2]
if not tempIL=={}:
for aqb in list(tempIL.keys()):
if aqb<BPAlignQCFlank:
del tempIL[aqb]
continue
if align_QCflag==1:
LNSPFR_aqb=os.popen(r'''%s %s %s %d %d 1'''%(ToolMappingQ,FileMappingQ,chrom,aqb-BPAlignQCFlank,aqb+BPAlignQCFlank))
tPairF_b=float(LNSPFR_aqb.read().strip())
if tPairF_b<float(BPAlignQC): del tempIL[aqb]
LinkIL={}
for k1 in list(tempIL.keys()):
temp_mate_F={}
temp_mate_R={}
info_mate=0
if tempIL[k1][0][2]=='f':
for k2 in tempIL[k1][1]:
if k2 in list(LinkFF.keys()):
for k3 in LinkFF[k2]:
if k3 in list(abInfF.keys()):temp_mate_F[k3]=sum(abInfF[k3])
if k2 in list(LinkFR.keys()):
for k3 in LinkFR[k2]:
if k3 in list(abInfR.keys()):temp_mate_R[k3]=sum(abInfR[k3])
elif tempIL[k1][0][2]=='r':
for k2 in tempIL[k1][1]:
if k2 in list(LinkRF.keys()):
for k3 in LinkRF[k2]:
if k3 in list(abInfF.keys()):temp_mate_F[k3]=sum(abInfF[k3])
if k2 in list(LinkRR.keys()):
for k3 in LinkRR[k2]:
if k3 in list(abInfR.keys()):temp_mate_R[k3]=sum(abInfR[k3])
for k1x in list(temp_mate_F.keys()): info_mate+=temp_mate_F[k1x]
for k1x in list(temp_mate_R.keys()): info_mate+=temp_mate_R[k1x]
if not info_mate<LinkCluMin:
LinkIL[k1]=[[],[]]
if not temp_mate_F=={}: LinkIL[k1][0]=clusterQC(clusterNums4(temp_mate_F, ClusterLen, 'f'),LinkCluMin)
if not temp_mate_R=={}: LinkIL[k1][1]=clusterQC(clusterNums4(temp_mate_R, ClusterLen, 'r'),LinkCluMin)
else:continue
for k1 in list(LinkIL.keys()):
for k2 in LinkIL[k1]:
for k3 in k2:
if k3>BPAlignQCFlank:
if align_QCflag==1:
tPairF_QC=0
tPairF_a=os.popen(r'''%s %s %s %d %d 1'''%(ToolMappingQ,FileMappingQ,chrom,k3-BPAlignQCFlank,k3+BPAlignQCFlank))
tPairF_b=float(tPairF_a.read().strip())
if not tPairF_b<float(BPAlignQC):
if not [min([k3,k1]),max([k3,k1])] in out_pair_bp and not k1==k3: out_pair_bp.append([min([k3,k1]),max([k3,k1])])
elif k1==k3: out_single_bp.append(k1)
else:
if not [min([k3,k1]),max([k3,k1])] in out_pair_bp and not k1==k3: out_pair_bp.append([min([k3,k1]),max([k3,k1])])
elif k1==k3: out_single_bp.append(k1)
if not out_pair_bp==[]:
out_pair_bp_temp=out_pair_bp_check(out_pair_bp,3)
out_pair_bp=out_pair_bp_temp
temp_out_pair_bp=[]
out_BPmodify={}
for k1 in out_pair_bp:
for k2 in k1: out_BPmodify[k2]=[]
if not SP4S==[]:
LBSP_tempIL=clusterSupVis3(sorted(SP4S),sorted(out_BPmodify.keys()))
for k1 in out_pair_bp:
temp_k1=[]
for k2 in k1:
k3=LBSP_tempIL[k2]
if abs(k2-k3)<ClusterLen: temp_k1.append(k3)
else: temp_k1.append(k2)
temp_out_pair_bp.append(temp_k1)
out_pair_bp=temp_out_pair_bp
for k1 in out_pair_bp:
for k2 in k1:
if k2 in SP4S: del SP4S[SP4S.index(k2)]
out_pair_modify={}
for i in out_pair_bp:
if not i[0] in list(out_pair_modify.keys()): out_pair_modify[i[0]]=[]
if not i[1] in out_pair_modify[i[0]]: out_pair_modify[i[0]].append(i[1])
if len(out_pair_modify)>1:
while True:
if len(out_pair_modify)==1: break
out_pair_qc=[]
for i in range(len(sorted(out_pair_modify.keys()))-1): out_pair_qc.append(sorted(out_pair_modify.keys())[i+1]-sorted(out_pair_modify.keys())[i])
if min(out_pair_qc)>50:break
else:
out_pair_modify[sorted(out_pair_modify.keys())[out_pair_qc.index(min(out_pair_qc))+1]]+=out_pair_modify[sorted(out_pair_modify.keys())[out_pair_qc.index(min(out_pair_qc))]]
del out_pair_modify[sorted(out_pair_modify.keys())[out_pair_qc.index(min(out_pair_qc))]]
for k1 in list(out_pair_modify.keys()):
while True:
if len(out_pair_modify[k1])==1: break
out_pair_modify[k1].sort()
out_pair_qc=[]
for i in range(len(out_pair_modify[k1])-1): out_pair_qc.append(out_pair_modify[k1][i+1]-out_pair_modify[k1][i])
if min(out_pair_qc)>50:break
else: out_pair_modify[k1][out_pair_qc.index(min(out_pair_qc))+1]=out_pair_modify[k1][out_pair_qc.index(min(out_pair_qc))]
for i in out_pair_modify[k1]:
if out_pair_modify[k1].count(i)>1:
for j in range(out_pair_modify[k1].count(i)-1): del out_pair_modify[k1][out_pair_modify[k1].index(i)]
out_pair_numrec={}
for k1 in list(out_pair_modify.keys()):
out_pair_numrec[k1]=[]
for k2 in [k1]+out_pair_modify[k1]:
if k2 in list(tempIL.keys()) and not k2 in list(LinkSP.keys()): out_pair_numrec[k1].append(len(tempIL[k2][1]))
elif k2 in list(LinkSP.keys()) and not k2 in list(tempIL.keys()): out_pair_numrec[k1].append(LinkSP[k2])
elif k2 in list(LinkSP.keys()) and k2 in list(tempIL.keys()): out_pair_numrec[k1].append(LinkSP[k2]+len(tempIL[k2][1]))
else: out_pair_numrec[k1].append(0)
fout=open(BPOutputb,'a')
out_pair_modify=out_pair_modify_check(out_pair_modify,3)
for i in sorted(out_pair_modify.keys()):
for j in out_pair_modify[i]:
if out_pair_numrec[i][0] + out_pair_numrec[i][out_pair_modify[i].index(j)+1] > BPLNCff: print(' '.join([chrom,str(i),str(out_pair_numrec[i][0]),str(j),str(out_pair_numrec[i][out_pair_modify[i].index(j)+1])]), file=fout)
if i in list(tempIL.keys()): del tempIL[i]
if j in list(tempIL.keys()): del tempIL[j]
fout.close()
fout=open(BPOutputa,'a')
for i in sorted(out_single_bp+list(LinkSP_To_Link.keys())):
num=0
if i in list(abInfF.keys()): num+=sum(abInfF[i])
if i in list(abInfR.keys()): num+=sum(abInfR[i])
if num>SPCluLen: print(' '.join([str(j) for j in [chrom,i,num]]), file=fout)
fout.close()
for i in list(tempIL.keys()):
temp_IL_Rec[i]=tempIL[i]
temp_mate_F=[]
temp_mate_R=[]
for k2 in tempIL[i][1]:
if k2 in list(LinkFF.keys()): temp_mate_F+=LinkFF[k2]
if k2 in list(LinkFR.keys()): temp_mate_R+=LinkFR[k2]
temp_IL_Rec[i].append(temp_mate_F)
temp_IL_Rec[i].append(temp_mate_R)
tempLNF=[]
tempLNR=[]
for key in list(abLink.keys()):
for key2 in range(int(len(abLink[key])/3)):
if abLink[key][3*key2]=='f': tempLNF.append(key)
else: tempLNR.append(key)
for key in list(abLinkSP.keys()):
for key2 in range(int(len(abLinkSP[key])/3)):
if abLinkSP[key][3*key2]=='f': tempLNF.append(key)
else: tempLNR.append(key)
FtLNFR=clusterNums(tempLNF+tempLNR,ClusterLen,'f')[0]
RtLNFR=clusterNums(tempLNF+tempLNR,ClusterLen,'r')[0]
FRtLNFR=clusterSupVis2(sorted(tempLNF+tempLNR),RtLNFR,FtLNFR,'left')
for key1 in list(FRtLNFR.keys()):
t_LNFR=[]
for key2 in FRtLNFR[key1]:
if key2 in list(abLink.keys()): t_LNFR+=abLink[key2]
else:
if abs(key2-key1)<SPCluLen or abs(key2-max(FRtLNFR[key1]))<SPCluLen: t_LNFR+=abLinkSP[key2]
if len(t_LNFR)/3<LinkCluMin: del FRtLNFR[key1]
else:
if not t_LNFR in FRtLNFR[key1]: FRtLNFR[key1].append(t_LNFR)
FRtLNFRb=[]
for key1 in list(FRtLNFR.keys()):
[t1_LN,t2_LN,t3_LN,t4out]=[[],[],{},[]]
for key2 in FRtLNFR[key1][:-1]:
if key2 in list(abLink.keys()):
if not key2 in t1_LN:
for key3 in range(int(len(abLink[key2])/3)):
if not abLink[key2][3*key3:3*(key3+1)] in t2_LN:
t2_LN.append(abLink[key2][3*key3:3*(key3+1)])
t1_LN.append(key2)
for key2 in t2_LN:
if not key2[-1].split('_')[0] in list(t3_LN.keys()):
t3_LN[key2[-1].split('_')[0]]={}
t3_LN[key2[-1].split('_')[0]]['a']=[]
t3_LN[key2[-1].split('_')[0]]['b']=[]
t3_LN[key2[-1].split('_')[0]]['c']=[]
t3_LN[key2[-1].split('_')[0]]['d']=[]
t3_LN[key2[-1].split('_')[0]]['a'].append(key2[0])
t3_LN[key2[-1].split('_')[0]]['b'].append(key2[1])
t3_LN[key2[-1].split('_')[0]]['c'].append(key2[-1].split('_')[1])
t3_LN[key2[-1].split('_')[0]]['d'].append(key2)
for key2 in list(t3_LN.keys()):
if clusterQC(clusterNums(t3_LN[key2]['b'], ClusterLen, 'f'), LinkCluMin)==[]: del t3_LN[key2]
else:
t4LN=clusterSupVis2(t3_LN[key2]['b'],clusterQC(clusterNums(t3_LN[key2]['b'], ClusterLen, 'r'), LinkCluMin),clusterQC(clusterNums(t3_LN[key2]['b'], ClusterLen, 'f'), LinkCluMin), 'left')
for key5 in list(t4LN.keys()):
t4LNa=[]
t4LNb=[]
t4LNc=[]
t4LNd=[]
t4out=[]
for key6 in t4LN[key5]:
t4LNa.append(t3_LN[key2]['a'][t3_LN[key2]['b'].index(key6)])
t4LNc.append(t3_LN[key2]['c'][t3_LN[key2]['b'].index(key6)])
t4LNd.append(key6)
t4LNb.append(t1_LN[t2_LN.index(t3_LN[key2]['d'][t3_LN[key2]['b'].index(key6)])])
if not 'f' in t4LNa or float(t4LNa.count('r'))/float(t4LNa.count('f'))>5: t4out+=[chrom,'r',min(t4LNb)]
elif not 'r' in t4LNa or float(t4LNa.count('f'))/float(t4LNa.count('r'))>5: t4out+=[chrom,'f',max(t4LNb)]
else: t4out+=[chrom,min(t4LNb),max(t4LNb)]
if not '+' in t4LNc or float(t4LNc.count('-'))/float(t4LNc.count('+'))>5: t4out+=[key2,'-',min(t4LNd)]
elif not '-' in t4LNc or float(t4LNc.count('+'))/float(t4LNc.count('-'))>5: t4out+=[key2,'+',max(t4LNd)]
else: t4out+=[key2,min(t4LNd),max(t4LNd)]
if not t4out==[] and not t4out in FRtLNFRb: FRtLNFRb.append(t4out)
if not FRtLNFRb==[]:
fout=open(BPOutputd,'a')
for keyfrt in FRtLNFRb: print(' '.join([str(keyfr2) for keyfr2 in keyfrt]), file=fout)
fout.close()
Link_IL_Rec={}
for k1 in list(temp_IL_Rec.keys()):
temp_mate_F=temp_IL_Rec[k1][2]
temp_mate_R=temp_IL_Rec[k1][3]
if not len(temp_IL_Rec[k1][1])+len(temp_IL_Rec[k1][2])+len(temp_IL_Rec[k1][3])<LnCluMin:
Link_IL_Rec[k1]=[clusterQC(clusterNums(temp_mate_F, ClusterLen, 'f'),LinkCluMin),clusterQC(clusterNums(temp_mate_R, ClusterLen, 'r'),LinkCluMin)]
del temp_IL_Rec[k1]
else:continue
for k1 in list(Link_IL_Rec.keys()):
for k2 in Link_IL_Rec[k1]:
for k3 in k2:
if k3>BPAlignQCFlank:
if align_QCflag==1:
tPairF_QC=0
tPairF_a=os.popen(r'''%s %s %s %d %d 1'''%(ToolMappingQ,FileMappingQ,chrom,k3-BPAlignQCFlank,k3+BPAlignQCFlank))
tPairF_b=float(tPairF_a.read().strip())
if not tPairF_b<float(BPAlignQC):
if not [min([k3,k1]),max([k3,k1])] in out_pair_bp: out_pair_bp.append([min([k3,k1]),max([k3,k1])])
else:
if not [min([k3,k1]),max([k3,k1])] in out_pair_bp: out_pair_bp.append([min([k3,k1]),max([k3,k1])])
if not out_pair_bp==[]:
temp_out_pair_bp=[]
out_BPmodify={}
for k1 in out_pair_bp:
for k2 in k1: out_BPmodify[k2]=[]
if not SP4S==[]:
LBSP_tempIL=clusterSupVis3(sorted(SP4S),sorted(out_BPmodify.keys()))
for k1 in out_pair_bp:
temp_k1=[]
for k2 in k1:
k3=LBSP_tempIL[k2]
if abs(k2-k3)<ClusterLen: temp_k1.append(k3)
else: temp_k1.append(k2)
temp_out_pair_bp.append(temp_k1)
out_pair_bp=temp_out_pair_bp
for k1 in out_pair_bp:
for k2 in k1:
if k2 in SP4S: del SP4S[SP4S.index(k2)]
out_pair_modify={}
for i in out_pair_bp:
if not i[0] in list(out_pair_modify.keys()): out_pair_modify[i[0]]=[]
if not i[1] in out_pair_modify[i[0]]: out_pair_modify[i[0]].append(i[1])
if len(out_pair_modify)>1:
while True:
if len(out_pair_modify)==1: break
out_pair_qc=[]
for i in range(len(sorted(out_pair_modify.keys()))-1): out_pair_qc.append(sorted(out_pair_modify.keys())[i+1]-sorted(out_pair_modify.keys())[i])
if min(out_pair_qc)>50:break
else:
out_pair_modify[sorted(out_pair_modify.keys())[out_pair_qc.index(min(out_pair_qc))+1]]+=out_pair_modify[sorted(out_pair_modify.keys())[out_pair_qc.index(min(out_pair_qc))]]
del out_pair_modify[sorted(out_pair_modify.keys())[out_pair_qc.index(min(out_pair_qc))]]
for k1 in list(out_pair_modify.keys()):
while True:
if len(out_pair_modify[k1])==1: break
out_pair_modify[k1].sort()
out_pair_qc=[]
for i in range(len(out_pair_modify[k1])-1): out_pair_qc.append(out_pair_modify[k1][i+1]-out_pair_modify[k1][i])
if min(out_pair_qc)>50:break
else: out_pair_modify[k1][out_pair_qc.index(min(out_pair_qc))+1]=out_pair_modify[k1][out_pair_qc.index(min(out_pair_qc))]
for i in out_pair_modify[k1]:
if out_pair_modify[k1].count(i)>1:
for j in range(out_pair_modify[k1].count(i)-1): del out_pair_modify[k1][out_pair_modify[k1].index(i)]
out_pair_numrec={}
for k1 in list(out_pair_modify.keys()):
out_pair_numrec[k1]=[]
for k2 in [k1]+out_pair_modify[k1]:
if k2 in list(tempIL.keys()) and not k2 in list(LinkSP.keys()): out_pair_numrec[k1].append(len(tempIL[k2][1]))
elif k2 in list(LinkSP.keys()) and not k2 in list(tempIL.keys()): out_pair_numrec[k1].append(LinkSP[k2])
elif k2 in list(LinkSP.keys()) and k2 in list(tempIL.keys()): out_pair_numrec[k1].append(LinkSP[k2]+len(tempIL[k2][1]))
else: out_pair_numrec[k1].append(0)
fout=open(BPOutputb,'a')
for i in sorted(out_pair_modify.keys()):
for j in out_pair_modify[i]:
if out_pair_numrec[i][0] + out_pair_numrec[i][out_pair_modify[i].index(j)+1]> BPLNCff: print(' '.join([chrom,str(i),str(out_pair_numrec[i][0]),str(j),str(out_pair_numrec[i][out_pair_modify[i].index(j)+1])]), file=fout)
if i in list(tempIL.keys()): del tempIL[i]
if j in list(tempIL.keys()): del tempIL[j]
fout.close()
fout=open(BPOutputa,'a')
for i in sorted(temp_IL_Rec.keys()):
if sum(temp_IL_Rec[i][0][:2])>SPCluLen: print(' '.join([chrom,str(i),str(sum(temp_IL_Rec[i][0][:2]))]), file=fout)
fout.close()
time2=time.time()
LN_Filter(BPOutputb,BPOutputa,workdir)
os.system(r'''cat %s >> %s'''%(BPOutputd,BPOutpute))
os.system(r'''rm %s'''%(BPOutputd))
print('BPSearch Complete for '+bamF+'.'+chrF)
print('Time Consuming: '+str(time2-time1))
if function_name=='BPSearch_Predefined':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','help=','input-bed=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters_predefinedbp()
else:
import time
import datetime
if not '--input-bed' in list(dict_opts.keys()):
print('Error: please specify predefined breakpoints using --input-bed')
else:
def Code_Files_Define():
global input_bed
input_bed=dict_opts['--input-bed']
global workdir
workdir=path_modify(dict_opts['--workdir'])
global Code_File
global Code0_Function
global Code1_Function
global Code2_Function
global Code2_Predefined_Function
global Code3_Function
global Code4_Function
global Code5_Function
global RCode_Path
global Code1a_file
global Code1d_file
global Code1d2_file
Code_File=script_name
Code0_Function='Setup'
Code1_Function='NullModel'
Code2_Function='BPSearch'
Code2_Predefined_Function='BPSearch_Predefined'
Code3_Function='BPIntegrate'
Code4_Function='SVPredict'
Code5_Function='SVIntegrate'
RCode_Path=workdir+'reference_SVelter/'
Code1a_file=RCode_Path+'SVelter1.NullModel.Figure.a.r'
Code1d_file=RCode_Path+'SVelter1.NullModel.Figure.b.r'
Code1d2_file=RCode_Path+'SVelter1.NullModel.Figure.c.r'
def Define_Default_AllInOne():
global deterministic_flag
deterministic_flag=0
if '--deterministic-flag' in list(dict_opts.keys()):
deterministic_flag=int(dict_opts['--deterministic-flag'])
if '--core' in list(dict_opts.keys()):
global pool
pool = Pool(processes=int(dict_opts['--core']))
global model_comp
if not '--null-model' in list(dict_opts.keys()):
model_comp='S'
else:
if dict_opts['--null-model'] in ['S','Simple']:
model_comp='S'
else:
model_comp='C'
global QCAlign
if '--qc-align' in list(dict_opts.keys()):
QCAlign=int(dict_opts['--qc-align'])
else:
QCAlign=20
global QCSplit
if '--qc-split' in list(dict_opts.keys()):
QCSplit=int(dict_opts['--qc-split'])
else:
QCSplit=20
global NullSplitLen_perc
if '--split-min-len' in list(dict_opts.keys()):
NullSplitLen_perc=int(dict_opts['--split-min-len'])
else:
NullSplitLen_perc=0.9
global KeepFile
if '--keep-temp-files' in list(dict_opts.keys()):
KeepFile=dict_opts['--keep-temp-files']
else:
KeepFile='No'
global KeepFigure
if '--keep-temp-figs' in list(dict_opts.keys()):
KeepFigure=dict_opts['--keep-temp-figs']
else:
KeepFigure='No'
global Trail_Number
if '--num-iteration' in list(dict_opts.keys()):
Trail_Number=int(dict_opts['--num-iteration'])
else:
Trail_Number=10000
global Local_Minumum_Number
Local_Minumum_Number=100
global Ploidy
if '--ploidy' in list(dict_opts.keys()):
Ploidy=int(dict_opts['--ploidy'])
else:
Ploidy=2
global ILCff_STD_Time
if '-S' in list(dict_opts.keys()):
ILCff_STD_Time=int(dict_opts['-S'])
else:
ILCff_STD_Time=3
def run_SVelter1_chrom_predefine(sin_bam_file):
os.system(r'''%s %s --keep-temp-files %s --keep-temp-figs %s --null-model %s --workdir %s --sample %s --out-path %s'''%(Code_File,Code1_Function,KeepFile,KeepFigure,model_comp,workdir,sin_bam_file,NullModel_out_folder))
def run_SVelter1_Single_chrom_predefine(sin_bam_file,chromos_single):
os.system(r'''%s %s --keep-temp-files %s --keep-temp-figs %s --null-model %s --workdir %s --sample %s --chromosome %s --out-path %s'''%(Code_File,Code1_Function,KeepFile,KeepFigure,model_comp,workdir,sin_bam_file,chromos_single,NullModel_out_folder))
def run_SVelter2_chrom_predefine(chrom_name,sin_bam_file,ILCff_STD_Time):
os.system(r'''%s %s --chromosome %s --workdir %s --sample %s --null-model %s -S %s --out-path %s'''%(Code_File,Code2_Predefined_Function,chrom_name,workdir,sin_bam_file,model_comp,ILCff_STD_Time,BPPredict_out_folder))
def run_SVelter3_chrom_predefine(sin_bam_file,out_folder):
os.system(r'''%s %s --batch %s --workdir %s --sample %s --bp-path %s'''%(Code_File,Code3_Function,dict_opts['--batch'],workdir,sin_bam_file,BPPredict_out_folder))
def run_SVelter4_chrom(txt_name,sin_bam_file):
os.system(r'''%s %s --workdir %s --bp-file %s --sample %s --num-iteration %s --ploidy %s --null-model %s --deterministic-flag %s'''%(Code_File,Code4_Function,workdir,txt_name,sin_bam_file,str(Trail_Number),str(Ploidy),model_comp,deterministic_flag))
print(txt_name+' done!')
def run_SVelter5_chrom(path2,out_vcf):
os.system(r'''%s %s --workdir %s --input-path %s --prefix %s'''%(Code_File,Code5_Function,workdir,path2,out_vcf))
def SamplingPercentage_read_in():
if '--null-copyneutral-perc' in list(dict_opts.keys()):
SamplingPercentage=float(dict_opts['--null-copyneutral-perc'])
else:
SamplingPercentage=0.001
return SamplingPercentage
def main():
Code_Files_Define()
Define_Default_AllInOne()
if '--sample' in list(dict_opts.keys()):
bam_path='/'.join(dict_opts['--sample'].split('/')[:-1])+'/'
bam_files=[dict_opts['--sample']]
bam_files_appdix=dict_opts['--sample'].split('.')[-1]
else:
bam_path=path_modify(dict_opts['--samplePath'])
bam_files=[]
for file in os.listdir(bam_path):
if file.split('.')[-1]==bam_files_appdix:
bam_files.append(bam_path+file)
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index):
print('Error: reference genome not indexed ')
else:
global whole_genome
global len_genome
[whole_genome,len_genome]=calculate_len_genome(ref_file)
chromos=list(whole_genome.keys())
chr_name_check=0
fin=open(ref_index)
chr_ref_check=[]
for line in fin:
pin=line.strip().split()
chr_ref_check.append(pin[0])
fin.close()
for filein_bam in bam_files:
chr_bam_check=[]
fin=os.popen(r'''samtools view -H %s'''%(filein_bam))
for line in fin:
pin=line.strip().split()
if pin[0]=='@SQ':
chr_bam_check.append(pin[1].split(':')[1])
fin.close()
if not chr_ref_check==chr_bam_check:
print('Warning: please make sure the reference file matches the sample file')
chr_flag=0
if 'chr' in chr_ref_check[0]:
chr_flag=1
SamplingPercentage=float(SamplingPercentage_read_in())
cn2_file=cn2_file_read_in(dict_opts,workdir)
ex_file=ex_file_read_in(dict_opts,workdir)
cn2_length=int(cn2_length_readin(dict_opts))
Gap_Refs=[ex_file]
if not os.path.isfile(cn2_file):
cn2_path='/'.join(cn2_file.split('/')[:-1])+'/'
if not os.path.isdir(cn2_path):
os.system(r'''mkdir %s'''%(cn2_path))
if not '--null-random-length' in list(dict_opts.keys()):
dict_opts['--null-random-length']=5000
else:
dict_opts['--null-random-length']=int(dict_opts['--null-random-length'])
if not '--null-random-num' in list(dict_opts.keys()):
dict_opts['--null-random-num']=10000
else:
dict_opts['--null-random-num']=int(dict_opts['--null-random-num'])
cn2_length=dict_opts['--null-random-length']-100
fo=open(cn2_file,'w')
for i in sorted(whole_genome.keys()):
num_i=int(float(whole_genome[i][0])/float(len_genome)*dict_opts['--null-random-num'])
reg_i=[random.randint(1,whole_genome[i][0]-dict_opts['--null-random-length']) for j in range(num_i)]
for j in sorted(reg_i):
print(' '.join([i,str(j),str(j+dict_opts['--null-random-length']-1)]), file=fo)
fo.close()
SamplingPercentage=1
if not os.path.isfile(ex_file):
fo=open(ex_file,'w')
for chr_ex in chromos:
print(' '.join([chr_ex,'0','0']), file=fo)
fo.close()
#if '--prefix' in dict_opts.keys():
# out_vcf=dict_opts['--prefix']+'.vcf'
# out_svelter=dict_opts['--prefix']+'.svelter'
#else:
# out_vcf=workdir+dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'.vcf')
# out_svelter=workdir+dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'.svelter')
# print 'Warning: output file is not specified'
# print 'output file: '+out_vcf
# print 'output file: '+out_svelter
temp_inter_replace=0
if '--chromosome' in list(dict_opts.keys()):
chrom_single=dict_opts['--chromosome']
if not chrom_single in chromos:
print('Error: please make sure the chromosome defined by --chr is correct based on the reference genome')
chromos=[]
else:
chromos=[chrom_single]
for sin_bam_file in bam_files:
global NullModel_out_folder
global BPPredict_out_folder
global bp_files_out_folder
BPPredict_out_folder=workdir+'BreakPoints.'+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1])+'.predefinedBP.'+'.'.join(dict_opts['--input-bed'].split('/')[-1].split('.')[:-1])+'/'
NullModel_out_folder=workdir+'NullModel.'+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1])+'.predefinedBP.'+'.'.join(dict_opts['--input-bed'].split('/')[-1].split('.')[:-1])+'/'
bp_files_out_folder=workdir+'bp_files.'+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1])+'.predefinedBP.'+'.'.join(dict_opts['--input-bed'].split('/')[-1].split('.')[:-1])+'/'
running_time=[]
if os.path.isfile(input_bed):
bed_info=bed_readin(input_bed)
path_mkdir(BPPredict_out_folder)
bed_write(bed_info,BPPredict_out_folder,sin_bam_file.split('/')[-1],input_bed)
else:
print('Error: predefined breakpoints file not exist !')
main()
if function_name=='BPIntegrate':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','help=','bp-path=','long-insert=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters_bpintegrate()
else:
def Define_Default_BPIntegrate():
global ReadLength
if not '--read-length' in list(dict_opts.keys()):
ReadLength=101
else:
ReadLength=int(dict_opts['--read-length'])
global ToolMappingQ
global FileMappingQ
global align_QCflag
if '--qc-map-tool' in list(dict_opts.keys()) and '--qc-map-file' in list(dict_opts.keys()):
ToolMappingQ=dict_opts['--qc-map-tool']
FileMappingQ=dict_opts['--qc-map-file']
align_QCflag=1
else:
align_QCflag=0
global BPalignQC
if '--BPalignQC' in list(dict_opts.keys()):
BPalignQC=float(dict_opts['--BPalignQC'])
else:
BPalignQC=0.2
global QCAlign
if '--qc-align' in list(dict_opts.keys()):
QCAlign=int(dict_opts['--qc-align'])
else:
QCAlign=20
global QCSplit
if '--qc-split' in list(dict_opts.keys()):
QCSplit=int(dict_opts['--qc-split'])
else:
QCSplit=20
global Null_SplitLen_perc
if '--split-min-len' in list(dict_opts.keys()):
Null_SplitLen_perc=float(dict_opts['--split-min-len'])
else:
Null_SplitLen_perc=0.1
global BPalignQCFlank
if '--BPalignQCFlank' in list(dict_opts.keys()):
BPalignQCFlank=int(dict_opts['--BPalignQCFlank'])
else:
BPalignQCFlank=500
global para_filter
para_filter=[]
if '--BPSPCff' in list(dict_opts.keys()) and '--BPLNCff' in list(dict_opts.keys()) and '--BPalignQC' in list(dict_opts.keys()):
para_filter=['SPCff'+dict_opts['--BPSPCff']+'.CluCff'+dict_opts['--BPLNCff']+'.AlignCff'+dict_opts['--BPalignQC']]
def global_para_declaration():
global workdir
workdir=path_modify(dict_opts['--workdir'])
print('temp files produced under: '+workdir)
global bps_in_path
if not '--bp-path' in list(dict_opts.keys()):
bps_in_path=workdir+'BreakPoints.'+dict_opts['--sample'].split('/')[-1]+'/'
else:
bps_in_path=path_modify(dict_opts['--bp-path'])
if not bps_in_path[0]=='/' and not bps_in_path[:2]=='./':
bps_in_path='./'+bps_in_path
global S_Sample
global chromo_name
global LN_list
global all_SPs
min_length=100
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
Define_Default_BPIntegrate()
if not '--workdir' in list(dict_opts.keys()):
print('Error: please specify working directory using: --workdir')
else:
global_para_declaration()
if not '--sample' in list(dict_opts.keys()):
print('Error: please specify either input file using --sample')
else:
bam_path='/'.join(dict_opts['--sample'].split('/')[:-1])+'/'
bam_files=[dict_opts['--sample']]
bam_files_appdix=dict_opts['--sample'].split('.')[-1]
#bam_names=[dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'')]
bam_names=['.'.join(dict_opts['--sample'].split('/')[-1].split('.')[:-1])]
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index):
print('Error: reference genome not indexed')
else:
chromos=chromos_read_in(ref_file)
allchromos=chromos
if '--chromosome' in list(dict_opts.keys()):
chrom_single=dict_opts['--chromosome']
if not chrom_single in chromos:
print('Error: please make sure the chromosome defined by --chr is correct based on the reference genome')
chromos=[]
else:
chromos=[chrom_single]
if not chromos==[]:
time1=time.time()
bps_hash={}
for i in bam_names:
bps_hash[i]={}
bps_folder='/'.join(bps_in_path.split('/')[:-2])+'/'+'.'.join(['bp_files']+bps_in_path.split('/')[-2].split('.')[1:])+'/'
path_mkdir(bps_folder)
for i in bam_names:
bps_hash[i]={}
for file1 in os.listdir(bps_in_path):
if file1.split('.')[-1]=='LNs':
key_bps_hash='.'.join(file1.split('.')[:-1])
if not key_bps_hash in list(bps_hash[i].keys()):
bps_hash[i][key_bps_hash]=[]
bps_hash[i][key_bps_hash].append(file1)
bps_hash[i][key_bps_hash].append(key_bps_hash+'.SPs')
for S_Sample in list(bps_hash.keys()):
[LN_list,all_SPs]=[{},{}]
for chromo_name in list(bps_hash[S_Sample].keys()):
[LN_list,all_SPs]=SP_LN_info_ReadIn(LN_list,all_SPs,bps_in_path+bps_hash[S_Sample][chromo_name][0],bps_in_path+bps_hash[S_Sample][chromo_name][1])
for chromo_name in list(LN_list.keys()):
if not LN_list[chromo_name]==[]:
if chromo_name in list(all_SPs.keys()):
unique_SPs=SP_Info_Merge(all_SPs[chromo_name])
else:
all_SPs[chromo_name]={}
unique_SPs=[]
modified_LNs=LN_Info_Correct(LN_list[chromo_name],unique_SPs)
multi_removed_LNs=multi_trans_detect(modified_LNs)
LN_LN_Merge_0=merge_LNs_into_LNs(multi_removed_LNs)
LN_LN_Merge=LN_Merge_Final_Check(LN_LN_Merge_0)
if not '--batch' in list(dict_opts.keys()):
write_bp_1a(LN_LN_Merge,bps_folder,chromo_name,S_Sample)
else:
if dict_opts['--batch']=='0':
write_bp_2a(LN_LN_Merge,bps_folder,chromo_name,S_Sample)
else:
file_length=int(dict_opts['--batch'])
file_index=write_bp_3a(LN_LN_Merge,bps_folder,file_length,chromo_name,S_Sample)
LN_bps_write(bps_hash,bps_folder,S_Sample,dict_opts,chromos,allchromos,bps_in_path)
time2=time.time()
print('BPIntegrate Complete !')
print('Time Consuming: '+str(time2-time1))
if function_name=='SVPredict':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','help=','long-insert=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','input-bed=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']: readme.print_default_parameters_svpredict()
else:
import os
import sys
import getopt
import random
import scipy
import math
import numpy
import pickle
from math import sqrt,pi,exp
import scipy
from scipy.stats import norm
import time
import datetime
import itertools
def Af_Rearrange_Info_Collect_2(BP_para_dict,Letter_Candidates):
P_IL=[]
P_RD=[]
P_DR=[]
P_TB=[]
Letter_Rec=[]
BP_Rec=[]
for Af_Letter in Letter_Candidates:
Af_BP=[[BP_para_dict['original_bp_list'][0]],[BP_para_dict['original_bp_list'][0]]]
for i in Af_Letter[0]:
Af_BP[0].append(Af_BP[0][-1]+Be_BP_Letter[i])
for i in Af_Letter[1]:
Af_BP[1].append(Af_BP[1][-1]+Be_BP_Letter[i])
Af_Info_all=Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP)
if not Af_Info_all==0:
Letter_Rec.append(Af_Letter)
BP_Rec.append(Af_BP)
Af_IL_Penal=Af_Info_all[0]
Af_RD_Rec=Af_Info_all[1]
Af_DR_Penal=(Af_Info_all[2])**2
Af_TB_Penal_a=Af_Info_all[4]
Af_TB_Rec=Af_Info_all[3]
Af_TB_Penal=-float(Af_TB_Penal_a)/float(BP_para_dict['num_of_reads'])+float(Af_TB_Rec)
Af_RD_Penal=RD_Adj_Penal(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter)
for key in list(Af_Info_all[5].keys()):
Af_RD_Penal+=Prob_Norm(Af_Info_all[5][key],0,GC_para_dict['GC_Var_Coverage'][chrom_N]/2)
P_IL.append(Af_IL_Penal)
P_RD.append(Af_RD_Penal)
P_DR.append(Af_DR_Penal/num_of_read_pairs)
P_TB.append(Af_TB_Penal)
if P_IL==[]: return 'Error'
else:
Regu_IL=[P_IL[i]*(1+DR_Weight*P_DR[i]) for i in range(len(P_IL))]
Regu_IL=[i*K_IL_new for i in Regu_IL]
#Regu_RD=[P_RD[i]*(1-TB_Weight*P_TB[i]) for i in range(len(P_RD))]
Regu_RD=[P_RD[i]+P_TB[i] for i in range(len(P_RD))]
return [Regu_IL,Regu_RD,Letter_Rec,BP_Rec]
def All_Block_RD(Initial_block_RD,Af_GCRD_Adj,Af_block_RD,Af_Letter,flank):
All_Letters=['left']+[chr(97+i) for i in range(len(Initial_block_RD)-1)]
CNm=[1]+[0 for j in range(len(Initial_block_RD)-1)]
CNp=[1]+[0 for j in range(len(Initial_block_RD)-1)]
k=Af_Letter[0]
for m in k:
CNm[ord(m[0])-96]+=1
k=Af_Letter[1]
for m in k:
CNp[ord(m[0])-96]+=1
RDm=[(Initial_block_RD[0]+left_RD_Calculate_2a(Through_GCRD_Adj,Af_GCRD_Adj[0],flank))/2]+[0 for j in (list(range(len(Initial_block_RD)-1)),Window_Size)]
RDp=[(Initial_block_RD[0]+left_RD_Calculate_2a(Through_GCRD_Adj,Af_GCRD_Adj[1],flank))/2]+[0 for j in (list(range(len(Initial_block_RD)-1)),Window_Size)]
RDs=[RDm,RDp]
for p in range(len(Af_Letter)):
for q in range(len(Af_Letter[p])):
RDs[p][ord(Af_Letter[p][q][0])-96]+=Af_block_RD[p][q]
for r in range(len(Initial_block_RD))[1:]:
if CNm[r]==CNp[r]:
RDs[0][r]+=Initial_block_RD[r]/2
RDs[1][r]+=Initial_block_RD[r]/2
elif CNm[r]==0 and not CNp[r]==0:
RDs[1][r]+=Initial_block_RD[r]
elif CNp[r]==0 and not CNm[r]==0:
RDs[0][r]+=Initial_block_RD[r]
else:
RDs[0][r]+=Initial_block_RD[r]*CNm[r]/(CNp[r]+CNm[r])
RDs[1][r]+=Initial_block_RD[r]*CNp[r]/(CNp[r]+CNm[r])
CNs=[CNm,CNp]
return [CNs,RDs]
def All_Block_RD_2(Initial_block_RD,Af_block_RD,Af_Letter,bps,flank):
RDs=[[],[]]
CNs=[[],[]]
for let in [chr(97+i) for i in range(len(bps)-1)]:
CNs[0].append(Af_Letter[0].count(let)+Af_Letter[0].count(let+'^'))
CNs[1].append(Af_Letter[1].count(let)+Af_Letter[1].count(let+'^'))
if not CNs[0][-1]+CNs[1][-1]==0:
RDs[0].append(Initial_block_RD[ord(let)-96]*CNs[0][-1]/(CNs[0][-1]+CNs[1][-1]))
RDs[1].append(Initial_block_RD[ord(let)-96]*CNs[1][-1]/(CNs[0][-1]+CNs[1][-1]))
if CNs[0][-1]+CNs[1][-1]==0:
RDs[0].append(0)
RDs[1].append(0)
for key in list(Af_block_RD[0].keys()):
if not key=='left' and not key=='right':
RDs[0][ord(key.split('_')[0])-97]+=float(Af_block_RD[0][key])/float(bps[ord(key.split('_')[0])-96]-bps[ord(key.split('_')[0])-97])*Window_Size
for key in list(Af_block_RD[1].keys()):
if not key=='left' and not key=='right':
RDs[1][ord(key.split('_')[0])-97]+=float(Af_block_RD[1][key])/float(bps[ord(key.split('_')[0])-96]-bps[ord(key.split('_')[0])-97])*Window_Size
CNs[0]=[1]+CNs[0]
CNs[1]=[1]+CNs[1]
RDs[0]=[Af_block_RD[0]['left']+Initial_block_RD[0]/2]+RDs[0]
RDs[1]=[Af_block_RD[1]['left']+Initial_block_RD[0]/2]+RDs[1]
return [CNs,RDs]
def Be_Info_1_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal):
be_info_1=Be_Info[0]
for j in be_info_1:
jMapPenam=0
j_m_new=[]
if j[0] in temp_letter[0] and j[3] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[3]]:
j_m_temp=[j[1]+ka[0],j[2]+ka[0],j[4]+kb[0],j[5]+kb[0]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[j[-1],j[-2]]
else:
j_m_temp+=[j[-2],j[-1]]
j_m_new.append(j_m_temp)
if j[0]+'^' in temp_letter[0] and j[3] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[3]]:
j_m_temp=[ka[1]-j[2],ka[1]-j[1],kb[0]+j[4],kb[0]+j[5]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[j[-1],complement(j[-2])]
else:
j_m_temp+=[complement(j[-2]),j[-1]]
j_m_new.append(j_m_temp)
if j[0] in temp_letter[0] and j[3]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[3]+'^']:
j_m_temp=[j[1]+ka[0],j[2]+ka[0],kb[1]-j[5],kb[1]-j[4]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[complement(j[-1]),j[-2]]
else:
j_m_temp+=[j[-2],complement(j[-1])]
j_m_new.append(j_m_temp)
if j[0]+'^' in temp_letter[0] and j[3]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[3]+'^']:
j_m_temp=[ka[1]-j[2],ka[1]-j[1],kb[1]-j[5],kb[1]-j[4]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[complement(j[-1]),complement(j[-2])]
else:
j_m_temp+=[complement(j[-2]),complement(j[-1])]
j_m_new.append(j_m_temp)
j_m_3a=candidate_QC_Control(j_m_new)
if j_m_3a==[]:
jMapPenam+=1
j_p_new=[]
if j[0] in temp_letter[1] and j[3] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[3]]:
j_p_temp=[j[1]+ka[0],j[2]+ka[0],j[4]+kb[0],j[5]+kb[0]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[j[-1],j[-2]]
else:
j_p_temp+=[j[-2],j[-1]]
j_p_new.append(j_p_temp)
if j[0]+'^' in temp_letter[1] and j[3] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[3]]:
j_p_temp=[ka[1]-j[2],ka[1]-j[1],kb[0]+j[4],kb[0]+j[5]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[j[-1],complement(j[-2])]
else:
j_p_temp+=[complement(j[-2]),j[-1]]
j_p_new.append(j_p_temp)
if j[0] in temp_letter[1] and j[3]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[3]+'^']:
j_p_temp=[j[1]+ka[0],j[2]+ka[0],kb[1]-j[5],kb[1]-j[4]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[complement(j[-1]),j[-2]]
else:
j_p_temp+=[j[-2],complement(j[-1])]
j_p_new.append(j_p_temp)
if j[0]+'^' in temp_letter[1] and j[3]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[3]+'^']:
j_p_temp=[ka[1]-j[2],ka[1]-j[1],kb[1]-j[5],kb[1]-j[4]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[complement(j[-1]),complement(j[-2])]
else:
j_p_temp+=[complement(j[-2]),complement(j[-1])]
j_p_new.append(j_p_temp)
j_p_3a=candidate_QC_Control(j_p_new)
if j_p_3a==[]:
jMapPenam+=1
if jMapPenam==2:
Total_Cov_For_Pen[j[0]]+=j[2]-j[1]
Total_Cov_For_Pen[j[3]]+=j[5]-j[4]
NoMapPenal+=2
elif jMapPenam==1:
if j_m_3a==[]:
Map_P+=[jp3+['p']+[float(1)/float(len(j_p_3a))] for jp3 in j_p_3a]
elif j_p_3a==[]:
Map_M+=[jp3+['m']+[float(1)/float(len(j_m_3a))] for jp3 in j_m_3a]
else:
j_mp_4a=candidate_QC_Control2(j_m_3a,j_p_3a)
if not j_mp_4a==[]:
Map_Both+=[j4+[float(1)/float(len(j_mp_4a))] for j4 in j_mp_4a]
else:
Total_Cov_For_Pen[j[0]]+=j[2]-j[1]
Total_Cov_For_Pen[j[3]]+=j[5]-j[4]
NoMapPenal+=2
return NoMapPenal
def Be_Info_2_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal):
be_info_2=Be_Info[1]
for j in be_info_2:
jMapPenam=0
j_m_new=[]
if j[0] in temp_letter[0] and j[2] in temp_letter[0] and j[4] in temp_letter[0] and j[6] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[2]]:
for kc in Let_BP_Info['m'][j[4]]:
for kd in Let_BP_Info['m'][j[6]]:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],j[-2]])
else:
j_m_new.append(j_info_new+[j[-2],j[-1]])
if j[0]+'^' in temp_letter[0] and j[2]+'^' in temp_letter[0] and j[4] in temp_letter[0] and j[6] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[2]+'^']:
for kc in Let_BP_Info['m'][j[4]]:
for kd in Let_BP_Info['m'][j[6]]:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],complement(j[-2])])
else:
j_m_new.append(j_info_new+[complement(j[-2]),j[-1]])
if j[0] in temp_letter[0] and j[2] in temp_letter[0] and j[4]+'^' in temp_letter[0] and j[6]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[2]]:
for kc in Let_BP_Info['m'][j[4]+'^']:
for kd in Let_BP_Info['m'][j[6]+'^']:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),j[-2]])
else:
j_m_new.append(j_info_new+[j[-2],complement(j[-1])])
if j[0]+'^' in temp_letter[0] and j[2]+'^' in temp_letter[0] and j[4]+'^' in temp_letter[0] and j[6]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[2]+'^']:
for kc in Let_BP_Info['m'][j[4]+'^']:
for kd in Let_BP_Info['m'][j[6]+'^']:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),complement(j[-2])])
else:
j_m_new.append(j_info_new+[complement(j[-2]),complement(j[-1])])
j_m_3a=candidate_QC_Control(j_m_new)
if j_m_3a==[]:
jMapPenam+=1
j_p_new=[]
if j[0] in temp_letter[1] and j[2] in temp_letter[1] and j[4] in temp_letter[1] and j[6] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[2]]:
for kc in Let_BP_Info['p'][j[4]]:
for kd in Let_BP_Info['p'][j[6]]:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],j[-2]])
else:
j_p_new.append(j_info_new+[j[-2],j[-1]])
if j[0]+'^' in temp_letter[1] and j[2]+'^' in temp_letter[1] and j[4] in temp_letter[1] and j[6] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[2]+'^']:
for kc in Let_BP_Info['p'][j[4]]:
for kd in Let_BP_Info['p'][j[6]]:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],complement(j[-2])])
else:
j_p_new.append(j_info_new+[complement(j[-2]),j[-1]])
if j[0] in temp_letter[1] and j[2] in temp_letter[1] and j[4]+'^' in temp_letter[1] and j[6]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[2]]:
for kc in Let_BP_Info['p'][j[4]+'^']:
for kd in Let_BP_Info['p'][j[6]+'^']:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),j[-2]])
else:
j_p_new.append(j_info_new+[j[-2],complement(j[-1])])
if j[0]+'^' in temp_letter[1] and j[2]+'^' in temp_letter[1] and j[4]+'^' in temp_letter[1] and j[6]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[2]+'^']:
for kc in Let_BP_Info['p'][j[4]+'^']:
for kd in Let_BP_Info['p'][j[6]+'^']:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),complement(j[-2])])
else:
j_p_new.append(j_info_new+[complement(j[-2]),complement(j[-1])])
j_p_3a=candidate_QC_Control(j_p_new)
if j_p_3a==[]:
jMapPenam+=1
if jMapPenam==2:
if j[0]==j[2]:
Total_Cov_For_Pen[j[0]]+=j[3]-j[1]
else:
Total_Cov_For_Pen[j[0]]+=Be_BP_Letter[j[0]]-j[1]
Total_Cov_For_Pen[j[2]]+=j[3]
if j[4]==j[6]:
Total_Cov_For_Pen[j[4]]+=j[7]-j[5]
else:
Total_Cov_For_Pen[j[4]]+=Be_BP_Letter[j[4]]-j[5]
Total_Cov_For_Pen[j[6]]+=j[7]
NoMapPenal+=2
elif jMapPenam==1:
if j_m_3a==[]:
Map_P+=[jp3+['p']+[float(1)/float(len(j_p_3a))] for jp3 in j_p_3a]
elif j_p_3a==[]:
Map_M+=[jp3+['m']+[float(1)/float(len(j_m_3a))] for jp3 in j_m_3a]
else:
j_mp_4a=candidate_QC_Control2(j_m_3a,j_p_3a)
if not j_mp_4a==[]:
Map_Both+=[j4+[float(1)/float(len(j_mp_4a))] for j4 in j_mp_4a]
else:
if j[0]==j[2]:
Total_Cov_For_Pen[j[0]]+=j[3]-j[1]
else:
Total_Cov_For_Pen[j[0]]+=Be_BP_Letter[j[0]]-j[1]
Total_Cov_For_Pen[j[2]]+=j[3]
if j[4]==j[6]:
Total_Cov_For_Pen[j[4]]+=j[7]-j[5]
else:
Total_Cov_For_Pen[j[4]]+=Be_BP_Letter[j[4]]-j[5]
Total_Cov_For_Pen[j[6]]+=j[7]
NoMapPenal+=2
return NoMapPenal
def Be_Info_3_rearrange(BP_para_dict,Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal):
be_info_3=Be_Info[2]
for j in be_info_3:
j_m_new=[]
if j[0] in temp_letter[0] and j[2] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[2]]:
temp_single=[ka[0]+j[1],kb[0]+j[3]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_m_new.append(temp_single)
if j[0]+'^' in temp_letter[0] and j[2]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[2]+'^']:
temp_single=[kb[1]-j[3],ka[1]-j[1]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_m_new.append(temp_single)
j_p_new=[]
if j[0] in temp_letter[1] and j[2] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[2]]:
temp_single=[ka[0]+j[1],kb[0]+j[3]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_p_new.append(temp_single)
if j[0]+'^' in temp_letter[1] and j[2]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[2]+'^']:
temp_single=[kb[1]-j[3],ka[1]-j[1]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_p_new.append(temp_single)
if not j_m_new+j_p_new==[]:
for j2 in j_m_new:
Map_Both.append(j2+['m',float(1)/float(len(j_m_new+j_p_new))])
for j2 in j_p_new:
Map_Both.append(j2+['p',float(1)/float(len(j_m_new+j_p_new))])
else:
Total_Cov_For_Pen[j[0]]=Be_BP_Letter[j[0]]-j[1]
Total_Cov_For_Pen[j[2]]=j[3]
NoMapPenal+=1
return NoMapPenal
def Block_Assign_To_Letters(bp_list,letter_list,flank):
#Eg of bp_list:[184569179, 184569775, 184571064, 184572009, 184572016]
#Eg of letter_list:['a', 'b', 'c', 'd']
#Eg of flank:446
number_of_blocks=(numpy.max(bp_list)-numpy.min(bp_list)+2*flank)/Window_Size+1
blocks={}
bp_list_new=[bp_list[0]-flank]+bp_list+[bp_list[-1]+flank]
relative_bp_list=[i-numpy.min(bp_list_new) for i in bp_list_new]
bp_length=[(bp_list_new[i+1]-bp_list_new[i]) for i in range(len(bp_list_new)-1)]
letter_list_new=['left']+letter_list+['right']
bp_blocks=[[letter_list_new[j]]+list(range(int(relative_bp_list[j]/Window_Size),int(relative_bp_list[j+1]/Window_Size+1))) for j in range(len(relative_bp_list)-1)]
blocks_bp={}
for i in range(number_of_blocks):
blocks_bp[i+1]=[bp_list_new[0]+i*Window_Size,bp_list_new[0]+i*Window_Size+Window_Size-1]
for j in bp_blocks:
if i in j:
blocks_bp[i+1].append(j[0])
blocks_bp[0]=[blocks_bp[1][0]-Window_Size,blocks_bp[1][0]-1,'0']
blocks_bp[number_of_blocks+1]=[blocks_bp[number_of_blocks][1]+1,blocks_bp[number_of_blocks][1]+Window_Size,'0']
return blocks_bp
def block_Info_ReadIn(GC_para_dict,BP_para_dict,chr_letter_bp,blocks_read_in,Multi_Dup):
block_bps={}
block_rds={}
for k1 in list(chr_letter_bp.keys()):
block_bps[k1]={}
block_rds[k1]={}
for k2 in list(chr_letter_bp[k1].keys()):
if not k2 in Multi_Dup:
block_bps[k1][k2]=[min(chr_letter_bp[k1][k2]),max(chr_letter_bp[k1][k2])]
block_rds[k1][k2]=0
Pair_ThroughBP={}
Double_Read_ThroughBP={}
Single_Read_ThroughBP={}
total_rec={}
rd_low_qual={}
for k1 in list(chr_letter_bp.keys()):
Pair_ThroughBP[k1]=[]
Double_Read_ThroughBP[k1]=[]
Single_Read_ThroughBP[k1]=[]
rd_low_qual[k1]={}
for k2 in blocks_read_in[k1]:
multi_dup_flag=multi_dup_check(k2,Multi_Dup)
if multi_dup_flag==0:
k2a=[]
k2b=[]
for k3 in k2:
if type(k3)==type(1):
k2a.append(k3)
else:
k2b.append(k3)
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,k1,min(k2a)-BP_para_dict['flank'],max(k2a)+BP_para_dict['flank']))
blackList=[]
temp_rec={}
temp_rec_LowQual={}
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
#if not int(pbam[4])>QCAlign:continue
if pbam[0] in blackList: continue
if not int(pbam[4])>QCAlign:
if not pbam[0] in list(temp_rec_LowQual.keys()):
temp_rec_LowQual[pbam[0]]=[]
if not pbam[1:9] in temp_rec_LowQual[pbam[0]]:
temp_rec_LowQual[pbam[0]]+=[pbam[1:9]]
else:
if not pbam[0] in list(temp_rec.keys()):
temp_rec[pbam[0]]=[]
if not pbam[1:9] in temp_rec[pbam[0]]:
temp_rec[pbam[0]]+=[pbam[1:9]]
fbam.close()
flank_region=[]
for k3 in k2b:
flank_region+=block_bps[k1][k3]
flank_region=[min(flank_region),max(flank_region)]
for k3 in list(temp_rec_LowQual.keys()):
for k4 in temp_rec_LowQual[k3]:
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
if not read_pos[-1] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-1]]=0
rd_low_qual[k1][read_pos[-1]]+=(read_pos[1]-read_pos[0])
else:
if not read_pos[-2] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-2]]=0
if not read_pos[-1] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-1]]=0
rd_low_qual[k1][read_pos[-2]]+=block_bps[k1][read_pos[-2]][1]-read_pos[0]
rd_low_qual[k1][read_pos[-1]]+=-block_bps[k1][read_pos[-1]][0]+read_pos[1]
for k3 in list(temp_rec.keys()):
if len(temp_rec[k3])>2:
test_rec=[int(temp_rec[k3][0][7])]
test_rec2=[temp_rec[k3][0]]
test_let=0
for k4 in temp_rec[k3][1:]:
delflag=0
for k5 in test_rec:
if int(k4[7])+k5==0:
test_let+=1
k6=k3+chr(96+test_let)
temp_rec[k6]=[test_rec2[test_rec.index(k5)],k4]
del test_rec2[test_rec.index(k5)]
del test_rec[test_rec.index(k5)]
delflag+=1
if delflag==0:
test_rec.append(int(k4[7]))
test_rec2.append(k4)
temp_rec[k3]=test_rec2
for k3 in list(temp_rec.keys()):
if len(temp_rec[k3])==1:
del_flag=0
k4=temp_rec[k3][0]
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
mate_pos=[int(k4[6]),int(k4[6])+ReadLength]
if 'left' in k2b and mate_pos[1]<flank_region[0]:
del_flag+=1
elif 'right' in k2b and mate_pos[0]>flank_region[0]:
del_flag+=1
#elif not mate_pos[1]<flank_region[0] and not mate_pos[0]>flank_region[1]:
# del_flag+=1
if del_flag>0:
del temp_rec[k3]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
else:
Single_Read_ThroughBP[k1].append(read_pos)
else:
if not k3 in list(total_rec.keys()):
total_rec[k3]=[k4]
else:
total_rec[k3]+=[k4]
elif len(temp_rec[k3])==2:
if int(temp_rec[k3][0][7])==0 or int(temp_rec[k3][1][7])==0:
continue
if int(temp_rec[k3][0][7])+int(temp_rec[k3][1][7])==0 and int(temp_rec[k3][0][7])<0:
temp_rec[k3]=[temp_rec[k3][1],temp_rec[k3][0]]
read_pos=[int(temp_rec[k3][0][2]),int(temp_rec[k3][0][2])+cigar2reaadlength(temp_rec[k3][0][4]),int(temp_rec[k3][1][2]),int(temp_rec[k3][1][2])+cigar2reaadlength(temp_rec[k3][1][4])]+Reads_Direction_Detect_flag(temp_rec[k3][0][0])
#print temp_rec[k3]
#if k3 in test2:
# print read_pos
if read_pos[0]>read_pos[2]:
read_pos=read_pos[2:4]+read_pos[:2]+[read_pos[-1],read_pos[-2]]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[6]==read_pos[7]==read_pos[8]==read_pos[9]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
block_rds[k1][read_pos[-1]]+=read_pos[3]-read_pos[2]
elif read_pos[8]==read_pos[9] and read_pos[6]==read_pos[7]:
Pair_ThroughBP[k1].append(read_pos[:6]+[read_pos[6],read_pos[8]])
else:
Double_Read_ThroughBP[k1].append(read_pos)
del temp_rec[k3]
#if k3 in test2:
# print read_pos
for k3 in list(total_rec.keys()):
if len(total_rec[k3])==1:
del_flag=0
k4=total_rec[k3][0]
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
mate_pos=[int(k4[6]),int(k4[6])+ReadLength]
if 'left' in k2b and mate_pos[1]<flank_region[0]:
del_flag+=1
elif 'right' in k2b and mate_pos[0]>flank_region[0]:
del_flag+=1
elif not mate_pos[1]<flank_region[0] and not mate_pos[0]>flank_region[1]:
del_flag+=1
if del_flag>0:
del total_rec[k3]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
else:
Single_Read_ThroughBP[k1].append(read_pos)
elif len(total_rec[k3])==2:
read_pos=[int(total_rec[k3][0][2]),int(total_rec[k3][0][2])+cigar2reaadlength(total_rec[k3][0][4]),int(total_rec[k3][1][2]),int(total_rec[k3][1][2])+cigar2reaadlength(total_rec[k3][1][4])]+Reads_Direction_Detect_flag(total_rec[k3][0][0])
#print read_pos
if read_pos[0]>read_pos[2]:
read_pos=read_pos[2:4]+read_pos[:2]+[read_pos[-1],read_pos[-2]]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[6]==read_pos[7]==read_pos[8]==read_pos[9]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
block_rds[k1][read_pos[-1]]+=read_pos[3]-read_pos[2]
elif read_pos[8]==read_pos[9] and read_pos[6]==read_pos[7]:
Pair_ThroughBP[k1].append(read_pos[:6]+[read_pos[6],read_pos[8]])
else:
Double_Read_ThroughBP[k1].append(read_pos)
del total_rec[k3]
#print total_rec
direction_penal=0
block_rd2={}
block_bp2=block_bps
for k1 in list(block_rds.keys()):
block_rd2[k1]={}
for k2 in list(block_rds[k1].keys()):
block_rd2[k1][k2]=0
for i2 in list(Pair_ThroughBP.keys()):
for i in Pair_ThroughBP[i2]:
if not i[4:6]==['+','-']:
direction_penal+=1
block_rd2[i2][i[6]]+=i[1]-i[0]
block_rd2[i2][i[7]]+=i[3]-i[2]
for i2 in list(Double_Read_ThroughBP.keys()):
for i in Double_Read_ThroughBP[i2]:
if i[6]==i[7]:
block_rd2[i2][i[6]]+=i[1]-i[0]
block_rd2[i2][i[8]]+=-i[2]+block_bp2[i2][i[8]][1]
block_rd2[i2][i[9]]+=i[3]-block_bp2[i2][i[9]][0]
#if -i[2]+block_bp2[i2][i[8]][1]>200 and i[8]=='a':
#print i
#if i[3]-block_bp2[i2][i[9]][0]>200 and i[9]=='a':
#print i
elif i[8]==i[9]:
block_rd2[i2][i[8]]+=i[3]-i[2]
block_rd2[i2][i[6]]+=-i[0]+block_bp2[i2][i[6]][1]
block_rd2[i2][i[7]]+=i[1]-block_bp2[i2][i[7]][0]
#if -i[0]+block_bp2[i2][i[6]][1]>101:
#print i
#if i[1]-block_bp2[i2][i[7]][0]>101:
#print i
else:
block_rd2[i2][i[6]]+=-i[0]+block_bp2[i2][i[6]][1]
block_rd2[i2][i[7]]+=i[1]-block_bp2[i2][i[7]][0]
block_rd2[i2][i[8]]+=-i[2]+block_bp2[i2][i[8]][1]
block_rd2[i2][i[9]]+=i[3]-block_bp2[i2][i[9]][0]
for i2 in list(Single_Read_ThroughBP.keys()):
for i in Single_Read_ThroughBP[i2]:
block_rd2[i2][i[2]]+=-i[0]+block_bp2[i2][i[2]][1]
block_rd2[i2][i[3]]+=i[1]-block_bp2[i2][i[3]][0]
for k1 in list(rd_low_qual.keys()):
for k2 in list(rd_low_qual[k1].keys()):
block_rds[k1][k2]+=rd_low_qual[k1][k2]
return [block_rds,block_rd2,Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP]
total_rd_calcu(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],letter_RD2,letter_GC,chr_letter_bp,block_rd2)
def block_RD_Calculate_2a(Initial_GCRD_Adj,original_bp_list,flank):
allele_BP=[0]+[flank+j-original_bp_list[0] for j in original_bp_list]+[2*flank+original_bp_list[-1]-original_bp_list[0]]
allele_Letter=['left']+[chr(97+i) for i in range(len(original_bp_list)-1)]
allele_RD=[]
for k in range(len(allele_Letter)):
length=allele_BP[k+1]-allele_BP[k]
block=[allele_BP[k],allele_BP[k+1]]
temp=[]
if not block[0]==block[0]/Window_Size*Window_Size:
blf=float((block[0]/Window_Size+1)*Window_Size-block[0])/Window_Size*Initial_GCRD_Adj[block[0]/Window_Size+1][3]
temp.append(blf)
for m in range(int(block[0]/Window_Size+2),int(block[1]/Window_Size+1)):
temp.append(Initial_GCRD_Adj[m][3])
if not block[1]==block[1]/Window_Size*Window_Size:
brf=float(block[1]-block[1]/Window_Size*Window_Size)/Window_Size*Initial_GCRD_Adj[block[1]/Window_Size+1][3]
temp.append(brf)
allele_RD.append(numpy.sum(temp)/length*Window_Size)
elif block[0]==block[0]/Window_Size*Window_Size:
for m in range(int(block[0]/Window_Size+1),int(block[1]/Window_Size+1)):
temp.append(Initial_GCRD_Adj[m][3])
if not block[1]==block[1]/Window_Size*Window_Size:
brf=float(block[1]-block[1]/Window_Size*Window_Size)/Window_Size*Initial_GCRD_Adj[block[1]/Window_Size+1][3]
temp.append(brf)
allele_RD.append(numpy.sum(temp)/length*Window_Size)
return allele_RD
def copy_num_estimate_calcu(GC_para_dict,BP_para_dict,bps2):
chr_letter_bp=letter_rearrange(BP_para_dict['flank'],bps2)
Initial_GCRD_Adj_pre=letter_RD_ReadIn(letter_RD_test_calcu(chr_letter_bp))
global Initial_GCRD_Adj
Initial_GCRD_Adj={}
for k1 in list(Initial_GCRD_Adj_pre.keys()):
for k2 in list(Initial_GCRD_Adj_pre[k1].keys()): Initial_GCRD_Adj[k2]=Initial_GCRD_Adj_pre[k1][k2]
for key_chr in bps2:
if not key_chr[0] in GC_para_dict['GC_Mean_Coverage'].keys():
return ['error','error']
Initial_GCRD_Adj['left']=numpy.mean([GC_para_dict['GC_Mean_Coverage'][key_chr[0]] for key_chr in bps2])
for key_chr in bps2:
if not key_chr[0] in GC_para_dict['GC_Mean_Coverage'].keys(): return ['error','error']
Initial_GCRD_Adj['right']=numpy.mean([GC_para_dict['GC_Mean_Coverage'][key_chr[0]] for key_chr in bps2])
Copy_num_estimate={}
for i in list(Initial_GCRD_Adj.keys()):
if not i in ['left','right']:
Copy_num_estimate[i]=int(Initial_GCRD_Adj[i]*2/GC_para_dict['GC_Mean_Coverage'][Chr])
if Initial_GCRD_Adj[i]<float(GC_para_dict['GC_Mean_Coverage'][Chr])/10.0:
Copy_num_estimate[i]=-1
Copy_num_Check=[]
for CNE in list(Copy_num_estimate.keys()):
if Copy_num_estimate[CNE]>4:
Copy_num_Check.append(CNE)
return [Copy_num_estimate,Copy_num_Check]
def calcu_chr_letter_bp_left(bps2):
out={}
for i in bps2:
if not i[0] in list(out.keys()):
out[i[0]]={}
out[i[0]]['a']=[i[1]-1000,i[1]]
return out
def calcu_chr_letter_bp_right(bps2):
out={}
for i in bps2:
if not i[0] in list(out.keys()):
out[i[0]]={}
out[i[0]]['a']=[i[-1],i[-1]+1000]
return out
def c_Coverage_Calculate_InfoList(Full_Info,Chromo,bp_MP,letter_MP,original_bp_list,flank):
bp_M=[i-original_bp_list[0] for i in bp_MP[0]]
bp_P=[i-original_bp_list[0] for i in bp_MP[1]]
M_New_bp=[bp_M[0]-flank]+bp_M+[bp_M[-1]+flank]
P_New_bp=[bp_P[0]-flank]+bp_P+[bp_P[-1]+flank]
M_coverage=Block_Assign_To_Letters(bp_MP[0],letter_MP[0],flank)
P_coverage=Block_Assign_To_Letters(bp_MP[1],letter_MP[1],flank)
for key in list(M_coverage.keys()):
M_coverage[key].append(0)
for key in list(P_coverage.keys()):
P_coverage[key].append(0)
for key in list(Half_Info.keys()):
Half=Half_Info[key]
if Half[0]<-flank-Window_Size: continue
else:
if Half[-1]=='M':
M_coverage[(Half[0]-(M_New_bp[0]))/Window_Size+1][-1]+=1
elif Half[-1]=='P':
P_coverage[(Half[0]-(P_New_bp[0]))/Window_Size+1][-1]+=1
return [M_coverage,P_coverage]
def c_GCContent_Calculate_InfoList(Ori_1_Seq,original_bp_list,flank):
region_length=original_bp_list[-1]-original_bp_list[0]+2*flank
region_length_new=(region_length/100+1)*100-2*flank
Number_Of_Blocks=len(Ori_1_Seq)/100
GC_Content={}
for i in range(Number_Of_Blocks):
GC_Content[i+1]=GC_Content_Calculate(Ori_1_Seq[i*100:(i+1)*100])[0]
return GC_Content
def c_Coverage_Calculate_2a(Letter_Single,Letter_Double,Chromo,original_bp_list,original_letters,flank):
letter_list=original_letters
bp_list=[i-original_bp_list[0] for i in original_bp_list]
bp_list_new=[bp_list[0]-flank]+bp_list+[bp_list[-1]+flank]
coverage=Block_Assign_To_Letters(bp_list,letter_list,flank)
for key in list(coverage.keys()):
coverage[key].append(0)
for key in list(Letter_Single.keys()):
for i in Letter_Single[key]:
keynumL=(i[0]+flank)/Window_Size+1
keynumR=(i[1]+flank)/Window_Size+1
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
else:
coverage[keynumR][-1]+=1
for key in list(Letter_Double.keys()):
for i in Letter_Double[key]:
keynumL=(i[0]+flank)/Window_Size+1
keynumR=(i[1]+flank)/Window_Size+1
if keynumL in list(coverage.keys()) and keynumR in list(coverage.keys()):
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
else:
coverage[keynumR][-1]+=1
keynumL=(i[2]+flank)/Window_Size+1
keynumR=(i[3]+flank)/Window_Size+1
if keynumL in list(coverage.keys()) and keynumR in list(coverage.keys()):
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
else:
coverage[keynumR][-1]+=1
return coverage
def c_Coverage_Calculate_2b(Letter_Through,Chromo,original_bp_list,original_letters,flank):
#Eg of RD_Full_Info_of_Reads (a hash list) elements: 'HWI-ST177_136:2:1:7920:85270': [1202, 1302, 1443, 1543, '+', '-']
letter_list=original_letters
bp_list=[i-original_bp_list[0] for i in bp_MP[0]]
bp_list_new=[bp_list[0]-flank]+bp_list+[bp_list[-1]+flank]
coverage=Block_Assign_To_Letters(bp_list,letter_list,flank)
for key in list(coverage.keys()):
coverage[key].append(0)
for key in list(Letter_Through.keys()):
i=Letter_Through[key]
keynumL=(i[0]+flank)/Window_Size+1
keynumR=(i[1]+flank)/Window_Size+1
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
elif lenL<lenR:
coverage[keynumR][-1]+=1
elif lenL==lenR:
coverage[keynumL][-1]+=0.5
coverage[keynumR][-1]+=0.5
keynumL=(i[2]+flank)/Window_Size+1
keynumR=(i[3]+flank)/Window_Size+1
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
elif lenL<lenR:
coverage[keynumR][-1]+=1
elif lenL==lenR:
coverage[keynumL][-1]+=0.5
coverage[keynumR][-1]+=0.5
return coverage
def c_Coverage_Calculate_2d(Full_Info,Chromo,bp_MP,letter_MP,original_bp_list,flank):
#Eg of RD_Full_Info_of_Reads (a hash list) elements: 'HWI-ST177_136:2:1:7920:85270': [1202, 1302, 1443, 1543, '+', '-']
bp_M=[i-original_bp_list[0] for i in bp_MP[0]]
bp_P=[i-original_bp_list[0] for i in bp_MP[1]]
M_New_bp=[bp_M[0]-flank]+bp_M+[bp_M[-1]+flank]
P_New_bp=[bp_P[0]-flank]+bp_P+[bp_P[-1]+flank]
M_coverage=Block_Assign_To_Letters(bp_MP[0],letter_MP[0],flank)
P_coverage=Block_Assign_To_Letters(bp_MP[1],letter_MP[1],flank)
for key in list(M_coverage.keys()):
M_coverage[key].append(0)
for key in list(P_coverage.keys()):
P_coverage[key].append(0)
for key in list(Full_Info.keys()):
if not len(Full_Info[key])==8:
Halfa=Full_Info[key][:2]+[Full_Info[key][4]]+[Full_Info[key][6]]
Halfb=Full_Info[key][2:4]+[Full_Info[key][5]]+[Full_Info[key][6]]
for Half in [Halfa,Halfb]:
if Half[0]<-flank-Window_Size: continue
else:
if Half[-1]=='M':
M_coverage[(Half[0]-(M_New_bp[0]))/Window_Size+1][-1]+=1
elif Half[-1]=='P':
P_coverage[(Half[0]-(P_New_bp[0]))/Window_Size+1][-1]+=1
elif len(Full_Info[key])==8:
Halfa=Full_Info[key][:2]+[Full_Info[key][4]]+[Full_Info[key][6]]
Halfb=Full_Info[key][2:4]+[Full_Info[key][5]]+[Full_Info[key][6]]
for Half in [Halfa,Halfb]:
if Half[0]<-flank-Window_Size: continue
else:
if Half[-1]=='M':
M_coverage[(Half[0]-(M_New_bp[0]))/Window_Size+1][-1]+=float(Full_Info[key][7])
elif Half[-1]=='P':
P_coverage[(Half[0]-(P_New_bp[0]))/Window_Size+1][-1]+=float(Full_Info[key][7])
return [M_coverage,P_coverage]
def c_Coverage_Calculate_2e(Af_Info,Chromo,bp_MP,letter_MP,original_bp_list,flank):
#Eg of RD_Full_Info_of_Reads (a hash list) elements: 'HWI-ST177_136:2:1:7920:85270': [1202, 1302, 1443, 1543, '+', '-']
hashM={}
for i in letter_MP[0]:
if not i[0] in list(hashM.keys()):
hashM[i[0]]=[i[0]]
if (letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))>1:
hashM[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))[1:]]
hashP={}
for i in letter_MP[1]:
if not i[0] in list(hashP.keys()):
hashP[i[0]]=[i[0]]
if (letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))>1:
hashP[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))[1:]]
hashMPLetterBP={}
hashMPLetterBP['M']={}
hashMPLetterBP['P']={}
for j in range(len(letter_MP[0])):
hashMPLetterBP['M'][hashM[letter_MP[0][j][0]][0]]=[bp_MP[0][j],bp_MP[0][j+1]]
hashM[letter_MP[0][j][0]].remove(hashM[letter_MP[0][j][0]][0])
for j in range(len(letter_MP[1])):
hashMPLetterBP['P'][hashP[letter_MP[1][j][0]][0]]=[bp_MP[1][j],bp_MP[1][j+1]]
hashP[letter_MP[1][j][0]].remove(hashP[letter_MP[1][j][0]][0])
hashM={}
hashM['left']=['left']
hashM['right']=['right']
for i in letter_MP[0]:
if not i[0] in list(hashM.keys()):
hashM[i[0]]=[i[0]]
if (letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))>1:
hashM[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))[1:]]
hashP={}
hashP['left']=['left']
hashP['right']=['right']
for i in letter_MP[1]:
if not i[0] in list(hashP.keys()):
hashP[i[0]]=[i[0]]
if (letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))>1:
hashP[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))[1:]]
M_Coverage={}
M_Coverage['left']=0
for key_1 in list(hashMPLetterBP['M'].keys()):
M_Coverage[key_1]=[0 for i in range((hashMPLetterBP['M'][key_1][1]-hashMPLetterBP['M'][key_1][0])/Window_Size)]
if ((hashMPLetterBP['M'][key_1][1]-hashMPLetterBP['M'][key_1][0])-(hashMPLetterBP['M'][key_1][1]-hashMPLetterBP['M'][key_1][0])/Window_Size*Window_Size)>30:
M_Coverage[key_1].append(0)
P_Coverage={}
P_Coverage['left']=0
for key_1 in list(hashMPLetterBP['P'].keys()):
P_Coverage[key_1]=[0 for i in range((hashMPLetterBP['P'][key_1][1]-hashMPLetterBP['P'][key_1][0])/Window_Size)]
if ((hashMPLetterBP['P'][key_1][1]-hashMPLetterBP['P'][key_1][0])-(hashMPLetterBP['P'][key_1][1]-hashMPLetterBP['P'][key_1][0])/Window_Size*Window_Size)>30:
P_Coverage[key_1].append(0)
for key in list(Af_Info.keys()):
if Af_Info[key][0]==Af_Info[key][1]==Af_Info[key][2]==Af_Info[key][3]==(-flank/2):
M_Coverage['left']+=0.5
P_Coverage['left']+=0.5
else:
if key in list(Letter_Through.keys()):
if Af_Info[key][6]=='M':
lele=hashM[Letter_Through[key][6]]
rile=hashM[Letter_Through[key][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['M'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
M_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])
if lebl[1]/Window_Size<len(M_Coverage[lele1]):
M_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['M'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
M_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])
if ribl[1]/Window_Size<len(M_Coverage[rile1]):
M_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])
if Af_Info[key][6]=='P':
lele=hashP[Letter_Through[key][6]]
rile=hashP[Letter_Through[key][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['P'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
P_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])
if lebl[1]/Window_Size<len(P_Coverage[lele1]):
P_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['P'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
P_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])
if ribl[1]/Window_Size<len(P_Coverage[rile1]):
P_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])
if not key in list(Letter_Through.keys()):
key2='_'.join(key.split('_')[:-1])
if Af_Info[key][6]=='M':
lele=hashM[Letter_Through[key2][6]]
rile=hashM[Letter_Through[key2][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['M'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
M_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
if lebl[1]/Window_Size<len(M_Coverage[lele1]):
M_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['M'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
M_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
if ribl[1]/Window_Size<len(M_Coverage[rile1]):
M_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
if Af_Info[key][6]=='P':
lele=hashP[Letter_Through[key2][6]]
rile=hashP[Letter_Through[key2][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['P'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
P_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
if lebl[1]/Window_Size<len(P_Coverage[lele1]):
P_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['P'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
P_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
if ribl[1]/Window_Size<len(P_Coverage[rile1]):
P_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
return [M_Coverage,P_Coverage]
def candidate_QC_Control(Read_List):
if Read_List==[]:
return []
else:
Qual_Filter_1=[]
for j in Read_List:
if not j[1]-j[0]>ReadLength+min_resolution and j[1]-j[0]>0 and not j[3]-j[2]>ReadLength+min_resolution and j[3]-j[2]>0:
Qual_Filter_1.append(j)
if not Qual_Filter_1==[]:
if len(Qual_Filter_1)==1:
Qual_Filter_1[0]+=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_1]
return Qual_Filter_1
else:
Qual_Filter_2=[]
for j2 in Qual_Filter_1:
if j2[-2:]==['+','-']:
Qual_Filter_2.append(j2)
if not Qual_Filter_2==[]:
if len(Qual_Filter_2)==1:
Qual_Filter_2[0]+=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
return Qual_Filter_2
else:
Qual_Filter_3=[]
Qual_IL=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
for jq in range(len(Qual_IL)):
if Qual_IL[jq]==max(Qual_IL) and not Qual_Filter_1[jq] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_1[jq]+[max(Qual_IL)])
return Qual_Filter_3
else:
Qual_Filter_2=Qual_Filter_1
if len(Qual_Filter_2)==1:
Qual_Filter_2[0]+=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
return Qual_Filter_2
else:
Qual_Filter_3=[]
Qual_IL=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
for jq in range(len(Qual_IL)):
if Qual_IL[jq]==max(Qual_IL) and not Qual_Filter_1[jq] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_1[jq]+[max(Qual_IL)])
return Qual_Filter_3
else:
return []
def candidate_QC_Control2(M_Read_List,P_Read_List):
Qual_Filter_1=[]
for i in M_Read_List:
Qual_Filter_1.append(i+['m'])
for i in P_Read_List:
Qual_Filter_1.append(i+['p'])
Qual_Filter_2=[]
for i in Qual_Filter_1:
if i[-4:-2]==['+','-']:
Qual_Filter_2.append(i)
if not Qual_Filter_2==[]:
Qual_Filter_3=[]
IL_Qual=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
for j in range(len(IL_Qual)):
if IL_Qual[j]==max(IL_Qual) and not Qual_Filter_2[j] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_2[j])
else:
Qual_Filter_2=Qual_Filter_1
Qual_Filter_3=[]
IL_Qual=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
for j in range(len(IL_Qual)):
if IL_Qual[j]==max(IL_Qual) and not Qual_Filter_2[j] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_2[j])
return Qual_Filter_3
def Cov_Cal_Block(pos,bp,cov,perc):
for j in range(len(bp)-2):
if not pos[0]<bp[j] and pos[0]<bp[j+1]:
if not pos[1]<bp[j] and pos[1]<bp[j+1]:
cov[j]+=(pos[1]-pos[0])*perc
elif not pos[1]<bp[j+1] and pos[1]<bp[j+2]:
cov[j]+=(bp[j+1]-pos[0])*perc
cov[j+1]+=(pos[1]-bp[j+1])*perc
elif not pos[1]<temp_bp[0][j+2] and pos[1]<temp_bp[0][j+3]:
cov[j]+=(bp[j+1]-pos[0])*perc
cov[j+1]+=(bp[j+2]-bp[j+1])*perc
cov[j+2]+=(pos[1]-bp[j+2])*perc
j=len(bp)-2
if not pos[0]<bp[j] and pos[0]<bp[j+1]:
if not pos[1]<bp[j] and pos[1]<bp[j+1]:
cov[j]+=(pos[1]-pos[0])*perc
else:
cov[j]+=(bp[j+1]-pos[0])*perc
def Define_Default_SVPredict():
global tolerance_bp
tolerance_bp=10
global min_resolution
min_resolution=70
global Best_IL_Score
Best_IL_Score=0
global Best_RD_Score
Best_RD_Score=0
global deterministic_flag
deterministic_flag=0
if '--deterministic-flag' in list(dict_opts.keys()):
deterministic_flag=int(dict_opts['--deterministic-flag'])
global Penalty_For_InsertLengthZero
Penalty_For_InsertLengthZero=-20 #Toy example,decides later
if not '/' in dict_opts['--bp-file']:
dict_opts['--bp-file']='./'+dict_opts['--bp-file']
global model_comp
if not '--null-model' in list(dict_opts.keys()):
model_comp='C'
else:
if dict_opts['--null-model'] in ['S','Simple']:
model_comp='S'
else:
model_comp='C'
global Ploidy
if '--ploidy' in list(dict_opts.keys()):
Ploidy=int(dict_opts['--ploidy'])
else:
Ploidy=2
global QCAlign
if '--qc-align' in list(dict_opts.keys()):
QCAlign=int(dict_opts['--qc-align'])
else:
QCAlign=20
global genome_name
if '--NullGenomeName' in list(dict_opts.keys()):
genome_name=dict_opts['--NullGenomeName']
else:
genome_name='genome'
global Trail_Number
if '--num-iteration' in list(dict_opts.keys()):
Trail_Number=int(dict_opts['--num-iteration'])
else:
Trail_Number=100000
global Local_Minumum_Number
Local_Minumum_Number=100
global IL_Weight
global DR_Weight
global TB_Weight
IL_Weight=1
DR_Weight=5
TB_Weight=5
def Full_Info_of_Reads_Product(Initial_Bam,bps,total_bps,total_letters,bamChr,flank,QCAlign,ReadLength,chr_link):
# letters=[chr(97+i) for i in range(len(bps)-1)]
temp_bp=total_bps
temp_let=total_letters
BlockCov={}
for j in temp_let:
BlockCov[j]=0
Letter_Double={}
Pair_ThroughBP=[]
Double_Read_ThroughBP=[]
Single_Read_ThroughBP=[]
blackList=[]
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,bamChr,bps[0]-flank,bps[-1]+flank))
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
if not int(pbam[4])>QCAlign:
continue
if pbam[0] in blackList: continue
if int(pbam[1])&8>0 or not pbam[6]=='=':
pos1=int(pbam[3])+low_qual_edge
pos2=int(pbam[3])+cigar2reaadlength(pbam[5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(pbam[5])
else:
rela_1=pos1-low_qual_edge-temp_bp[temp_let.index(block1)]
rela_2=pos2+low_qual_edge-temp_bp[temp_let.index(block2)]
Single_Read_ThroughBP.append([block1,rela_1,block2,rela_2,pbam[5]])
if not pbam[6]=='=':
if not pbam[0] in list(chr_link.keys()):
chr_link[pbam[0]]=[pbam[1:9]]
else:
chr_link[pbam[0]]+=[pbam[1:9]]
elif int(pbam[1])&8==0:
if pbam[6]=='=':
if not pbam[0] in list(Letter_Double.keys()):
Letter_Double[pbam[0]]=[pbam[:9]]
else:
if not pbam[:9] in Letter_Double[pbam[0]]:
Letter_Double[pbam[0]]+=[pbam[:9]]
if int(Letter_Double[pbam[0]][0][3])<int(Letter_Double[pbam[0]][1][3]):
pos1=int(Letter_Double[pbam[0]][0][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][1][3])+cigar2reaadlength(Letter_Double[pbam[0]][1][5])-low_qual_edge
else:
pos1=int(Letter_Double[pbam[0]][1][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][0][3])+cigar2reaadlength(Letter_Double[pbam[0]][0][5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(Letter_Double[pbam[0]][0][5])
BlockCov[block1]+=cigar2reaadlength(Letter_Double[pbam[0]][1][5])
del Letter_Double[pbam[0]]
blackList.append(pbam[0])
fbam.close()
for key in list(Letter_Double.keys()):
if key in blackList:
del Letter_Double[key]
continue
if len(Letter_Double[key])==2:
pos1=int(Letter_Double[key][0][3])
pos2=int(Letter_Double[key][1][3])
if not pos1>pos2:
pos1=int(Letter_Double[key][0][3])
pos1b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
pos2=int(Letter_Double[key][1][3])
pos2b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][0][1])
elif pos1>pos2:
pos1=int(Letter_Double[key][1][3])
pos1b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
pos2=int(Letter_Double[key][0][3])
pos2b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][1][1])
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1+low_qual_edge)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2+low_qual_edge)
block1b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1b-low_qual_edge)
block2b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2b-low_qual_edge)
rela_1=pos1-temp_bp[temp_let.index(block1)]
rela_2=pos2-temp_bp[temp_let.index(block2)]
rela_1b=pos1b-temp_bp[temp_let.index(block1b)]
rela_2b=pos2b-temp_bp[temp_let.index(block2b)]
if block1==block1b and block2==block2b:
Pair_ThroughBP.append([block1,rela_1,rela_1b, block2,rela_2,rela_2b]+direct_temp)
else:
Double_Read_ThroughBP.append([block1,rela_1,block1b,rela_1b, block2,rela_2,block2b,rela_2b]+direct_temp)
del Letter_Double[key]
elif len(Letter_Double[key])==1:
if Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][7]))==0:
if Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))==Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3])+cigar2reaadlength(Letter_Double[key][0][5])):
BlockCov[Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))]+=cigar2reaadlength(Letter_Double[key][0][5])
del Letter_Double[key]
Initial_DR_Penal=0
for j in Pair_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Double_Read_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
Initial_Cov={}
for j in temp_let:
Initial_Cov[j]=0
for j in Pair_ThroughBP:
Initial_Cov[j[0]]+=j[2]-j[1]
Initial_Cov[j[3]]+=j[5]-j[4]
for j in Single_Read_ThroughBP:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
for j in Double_Read_ThroughBP:
if j[0]==j[2]:
Initial_Cov[j[0]]+=j[3]-j[1]
else:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
if j[4]==j[6]:
Initial_Cov[j[4]]+=j[7]-j[5]
else:
Initial_Cov[j[4]]+=temp_bp[temp_let.index(j[4])+1]-temp_bp[temp_let.index(j[4])]-j[5]
Initial_Cov[j[6]]+=j[7]
Initial_IL=[]
for j in Pair_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[3])]-temp_bp[temp_let.index(j[0])]-j[1]+j[5])
for j in Double_Read_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[6])]-temp_bp[temp_let.index(j[0])]-j[1]+j[7])
Initial_ILPenal=[]
for j in Initial_IL:
Initial_ILPenal+=[pdf_calculate(j,GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero)/len(Initial_IL)]
return [Initial_DR_Penal,Initial_ILPenal,Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,BlockCov,Initial_Cov,Letter_Double]
def Full_Info_of_Reads_Product_3(Initial_Bam,temp_bp,temp_let,bamChr,target_region,Chr_Link):
Letter_Double={}
Pair_ThroughBP=[]
Double_Read_ThroughBP=[]
Single_Read_ThroughBP=[]
blackList=[]
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,bamChr,target_region[0]-flank,target_region[-1]+flank))
num_of_reads=0
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if not int(pbam[4])>QCAlign or int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
if pbam[0] in blackList: continue
num_of_reads+=1
if int(pbam[1])&8>0 or not pbam[6]=='=':
pos1=int(pbam[3])+low_qual_edge
pos2=int(pbam[3])+cigar2reaadlength(pbam[5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(pbam[5])
else:
reg1a=temp_bp[temp_let.index(block1)]
reg1b=temp_bp[temp_let.index(block1)+1]
reg2a=temp_bp[temp_let.index(block2)]
reg2b=temp_bp[temp_let.index(block2)+1]
rela_1=pos1-low_qual_edge-temp_bp[temp_let.index(block1)]
rela_2=pos2+low_qual_edge-temp_bp[temp_let.index(block2)]
Single_Read_ThroughBP.append([block1,rela_1,block2,rela_2,pbam[5]])
if not pbam[6]=='=':
if not pbam[0] in Chr_Link:
Chr_Link[pbam[0]]=[pbam[1:9]]
else:
Chr_Link[pbam[0]]+=[pbam[1:9]]
elif int(pbam[1])&8==0:
if pbam[6]=='=':
if not pbam[0] in list(Letter_Double.keys()):
Letter_Double[pbam[0]]=[pbam[:9]]
else:
if not pbam[:9] in Letter_Double[pbam[0]]:
Letter_Double[pbam[0]]+=[pbam[:9]]
if int(Letter_Double[pbam[0]][0][3])<int(Letter_Double[pbam[0]][1][3]):
pos1=int(Letter_Double[pbam[0]][0][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][1][3])+cigar2reaadlength(Letter_Double[pbam[0]][1][5])-low_qual_edge
else:
pos1=int(Letter_Double[pbam[0]][1][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][0][3])+cigar2reaadlength(Letter_Double[pbam[0]][0][5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(Letter_Double[pbam[0]][0][5])
del Letter_Double[pbam[0]]
blackList.append(pbam[0])
fbam.close()
for key in list(Letter_Double.keys()):
if key in blackList:
del Letter_Double[key]
continue
if len(Letter_Double[key])==2:
pos1=int(Letter_Double[key][0][3])
pos2=int(Letter_Double[key][1][3])
if not pos1>pos2:
pos1=int(Letter_Double[key][0][3])
pos1b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
pos2=int(Letter_Double[key][1][3])
pos2b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][0][1])
elif pos1>pos2:
pos1=int(Letter_Double[key][1][3])
pos1b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
pos2=int(Letter_Double[key][0][3])
pos2b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][1][1])
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1+low_qual_edge)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2+low_qual_edge)
block1b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1b-low_qual_edge)
block2b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2b-low_qual_edge)
rela_1=pos1-temp_bp[temp_let.index(block1)]
rela_2=pos2-temp_bp[temp_let.index(block2)]
rela_1b=pos1b-temp_bp[temp_let.index(block1b)]
rela_2b=pos2b-temp_bp[temp_let.index(block2b)]
if block1==block1b and block2==block2b:
Pair_ThroughBP.append([block1,rela_1,rela_1b, block2,rela_2,rela_2b]+direct_temp)
else:
Double_Read_ThroughBP.append([block1,rela_1,block1b,rela_1b, block2,rela_2,block2b,rela_2b]+direct_temp)
del Letter_Double[key]
elif len(Letter_Double[key])==1:
if Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))==Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3])+cigar2reaadlength(Letter_Double[key][0][5])):
BlockCov[Reads_block_assignment_1(flank,flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))]+=cigar2reaadlength(Letter_Double[key][0][5])
del Letter_Double[key]
Initial_DR_Penal=0
for j in Pair_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Double_Read_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Pair_ThroughBP:
Initial_Cov[j[0]]+=j[2]-j[1]
Initial_Cov[j[3]]+=j[5]-j[4]
for j in Single_Read_ThroughBP:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
for j in Double_Read_ThroughBP:
if j[0]==j[2]:
Initial_Cov[j[0]]+=j[3]-j[1]
else:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
if j[4]==j[6]:
Initial_Cov[j[4]]+=j[7]-j[5]
else:
Initial_Cov[j[4]]+=temp_bp[temp_let.index(j[4])+1]-temp_bp[temp_let.index(j[4])]-j[5]
Initial_Cov[j[6]]+=j[7]
for j in Pair_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[3])]-temp_bp[temp_let.index(j[0])]-j[1]+j[5])
for j in Double_Read_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[6])]-temp_bp[temp_let.index(j[0])]-j[1]+j[7])
return [Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,num_of_reads,Initial_DR_Penal]
def Full_Info_of_Reads_Integrate(GC_para_dict,BP_para_dict,bps2):
bps2_left=[]
bps2_right=[]
for x in bps2:
bps2_left.append([x[0],x[1]-5000,x[1]])
bps2_right.append([x[0],x[-1],x[-1]+5000])
chr_letter_bp=letter_rearrange(BP_para_dict['flank'],bps2)
letter_GC=letter_GC_ReadIn(chr_letter_bp)
letter_RD_test=letter_RD_ReadIn(letter_RD_test_calcu(chr_letter_bp))
if len(bps2)==1 and len(bps2[0])==3 and letter_RD_test[bps2[0][0]]['a']>GC_para_dict['GC_Overall_Median_Coverage'][bps2[0][0]]*4:
return [letter_RD_test[bps2[0][0]],letter_RD_test[bps2[0][0]],0,0,[],[],[],letter_GC[bps2[0][0]]]+original_bp_let_produce(chr_letter_bp,bps2)
letter_RD=letter_RD_ReadIn(chr_letter_bp)
Multi_Dup=multi_dup_define(letter_RD,GC_para_dict['GC_Overall_Median_Coverage'])
global letter_RD_left_control
letter_RD_left_control=letter_RD_ReadIn(letter_rearrange(BP_para_dict['flank'],bps2_left))
global letter_RD_right_control
letter_RD_right_control=letter_RD_ReadIn(letter_rearrange(BP_para_dict['flank'],bps2_right))
letter_range_report(BP_para_dict['flank'],chr_letter_bp)
blocks_read_in=block_Read_From_Bam(chr_letter_bp)
read_info=block_Info_ReadIn(GC_para_dict,BP_para_dict,chr_letter_bp,blocks_read_in,Multi_Dup)
block_rds=read_info[0]
block_rd2=read_info[1]
letter_RD2={}
for k1 in list(letter_RD.keys()):
for k2 in list(letter_RD[k1].keys()):
if k2 in Multi_Dup:
letter_RD2[k2]=letter_RD[k1][k2]
if not k1 in list(block_rd2.keys()):
block_rd2[k1]={}
if not k2 in list(block_rd2[k1].keys()):
block_rd2[k1][k2]=0
else:
if len(chr_letter_bp[k1][k2])==4:
letter_RD2[k2]=letter_RD[k1][k2]*(chr_letter_bp[k1][k2][2]-chr_letter_bp[k1][k2][1])/(chr_letter_bp[k1][k2][3]-chr_letter_bp[k1][k2][0])
else:
letter_RD2[k2]=letter_RD[k1][k2]
for k1 in list(block_rds.keys()):
for k2 in list(block_rds[k1].keys()):
if not k2 in ['left','right']:
if not chr_letter_bp[k1][k2][-1]==chr_letter_bp[k1][k2][0]:
letter_RD2[k2]+=float(block_rds[k1][k2])/float(chr_letter_bp[k1][k2][-1]-chr_letter_bp[k1][k2][0])
Pair_ThroughBP=rela_Pair_ThroughBP(chr_letter_bp,read_info[2])
Double_Read_ThroughBP=rela_Pair_Double_Read_ThroughBP(chr_letter_bp,read_info[3])
Single_Read_ThroughBP=read_Pair_Single_Read_ThroughBP(chr_letter_bp,read_info[4])
Initial_RD=total_rd_calcu(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],letter_RD2,letter_GC,chr_letter_bp,block_rd2)
DR_Penal=DR_Penal_Calcu(read_info)
IL_Penal=IL_Penal_Calcu(read_info,GC_para_dict['IL_Statistics'],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero)
letter_GC_out={}
for k1 in list(letter_GC.keys()):
for k2 in list(letter_GC[k1].keys()):
letter_GC_out[k2]=letter_GC[k1][k2]
return [letter_RD2,Initial_RD,DR_Penal,numpy.mean(IL_Penal),Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,letter_GC_out]+original_bp_let_produce(chr_letter_bp,bps2)
def global_para_declaration():
global chrom_N
global chrom_X
global chrom_Y
global workdir
workdir=path_modify(dict_opts['--workdir'])
global bp_txt_Path
global BPPath
global NullPath
global ref_path
global ref_file
global ref_index
global ref_ppre
global ref_prefix
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
ref_ppre=ref_path
ref_prefix='.'.join(ref_file.split('.')[:-1])
global GC_hash
GC_hash=GC_Index_Readin(ref_prefix+'.GC_Content')
def letter_GC_ReadIn(chr_letter_bp):
block_GC_temp={}
filein=ref_prefix+'.GC_Content'
block_range={}
GC_hash_temp={}
test_flag=0
for i in list(chr_letter_bp.keys()):
if not os.path.isfile(filein): test_flag+=1
if test_flag==0:
for i in list(chr_letter_bp.keys()):
GC_hash_temp[i]={}
block_range[i]=[]
for j in list(chr_letter_bp[i].keys()): block_range[i]+=chr_letter_bp[i][j]
block_range[i]=[min(block_range[i]),max(block_range[i])]
for xa in list(GC_hash[i].keys()):
for xb in list(GC_hash[i][xa].keys()):
if not xb<block_range[i][0] and not xa>block_range[i][1]: GC_hash_temp[i][str(xa)+'-'+str(xb)]=GC_hash[i][xa][xb]
for k1 in list(chr_letter_bp.keys()):
block_GC_temp[k1]={}
for k2 in list(GC_hash_temp[k1].keys()):
bl2=[int(k2.split('-')[0]),int(k2.split('-')[1])]
for k3 in list(chr_letter_bp[k1].keys()):
if min(chr_letter_bp[k1][k3])>bl2[0]-1 and max(chr_letter_bp[k1][k3])<bl2[1]+1: block_GC_temp[k1][k3]=GC_hash_temp[k1][k2][int((min(chr_letter_bp[k1][k3])-bl2[0])/100):int((max(chr_letter_bp[k1][k3])-bl2[0])/100)+1]
elif min(chr_letter_bp[k1][k3])>bl2[0]-1 and max(chr_letter_bp[k1][k3])>bl2[1]:
if not k3 in list(block_GC_temp[k1].keys()): block_GC_temp[k1][k3]=GC_hash_temp[k1][k2][int((min(chr_letter_bp[k1][k3])-bl2[0])/100):]
else: block_GC_temp[k1][k3]+=GC_hash_temp[k1][k2][int((min(chr_letter_bp[k1][k3])-bl2[0])/100):]
elif min(chr_letter_bp[k1][k3])<bl2[0] and max(chr_letter_bp[k1][k3])>bl2[0]-1:
if not k3 in list(block_GC_temp[k1].keys()): block_GC_temp[k1][k3]=GC_hash_temp[k1][k2][:int((max(chr_letter_bp[k1][k3])-bl2[0])/100+1)]
else: block_GC_temp[k1][k3]+=GC_hash_temp[k1][k2][:int((max(chr_letter_bp[k1][k3])-bl2[0])/100+1)]
elif min(chr_letter_bp[k1][k3])<bl2[0]+1 and max(chr_letter_bp[k1][k3])>bl2[1]-1:
if not k3 in list(block_GC_temp[k1].keys()): block_GC_temp[k1][k3]=GC_hash_temp[k1][k2]
else: block_GC_temp[k1][k3]+=GC_hash_temp[k1][k2]
for k1 in list(block_GC_temp.keys()):
for k2 in list(block_GC_temp[k1].keys()):
if not block_GC_temp[k1][k2]==[]: block_GC_temp[k1][k2]=numpy.mean([float(k3) for k3 in block_GC_temp[k1][k2]])
else: return 'error'
return block_GC_temp
else: return 'error'
def letter_RD_ReadIn(chr_letter_bp):
test_flag=0
for k1 in list(chr_letter_bp.keys()):
filein=NullPath+'RD_Stat/'+BamN+'.'+k1+'.RD.index'
if not os.path.isfile(filein): test_flag+=1
if test_flag==0:
out={}
RD_hash={}
block_range={}
for i in list(chr_letter_bp.keys()):
RD_hash[i]={}
out[i]={}
block_range[i]=[]
for j in list(chr_letter_bp[i].keys()): block_range[i]+=chr_letter_bp[i][j]
block_range[i]=[min(block_range[i]),max(block_range[i])]
for k1 in list(chr_letter_bp.keys()):
filein=NullPath+'RD_Stat/'+BamN+'.'+k1+'.RD.index'
fin=open(filein)
while True:
pin=fin.readline().strip().split()
if not pin: break
pin2=fin.readline().strip().split()
bl2=[int(pin[0].split(':')[1].split('-')[0]),int(pin[0].split(':')[1].split('-')[1])]
if not bl2[1]<block_range[k1][0]+1 and not bl2[0]>block_range[k1][1]-1:
RD_hash[k1][str(bl2[0])+'-'+str(bl2[1])]=pin2
fin.close()
for k1 in list(chr_letter_bp.keys()):
for k2 in list(RD_hash[k1].keys()):
bl2=[int(k2.split('-')[0]),int(k2.split('-')[1])]
for j in sorted(chr_letter_bp[k1].keys()):
if not j in list(out[k1].keys()): out[k1][j]=[]
if len(chr_letter_bp[k1][j])==4:
bl1=chr_letter_bp[k1][j][1:-1]
if bl1[0]>bl2[0]-1 and bl1[1]<bl2[1]+1: out[k1][j]+=RD_hash[k1][k2][int((bl1[0]-bl2[0])/Window_Size):int((bl1[1]-bl2[0])/Window_Size)+1]
elif bl1[0]>bl2[0]-1 and bl1[1]>bl2[1]: out[k1][j]+=RD_hash[k1][k2][int((bl1[0]-bl2[0])/Window_Size):]
elif bl1[0]<bl2[0] and bl1[1]<bl2[1]+1: out[k1][j]+=RD_hash[k1][k2][:int((bl1[1]-bl2[0])/Window_Size)+1]
elif bl1[0]<bl2[0] and bl1[1]>bl2[1]: out[k1][j]+=RD_hash[k1][k2]
for k1 in list(out.keys()):
for k2 in list(out[k1].keys()):
if out[k1][k2]==[]: out[k1][k2]=0
else:
rd_test_tmp=reject_outliers([float(k3) for k3 in out[k1][k2]],10)
if rd_test_tmp==[]: rd_test_tmp= [float(k3) for k3 in out[k1][k2]]
out[k1][k2]=numpy.mean(rd_test_tmp)
return out
else: return 'error'
def reject_outliers(data, m):
out=[]
mean_1=numpy.mean(data)
median_1=numpy.median(data)
if len(data)>100:out=[i for i in out if i/median_1<m and i/median_1>1/m]
else:
for i in range(len(data)):
tmp=[data[j] for j in range(len(data)) if not j==i]
mean_2=numpy.mean(tmp)
if mean_2/mean_1 <m and mean_2/mean_1 >1/m: out.append(data[i])
return out
def letters_bps_produce(letters,bps,flank):
letters_bps={}
letters_relative_bps={}
letters_bps['left']=[bps[0]-flank,bps[0]]
letters_relative_bps['left']=[-flank,0]
for i in range(len(bps)-1):
letters_relative_bps[letters[i]]=[bps[i]-bps[0],bps[i+1]-bps[0]]
letters_bps[letters[i]]=[bps[i],bps[i+1]]
letters_bps['right']=[bps[-1],bps[-1]+flank]
letters_relative_bps['right']=[bps[-1]-bps[0],bps[-1]-bps[0]+flank]
return [letters_bps,letters_relative_bps]
def letter_rearrange(flank,bps2):
chr_letter_bp={}
let_start=96
for i in bps2:
if not i[0] in list(chr_letter_bp.keys()):
chr_letter_bp[i[0]]={}
for j in range(len(i))[1:-1]:
chr_letter_bp[i[0]][chr(let_start+j)]=[]
if int(i[j+1])-int(i[j])<10*flank:
chr_letter_bp[i[0]][chr(let_start+j)]+=[int(i[j]),int(i[j+1])]
else:
chr_letter_bp[i[0]][chr(let_start+j)]+=[int(i[j]),int(i[j])+flank,int(i[j+1])-flank,int(i[j+1])]
let_start+=len(i)-2
return chr_letter_bp
def letter_RD_test_calcu(chr_letter_bp):
out={}
for x in list(chr_letter_bp.keys()):
out[x]={}
for y in list(chr_letter_bp[x].keys()):
if not y in ['left','right']:
if len(chr_letter_bp[x][y])==2:
out[x][y]=[chr_letter_bp[x][y][0]-500]+chr_letter_bp[x][y]+[chr_letter_bp[x][y][1]+500]
else:
out[x][y]=chr_letter_bp[x][y]
return out
def LetterList_Rearrange(Letter_List,Command,BP_List_origin):
if Command[-1]=='del' or Command[-1]=='delete':
return BPList_Delete_Letter(Letter_List,Command)
elif Command[-1]=='inv' or Command[-1]=='invert':
return BPList_Invert_Letter(Letter_List,Command)
elif Command[-1]=='ins' or Command[-1]=='insert':
return BPList_Insert_Letter(Letter_List,Command)
elif Command[-1]=='copy+paste' or Command[-1]=='CopyPaste':
return BPList_CopyPaste_Letter(Letter_List,Command)
elif Command[-1]=='cut+paste' or Command[-1]=='CutPaste':
return BPList_CutPaste_Letter(Letter_List,Command)
elif Command[-1]=='x' or Command[-1]=='X':
return BPList_X_Letter(Letter_List,Command)
def RD_Index_ReadIn(ppre_Path,BamN, chromo, region):
if not ppre_Path[-1]=='/':
ppre_Path+='/'
path_in=NullPath+'RD_Stat/'
file_in=BamN+'.'+chromo+'.RD.index'
fin=open(path_in+file_in)
pos1=int(region[0])
pos2=int(region[1])
while True:
pin1=fin.readline().strip().split()
if not pin1: break
pin2=fin.readline().strip().split()
reg1=int(pin1[0].split(':')[1].split('-')[0])
reg2=int(pin1[0].split(':')[1].split('-')[1])
if not pos1<reg1 and not pos2>reg2:
break
def read_Pair_Single_Read_ThroughBP(chr_letter_bp,Single_Read_ThroughBP):
out=[]
for k1 in list(Single_Read_ThroughBP.keys()):
for k2 in Single_Read_ThroughBP[k1]:
rela=[k2[2],k2[0]-chr_letter_bp[k1][k2[2]][0],
k2[3],k2[1]-chr_letter_bp[k1][k2[3]][0]]
out.append(rela)
return out
def rela_Pair_ThroughBP(chr_letter_bp,Pair_ThroughBP):
out=[]
for k1 in list(Pair_ThroughBP.keys()):
for k2 in Pair_ThroughBP[k1]:
rela=[k2[6],k2[0]-chr_letter_bp[k1][k2[6]][0],
k2[1]-chr_letter_bp[k1][k2[6]][0],
k2[7],k2[2]-chr_letter_bp[k1][k2[7]][0],
k2[3]-chr_letter_bp[k1][k2[7]][0],k2[4],k2[5]]
out.append(rela)
return out
def rela_Pair_Double_Read_ThroughBP(chr_letter_bp,Double_Read_ThroughBP):
out=[]
for k1 in list(Double_Read_ThroughBP.keys()):
for k2 in Double_Read_ThroughBP[k1]:
rela=[k2[6],k2[0]-chr_letter_bp[k1][k2[6]][0],
k2[7],k2[1]-chr_letter_bp[k1][k2[7]][0],
k2[8],k2[2]-chr_letter_bp[k1][k2[8]][0],
k2[9],k2[3]-chr_letter_bp[k1][k2[9]][0],k2[4],k2[5]]
out.append(rela)
return out
def Single_Rec_Read_Locate(BP_para_dict,Letter_Double_rec,temp_bp, temp_let):
Pair_ThroughBP=[]
Double_Read_ThroughBP=[]
Single_Read_ThroughBP=[]
Initial_IL=[]
BlockCov={}
Initial_Cov={}
Initial_DR_Penal=0
for j in temp_let:
BlockCov[j]=0
for key in list(Letter_Double_rec.keys()):
if len(Letter_Double_rec[key])==1:
pos1=int(Letter_Double_rec[key][0][3])
pos2=int(Letter_Double_rec[key][0][7])
bamChr=Letter_Double_rec[key][0][2]
fbamtemp=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,bamChr,pos2,pos2+ReadLength))
while True:
pbam=fbamtemp.readline().strip().split()
if not pbam: break
flag=0
if pbam[0]==key:
Letter_Double_rec[key]+=[pbam[:9]]
flag+=1
if flag==1:
break
fbamtemp.close()
for key in list(Letter_Double_rec.keys()):
if len(Letter_Double_rec[key])==2:
pos1=int(Letter_Double_rec[key][0][3])
pos2=int(Letter_Double_rec[key][1][3])
if not pos1>pos2:
pos1=int(Letter_Double_rec[key][0][3])
pos1b=pos1+cigar2reaadlength(Letter_Double_rec[key][0][5])
pos2=int(Letter_Double_rec[key][1][3])
pos2b=pos2+cigar2reaadlength(Letter_Double_rec[key][1][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double_rec[key][0][1])
elif pos1>pos2:
pos1=int(Letter_Double_rec[key][1][3])
pos1b=pos2+cigar2reaadlength(Letter_Double_rec[key][1][5])
pos2=int(Letter_Double_rec[key][0][3])
pos2b=pos1+cigar2reaadlength(Letter_Double_rec[key][0][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double_rec[key][1][1])
if not pos1<temp_bp[0]-BP_para_dict['flank']+1 and not pos2b>temp_bp[-1]+BP_para_dict['flank']-1:
block1=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos1+low_qual_edge)
block2=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos2+low_qual_edge)
block1b=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos1b-low_qual_edge)
block2b=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos2b-low_qual_edge)
rela_1=pos1-temp_bp[temp_let.index(block1)]
rela_2=pos2-temp_bp[temp_let.index(block2)]
rela_1b=pos1b-temp_bp[temp_let.index(block1b)]
rela_2b=pos2b-temp_bp[temp_let.index(block2b)]
if block1==block1b==block2==block2:
BlockCov[block1]+=cigar2reaadlength(Letter_Double_rec[key][0][5])
else:
if block1==block1b and block2==block2b:
Pair_ThroughBP.append([block1,rela_1,rela_1b, block2,rela_2,rela_2b]+direct_temp)
else:
Double_Read_ThroughBP.append([block1,rela_1,block1b,rela_1b, block2,rela_2,block2b,rela_2b]+direct_temp)
del Letter_Double_rec[key]
for j in Pair_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Double_Read_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in temp_let:
Initial_Cov[j]=0
for j in Pair_ThroughBP:
Initial_Cov[j[0]]+=j[2]-j[1]
Initial_Cov[j[3]]+=j[5]-j[4]
for j in Single_Read_ThroughBP:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
for j in Double_Read_ThroughBP:
if j[0]==j[2]:
Initial_Cov[j[0]]+=j[3]-j[1]
else:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
if j[4]==j[6]:
Initial_Cov[j[4]]+=j[7]-j[5]
else:
Initial_Cov[j[4]]+=temp_bp[temp_let.index(j[4])+1]-temp_bp[temp_let.index(j[4])]-j[5]
Initial_Cov[j[6]]+=j[7]
Initial_IL=[]
for j in Pair_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[3])]-temp_bp[temp_let.index(j[0])]-j[1]+j[5])
for j in Double_Read_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[6])]-temp_bp[temp_let.index(j[0])]-j[1]+j[7])
Initial_ILPenal=[]
for j in Initial_IL:
Initial_ILPenal+=[pdf_calculate(j,GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero)/len(Initial_IL)]
return [Initial_DR_Penal,Initial_ILPenal,Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,BlockCov,Initial_Cov]
def Single_Read_Assort_For_insert(Full_Info,bp_list,flank):
relative_bps=[i-bp_list[0] for i in bp_list]
letter_list=[chr(97+i) for i in range(len(bp_list)-1)]
Block_and_Reads={}
Block_and_Reads['left']=[]
Block_and_Reads['right']=[]
SingleR_Through=Full_Info[6]
Pair_Through=Full_Info[4]
Read_Through=Full_Info[5]
for block in letter_list:
Block_and_Reads[block]=[]
for j in Pair_Through:
Block_and_Reads[j[0]]=[j[1:3],j[3:]]
Block_and_Reads[j[3]]=[j[4:6],j[:3]+j[6:8]]
for j in Read_Through:
Block_and_Reads[j[0]]=[]
for key in list(Full_Info_of_Reads.keys()):
read_left=[int(i) for i in Full_Info_of_Reads[key][:2]]+[Full_Info_of_Reads[key][-2]]
read_right=[int(i) for i in Full_Info_of_Reads[key][2:4]]+[Full_Info_of_Reads[key][-1]]
assign_left=Reads_block_assignment_2(relative_bps,letter_list,read_left[0],read_left[1],flank)
assign_right=Reads_block_assignment_2(relative_bps,letter_list,read_right[0],read_right[1],flank)
New_Info=['_'.join([assign_left[0],str(int(co)-assign_left[1])]) for co in Full_Info_of_Reads[key][:2]]+['_'.join([assign_right[0],str(int(co)-assign_right[1])]) for co in Full_Info_of_Reads[key][2:4]]+Full_Info_of_Reads[key][4:]
Block_and_Reads[assign_left[0]][key]=New_Info
Block_and_Reads[assign_right[0]][key]=New_Info
return Block_and_Reads
def Insert_Seq_Pool_Prod_2(original_bp_list,ori_1_Seq,flank):
ini_letters=['left']+['I'+chr(97+i) for i in range(len(original_bp_list)-1)]+['right']+['I'+chr(97+i)+'^' for i in range(len(original_bp_list)-1)]
relative_bps=[0]+[j-original_bp_list[0]+flank for j in original_bp_list]+[original_bp_list[-1]+flank-original_bp_list[0]+flank]
Insert_Seq_Pool={}
for k in range(len(original_bp_list)+1):
Insert_Seq_Pool[ini_letters[k]]=ori_1_Seq[relative_bps[k]:relative_bps[k+1]]
for k in range(len(original_bp_list)+1,len(ini_letters)):
Insert_Seq_Pool[ini_letters[k]]=complementary(ori_1_Seq[relative_bps[k-len(original_bp_list)]:relative_bps[k+1-len(original_bp_list)]])
return Insert_Seq_Pool
def BPs_Coverage(Af_Letter,original_bp_list,original_letters,Letter_Through,Af_Info,flank):
blocklen={}
for i in range(len(original_bp_list)-1):
blocklen[original_letters[i]]=original_bp_list[i+1]-original_bp_list[i]
blocklen['left']=flank
blocklen['right']=flank
tempM=[blocklen[j[0]] for j in Af_Letter[0]]
tempP=[blocklen[j[0]] for j in Af_Letter[1]]
Af_BPs=[[-flank,0]+[sum(tempM[:(k+1)]) for k in range(len(tempM))],[-flank,0,]+[sum(tempP[:(k+1)]) for k in range(len(tempP))]]
Af_BPs=[Af_BPs[0]+[Af_BPs[0][-1]+flank],Af_BPs[1]+[Af_BPs[1][-1]+flank]]
Af_BP_Through=[[0 for i in range(len(Af_BPs[0]))],[0 for i in range(len(Af_BPs[1]))]]
for key in list(Af_Info.keys()):
if Af_Info[key][6]=='M':
tempbps=Af_BPs[0]
leftMost=numpy.min([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
rightMost=numpy.max([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
for m in range(len(tempbps)-1):
if tempbps[m+1]>leftMost and tempbps[m]<leftMost:
for n in range(m,len(tempbps)-1):
if tempbps[n+1]>rightMost and tempbps[n]<rightMost:
for p in range(m+1,n+1):
if len(Af_Info[key])==7:
Af_BP_Through[0][p]+=1
elif len(Af_Info[key])==8:
Af_BP_Through[0][p]+=float(Af_Info[key][7])
if Af_Info[key][6]=='P':
tempbps=Af_BPs[1]
leftMost=numpy.min([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
rightMost=numpy.max([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
for m in range(len(tempbps)-1):
if tempbps[m+1]>leftMost and tempbps[m]<leftMost:
for n in range(m,len(tempbps)-1):
if tempbps[n+1]>rightMost and tempbps[n]<rightMost:
for p in range(m+1,n+1):
if len(Af_Info[key])==7:
Af_BP_Through[1][p]+=1
elif len(Af_Info[key])==8:
Af_BP_Through[1][p]+=float(Af_Info[key][7])
return [Af_BP_Through[0][1:-1],Af_BP_Through[1][1:-1]]
def penal_calculate(GC_para_dict,BP_para_dict,Map_All,temp_bp,Af_Letter,Af_BP,letters_numbers,NoMapPenal):
out_rd=[[0 for i in temp_bp[0][:-1]],[0 for i in temp_bp[1][:-1]]]
IL_Rec={}
DR_Penal=0
out_tb=[[0 for i in temp_bp[0]],[0 for i in temp_bp[1]]]
for i in Map_All:
if len(i)>4:
if not i[6] in list(IL_Rec.keys()):
IL_Rec[i[6]]=i[8]
else:
IL_Rec[i[6]]+=i[8]
if not i[4:6]==['+','-']:
DR_Penal+=1
if i[7]=='m':
i_block=[]
for k in i[:4]:
if k<temp_bp[0][1]:
i_block.append(0)
elif k>temp_bp[0][-2]-1:
i_block.append(len(temp_bp[0])-2)
else:
for j in range(len(temp_bp[0])-1)[1:-1]:
if temp_bp[0][j]-1<k and temp_bp[0][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[0][i_block[2]]+=(i[3]-i[2])*i[-1]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif not i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
out_rd[0][i_block[2]]+=(i[3]-i[2])*i[-1]
out_tb[0][i_block[1]]+=i[8]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[0][i_block[2]]+=(temp_bp[0][i_block[2]+1]-i[2])*i[-1]
out_rd[0][i_block[3]]+=(i[3]-temp_bp[0][i_block[3]])*i[-1]
out_tb[0][i_block[3]]+=i[8]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif not i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
out_rd[0][i_block[2]]+=(temp_bp[0][i_block[2]+1]-i[2])*i[-1]
out_rd[0][i_block[3]]+=(i[3]-temp_bp[0][i_block[3]])*i[-1]
out_tb[0][i_block[1]]+=i[8]
out_tb[0][i_block[3]]+=i[8]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
if i[7]=='p':
i_block=[]
for k in i[:4]:
if k<temp_bp[1][1]:
i_block.append(0)
elif k>temp_bp[1][-2]-1:
i_block.append(len(temp_bp[1])-2)
else:
for j in range(len(temp_bp[1])-1)[1:-1]:
if temp_bp[1][j]-1<k and temp_bp[1][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[1][i_block[2]]+=(i[3]-i[2])*i[-1]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif not i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
out_rd[1][i_block[2]]+=(i[3]-i[2])*i[-1]
out_tb[1][i_block[1]]+=i[8]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[1][i_block[2]]+=(temp_bp[1][i_block[2]+1]-i[2])*i[-1]
out_rd[1][i_block[3]]+=(i[3]-temp_bp[1][i_block[3]])*i[-1]
out_tb[1][i_block[3]]+=i[8]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif not i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
out_rd[1][i_block[2]]+=(temp_bp[1][i_block[2]+1]-i[2])*i[-1]
out_rd[1][i_block[3]]+=(i[3]-temp_bp[1][i_block[3]])*i[-1]
out_tb[1][i_block[1]]+=i[8]
out_tb[1][i_block[3]]+=i[8]
if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
else:
if i[2]=='m':
i_block=[]
for k in i[:2]:
if k<temp_bp[0][1]:
i_block.append(0)
elif k>temp_bp[0][-2]-1:
i_block.append(len(temp_bp[0])-2)
else:
for j in range(len(temp_bp[0])-1)[1:-1]:
if temp_bp[0][j]-1<k and temp_bp[0][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
elif not i_block[0]==i_block[1]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
if i[2]=='p':
i_block=[]
for k in i[:2]:
if k<temp_bp[1][1]:
i_block.append(0)
elif k>temp_bp[1][-2]-1:
i_block.append(len(temp_bp[1])-2)
else:
for j in range(len(temp_bp[1])-1)[1:-1]:
if temp_bp[1][j]-1<k and temp_bp[1][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
elif not i_block[0]==i_block[1]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
block_bps_chr={}
block_bps_chr['m']={}
block_bps_chr['p']={}
if not Penalty_For_InsertLengthZero in list(IL_Rec.keys()):
IL_Rec[Penalty_For_InsertLengthZero]=NoMapPenal
else:
IL_Rec[Penalty_For_InsertLengthZero]+=NoMapPenal
IL_Penal=0
IL_Weight=0
for i in list(IL_Rec.keys()):
IL_Penal+=i*IL_Rec[i]
IL_Weight+=IL_Rec[i]
if not IL_Weight==0:
IL_Output=IL_Penal/IL_Weight
else:
IL_Output=0
Num_Read_TB=[out_tb[0][1:-1],out_tb[1][1:-1]]
TB_Pena_2_out=0
Num_total_TB=[]
for x in Num_Read_TB:
Num_total_TB+=x
TB_Pena_2_out=numpy.mean([pdf_calculate(single_pc*2.0,PC_Statistics[0][4],PC_Statistics[0][0],PC_Statistics[0][1],PC_Statistics[0][2],PC_Statistics[0][3],PC_Statistics[1][1]+3*PC_Statistics[1][2],PC_Statistics[1][1]-3*PC_Statistics[1][2],Penalty_For_InsertLengthZero) for single_pc in Num_total_TB ])
Af_Block_Len=[[BP_para_dict['flank']]+[Af_BP[0][i+1]-Af_BP[0][i] for i in range(len(Af_BP[0])-1)]+[BP_para_dict['flank']],[BP_para_dict['flank']]+[Af_BP[1][i+1]-Af_BP[1][i] for i in range(len(Af_BP[1])-1)]+[BP_para_dict['flank']]]
out_rd=[[out_rd[0][i]/Af_Block_Len[0][i] for i in range(len(out_rd[0]))],[out_rd[1][i]/Af_Block_Len[1][i] for i in range(len(out_rd[1]))]]
out_rd_new=[[(BP_para_dict['RD_within_B']['left']-out_rd[0][0]-out_rd[1][0])/2.0+out_rd[0][0],
(BP_para_dict['RD_within_B']['right']-out_rd[0][-1]-out_rd[1][-1])/2.0+out_rd[0][-1]],
[(BP_para_dict['RD_within_B']['left']-out_rd[0][0]-out_rd[1][0])/2.0+out_rd[1][0],
(BP_para_dict['RD_within_B']['right']-out_rd[0][-1]-out_rd[1][-1])/2.0+out_rd[1][-1]]]
out_rd=[[out_rd_new[0][0]]+out_rd[0][1:-1]+[out_rd_new[0][-1]],[out_rd_new[1][0]]+out_rd[1][1:-1]+[out_rd_new[1][-1]]]
out_rd_within=[[BP_para_dict['RD_within_B'][Af_Letter[0][i]]/letters_numbers[0][i] for i in range(len(Af_Letter[0]))],[BP_para_dict['RD_within_B'][Af_Letter[1][i]]/letters_numbers[1][i] for i in range(len(Af_Letter[1]))]]
out_rd_within[0]=[0]+out_rd_within[0]+[0]
out_rd_within[1]=[0]+out_rd_within[1]+[0]
cov_bp2=[[out_rd[0][i]+out_rd_within[0][i] for i in range(len(out_rd[0]))],[out_rd[1][i]+out_rd_within[1][i] for i in range(len(out_rd[1]))]]
Cov_GC=[[BP_para_dict['BlockGC2'][k] for k in Af_Letter[0]],[BP_para_dict['BlockGC2'][k] for k in Af_Letter[1]]]
adj_cov_bp=[GC_RD_Adj(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],chrom_N,Cov_GC[0],cov_bp2[0][1:-1]),GC_RD_Adj(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],chrom_N,Cov_GC[1],cov_bp2[1][1:-1])]
return [IL_Output,adj_cov_bp,DR_Penal,TB_Pena_2_out,Num_total_TB]
def Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP):
Total_Cov_For_Pen={}
for key in list(BP_para_dict['RD_within_B'].keys()):
Total_Cov_For_Pen[key]=0
Map_M=[]
Map_P=[]
Map_Both=[]
Let_BP_Info={}
Let_BP_Info['m']={}
Let_BP_Info['p']={}
temp_letter=[['left']+Af_Letter[0]+['right'],['left']+Af_Letter[1]+['right']]
temp_bp=[[Af_BP[0][0]-BP_para_dict['flank']]+Af_BP[0]+[Af_BP[0][-1]+BP_para_dict['flank']],[Af_BP[1][0]-BP_para_dict['flank']]+Af_BP[1]+[Af_BP[1][-1]+BP_para_dict['flank']]]
for j1 in range(len(temp_letter[0])):
j=temp_letter[0][j1]
if not j in list(Let_BP_Info['m'].keys()):
Let_BP_Info['m'][j]=[[temp_bp[0][j1],temp_bp[0][j1+1]]]
else:
Let_BP_Info['m'][j]+=[[temp_bp[0][j1],temp_bp[0][j1+1]]]
for j1 in range(len(temp_letter[1])):
j=temp_letter[1][j1]
if not j in list(Let_BP_Info['p'].keys()):
Let_BP_Info['p'][j]=[[temp_bp[1][j1],temp_bp[1][j1+1]]]
else:
Let_BP_Info['p'][j]+=[[temp_bp[1][j1],temp_bp[1][j1+1]]]
letters_numbers=[[Af_Letter[0].count(i[0])+Af_Letter[1].count(i[0])+Af_Letter[0].count(i[0]+'^')+Af_Letter[1].count(i[0]+'^') for i in Af_Letter[0]],[Af_Letter[0].count(i[0])+Af_Letter[1].count(i[0])+Af_Letter[0].count(i[0]+'^')+Af_Letter[1].count(i[0]+'^') for i in Af_Letter[1]]]
NoMapPenal=0
IL_Rec={}
DR_Rec=0
cov_bp=[[0 for i in range(len(temp_letter[0]))],[0 for i in range(len(temp_letter[1]))]]
cov_bp2=[]
NoMapPenal=Be_Info_1_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal)
NoMapPenal=Be_Info_2_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal)
NoMapPenal=Be_Info_3_rearrange(BP_para_dict,Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal)
best_structure_sign_flag=0
for key in list(Total_Cov_For_Pen.keys()):
if Total_Cov_For_Pen[key]==0:
del Total_Cov_For_Pen[key]
else:
Total_Cov_For_Pen[key]/=float(Be_BP_Letter[key])
for key in list(BP_para_dict['RD_within_B'].keys()):
if not key[-1]=='^' and not key in ['left','right','left^', 'right^']:
if not key in Af_Letter[0]+Af_Letter[1] and not key+'^' in Af_Letter[0]+Af_Letter[1]:
if not key in list(Total_Cov_For_Pen.keys()):
Total_Cov_For_Pen[key]=0
Total_Cov_For_Pen[key]+=BP_para_dict['RD_within_B'][key]
if NoMapPenal>0:
best_structure_sign_flag+=1
for key1 in list(Total_Cov_For_Pen.keys()):
if Total_Cov_For_Pen[key1]>2.58*GC_para_dict['GC_Std_Coverage'][chrom_N]:
best_structure_sign_flag+=1
if not Map_M+Map_P+Map_Both==[]:
penals=penal_calculate(GC_para_dict,BP_para_dict,Map_M+Map_P+Map_Both,temp_bp,Af_Letter,Af_BP,letters_numbers,NoMapPenal)
if penals[2]>0:
best_structure_sign_flag+=1
return penals[:-1]+[NoMapPenal,Total_Cov_For_Pen,best_structure_sign_flag]+[penals[-1]]
else:
return 0
def write_best_letter(bps_all,Best_Letter_Rec,Best_Score_Rec,Score_rec_hash,original_letters):
fo=open(output_Score_File,'a')
time2=time.time()
Best_Letter_2=[]
if not Score_rec_hash=={}:
temp1=Best_Let_modify(original_letters,Best_Letter_Rec,Best_Score_Rec,Score_rec_hash)
Best_Letter_Rec=temp1[0]
Best_Score_Rec=temp1[1]
for bestletter in Best_Letter_Rec:
if not sorted(bestletter) in Best_Letter_2:
Best_Letter_2.append(sorted(bestletter))
bps3=[]
for bps in bps_all: bps3+=bps
for bestletter in Best_Letter_2:
if not '/'.join([''.join(original_letters),''.join(original_letters)])=='/'.join([''.join(bestletter[0]),''.join(bestletter[1])]):
if Uniparental_disomy_check(original_letters,bestletter)=='Pass':
print(' '.join([str(bp_ele) for bp_ele in bps3]), file=fo)
print('/'.join([''.join(bestletter[0]),''.join(bestletter[1])]), file=fo)
#print ' '.join([str(bp_ele) for bp_ele in bps3])
#print '/'.join([''.join(bestletter[0]),''.join(bestletter[1])])
print('Theoretical Best Score: '+str(Best_IL_Score+Best_RD_Score+20), file=fo)
if Best_Score_Rec>80:
Best_Score_Rec=80
print('Current Best Scure: '+str(Best_Score_Rec+20), file=fo)
print('Time Consuming:'+str(datetime.timedelta(seconds=(time2-time1))), file=fo)
else: print('Uniparental_disomy: '+' '.join([str(bp_ele) for bp_ele in bps3+[original_letters,bestletter]]))
fo.close()
def score_rec_hash_Modify_for_short_del(Score_rec_hash):
Score_rec_hash_new={}
for x in sorted(Score_rec_hash.keys())[::-1][:1]:
Score_rec_hash_new[x]=Score_rec_hash[x]
for x in sorted(Score_rec_hash.keys())[::-1][1:]:
Score_rec_hash_new[x-1.1]=Score_rec_hash[x]
return Score_rec_hash_new
def one_RD_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash):
#Letter_Candidates=[[[],[]],[['a'], []],[['a^'], []],[['a'], ['a']],[['a^'], ['a']],[['a^'], ['a^']],[['a','a^'], []],[['a^','a'], []],[['a^','a^'], []]]
Letter_Candidates=[[[],[]],[['a'], []],[['a^'], []],[['a'], ['a']],[['a^'], ['a']],[['a^'], ['a^']]]
if Ploidy==2: Letter_Candidates=Letter_Candidates
elif Ploidy==1: Letter_Candidates=[i for i in Letter_Candidates if i[0]==i[1]]
elif Ploidy==0: Letter_Candidates=[i for i in Letter_Candidates if ['a'] in i]
if inv_flag_overall<0.1: Letter_Candidates=[i for i in Letter_Candidates if tag_inv(i)==0]
IL_RD_Temp_Info=Af_Rearrange_Info_Collect_2(BP_para_dict,Letter_Candidates)
if not IL_RD_Temp_Info=='Error':
[ILTemp,RDTemp,Letter_Rec,BP_Rec]=[IL_RD_Temp_Info[0],IL_RD_Temp_Info[1],IL_RD_Temp_Info[2],IL_RD_Temp_Info[3]]
if not ILTemp==[]:
DECISION_Score=Move_Decide_3(ILTemp,RDTemp,GC_para_dict['GC_Var_Coverage'])
Best_Letter_Rec=[Letter_Rec[DECISION_Score[0]]]
Best_Score_Rec=ILTemp[DECISION_Score[0]]+RDTemp[DECISION_Score[0]]
run_flag+=1
for x in range(len(Letter_Rec)):
xy=ILTemp[x]+RDTemp[x]
if not xy in list(Score_rec_hash.keys()):
Score_rec_hash[xy]=[]
Score_rec_hash[xy].append(Letter_Rec[x])
else:
Best_Letter_Rec=[]
Best_Score_Rec=100
run_flag+=1
Score_rec_hash2=score_rec_hash_Modify_for_short_del(Score_rec_hash)
return([Best_Letter_Rec,Best_Score_Rec,run_flag,Score_rec_hash2])
else:
return 'Error'
def two_RD_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash):
Letter_Candidates=struc_propose_single_block(2)+struc_propose_single_block(3)+struc_propose_single_block(4)+struc_propose_single_block(5)
Letter_Candidates=[i for i in Letter_Candidates if not [] in i]
if [[], ['a', 'a']] in Letter_Candidates: del Letter_Candidates[Letter_Candidates.index([[], ['a', 'a']])]
if [[], ['a^', 'a^']] in Letter_Candidates: del Letter_Candidates[Letter_Candidates.index([[], ['a^', 'a^']])]
if Ploidy==2: Letter_Candidates=Letter_Candidates
elif Ploidy==1: Letter_Candidates=[i for i in Letter_Candidates if i[0]==i[1]]
elif Ploidy==0: Letter_Candidates=[i for i in Letter_Candidates if ['a'] in i]
if inv_flag_overall<0.1: Letter_Candidates=[i for i in Letter_Candidates if tag_inv(i)==0]
IL_RD_Temp_Info=Af_Rearrange_Info_Collect_2(BP_para_dict,Letter_Candidates)
if not IL_RD_Temp_Info=='Error':
[ILTemp,RDTemp,Letter_Rec,BP_Rec]=[IL_RD_Temp_Info[0],IL_RD_Temp_Info[1],IL_RD_Temp_Info[2],IL_RD_Temp_Info[3]]
if not ILTemp==[]:
DECISION_Score=Move_Decide_3(ILTemp,RDTemp,GC_para_dict['GC_Var_Coverage'])
Best_Letter_Rec=[Letter_Rec[DECISION_Score[0]]]
Best_Score_Rec=ILTemp[DECISION_Score[0]]+RDTemp[DECISION_Score[0]]
run_flag+=1
for x in range(len(Letter_Rec)):
xy=ILTemp[x]+RDTemp[x]
if not xy in list(Score_rec_hash.keys()):
Score_rec_hash[xy]=[]
Score_rec_hash[xy].append(Letter_Rec[x])
else:
Best_Letter_Rec=[]
Best_Score_Rec=100
run_flag+=1
return([Best_Letter_Rec,Best_Score_Rec,run_flag,Score_rec_hash])
else: return 'Error'
def few_RD_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash):
Letter_Candidates=struc_propose_single_block(copy_num_a)+struc_propose_single_block(copy_num_b)
Letter_Candidates=[i for i in Letter_Candidates if not [] in i]
if Ploidy==2: Letter_Candidates=Letter_Candidates
elif Ploidy==1: Letter_Candidates=[i for i in Letter_Candidates if i[0]==i[1]]
elif Ploidy==0: Letter_Candidates=[i for i in Letter_Candidates if ['a'] in i]
if inv_flag_overall<0.1: Letter_Candidates=[i for i in Letter_Candidates if tag_inv(i)==0]
IL_RD_Temp_Info=Af_Rearrange_Info_Collect_2(BP_para_dict,Letter_Candidates)
if not IL_RD_Temp_Info=='Error':
[ILTemp,RDTemp,Letter_Rec,BP_Rec]=[IL_RD_Temp_Info[0],IL_RD_Temp_Info[1],IL_RD_Temp_Info[2],IL_RD_Temp_Info[3]]
if not ILTemp==[]:
DECISION_Score=Move_Decide_3(ILTemp,RDTemp,GC_para_dict['GC_Var_Coverage'])
Best_Letter_Rec=[Letter_Rec[DECISION_Score[0]]]
Best_Score_Rec=ILTemp[DECISION_Score[0]]+RDTemp[DECISION_Score[0]]
run_flag+=1
for x in range(len(Letter_Rec)):
xy=ILTemp[x]+RDTemp[x]
if not xy in list(Score_rec_hash.keys()):
Score_rec_hash[xy]=[]
Score_rec_hash[xy].append(Letter_Rec[x])
else:
Best_Letter_Rec=[]
Best_Score_Rec=100
run_flag+=1
return([Best_Letter_Rec,Best_Score_Rec,run_flag,Score_rec_hash])
else: return 'Error'
def two_block_RD_Process(GC_para_dict,BP_para_dict,run_flag):
Letter_Candidates=struc_produce_two_block(Copy_num_estimate)
if Ploidy==2: Letter_Candidates=Letter_Candidates
elif Ploidy==1: Letter_Candidates=[i for i in Letter_Candidates if i[0]==i[1]]
elif Ploidy==0: Letter_Candidates=[i for i in Letter_Candidates if ['a','b'] in i]
if inv_flag_overall<0.1: Letter_Candidates=[i for i in Letter_Candidates if tag_inv(i)==0]
IL_RD_Temp_Info=Af_Rearrange_Info_Collect_2(BP_para_dict,Letter_Candidates)
if not IL_RD_Temp_Info=='Error':
[ILTemp,RDTemp,Letter_Rec,BP_Rec]=[IL_RD_Temp_Info[0],IL_RD_Temp_Info[1],IL_RD_Temp_Info[2],IL_RD_Temp_Info[3]]
if not ILTemp==[]:
DECISION_Score=Move_Decide_3(ILTemp,RDTemp,GC_para_dict['GC_Var_Coverage'])
Best_Letter_Rec=[Letter_Rec[DECISION_Score[0]]]
Best_Score_Rec=ILTemp[DECISION_Score[0]]+RDTemp[DECISION_Score[0]]
run_flag+=1
else:
Best_Letter_Rec=[]
Best_Score_Rec=100
run_flag+=1
return([Best_Letter_Rec,Best_Score_Rec,run_flag])
else: return 'Error'
def null_model_global_para_setup(dict_opts):
global bam_files_appdix,BamN,Input_File,bp_txt_Path,BPPath,NullPath
bam_files_appdix=dict_opts['--sample'].split('.')[-1]
#BamN=dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'')
BamN='.'.join(dict_opts['--sample'].split('/')[-1].split('.')[:-1])
Input_File=dict_opts['--bp-file']
bp_txt_Path='/'.join(Input_File.split('/')[:-1])+'/'
BPPath=workdir +'.'.join(['BreakPoints']+[dict_opts['--sample'].split('/')[-1]])+'/'
NullPath=workdir+'.'.join(['NullModel']+[dict_opts['--sample'].split('/')[-1]])+'/'
global Insert_Len_Stat,Read_Depth_Stat,Physical_Cov_Stat,RD_Weight
Insert_Len_Stat=NullPath+'ILNull.'+BamN+'.'+genome_name+'.Bimodal' #Insert Length stat
Read_Depth_Stat=NullPath+'RDNull.'+BamN+'.'+genome_name+'.NegativeBinomial' #read coverage stat
Physical_Cov_Stat=NullPath+'TBNull.'+BamN+'.'+genome_name+'.Bimodal' #physical coverage stat
RD_Weight=Insert_len_stat_readin(Insert_Len_Stat)/RD_NB_stat_readin(Read_Depth_Stat)
#RD_Weight=1
global flank,Cut_Lower,Cut_Upper,IL_Stat_all,IL_Normal_Stat,IL_Statistics,PC_Statistics,RD_Statistics,IL_max,PC_max,RD_max
[flank,Cut_Lower,Cut_Upper]=[cdf_solver_application(Insert_Len_Stat,0.95,model_comp) ,cdf_solver_application(Insert_Len_Stat,0.005,model_comp) ,cdf_solver_application(Insert_Len_Stat,0.995,model_comp)]
IL_Stat_all=IL_Stat_readin(Insert_Len_Stat)
[IL_Statistics,IL_Normal_Stat]=IL_Stat_all
IL_max=numpy.log(find_max_bimodal(IL_Statistics)) #calculate max_pdf of insert length distribution
PC_Statistics=IL_Stat_readin(Physical_Cov_Stat) #readin physical coverage parameters
PC_max=numpy.log(find_max_bimodal(PC_Statistics[0])) #calculate max_pdf of physical coverage
RD_Statistics=RD_Stat_readin(Read_Depth_Stat)
RD_max=numpy.log(find_max_negative_binomial(RD_Statistics))
def inv_structure_predict(Full_Info):
[Pair_Through,Read_Through,SingleR_Through]=[Full_Info[4],Full_Info[5],Full_Info[6]]
inv_pairs_count=0
all_count=len(Pair_Through)+len(Read_Through)
for x in Pair_Through:
if x[-2:] in [['+','+'],['-','-']]: inv_pairs_count+=1
for x in Read_Through:
if x[-2:] in [['+','+'],['-','-']]: inv_pairs_count+=1
if all_count>0: return float(inv_pairs_count)/float(all_count)
else: return 0
def After_Letter_List_Produce_M(M_Move_Choices,Be_BP,Be_Letter,original_bp_list,Ploidy,Best_Score_Rec,Best_Letter_Rec,Block_CN_Upper):
[Af_BP_List,Af_Letter_List]=[[],[]]
for m in [['2m','1','1','1','X']]+M_Move_Choices:
p=[str(Chr)+'p','1','1','1','X']
Move_MP=[m,p]
Af_BP=[BPList_Rearrange(Be_BP[0],m,original_bp_list),BPList_Rearrange(Be_BP[1],p,original_bp_list)]
Af_Letter=[LetterList_Rearrange(Be_Letter[0],m,original_bp_list),LetterList_Rearrange(Be_Letter[1],p,original_bp_list)]
if Ploidy==1:
Af_Letter[1]=Af_Letter[0]
Af_BP[1]=Af_BP[0]
Af_BP_List.append(Af_BP)
Af_Letter_List.append(Af_Letter)
elif Ploidy==0:
Af_BP_List.append(Af_BP)
Af_BP_List.append([Af_BP[0],Af_BP[0]])
Af_Letter_List.append(Af_Letter)
Af_Letter_List.append([Af_Letter[0],Af_Letter[0]])
elif Ploidy==2:
Af_BP_List.append(Af_BP)
Af_Letter_List.append(Af_Letter)
out=[[],[],0]
for Af_Num in range(len(Af_Letter_List)):
Af_Letter=Af_Letter_List[Af_Num]
Af_BP=Af_BP_List[Af_Num]
if not Af_Letter_QC(Af_Letter,Copy_num_estimate)==0:continue
if not Best_Score_Rec==0 and Af_Letter in Best_Letter_Rec: continue
letter_num_flag=0
for key in list(Block_CN_Upper.keys()):
if (Af_Letter[0]+Af_Letter[1]).count(key)>Block_CN_Upper[key]:
letter_num_flag+=1
if not letter_num_flag==0: continue
out[0].append(Af_Letter)
out[1].append(Af_BP)
out[2]+=1
return out
def After_Letter_List_Produce_P(P_Move_Choices,Be_BP,Be_Letter,original_bp_list,Ploidy,Best_Score_Rec,Best_Letter_Rec,Block_CN_Upper):
[Af_BP_List,Af_Letter_List]=[[],[]]
for p in [['2p','1','1','1','X']]+P_Move_Choices:
m=[str(Chr)+'m','1','1','1','X']
Move_MP=[m,p]
Af_BP=[BPList_Rearrange(Be_BP[0],m,original_bp_list),BPList_Rearrange(Be_BP[1],p,original_bp_list)]
Af_Letter=[LetterList_Rearrange(Be_Letter[0],m,original_bp_list),LetterList_Rearrange(Be_Letter[1],p,original_bp_list)]
if Ploidy==1:
Af_Letter[0]=Af_Letter[1]
Af_BP[0]=Af_BP[1]
Af_BP_List.append(Af_BP)
Af_Letter_List.append(Af_Letter)
elif Ploidy==0:
Af_BP_List.append(Af_BP)
Af_BP_List.append([Af_BP[1],Af_BP[1]])
Af_Letter_List.append(Af_Letter)
Af_Letter_List.append([Af_Letter[1],Af_Letter[1]])
elif Ploidy==2:
Af_BP_List.append(Af_BP)
Af_Letter_List.append(Af_Letter)
out=[[],[],0]
for Af_Num in range(len(Af_Letter_List)):
Af_Letter=Af_Letter_List[Af_Num]
Af_BP=Af_BP_List[Af_Num]
if not Af_Letter_QC(Af_Letter,Copy_num_estimate)==0:continue
if not Best_Score_Rec==0 and Af_Letter in Best_Letter_Rec: continue
letter_num_flag=0
for key in list(Block_CN_Upper.keys()):
if (Af_Letter[0]+Af_Letter[1]).count(key)>Block_CN_Upper[key]:
letter_num_flag+=1
if not letter_num_flag==0: continue
out[0].append(Af_Letter)
out[1].append(Af_BP)
out[2]+=1
return out
def size_check(bps2, cff=1000000):
flag=0
for i in bps2:
if(len(i)<3): return 1
if int(i[2])-int(i[1])>cff: flag+=1
return flag
Define_Default_SVPredict()
if not '--workdir' in list(dict_opts.keys()): print('Error: please specify working directory using: --workdir')
else:
global_para_declaration()
if not '--bp-file' in list(dict_opts.keys()):print('Error: please specify input txt file using : --bp-file')
else:
if not '--out-path' in list(dict_opts.keys()): dict_opts['--out-path']='/'.join(dict_opts['--bp-file'].split('/')[:-1])
if not dict_opts['--out-path'][-1]=='/': dict_opts['--out-path']+='/'
if not os.path.isfile(ref_file): print('Error: wrong reference genome provided')
else:
if not os.path.isfile(ref_index): print('Error: reference genome not indexed')
else:
global chromos_all
chromos_all=chromos_readin_list(ref_file)
if not '--sample' in list(dict_opts.keys()): print('Error: please specify either input file using --sample')
else:
time1=time.time()
null_model_global_para_setup(dict_opts)
if not os.path.isfile(Insert_Len_Stat): print('Error: cannot access file: '+Insert_Len_Stat)
else:
ReadLenFin=NullPath+BamN+'.'+genome_name+'.Stats'
if not os.path.isfile(ReadLenFin): print('Error: cannot access file: '+ReadLenFin)
else:
fin=open(ReadLenFin)
pin=fin.readline().strip().split()
pin=fin.readline().strip().split()
pin=fin.readline().strip().split()
Window_Size=int(pin[0])/3
for line in fin: pin=line.strip().split()
fin.close()
ReadLength=int(pin[-1].split(':')[-1])
Initial_Bam_Name=BamN+'.'+bam_files_appdix
Initial_Bam=dict_opts['--sample']
fi_test=os.popen(r'''wc -l %s'''%(Input_File))
line_test=fi_test.readline().strip().split()
fi_test.close()
if not line_test[0]=='0':
IL_Estimate=IL_Statistics[0]*IL_Statistics[4]+IL_Statistics[1]*IL_Statistics[5]
IL_SD=((IL_Statistics[2]*IL_Statistics[4])**2+(IL_Statistics[3]*IL_Statistics[5])**2)**(0.5)
IL_Penal_Two_End_Limit=min([pdf_calculate(IL_Estimate-3*IL_SD,IL_Statistics[4],IL_Statistics[0],IL_Statistics[1],IL_Statistics[2],IL_Statistics[3],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero),pdf_calculate(IL_Estimate+3*IL_SD,IL_Statistics[4],IL_Statistics[0],IL_Statistics[1],IL_Statistics[2],IL_Statistics[3],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero)])
low_qual_edge=5
fi=open(Input_File)
bps_hash={}
bps_temp=[]
break_flag=0
for line in fi:
pi=line.strip().split()
if pi==[] or len(pi)<3:
if bps_temp==[]: continue
else:
bp_key=0
for l1 in bps_temp: bp_key+=len(l1)
if not bp_key in list(bps_hash.keys()): bps_hash[bp_key]=[]
bps_hash[bp_key].append(bps_temp)
bps_temp=[]
else: bps_temp.append(pi)
fi.close()
bps_hash_inter={}
for k1 in list(bps_hash.keys()):
bps_hash_inter[k1]=[]
for k2 in bps_hash[k1]:
if not k2 in bps_hash_inter[k1]: bps_hash_inter[k1].append(k2)
bps_hash=bps_hash_inter
output_Score_File=dict_opts['--out-path']+'_'.join(dict_opts['--bp-file'].split('/')[-1].split('.')[:-1])+'.coverge'
file_setup(output_Score_File)
for bpsk1 in sorted(bps_hash.keys()):
for bps2 in bps_hash[bpsk1]:
for i in bps2:
if len(i)<3: i.append(str(int(i[-1])+Window_Size))
GC_Stat_Path=NullPath+'RD_Stat'
Affix_GC_Stat='_MP'+str(QCAlign)+'_GC_Coverage_ReadLength'
[GC_Content_Coverage,Chromosome,Coverage_0]=GC_Stat_ReadIn(BamN,GC_Stat_Path,genome_name,Affix_GC_Stat)
[Coverage,GC_Overall_Median_Coverage,GC_Overall_Median_Num,GC_Median_Coverage,GC_Median_Num,GC_Mean_Coverage,GC_Std_Coverage,GC_Var_Coverage]=[[int(k) for k in Coverage_0],{},[],{},{},{},{},{}]
for a in Chromosome:
if a in list(GC_Content_Coverage.keys()):
GC_Overall_temp=[]
for b in Coverage:
if not b in list(GC_Content_Coverage[a].keys()): continue
if not b in list(GC_Median_Num.keys()): GC_Median_Num[b]=[]
if len(GC_Content_Coverage[a][b][0])==2: continue
elif len(GC_Content_Coverage[a][b][0])>2:
num_list=[float(c) for c in GC_Content_Coverage[a][b][0][2:].split(',')]
if not sum(num_list)==0:
GC_Median_Num[b]+=num_list
GC_Overall_Median_Num+=num_list
GC_Overall_temp=GC_Overall_temp+num_list
if not Median_Pick(num_list)==0.0:
if not a in list(GC_Median_Coverage.keys()): GC_Median_Coverage[a]={}
GC_Median_Coverage[a][b]=Median_Pick(num_list)
if len(GC_Overall_temp)==0: continue
if sum(GC_Overall_temp)==0.0: continue
elif len(GC_Overall_temp)>0:
GC_Overall_Median_Coverage[a]=Median_Pick(GC_Overall_temp)
GC_Mean_Coverage[a]=numpy.mean(GC_Overall_temp)
GC_Std_Coverage[a]=numpy.std(GC_Overall_temp)
GC_Var_Coverage[a]=(GC_Std_Coverage[a])**2
GC_Overall_Median_Num=Median_Pick([i for i in GC_Overall_Median_Num if not i==0])
for a in list(GC_Median_Num.keys()):
if GC_Median_Num[a]==[]: GC_Median_Num[a]=GC_Overall_Median_Num
else: GC_Median_Num[a]=Median_Pick(GC_Median_Num[a])
GC_Median_Num=GC_Median_Num_Correct(GC_Median_Num)
ChrN_Median_Coverage={}
for i in list(GC_Median_Coverage.keys()):
for j in list(GC_Median_Coverage[i].keys()):
if not j in list(ChrN_Median_Coverage.keys()): ChrN_Median_Coverage[j]=[GC_Median_Coverage[i][j]]
else: ChrN_Median_Coverage[j]+=[GC_Median_Coverage[i][j]]
[chrom_N,chrom_X,chrom_Y,GC_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage]=GC_RD_Info_Complete(ref_file,GC_Median_Coverage,ChrN_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage,Chromosome)
GC_para_dict={'IL_Statistics':IL_Statistics,'GC_Overall_Median_Coverage':GC_Overall_Median_Coverage,'GC_Overall_Median_Num':GC_Overall_Median_Num,'GC_Median_Coverage':GC_Median_Coverage,'GC_Median_Num':GC_Median_Num,'GC_Mean_Coverage':GC_Mean_Coverage,'GC_Std_Coverage':GC_Std_Coverage,'GC_Var_Coverage':GC_Var_Coverage,'Coverage':Coverage}
for bpsk1 in sorted(bps_hash.keys()):
if bpsk1>5: continue
for bps2_new in bps_hash[bpsk1]:
bps2_new_2=modify_bps2_new(bps2_new)
bps2=LN_bps2_Modify(bps2_new_2,chromos_all)
if size_check(bps2)>0: continue
if len(bps2)>0 and qual_check_bps2(bps2)=='right':
print(bps2)
Chromo=bps2[0][0]
if not str(Chromo) in list(GC_Std_Coverage.keys()): continue
if not str(Chromo) in list(GC_Mean_Coverage.keys()): continue
K_RD=GC_Std_Coverage[str(Chromo)]/GC_Mean_Coverage[str(Chromo)]
K_IL=IL_Normal_Stat[2]/IL_Normal_Stat[1]
K_RD_new=1
K_IL_new=(K_IL/K_RD)**2
IL_GS=Prob_Norm(IL_Normal_Stat[1],IL_Normal_Stat[1],IL_Normal_Stat[2]**2)
RD_GS=Prob_Norm(GC_Mean_Coverage[str(Chromo)],GC_Mean_Coverage[str(Chromo)],GC_Std_Coverage[str(Chromo)]**2)
for i in bps2:
temp2=[int(j) for j in i[1:]]
k=[i[0]]+sorted(temp2)
k2=k[:2]
for k3 in temp2:
if not k3 in k2 and k3-k2[-1]>10: k2.append(k3)
if len(k2)>2: bps2[bps2.index(i)]=k2
else: del bps2[bps2.index(i)]
if len(bps2)<1: continue
original_bps_all=[]
for obas in bps2: original_bps_all+=obas
original_structure=bp_to_let(original_bps_all,chromos_all)
chr_letter_tbp=letter_rearrange(flank,bps2)
letter_tGC=letter_GC_ReadIn(chr_letter_tbp)
if letter_tGC=='error': continue
letter_tRD=letter_RD_ReadIn(chr_letter_tbp)
if letter_tRD=='error': continue
[chr_letter_bp,letter_GC,letter_RD]=[{},{},{}]
for k1 in list(chr_letter_tbp.keys()):
chr_letter_bp[k1]={}
letter_GC[k1]={}
letter_RD[k1]={}
for k2 in list(chr_letter_tbp[k1].keys()):
if k2 in list(letter_tGC[k1].keys()) and k2 in list(letter_tRD[k1].keys()) and not math.isnan(letter_tRD[k1][k2]) and not math.isnan(letter_tGC[k1][k2]):
chr_letter_bp[k1][k2]=chr_letter_tbp[k1][k2]
letter_GC[k1][k2]=letter_tGC[k1][k2]
letter_RD[k1][k2]=letter_tRD[k1][k2]
left_keys=[]
for k1 in list(chr_letter_bp.keys()):
for k2 in list(chr_letter_bp[k1].keys()): left_keys.append(k2)
if not left_keys==[]:
bps3={}
for k1 in list(chr_letter_bp.keys()):
bps3[k1]={}
for k2 in list(chr_letter_bp[k1].keys()): bps3[k1][chr_letter_bp[k1][k2][0]]=[chr_letter_bp[k1][k2][0],chr_letter_bp[k1][k2][-1]]
bps4={}
for k1 in list(bps3.keys()):
if not bps3[k1]=={}:
bps4[k1]=[[k1]+bps3[k1][sorted(bps3[k1].keys())[0]]]
for k2 in range(len(list(bps3[k1].keys()))-1):
if bps3[k1][sorted(bps3[k1].keys())[k2+1]][0]==bps3[k1][sorted(bps3[k1].keys())[k2]][-1]: bps4[k1][-1]+=[bps3[k1][sorted(bps3[k1].keys())[k2+1]][-1]]
else: bps4[k1].append(bps3[k1][sorted(bps3[k1].keys())[k2+1]])
bps2=bps4_to_bps2(bps4)
Chr=bps2[0][0]
Flank_para_dict={'flank':flank,'Cut_Lower':Cut_Lower,'Cut_Upper':Cut_Upper,'ReadLength':ReadLength}
[Copy_num_estimate,Copy_num_Check]=copy_num_estimate_calcu(GC_para_dict,Flank_para_dict,bps2)
if Copy_num_estimate=='error': continue
if Copy_num_Check==[]:
Full_Info=Full_Info_of_Reads_Integrate(GC_para_dict,Flank_para_dict,bps2)
RD_within_B=RD_within_B_calcu(GC_Mean_Coverage,Full_Info,bps2)
global inv_flag_overall
inv_flag_overall=inv_structure_predict(Full_Info)
for j in range(Cut_Lower,Cut_Upper+1):
Single_ILScore=pdf_calculate(j,IL_Statistics[4],IL_Statistics[0],IL_Statistics[1],IL_Statistics[2],IL_Statistics[3],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero)
Best_IL_Score+=Single_ILScore*exp(Single_ILScore)
let_chr_rec={}
for i in list(chr_letter_bp.keys()):
for j in list(chr_letter_bp[i].keys()):
if j in left_keys: let_chr_rec[j]=i
for i in list(let_chr_rec.keys()):
Theo_RD=GC_Overall_Median_Coverage[str(let_chr_rec[i])]
Theo_Var=GC_Var_Coverage[str(let_chr_rec[i])]
for j in range(int(Theo_RD/2),int(Theo_RD/2*3+1)):
single_ProbNB=Prob_Norm(j,Theo_RD,Theo_Var)
Best_RD_Score+=single_ProbNB*exp(single_ProbNB)
Block_CN_Upper={}
#if Copy_num_Check==[]:
median_CN=GC_Overall_Median_Coverage[chrom_N]/2
for key in list(Initial_GCRD_Adj.keys()):
if not key in ['left','right']: Block_CN_Upper[key]=Initial_GCRD_Adj[key]/median_CN+2
[Initial_DR,Initial_IL,BlockGC,original_bp_list,original_letters]=[Full_Info[2],Full_Info[3],Full_Info[7],Full_Info[8],Full_Info[9]]
BlockGC['left']=0.476
BlockGC['right']=0.476
BlockGC2={}
for key_B_GC in list(BlockGC.keys()):
BlockGC2[key_B_GC]=BlockGC[key_B_GC]
BlockGC2[key_B_GC+'^']=BlockGC[key_B_GC]
Be_BP_Letter={}
for let_key in original_letters:
Be_BP_Letter[let_key]=original_bp_list[original_letters.index(let_key)+1]-original_bp_list[original_letters.index(let_key)]
ori_let2=[]
for i in original_letters: ori_let2.append(i)
for i in original_letters:
if Copy_num_estimate[i]<0: ori_let2.remove(i)
elif Copy_num_estimate[i]>3:
letter_copy=int(Copy_num_estimate[i]/2)
for j in range(letter_copy)[1:]: ori_let2.append(i)
ori_bp2=[original_bp_list[0]]
for i in ori_let2: ori_bp2.append(ori_bp2[-1]+Be_BP_Letter[i])
Initial_TB=0
[Pair_Through,Read_Through,SingleR_Through]=[Full_Info[4],Full_Info[5],Full_Info[6]]
bp_MP=[original_bp_list,original_bp_list]
letter_MP=[original_letters,original_letters]
Be_BP_Letter['left']=flank
Be_BP_Letter['right']=flank
for let_key in list(Be_BP_Letter.keys()): Be_BP_Letter[let_key+'^']=Be_BP_Letter[let_key]
num_of_read_pairs=1
for k1 in list(Be_BP_Letter.keys()):
if not k1[-1]=='^' and not k1 in ['left','right']: num_of_read_pairs+=Be_BP_Letter[k1]*RD_within_B[k1]/2/ReadLength
num_of_read_pairs+=len(Full_Info[4])+len(Full_Info[5])+len(Full_Info[6])
Be_Info=[Pair_Through,Read_Through,SingleR_Through]
Be_Letter=[ori_let_Modi(Be_Info,ori_let2,Copy_num_estimate),ori_let2]
Be_BP=ori_bp_Modi(Be_Letter,ori_bp2,Be_BP_Letter)
Best_Score=float("-inf")
[Move_Step,best_iterations,Best_Letter,Best_BPs,score_record,Best_Score_Rec,Score_rec_hash,break_Iteration_Flag,run_flag,Best_Letter_Rec]=[0,0,[],[],[],0,{},0,0,[]]
num_of_reads=(original_bp_list[-1]-original_bp_list[0])*GC_Mean_Coverage[Chr]/2/ReadLength
BP_para_dict={'flank':flank,'Cut_Lower':Cut_Lower,'Cut_Upper':Cut_Upper,'ReadLength':ReadLength,'Be_Letter':Be_Letter,'num_of_reads':num_of_reads,'original_letters':original_letters,'BlockGC2':BlockGC2,'BlockGC':BlockGC,'original_bp_list':original_bp_list,'RD_within_B':RD_within_B}
if len(Full_Info[9])==1:
if Full_Info[1]['a']<GC_Mean_Coverage[Chr]/4 and Full_Info[2]<3:
Run_Result=zero_RD_Process(original_bp_list,run_flag,Best_IL_Score,Best_RD_Score)
if Run_Result=='Error': continue
[Best_Letter_Rec,Best_Score_Rec,run_flag]=[Run_Result[0],Run_Result[1],Run_Result[2]]
Score_rec_hash[Best_Score_Rec]=Best_Letter_Rec
else:
if Full_Info[1]['a']<GC_Mean_Coverage[Chr]:
Run_Result=one_RD_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash)
if Run_Result=='Error': continue
[Best_Letter_Rec,Best_Score_Rec,run_flag]=[Run_Result[0],Run_Result[1],Run_Result[2]]
Score_rec_hash=Run_Result[3]
else:
if Full_Info[1]['a']<2*GC_Mean_Coverage[Chr]:
Run_Result=two_RD_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash)
if Run_Result=='Error': continue
[Best_Letter_Rec,Best_Score_Rec,run_flag]=[Run_Result[0],Run_Result[1],Run_Result[2]]
Score_rec_hash=Run_Result[3]
else:
copy_num_a=int(float(Full_Info[1]['a'])/(float(GC_Mean_Coverage[Chr])/2))
copy_num_b=int(float(Full_Info[1]['a'])/(float(GC_Mean_Coverage[Chr])/2))+1
if copy_num_b<4:
Run_Result=few_RD_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash)
if Run_Result=='Error': continue
[Best_Letter_Rec,Best_Score_Rec,run_flag]=[Run_Result[0],Run_Result[1],Run_Result[2]]
Score_rec_hash=Run_Result[3]
elif copy_num_b<50:
Run_Result=many_RD_Process(copy_num_a,run_flag)
if Run_Result=='Error': continue
[Best_Letter_Rec,Best_Score_Rec,run_flag]=[Run_Result[0],Run_Result[1],Run_Result[2]]
Score_rec_hash[Best_Score_Rec]=Best_Letter_Rec
else:
print(bps2)
print(copy_num_b)
elif len(Full_Info[9])==2 and deterministic_flag==0:
bl2_flag=0
for keyCNE in list(Copy_num_estimate.keys()):
if not Copy_num_estimate[keyCNE]<2: bl2_flag+=1
if bl2_flag==0:
Run_Result=two_block_RD_Process(GC_para_dict,BP_para_dict,run_flag)
if Run_Result=='Error': continue
[Best_Letter_Rec,Best_Score_Rec,run_flag]=[Run_Result[0],Run_Result[1],Run_Result[2]]
Score_rec_hash[Best_Score_Rec]=Best_Letter_Rec
if run_flag==0:
speed_test=10
t1_sptest=time.time()
while True:
if Move_Step>speed_test: break
Move_Step+=1
if inv_flag_overall<0.1: Move_Sample_Pool=['delete','insert']
else: Move_Sample_Pool=['delete','invert','insert']
Initial_Move_Prob=[float(1)/float(len(Move_Sample_Pool)) for i in range(len(Move_Sample_Pool))]
Move_M_P=Move_Choose(Move_Sample_Pool,Ploidy,Initial_Move_Prob)
M_Move_Choices=Move_Choice_procedure_2(Move_M_P[0],Be_Letter[0],original_letters,'2m')
P_Move_Choices=Move_Choice_procedure_2(Move_M_P[1],Be_Letter[1],original_letters,'2p')
if M_Move_Choices=='ERROR!' and P_Move_Choices=='ERROR!':
Move_Step-=1
continue
if not M_Move_Choices=='ERROR!' and not M_Move_Choices==[]:
[P_IL,P_RD,P_DR,P_TB,Letter_Rec,BP_Rec]=[[],[],[],[],[],[]]
Af_Letter_BP_List=After_Letter_List_Produce_M(M_Move_Choices,Be_BP,Be_Letter,original_bp_list,Ploidy,Best_Score_Rec,Best_Letter_Rec,Block_CN_Upper)
for Af_Info_Number in range(Af_Letter_BP_List[2]):
[Af_Letter,Af_BP]=[Af_Letter_BP_List[0][Af_Info_Number],Af_Letter_BP_List[1][Af_Info_Number]]
Af_Info_all=Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP)
if Af_Info_all==0:continue
Letter_Rec.append(Af_Letter)
BP_Rec.append(Af_BP)
Af_IL_Penal=Af_Info_all[0]
Af_RD_Rec=Af_Info_all[1]
Af_DR_Penal=(Af_Info_all[2])**2
Af_TB_Penal_a=Af_Info_all[4]
Af_TB_Rec=Af_Info_all[3]
Af_TB_Penal=float(Af_TB_Penal_a)/float(num_of_reads)+float(Af_TB_Rec)/float(len(Af_Letter[0]+Af_Letter[1])+2)
Af_RD_Penal=RD_Adj_Penal(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter)
for key in list(Af_Info_all[5].keys()): Af_RD_Penal+=Prob_Norm(Af_Info_all[5][key],0,GC_Var_Coverage[chrom_N]/2)
P_IL.append(Af_IL_Penal)
P_RD.append(Af_RD_Penal)
P_DR.append(Af_DR_Penal/num_of_read_pairs)
P_TB.append(Af_TB_Penal)
if len(P_IL)==0: continue
Regu_IL=[P_IL[i]*(1+DR_Weight*P_DR[i]) for i in range(len(P_IL))]
Regu_RD=[P_RD[i]+P_TB[i] for i in range(len(P_RD))]
Regu_IL=[(i-IL_GS)*K_IL_new for i in Regu_IL]
Regu_RD=[i-RD_GS for i in Regu_RD]
Regulator=1
ILTemp=[j/Regulator for j in Regu_IL]
RDTemp=[i for i in Regu_RD]
if deterministic_flag==0: DECISION_Score=Move_Decide_2(ILTemp,RDTemp,GC_Var_Coverage)
else: DECISION_Score=Move_Decide_deterministic(ILTemp,RDTemp,GC_Var_Coverage)
if DECISION_Score=='': continue
DECISION=DECISION_Score[0]
S_DECISION=Regu_IL[DECISION]+Regu_RD[DECISION]
Be_Letter=Letter_Rec[DECISION]
Be_BP=BP_Rec[DECISION]
if not S_DECISION in list(Score_rec_hash.keys()): Score_rec_hash[S_DECISION]=[]
Score_rec_hash[S_DECISION].append(Be_Letter)
if S_DECISION>Best_Score:
Best_Letter=[Be_Letter]
Best_BPs=[Be_BP]
Best_Score=S_DECISION
best_iterations=0
elif S_DECISION==Best_Score:
if not Be_Letter in Best_Letter:
Best_Letter+=[Be_Letter]
Best_BPs+=[Be_BP]
best_iterations+=1
else: best_iterations+=1
score_record.append(S_DECISION)
if not P_Move_Choices=='ERROR!' and not P_Move_Choices==[]:
[P_IL,P_RD,P_DR,P_TB,Letter_Rec,BP_Rec]=[[],[],[],[],[],[]]
Af_Letter_BP_List=After_Letter_List_Produce_P(P_Move_Choices,Be_BP,Be_Letter,original_bp_list,Ploidy,Best_Score_Rec,Best_Letter_Rec,Block_CN_Upper)
for Af_Info_Number in range(Af_Letter_BP_List[2]):
[Af_Letter,Af_BP]=[Af_Letter_BP_List[0][Af_Info_Number],Af_Letter_BP_List[1][Af_Info_Number]]
Af_Info_all=Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP)
if Af_Info_all==0: continue
Letter_Rec.append(Af_Letter)
BP_Rec.append(Af_BP)
Af_IL_Penal=Af_Info_all[0]
Af_RD_Rec=Af_Info_all[1]
Af_DR_Penal=(Af_Info_all[2])**2
Af_TB_Penal_a=Af_Info_all[4]
Af_TB_Rec=Af_Info_all[3]
Af_TB_Penal=float(Af_TB_Penal_a)/float(num_of_reads)+float(Af_TB_Rec)/float(len(Af_Letter[0]+Af_Letter[1])+2)
Af_RD_Penal=RD_Adj_Penal(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter)
for key in list(Af_Info_all[5].keys()): Af_RD_Penal+=Prob_Norm(Af_Info_all[5][key],0,GC_Var_Coverage[chrom_N])
P_IL.append(Af_IL_Penal)
P_RD.append(Af_RD_Penal)
P_DR.append(Af_DR_Penal/num_of_read_pairs)
P_TB.append(Af_TB_Penal)
if len(P_IL)==0: continue
Regu_IL=[P_IL[i]*(1+DR_Weight*P_DR[i]) for i in range(len(P_IL))]
Regu_RD=[P_RD[i]+P_TB[i] for i in range(len(P_RD))]
Regu_IL=[(i-IL_GS)*K_IL_new for i in Regu_IL]
Regu_RD=[i-RD_GS for i in Regu_RD]
Regulator=numpy.median(Regu_IL)/numpy.median(Regu_RD)
Regulator=1
ILTemp=[j/Regulator for j in Regu_IL]
RDTemp=[i for i in Regu_RD]
if deterministic_flag==0: DECISION_Score=Move_Decide_2(ILTemp,RDTemp,GC_Var_Coverage)
else: DECISION_Score=Move_Decide_deterministic(ILTemp,RDTemp,GC_Var_Coverage)
if DECISION_Score=='': continue
DECISION=DECISION_Score[0]
S_DECISION=Regu_IL[DECISION]+Regu_RD[DECISION]
Be_Letter=Letter_Rec[DECISION]
Be_BP=BP_Rec[DECISION]
if not S_DECISION in list(Score_rec_hash.keys()): Score_rec_hash[S_DECISION]=[]
Score_rec_hash[S_DECISION].append(Be_Letter)
if S_DECISION>Best_Score:
Best_Letter=[Be_Letter]
Best_BPs=[Be_BP]
Best_Score=S_DECISION
best_iterations=0
elif S_DECISION==Best_Score:
if not Be_Letter in Best_Letter:
Best_Letter+=[Be_Letter]
Best_BPs+=[Be_BP]
best_iterations+=1
else: best_iterations+=1
score_record.append(S_DECISION)
#best_score_rec.append(Best_Score)
t2_sptest=time.time()
if t2_sptest-t1_sptest<10 or bpsk1<4:
while True:
if Move_Step>Trail_Number: break
if best_iterations>Local_Minumum_Number:
if Best_Score_Rec==0:
best_iterations=0
Best_Score_Rec=Best_Score
Best_Letter_Rec=Best_Letter
Score_rec_hash[Best_Score_Rec]=Best_Letter_Rec
Best_BPs_Rec=Best_BPs
Be_Letter=Best_Letter[0]
Be_BP=Best_BPs[0]
Best_Score-=100
else:
if Best_Score<Best_Score_Rec: break_Iteration_Flag=1
elif Best_Score==Best_Score_Rec:
break_Iteration_Flag=1
for i in Best_Letter:
if not i in Best_Letter_Rec: Best_Letter_Rec.append(i)
else:
best_iterations=0
Best_Score_Rec=Best_Score
Best_Letter_Rec=Best_Letter
Best_BPs_Rec=Best_BPs
Be_Letter=Best_Letter[0]
Be_BP=Best_BPs[0]
Best_Score-=100
if break_Iteration_Flag>0: break
Move_Step+=1
Move_Sample_Pool=['delete','invert','insert']
Move_M_P=Move_Choose(Move_Sample_Pool,Ploidy,Initial_Move_Prob)
if Be_Letter[0]==[]: Move_M_P[0]='insert'
if Be_Letter[1]==[]: Move_M_P[1]='insert'
M_Move_Choices=Move_Choice_procedure_2(Move_M_P[0],Be_Letter[0],original_letters,'2m')
P_Move_Choices=Move_Choice_procedure_2(Move_M_P[1],Be_Letter[1],original_letters,'2p')
if M_Move_Choices=='ERROR!' and P_Move_Choices=='ERROR!':
Move_Step-=1
continue
if not M_Move_Choices=='ERROR!' and not M_Move_Choices==[]:
[P_IL,P_RD,P_DR,P_TB,Letter_Rec,BP_Rec]=[[],[],[],[],[],[]]
Af_Letter_BP_List=After_Letter_List_Produce_M(M_Move_Choices,Be_BP,Be_Letter,original_bp_list,Ploidy,Best_Score_Rec,Best_Letter_Rec,Block_CN_Upper)
for Af_Info_Number in range(Af_Letter_BP_List[2]):
[Af_Letter,Af_BP]=[Af_Letter_BP_List[0][Af_Info_Number],Af_Letter_BP_List[1][Af_Info_Number]]
Af_Info_all=Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP)
if Af_Info_all==0:continue
Letter_Rec.append(Af_Letter)
BP_Rec.append(Af_BP)
Af_IL_Penal=Af_Info_all[0]
Af_RD_Rec=Af_Info_all[1]
Af_DR_Penal=(Af_Info_all[2])**2
Af_TB_Penal_a=Af_Info_all[4]
Af_TB_Rec=Af_Info_all[3]
Af_TB_Penal=float(Af_TB_Penal_a)/float(num_of_reads)+float(Af_TB_Rec)/float(len(Af_Letter[0]+Af_Letter[1])+2)
Af_RD_Penal=RD_Adj_Penal(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter)
for key in list(Af_Info_all[5].keys()): Af_RD_Penal+=Prob_Norm(Af_Info_all[5][key],0,GC_Var_Coverage[chrom_N]/2)
P_IL.append(Af_IL_Penal)
P_RD.append(Af_RD_Penal)
P_DR.append(Af_DR_Penal/num_of_read_pairs)
P_TB.append(Af_TB_Penal)
if len(P_IL)==0: continue
Regu_IL=[P_IL[i]*(1+DR_Weight*P_DR[i]) for i in range(len(P_IL))]
Regu_RD=[P_RD[i]+P_TB[i] for i in range(len(P_RD))]
Regu_IL=[(i-IL_GS)*K_IL_new for i in Regu_IL]
Regu_RD=[i-RD_GS for i in Regu_RD]
Regulator=numpy.median(Regu_IL)/numpy.median(Regu_RD)
Regulator=1
ILTemp=[j/Regulator for j in Regu_IL]
RDTemp=[i for i in Regu_RD]
if deterministic_flag==0: DECISION_Score=Move_Decide_2(ILTemp,RDTemp,GC_Var_Coverage)
else: DECISION_Score=Move_Decide_deterministic(ILTemp,RDTemp,GC_Var_Coverage)
if DECISION_Score=='': continue
DECISION=DECISION_Score[0]
S_DECISION=Regu_IL[DECISION]+Regu_RD[DECISION]
Be_Letter=Letter_Rec[DECISION]
Be_BP=BP_Rec[DECISION]
if not S_DECISION in list(Score_rec_hash.keys()): Score_rec_hash[S_DECISION]=[Be_Letter]
else:
if not Be_Letter in Score_rec_hash[S_DECISION]: Score_rec_hash[S_DECISION].append(Be_Letter)
if S_DECISION>Best_Score:
Best_Letter=[Be_Letter]
Best_BPs=[Be_BP]
Best_Score=S_DECISION
best_iterations=0
elif S_DECISION==Best_Score:
if not Be_Letter in Best_Letter:
Best_Letter+=[Be_Letter]
Best_BPs+=[Be_BP]
best_iterations+=1
else: best_iterations+=1
score_record.append(S_DECISION)
#best_score_rec.append(Best_Score)
if not P_Move_Choices=='ERROR!' and not P_Move_Choices==[]:
[P_IL,P_RD,P_DR,P_TB,Letter_Rec,BP_Rec]=[[],[],[],[],[],[]]
Af_Letter_BP_List=After_Letter_List_Produce_P(P_Move_Choices,Be_BP,Be_Letter,original_bp_list,Ploidy,Best_Score_Rec,Best_Letter_Rec,Block_CN_Upper)
for Af_Info_Number in range(Af_Letter_BP_List[2]):
[Af_Letter,Af_BP]=[Af_Letter_BP_List[0][Af_Info_Number],Af_Letter_BP_List[1][Af_Info_Number]]
Af_Info_all=Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP)
if Af_Info_all==0: continue
Letter_Rec.append(Af_Letter)
BP_Rec.append(Af_BP)
Af_IL_Penal=Af_Info_all[0]
Af_RD_Rec=Af_Info_all[1]
Af_DR_Penal=(Af_Info_all[2])**2
Af_TB_Penal_a=Af_Info_all[4]
Af_TB_Rec=Af_Info_all[3]
Af_TB_Penal=float(Af_TB_Penal_a)/float(num_of_reads)+float(Af_TB_Rec)/float(len(Af_Letter[0]+Af_Letter[1])+2)
Af_RD_Penal=RD_Adj_Penal(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter)
for key in list(Af_Info_all[5].keys()): Af_RD_Penal+=Prob_Norm(Af_Info_all[5][key],0,GC_Var_Coverage[chrom_N])
P_IL.append(Af_IL_Penal)
P_RD.append(Af_RD_Penal)
P_DR.append(Af_DR_Penal/num_of_read_pairs)
P_TB.append(Af_TB_Penal)
if len(P_IL)==0: continue
Regu_IL=[P_IL[i]*(1+DR_Weight*P_DR[i]) for i in range(len(P_IL))]
Regu_RD=[P_RD[i]+P_TB[i] for i in range(len(P_RD))]
Regu_IL=[(i-IL_GS)*K_IL_new for i in Regu_IL]
Regu_RD=[i-RD_GS for i in Regu_RD]
Regulator=numpy.median(Regu_IL)/numpy.median(Regu_RD)
Regulator=1
ILTemp=[j/Regulator for j in Regu_IL]
RDTemp=[i for i in Regu_RD]
if deterministic_flag==0: DECISION_Score=Move_Decide_2(ILTemp,RDTemp,GC_Var_Coverage)
else: DECISION_Score=Move_Decide_deterministic(ILTemp,RDTemp,GC_Var_Coverage)
if DECISION_Score=='': continue
DECISION=DECISION_Score[0]
S_DECISION=Regu_IL[DECISION]+Regu_RD[DECISION]
Be_Letter=Letter_Rec[DECISION]
Be_BP=BP_Rec[DECISION]
if not S_DECISION in list(Score_rec_hash.keys()):
Score_rec_hash[S_DECISION]=[Be_Letter]
else:
if not Be_Letter in Score_rec_hash[S_DECISION]: Score_rec_hash[S_DECISION].append(Be_Letter)
if S_DECISION>Best_Score:
Best_Letter=[Be_Letter]
Best_BPs=[Be_BP]
Best_Score=S_DECISION
best_iterations=0
elif S_DECISION==Best_Score:
if not Be_Letter in Best_Letter:
Best_Letter+=[Be_Letter]
Best_BPs+=[Be_BP]
best_iterations+=1
else: best_iterations+=1
score_record.append(S_DECISION)
#best_score_rec.append(Best_Score)
else:
gaps=[]
bps2_new=[]
for k1 in bps2:
gaps.append([])
for k2 in range(len(k1)-2): gaps[-1].append(int(k1[k2+2])-int(k1[k2+1]))
for k1 in range(len(gaps)):
bps2_new.append([])
chr_rec=bps2[k1][0]
rec1=1
for k2 in range(len(gaps[k1])):
if gaps[k1][k2]==max(gaps[k1]):
bps2_new[-1].append([chr_rec]+bps2[k1][rec1:(k2+2)])
bps2_new[-1].append([chr_rec]+bps2[k1][(k2+1):(k2+3)])
rec1=k2+2
bps2_new[-1].append([chr_rec]+bps2[k1][rec1:])
for k1 in bps2_new:
for k2 in k1: bps_hash[max(bps_hash.keys())].append([k2])
Best_Letter_Rec=[]
Best_Score_Rec=100
struc_to_remove=[]
for bestletter in Best_Letter_Rec:
if '/'.join([''.join(bestletter[0]),''.join(bestletter[1])])==original_structure: struc_to_remove.append(bestletter)
Best_Letter_Rec=[i for i in Best_Letter_Rec if not i in struc_to_remove]
if Best_Letter_Rec==[] and Best_Score_Rec==100: continue
else: write_best_letter(bps2,Best_Letter_Rec,Best_Score_Rec,Score_rec_hash,original_letters)
else:
Score_rec_hash={}
bps_new={}
temp_Full_Info=original_bp_let_produce(chr_letter_bp,bps2)
original_letters=temp_Full_Info[1]
original_bp_list=temp_Full_Info[0]
for bl in Copy_num_Check:
for blk1 in list(chr_letter_bp.keys()):
for blk2 in sorted(chr_letter_bp[blk1].keys()):
if blk2==bl:
bps2_temp=[blk1]+[chr_letter_bp[blk1][blk2][0],chr_letter_bp[blk1][blk2][-1]]
copy_num_a=int(Copy_num_estimate[bl]/2)
if copy_num_a>50: continue
copy_num_b=Copy_num_estimate[bl]-copy_num_a
Best_Letter_Rec=[[['a' for i in range(copy_num_a)],['a' for i in range(copy_num_a)]]]
Best_Score_Rec=100
write_best_letter([bps2_temp],Best_Letter_Rec,Best_Score_Rec,Score_rec_hash,original_letters)
for blk1 in list(chr_letter_bp.keys()):
bps_new[blk1]=[]
for blk2 in sorted(chr_letter_bp[blk1].keys()):
if not blk2 in Copy_num_Check: bps_new[blk1].append([chr_letter_bp[blk1][blk2][0],chr_letter_bp[blk1][blk2][-1]])
bps_new_2=[]
for k1 in list(bps_new.keys()):
for k2 in bps_new[k1]:
if bps_new_2==[]: bps_new_2.append([k1]+k2)
else:
if k1==bps_new_2[-1][0] and k2[0]==bps_new_2[-1][-1]: bps_new_2[-1]+=k2[1:]
else: bps_new_2.append([k1]+k2)
for k1 in bps_new_2:
bps_hash[max(bps_hash.keys())].append([k1])
if function_name=='SVIntegrate':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','help=','long-insert=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']: readme.print_default_parameters_svintegrate()
else:
def all_sv_single_haploid_decide(k1_hap,k2_hap):
out='NA'
if not k1_hap==k2_hap:
hap_result=simple_del_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'del']
else:
hap_result=simple_inv_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'inv']
else:
hap_result=simple_tandup_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'tandup']
else:
hap_result=simple_disdup_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'disdup']
else:
hap_result=del_inv_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'del_inv']
else:
hap_result=dup_inv_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'dup_inv']
else:
hap_result=del_dup_inv_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'del_dup_inv']
else:
hap_result=del_dup_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'del_dup']
else:
hap_result=simple_tra_haploid_decide(k1_hap,k2_hap)
if not hap_result=='FALSE': out=[hap_result,'tra']
else:
if k1_hap=='a' and k2_hap.count('a')>3:
hap_result=[['a'],[k2_hap.count('a')]]
out=[hap_result,'tandup']
else:
out=['FALSE','FALSE']
return out
def block_modify(block,chromos):
#eg of block=['chr16', '34911339', '34913149', 'chr16', '34913149', '34913438']
out=[]
for x in block:
if x in chromos:
if out==[]: out.append([x])
else:
if not x in out[-1]: out.append([x])
else: out[-1].append(x)
out_new=[]
for x in out:
out_new.append([])
for y in x:
if x.count(y)==1:
out_new[-1].append(y)
out_new_2=[]
for x in out_new:
if len(x)==3:
out_new_2.append(x)
else:
for y in range(int((len(x)-1)/2)):
out_new_2.append([x[0],x[2*y+1],x[2*y+2]])
return out_new_2
def bp_to_chr_hash(bps,chromos,flank_length=500):
#eg of bps=['chr16', '34910548', '34911339', '34913149', '34913438', '36181068', '36181482']
temp1=[]
for i in bps:
if i in chromos:
temp1.append([i])
else:
temp1[-1].append(i)
out={}
rec=-1
for k1 in temp1:
for k2 in range(len(k1[2:])):
rec+=1
out[chr(97+rec)]=[k1[0],k1[k2+1],k1[k2+2]]
out['+']=[out[sorted(out.keys())[-1]][0],out[sorted(out.keys())[-1]][2],str(int(out[sorted(out.keys())[-1]][2])+flank_length)]
out['-']=[out['a'][0],str(int(out['a'][1])-flank_length),int(out['a'][1])]
return out
def chromos_readin(ref):
fin=open(ref+'.fai')
chromos=[]
for line in fin:
pin=line.strip().split()
chromos.append(pin[0])
fin.close()
return chromos
def complex_hash_unit_modify(complex_list):
simple_svs=['del','inv','disdup']
out=[]
simple_flag=0
for x in complex_list:
if x[3] in simple_svs: out.append(x)
elif x[3]=='tandup': out.append(x)
else: simple_flag+=1
if simple_flag==0: return out
else:
temp_hash_1={}
for k1 in complex_list:
if not k1[-1] in list(temp_hash_1.keys()): temp_hash_1[k1[-1]]={}
if not k1[-3] in list(temp_hash_1[k1[-1]].keys()): temp_hash_1[k1[-1]][k1[-3]]={}
if not k1[-2] in list(temp_hash_1[k1[-1]][k1[-3]].keys()): temp_hash_1[k1[-1]][k1[-3]][k1[-2]]=[]
temp_hash_1[k1[-1]][k1[-3]][k1[-2]].append(k1)
for k1 in list(temp_hash_1.keys()):
for k2 in list(temp_hash_1[k1].keys()):
for k3 in list(temp_hash_1[k1][k2].keys()):
allales_info={}
for k4 in temp_hash_1[k1][k2][k3]:
if not k4[4] in list(allales_info.keys()):allales_info[k4[4]]=[]
allales_info[k4[4]].append(k4)
for x in list(allales_info.keys()):
if allales_info[x][0][3]=='del_dup_inv':
info_column=[]
dup_inv_info=[]
ins_info=[]
for y in allales_info[x]:
if y[5]=='del_block=': info_column.append('del='+':'.join([y[0],'-'.join(y[1:3])]))
else:
if y[5]=='dup_inv_block=': dup_inv_info.append(y[:5])
elif y[5]=='insert_point=': ins_info.append([y[0],y[2]])
for y in range(len(dup_inv_info)):
vcf_single_rec=dup_inv_info[y]+[';'.join(info_column+['dup_inv='+':'.join([dup_inv_info[y][0],'-'.join(dup_inv_info[y][1:3])])]+['insert_point='+':'.join([str(i) for i in ins_info[y]])])]+[k2,k3,k1]
out.append(vcf_single_rec)
elif allales_info[x][0][3]=='del_inv':
blocks_pos=[]
for y in allales_info[x]:
if blocks_pos==[]: blocks_pos+=y[:3]
elif y[0]==blocks_pos[0]: blocks_pos+=y[1:3]
else: blocks_pos.append('Error')
if 'Error' in blocks_pos: continue
else:
blocks_pos=[blocks_pos[0],min([int(i) for i in blocks_pos[1:]]),max([int(i) for i in blocks_pos[1:]])]
[del_info,inv_info]=[[],[]]
for y in allales_info[x]:
if y[5]=='del':del_info.append('del='+':'.join([y[0],'-'.join(y[1:3])]))
elif y[5]=='inv':inv_info.append('inv='+':'.join([y[0],'-'.join(y[1:3])]))
vcf_single_rec=blocks_pos+['del_inv',x,';'.join(del_info+inv_info),k2,k3,k1]
out.append(vcf_single_rec)
elif allales_info[x][0][3]=='dup_inv':
for k4 in allales_info[x]:
out.append(k4)
elif allales_info[x][0][3]=='del_dup':
blocks_pos=[]
for y in allales_info[x]:
if blocks_pos==[]: blocks_pos+=y[:3]
elif y[0]==blocks_pos[0]: blocks_pos+=y[1:3]
else: blocks_pos.append('Error')
if 'Error' in blocks_pos: continue
else:
blocks_pos=[blocks_pos[0],min([int(i) for i in blocks_pos[1:]]),max([int(i) for i in blocks_pos[1:]])]
[del_info,dup_info]=[[],[]]
for y in allales_info[x]:
if y[5]=='del_block=':del_info.append('del='+':'.join([y[0],'-'.join(y[1:3])]))
elif y[5]=='dup_block=':dup_info.append('dup='+':'.join([y[0],'-'.join(y[1:3])]))
vcf_single_rec=blocks_pos+['del_dup',x,';'.join(del_info+dup_info),k2,k3,k1]
out.append(vcf_single_rec)
elif allales_info[x][0][3] in simple_svs+['tandup']: continue
else:
for k4 in allales_info[x]: out.append(k4)
return out
def Define_Default_SVIntegrate():
global score_Cff
if not '--qc-structure' in dict_opts:
score_Cff=0
else:
score_Cff=int(dict_opts['--qc-structure'])
def del_block_modify(del_block,chromos):
out=[]
for x in del_block:
out.append([])
for y in x:
out[-1]+=block_modify(y,chromos)
return out
def dup_block_modify(del_block,chromos):
out=[]
for x in del_block:
out.append([])
for y in x:
out[-1]+=block_modify(y,chromos)+[y[-1]]
return out
def dup_block_new_to_temp(dup_block_new):
#eg of dup_block_new=[['chr1', '246785645', '246785978'], 2, ['chr1', '246785645', '246786238'], 2]
temp=[[]]
for x in dup_block_new:
if type(x)==int:
temp[-1].append(x)
temp.append([])
else:
temp[-1].append(x)
return [i for i in temp if not i==[]]
def svelter_file_readin(svelter_file):
#eg of svelter_file='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/SVelter.version14/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.svelter'
fin=open(svelter_file)
out_hash={}
pin=fin.readline().strip().split()
while True:
pin=fin.readline().strip().split()
if not pin: break
if not pin[4] in list(out_hash.keys()): out_hash[pin[4]]={}
if not pin[5] in list(out_hash[pin[4]].keys()): out_hash[pin[4]][pin[5]]=[]
if not pin[3].split(':') in out_hash[pin[4]][pin[5]]: out_hash[pin[4]][pin[5]].append(pin[3].split(':'))
fin.close()
return out_hash
def simple_del_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='a/a'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(simple_del_haploid_decide(k1_hap,x))
return out
def simple_del_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='ab' ; eg of k2_hap='b'
if k1_hap==k2_hap: return 'FALSE' #no alt
if k2_hap=='': return [k1_hap]
if '^' in k2_hap: return 'FALSE' #check if inv included
dup_test=[k2_hap.count(x) for x in k2_hap]
if max(dup_test)>1: return 'FALSE' #check if dup included
if len(k2_hap)==1 and len(k1_hap)>1: return letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap])) #del
pos_compare=[ord(k2_hap[i+1])-ord(k2_hap[i]) for i in range(len(k2_hap)-1)]
if min(pos_compare)<1: return 'FALSE'
return letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap]))
def simple_inv_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='ab^/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(simple_inv_haploid_decide(k1_hap,x))
return out
def simple_inv_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='ab' ; eg of k2_hap='b^a^'
if not '^' in k2_hap: return 'FALSE' #if not block inverted
if len(k2_hap.replace('^',''))==1 and len(k1_hap)==1: return [i for i in k1_hap]
dup_test=[k2_hap.count(i) for i in k2_hap if not i=='^']
if max(dup_test)>1: return 'FALSE'
inverted_sv_new=letter_subgroup(k2_hap)
if ''.join([i.replace('^','') for i in inverted_sv_new])==k1_hap: return [i[:-1] for i in inverted_sv_new if '^' in i]
else: return 'FALSE'
def simple_tandup_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='abb/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(simple_tandup_haploid_decide(k1_hap,x))
return out
def simple_tandup_haploid_decide(k1_hap,k2_hap):
if '^' in k2_hap: return 'FALSE'
dup_count=[k2_hap.count(i) for i in k1_hap]
if min(dup_count)<1 or max(dup_count)<2: return 'FALSE' #deletion structure inside
out=[]
temp1=[]
for x in k2_hap:
if temp1==[]: temp1.append(x)
elif ord(x)-ord(temp1[-1][-1])==1: temp1[-1]+=x
else: temp1.append(x)
overlap_portion=[]
overlap_count=[]
for x in temp1:
if out==[]:
out.append(x)
else:
overlap=intersect(out[-1],x)
if not len(overlap) >len(out[-1]) and not len(overlap)>len(x):
if out[-1][-len(overlap):]==x[:len(overlap)]:
out[-1]+=x[len(overlap):]
if not overlap in overlap_portion:
overlap_portion.append(overlap)
overlap_count.append(2)
else:
overlap_count[overlap_portion.index(overlap)]+=1
else:
out.append(x)
else:
out.append(x)
if ''.join(out)==k1_hap:
return [overlap_portion,overlap_count]
return 'FALSE'
def simple_disdup_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='bab/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(simple_disdup_haploid_decide(k1_hap,x))
return out
def simple_disdup_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='abcd' ; eg of k2_hap='babdcd'
if not '^' in k2_hap:
if simple_tandup_haploid_decide(k1_hap,k2_hap)=='FALSE':
dup_dis=letter_subgroup(k2_hap)
overlap=[intersect(dup_dis[i],dup_dis[i+1]) for i in range(len(dup_dis)-1)]
if len(list_unify(overlap))==len(overlap):
dup_count=[k2_hap.count(i) for i in k1_hap]
if not min(dup_count)<1 and not max(dup_count)<2: #deletion structure inside
dup_block=[k1_hap[i] for i in range(len(dup_count)) if dup_count[i]>1]
dup_block_combined=dup_block_combine(dup_block,k1_hap,k2_hap)
dis_dup_check=[]
no_dup_block=[]
for x in k2_hap:
if not x in dup_block:
no_dup_block.append(k2_hap.index(x))
for x in dup_block_combined:
dis_dup_check.append([])
for y in range(len(k2_hap)-len(x)+1):
if k2_hap[y:(y+len(x))]==x:
dis_dup_check[-1].append(y)
original_pos=[]
for x in itertools.product(*dis_dup_check):
x_modify_new=x_to_x_modify_new(x,dup_block_combined)
temp_structure=[k2_hap[i] for i in sorted(x_modify_new+no_dup_block)]
if ''.join(temp_structure)==k1_hap:
original_pos+=list(x)
if len(original_pos)>0:
insert_pos=[]
for i in dis_dup_check:
for j in i:
if not j in original_pos:
insert_pos.append(j)
k2_hap_new=['-']+[i for i in k2_hap]+['+']
insert_block=[]
pos_rec=-1
if len(insert_pos)==len(dup_block_combined):
for i in insert_pos:
pos_rec+=1
if len(dup_block_combined[pos_rec])==1:
insert_block.append([k2_hap_new[i],k2_hap_new[i+1],k2_hap_new[i+2]])
else:
insert_block.append([k2_hap_new[i]]+k2_hap_new[(i+1):(i+len(dup_block_combined[pos_rec])+2)])
#insert_block=[[k2_hap_new[i],k2_hap_new[i+1],k2_hap_new[i+2]] for i in insert_pos]
return [dup_block_combined,insert_block]
return 'FALSE'
def simple_tra_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='ba/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(simple_tra_haploid_decide(k1_hap,x))
return out
def simple_tra_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='abcd' ; eg of k2_hap='bacd'
if not '^' in k2_hap:
if len(k2_hap)>1:
dup_test=[k2_hap.count(i) for i in k1_hap]
if min(dup_test)>0 and max(dup_test)<2: #no del no dup
letter_pos=[ord(i) for i in k2_hap]
letter_dis=[letter_pos[i+1]-letter_pos[i] for i in range(len(letter_pos)-1)]
tra_pos=[i for i in range(len(letter_dis)) if letter_dis[i]<0]
all_letter=['-']+[i for i in k2_hap]+['+']
tra_blocks=[[all_letter[i],all_letter[i+1],all_letter[i+2]] for i in tra_pos]
return tra_blocks
return 'FALSE'
def del_inv_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='abb/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(del_inv_haploid_decide(k1_hap,x))
return out
def del_inv_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='abcd' ; eg of k2_hap='ad^'
if len(k1_hap)>1: #del-inv cannot happen if only 1 block
if '^' in k2_hap: #inv in k2_hap
dup_test=[k2_hap.count(i) for i in k1_hap]
if max(dup_test)<2 and min(dup_test)<1: #no dup in k2_hap; del in k2_hap
if len(k2_hap.replace('^',''))==1:
return [letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap])),[k2_hap]]
else:
k2_new=letter_subgroup(k2_hap)
if len(k2_new)==1:
return [letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap])),k2_new]
else:
tra_test=[k1_hap.index(i[0]) for i in k2_new if not i=='^']
tra_dis=[tra_test[i+1]-tra_test[i] for i in range(len(tra_test)-1)]
if min(tra_dis)>0: #no tra in k2_hap
return [letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap])),[i for i in k2_new if '^' in i]]
return 'FALSE'
def dup_inv_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='abb/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(dup_inv_haploid_decide(k1_hap,x))
return out
def dup_inv_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='abcd' ; eg of k2_hap='ad^bcd'
#if len(k1_hap)>1: #dup-inv cannot happen if only 1 block; only defined on multi-block event
if '^' in k2_hap: #inv in k2_hap
dup_test=[k2_hap.count(i) for i in k1_hap]
if max(dup_test)>1 and min(dup_test)>0: #no del in k2_hap; dup in k2_hap
dup_block=[k1_hap[i] for i in range(len(dup_test)) if dup_test[i]>1]
all_block=letter_subgroup(k2_hap)
if ''.join([i for i in all_block if not '^' in i])==k1_hap:
dup_inv_block=[i for i in all_block if '^' in i]
if dup_block==sorted([i for i in ''.join(dup_inv_block) if not i=='^']):
dup_pos=[i for i in range(len(all_block)) if all_block[i] in dup_inv_block]
all_block_with_flank=['-']+all_block+['+']
dup_neighber=[[all_block_with_flank[i],all_block_with_flank[i+1],all_block_with_flank[i+2]] for i in dup_pos]
return [dup_block,dup_neighber]
return 'FALSE'
def del_dup_inv_diploid_decide(k1,k2) :
#eg of k1='ab/ab' ; eg of k2='abb/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(del_dup_inv_haploid_decide(k1_hap,x))
return out
def del_dup_inv_haploid_decide(k1_hap,k2_hap) :
#eg of k1_hap='abcd' ; eg of k2_hap='ad^cd'
#out format: [[del_blocks],[dup_inv_blocks]]
if len(k1_hap)>1: #dup-inv cannot happen if only 1 block; only defined on multi-block event
if '^' in k2_hap: #inv in k2_hap
dup_test=[k2_hap.count(i) for i in k1_hap]
if max(dup_test)>1 and min(dup_test)<1: # del in k2_hap; dup in k2_hap
dup_block=[k1_hap[i] for i in range(len(dup_test)) if dup_test[i]>1]
all_block=letter_subgroup(k2_hap)
all_block_with_flank=['-']+all_block+['+']
pos_check=[ord(j[0]) for j in [i for i in all_block if not '^' in i]]
if len(pos_check)==1:
insert_point=[]
dup_inv_block=[i for i in all_block if '^' in i]
for x in dup_inv_block:
insert_point.append(all_block_with_flank[all_block_with_flank.index(x)-1])
return [letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap])),dup_inv_block,insert_point]
else:
if not interval_dis_calcu_min(pos_check)=='NA' and interval_dis_calcu_min(pos_check)>0:
dup_inv_block=[i for i in all_block if '^' in i]
if dup_block==sorted([i for i in ''.join(dup_inv_block) if not i=='^']):
insert_point=[]
for x in dup_inv_block:
insert_point.append(all_block_with_flank[all_block_with_flank.index(x)-1])
return [letter_subgroup(''.join([i for i in k1_hap if not i in k2_hap])),dup_inv_block,insert_point]
return 'FALSE'
def del_dup_diploid_decide(k1,k2):
#eg of k1='abc/abc' ; eg of k2='aac/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(del_dup_haploid_decide(k1_hap,x))
return out
def dup_block_combined_qc(all_combines):
#eg of all_combines=['a', 'b', 'c', 'd', 'ab', 'ac', 'ad', 'bc', 'bd', 'cd', 'abc', 'abd', 'acd', 'bcd', 'abcd']
out=[]
for x in all_combines:
if len(x)==1: out.append(x)
else:
temp=[ord(i) for i in x]
if interval_dis_calcu_max(temp)>1: continue
else: out.append(x)
return out
def dup_block_kept_qc(kept_dup):
#eg of kept_dup:
out=[]
if len(kept_dup)>0:
out.append(kept_dup[0])
for y in kept_dup[1:]:
flag_y=0
for z in out:
if y in z: flag_y+=1
if flag_y==0: out.append(y)
return out
def dup_block_combine(dup_block,k1_hap,k2_hap):
#eg of dup_block=['a', 'b']; k1_hap='abcd' ; k2_hap='abab'
all_combines=[]
for x in range(len(dup_block)):
all_combines+=[''.join(list(i)) for i in list(itertools.combinations(dup_block,x+1))]
all_combines=dup_block_combined_qc(all_combines)
kept_dup=[]
for x in all_combines[::-1]:
if k2_hap.count(x)>1:
kept_dup.append(x)
return dup_block_kept_qc(kept_dup)[::-1]
def del_dup_haploid_decide(k1_hap,k2_hap):
#eg of k1_hap='abcd' ; eg of k2_hap='abb'
#out format: [[del_blocks],[dup_inv_blocks]]
if len(k1_hap)>1: #dup-inv cannot happen if only 1 block; only defined on multi-block event
if not '^' in k2_hap: #inv in k2_hap
dup_test=[k2_hap.count(i) for i in k1_hap]
if max(dup_test)>1 and min(dup_test)<1: # del in k2_hap; dup in k2_hap
dup_block=[k1_hap[i] for i in range(len(dup_test)) if dup_test[i]>1]
del_block=[i for i in k1_hap if not i in k2_hap]
#reorga_dup_block=[dup_block_combine([i for i in j],k1_hap,k2_hap) for j in letter_subgroup(''.join(dup_block))]
return [letter_subgroup(''.join(del_block)),dup_block_combine(dup_block,k1_hap,k2_hap)]
return 'FALSE'
def interval_dis_calcu_min(pos_check):
#eg of pos_check=[97,98]
if len(pos_check)>1:
out=[pos_check[i+1]-pos_check[i] for i in range(len(pos_check)-1)]
return min(out)
else:
return 'NA'
def interval_dis_calcu_max(pos_check):
#eg of pos_check=[97,98]
if len(pos_check)>1:
out=[pos_check[i+1]-pos_check[i] for i in range(len(pos_check)-1)]
return max(out)
else:
return 'NA'
def intersect(a, b):
return ''.join(sorted(list(set(a) & set(b))))
def letter_subgroup(k2_hap):
#eg of k2_hap='ac^b^'
inverted_sv=[]
for x in k2_hap:
if not x=='^': inverted_sv.append(x)
else: inverted_sv[-1]+='^'
inverted_sv_2=[]
for x in inverted_sv:
if inverted_sv_2==[]: inverted_sv_2.append(x)
else:
if not '^' in inverted_sv_2[-1] and not '^' in x and ord(x)-ord(inverted_sv_2[-1][-1])==1: inverted_sv_2[-1]+=x
elif '^' in inverted_sv_2[-1] and '^' in x and ord(x[0])-ord(inverted_sv_2[-1][-2])==-1: inverted_sv_2[-1]+=x
else: inverted_sv_2.append(x)
inverted_sv_3=[]
for i in inverted_sv_2:
if not '^' in i: inverted_sv_3.append(i)
else:
inverted_sv_3.append(i.replace('^','')[::-1]+'^')
return inverted_sv_3
def let_to_block_info(let,let_hash):
#eg of let='ab'; eg of let_hash={'a': ['chrY', '10818935', '10819073'], 'b': ['chrY', '10819073', '10926507'], '+': ['chrY', '10926507', '10927007'], '-': ['chrY', '10818435', 10818935]}
out=[]
for i in let:
if not i=='^':
out+=let_hash[i]
return(block_modify(out,chromos))
def list_unify(list):
out=[]
for i in list:
if not i in out: out.append(i)
return out
def simple_multicopy_diploid_decide(k1,k2):
#eg of k1='ab/ab' ; eg of k2='aabaa/ab'
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
out.append(simple_multicopy_haploid_decide(k1_hap,x))
return out
def svelter_to_vcf_new(svelter_hash):
vcf_info_out=[]
for k1 in list(svelter_hash.keys()):
for k2 in list(svelter_hash[k1].keys()):
if k1=='a/a' and k2.count('a')>3: #tandup
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
sv_info=simple_del_diploid_decide(k1,k2) #decide if simple del between k1 and k2
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
del_block=[]
for x in sv_info:
del_block.append([])
if not x=='NA':
for y in x:
del_block[-1].append([])
for z in y:
del_block[-1][-1]+=let_hash[z]
del_block_new=del_block_modify(del_block,chromos)
if del_block_new[0]==del_block_new[1]:
for x in del_block_new[0]:
vcf_info_out.append(x+['del','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
else:
for x in del_block_new[0]:
vcf_info_out.append(x+['del','1/0']+[k1,k2,':'.join([str(i) for i in k3])])
for x in del_block_new[1]:
vcf_info_out.append(x+['del','0/1']+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=simple_inv_diploid_decide(k1,k2) #decide if single inv between k1 and k2
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
del_block=[]
for x in sv_info:
del_block.append([])
if not x=='NA':
for y in x:
del_block[-1].append([])
for z in y:
del_block[-1][-1]+=let_hash[z]
del_block_new=del_block_modify(del_block,chromos)
if del_block_new[0]==del_block_new[1]:
for x in del_block_new[0]:
vcf_info_out.append(x+['inv','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
else:
for x in del_block_new[0]:
vcf_info_out.append(x+['inv','1/0']+[k1,k2,':'.join([str(i) for i in k3])])
for x in del_block_new[1]:
vcf_info_out.append(x+['inv','0/1']+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=simple_tandup_diploid_decide(k1,k2) #decide if single tandup between k1 and k2
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
del_block=[]
for x in sv_info:
del_block.append([])
if not x=='NA':
for y in x[0]:
del_block[-1].append([])
for z in y:
del_block[-1][-1]+=let_hash[z]
block_rec1=-1
for x in del_block:
block_rec1+=1
block_rec2=-1
for y in x:
block_rec2+=1
y+=[sv_info[block_rec1][1][block_rec2]]
del_block_new=dup_block_modify(del_block,chromos)
if del_block_new[0]==del_block_new[1]:
temp_dup=dup_block_new_to_temp(del_block_new[0])
for x in temp_dup:
for y in x[:-1]:
vcf_info_out.append(y+['tandup','./.','CN='+str(x[-1])]+[k1,k2,':'.join([str(i) for i in k3])])
else:
temp_dup=dup_block_new_to_temp(del_block_new[0])
for x in temp_dup:
for y in x[:-1]:
vcf_info_out.append(y+['tandup','1/0','CN='+str(x[-1])]+[k1,k2,':'.join([str(i) for i in k3])])
temp_dup=dup_block_new_to_temp(del_block_new[1])
for x in temp_dup:
for y in x[:-1]:
vcf_info_out.append(y+['tandup','0/1','CN='+str(x[-1])]+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=simple_disdup_diploid_decide(k1,k2) #decide if single disdup between k1 and k2
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
dup_block=[]
for x in sv_info:
dup_block.append([])
if not x=='NA':
for y in x[1]:
dup_block_temp=[]
for i in y[1:-1]:
dup_block_temp+=block_modify(let_hash[i],chromos)
ins_block_temp=[]
if let_hash[y[0]][2]==let_hash[y[-1]][1]:
ins_block_temp.append([let_hash[y[0]][0],let_hash[y[0]][2]])
if ins_block_temp==[]:
ins_block_temp=[['NA']]
dup_block[-1].append(dup_block_temp+ins_block_temp)
if dup_block[0]==dup_block[1]:
for y in dup_block[0]:
vcf_info_out.append(y[0]+['disdup','1/1']+['insert_point='+':'.join(y[1])]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for y in dup_block[0]:
vcf_info_out.append(y[0]+['disdup','1/0']+['insert_point='+':'.join(y[1])]+[k1,k2,':'.join([str(i) for i in k3])])
for y in dup_block[1]:
vcf_info_out.append(y[0]+['disdup','0/1']+['insert_point='+':'.join(y[1])]+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=del_inv_diploid_decide(k1,k2) #decide if del+inv
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
del_inv_block=[]
if sv_info[0]==sv_info[1]:
if not sv_info[0]=='NA':
x=sv_info[0]
del_block_temp=[]
for i in x[0]: del_block_temp+=let_to_block_info(i,let_hash)
inv_block_temp=[]
for i in x[1]: inv_block_temp+=let_to_block_info(i,let_hash)
for i in del_block_temp: vcf_info_out.append(i+['del_inv','1/1','del']+[k1,k2,':'.join([str(i) for i in k3])])
for i in inv_block_temp: vcf_info_out.append(i+['del_inv','1/1','inv']+[k1,k2,':'.join([str(i) for i in k3])])
else:
if not sv_info[0]=='NA':
x=sv_info[0]
del_block_temp=[]
for i in x[0]: del_block_temp+=let_to_block_info(i,let_hash)
inv_block_temp=[]
for i in x[1]: inv_block_temp+=let_to_block_info(i,let_hash)
for i in del_block_temp: vcf_info_out.append(i+['del_inv','1/0','del']+[k1,k2,':'.join([str(i) for i in k3])])
for i in inv_block_temp: vcf_info_out.append(i+['del_inv','1/0','inv']+[k1,k2,':'.join([str(i) for i in k3])])
if not sv_info[1]=='NA':
x=sv_info[1]
del_block_temp=[]
for i in x[0]: del_block_temp+=let_to_block_info(i,let_hash)
inv_block_temp=[]
for i in x[1]: inv_block_temp+=let_to_block_info(i,let_hash)
for i in del_block_temp: vcf_info_out.append(i+['del_inv','0/1','del']+[k1,k2,':'.join([str(i) for i in k3])])
for i in inv_block_temp: vcf_info_out.append(i+['del_inv','0/1','inv']+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=dup_inv_diploid_decide(k1,k2) #decide if dup+inv
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
dup_inv_block=[]
for x in sv_info:
dup_inv_block.append([])
if not x=='NA':
for y in x[1]:
dup_inv_let=[]
for z in y[1]:
if not z=='^':
dup_inv_let+=let_hash[z]
dup_inv_let=block_modify(dup_inv_let,chromos)
insert_point=[]
if let_hash[y[0].replace('^','')[-1]][2]==let_hash[y[-1].replace('^','')[0]][1]:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
else:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
#if insert_point==[]: insert_point=[['NA']]
dup_inv_block[-1].append([dup_inv_let,insert_point])
if sv_info[0]==sv_info[1]:
for x in dup_inv_block[0]:
y=x
for z in range(len(y[0])):
vcf_info_out.append(y[0][z]+['dup_inv','1/1','insert_point='+':'.join([str(i) for i in y[1][0]])]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for x in dup_inv_block[0]:
y=x
for z in range(len(y[0])):
vcf_info_out.append(y[0][z]+['dup_inv','1/0','insert_point='+':'.join([str(i) for i in y[1][0]])]+[k1,k2,':'.join([str(i) for i in k3])])
for x in dup_inv_block[1]:
y=x
for z in range(len(y[0])):
vcf_info_out.append(y[0][z]+['dup_inv','0/1','insert_point='+':'.join([str(i) for i in y[1][0]])]+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=del_dup_inv_diploid_decide(k1,k2) #decide if del+dup+inv
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
del_dup_inv_block=[]
let_hash=bp_to_chr_hash(k3,chromos)
for x in sv_info:
del_dup_inv_block.append([])
if not x=='NA':
del_block=[let_to_block_info(i,let_hash) for i in x[0]]
dup_inv_block=[let_to_block_info(i,let_hash) for i in x[1]]
ins_pos=[let_to_block_info(i,let_hash) for i in x[2]]
del_dup_inv_block[-1]+=[del_block,dup_inv_block,ins_pos]
if sv_info[0]==sv_info[1]:
for i1 in del_dup_inv_block[0][0]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_inv_block[0][1]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/1','dup_inv_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_inv_block[0][2]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/1','insert_point=']+[k1,k2,':'.join([str(i) for i in k3])])
else:
if not del_dup_inv_block[0]==[]:
for i1 in del_dup_inv_block[0][0]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/0','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_inv_block[0][1]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/0','dup_inv_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_inv_block[0][2]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/0','insert_point=']+[k1,k2,':'.join([str(i) for i in k3])])
if not del_dup_inv_block[1]==[]:
for i1 in del_dup_inv_block[1][0]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','0/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_inv_block[1][1]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','0/1','dup_inv_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_inv_block[1][2]:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','0/1','insert_point=']+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=del_dup_diploid_decide(k1,k2) #decide if del+dup
if not 'FALSE' in sv_info:
for k3 in svelter_hash[k1][k2]:
del_dup_block=[]
let_hash=bp_to_chr_hash(k3,chromos)
for x in sv_info:
del_dup_block.append([])
if not x=='NA':
del_block=[let_to_block_info(i,let_hash) for i in x[0]]
dup_inv_block=[let_to_block_info(i,let_hash) for i in x[1]]
del_dup_block[-1]+=[del_block,dup_inv_block]
if sv_info[0]==sv_info[1]:
for i1 in del_dup_block[0][0]:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_block[0][1]:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/1','dup_block=']+[k1,k2,':'.join([str(i) for i in k3])])
else:
if not del_dup_block[0]==[]:
for i1 in del_dup_block[0][0]:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/0','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_block[0][1]:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/0','dup_block=']+[k1,k2,':'.join([str(i) for i in k3])])
if not del_dup_block[1]==[]:
for i1 in del_dup_block[1][0]:
for j1 in i1: vcf_info_out.append(j1+['del_dup','0/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in del_dup_block[1][1]:
for j1 in i1: vcf_info_out.append(j1+['del_dup','0/1','dup_block=']+[k1,k2,':'.join([str(i) for i in k3])])
if 'FALSE' in sv_info:
sv_info=simple_tra_diploid_decide(k1,k2) #decide if simple translocation
if 'FALSE' in sv_info:
if k2.split('/')[0]==k2.split('/')[1]: #homo-alt
allele_sv_info=all_sv_single_haploid_decide(k1.split('/')[0],k2.split('/')[0])
if not allele_sv_info=='NA':
if not 'FALSE' in allele_sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
if allele_sv_info[1]=='del':
del_blocks=[let_to_block_info(i,let_hash) for i in allele_sv_info[0]]
for x in del_blocks:
for y in x: vcf_info_out.append(y+['del','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='inv':
del_blocks=[let_to_block_info(i,let_hash) for i in allele_sv_info[0]]
for x in del_blocks:
for y in x: vcf_info_out.append(y+['inv','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='tandup':
dup_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
block_rec=-1
for x in dup_block:
block_rec+=1
block_cn=allele_sv_info[0][1][block_rec]
for y in x:
vcf_info_out.append(y+['tandup','./.','CN='+str(block_cn)]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='disdup':
for x in allele_sv_info[0][1]:
letters_disdup=letter_subgroup(''.join(x[1:-1]))
dup_block=[let_to_block_info(i,let_hash) for i in letters_disdup]
if let_hash[x[0]][2]==let_hash[x[-1]][1]:
insert_point=[let_hash[x[0]][0],let_hash[x[0]][2]]
else:
insert_point=[let_hash[x[0]][0],let_hash[x[0]][2]]
#insert_point=['Not','Known']
for y in dup_block:
for z in y:
vcf_info_out.append(z+['disdup','1/1','insert_point='+':'.join([str(i) for i in insert_point]+[k1,k2,':'.join([str(i) for i in k3])])])
elif allele_sv_info[1]=='del_inv':
x=allele_sv_info[0]
del_block_temp=[]
for i in x[0]: del_block_temp+=let_to_block_info(i,let_hash)
inv_block_temp=[]
for i in x[1]: inv_block_temp+=let_to_block_info(i,let_hash)
for i in del_block_temp: vcf_info_out.append(i+['del_inv','1/1','del']+[k1,k2,':'.join([str(i) for i in k3])])
for i in inv_block_temp: vcf_info_out.append(i+['del_inv','1/1','inv']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='dup_inv':
x=allele_sv_info[0]
for y in x[1]:
dup_inv_let=[]
for z in y[1]:
if not z=='^':
dup_inv_let+=let_hash[z]
dup_inv_let=block_modify(dup_inv_let,chromos)
insert_point=[]
if let_hash[y[0].replace('^','')[-1]][2]==let_hash[y[-1].replace('^','')[0]][1]:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
else:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
#if insert_point==[]: insert_point=[['Not','Kown']]
for z in dup_inv_let:
vcf_info_out.append(z+['dup_inv','1/0','insert_point='+':'.join([str(i) for i in insert_point[0]])]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_dup_inv':
del_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
dup_inv_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][1]]
ins_pos=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][2]]
for i1 in del_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in dup_inv_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/1','dup_inv_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in ins_pos:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/1','insert_point=']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_dup':
del_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
dup_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][1]]
for i1 in del_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in dup_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/1','dup_block=']+[k1,k2,':'.join([str(i) for i in k3])])
else:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif k1=='a/a' and k2.count('a')>3: #tandup, high copynumber
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif k1=='a/a' and k2.count('a')>3: #tandup, high copynumber
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
else:
allele_sv_info=all_sv_single_haploid_decide(k1.split('/')[0],k2.split('/')[0]) #allele_1
if not allele_sv_info=='NA':
if not 'FALSE' in allele_sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
if allele_sv_info[1]=='del':
del_blocks=[let_to_block_info(i,let_hash) for i in allele_sv_info[0]]
for x in del_blocks:
for y in x: vcf_info_out.append(y+['del','1/0']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='inv':
del_blocks=[let_to_block_info(i,let_hash) for i in allele_sv_info[0]]
for x in del_blocks:
for y in x: vcf_info_out.append(y+['inv','1/0']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='tandup':
dup_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
block_rec=-1
for x in dup_block:
block_rec+=1
block_cn=allele_sv_info[0][1][block_rec]
for y in x:
vcf_info_out.append(y+['tandup','1/0','CN='+str(block_cn)]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='disdup':
for x in allele_sv_info[0][1]:
letters_disdup=letter_subgroup(''.join(x[1:-1]))
dup_block=[let_to_block_info(i,let_hash) for i in letters_disdup]
if let_hash[x[0]][2]==let_hash[x[-1]][1]:
insert_point=[let_hash[x[0]][0],let_hash[x[0]][2]]
else:
insert_point=[let_hash[x[0]][0],let_hash[x[0]][2]]
#insert_point=['Not','Known']
for y in dup_block:
for z in y:
vcf_info_out.append(z+['disdup','1/0','insert_point='+':'.join([str(i) for i in insert_point])]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_inv':
x=allele_sv_info[0]
del_block_temp=[]
for i in x[0]: del_block_temp+=let_to_block_info(i,let_hash)
inv_block_temp=[]
for i in x[1]: inv_block_temp+=let_to_block_info(i,let_hash)
for i in del_block_temp: vcf_info_out.append(i+['del_inv','1/0','del']+[k1,k2,':'.join([str(i) for i in k3])])
for i in inv_block_temp: vcf_info_out.append(i+['del_inv','1/0','inv']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='dup_inv':
x=allele_sv_info[0]
for y in x[1]:
dup_inv_let=[]
for z in y[1]:
if not z=='^':
dup_inv_let+=let_hash[z]
dup_inv_let=block_modify(dup_inv_let,chromos)
insert_point=[]
if let_hash[y[0].replace('^','')[-1]][2]==let_hash[y[-1].replace('^','')[0]][1]:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
else:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
#if insert_point==[]: insert_point=[['Not','Kown']]
for z in dup_inv_let:
vcf_info_out.append(z+['dup_inv','1/0','insert_point='+':'.join([str(i) for i in insert_point[0]])]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_dup_inv':
del_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
dup_inv_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][1]]
ins_pos=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][2]]
for i1 in del_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/0','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in dup_inv_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/0','dup_inv_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in ins_pos:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','1/0','insert_point=']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_dup':
del_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
dup_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][1]]
for i1 in del_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/0','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in dup_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup','1/0','dup_block=']+[k1,k2,':'.join([str(i) for i in k3])])
else:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/0']+[k1,k2,':'.join([str(i) for i in k3])])
elif k1=='a/a' and k2.count('a')>3: #tandup, high copynumber
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif k1=='a/a' and k2.count('a')>3: #tandup, high copynumber
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
allele_sv_info=all_sv_single_haploid_decide(k1.split('/')[0],k2.split('/')[1]) #allele_2
if not allele_sv_info=='NA':
if not 'FALSE' in allele_sv_info:
for k3 in svelter_hash[k1][k2]:
let_hash=bp_to_chr_hash(k3,chromos)
if allele_sv_info[1]=='del':
del_blocks=[let_to_block_info(i,let_hash) for i in allele_sv_info[0]]
for x in del_blocks:
for y in x: vcf_info_out.append(y+['del','0/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='inv':
del_blocks=[let_to_block_info(i,let_hash) for i in allele_sv_info[0]]
for x in del_blocks:
for y in x: vcf_info_out.append(y+['inv','0/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='tandup':
dup_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
block_rec=-1
for x in dup_block:
block_rec+=1
block_cn=allele_sv_info[0][1][block_rec]
for y in x:
vcf_info_out.append(y+['tandup','0/1','CN='+str(block_cn)]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='disdup':
for x in allele_sv_info[0][1]:
letters_disdup=letter_subgroup(''.join(x[1:-1]))
dup_block=[let_to_block_info(i,let_hash) for i in letters_disdup]
if let_hash[x[0]][2]==let_hash[x[-1]][1]:
insert_point=[let_hash[x[0]][0],let_hash[x[0]][2]]
else:
insert_point=[let_hash[x[0]][0],let_hash[x[0]][2]]
#insert_point=['Not','Known']
for y in dup_block:
for z in y:
vcf_info_out.append(z+['disdup','0/1','insert_point='+':'.join([str(i) for i in insert_point])]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_inv':
x=allele_sv_info[0]
del_block_temp=[]
for i in x[0]: del_block_temp+=let_to_block_info(i,let_hash)
inv_block_temp=[]
for i in x[1]: inv_block_temp+=let_to_block_info(i,let_hash)
for i in del_block_temp: vcf_info_out.append(i+['del_inv','0/1','del']+[k1,k2,':'.join([str(i) for i in k3])])
for i in inv_block_temp: vcf_info_out.append(i+['del_inv','0/1','inv']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='dup_inv':
x=allele_sv_info[0]
for y in x[1]:
dup_inv_let=[]
for z in y[1]:
if not z=='^':
dup_inv_let+=let_hash[z]
dup_inv_let=block_modify(dup_inv_let,chromos)
insert_point=[]
if let_hash[y[0].replace('^','')[-1]][2]==let_hash[y[-1].replace('^','')[0]][1]:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
else:
insert_point.append([let_hash[y[0].replace('^','')[-1]][0],let_hash[y[0].replace('^','')[-1]][2]])
#if insert_point==[]: insert_point=[['Not','Kown']]
for z in dup_inv_let:
vcf_info_out.append(z+['dup_inv','1/0','insert_point='+':'.join([str(i) for i in insert_point[0]])]+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_dup_inv':
del_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
dup_inv_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][1]]
ins_pos=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][2]]
for i1 in del_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','0/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in dup_inv_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','0/1','dup_inv_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in ins_pos:
for j1 in i1: vcf_info_out.append(j1+['del_dup_inv','0/1','insert_point=']+[k1,k2,':'.join([str(i) for i in k3])])
elif allele_sv_info[1]=='del_dup':
del_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][0]]
dup_block=[let_to_block_info(i,let_hash) for i in allele_sv_info[0][1]]
for i1 in del_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup','0/1','del_block=']+[k1,k2,':'.join([str(i) for i in k3])])
for i1 in dup_block:
for j1 in i1: vcf_info_out.append(j1+['del_dup','0/1','dup_block=']+[k1,k2,':'.join([str(i) for i in k3])])
else:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','0/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif k1=='a/a' and k2.count('a')>3: #tandup, high copynumber
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
elif k1=='a/a' and k2.count('a')>3: #tandup, high copynumber
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append(k3+['tandup','./.','CN='+str(k2.count('a'))]+[k1,k2,':'.join([str(i) for i in k3])])
else:
for k3 in svelter_hash[k1][k2]:
vcf_info_out.append([k3[0],k3[1],k3[-1]]+['cannot_classify_for_now','1/1']+[k1,k2,':'.join([str(i) for i in k3])])
return vcf_info_out
def vcf_info_out_modify_1(vcf_list):
out=[]
complex_list=[]
for x in vcf_list:
if complex_list==[]: complex_list.append(x)
else:
if x[-3]==complex_list[-1][-3] and x[-2]==complex_list[-1][-2]: complex_list.append(x)
else:
out+=complex_hash_unit_modify(complex_list)
complex_list=[x]
return order_vcf_list(out,chromos)
def order_vcf_list(vcf_list,chromos):
vcf_hash={}
for k1 in vcf_list:
if not k1[0] in list(vcf_hash.keys()): vcf_hash[k1[0]]={}
if not int(k1[1]) in list(vcf_hash[k1[0]].keys()): vcf_hash[k1[0]][int(k1[1])]={}
if not int(k1[2]) in list(vcf_hash[k1[0]][int(k1[1])].keys()): vcf_hash[k1[0]][int(k1[1])][int(k1[2])]=[]
if not k1 in vcf_hash[k1[0]][int(k1[1])][int(k1[2])]: vcf_hash[k1[0]][int(k1[1])][int(k1[2])].append(k1)
vcf_out=[]
for k1 in chromos:
if k1 in list(vcf_hash.keys()):
for k2 in sorted(vcf_hash[k1].keys()):
for k3 in sorted(vcf_hash[k1][k2].keys()):
for k4 in vcf_hash[k1][k2][k3]:
if not k4 in vcf_out: vcf_out.append(k4)
return vcf_out
def overlap_csv_diploid_decide(k1,k2):
k2_haps=k2.split('/')
k1_hap=k1.split('/')[0]
out=[]
out_type=[]
for x in k2_haps:
if x==k1_hap: out.append('NA')
else:
hap_result=simple_del_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('del')
else:
hap_result=simple_inv_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('inv')
else:
hap_result=simple_tandup_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('tandup')
else:
hap_result=simple_disdup_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('disdup')
else:
hap_result=del_inv_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('del_inv')
else:
hap_result=dup_inv_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('dup_inv')
else:
hap_result=del_dup_inv_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('del_dup_inv')
else:
hap_result=del_dup_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('del_dup')
else:
hap_result=simple_tra_haploid_decide(k1_hap,x)
if not hap_result=='FALSE':
out.append(hap_result)
out_type.append('tra')
else:
out.append('FALSE')
out_type.append('FALSE')
return out+out_type
def write_svelter_list(vcf_list_modi_1,fileout_prefix):
#write output in svelter format
fo=open(fileout_prefix+'.svelter','w')
print(' '.join(['chr','start','end','bp_info','ref','alt','sv_class','genotype','other_info']), file=fo)
for k1 in vcf_list_modi_1:
print('\t'.join([str(i) for i in k1[:3]+[k1[-1],k1[-3],k1[-2]]+[k1[3],k1[4]]+['/'.join([str(i) for i in k1[5:-3]])]]), file=fo)
fo.close()
def write_vcf_header(sample_list,file_out):
fo=open(file_out,'w')
print('##fileformat=VCFv4.2', file=fo)
print('##fileDate='+time.strftime("%Y%m%d"), file=fo)
print('##reference='+ref, file=fo)
fref=open(ref+'.fai')
for line in fref:
pin=line.strip().split()
print(''.join(['##contig=<ID=',pin[0],',length=',pin[1],'>']), file=fo)
fref.close()
print('##INFO=<ID=END,Number=1,Type=Integer,Description="End position of the variant described in this record">', file=fo)
print('##INFO=<ID=SVTYPE,Number=1,Type=String,Description="Type of structural variant">', file=fo)
print('##INFO=<ID=insert_point,Number=1,Type=String,Description="insertion point">', file=fo)
print('##INFO=<ID=del,Number=1,Type=String,Description="position of deleted region">', file=fo)
print('##INFO=<ID=dup,Number=1,Type=String,Description="position of duplicated region">', file=fo)
print('##INFO=<ID=inv,Number=1,Type=String,Description="position of inverted region">', file=fo)
print('##INFO=<ID=dup_inv,Number=1,Type=String,Description="position of inverted duplicated region">', file=fo)
print('##INFO=<ID=CN,Number=1,Type=String,Description="copy number estimation of tandem duplications">', file=fo)
print('##INFO=<ID=Other,Number=1,Type=String,Description="breakpoints and predicted structures by SVelter">', file=fo)
print('##INFO=<ID=bps,Number=1,Type=String,Description="all breakpoints detected by SVelter in this structural variants">', file=fo)
print('##INFO=<ID=ref_structure,Number=1,Type=String,Description="reference structure used by SVelter, each letter stands for a genomic region within adjacent breakpoints">', file=fo)
print('##INFO=<ID=alt_structure,Number=1,Type=String,Description="alternative structure predicted by SVelter">', file=fo)
print('##FILTER=<ID=LowQual,Description="Score of final structural - Theoretical Score <-50">', file=fo)
print('##ALT=<ID=DEL,Description="Deletion">', file=fo)
print('##ALT=<ID=DUP,Description="Duplication">', file=fo)
print('##ALT=<ID=INV,Description="Inversion">', file=fo)
print('##ALT=<ID=TRA,Description="Translocation">', file=fo)
print('##ALT=<ID=INS,Description="Insertion">', file=fo)
print('##ALT=<ID=DEL_INV,Description="Deletion and Inversion">', file=fo)
print('##ALT=<ID=DUP_INV,Description="Duplication and Invertion">', file=fo)
print('##ALT=<ID=DEL_DUP_INV,Description="Deletion, Duplication and Inversion">', file=fo)
print('##ALT=<ID=DEL_DUP,Description="Deletion and Inversion">', file=fo)
print('##FORMAT=<ID=GT,Number=1,Type=String,Description="Genotype">', file=fo)
print('##FORMAT=<ID=GQ,Number=1,Type=Integer,Description="Genotype quality">', file=fo)
print('##FORMAT=<ID=GL,Number=G,Type=Float,Description="Genotype Likelihood, log10-scaled likelihoods of the data given the called genotype for each possible genotype generated from the reference and alternate alleles given the sample ploidy">', file=fo)
print('\t'.join(['#CHROM','POS','ID','REF','ALT','QUAL','FILTER','INFO','FORMAT']+sample_list), file=fo)
fo.close()
def write_vcf_list(vcf_list_modi_1,chromos,fileout_prefix,sample_name,qc_score_info):
#write output in vcf format
write_vcf_header([sample_name],fileout_prefix+'.vcf')
write_svelter_list(vcf_list_modi_1,fileout_prefix)
file_out_name=fileout_prefix+'.vcf'
fo=open(file_out_name,'a')
for k1 in vcf_list_modi_1:
other_info='/'.join([str(i) for i in k1[5:-3]])
if other_info=='':
if not k1[3]=='tandup':
print('\t'.join([str(i) for i in k1[:2]+[k1[-1],ref_base_readin(ref,k1[0],k1[1])]+['<'+k1[3].upper()+'>']+[qc_score_info[k1[-1]],('PASS','LowQual')[qc_score_info[k1[-1]]<score_Cff]]+[';'.join(['SVTYPE='+k1[3],'END='+str(k1[2]),'Other='+'_'.join(k1[-3:])])]+['GT',k1[4]]]), file=fo)
else:
print('\t'.join([str(i) for i in k1[:2]+[k1[-1],ref_base_readin(ref,k1[0],k1[1])]+['<'+k1[3].upper()+'>']+[qc_score_info[k1[-1]],('PASS','LowQual')[qc_score_info[k1[-1]]<score_Cff]]+[';'.join(['SVTYPE='+k1[3],'END='+str(k1[2])])]+['CN',other_info.split('=')[1]]]), file=fo)
else:
if not k1[3]=='tandup':
print('\t'.join([str(i) for i in k1[:2]+[k1[-1],ref_base_readin(ref,k1[0],k1[1])]+['<'+k1[3].upper()+'>']+[qc_score_info[k1[-1]],('PASS','LowQual')[qc_score_info[k1[-1]]<score_Cff]]+[';'.join(['SVTYPE='+k1[3],'END='+str(k1[2]),other_info,'Other='+'_'.join(k1[-3:])])]+['GT',k1[4]]]), file=fo)
else:
print('\t'.join([str(i) for i in k1[:2]+[k1[-1],ref_base_readin(ref,k1[0],k1[1])]+['<'+k1[3].upper()+'>']+[qc_score_info[k1[-1]],('PASS','LowQual')[qc_score_info[k1[-1]]<score_Cff]]+[';'.join(['SVTYPE='+k1[3],'END='+str(k1[2]),other_info])]+['CN',other_info.split('=')[1]]]), file=fo)
fo.close()
def read_in_structures(filein):
fin=open(filein)
while True:
pin1=fin.readline().strip().split()
if not pin1: break
if pin1[0]=='Total': break
pin2=fin.readline().strip().split()
pin3=fin.readline().strip().split()
pin4=fin.readline().strip().split()
pin5=fin.readline().strip().split()
if pin3[0]=='Theoretical' and pin4[0]=='Current' and pin5[0]=='Time':
let1=bp_to_let([pin1],chromos)
if not let1==0:
let2='/'.join(sorted(pin2[0].split('/')))
if not let1 in list(sv_info.keys()): sv_info[let1]={}
if not let2 in list(sv_info[let1].keys()): sv_info[let1][let2]=[]
if not pin1 in sv_info[let1][let2]: sv_info[let1][let2].append(pin1+[float(pin4[-1])-float(pin3[-1])])
fin.close()
def ref_base_readin(ref,chr,pos):
fin=os.popen(r'''samtools faidx %s %s:%s-%s'''%(ref,chr,pos,pos))
pin=fin.readline().strip().split()
pin=fin.readline().strip().split()
fin.close()
if len(pin)>0: return pin[0]
else: return 'N'
def out_vcf_to_final_vcf(out_vcf):
#out_vcf=[vcf_list]
#eg of vcf_list=['chrY', '26655224', '26655397', 'inv', '1/0', 'a/a', 'a^/a', 'chrY:26655224:26655397', '0.982409865935_0.000862204201036_3.84571648609e-09', '0.0582740231527_0.0576300231209_0.0323373546742', '0.0730093529725_0.0726786922608_0.0931921291232', '0.999316104416_6.33905201496e-06_2.27094687442e-08', '0.0293728183823_0.0292298080937_0.0292058447476', '0.0400447285638_0.0396106870366_0.00810824074155', '0.00246938033609_0.979532645829_7.59260922793e-06', '0.000233917319996_0.171378471541_5.7269108597e-06', '0.00412257237951_0.00408229517668_0.00127621580899']
#out format=['#CHROM POS ID REF ALT QUAL FILTER INFO FORMAT SAMPLE']
ID_rec=0
current_event_rec=''
out=[]
for x in out_vcf:
x=[str(i) for i in x]
if not x[8]==current_event_rec: ID_rec+=1
if len(x)==18:
out+=[x[:2]+[ID_rec,ref_base_readin(ref,x[0],x[1]),x[3].upper()]+SV_qual_from_genotype_likelihood(x[-9:])+[';'.join(['END='+x[2],'SVTYPE='+x[3],x[5].replace('insert_point:chr','insert_point=chr'),'bps='+x[8],'ref_structure='+x[6],'alt_structure='+x[7]])]+['GT:GQ:GL']+[':'.join([str(i) for i in likelihood_to_gt_gq(i)]) for i in x[-len(sample_list):]]]
else:
out+=[x[:2]+[ID_rec,ref_base_readin(ref,x[0],x[1]),x[3].upper()]+SV_qual_from_genotype_likelihood(x[-9:])+[';'.join(['END='+x[2],'SVTYPE='+x[3],'bps='+x[7],'ref_structure='+x[5],'alt_structure='+x[6]])]+['GT:GQ:GL']+[':'.join([str(i) for i in likelihood_to_gt_gq(i)]) for i in x[-len(sample_list):]]]
return out
def x_to_x_modify_new(x,dup_block_combined):
x_modify=[[i] for i in list(x)]
block_rec=-1
for y in dup_block_combined:
block_rec+=1
if len(y)>1:
x_modify[block_rec]+=[x_modify[block_rec][0]+1+i for i in range(len(y)-1)]
x_modify_new=[]
for y in x_modify: x_modify_new+=y
return x_modify_new
def main():
#eg of svelter_file='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/SVelter.version14/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.svelter'
#eg of fileout_prefix='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/SVelter.version14/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.classified'
Define_Default_SVIntegrate()
if not '--workdir' in list(dict_opts.keys()): print('Error: please specify working directory using: --workdir')
else:
workdir=path_modify(dict_opts['--workdir'])
if not '--input-path' in list(dict_opts.keys()): print('Error: please specify path of input .coverge files using --input-path')
else:
if '--input-path' in list(dict_opts.keys()):
if not dict_opts['--input-path'][-1]=='/':dict_opts['--input-path']+='/'
InputPath=[dict_opts['--input-path']]
else:
InputPath=[]
if os.path.isdir(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1]):
InputPath.append(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1])
print('Reading Result from default path: '+workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1])
else: print('Error: please specify input path using --input-path')
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
if '--reference' in list(dict_opts.keys()):
ref_file=dict_opts['--reference']
ref_path='/'.join(ref_file.split('/')[:-1])+'/'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index): print('Error: reference genome not indexed')
else:
if not '--prefix' in list(dict_opts.keys()):
print('Warning: output file name not specified. output file: '+workdir+'Output.vcf')
output_file=workdir+'Output.vcf'
else: output_file=dict_opts['--prefix']+'.vcf'
time1=time.time()
global ref,chromos
ref=ref_file
chromos=chromos_readin(ref)
for path2 in InputPath:
path2=path_modify(path2)
global sv_info
sv_info={}
for k3 in os.listdir(path2):
print(k3)
if k3.split('.')[-1]=='coverge': read_in_structures(path2+k3)
[svelter_hash,qc_score_info]=sv_info_qc_score_extract(sv_info_score_modify(sv_info))
vcf_list=svelter_to_vcf_new(svelter_hash)
vcf_list_modi_1=vcf_info_out_modify_1(vcf_list)
write_vcf_list(vcf_list_modi_1,chromos,'.'.join(output_file.split('.')[:-1]),output_file.split('/')[-1],qc_score_info)
time2=time.time()
print('SVIntegrate Complete !')
print('Time Consuming: '+str(time2-time1))
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
main()
if function_name=='PredefinedBP':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','help=','input-bed=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters_predefinedbp()
else:
import time
import datetime
if not '--input-bed' in list(dict_opts.keys()):
print('Error: please specify predefined breakpoints using --input-bed')
else:
def Code_Files_Define():
global input_bed
input_bed=dict_opts['--input-bed']
global workdir
workdir=path_modify(dict_opts['--workdir'])
global Code_File
global Code0_Function
global Code1_Function
global Code2_Function
global Code2_Predefined_Function
global Code3_Function
global Code4_Function
global Code5_Function
global RCode_Path
global Code1a_file
global Code1d_file
global Code1d2_file
Code_File=script_name
Code0_Function='Setup'
Code1_Function='NullModel'
Code2_Function='BPSearch'
Code2_Predefined_Function='BPSearch_Predefined'
Code3_Function='BPIntegrate'
Code4_Function='SVPredict'
Code5_Function='SVIntegrate'
RCode_Path=workdir+'reference_SVelter/'
Code1a_file=RCode_Path+'SVelter1.NullModel.Figure.a.r'
Code1d_file=RCode_Path+'SVelter1.NullModel.Figure.b.r'
Code1d2_file=RCode_Path+'SVelter1.NullModel.Figure.c.r'
def Define_Default_AllInOne():
global deterministic_flag
deterministic_flag=0
if '--deterministic-flag' in list(dict_opts.keys()):
deterministic_flag=int(dict_opts['--deterministic-flag'])
if '--core' in list(dict_opts.keys()):
global pool
pool = Pool(processes=int(dict_opts['--core']))
global model_comp
if not '--null-model' in list(dict_opts.keys()):
model_comp='C'
else:
if dict_opts['--null-model'] in ['S','Simple']:
model_comp='S'
else:
model_comp='C'
global QCAlign
if '--qc-align' in list(dict_opts.keys()):
QCAlign=int(dict_opts['--qc-align'])
else:
QCAlign=20
global QCSplit
if '--qc-split' in list(dict_opts.keys()):
QCSplit=int(dict_opts['--qc-split'])
else:
QCSplit=20
global NullSplitLen_perc
if '--split-min-len' in list(dict_opts.keys()):
NullSplitLen_perc=int(dict_opts['--split-min-len'])
else:
NullSplitLen_perc=0.9
global KeepFile
if '--keep-temp-files' in list(dict_opts.keys()):
KeepFile=dict_opts['--keep-temp-files']
else:
KeepFile='No'
global KeepFigure
if '--keep-temp-figs' in list(dict_opts.keys()):
KeepFigure=dict_opts['--keep-temp-figs']
else:
KeepFigure='No'
global Trail_Number
if '--num-iteration' in list(dict_opts.keys()):
Trail_Number=int(dict_opts['--num-iteration'])
else:
Trail_Number=10000
global Local_Minumum_Number
Local_Minumum_Number=100
global Ploidy
if '--ploidy' in list(dict_opts.keys()):
Ploidy=int(dict_opts['--ploidy'])
else:
Ploidy=2
global ILCff_STD_Time
if '-S' in list(dict_opts.keys()):
ILCff_STD_Time=int(dict_opts['-S'])
else:
ILCff_STD_Time=3
def run_SVelter1_chrom_predefine(sin_bam_file):
os.system(r'''%s %s --keep-temp-files %s --keep-temp-figs %s --null-model %s --workdir %s --sample %s --out-path %s'''%(Code_File,Code1_Function,KeepFile,KeepFigure,model_comp,workdir,sin_bam_file,NullModel_out_folder))
def run_SVelter1_Single_chrom_predefine(sin_bam_file,chromos_single):
os.system(r'''%s %s --keep-temp-files %s --keep-temp-figs %s --null-model %s --workdir %s --sample %s --chromosome %s --out-path %s'''%(Code_File,Code1_Function,KeepFile,KeepFigure,model_comp,workdir,sin_bam_file,chromos_single,NullModel_out_folder))
def run_SVelter2_chrom_predefine(chrom_name,sin_bam_file,ILCff_STD_Time):
os.system(r'''%s %s --chromosome %s --workdir %s --sample %s --null-model %s -S %s --out-path %s'''%(Code_File,Code2_Predefined_Function,chrom_name,workdir,sin_bam_file,model_comp,ILCff_STD_Time,BPPredict_out_folder))
def run_SVelter3_chrom_predefine(sin_bam_file,out_folder):
os.system(r'''%s %s --batch %s --workdir %s --sample %s --bp-path %s'''%(Code_File,Code3_Function,dict_opts['--batch'],workdir,sin_bam_file,BPPredict_out_folder))
def run_SVelter4_chrom(txt_name,sin_bam_file):
os.system(r'''%s %s --workdir %s --bp-file %s --sample %s --num-iteration %s --ploidy %s --null-model %s --deterministic-flag %s'''%(Code_File,Code4_Function,workdir,txt_name,sin_bam_file,str(Trail_Number),str(Ploidy),model_comp,deterministic_flag))
print(txt_name+' done!')
def run_SVelter5_chrom(path2,out_vcf):
os.system(r'''%s %s --workdir %s --input-path %s --prefix %s'''%(Code_File,Code5_Function,workdir,path2,out_vcf))
def SamplingPercentage_read_in():
if '--null-copyneutral-perc' in list(dict_opts.keys()):
SamplingPercentage=float(dict_opts['--null-copyneutral-perc'])
else:
SamplingPercentage=0.001
return SamplingPercentage
def main():
Code_Files_Define()
Define_Default_AllInOne()
if '--sample' in list(dict_opts.keys()):
bam_path='/'.join(dict_opts['--sample'].split('/')[:-1])+'/'
bam_files=[dict_opts['--sample']]
bam_files_appdix=dict_opts['--sample'].split('.')[-1]
else:
bam_path=path_modify(dict_opts['--samplePath'])
bam_files=[]
for file in os.listdir(bam_path):
if file.split('.')[-1]==bam_files_appdix:
bam_files.append(bam_path+file)
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index):
print('Error: reference genome not indexed ')
else:
global whole_genome
global len_genome
[whole_genome,len_genome]=calculate_len_genome(ref)
chromos=list(whole_genome.keys())
chr_name_check=0
fin=open(ref_index)
chr_ref_check=[]
for line in fin:
pin=line.strip().split()
chr_ref_check.append(pin[0])
fin.close()
for filein_bam in bam_files:
chr_bam_check=[]
fin=os.popen(r'''samtools view -H %s'''%(filein_bam))
for line in fin:
pin=line.strip().split()
if pin[0]=='@SQ':
chr_bam_check.append(pin[1].split(':')[1])
fin.close()
if not chr_ref_check==chr_bam_check:
print('Warning: please make sure the reference file matches the sample file')
chr_flag=0
if 'chr' in chr_ref_check[0]:
chr_flag=1
SamplingPercentage=float(SamplingPercentage_read_in())
cn2_file=cn2_file_read_in(dict_opts,workdir)
ex_file=ex_file_read_in(dict_opts,workdir)
cn2_length=int(cn2_length_readin(dict_opts))
Gap_Refs=[ex_file]
if not os.path.isfile(cn2_file):
cn2_path='/'.join(cn2_file.split('/')[:-1])+'/'
if not os.path.isdir(cn2_path):
os.system(r'''mkdir %s'''%(cn2_path))
if not '--null-random-length' in list(dict_opts.keys()):
dict_opts['--null-random-length']=5000
else:
dict_opts['--null-random-length']=int(dict_opts['--null-random-length'])
if not '--null-random-num' in list(dict_opts.keys()):
dict_opts['--null-random-num']=10000
else:
dict_opts['--null-random-num']=int(dict_opts['--null-random-num'])
cn2_length=dict_opts['--null-random-length']-100
fo=open(cn2_file,'w')
for i in sorted(whole_genome.keys()):
num_i=int(float(whole_genome[i][0])/float(len_genome)*dict_opts['--null-random-num'])
reg_i=[random.randint(1,whole_genome[i][0]-dict_opts['--null-random-length']) for j in range(num_i)]
for j in sorted(reg_i):
print(' '.join([i,str(j),str(j+dict_opts['--null-random-length']-1)]), file=fo)
fo.close()
SamplingPercentage=1
if not os.path.isfile(ex_file):
fo=open(ex_file,'w')
for chr_ex in chromos:
print(' '.join([chr_ex,'0','0']), file=fo)
fo.close()
if '--prefix' in list(dict_opts.keys()):
out_vcf=dict_opts['--prefix']+'.vcf'
out_svelter=dict_opts['--prefix']+'.svelter'
else:
#out_vcf=workdir+dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'.vcf')
#out_svelter=workdir+dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'.svelter')
out_vcf=workdir+'.'.join(dict_opts['--sample'].split('/')[-1].split('.')[:-1])+'.vcf'
out_svelter=workdir+'.'.join(dict_opts['--sample'].split('/')[-1].split('.')[:-1])+'.svelter'
print('Warning: output file is not specified')
print('output file: '+out_vcf)
print('output file: '+out_svelter)
temp_inter_replace=0
if '--chromosome' in list(dict_opts.keys()):
chrom_single=dict_opts['--chromosome']
if not chrom_single in chromos:
print('Error: please make sure the chromosome defined by --chr is correct based on the reference genome')
chromos=[]
else:
chromos=[chrom_single]
for sin_bam_file in bam_files:
global NullModel_out_folder
global BPPredict_out_folder
global bp_files_out_folder
BPPredict_out_folder=workdir+'BreakPoints.'+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1])+'.predefinedBP.'+'.'.join(dict_opts['--input-bed'].split('/')[-1].split('.')[:-1])+'/'
NullModel_out_folder=workdir+'NullModel.'+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1])+'.predefinedBP.'+'.'.join(dict_opts['--input-bed'].split('/')[-1].split('.')[:-1])+'/'
bp_files_out_folder=workdir+'bp_files.'+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1])+'.predefinedBP.'+'.'.join(dict_opts['--input-bed'].split('/')[-1].split('.')[:-1])+'/'
running_time=[]
print(' ')
print('Step1: Running null parameters for '+sin_bam_file+' ...')
time1=time.time()
if len(chromos)>1:
run_SVelter1_chrom_predefine(sin_bam_file)
elif len(chromos)==1:
run_SVelter1_Single_chrom_predefine(sin_bam_file,chromos[0])
time2=time.time()
running_time.append(time2-time1)
print('Null model built for '+'.'.join(sin_bam_file.split('/')[-1].split('.')[:-1]))
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step2: Integrate predefined breakpoitns of sample '+sin_bam_file+' ...')
time1=time.time()
for x in chromos:
print(x)
run_SVelter2_chrom_predefine(x,sin_bam_file,ILCff_STD_Time)
if os.path.isfile(input_bed):
bed_info=bed_readin(input_bed)
path_mkdir(BPPredict_out_folder)
bed_write(bed_info,BPPredict_out_folder,sin_bam_file.split('/')[-1],input_bed)
else:
print('Error: predefined breakpoints file not exist !')
time2=time.time()
running_time.append(time2-time1)
print('Breakpointse set for sample:'+sin_bam_file)
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step3: Integrating breakpoints ... ')
if not '--batch' in list(dict_opts.keys()):
dict_opts['--batch']='0'
time1=time.time()
run_SVelter3_chrom_predefine(sin_bam_file,BPPredict_out_folder)
time2=time.time()
running_time.append(time2-time1)
print('Break points cluster done for sample:'+sin_bam_file)
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step4: Resolving structure ... ')
time1=time.time()
for k3 in os.listdir(bp_files_out_folder):
if k3.split('.')[-1]=='txt':
run_SVelter4_chrom(bp_files_out_folder+k3,sin_bam_file)
time2=time.time()
running_time.append(time2-time1)
print('Structure resolved !')
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step5: Integrating results in VCF file: '+out_vcf+' ... ')
time1=time.time()
run_SVelter5_chrom(workdir+bp_files_out_folder,'.'.join(out_vcf.split('.')[:-1]))
time2=time.time()
running_time.append(time2-time1)
if temp_inter_replace==0:
print(out_vcf+' completed! ')
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print('Total Running Time:'+' '.join([str(i) for i in running_time]))
if os.path.isfile(out_vcf):
os.system(r'''rm -r %s'''%(NullModel_out_folder))
os.system(r'''rm -r %s'''%(BPPredict_out_folder))
os.system(r'''rm -r %s'''%(TXTPath))
main()
if function_name=='GenoTyper':
#command='svelter.py GenoTyper --workdir /scratch/remills_flux/xuefzhao/SV_discovery_index/download/ --seq-path /scratch/remills_flux/xuefzhao/SV_discovery_index/download/alignment/ -f /scratch/remills_flux/xuefzhao/SV_discovery_index/download/SVelter.version14/svelter/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.svelter'
import getopt
import random
import scipy
import math
import numpy
import pickle
from math import sqrt,pi,exp
import scipy
from scipy.stats import norm
import time
import datetime
import itertools
import glob
def Af_Letter_2_Af_BP(BP_para_dict,Af_Letter,Be_BP_Letter):
Af_BP=[[BP_para_dict['original_bp_list'][0]],[BP_para_dict['original_bp_list'][0]]]
for i in Af_Letter[0]:
if not i=='^':
Af_BP[0].append(Af_BP[0][-1]+Be_BP_Letter[i[0]])
for i in Af_Letter[1]:
if not i=='^':
Af_BP[1].append(Af_BP[1][-1]+Be_BP_Letter[i[0]])
return Af_BP
def Af_Rearrange_Info_Collect(GC_para_dict,BP_para_dict,Be_BP_Letter,Be_Info,Letter_Candidates):
[P_IL,P_RD,P_DR,P_TB,Letter_Rec,BP_Rec]=[[],[],[],[],[],[]]
for Af_Letter in Letter_Candidates:
Af_BP=Af_Letter_2_Af_BP(BP_para_dict,Af_Letter,Be_BP_Letter)
Af_Info_all=Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP)
print(Af_Info_all)
if not Af_Info_all==0:
Letter_Rec.append(Af_Letter)
BP_Rec.append(Af_BP)
Af_IL_Penal=Af_Info_all[0]
Af_RD_Rec=Af_Info_all[1]
Af_DR_Penal=calcu_PO_Stat(Af_Info_all[2]*100/(4*flank+Af_BP[0][-1]-Af_BP[0][0]+Af_BP[1][-1]-Af_BP[1][0]),Pair_Orien_Info[0],Pair_Orien_Info[1])
Af_TB_Penal=calcu_PC_Norm(Af_Info_all[-1],Physical_Cov_Stat)
Af_RD_Penal=calcu_RD_Norm(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter)
for key in list(Af_Info_all[5].keys()):
theo_RD=GC_para_dict['GC_Overall_Median_Coverage'][str(Chr)]
Af_RD_Penal+=Prob_Norm(Af_Info_all[5][key]+theo_RD/2,theo_RD/2,GC_para_dict['GC_Var_Coverage'][chrom_N]/2)-Prob_Norm(theo_RD/2,theo_RD/2,GC_para_dict['GC_Var_Coverage'][chrom_N]/2)
P_IL.append(Af_IL_Penal)
P_RD.append(Af_RD_Penal)
P_DR.append(Af_DR_Penal)
P_TB.append(Af_TB_Penal)
else:
P_IL.append(1)
P_RD.append(1)
P_DR.append(1)
P_TB.append(1)
P_IL=P_list_modify(P_IL)
P_RD=P_list_modify(P_RD)
P_DR=P_list_modify(P_DR)
P_TB=P_list_modify(P_TB)
return [P_IL,P_DR,P_RD,P_TB]
def All_Block_RD(Initial_block_RD,Af_GCRD_Adj,Af_block_RD,Af_Letter,flank):
All_Letters=['left']+[chr(97+i) for i in range(len(Initial_block_RD)-1)]
CNm=[1]+[0 for j in range(len(Initial_block_RD)-1)]
CNp=[1]+[0 for j in range(len(Initial_block_RD)-1)]
k=Af_Letter[0]
for m in k:
CNm[ord(m[0])-96]+=1
k=Af_Letter[1]
for m in k:
CNp[ord(m[0])-96]+=1
RDm=[(Initial_block_RD[0]+left_RD_Calculate_2a(Through_GCRD_Adj,Af_GCRD_Adj[0],flank))/2]+[0 for j in (list(range(len(Initial_block_RD)-1)),Window_Size)]
RDp=[(Initial_block_RD[0]+left_RD_Calculate_2a(Through_GCRD_Adj,Af_GCRD_Adj[1],flank))/2]+[0 for j in (list(range(len(Initial_block_RD)-1)),Window_Size)]
RDs=[RDm,RDp]
for p in range(len(Af_Letter)):
for q in range(len(Af_Letter[p])):
RDs[p][ord(Af_Letter[p][q][0])-96]+=Af_block_RD[p][q]
for r in range(len(Initial_block_RD))[1:]:
if CNm[r]==CNp[r]:
RDs[0][r]+=Initial_block_RD[r]/2
RDs[1][r]+=Initial_block_RD[r]/2
elif CNm[r]==0 and not CNp[r]==0:
RDs[1][r]+=Initial_block_RD[r]
elif CNp[r]==0 and not CNm[r]==0:
RDs[0][r]+=Initial_block_RD[r]
else:
RDs[0][r]+=Initial_block_RD[r]*CNm[r]/(CNp[r]+CNm[r])
RDs[1][r]+=Initial_block_RD[r]*CNp[r]/(CNp[r]+CNm[r])
CNs=[CNm,CNp]
return [CNs,RDs]
def All_Block_RD_2(Initial_block_RD,Af_block_RD,Af_Letter,bps,flank):
RDs=[[],[]]
CNs=[[],[]]
for let in [chr(97+i) for i in range(len(bps)-1)]:
CNs[0].append(Af_Letter[0].count(let)+Af_Letter[0].count(let+'^'))
CNs[1].append(Af_Letter[1].count(let)+Af_Letter[1].count(let+'^'))
if not CNs[0][-1]+CNs[1][-1]==0:
RDs[0].append(Initial_block_RD[ord(let)-96]*CNs[0][-1]/(CNs[0][-1]+CNs[1][-1]))
RDs[1].append(Initial_block_RD[ord(let)-96]*CNs[1][-1]/(CNs[0][-1]+CNs[1][-1]))
if CNs[0][-1]+CNs[1][-1]==0:
RDs[0].append(0)
RDs[1].append(0)
for key in list(Af_block_RD[0].keys()):
if not key=='left' and not key=='right':
RDs[0][ord(key.split('_')[0])-97]+=float(Af_block_RD[0][key])/float(bps[ord(key.split('_')[0])-96]-bps[ord(key.split('_')[0])-97])*Window_Size
for key in list(Af_block_RD[1].keys()):
if not key=='left' and not key=='right':
RDs[1][ord(key.split('_')[0])-97]+=float(Af_block_RD[1][key])/float(bps[ord(key.split('_')[0])-96]-bps[ord(key.split('_')[0])-97])*Window_Size
CNs[0]=[1]+CNs[0]
CNs[1]=[1]+CNs[1]
RDs[0]=[Af_block_RD[0]['left']+Initial_block_RD[0]/2]+RDs[0]
RDs[1]=[Af_block_RD[1]['left']+Initial_block_RD[0]/2]+RDs[1]
return [CNs,RDs]
def alt_allele_decide(ref_st,alt_st):
out=[]
for x in alt_st.split('/'):
if not x ==ref_st.split('/')[0]:
out.append(x)
return out
def alt_SV_genotype_prep(sv_info):
#eg of sv_info: [['chr1', '1207346', '1207761'], 'a/a', '/']
#output:[ref_ref,ref_alt,alt_alt]
out=[[[i for i in sv_info[1].split('/')[0]],[i for i in sv_info[1].split('/')[1]]]]
for alt_al in alt_allele_decide(sv_info[1],sv_info[2]):
homo_alt='/'.join([alt_al,alt_al])
het_alt='/'.join([alt_al,sv_info[1].split('/')[0]])
out_single=[[[i for i in het_alt.split('/')[0]],[i for i in het_alt.split('/')[1]]],[[i for i in homo_alt.split('/')[0]],[i for i in homo_alt.split('/')[1]]]]
out_modify=[]
for x in out_single:
out_modify.append([])
for y in x:
out_modify[-1].append([])
for z in y:
if not z=='^':
out_modify[-1][-1].append(z)
else:
out_modify[-1][-1][-1]+='^'
out+=out_modify
out2=[]
for x in out:
if not x in out2:
out2.append(x)
return out2
def Be_Info_1_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal):
be_info_1=Be_Info[0]
for j in be_info_1:
jMapPenam=0
j_m_new=[]
if j[0] in temp_letter[0] and j[3] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[3]]:
j_m_temp=[j[1]+ka[0],j[2]+ka[0],j[4]+kb[0],j[5]+kb[0]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[j[-1],j[-2]]
else:
j_m_temp+=[j[-2],j[-1]]
j_m_new.append(j_m_temp)
if j[0]+'^' in temp_letter[0] and j[3] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[3]]:
j_m_temp=[ka[1]-j[2],ka[1]-j[1],kb[0]+j[4],kb[0]+j[5]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[j[-1],complement(j[-2])]
else:
j_m_temp+=[complement(j[-2]),j[-1]]
j_m_new.append(j_m_temp)
if j[0] in temp_letter[0] and j[3]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[3]+'^']:
j_m_temp=[j[1]+ka[0],j[2]+ka[0],kb[1]-j[5],kb[1]-j[4]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[complement(j[-1]),j[-2]]
else:
j_m_temp+=[j[-2],complement(j[-1])]
j_m_new.append(j_m_temp)
if j[0]+'^' in temp_letter[0] and j[3]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[3]+'^']:
j_m_temp=[ka[1]-j[2],ka[1]-j[1],kb[1]-j[5],kb[1]-j[4]]
if j_m_temp[0]>j_m_temp[2]:
j_m_temp=j_m_temp[2:4]+j_m_temp[:2]+[complement(j[-1]),complement(j[-2])]
else:
j_m_temp+=[complement(j[-2]),complement(j[-1])]
j_m_new.append(j_m_temp)
j_m_3a=candidate_QC_Control(j_m_new)
if j_m_3a==[]:
jMapPenam+=1
j_p_new=[]
if j[0] in temp_letter[1] and j[3] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[3]]:
j_p_temp=[j[1]+ka[0],j[2]+ka[0],j[4]+kb[0],j[5]+kb[0]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[j[-1],j[-2]]
else:
j_p_temp+=[j[-2],j[-1]]
j_p_new.append(j_p_temp)
if j[0]+'^' in temp_letter[1] and j[3] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[3]]:
j_p_temp=[ka[1]-j[2],ka[1]-j[1],kb[0]+j[4],kb[0]+j[5]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[j[-1],complement(j[-2])]
else:
j_p_temp+=[complement(j[-2]),j[-1]]
j_p_new.append(j_p_temp)
if j[0] in temp_letter[1] and j[3]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[3]+'^']:
j_p_temp=[j[1]+ka[0],j[2]+ka[0],kb[1]-j[5],kb[1]-j[4]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[complement(j[-1]),j[-2]]
else:
j_p_temp+=[j[-2],complement(j[-1])]
j_p_new.append(j_p_temp)
if j[0]+'^' in temp_letter[1] and j[3]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[3]+'^']:
j_p_temp=[ka[1]-j[2],ka[1]-j[1],kb[1]-j[5],kb[1]-j[4]]
if j_p_temp[0]>j_p_temp[2]:
j_p_temp=j_p_temp[2:4]+j_p_temp[:2]+[complement(j[-1]),complement(j[-2])]
else:
j_p_temp+=[complement(j[-2]),complement(j[-1])]
j_p_new.append(j_p_temp)
j_p_3a=candidate_QC_Control(j_p_new)
if j_p_3a==[]:
jMapPenam+=1
if jMapPenam==2:
Total_Cov_For_Pen[j[0]]+=j[2]-j[1]
Total_Cov_For_Pen[j[3]]+=j[5]-j[4]
NoMapPenal+=2
elif jMapPenam==1:
if j_m_3a==[]:
Map_P+=[jp3+['p']+[float(1)/float(len(j_p_3a))] for jp3 in j_p_3a]
elif j_p_3a==[]:
Map_M+=[jp3+['m']+[float(1)/float(len(j_m_3a))] for jp3 in j_m_3a]
else:
j_mp_4a=candidate_QC_Control2(j_m_3a,j_p_3a)
if not j_mp_4a==[]:
Map_Both+=[j4+[float(1)/float(len(j_mp_4a))] for j4 in j_mp_4a]
else:
Total_Cov_For_Pen[j[0]]+=j[2]-j[1]
Total_Cov_For_Pen[j[3]]+=j[5]-j[4]
NoMapPenal+=2
return NoMapPenal
def Be_Info_2_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal):
be_info_2=Be_Info[1]
for j in be_info_2:
jMapPenam=0
j_m_new=[]
if j[0] in temp_letter[0] and j[2] in temp_letter[0] and j[4] in temp_letter[0] and j[6] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[2]]:
for kc in Let_BP_Info['m'][j[4]]:
for kd in Let_BP_Info['m'][j[6]]:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],j[-2]])
else:
j_m_new.append(j_info_new+[j[-2],j[-1]])
if j[0]+'^' in temp_letter[0] and j[2]+'^' in temp_letter[0] and j[4] in temp_letter[0] and j[6] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[2]+'^']:
for kc in Let_BP_Info['m'][j[4]]:
for kd in Let_BP_Info['m'][j[6]]:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],complement(j[-2])])
else:
j_m_new.append(j_info_new+[complement(j[-2]),j[-1]])
if j[0] in temp_letter[0] and j[2] in temp_letter[0] and j[4]+'^' in temp_letter[0] and j[6]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[2]]:
for kc in Let_BP_Info['m'][j[4]+'^']:
for kd in Let_BP_Info['m'][j[6]+'^']:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),j[-2]])
else:
j_m_new.append(j_info_new+[j[-2],complement(j[-1])])
if j[0]+'^' in temp_letter[0] and j[2]+'^' in temp_letter[0] and j[4]+'^' in temp_letter[0] and j[6]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[2]+'^']:
for kc in Let_BP_Info['m'][j[4]+'^']:
for kd in Let_BP_Info['m'][j[6]+'^']:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_m_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),complement(j[-2])])
else:
j_m_new.append(j_info_new+[complement(j[-2]),complement(j[-1])])
j_m_3a=candidate_QC_Control(j_m_new)
if j_m_3a==[]:
jMapPenam+=1
j_p_new=[]
if j[0] in temp_letter[1] and j[2] in temp_letter[1] and j[4] in temp_letter[1] and j[6] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[2]]:
for kc in Let_BP_Info['p'][j[4]]:
for kd in Let_BP_Info['p'][j[6]]:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],j[-2]])
else:
j_p_new.append(j_info_new+[j[-2],j[-1]])
if j[0]+'^' in temp_letter[1] and j[2]+'^' in temp_letter[1] and j[4] in temp_letter[1] and j[6] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[2]+'^']:
for kc in Let_BP_Info['p'][j[4]]:
for kd in Let_BP_Info['p'][j[6]]:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kc[0]+j[5],kd[0]+j[7]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[j[-1],complement(j[-2])])
else:
j_p_new.append(j_info_new+[complement(j[-2]),j[-1]])
if j[0] in temp_letter[1] and j[2] in temp_letter[1] and j[4]+'^' in temp_letter[1] and j[6]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[2]]:
for kc in Let_BP_Info['p'][j[4]+'^']:
for kd in Let_BP_Info['p'][j[6]+'^']:
j_info_new=[ka[0]+j[1],kb[0]+j[3],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),j[-2]])
else:
j_p_new.append(j_info_new+[j[-2],complement(j[-1])])
if j[0]+'^' in temp_letter[1] and j[2]+'^' in temp_letter[1] and j[4]+'^' in temp_letter[1] and j[6]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[2]+'^']:
for kc in Let_BP_Info['p'][j[4]+'^']:
for kd in Let_BP_Info['p'][j[6]+'^']:
j_info_new=[kb[1]-j[3],ka[1]-j[1],kd[1]-j[7],kc[1]-j[5]]
if j_info_new[0]>j_info_new[2]:
j_p_new.append(j_info_new[2:4]+j_info_new[:2]+[complement(j[-1]),complement(j[-2])])
else:
j_p_new.append(j_info_new+[complement(j[-2]),complement(j[-1])])
j_p_3a=candidate_QC_Control(j_p_new)
if j_p_3a==[]:
jMapPenam+=1
if jMapPenam==2:
if j[0]==j[2]:
Total_Cov_For_Pen[j[0]]+=j[3]-j[1]
else:
Total_Cov_For_Pen[j[0]]+=Be_BP_Letter[j[0]]-j[1]
Total_Cov_For_Pen[j[2]]+=j[3]
if j[4]==j[6]:
Total_Cov_For_Pen[j[4]]+=j[7]-j[5]
else:
Total_Cov_For_Pen[j[4]]+=Be_BP_Letter[j[4]]-j[5]
Total_Cov_For_Pen[j[6]]+=j[7]
NoMapPenal+=2
elif jMapPenam==1:
if j_m_3a==[]:
Map_P+=[jp3+['p']+[float(1)/float(len(j_p_3a))] for jp3 in j_p_3a]
elif j_p_3a==[]:
Map_M+=[jp3+['m']+[float(1)/float(len(j_m_3a))] for jp3 in j_m_3a]
else:
j_mp_4a=candidate_QC_Control2(j_m_3a,j_p_3a)
if not j_mp_4a==[]:
Map_Both+=[j4+[float(1)/float(len(j_mp_4a))] for j4 in j_mp_4a]
else:
if j[0]==j[2]:
Total_Cov_For_Pen[j[0]]+=j[3]-j[1]
else:
Total_Cov_For_Pen[j[0]]+=Be_BP_Letter[j[0]]-j[1]
Total_Cov_For_Pen[j[2]]+=j[3]
if j[4]==j[6]:
Total_Cov_For_Pen[j[4]]+=j[7]-j[5]
else:
Total_Cov_For_Pen[j[4]]+=Be_BP_Letter[j[4]]-j[5]
Total_Cov_For_Pen[j[6]]+=j[7]
NoMapPenal+=2
return NoMapPenal
def Be_Info_3_rearrange(BP_para_dict,Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal):
be_info_3=Be_Info[2]
for j in be_info_3:
j_m_new=[]
if j[0] in temp_letter[0] and j[2] in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]]:
for kb in Let_BP_Info['m'][j[2]]:
temp_single=[ka[0]+j[1],kb[0]+j[3]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_m_new.append(temp_single)
if j[0]+'^' in temp_letter[0] and j[2]+'^' in temp_letter[0]:
for ka in Let_BP_Info['m'][j[0]+'^']:
for kb in Let_BP_Info['m'][j[2]+'^']:
temp_single=[kb[1]-j[3],ka[1]-j[1]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_m_new.append(temp_single)
j_p_new=[]
if j[0] in temp_letter[1] and j[2] in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]]:
for kb in Let_BP_Info['p'][j[2]]:
temp_single=[ka[0]+j[1],kb[0]+j[3]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_p_new.append(temp_single)
if j[0]+'^' in temp_letter[1] and j[2]+'^' in temp_letter[1]:
for ka in Let_BP_Info['p'][j[0]+'^']:
for kb in Let_BP_Info['p'][j[2]+'^']:
temp_single=[kb[1]-j[3],ka[1]-j[1]]
if not temp_single[1]-temp_single[0]>BP_para_dict['ReadLength']*1.2 and temp_single[1]-temp_single[0]>0:
j_p_new.append(temp_single)
if not j_m_new+j_p_new==[]:
for j2 in j_m_new:
Map_Both.append(j2+['m',float(1)/float(len(j_m_new+j_p_new))])
for j2 in j_p_new:
Map_Both.append(j2+['p',float(1)/float(len(j_m_new+j_p_new))])
else:
Total_Cov_For_Pen[j[0]]=Be_BP_Letter[j[0]]-j[1]
Total_Cov_For_Pen[j[2]]=j[3]
NoMapPenal+=1
return NoMapPenal
def Be_BP_Letter_modify(original_letters,flank,RD_within_B,ReadLength,Full_Info,original_bp_list):
global Be_BP_Letter
Be_BP_Letter={}
for let_key in original_letters:
Be_BP_Letter[let_key]=original_bp_list[original_letters.index(let_key)+1]-original_bp_list[original_letters.index(let_key)]
Be_BP_Letter['left']=flank
Be_BP_Letter['right']=flank
for let_key in list(Be_BP_Letter.keys()):
Be_BP_Letter[let_key+'^']=Be_BP_Letter[let_key]
num_of_read_pairs=1
for k1 in list(Be_BP_Letter.keys()):
if not k1[-1]=='^' and not k1 in ['left','right']:
num_of_read_pairs+=Be_BP_Letter[k1]*RD_within_B[k1]/2/ReadLength
num_of_read_pairs+=len(Full_Info[4])+len(Full_Info[5])+len(Full_Info[6])
return num_of_read_pairs
def BPs_Coverage(Af_Letter,original_bp_list,original_letters,Letter_Through,Af_Info,flank):
blocklen={}
for i in range(len(original_bp_list)-1):
blocklen[original_letters[i]]=original_bp_list[i+1]-original_bp_list[i]
blocklen['left']=flank
blocklen['right']=flank
tempM=[blocklen[j[0]] for j in Af_Letter[0]]
tempP=[blocklen[j[0]] for j in Af_Letter[1]]
Af_BPs=[[-flank,0]+[sum(tempM[:(k+1)]) for k in range(len(tempM))],[-flank,0,]+[sum(tempP[:(k+1)]) for k in range(len(tempP))]]
Af_BPs=[Af_BPs[0]+[Af_BPs[0][-1]+flank],Af_BPs[1]+[Af_BPs[1][-1]+flank]]
Af_BP_Through=[[0 for i in range(len(Af_BPs[0]))],[0 for i in range(len(Af_BPs[1]))]]
for key in list(Af_Info.keys()):
if Af_Info[key][6]=='M':
tempbps=Af_BPs[0]
leftMost=numpy.min([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
rightMost=numpy.max([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
for m in range(len(tempbps)-1):
if tempbps[m+1]>leftMost and tempbps[m]<leftMost:
for n in range(m,len(tempbps)-1):
if tempbps[n+1]>rightMost and tempbps[n]<rightMost:
for p in range(m+1,n+1):
if len(Af_Info[key])==7:
Af_BP_Through[0][p]+=1
elif len(Af_Info[key])==8:
Af_BP_Through[0][p]+=float(Af_Info[key][7])
if Af_Info[key][6]=='P':
tempbps=Af_BPs[1]
leftMost=numpy.min([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
rightMost=numpy.max([numpy.mean(Af_Info[key][:2]),numpy.mean(Af_Info[key][2:4])])
for m in range(len(tempbps)-1):
if tempbps[m+1]>leftMost and tempbps[m]<leftMost:
for n in range(m,len(tempbps)-1):
if tempbps[n+1]>rightMost and tempbps[n]<rightMost:
for p in range(m+1,n+1):
if len(Af_Info[key])==7:
Af_BP_Through[1][p]+=1
elif len(Af_Info[key])==8:
Af_BP_Through[1][p]+=float(Af_Info[key][7])
return [Af_BP_Through[0][1:-1],Af_BP_Through[1][1:-1]]
def Block_Assign_To_Letters(bp_list,letter_list,flank):
#Eg of bp_list:[184569179, 184569775, 184571064, 184572009, 184572016]
#Eg of letter_list:['a', 'b', 'c', 'd']
#Eg of flank:446
number_of_blocks=(numpy.max(bp_list)-numpy.min(bp_list)+2*flank)/Window_Size+1
blocks={}
bp_list_new=[bp_list[0]-flank]+bp_list+[bp_list[-1]+flank]
relative_bp_list=[i-numpy.min(bp_list_new) for i in bp_list_new]
bp_length=[(bp_list_new[i+1]-bp_list_new[i]) for i in range(len(bp_list_new)-1)]
letter_list_new=['left']+letter_list+['right']
bp_blocks=[[letter_list_new[j]]+list(range(relative_bp_list[j]/Window_Size,relative_bp_list[j+1]/Window_Size+1)) for j in range(len(relative_bp_list)-1)]
blocks_bp={}
for i in range(number_of_blocks):
blocks_bp[i+1]=[bp_list_new[0]+i*Window_Size,bp_list_new[0]+i*Window_Size+Window_Size-1]
for j in bp_blocks:
if i in j:
blocks_bp[i+1].append(j[0])
blocks_bp[0]=[blocks_bp[1][0]-Window_Size,blocks_bp[1][0]-1,'0']
blocks_bp[number_of_blocks+1]=[blocks_bp[number_of_blocks][1]+1,blocks_bp[number_of_blocks][1]+Window_Size,'0']
return blocks_bp
def bam_info_readin(bam_name,chrom,start,end,QCAlign):
#eg of bam_name:'/scratch/remills_flux/xuefzhao/SV_discovery_index/download/alignment/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram'
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(bam_name,chrom,start,end))
blackList=[]
temp_rec={}
temp_rec_LowQual={}
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
#if not int(pbam[4])>QCAlign:continue
if pbam[0] in blackList: continue
if not int(pbam[4])>QCAlign:
if not pbam[0] in list(temp_rec_LowQual.keys()):
temp_rec_LowQual[pbam[0]]=[]
if not pbam[1:9] in temp_rec_LowQual[pbam[0]]:
temp_rec_LowQual[pbam[0]]+=[pbam[1:9]]
else:
if not pbam[0] in list(temp_rec.keys()):
temp_rec[pbam[0]]=[]
if not pbam[1:9] in temp_rec[pbam[0]]:
temp_rec[pbam[0]]+=[pbam[1:9]]
fbam.close()
return [temp_rec,temp_rec_LowQual]
def block_RD_Calculate_2a(Initial_GCRD_Adj,original_bp_list,flank):
allele_BP=[0]+[flank+j-original_bp_list[0] for j in original_bp_list]+[2*flank+original_bp_list[-1]-original_bp_list[0]]
allele_Letter=['left']+[chr(97+i) for i in range(len(original_bp_list)-1)]
allele_RD=[]
for k in range(len(allele_Letter)):
length=allele_BP[k+1]-allele_BP[k]
block=[allele_BP[k],allele_BP[k+1]]
temp=[]
if not block[0]==block[0]/Window_Size*Window_Size:
blf=float((block[0]/Window_Size+1)*Window_Size-block[0])/Window_Size*Initial_GCRD_Adj[block[0]/Window_Size+1][3]
temp.append(blf)
for m in range(block[0]/Window_Size+2,block[1]/Window_Size+1):
temp.append(Initial_GCRD_Adj[m][3])
if not block[1]==block[1]/Window_Size*Window_Size:
brf=float(block[1]-block[1]/Window_Size*Window_Size)/Window_Size*Initial_GCRD_Adj[block[1]/Window_Size+1][3]
temp.append(brf)
allele_RD.append(numpy.sum(temp)/length*Window_Size)
elif block[0]==block[0]/Window_Size*Window_Size:
for m in range(block[0]/Window_Size+1,block[1]/Window_Size+1):
temp.append(Initial_GCRD_Adj[m][3])
if not block[1]==block[1]/Window_Size*Window_Size:
brf=float(block[1]-block[1]/Window_Size*Window_Size)/Window_Size*Initial_GCRD_Adj[block[1]/Window_Size+1][3]
temp.append(brf)
allele_RD.append(numpy.sum(temp)/length*Window_Size)
return allele_RD
def bp_list_to_hash(bp_list):
#eg of bp_list:['chr1', '101', '45703342', '45703361']
out={}
chromo_seq=[]
for x in bp_list:
if x in chromos_all:
if not x in list(out.keys()):
out[x]=[]
chromo_cur=x
chromo_seq.append(chromo_cur)
else:
if out[chromo_cur]==[]:
out[chromo_cur].append([x])
else:
out[chromo_cur][-1]+=[x]
out[chromo_cur].append([x])
rec=96
out_hash={}
for x in chromo_seq:
del out[x][-1]
out_hash[x]={}
for y in out[x]:
rec+=1
out_hash[x][chr(rec)]=[x]+y
return out_hash
def block_info_modify(block_hash,flank):
#eg of block_hash:{'chr1': {'a': ['chr1', '101', '45703342'], 'b': ['chr1', '45703342', '45703361']}}
out={}
for k1 in list(block_hash.keys()):
out[k1]={}
out_keys=sorted(block_hash[k1].keys())
out[k1]['left']=[k1,max([int(block_hash[k1][out_keys[0]][1])-flank,0]),int(block_hash[k1][out_keys[0]][1])]
out[k1]['right']=[k1,int(block_hash[k1][out_keys[-1]][2]),int(block_hash[k1][out_keys[-1]][2])+flank]
for k2 in list(block_hash[k1].keys()):
out[k1][k2]=[block_hash[k1][k2][0]]+[int(i) for i in block_hash[k1][k2][1:]]
return out
def block_info_disect(block_hash,max_len=5000):
out=[]
for k1 in list(block_hash.keys()):
for k2 in [i for i in sorted(block_hash[k1].keys()) if not i in ['left','right']]:
if block_hash[k1][k2][2]-block_hash[k1][k2][1]>max_len: #block too long
out.append()
def block_Info_ReadIn(GC_para_dict,BP_para_dict,chr_letter_bp,blocks_read_in,Multi_Dup):
block_bps={}
block_rds={}
for k1 in list(chr_letter_bp.keys()):
block_bps[k1]={}
block_rds[k1]={}
for k2 in list(chr_letter_bp[k1].keys()):
if not k2 in Multi_Dup:
block_bps[k1][k2]=[min(chr_letter_bp[k1][k2]),max(chr_letter_bp[k1][k2])]
block_rds[k1][k2]=0
[Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,total_rec,rd_low_qual]=[{},{},{},{},{}]
for k1 in list(chr_letter_bp.keys()):
Pair_ThroughBP[k1]=[]
Double_Read_ThroughBP[k1]=[]
Single_Read_ThroughBP[k1]=[]
rd_low_qual[k1]={}
for k2 in blocks_read_in[k1]:
multi_dup_flag=multi_dup_check(k2,Multi_Dup)
if multi_dup_flag==0:
k2a=[]
k2b=[]
for k3 in k2:
if type(k3)==type(1):
k2a.append(k3)
else:
k2b.append(k3)
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,k1,min(k2a)-BP_para_dict['flank'],max(k2a)+BP_para_dict['flank']))
blackList=[]
temp_rec={}
temp_rec_LowQual={}
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
#if not int(pbam[4])>QCAlign:continue
if pbam[0] in blackList: continue
if not int(pbam[4])>QCAlign:
if not pbam[0] in list(temp_rec_LowQual.keys()):
temp_rec_LowQual[pbam[0]]=[]
if not pbam[1:9] in temp_rec_LowQual[pbam[0]]:
temp_rec_LowQual[pbam[0]]+=[pbam[1:9]]
else:
if not pbam[0] in list(temp_rec.keys()):
temp_rec[pbam[0]]=[]
if not pbam[1:9] in temp_rec[pbam[0]]:
temp_rec[pbam[0]]+=[pbam[1:9]]
fbam.close()
flank_region=[]
for k3 in k2b:
flank_region+=block_bps[k1][k3]
flank_region=[min(flank_region),max(flank_region)]
for k3 in list(temp_rec_LowQual.keys()):
for k4 in temp_rec_LowQual[k3]:
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
if not read_pos[-1] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-1]]=0
rd_low_qual[k1][read_pos[-1]]+=(read_pos[1]-read_pos[0])
else:
if not read_pos[-2] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-2]]=0
if not read_pos[-1] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-1]]=0
rd_low_qual[k1][read_pos[-2]]+=block_bps[k1][read_pos[-2]][1]-read_pos[0]
rd_low_qual[k1][read_pos[-1]]+=-block_bps[k1][read_pos[-1]][0]+read_pos[1]
for k3 in list(temp_rec.keys()):
if len(temp_rec[k3])>2:
test_rec=[int(temp_rec[k3][0][7])]
test_rec2=[temp_rec[k3][0]]
test_let=0
for k4 in temp_rec[k3][1:]:
delflag=0
for k5 in test_rec:
if int(k4[7])+k5==0:
test_let+=1
k6=k3+chr(96+test_let)
temp_rec[k6]=[test_rec2[test_rec.index(k5)],k4]
del test_rec2[test_rec.index(k5)]
del test_rec[test_rec.index(k5)]
delflag+=1
if delflag==0:
test_rec.append(int(k4[7]))
test_rec2.append(k4)
temp_rec[k3]=test_rec2
for k3 in list(temp_rec.keys()):
if len(temp_rec[k3])==1:
del_flag=0
k4=temp_rec[k3][0]
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
mate_pos=[int(k4[6]),int(k4[6])+ReadLength]
if 'left' in k2b and mate_pos[1]<flank_region[0]:
del_flag+=1
elif 'right' in k2b and mate_pos[0]>flank_region[0]:
del_flag+=1
#elif not mate_pos[1]<flank_region[0] and not mate_pos[0]>flank_region[1]:
# del_flag+=1
if del_flag>0:
del temp_rec[k3]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
else:
Single_Read_ThroughBP[k1].append(read_pos)
else:
if not k3 in list(total_rec.keys()):
total_rec[k3]=[k4]
else:
total_rec[k3]+=[k4]
elif len(temp_rec[k3])==2:
if int(temp_rec[k3][0][7])==0 or int(temp_rec[k3][1][7])==0:
continue
if int(temp_rec[k3][0][7])+int(temp_rec[k3][1][7])==0 and int(temp_rec[k3][0][7])<0:
temp_rec[k3]=[temp_rec[k3][1],temp_rec[k3][0]]
read_pos=[int(temp_rec[k3][0][2]),int(temp_rec[k3][0][2])+cigar2reaadlength(temp_rec[k3][0][4]),int(temp_rec[k3][1][2]),int(temp_rec[k3][1][2])+cigar2reaadlength(temp_rec[k3][1][4])]+Reads_Direction_Detect_flag(temp_rec[k3][0][0])
#print temp_rec[k3]
#if k3 in test2:
# print read_pos
if read_pos[0]>read_pos[2]:
read_pos=read_pos[2:4]+read_pos[:2]+[read_pos[-1],read_pos[-2]]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[6]==read_pos[7]==read_pos[8]==read_pos[9]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
block_rds[k1][read_pos[-1]]+=read_pos[3]-read_pos[2]
elif read_pos[8]==read_pos[9] and read_pos[6]==read_pos[7]:
Pair_ThroughBP[k1].append(read_pos[:6]+[read_pos[6],read_pos[8]])
else:
Double_Read_ThroughBP[k1].append(read_pos)
del temp_rec[k3]
#if k3 in test2:
# print read_pos
for k3 in list(total_rec.keys()):
if len(total_rec[k3])==1:
del_flag=0
k4=total_rec[k3][0]
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
mate_pos=[int(k4[6]),int(k4[6])+ReadLength]
if 'left' in k2b and mate_pos[1]<flank_region[0]:
del_flag+=1
elif 'right' in k2b and mate_pos[0]>flank_region[0]:
del_flag+=1
elif not mate_pos[1]<flank_region[0] and not mate_pos[0]>flank_region[1]:
del_flag+=1
if del_flag>0:
del total_rec[k3]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
else:
Single_Read_ThroughBP[k1].append(read_pos)
elif len(total_rec[k3])==2:
read_pos=[int(total_rec[k3][0][2]),int(total_rec[k3][0][2])+cigar2reaadlength(total_rec[k3][0][4]),int(total_rec[k3][1][2]),int(total_rec[k3][1][2])+cigar2reaadlength(total_rec[k3][1][4])]+Reads_Direction_Detect_flag(total_rec[k3][0][0])
#print read_pos
if read_pos[0]>read_pos[2]:
read_pos=read_pos[2:4]+read_pos[:2]+[read_pos[-1],read_pos[-2]]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[6]==read_pos[7]==read_pos[8]==read_pos[9]:
block_rds[k1][read_pos[-1]]+=read_pos[1]-read_pos[0]
block_rds[k1][read_pos[-1]]+=read_pos[3]-read_pos[2]
elif read_pos[8]==read_pos[9] and read_pos[6]==read_pos[7]:
Pair_ThroughBP[k1].append(read_pos[:6]+[read_pos[6],read_pos[8]])
else:
Double_Read_ThroughBP[k1].append(read_pos)
del total_rec[k3]
#print total_rec
direction_penal=0
block_rd2={}
block_bp2=block_bps
for k1 in list(block_rds.keys()):
block_rd2[k1]={}
for k2 in list(block_rds[k1].keys()):
block_rd2[k1][k2]=0
for i2 in list(Pair_ThroughBP.keys()):
for i in Pair_ThroughBP[i2]:
if not i[4:6]==['+','-']:
direction_penal+=1
block_rd2[i2][i[6]]+=i[1]-i[0]
block_rd2[i2][i[7]]+=i[3]-i[2]
for i2 in list(Double_Read_ThroughBP.keys()):
for i in Double_Read_ThroughBP[i2]:
if i[6]==i[7]:
block_rd2[i2][i[6]]+=i[1]-i[0]
block_rd2[i2][i[8]]+=-i[2]+block_bp2[i2][i[8]][1]
block_rd2[i2][i[9]]+=i[3]-block_bp2[i2][i[9]][0]
#if -i[2]+block_bp2[i2][i[8]][1]>200 and i[8]=='a':
#print i
#if i[3]-block_bp2[i2][i[9]][0]>200 and i[9]=='a':
#print i
elif i[8]==i[9]:
block_rd2[i2][i[8]]+=i[3]-i[2]
block_rd2[i2][i[6]]+=-i[0]+block_bp2[i2][i[6]][1]
block_rd2[i2][i[7]]+=i[1]-block_bp2[i2][i[7]][0]
#if -i[0]+block_bp2[i2][i[6]][1]>101:
#print i
#if i[1]-block_bp2[i2][i[7]][0]>101:
#print i
else:
block_rd2[i2][i[6]]+=-i[0]+block_bp2[i2][i[6]][1]
block_rd2[i2][i[7]]+=i[1]-block_bp2[i2][i[7]][0]
block_rd2[i2][i[8]]+=-i[2]+block_bp2[i2][i[8]][1]
block_rd2[i2][i[9]]+=i[3]-block_bp2[i2][i[9]][0]
for i2 in list(Single_Read_ThroughBP.keys()):
for i in Single_Read_ThroughBP[i2]:
block_rd2[i2][i[2]]+=-i[0]+block_bp2[i2][i[2]][1]
block_rd2[i2][i[3]]+=i[1]-block_bp2[i2][i[3]][0]
for k1 in list(rd_low_qual.keys()):
for k2 in list(rd_low_qual[k1].keys()):
block_rds[k1][k2]+=rd_low_qual[k1][k2]
return [block_rds,block_rd2,Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP]
def bp_let_split(k2):
#eg of k2=[0, 101, 101, 847, 'a', 'left']
out=[[]]
for x in k2:
if out[-1]==[]:
out[-1].append(x)
else:
if type(x)==type(out[-1][-1]):
out[-1].append(x)
else:
out.append([x])
return out
def Copy_num_Check_report(Copy_num_Check,Full_Info,chr_letter_bp):
out=[]
block_hash={}
for k1 in list(chr_letter_bp.keys()):
for k2 in list(chr_letter_bp[k1].keys()):
block_hash[k2]=[k1]+chr_letter_bp[k1][k2]
for x in Copy_num_Check:
out.append(block_hash[x[0]]+['CN='+str(int(float(Full_Info[1][x[0]])/float(GC_para_dict['GC_Mean_Coverage'][Chr]*2)))])
#out.append(block_hash[x[0]]+['CN='+str(int(x[1]))])
out_new=[]
for x in out:
if out_new==[]: out_new.append(x)
else:
if x[0]==out_new[-1][0] and x[1]==out_new[-1][2] and int(x[-1].split('=')[1])-int(out_new[-1][-1].split('=')[1])<2:
out_new[-1][2]=x[2]
else: out_new.append(x)
return out_new
def Cov_Cal_Block(pos,bp,cov,perc):
for j in range(len(bp)-2):
if not pos[0]<bp[j] and pos[0]<bp[j+1]:
if not pos[1]<bp[j] and pos[1]<bp[j+1]:
cov[j]+=(pos[1]-pos[0])*perc
elif not pos[1]<bp[j+1] and pos[1]<bp[j+2]:
cov[j]+=(bp[j+1]-pos[0])*perc
cov[j+1]+=(pos[1]-bp[j+1])*perc
elif not pos[1]<temp_bp[0][j+2] and pos[1]<temp_bp[0][j+3]:
cov[j]+=(bp[j+1]-pos[0])*perc
cov[j+1]+=(bp[j+2]-bp[j+1])*perc
cov[j+2]+=(pos[1]-bp[j+2])*perc
j=len(bp)-2
if not pos[0]<bp[j] and pos[0]<bp[j+1]:
if not pos[1]<bp[j] and pos[1]<bp[j+1]:
cov[j]+=(pos[1]-pos[0])*perc
else:
cov[j]+=(bp[j+1]-pos[0])*perc
def calcu_chr_letter_bp_left(bps2):
out={}
for i in bps2:
if not i[0] in list(out.keys()):
out[i[0]]={}
out[i[0]]['a']=[i[1]-1000,i[1]]
return out
def calcu_chr_letter_bp_right(bps2):
out={}
for i in bps2:
if not i[0] in list(out.keys()):
out[i[0]]={}
out[i[0]]['a']=[i[-1],i[-1]+1000]
return out
def calcu_k2_k3(k2):
#eg of k2:[2780427, 2780927, 2780927, 2782153, 2782153, 2782378, 2782378, 2782468, 2782468, 2782968, 'a', 'b', 'c', 'left', 'right']
k2a=[]
k2b=[]
for k3 in k2:
if type(k3)==type(1):
k2a.append(k3)
else:
k2b.append(k3)
return [k2a,k2b]
def candidate_QC_Control(Read_List):
if Read_List==[]:
return []
else:
Qual_Filter_1=[]
for j in Read_List:
if not j[1]-j[0]>ReadLength+min_resolution and j[1]-j[0]>0 and not j[3]-j[2]>ReadLength+min_resolution and j[3]-j[2]>0:
Qual_Filter_1.append(j)
if not Qual_Filter_1==[]:
if len(Qual_Filter_1)==1:
Qual_Filter_1[0]+=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_1]
return Qual_Filter_1
else:
Qual_Filter_2=[]
for j2 in Qual_Filter_1:
if j2[-2:]==['+','-']:
Qual_Filter_2.append(j2)
if not Qual_Filter_2==[]:
if len(Qual_Filter_2)==1:
Qual_Filter_2[0]+=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
return Qual_Filter_2
else:
Qual_Filter_3=[]
Qual_IL=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
for jq in range(len(Qual_IL)):
if Qual_IL[jq]==max(Qual_IL) and not Qual_Filter_1[jq] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_1[jq]+[max(Qual_IL)])
return Qual_Filter_3
else:
Qual_Filter_2=Qual_Filter_1
if len(Qual_Filter_2)==1:
Qual_Filter_2[0]+=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
return Qual_Filter_2
else:
Qual_Filter_3=[]
Qual_IL=[pdf_calculate(max(j3[:4])-min(j3[:4]),GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero) for j3 in Qual_Filter_2]
for jq in range(len(Qual_IL)):
if Qual_IL[jq]==max(Qual_IL) and not Qual_Filter_1[jq] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_1[jq]+[max(Qual_IL)])
return Qual_Filter_3
else:
return []
def candidate_QC_Control2(M_Read_List,P_Read_List):
Qual_Filter_1=[]
for i in M_Read_List:
Qual_Filter_1.append(i+['m'])
for i in P_Read_List:
Qual_Filter_1.append(i+['p'])
Qual_Filter_2=[]
for i in Qual_Filter_1:
if i[-4:-2]==['+','-']:
Qual_Filter_2.append(i)
if not Qual_Filter_2==[]:
Qual_Filter_3=[]
IL_Qual=[abs(j3[3]-j3[0]-IL_Mean) for j3 in Qual_Filter_2]
for j in range(len(IL_Qual)):
if IL_Qual[j]==min(IL_Qual) and not Qual_Filter_2[j] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_2[j])
else:
Qual_Filter_2=Qual_Filter_1
Qual_Filter_3=[]
IL_Qual=[abs(j3[3]-j3[0]-IL_Mean) for j3 in Qual_Filter_2]
for j in range(len(IL_Qual)):
if IL_Qual[j]==min(IL_Qual) and not Qual_Filter_2[j] in Qual_Filter_3:
Qual_Filter_3.append(Qual_Filter_2[j])
return Qual_Filter_3
def calcu_IL_Norm(IL,file_in):
stat=readin_IL_Stat(file_in,model_comp='C')
return Prob_Norm(IL,stat[1],stat[2])
def calcu_RD_Norm(GC_para_dict,Initial_GCRD_Adj,Chr,Af_RD_Rec,Af_Letter):
Letters=[['left']+Af_Letter[0]+['right'],['left']+Af_Letter[1]+['right']]
Overall_Median_Coverage=float(GC_para_dict['GC_Overall_Median_Num'])
Theo_RD=GC_para_dict['GC_Overall_Median_Coverage'][str(Chr)]
Theo_Var=GC_para_dict['GC_Var_Coverage'][str(Chr)]
Prob_out=[]
if Af_Letter==[[], []]:
for i in list(Initial_GCRD_Adj.keys()):
if not i in ['left','right']:
Prob_out.append(Prob_Norm(Initial_GCRD_Adj[i]+Theo_RD/2,Theo_RD/2,Theo_Var))
else:
for i in Af_RD_Rec:
for j in i:
Prob_out.append(Prob_Norm(j,Theo_RD/2,Theo_Var/sqrt(2)))
return numpy.mean(Prob_out)
def calcu_PC_Norm(PC_list,PC_file):
#eg of PC_file='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/TBNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.Bimodal'
#return mean(log_pdf of physical coverages across all breakpoints)
PC_stat=readin_PC_Stat(PC_file,'C')
out=[]
for x in PC_list:
out.append(pdf_calculate(2.0*x,PC_stat[0],PC_stat[1],PC_stat[4],PC_stat[2],PC_stat[5],TB_Cut_Upper,TB_Cut_Lower,Penalty_For_InsertLengthZero))
return numpy.mean(out)
def calcu_PO_Stat(number_of_aberrant_pairs,slope,intercept):
#number_of_aberrant_pairs should be normalized to per 100bp bin by : number_of_aberrant_pairs_per_event/SV_length*100
#return log(p) of observing current number of aberriant orientated pairs
log_prob=slope*number_of_aberrant_pairs+intercept #log_prob is the log(p) of observing current number of aberriant orientated pairs per 100bp
return log_prob
def chromos_readin_list(ref):
fin=open(ref+'.fai')
chromos=[]
for line in fin:
pin=line.strip().split()
chromos.append(pin[0])
fin.close()
return chromos
def chr_letter_bp_modify(chr_letter_bp,flank=500):
#eg of chr_letter_bp:{'chr1': {'a': [101, 847, 45702596, 45703342], 'b': [45703342, 45703361]}}
for k1 in list(chr_letter_bp.keys()):
blocks_name=sorted([i for i in list(chr_letter_bp[k1].keys()) if not i in ['left','right']])
if not 'left' in list(chr_letter_bp[k1].keys()):
chr_letter_bp[k1]['left']=[max([chr_letter_bp[k1][blocks_name[0]][0]-flank,0]),chr_letter_bp[k1][blocks_name[0]][0]]
if not 'right' in list(chr_letter_bp[k1].keys()):
chr_letter_bp[k1]['right']=[chr_letter_bp[k1][blocks_name[-1]][1],chr_letter_bp[k1][blocks_name[-1]][1]+flank]
return chr_letter_bp
def commandline_readin():
global workdir,seq_path,ref_path,ref_file,ref_index,ref_ppre,ref_prefix,GC_hash,genome_name,model_comp,Penalty_For_InsertLengthZero
workdir=dict_opts['--workdir']
seq_path=dict_opts['--seq-path']
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
ref_ppre=ref_path
ref_prefix='.'.join(ref_file.split('.')[:-1])
GC_hash=GC_Index_Readin(ref_prefix+'.GC_Content')
genome_name='genome'
model_comp='C'
Penalty_For_InsertLengthZero=-20
global QCAlign,tolerance_bp,min_resolution,Best_IL_Score,Best_RD_Score
[QCAlign,tolerance_bp,min_resolution,Best_IL_Score,Best_RD_Score]=[20,10,70,0,0]
def copy_num_estimate_calcu(GC_para_dict,BP_para_dict,bps2):
chr_letter_bp=letter_rearrange(BP_para_dict['flank'],bps2)
Initial_GCRD_Adj_pre=letter_RD_ReadIn(letter_RD_test_calcu(chr_letter_bp))
global Initial_GCRD_Adj
Initial_GCRD_Adj={}
for k1 in list(Initial_GCRD_Adj_pre.keys()):
for k2 in list(Initial_GCRD_Adj_pre[k1].keys()):
Initial_GCRD_Adj[k2]=Initial_GCRD_Adj_pre[k1][k2]
Initial_GCRD_Adj['left']=numpy.mean([GC_para_dict['GC_Mean_Coverage'][key_chr[0]] for key_chr in bps2])
Initial_GCRD_Adj['right']=numpy.mean([GC_para_dict['GC_Mean_Coverage'][key_chr[0]] for key_chr in bps2])
Copy_num_estimate={}
for i in list(Initial_GCRD_Adj.keys()):
if not i in ['left','right']:
Copy_num_estimate[i]=round(Initial_GCRD_Adj[i]*2/GC_para_dict['GC_Mean_Coverage'][Chr])
if Initial_GCRD_Adj[i]<float(GC_para_dict['GC_Mean_Coverage'][Chr])/10.0:
Copy_num_estimate[i]=-1
Copy_num_Check=[]
for CNE in list(Copy_num_estimate.keys()):
if Copy_num_estimate[CNE]>4:
Copy_num_Check.append([CNE,Copy_num_estimate[CNE]])
return [Copy_num_estimate,Copy_num_Check]
def c_Coverage_Calculate_InfoList(Full_Info,Chromo,bp_MP,letter_MP,original_bp_list,flank):
bp_M=[i-original_bp_list[0] for i in bp_MP[0]]
bp_P=[i-original_bp_list[0] for i in bp_MP[1]]
M_New_bp=[bp_M[0]-flank]+bp_M+[bp_M[-1]+flank]
P_New_bp=[bp_P[0]-flank]+bp_P+[bp_P[-1]+flank]
M_coverage=Block_Assign_To_Letters(bp_MP[0],letter_MP[0],flank)
P_coverage=Block_Assign_To_Letters(bp_MP[1],letter_MP[1],flank)
for key in list(M_coverage.keys()):
M_coverage[key].append(0)
for key in list(P_coverage.keys()):
P_coverage[key].append(0)
for key in list(Half_Info.keys()):
Half=Half_Info[key]
if Half[0]<-flank-Window_Size: continue
else:
if Half[-1]=='M':
M_coverage[(Half[0]-(M_New_bp[0]))/Window_Size+1][-1]+=1
elif Half[-1]=='P':
P_coverage[(Half[0]-(P_New_bp[0]))/Window_Size+1][-1]+=1
return [M_coverage,P_coverage]
def c_GCContent_Calculate_InfoList(Ori_1_Seq,original_bp_list,flank):
region_length=original_bp_list[-1]-original_bp_list[0]+2*flank
region_length_new=(region_length/100+1)*100-2*flank
Number_Of_Blocks=len(Ori_1_Seq)/100
GC_Content={}
for i in range(Number_Of_Blocks):
GC_Content[i+1]=GC_Content_Calculate(Ori_1_Seq[i*100:(i+1)*100])[0]
return GC_Content
def c_Coverage_Calculate_2a(Letter_Single,Letter_Double,Chromo,original_bp_list,original_letters,flank):
letter_list=original_letters
bp_list=[i-original_bp_list[0] for i in original_bp_list]
bp_list_new=[bp_list[0]-flank]+bp_list+[bp_list[-1]+flank]
coverage=Block_Assign_To_Letters(bp_list,letter_list,flank)
for key in list(coverage.keys()):
coverage[key].append(0)
for key in list(Letter_Single.keys()):
for i in Letter_Single[key]:
keynumL=(i[0]+flank)/Window_Size+1
keynumR=(i[1]+flank)/Window_Size+1
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
else:
coverage[keynumR][-1]+=1
for key in list(Letter_Double.keys()):
for i in Letter_Double[key]:
keynumL=(i[0]+flank)/Window_Size+1
keynumR=(i[1]+flank)/Window_Size+1
if keynumL in list(coverage.keys()) and keynumR in list(coverage.keys()):
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
else:
coverage[keynumR][-1]+=1
keynumL=(i[2]+flank)/Window_Size+1
keynumR=(i[3]+flank)/Window_Size+1
if keynumL in list(coverage.keys()) and keynumR in list(coverage.keys()):
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
else:
coverage[keynumR][-1]+=1
return coverage
def c_Coverage_Calculate_2b(Letter_Through,Chromo,original_bp_list,original_letters,flank):
#Eg of RD_Full_Info_of_Reads (a hash list) elements: 'HWI-ST177_136:2:1:7920:85270': [1202, 1302, 1443, 1543, '+', '-']
letter_list=original_letters
bp_list=[i-original_bp_list[0] for i in bp_MP[0]]
bp_list_new=[bp_list[0]-flank]+bp_list+[bp_list[-1]+flank]
coverage=Block_Assign_To_Letters(bp_list,letter_list,flank)
for key in list(coverage.keys()):
coverage[key].append(0)
for key in list(Letter_Through.keys()):
i=Letter_Through[key]
keynumL=(i[0]+flank)/Window_Size+1
keynumR=(i[1]+flank)/Window_Size+1
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
elif lenL<lenR:
coverage[keynumR][-1]+=1
elif lenL==lenR:
coverage[keynumL][-1]+=0.5
coverage[keynumR][-1]+=0.5
keynumL=(i[2]+flank)/Window_Size+1
keynumR=(i[3]+flank)/Window_Size+1
lenL=coverage[keynumL][1]-i[0]
lenR=i[1]-coverage[keynumR][0]+1
if lenL>lenR:
coverage[keynumL][-1]+=1
elif lenL<lenR:
coverage[keynumR][-1]+=1
elif lenL==lenR:
coverage[keynumL][-1]+=0.5
coverage[keynumR][-1]+=0.5
return coverage
def c_Coverage_Calculate_2d(Full_Info,Chromo,bp_MP,letter_MP,original_bp_list,flank):
#Eg of RD_Full_Info_of_Reads (a hash list) elements: 'HWI-ST177_136:2:1:7920:85270': [1202, 1302, 1443, 1543, '+', '-']
bp_M=[i-original_bp_list[0] for i in bp_MP[0]]
bp_P=[i-original_bp_list[0] for i in bp_MP[1]]
M_New_bp=[bp_M[0]-flank]+bp_M+[bp_M[-1]+flank]
P_New_bp=[bp_P[0]-flank]+bp_P+[bp_P[-1]+flank]
M_coverage=Block_Assign_To_Letters(bp_MP[0],letter_MP[0],flank)
P_coverage=Block_Assign_To_Letters(bp_MP[1],letter_MP[1],flank)
for key in list(M_coverage.keys()):
M_coverage[key].append(0)
for key in list(P_coverage.keys()):
P_coverage[key].append(0)
for key in list(Full_Info.keys()):
if not len(Full_Info[key])==8:
Halfa=Full_Info[key][:2]+[Full_Info[key][4]]+[Full_Info[key][6]]
Halfb=Full_Info[key][2:4]+[Full_Info[key][5]]+[Full_Info[key][6]]
for Half in [Halfa,Halfb]:
if Half[0]<-flank-Window_Size: continue
else:
if Half[-1]=='M':
M_coverage[(Half[0]-(M_New_bp[0]))/Window_Size+1][-1]+=1
elif Half[-1]=='P':
P_coverage[(Half[0]-(P_New_bp[0]))/Window_Size+1][-1]+=1
elif len(Full_Info[key])==8:
Halfa=Full_Info[key][:2]+[Full_Info[key][4]]+[Full_Info[key][6]]
Halfb=Full_Info[key][2:4]+[Full_Info[key][5]]+[Full_Info[key][6]]
for Half in [Halfa,Halfb]:
if Half[0]<-flank-Window_Size: continue
else:
if Half[-1]=='M':
M_coverage[(Half[0]-(M_New_bp[0]))/Window_Size+1][-1]+=float(Full_Info[key][7])
elif Half[-1]=='P':
P_coverage[(Half[0]-(P_New_bp[0]))/Window_Size+1][-1]+=float(Full_Info[key][7])
return [M_coverage,P_coverage]
def c_Coverage_Calculate_2e(Af_Info,Chromo,bp_MP,letter_MP,original_bp_list,flank):
#Eg of RD_Full_Info_of_Reads (a hash list) elements: 'HWI-ST177_136:2:1:7920:85270': [1202, 1302, 1443, 1543, '+', '-']
hashM={}
for i in letter_MP[0]:
if not i[0] in list(hashM.keys()):
hashM[i[0]]=[i[0]]
if (letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))>1:
hashM[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))[1:]]
hashP={}
for i in letter_MP[1]:
if not i[0] in list(hashP.keys()):
hashP[i[0]]=[i[0]]
if (letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))>1:
hashP[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))[1:]]
hashMPLetterBP={}
hashMPLetterBP['M']={}
hashMPLetterBP['P']={}
for j in range(len(letter_MP[0])):
hashMPLetterBP['M'][hashM[letter_MP[0][j][0]][0]]=[bp_MP[0][j],bp_MP[0][j+1]]
hashM[letter_MP[0][j][0]].remove(hashM[letter_MP[0][j][0]][0])
for j in range(len(letter_MP[1])):
hashMPLetterBP['P'][hashP[letter_MP[1][j][0]][0]]=[bp_MP[1][j],bp_MP[1][j+1]]
hashP[letter_MP[1][j][0]].remove(hashP[letter_MP[1][j][0]][0])
hashM={}
hashM['left']=['left']
hashM['right']=['right']
for i in letter_MP[0]:
if not i[0] in list(hashM.keys()):
hashM[i[0]]=[i[0]]
if (letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))>1:
hashM[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[0].count(i[0])+letter_MP[0].count(i[0]+'^'))[1:]]
hashP={}
hashP['left']=['left']
hashP['right']=['right']
for i in letter_MP[1]:
if not i[0] in list(hashP.keys()):
hashP[i[0]]=[i[0]]
if (letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))>1:
hashP[i[0]]+=[i[0]+'_'+str(j) for j in range(letter_MP[1].count(i[0])+letter_MP[1].count(i[0]+'^'))[1:]]
M_Coverage={}
M_Coverage['left']=0
for key_1 in list(hashMPLetterBP['M'].keys()):
M_Coverage[key_1]=[0 for i in range((hashMPLetterBP['M'][key_1][1]-hashMPLetterBP['M'][key_1][0])/Window_Size)]
if ((hashMPLetterBP['M'][key_1][1]-hashMPLetterBP['M'][key_1][0])-(hashMPLetterBP['M'][key_1][1]-hashMPLetterBP['M'][key_1][0])/Window_Size*Window_Size)>30:
M_Coverage[key_1].append(0)
P_Coverage={}
P_Coverage['left']=0
for key_1 in list(hashMPLetterBP['P'].keys()):
P_Coverage[key_1]=[0 for i in range((hashMPLetterBP['P'][key_1][1]-hashMPLetterBP['P'][key_1][0])/Window_Size)]
if ((hashMPLetterBP['P'][key_1][1]-hashMPLetterBP['P'][key_1][0])-(hashMPLetterBP['P'][key_1][1]-hashMPLetterBP['P'][key_1][0])/Window_Size*Window_Size)>30:
P_Coverage[key_1].append(0)
for key in list(Af_Info.keys()):
if Af_Info[key][0]==Af_Info[key][1]==Af_Info[key][2]==Af_Info[key][3]==(-flank/2):
M_Coverage['left']+=0.5
P_Coverage['left']+=0.5
else:
if key in list(Letter_Through.keys()):
if Af_Info[key][6]=='M':
lele=hashM[Letter_Through[key][6]]
rile=hashM[Letter_Through[key][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['M'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
M_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])
if lebl[1]/Window_Size<len(M_Coverage[lele1]):
M_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['M'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
M_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])
if ribl[1]/Window_Size<len(M_Coverage[rile1]):
M_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])
if Af_Info[key][6]=='P':
lele=hashP[Letter_Through[key][6]]
rile=hashP[Letter_Through[key][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['P'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
P_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])
if lebl[1]/Window_Size<len(P_Coverage[lele1]):
P_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['P'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
P_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])
if ribl[1]/Window_Size<len(P_Coverage[rile1]):
P_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])
if not key in list(Letter_Through.keys()):
key2='_'.join(key.split('_')[:-1])
if Af_Info[key][6]=='M':
lele=hashM[Letter_Through[key2][6]]
rile=hashM[Letter_Through[key2][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['M'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
M_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
if lebl[1]/Window_Size<len(M_Coverage[lele1]):
M_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['M'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
M_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
if ribl[1]/Window_Size<len(M_Coverage[rile1]):
M_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
if Af_Info[key][6]=='P':
lele=hashP[Letter_Through[key2][6]]
rile=hashP[Letter_Through[key2][9]]
lebl=Af_Info[key][:2]
ribl=Af_Info[key][2:4]
for lele1 in lele:
if lele1=='left' or lele1=='right': continue
block=[lele2-bps[0] for lele2 in hashMPLetterBP['P'][lele1]]
if numpy.min(lebl)+15>block[0] and numpy.max(lebl)-15<block[1]:
lebl=[k-block[0] for k in lebl]
P_Coverage[lele1][lebl[0]/Window_Size]+=float(lebl[0]/Window_Size*Window_Size+Window_Size-lebl[0])/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
if lebl[1]/Window_Size<len(P_Coverage[lele1]):
P_Coverage[lele1][lebl[1]/Window_Size]+=float(lebl[1]-lebl[1]/Window_Size*Window_Size)/float(lebl[1]-lebl[0])*float(Af_Info[key][7])
for rile1 in rile:
if rile1=='left' or rile1=='right':continue
block=[rile2-bps[0] for rile2 in hashMPLetterBP['P'][rile1]]
if numpy.min(ribl)+15>block[0] and numpy.max(ribl)-15<block[1]:
ribl=[k-block[0] for k in ribl]
P_Coverage[rile1][ribl[0]/Window_Size]+=float(ribl[0]/Window_Size*Window_Size+Window_Size-ribl[0])/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
if ribl[1]/Window_Size<len(P_Coverage[rile1]):
P_Coverage[rile1][ribl[1]/Window_Size]+=float(ribl[1]-ribl[1]/Window_Size*Window_Size)/float(ribl[1]-ribl[0])*float(Af_Info[key][7])
return [M_Coverage,P_Coverage]
def Define_Default_SVPredict(dict_opts):
global tolerance_bp
tolerance_bp=10
global min_resolution
min_resolution=70
global Best_IL_Score
Best_IL_Score=0
global Best_RD_Score
deterministic_flag=0
if '--deterministic-flag' in list(dict_opts.keys()):
deterministic_flag=int(dict_opts['--deterministic-flag'])
global Penalty_For_InsertLengthZero
Penalty_For_InsertLengthZero=-20 #Toy example,decides later
global model_comp
if not '--null-model' in list(dict_opts.keys()):
model_comp='C'
else:
if dict_opts['--null-model'] in ['S','Simple']:
model_comp='S'
else:
model_comp='C'
global Ploidy
if '--ploidy' in list(dict_opts.keys()):
Ploidy=int(dict_opts['--ploidy'])
else:
Ploidy=2
global QCAlign
if '--qc-align' in list(dict_opts.keys()):
QCAlign=int(dict_opts['--qc-align'])
else:
QCAlign=20
global genome_name
if '--NullGenomeName' in list(dict_opts.keys()):
genome_name=dict_opts['--NullGenomeName']
else:
genome_name='genome'
global Trail_Number
if '--num-iteration' in list(dict_opts.keys()):
Trail_Number=int(dict_opts['--num-iteration'])
else:
Trail_Number=100000
global Local_Minumum_Number
Local_Minumum_Number=100
global IL_Weight,DR_Weight,TB_Weight
[IL_Weight,RD_Weight,TB_Weight]=[1,5,5]
global chromos_all,single_file,seq_file_names
chromos_all=chromos_readin_list(ref_file)
single_file=dict_opts['-f']
seq_file_names=seq_file_name_readin(seq_path)
def Full_Info_of_Reads_Product(Initial_Bam,bps,total_bps,total_letters,bamChr,flank,QCAlign,ReadLength,chr_link):
# letters=[chr(97+i) for i in range(len(bps)-1)]
temp_bp=total_bps
temp_let=total_letters
BlockCov={}
for j in temp_let:
BlockCov[j]=0
Letter_Double={}
Pair_ThroughBP=[]
Double_Read_ThroughBP=[]
Single_Read_ThroughBP=[]
blackList=[]
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,bamChr,bps[0]-flank,bps[-1]+flank))
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
if not int(pbam[4])>QCAlign:
continue
if pbam[0] in blackList: continue
if int(pbam[1])&8>0 or not pbam[6]=='=':
pos1=int(pbam[3])+low_qual_edge
pos2=int(pbam[3])+cigar2reaadlength(pbam[5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(pbam[5])
else:
rela_1=pos1-low_qual_edge-temp_bp[temp_let.index(block1)]
rela_2=pos2+low_qual_edge-temp_bp[temp_let.index(block2)]
Single_Read_ThroughBP.append([block1,rela_1,block2,rela_2,pbam[5]])
if not pbam[6]=='=':
if not pbam[0] in list(chr_link.keys()):
chr_link[pbam[0]]=[pbam[1:9]]
else:
chr_link[pbam[0]]+=[pbam[1:9]]
elif int(pbam[1])&8==0:
if pbam[6]=='=':
if not pbam[0] in list(Letter_Double.keys()):
Letter_Double[pbam[0]]=[pbam[:9]]
else:
if not pbam[:9] in Letter_Double[pbam[0]]:
Letter_Double[pbam[0]]+=[pbam[:9]]
if int(Letter_Double[pbam[0]][0][3])<int(Letter_Double[pbam[0]][1][3]):
pos1=int(Letter_Double[pbam[0]][0][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][1][3])+cigar2reaadlength(Letter_Double[pbam[0]][1][5])-low_qual_edge
else:
pos1=int(Letter_Double[pbam[0]][1][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][0][3])+cigar2reaadlength(Letter_Double[pbam[0]][0][5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(Letter_Double[pbam[0]][0][5])
BlockCov[block1]+=cigar2reaadlength(Letter_Double[pbam[0]][1][5])
del Letter_Double[pbam[0]]
blackList.append(pbam[0])
fbam.close()
for key in list(Letter_Double.keys()):
if key in blackList:
del Letter_Double[key]
continue
if len(Letter_Double[key])==2:
pos1=int(Letter_Double[key][0][3])
pos2=int(Letter_Double[key][1][3])
if not pos1>pos2:
pos1=int(Letter_Double[key][0][3])
pos1b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
pos2=int(Letter_Double[key][1][3])
pos2b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][0][1])
elif pos1>pos2:
pos1=int(Letter_Double[key][1][3])
pos1b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
pos2=int(Letter_Double[key][0][3])
pos2b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][1][1])
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1+low_qual_edge)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2+low_qual_edge)
block1b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1b-low_qual_edge)
block2b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2b-low_qual_edge)
rela_1=pos1-temp_bp[temp_let.index(block1)]
rela_2=pos2-temp_bp[temp_let.index(block2)]
rela_1b=pos1b-temp_bp[temp_let.index(block1b)]
rela_2b=pos2b-temp_bp[temp_let.index(block2b)]
if block1==block1b and block2==block2b:
Pair_ThroughBP.append([block1,rela_1,rela_1b, block2,rela_2,rela_2b]+direct_temp)
else:
Double_Read_ThroughBP.append([block1,rela_1,block1b,rela_1b, block2,rela_2,block2b,rela_2b]+direct_temp)
del Letter_Double[key]
elif len(Letter_Double[key])==1:
if Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][7]))==0:
if Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))==Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3])+cigar2reaadlength(Letter_Double[key][0][5])):
BlockCov[Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))]+=cigar2reaadlength(Letter_Double[key][0][5])
del Letter_Double[key]
Initial_DR_Penal=0
for j in Pair_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Double_Read_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
Initial_Cov={}
for j in temp_let:
Initial_Cov[j]=0
for j in Pair_ThroughBP:
Initial_Cov[j[0]]+=j[2]-j[1]
Initial_Cov[j[3]]+=j[5]-j[4]
for j in Single_Read_ThroughBP:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
for j in Double_Read_ThroughBP:
if j[0]==j[2]:
Initial_Cov[j[0]]+=j[3]-j[1]
else:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
if j[4]==j[6]:
Initial_Cov[j[4]]+=j[7]-j[5]
else:
Initial_Cov[j[4]]+=temp_bp[temp_let.index(j[4])+1]-temp_bp[temp_let.index(j[4])]-j[5]
Initial_Cov[j[6]]+=j[7]
Initial_IL=[]
for j in Pair_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[3])]-temp_bp[temp_let.index(j[0])]-j[1]+j[5])
for j in Double_Read_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[6])]-temp_bp[temp_let.index(j[0])]-j[1]+j[7])
Initial_ILPenal=[]
for j in Initial_IL:
Initial_ILPenal+=[pdf_calculate(j,GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero)/len(Initial_IL)]
return [Initial_DR_Penal,Initial_ILPenal,Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,BlockCov,Initial_Cov,Letter_Double]
def Full_Info_of_Reads_Product_3(Initial_Bam,temp_bp,temp_let,bamChr,target_region,Chr_Link):
Letter_Double={}
Pair_ThroughBP=[]
Double_Read_ThroughBP=[]
Single_Read_ThroughBP=[]
blackList=[]
fbam=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,bamChr,target_region[0]-flank,target_region[-1]+flank))
num_of_reads=0
while True:
pbam=fbam.readline().strip().split()
if not pbam: break
if int(pbam[1])&4>0: continue
if int(pbam[1])&1024>0:continue
if not int(pbam[4])>QCAlign or int(pbam[1])&512>0:
blackList.append(pbam[0])
continue
if pbam[0] in blackList: continue
num_of_reads+=1
if int(pbam[1])&8>0 or not pbam[6]=='=':
pos1=int(pbam[3])+low_qual_edge
pos2=int(pbam[3])+cigar2reaadlength(pbam[5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(pbam[5])
else:
reg1a=temp_bp[temp_let.index(block1)]
reg1b=temp_bp[temp_let.index(block1)+1]
reg2a=temp_bp[temp_let.index(block2)]
reg2b=temp_bp[temp_let.index(block2)+1]
rela_1=pos1-low_qual_edge-temp_bp[temp_let.index(block1)]
rela_2=pos2+low_qual_edge-temp_bp[temp_let.index(block2)]
Single_Read_ThroughBP.append([block1,rela_1,block2,rela_2,pbam[5]])
if not pbam[6]=='=':
if not pbam[0] in Chr_Link:
Chr_Link[pbam[0]]=[pbam[1:9]]
else:
Chr_Link[pbam[0]]+=[pbam[1:9]]
elif int(pbam[1])&8==0:
if pbam[6]=='=':
if not pbam[0] in list(Letter_Double.keys()):
Letter_Double[pbam[0]]=[pbam[:9]]
else:
if not pbam[:9] in Letter_Double[pbam[0]]:
Letter_Double[pbam[0]]+=[pbam[:9]]
if int(Letter_Double[pbam[0]][0][3])<int(Letter_Double[pbam[0]][1][3]):
pos1=int(Letter_Double[pbam[0]][0][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][1][3])+cigar2reaadlength(Letter_Double[pbam[0]][1][5])-low_qual_edge
else:
pos1=int(Letter_Double[pbam[0]][1][3])+low_qual_edge
pos2=int(Letter_Double[pbam[0]][0][3])+cigar2reaadlength(Letter_Double[pbam[0]][0][5])-low_qual_edge
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2)
if block1==block2:
BlockCov[block1]+=cigar2reaadlength(Letter_Double[pbam[0]][0][5])
del Letter_Double[pbam[0]]
blackList.append(pbam[0])
fbam.close()
for key in list(Letter_Double.keys()):
if key in blackList:
del Letter_Double[key]
continue
if len(Letter_Double[key])==2:
pos1=int(Letter_Double[key][0][3])
pos2=int(Letter_Double[key][1][3])
if not pos1>pos2:
pos1=int(Letter_Double[key][0][3])
pos1b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
pos2=int(Letter_Double[key][1][3])
pos2b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][0][1])
elif pos1>pos2:
pos1=int(Letter_Double[key][1][3])
pos1b=pos2+cigar2reaadlength(Letter_Double[key][1][5])
pos2=int(Letter_Double[key][0][3])
pos2b=pos1+cigar2reaadlength(Letter_Double[key][0][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double[key][1][1])
block1=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1+low_qual_edge)
block2=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2+low_qual_edge)
block1b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos1b-low_qual_edge)
block2b=Reads_block_assignment_1(flank,temp_bp,temp_let,pos2b-low_qual_edge)
rela_1=pos1-temp_bp[temp_let.index(block1)]
rela_2=pos2-temp_bp[temp_let.index(block2)]
rela_1b=pos1b-temp_bp[temp_let.index(block1b)]
rela_2b=pos2b-temp_bp[temp_let.index(block2b)]
if block1==block1b and block2==block2b:
Pair_ThroughBP.append([block1,rela_1,rela_1b, block2,rela_2,rela_2b]+direct_temp)
else:
Double_Read_ThroughBP.append([block1,rela_1,block1b,rela_1b, block2,rela_2,block2b,rela_2b]+direct_temp)
del Letter_Double[key]
elif len(Letter_Double[key])==1:
if Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))==Reads_block_assignment_1(flank,temp_bp,temp_let,int(Letter_Double[key][0][3])+cigar2reaadlength(Letter_Double[key][0][5])):
BlockCov[Reads_block_assignment_1(flank,flank,temp_bp,temp_let,int(Letter_Double[key][0][3]))]+=cigar2reaadlength(Letter_Double[key][0][5])
del Letter_Double[key]
Initial_DR_Penal=0
for j in Pair_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Double_Read_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Pair_ThroughBP:
Initial_Cov[j[0]]+=j[2]-j[1]
Initial_Cov[j[3]]+=j[5]-j[4]
for j in Single_Read_ThroughBP:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
for j in Double_Read_ThroughBP:
if j[0]==j[2]:
Initial_Cov[j[0]]+=j[3]-j[1]
else:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
if j[4]==j[6]:
Initial_Cov[j[4]]+=j[7]-j[5]
else:
Initial_Cov[j[4]]+=temp_bp[temp_let.index(j[4])+1]-temp_bp[temp_let.index(j[4])]-j[5]
Initial_Cov[j[6]]+=j[7]
for j in Pair_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[3])]-temp_bp[temp_let.index(j[0])]-j[1]+j[5])
for j in Double_Read_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[6])]-temp_bp[temp_let.index(j[0])]-j[1]+j[7])
return [Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,num_of_reads,Initial_DR_Penal]
def find_file_under_path(NullPath,appdix):
NullPath=path_modify(NullPath)
out=[]
for k1 in os.listdir(NullPath):
if os.path.isfile(NullPath+k1):
if k1.split('.')[-1]=='appdix':
out.appdix(NullPath+k1)
return out
def file_straight_readin(file_in):
info=[]
fin=open(file_in)
for line in fin:
pin=line.strip().split()
info.append(pin)
fin.close()
return info
def Full_Info_of_Reads_Integrate(GC_para_dict,BP_para_dict,bps2):
bps2_left=[]
bps2_right=[]
for x in bps2:
bps2_left.append([x[0],x[1]-5000,x[1]])
bps2_right.append([x[0],x[-1],x[-1]+5000])
chr_letter_bp=letter_rearrange(BP_para_dict['flank'],bps2)
letter_GC=letter_GC_ReadIn(chr_letter_bp)
letter_RD_test=letter_RD_ReadIn(letter_RD_test_calcu(chr_letter_bp))
if len(bps2)==1 and len(bps2[0])==3 and letter_RD_test[bps2[0][0]]['a']>GC_para_dict['GC_Overall_Median_Coverage'][bps2[0][0]]*4:
return [letter_RD_test[bps2[0][0]],letter_RD_test[bps2[0][0]],0,0,[],[],[],letter_GC[bps2[0][0]]]+original_bp_let_produce(chr_letter_bp,bps2)
letter_RD=letter_RD_ReadIn(chr_letter_bp)
Multi_Dup=multi_dup_define(letter_RD,GC_para_dict['GC_Overall_Median_Coverage'])
global letter_RD_left_control
letter_RD_left_control=letter_RD_ReadIn(letter_rearrange(BP_para_dict['flank'],bps2_left))
global letter_RD_right_control
letter_RD_right_control=letter_RD_ReadIn(letter_rearrange(BP_para_dict['flank'],bps2_right))
letter_range_report(BP_para_dict['flank'],chr_letter_bp)
blocks_read_in=block_Read_From_Bam(chr_letter_bp)
read_info=block_Info_ReadIn(GC_para_dict,BP_para_dict,chr_letter_bp,blocks_read_in,Multi_Dup)
block_rds=read_info[0]
block_rd2=read_info[1]
letter_RD2={}
for k1 in list(letter_RD.keys()):
for k2 in list(letter_RD[k1].keys()):
if k2 in Multi_Dup:
letter_RD2[k2]=letter_RD[k1][k2]
if not k1 in list(block_rd2.keys()):
block_rd2[k1]={}
if not k2 in list(block_rd2[k1].keys()):
block_rd2[k1][k2]=0
else:
if len(chr_letter_bp[k1][k2])==4:
letter_RD2[k2]=letter_RD[k1][k2]*(chr_letter_bp[k1][k2][2]-chr_letter_bp[k1][k2][1])/(chr_letter_bp[k1][k2][3]-chr_letter_bp[k1][k2][0])
else:
letter_RD2[k2]=letter_RD[k1][k2]
for k1 in list(block_rds.keys()):
for k2 in list(block_rds[k1].keys()):
if not k2 in ['left','right']:
if not chr_letter_bp[k1][k2][-1]==chr_letter_bp[k1][k2][0]:
letter_RD2[k2]+=float(block_rds[k1][k2])/float(chr_letter_bp[k1][k2][-1]-chr_letter_bp[k1][k2][0])
Pair_ThroughBP=rela_Pair_ThroughBP(chr_letter_bp,read_info[2])
Double_Read_ThroughBP=rela_Pair_Double_Read_ThroughBP(chr_letter_bp,read_info[3])
Single_Read_ThroughBP=read_Pair_Single_Read_ThroughBP(chr_letter_bp,read_info[4])
Initial_RD=total_rd_calcu(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],letter_RD2,letter_GC,chr_letter_bp,block_rd2)
DR_Penal=DR_Penal_Calcu(read_info)
IL_Penal=IL_Penal_Calcu(read_info,GC_para_dict['IL_Statistics'],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero)
letter_GC_out={}
for k1 in list(letter_GC.keys()):
for k2 in list(letter_GC[k1].keys()):
letter_GC_out[k2]=letter_GC[k1][k2]
return [letter_RD2,Initial_RD,DR_Penal,numpy.mean(IL_Penal),Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,letter_GC_out]+original_bp_let_produce(chr_letter_bp,bps2)
def flank_region_calcu(k1,k2b):
#eg of k1='chr1'
#eg of k2b=['a', 'b', 'c', 'left', 'right']
flank_region=[]
for k3 in k2b:
flank_region+=block_bps[k1][k3]
flank_region=[min(flank_region),max(flank_region)]
return flank_region
def GC_RD_Prepare(ref_file,Chromosome,Coverage,GC_Content_Coverage):
global GC_Overall_Median_Coverage
GC_Overall_Median_Coverage={}
global GC_Overall_Median_Num
GC_Overall_Median_Num=[]
global GC_Median_Coverage
GC_Median_Coverage={}
global GC_Median_Num
GC_Median_Num={}
global GC_Mean_Coverage
GC_Mean_Coverage={}
global GC_Std_Coverage
GC_Std_Coverage={}
global GC_Var_Coverage
GC_Var_Coverage={}
for a in Chromosome:
if a in list(GC_Content_Coverage.keys()):
GC_Overall_temp=[]
for b in Coverage:
if not b in list(GC_Content_Coverage[a].keys()): continue
if not b in list(GC_Median_Num.keys()):
GC_Median_Num[b]=[]
if len(GC_Content_Coverage[a][b][0])==2: continue
elif len(GC_Content_Coverage[a][b][0])>2:
num_list=[float(c) for c in GC_Content_Coverage[a][b][0][2:].split(',')]
if not sum(num_list)==0:
GC_Median_Num[b]+=num_list
GC_Overall_Median_Num+=num_list
GC_Overall_temp=GC_Overall_temp+num_list
if not Median_Pick(num_list)==0.0:
if not a in list(GC_Median_Coverage.keys()):
GC_Median_Coverage[a]={}
GC_Median_Coverage[a][b]=Median_Pick(num_list)
if len(GC_Overall_temp)==0: continue
if sum(GC_Overall_temp)==0.0: continue
elif len(GC_Overall_temp)>0:
GC_Overall_Median_Coverage[a]=Median_Pick(GC_Overall_temp)
GC_Mean_Coverage[a]=numpy.mean(GC_Overall_temp)
GC_Std_Coverage[a]=numpy.std(GC_Overall_temp)
GC_Var_Coverage[a]=(GC_Std_Coverage[a])**2
GC_Overall_Median_Num=Median_Pick([i for i in GC_Overall_Median_Num if not i==0])
for a in list(GC_Median_Num.keys()):
if GC_Median_Num[a]==[]:
GC_Median_Num[a]=GC_Overall_Median_Num
else:
GC_Median_Num[a]=Median_Pick(GC_Median_Num[a])
GC_Median_Num=GC_Median_Num_Correct(GC_Median_Num)
ChrN_Median_Coverage={}
for i in list(GC_Median_Coverage.keys()):
for j in list(GC_Median_Coverage[i].keys()):
if not j in list(ChrN_Median_Coverage.keys()):
ChrN_Median_Coverage[j]=[GC_Median_Coverage[i][j]]
else:
ChrN_Median_Coverage[j]+=[GC_Median_Coverage[i][j]]
[chrom_N,chrom_X,chrom_Y,GC_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage]=GC_RD_Info_Complete(ref_file,GC_Median_Coverage,ChrN_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage,Chromosome)
return [chrom_N,chrom_X,chrom_Y,GC_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage,GC_Median_Num]
def geno_Stat_Modify(P_IL,P_DR,P_RD,P_TB):
P_IL_new=[i-IL_max for i in P_IL]
P_RD_new=[i-RD_max for i in P_RD]
P_TB_new=[i-PC_max for i in P_TB]
P_TB_new=[i*0.2 for i in P_TB_new]#reduce the load of physical coverage
P_DR_new=P_DR
P_DR_new=[i-max(P_DR_new) for i in P_DR_new]
return [P_IL_new,P_DR_new,P_RD_new,P_TB_new]
def geno_Stat_Integrate(P_IL_new,P_DR_new,P_RD_new,P_TB_new):
out=[sum([P_IL_new[i],P_DR_new[i],P_RD_new[i],P_TB_new[i]]) for i in range(len(P_IL_new))]
out=[i-max(out) for i in out]
prob_scale=[exp(i) for i in out]
prob_norm=[i/sum(prob_scale) for i in prob_scale]
return prob_norm
def genotype_SVs_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash,Be_BP_Letter,Be_Info,structure_candidates):
#Letter_Candidates=[[[],[]],[['a'], []],[['a^'], []],[['a'], ['a']],[['a^'], ['a']],[['a^'], ['a^']],[['a','a^'], []],[['a^','a'], []],[['a^','a^'], []]]
Letter_Candidates=structure_candidates
[P_IL,P_DR,P_RD,P_TB]=Af_Rearrange_Info_Collect(GC_para_dict,BP_para_dict,Be_BP_Letter,Be_Info,Letter_Candidates)
[P_IL_new,P_DR_new,P_RD_new,P_TB_new]=geno_Stat_Modify(P_IL,P_DR,P_RD,P_TB)
prob_out=geno_Stat_Integrate(P_IL_new,P_DR_new,P_RD_new,P_TB_new)
return prob_out
def geno_likelihood_write(geno_likelihood_list,sv_rec_list,single_file,bam_file_name):
file_out_name='/'.join(single_file.split('/')[:-1])+'/'+'.'.join(single_file.split('/')[-1].split('.')[:-1])+'_Genotyped_in_'+bam_file_name.split('/')[-1].split('.')[0]+'.genotype.likelihood'
file_initiate(file_out_name)
fo=open(file_out_name,'a')
for k1 in sorted(sv_rec_list[individual_name].keys()):
if k1 in list(geno_likelihood_list[individual_name].keys()):
print(' '.join([str(i) for i in sv_rec_list[individual_name][k1][0]+geno_likelihood_list[individual_name][k1]]), file=fo)
fo.close()
def global_name_define_1(bam_file_name):
global individual_name,bam_files_appdix,BamN,BPPath,NullPath,Insert_Len_Stat,Read_Depth_Stat,Physical_Cov_Stat,Pair_Orien_Stat,Pair_Orien_Info,RD_Weight,Initial_Bam_Name,Initial_Bam
individual_name='.'.join(bam_file_name.split('/')[-1].split('.')[:-1])
geno_likelihood_list[individual_name]={}
sv_rec_list[individual_name]={}
bam_files_appdix=bam_file_name.split('.')[-1]
#BamN=bam_file_name.split('/')[-1].replace('.'+bam_files_appdix,'')
BamN='.'.join(bam_file_name.split('/')[-1].split('.')[:-1])
#############
BPPath=workdir+'.'.join(['BreakPoints']+[bam_file_name.split('/')[-1]])+'/'
NullPath=workdir+'.'.join(['NullModel']+[bam_file_name.split('/')[-1]])+'/'
Insert_Len_Stat=NullPath+'ILNull.'+BamN+'.'+genome_name+'.Bimodal' #Insert Length stat
Read_Depth_Stat=NullPath+'RDNull.'+BamN+'.'+genome_name+'.NegativeBinomial' #read coverage stat
Physical_Cov_Stat=NullPath+'TBNull.'+BamN+'.'+genome_name+'.Bimodal' #physical coverage stat
Pair_Orien_Stat=NullPath+BamN+'.'+genome_name+'.null'
#############
Pair_Orien_Info=readin_PO_Stat(Pair_Orien_Stat)
RD_Weight=Insert_len_stat_readin(Insert_Len_Stat)/RD_NB_stat_readin(Read_Depth_Stat)
#RD_Weight=1
Initial_Bam_Name=BamN+'.'+bam_files_appdix
Initial_Bam=bam_file_name
global flank,Cut_Lower,Cut_Upper,IL_Stat_all,TB_Cut_Lower,TB_Cut_Upper,IL_Normal_Stat,IL_Statistics,PC_Statistics,RD_Statistics,IL_max,PC_max,RD_max
[flank,Cut_Lower,Cut_Upper,IL_Stat_all]=[cdf_solver_application(Insert_Len_Stat,0.95,model_comp) ,cdf_solver_application(Insert_Len_Stat,0.0001,model_comp) ,cdf_solver_application(Insert_Len_Stat,0.9999,model_comp) ,IL_Stat_readin(Insert_Len_Stat)]
[TB_Cut_Lower,TB_Cut_Upper]=[cdf_solver_application(Physical_Cov_Stat,0.0001,model_comp),cdf_solver_application(Physical_Cov_Stat,0.9995,model_comp)]
[IL_Statistics,IL_Normal_Stat]=IL_Stat_all
IL_max=numpy.log(find_max_bimodal(IL_Statistics)) #calculate max_pdf of insert length distribution
PC_Statistics=IL_Stat_readin(Physical_Cov_Stat) #readin physical coverage parameters
PC_max=numpy.log(find_max_bimodal(PC_Statistics[0])) #calculate max_pdf of physical coverage
RD_Statistics=RD_Stat_readin(Read_Depth_Stat)
RD_max=numpy.log(find_max_negative_binomial(RD_Statistics))
global tau_list,IL_Mean,IL_Estimate,IL_SD,IL_Penal_Two_End_Limit,low_qual_edge,GC_Stat_Path
tau_list=tau_calcu(Insert_Len_Stat,Physical_Cov_Stat,Read_Depth_Stat) #[IL,RD,TB]
IL_Mean=IL_Statistics[0]*IL_Statistics[4]+IL_Statistics[1]*IL_Statistics[5]
IL_Estimate=IL_Statistics[0]*IL_Statistics[4]+IL_Statistics[1]*IL_Statistics[5]
IL_SD=((IL_Statistics[2]*IL_Statistics[4])**2+(IL_Statistics[3]*IL_Statistics[5])**2)**(0.5)
IL_Penal_Two_End_Limit=min([pdf_calculate(IL_Estimate-3*IL_SD,IL_Statistics[4],IL_Statistics[0],IL_Statistics[1],IL_Statistics[2],IL_Statistics[3],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero),pdf_calculate(IL_Estimate+3*IL_SD,IL_Statistics[4],IL_Statistics[0],IL_Statistics[1],IL_Statistics[2],IL_Statistics[3],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero)])
[low_qual_edge,GC_Stat_Path]=[5,NullPath+'RD_Stat']
def global_para_declaration():
global chrom_N
global chrom_X
global chrom_Y
global workdir
global bp_txt_Path
global BPPath
global NullPath
global ref_path
global ref_file
global ref_index
global ref_ppre
global ref_prefix
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
ref_ppre=ref_path
ref_prefix='.'.join(ref_file.split('.')[:-1])
def letter_rearrange(flank,bps2):
chr_letter_bp={}
let_start=96
for i in bps2:
if not i[0] in list(chr_letter_bp.keys()):
chr_letter_bp[i[0]]={}
for j in range(len(i))[1:-1]:
chr_letter_bp[i[0]][chr(let_start+j)]=[]
if int(i[j+1])-int(i[j])<10*flank:
chr_letter_bp[i[0]][chr(let_start+j)]+=[int(i[j]),int(i[j+1])]
else:
chr_letter_bp[i[0]][chr(let_start+j)]+=[int(i[j]),int(i[j])+flank,int(i[j+1])-flank,int(i[j+1])]
let_start+=len(i)-2
return chr_letter_bp
def letter_GC_ReadIn(chr_letter_bp):
block_GC_temp={}
filein=ref_prefix+'.GC_Content'
block_range={}
GC_hash_temp={}
test_flag=0
for i in list(chr_letter_bp.keys()):
if not os.path.isfile(filein):
test_flag+=1
if test_flag==0:
for i in list(chr_letter_bp.keys()):
GC_hash_temp[i]={}
block_range[i]=[]
for j in list(chr_letter_bp[i].keys()):
block_range[i]+=chr_letter_bp[i][j]
block_range[i]=[min(block_range[i]),max(block_range[i])]
for xa in list(GC_hash[i].keys()):
for xb in list(GC_hash[i][xa].keys()):
if not xb<block_range[i][0] and not xa>block_range[i][1]:
GC_hash_temp[i][str(xa)+'-'+str(xb)]=GC_hash[i][xa][xb]
for k1 in list(chr_letter_bp.keys()):
block_GC_temp[k1]={}
for k2 in list(GC_hash_temp[k1].keys()):
bl2=[int(k2.split('-')[0]),int(k2.split('-')[1])]
for k3 in list(chr_letter_bp[k1].keys()):
if min(chr_letter_bp[k1][k3])>bl2[0]-1 and max(chr_letter_bp[k1][k3])<bl2[1]+1:
block_GC_temp[k1][k3]=GC_hash_temp[k1][k2][(min(chr_letter_bp[k1][k3])-bl2[0])/100:(max(chr_letter_bp[k1][k3])-bl2[0])/100+1]
elif min(chr_letter_bp[k1][k3])>bl2[0]-1 and max(chr_letter_bp[k1][k3])>bl2[1]:
if not k3 in list(block_GC_temp[k1].keys()):
block_GC_temp[k1][k3]=GC_hash_temp[k1][k2][(min(chr_letter_bp[k1][k3])-bl2[0])/100:]
else:
block_GC_temp[k1][k3]+=GC_hash_temp[k1][k2][(min(chr_letter_bp[k1][k3])-bl2[0])/100:]
elif min(chr_letter_bp[k1][k3])<bl2[0] and max(chr_letter_bp[k1][k3])>bl2[0]-1:
if not k3 in list(block_GC_temp[k1].keys()):
block_GC_temp[k1][k3]=GC_hash_temp[k1][k2][:(max(chr_letter_bp[k1][k3])-bl2[0])/100+1]
else:
block_GC_temp[k1][k3]+=GC_hash_temp[k1][k2][:(max(chr_letter_bp[k1][k3])-bl2[0])/100+1]
elif min(chr_letter_bp[k1][k3])<bl2[0]+1 and max(chr_letter_bp[k1][k3])>bl2[1]-1:
if not k3 in list(block_GC_temp[k1].keys()):
block_GC_temp[k1][k3]=GC_hash_temp[k1][k2]
else:
block_GC_temp[k1][k3]+=GC_hash_temp[k1][k2]
for k1 in list(block_GC_temp.keys()):
for k2 in list(block_GC_temp[k1].keys()):
if not block_GC_temp[k1][k2]==[]:
block_GC_temp[k1][k2]=numpy.mean([float(k3) for k3 in block_GC_temp[k1][k2]])
else:
return 'error'
return block_GC_temp
else:
return 'error'
def letter_RD_ReadIn(chr_letter_bp):
test_flag=0
for k1 in list(chr_letter_bp.keys()):
filein=NullPath+'RD_Stat/'+BamN+'.'+k1+'.RD.index'
if not os.path.isfile(filein):
test_flag+=1
if test_flag==0:
out={}
RD_hash={}
block_range={}
for i in list(chr_letter_bp.keys()):
RD_hash[i]={}
out[i]={}
block_range[i]=[]
for j in list(chr_letter_bp[i].keys()):
block_range[i]+=chr_letter_bp[i][j]
block_range[i]=[min(block_range[i]),max(block_range[i])]
for k1 in list(chr_letter_bp.keys()):
filein=NullPath+'RD_Stat/'+BamN+'.'+k1+'.RD.index'
fin=open(filein)
while True:
pin=fin.readline().strip().split()
if not pin: break
pin2=fin.readline().strip().split()
bl2=[int(pin[0].split(':')[1].split('-')[0]),int(pin[0].split(':')[1].split('-')[1])]
if not bl2[1]<block_range[k1][0]+1 and not bl2[0]>block_range[k1][1]-1:
RD_hash[k1][str(bl2[0])+'-'+str(bl2[1])]=pin2
fin.close()
for k1 in list(chr_letter_bp.keys()):
for k2 in list(RD_hash[k1].keys()):
bl2=[int(k2.split('-')[0]),int(k2.split('-')[1])]
for j in sorted(chr_letter_bp[k1].keys()):
if not j in list(out[k1].keys()):
out[k1][j]=[]
if len(chr_letter_bp[k1][j])==4:
bl1=chr_letter_bp[k1][j][1:-1]
if bl1[0]>bl2[0]-1 and bl1[1]<bl2[1]+1:
out[k1][j]+=RD_hash[k1][k2][(bl1[0]-bl2[0])/Window_Size:(bl1[1]-bl2[0])/Window_Size+1]
elif bl1[0]>bl2[0]-1 and bl1[1]>bl2[1]:
out[k1][j]+=RD_hash[k1][k2][(bl1[0]-bl2[0])/Window_Size:]
elif bl1[0]<bl2[0] and bl1[1]<bl2[1]+1:
out[k1][j]+=RD_hash[k1][k2][:(bl1[1]-bl2[0])/Window_Size+1]
elif bl1[0]<bl2[0] and bl1[1]>bl2[1]:
out[k1][j]+=RD_hash[k1][k2]
for k1 in list(out.keys()):
for k2 in list(out[k1].keys()):
if out[k1][k2]==[]:
out[k1][k2]=0
else:
out[k1][k2]=numpy.mean([float(k3) for k3 in out[k1][k2]])
return out
else:
return 'error'
def letter_bp_GC_RD_Prep(chr_letter_tbp,letter_tRD,letter_tGC):
chr_letter_bp={}
letter_GC={}
letter_RD={}
for k1 in list(chr_letter_tbp.keys()):
chr_letter_bp[k1]={}
letter_GC[k1]={}
letter_RD[k1]={}
for k2 in list(chr_letter_tbp[k1].keys()):
if k2 in list(letter_tGC[k1].keys()) and k2 in list(letter_tRD[k1].keys()) and not math.isnan(letter_tRD[k1][k2]) and not math.isnan(letter_tGC[k1][k2]):
chr_letter_bp[k1][k2]=chr_letter_tbp[k1][k2]
letter_GC[k1][k2]=letter_tGC[k1][k2]
letter_RD[k1][k2]=letter_tRD[k1][k2]
return [chr_letter_bp,letter_GC,letter_RD]
def left_keys_prep(chr_letter_bp):
left_keys=[]
for k1 in list(chr_letter_bp.keys()):
for k2 in list(chr_letter_bp[k1].keys()):
left_keys.append(k2)
return left_keys
def penal_calculate(GC_para_dict,BP_para_dict,Map_All,temp_bp, Af_Letter,Af_BP,letters_numbers,NoMapPenal):
out_rd=[[0 for i in temp_bp[0][:-1]],[0 for i in temp_bp[1][:-1]]]
IL_Rec={}
DR_Penal=0
out_tb=[[0 for i in temp_bp[0]],[0 for i in temp_bp[1]]]
for i in Map_All:
print(out_tb)
if len(i)>4:
if not i[6] in list(IL_Rec.keys()):
IL_Rec[i[6]]=i[8]
else:
IL_Rec[i[6]]+=i[8]
if not i[4:6]==['+','-']:
DR_Penal+=1
if i[7]=='m':
i_block=[]
for k in i[:4]:
if k<temp_bp[0][1]:
i_block.append(0)
elif k>temp_bp[0][-2]-1:
i_block.append(len(temp_bp[0])-2)
else:
for j in range(len(temp_bp[0])-1)[1:-1]:
if temp_bp[0][j]-1<k and temp_bp[0][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[0][i_block[2]]+=(i[3]-i[2])*i[-1]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]/(i_block[2]-i_block[1])
elif not i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
out_rd[0][i_block[2]]+=(i[3]-i[2])*i[-1]
out_tb[0][i_block[1]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[0][i_block[2]]+=(temp_bp[0][i_block[2]+1]-i[2])*i[-1]
out_rd[0][i_block[3]]+=(i[3]-temp_bp[0][i_block[3]])*i[-1]
out_tb[0][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif not i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
out_rd[0][i_block[2]]+=(temp_bp[0][i_block[2]+1]-i[2])*i[-1]
out_rd[0][i_block[3]]+=(i[3]-temp_bp[0][i_block[3]])*i[-1]
out_tb[0][i_block[1]]+=i[8]
out_tb[0][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
if i[7]=='p':
i_block=[]
for k in i[:4]:
if k<temp_bp[1][1]:
i_block.append(0)
elif k>temp_bp[1][-2]-1:
i_block.append(len(temp_bp[1])-2)
else:
for j in range(len(temp_bp[1])-1)[1:-1]:
if temp_bp[1][j]-1<k and temp_bp[1][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[1][i_block[2]]+=(i[3]-i[2])*i[-1]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif not i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
out_rd[1][i_block[2]]+=(i[3]-i[2])*i[-1]
out_tb[1][i_block[1]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[1][i_block[2]]+=(temp_bp[1][i_block[2]+1]-i[2])*i[-1]
out_rd[1][i_block[3]]+=(i[3]-temp_bp[1][i_block[3]])*i[-1]
out_tb[1][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif not i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
out_rd[1][i_block[2]]+=(temp_bp[1][i_block[2]+1]-i[2])*i[-1]
out_rd[1][i_block[3]]+=(i[3]-temp_bp[1][i_block[3]])*i[-1]
out_tb[1][i_block[1]]+=i[8]
out_tb[1][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
else:
if i[2]=='m':
i_block=[]
for k in i[:2]:
if k<temp_bp[0][1]:
i_block.append(0)
elif k>temp_bp[0][-2]-1:
i_block.append(len(temp_bp[0])-2)
else:
for j in range(len(temp_bp[0])-1)[1:-1]:
if temp_bp[0][j]-1<k and temp_bp[0][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
elif not i_block[0]==i_block[1]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
if i[2]=='p':
i_block=[]
for k in i[:2]:
if k<temp_bp[1][1]:
i_block.append(0)
elif k>temp_bp[1][-2]-1:
i_block.append(len(temp_bp[1])-2)
else:
for j in range(len(temp_bp[1])-1)[1:-1]:
if temp_bp[1][j]-1<k and temp_bp[1][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
elif not i_block[0]==i_block[1]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
block_bps_chr={}
block_bps_chr['m']={}
block_bps_chr['p']={}
if not Penalty_For_InsertLengthZero in list(IL_Rec.keys()):
IL_Rec[Penalty_For_InsertLengthZero]=NoMapPenal
else:
IL_Rec[Penalty_For_InsertLengthZero]+=NoMapPenal
IL_Penal=0
IL_Weight=0
for i in list(IL_Rec.keys()):
IL_Penal+=i*IL_Rec[i]
IL_Weight+=IL_Rec[i]
if not IL_Weight==0:
IL_Output=float(IL_Penal)/float(IL_Weight)#iytpout IL_Output = mean(log(P_IL)) for all pairs
else:
IL_Output=0
Num_Read_TB=[out_tb[0][1:-1],out_tb[1][1:-1]]
TB_Pena_2_out=0
Num_total_TB=[]
for x in Num_Read_TB:
Num_total_TB+=x
if numpy.sum(Num_total_TB)>0:
pvalue=scipy.stats.chisquare(Num_total_TB)[1]
else:
pvalue=0.0
if pvalue>0:
TB_Pena_2_out=numpy.log(pvalue)
else:
TB_Pena_2_out=-100000000
Af_Block_Len=[[BP_para_dict['flank']]+[Af_BP[0][i+1]-Af_BP[0][i] for i in range(len(Af_BP[0])-1)]+[BP_para_dict['flank']],[BP_para_dict['flank']]+[Af_BP[1][i+1]-Af_BP[1][i] for i in range(len(Af_BP[1])-1)]+[BP_para_dict['flank']]]
out_rd=[[out_rd[0][i]/Af_Block_Len[0][i] for i in range(len(out_rd[0]))],[out_rd[1][i]/Af_Block_Len[1][i] for i in range(len(out_rd[1]))]]
out_rd_new=[[(BP_para_dict['RD_within_B']['left']-out_rd[0][0]-out_rd[1][0])/2.0+out_rd[0][0],
(BP_para_dict['RD_within_B']['right']-out_rd[0][-1]-out_rd[1][-1])/2.0+out_rd[0][-1]],
[(BP_para_dict['RD_within_B']['left']-out_rd[0][0]-out_rd[1][0])/2.0+out_rd[1][0],
(BP_para_dict['RD_within_B']['right']-out_rd[0][-1]-out_rd[1][-1])/2.0+out_rd[1][-1]]]
out_rd=[[out_rd_new[0][0]]+out_rd[0][1:-1]+[out_rd_new[0][-1]],[out_rd_new[1][0]]+out_rd[1][1:-1]+[out_rd_new[1][-1]]]
out_rd_within=[[BP_para_dict['RD_within_B'][Af_Letter[0][i]]/letters_numbers[0][i] for i in range(len(Af_Letter[0]))],[BP_para_dict['RD_within_B'][Af_Letter[1][i]]/letters_numbers[1][i] for i in range(len(Af_Letter[1]))]]
out_rd_within[0]=[0]+out_rd_within[0]+[0]
out_rd_within[1]=[0]+out_rd_within[1]+[0]
cov_bp2=[[out_rd[0][i]+out_rd_within[0][i] for i in range(len(out_rd[0]))],[out_rd[1][i]+out_rd_within[1][i] for i in range(len(out_rd[1]))]]
Cov_GC=[[BP_para_dict['BlockGC2'][k] for k in Af_Letter[0]],[BP_para_dict['BlockGC2'][k] for k in Af_Letter[1]]]
adj_cov_bp=[GC_RD_Adj(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],chrom_N,Cov_GC[0],cov_bp2[0][1:-1]),GC_RD_Adj(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],chrom_N,Cov_GC[1],cov_bp2[1][1:-1])]
return [IL_Output,adj_cov_bp,DR_Penal,TB_Pena_2_out,Num_total_TB]
def readin_RD_Stat(file_in):
#readin the read depth stats calculated in NullModel build step
#eg of file_in='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/RDNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.NegativeBinomial'
info=file_straight_readin(file_in)
return [float(i) for i in info[-1]] #eg of output: [mean,median,std]
def readin_PC_Stat(file_in,model_comp='C'):
#readin the physical coverage stats calculated in NullModel build step
#eg of file_in='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/TBNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.Bimodal'
#model_comp:['C' for complex,'S' for simple]
info=file_straight_readin(file_in)
if model_comp=='S': return [float(i) for i in info[-1]] #eg of output: [1, mean,std]
elif model_comp=='C': return [float(i) for i in info[3]+info[5]] #eg of output:[alpha1,mean1,std1,alpha2,mean2,std2]
def readin_IL_Stat(file_in,model_comp='C'):
#readin the insert length stats calculated in NullModel build step
#eg of file_in='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/ILNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.Bimodal'
#model_comp:['C' for complex,'S' for simple]
info=file_straight_readin(file_in)
if model_comp=='S': return [float(i) for i in info[-1]] #eg of output: [1, mean,std]
elif model_comp=='C': return [float(i) for i in info[3]+info[5]] #eg of output:[alpha1,mean1,std1,alpha2,mean2,std2]
def Insert_Seq_Pool_Prod_2(original_bp_list,ori_1_Seq,flank):
ini_letters=['left']+['I'+chr(97+i) for i in range(len(original_bp_list)-1)]+['right']+['I'+chr(97+i)+'^' for i in range(len(original_bp_list)-1)]
relative_bps=[0]+[j-original_bp_list[0]+flank for j in original_bp_list]+[original_bp_list[-1]+flank-original_bp_list[0]+flank]
Insert_Seq_Pool={}
for k in range(len(original_bp_list)+1):
Insert_Seq_Pool[ini_letters[k]]=ori_1_Seq[relative_bps[k]:relative_bps[k+1]]
for k in range(len(original_bp_list)+1,len(ini_letters)):
Insert_Seq_Pool[ini_letters[k]]=complementary(ori_1_Seq[relative_bps[k-len(original_bp_list)]:relative_bps[k+1-len(original_bp_list)]])
return Insert_Seq_Pool
def letters_bps_produce(letters,bps,flank):
letters_bps={}
letters_relative_bps={}
letters_bps['left']=[bps[0]-flank,bps[0]]
letters_relative_bps['left']=[-flank,0]
for i in range(len(bps)-1):
letters_relative_bps[letters[i]]=[bps[i]-bps[0],bps[i+1]-bps[0]]
letters_bps[letters[i]]=[bps[i],bps[i+1]]
letters_bps['right']=[bps[-1],bps[-1]+flank]
letters_relative_bps['right']=[bps[-1]-bps[0],bps[-1]-bps[0]+flank]
return [letters_bps,letters_relative_bps]
def letter_rearrange(flank,bps2):
chr_letter_bp={}
let_start=96
for i in bps2:
if not i[0] in list(chr_letter_bp.keys()):
chr_letter_bp[i[0]]={}
for j in range(len(i))[1:-1]:
chr_letter_bp[i[0]][chr(let_start+j)]=[]
if int(i[j+1])-int(i[j])<10*flank:
chr_letter_bp[i[0]][chr(let_start+j)]+=[int(i[j]),int(i[j+1])]
else:
chr_letter_bp[i[0]][chr(let_start+j)]+=[int(i[j]),int(i[j])+flank,int(i[j+1])-flank,int(i[j+1])]
let_start+=len(i)-2
return chr_letter_bp
def letter_RD_test_calcu(chr_letter_bp):
out={}
for x in list(chr_letter_bp.keys()):
out[x]={}
for y in list(chr_letter_bp[x].keys()):
if not y in ['left','right']:
if len(chr_letter_bp[x][y])==2:
out[x][y]=[chr_letter_bp[x][y][0]-500]+chr_letter_bp[x][y]+[chr_letter_bp[x][y][1]+500]
else:
out[x][y]=chr_letter_bp[x][y]
return out
def LetterList_Rearrange(Letter_List,Command,BP_List_origin):
if Command[-1]=='del' or Command[-1]=='delete':
return BPList_Delete_Letter(Letter_List,Command)
elif Command[-1]=='inv' or Command[-1]=='invert':
return BPList_Invert_Letter(Letter_List,Command)
elif Command[-1]=='ins' or Command[-1]=='insert':
return BPList_Insert_Letter(Letter_List,Command)
elif Command[-1]=='copy+paste' or Command[-1]=='CopyPaste':
return BPList_CopyPaste_Letter(Letter_List,Command)
elif Command[-1]=='cut+paste' or Command[-1]=='CutPaste':
return BPList_CutPaste_Letter(Letter_List,Command)
elif Command[-1]=='x' or Command[-1]=='X':
return BPList_X_Letter(Letter_List,Command)
def Letter_Through_Rearrange_4(GC_para_dict,BP_para_dict,Be_Info,Af_Letter,Af_BP):
Total_Cov_For_Pen={}
for key in list(BP_para_dict['RD_within_B'].keys()):
Total_Cov_For_Pen[key]=0
Map_M=[]
Map_P=[]
Map_Both=[]
Let_BP_Info={}
Let_BP_Info['m']={}
Let_BP_Info['p']={}
temp_letter=[['left']+Af_Letter[0]+['right'],['left']+Af_Letter[1]+['right']]
temp_bp=[[Af_BP[0][0]-BP_para_dict['flank']]+Af_BP[0]+[Af_BP[0][-1]+BP_para_dict['flank']],[Af_BP[1][0]-BP_para_dict['flank']]+Af_BP[1]+[Af_BP[1][-1]+BP_para_dict['flank']]]
for j1 in range(len(temp_letter[0])):
j=temp_letter[0][j1]
if not j in list(Let_BP_Info['m'].keys()):
Let_BP_Info['m'][j]=[[temp_bp[0][j1],temp_bp[0][j1+1]]]
else:
Let_BP_Info['m'][j]+=[[temp_bp[0][j1],temp_bp[0][j1+1]]]
for j1 in range(len(temp_letter[1])):
j=temp_letter[1][j1]
if not j in list(Let_BP_Info['p'].keys()):
Let_BP_Info['p'][j]=[[temp_bp[1][j1],temp_bp[1][j1+1]]]
else:
Let_BP_Info['p'][j]+=[[temp_bp[1][j1],temp_bp[1][j1+1]]]
letters_numbers=[[Af_Letter[0].count(i[0])+Af_Letter[1].count(i[0])+Af_Letter[0].count(i[0]+'^')+Af_Letter[1].count(i[0]+'^') for i in Af_Letter[0]],[Af_Letter[0].count(i[0])+Af_Letter[1].count(i[0])+Af_Letter[0].count(i[0]+'^')+Af_Letter[1].count(i[0]+'^') for i in Af_Letter[1]]]
NoMapPenal=0
IL_Rec={}
DR_Rec=0
cov_bp=[[0 for i in range(len(temp_letter[0]))],[0 for i in range(len(temp_letter[1]))]]
cov_bp2=[]
NoMapPenal=Be_Info_1_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal)
NoMapPenal=Be_Info_2_rearrange(Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal)
NoMapPenal=Be_Info_3_rearrange(BP_para_dict,Be_Info,temp_letter,Let_BP_Info,Total_Cov_For_Pen,Map_M,Map_P,Map_Both,NoMapPenal)
best_structure_sign_flag=0
for key in list(Total_Cov_For_Pen.keys()):
if Total_Cov_For_Pen[key]==0:
del Total_Cov_For_Pen[key]
else:
Total_Cov_For_Pen[key]/=float(Be_BP_Letter[key])
for key in list(BP_para_dict['RD_within_B'].keys()):
if not key[-1]=='^' and not key in ['left','right','left^', 'right^']:
if not key in Af_Letter[0]+Af_Letter[1] and not key+'^' in Af_Letter[0]+Af_Letter[1]:
if not key in list(Total_Cov_For_Pen.keys()):
Total_Cov_For_Pen[key]=0
Total_Cov_For_Pen[key]+=BP_para_dict['RD_within_B'][key]
if NoMapPenal>0:
best_structure_sign_flag+=1
for key1 in list(Total_Cov_For_Pen.keys()):
if Total_Cov_For_Pen[key1]>2.58*GC_para_dict['GC_Std_Coverage'][chrom_N]:
best_structure_sign_flag+=1
if not Map_M+Map_P+Map_Both==[]:
penals=penal_calculate(GC_para_dict,BP_para_dict,Map_M+Map_P+Map_Both,temp_bp,Af_Letter,Af_BP,letters_numbers,NoMapPenal)
if penals[2]>0:
best_structure_sign_flag+=1
return penals[:-1]+[NoMapPenal,Total_Cov_For_Pen,best_structure_sign_flag]+[penals[-1]]
else:
return 0
def modify_bps1_new(bps2_new):
out=[]
for x in bps2_new:
for y in x:
if y in chromos_all:
out.append([y])
else:
out[-1].append(y)
return out
def P_list_modify(P_list):
for x in range(len(P_list)):
if P_list[x]==1:
P_list[x]=min(P_list)*100
return P_list
def penal_calculate(GC_para_dict,BP_para_dict,Map_All,temp_bp, Af_Letter,Af_BP,letters_numbers,NoMapPenal):
out_rd=[[0 for i in temp_bp[0][:-1]],[0 for i in temp_bp[1][:-1]]]
IL_Rec={}
DR_Penal=0
out_tb=[[0 for i in temp_bp[0]],[0 for i in temp_bp[1]]]
for i in Map_All:
if len(i)>4:
if not i[6] in list(IL_Rec.keys()):
IL_Rec[i[6]]=i[8]
else:
IL_Rec[i[6]]+=i[8]
if not i[4:6]==['+','-']:
DR_Penal+=1
if i[7]=='m':
i_block=[]
for k in i[:4]:
if k<temp_bp[0][1]:
i_block.append(0)
elif k>temp_bp[0][-2]-1:
i_block.append(len(temp_bp[0])-2)
else:
for j in range(len(temp_bp[0])-1)[1:-1]:
if temp_bp[0][j]-1<k and temp_bp[0][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[0][i_block[2]]+=(i[3]-i[2])*i[-1]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]/(i_block[2]-i_block[1])
elif not i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
out_rd[0][i_block[2]]+=(i[3]-i[2])*i[-1]
out_tb[0][i_block[1]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[0][i_block[2]]+=(temp_bp[0][i_block[2]+1]-i[2])*i[-1]
out_rd[0][i_block[3]]+=(i[3]-temp_bp[0][i_block[3]])*i[-1]
out_tb[0][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
elif not i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
out_rd[0][i_block[2]]+=(temp_bp[0][i_block[2]+1]-i[2])*i[-1]
out_rd[0][i_block[3]]+=(i[3]-temp_bp[0][i_block[3]])*i[-1]
out_tb[0][i_block[1]]+=i[8]
out_tb[0][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[0][k2]+=i[8]
if i[7]=='p':
i_block=[]
for k in i[:4]:
if k<temp_bp[1][1]:
i_block.append(0)
elif k>temp_bp[1][-2]-1:
i_block.append(len(temp_bp[1])-2)
else:
for j in range(len(temp_bp[1])-1)[1:-1]:
if temp_bp[1][j]-1<k and temp_bp[1][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[1][i_block[2]]+=(i[3]-i[2])*i[-1]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif not i_block[0]==i_block[1] and i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
out_rd[1][i_block[2]]+=(i[3]-i[2])*i[-1]
out_tb[1][i_block[1]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
out_rd[1][i_block[2]]+=(temp_bp[1][i_block[2]+1]-i[2])*i[-1]
out_rd[1][i_block[3]]+=(i[3]-temp_bp[1][i_block[3]])*i[-1]
out_tb[1][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
elif not i_block[0]==i_block[1] and not i_block[2]==i_block[3]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
out_rd[1][i_block[2]]+=(temp_bp[1][i_block[2]+1]-i[2])*i[-1]
out_rd[1][i_block[3]]+=(i[3]-temp_bp[1][i_block[3]])*i[-1]
out_tb[1][i_block[1]]+=i[8]
out_tb[1][i_block[3]]+=i[8]
#if i[4:6]==['+', '-'] and i[6]>Penalty_For_InsertLengthZero:
if i[4:6]==['+', '-']:
for k2 in range(i_block[1]+1,i_block[2]+1):
out_tb[1][k2]+=i[8]
else:
if i[2]=='m':
i_block=[]
for k in i[:2]:
if k<temp_bp[0][1]:
i_block.append(0)
elif k>temp_bp[0][-2]-1:
i_block.append(len(temp_bp[0])-2)
else:
for j in range(len(temp_bp[0])-1)[1:-1]:
if temp_bp[0][j]-1<k and temp_bp[0][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1]:
out_rd[0][i_block[0]]+=(i[1]-i[0])*i[-1]
elif not i_block[0]==i_block[1]:
out_rd[0][i_block[0]]+=(temp_bp[0][i_block[0]+1]-i[0])*i[-1]
out_rd[0][i_block[1]]+=(i[1]-temp_bp[0][i_block[1]])*i[-1]
if i[2]=='p':
i_block=[]
for k in i[:2]:
if k<temp_bp[1][1]:
i_block.append(0)
elif k>temp_bp[1][-2]-1:
i_block.append(len(temp_bp[1])-2)
else:
for j in range(len(temp_bp[1])-1)[1:-1]:
if temp_bp[1][j]-1<k and temp_bp[1][j+1]>k:
i_block.append(j)
if i_block[0]==i_block[1]:
out_rd[1][i_block[0]]+=(i[1]-i[0])*i[-1]
elif not i_block[0]==i_block[1]:
out_rd[1][i_block[0]]+=(temp_bp[1][i_block[0]+1]-i[0])*i[-1]
out_rd[1][i_block[1]]+=(i[1]-temp_bp[1][i_block[1]])*i[-1]
block_bps_chr={}
block_bps_chr['m']={}
block_bps_chr['p']={}
if not Penalty_For_InsertLengthZero in list(IL_Rec.keys()):
IL_Rec[Penalty_For_InsertLengthZero]=NoMapPenal
else:
IL_Rec[Penalty_For_InsertLengthZero]+=NoMapPenal
IL_Penal=0
IL_Weight=0
for i in list(IL_Rec.keys()):
IL_Penal+=i*IL_Rec[i]
IL_Weight+=IL_Rec[i]
if not IL_Weight==0:
IL_Output=IL_Penal/IL_Weight
else:
IL_Output=0
Num_Read_TB=[out_tb[0][1:-1],out_tb[1][1:-1]]
TB_Pena_2_out=0
Num_total_TB=[]
for x in Num_Read_TB:
Num_total_TB+=x
if numpy.sum(Num_total_TB)>0:
pvalue=scipy.stats.chisquare(Num_total_TB)[1]
else:
pvalue=0.0
if pvalue>0:
TB_Pena_2_out=numpy.log(pvalue)
else:
TB_Pena_2_out=-100000000
Af_Block_Len=[[BP_para_dict['flank']]+[Af_BP[0][i+1]-Af_BP[0][i] for i in range(len(Af_BP[0])-1)]+[BP_para_dict['flank']],[BP_para_dict['flank']]+[Af_BP[1][i+1]-Af_BP[1][i] for i in range(len(Af_BP[1])-1)]+[BP_para_dict['flank']]]
out_rd=[[out_rd[0][i]/Af_Block_Len[0][i] for i in range(len(out_rd[0]))],[out_rd[1][i]/Af_Block_Len[1][i] for i in range(len(out_rd[1]))]]
out_rd_new=[[(BP_para_dict['RD_within_B']['left']-out_rd[0][0]-out_rd[1][0])/2.0+out_rd[0][0],
(BP_para_dict['RD_within_B']['right']-out_rd[0][-1]-out_rd[1][-1])/2.0+out_rd[0][-1]],
[(BP_para_dict['RD_within_B']['left']-out_rd[0][0]-out_rd[1][0])/2.0+out_rd[1][0],
(BP_para_dict['RD_within_B']['right']-out_rd[0][-1]-out_rd[1][-1])/2.0+out_rd[1][-1]]]
out_rd=[[out_rd_new[0][0]]+out_rd[0][1:-1]+[out_rd_new[0][-1]],[out_rd_new[1][0]]+out_rd[1][1:-1]+[out_rd_new[1][-1]]]
out_rd_within=[[BP_para_dict['RD_within_B'][Af_Letter[0][i]]/letters_numbers[0][i] for i in range(len(Af_Letter[0]))],[BP_para_dict['RD_within_B'][Af_Letter[1][i]]/letters_numbers[1][i] for i in range(len(Af_Letter[1]))]]
out_rd_within[0]=[0]+out_rd_within[0]+[0]
out_rd_within[1]=[0]+out_rd_within[1]+[0]
cov_bp2=[[out_rd[0][i]+out_rd_within[0][i] for i in range(len(out_rd[0]))],[out_rd[1][i]+out_rd_within[1][i] for i in range(len(out_rd[1]))]]
Cov_GC=[[BP_para_dict['BlockGC2'][k] for k in Af_Letter[0]],[BP_para_dict['BlockGC2'][k] for k in Af_Letter[1]]]
adj_cov_bp=[GC_RD_Adj(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],chrom_N,Cov_GC[0],cov_bp2[0][1:-1]),GC_RD_Adj(GC_para_dict['GC_Median_Num'],GC_para_dict['GC_Overall_Median_Num'],chrom_N,Cov_GC[1],cov_bp2[1][1:-1])]
return [IL_Output,adj_cov_bp,DR_Penal,TB_Pena_2_out,Num_total_TB]
def RD_Index_ReadIn(ppre_Path,BamN, chromo, region):
if not ppre_Path[-1]=='/':
ppre_Path+='/'
path_in=NullPath+'RD_Stat/'
file_in=BamN+'.'+chromo+'.RD.index'
fin=open(path_in+file_in)
pos1=int(region[0])
pos2=int(region[1])
while True:
pin1=fin.readline().strip().split()
if not pin1: break
pin2=fin.readline().strip().split()
reg1=int(pin1[0].split(':')[1].split('-')[0])
reg2=int(pin1[0].split(':')[1].split('-')[1])
if not pos1<reg1 and not pos2>reg2:
break
def Read_Through_modify(Pair_Through,Read_Through,Be_BP_Letter):
#eg of Read_Through:[['left', 236, 'left', 362, 'a', 190, 'right', 98, '+', '-'], ['left', 329, 'left', 455, 'a', 198, 'right', 116, '+', '-']]
#based on the assumption that breakpoints are of high quality, there should not be much read through the breakpoints.
#if read is not relatively evently distributed in two blocks (min_size / over_size >1/3), we take it as on the major block
out=[]
for x in Read_Through:
x_new_info=[]
if not x[0]==x[2]:
x_new=[Be_BP_Letter[x[0]]-x[1],x[3]] #[length of reads in both blocks]
if float(x_new[0])/float(sum(x_new))<1.0/3.0:
x_new_info.append(x[2])
x_new_info.append(0+1)
x_new_info.append(x[2])
x_new_info.append(x[3])
elif float(x_new[1])/float(sum(x_new))<1.0/3.0:
x_new_info.append(x[0])
x_new_info.append(x[1])
x_new_info.append(x[0])
x_new_info.append(Be_BP_Letter[x[0]]-1)
if x_new_info==[]:
x_new_info.append(x[0])
x_new_info.append(x[1])
x_new_info.append(x[2])
x_new_info.append(x[3])
if not x[4]==x[6]:
x_new=[Be_BP_Letter[x[4]]-x[5],x[7]] #[length of reads in both blocks]
if float(x_new[0])/float(sum(x_new))<1.0/3.0:
x_new_info.append(x[6])
x_new_info.append(0+1)
x_new_info.append(x[6])
x_new_info.append(x[7])
elif float(x_new[1])/float(sum(x_new))<1.0/3.0:
x_new_info.append(x[4])
x_new_info.append(x[5])
x_new_info.append(x[4])
x_new_info.append(Be_BP_Letter[x[4]]-1)
if len(x_new_info)==4:
x_new_info.append(x[4])
x_new_info.append(x[5])
x_new_info.append(x[6])
x_new_info.append(x[7])
x_new_info+=[x[8],x[9]]
if x_new_info[0]==x_new_info[2] and x_new_info[4]==x_new_info[6]:
Pair_Through.append([x_new_info[0],x_new_info[1],x_new_info[3],x_new_info[4],x_new_info[5],x_new_info[7],x_new_info[8],x_new_info[9]])
else:
out.append(x_new_info)
return [Pair_Through,out]
def ReadLenFin_info_readin(ReadLenFin):
fin=open(ReadLenFin)
pin=fin.readline().strip().split()
pin=fin.readline().strip().split()
pin=fin.readline().strip().split()
global Window_Size
Window_Size=int(pin[0])/3
for line in fin:
pin=line.strip().split()
fin.close()
global ReadLength,chrom_N,chrom_X,chrom_Y,GC_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage,GC_Median_Num,GC_para_dict
ReadLength=int(pin[-1].split(':')[-1])
Affix_GC_Stat='_MP'+str(QCAlign)+'_GC_Coverage_ReadLength'
[GC_Content_Coverage,Chromosome,Coverage_0]=GC_Stat_ReadIn(BamN,GC_Stat_Path,genome_name,Affix_GC_Stat)
Coverage=[int(k) for k in Coverage_0]
[chrom_N,chrom_X,chrom_Y,GC_Median_Coverage,GC_Overall_Median_Coverage,GC_Var_Coverage,GC_Mean_Coverage,GC_Std_Coverage,GC_Median_Num]=GC_RD_Prepare(ref_file,Chromosome,Coverage,GC_Content_Coverage)
GC_para_dict={'IL_Statistics':IL_Statistics,'GC_Overall_Median_Coverage':GC_Overall_Median_Coverage,'GC_Overall_Median_Num':GC_Overall_Median_Num,'GC_Median_Coverage':GC_Median_Coverage,'GC_Median_Num':GC_Median_Num,'GC_Mean_Coverage':GC_Mean_Coverage,'GC_Std_Coverage':GC_Std_Coverage,'GC_Var_Coverage':GC_Var_Coverage,'Coverage':Coverage}
def readin_PO_Stat(file_in):
#fit in the exponential distribution on prob of observing aberrant pair orientation
#eg of file_in='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.null'
fin=open(file_in)
pin=fin.readline().strip().split()
info_pos=pin.index('AbnormalDirection')
info_hash={}
for line in fin:
pin=line.strip().split()
if not int(pin[info_pos]) in list(info_hash.keys()):
info_hash[int(pin[info_pos])]=0
info_hash[int(pin[info_pos])]+=1
fin.close()
PO_num=sorted(info_hash.keys())
region_num=[info_hash[i] for i in sorted(info_hash.keys())]
region_prob_log=[numpy.log(i) for i in [float(i)/float(sum(region_num)) for i in region_num]]
regression_para=scipy.stats.linregress(PO_num,region_prob_log)
return [regression_para.slope,regression_para.intercept] #eg of output:log(y)=ax+b, return [a,b]
def rela_Pair_ThroughBP(chr_letter_bp,Pair_ThroughBP):
out=[]
for k1 in list(Pair_ThroughBP.keys()):
for k2 in Pair_ThroughBP[k1]:
rela=[k2[6],k2[0]-chr_letter_bp[k1][k2[6]][0],
k2[1]-chr_letter_bp[k1][k2[6]][0],
k2[7],k2[2]-chr_letter_bp[k1][k2[7]][0],
k2[3]-chr_letter_bp[k1][k2[7]][0],k2[4],k2[5]]
out.append(rela)
return out
def read_Pair_Single_Read_ThroughBP(chr_letter_bp,Single_Read_ThroughBP):
out=[]
for k1 in list(Single_Read_ThroughBP.keys()):
for k2 in Single_Read_ThroughBP[k1]:
rela=[k2[2],k2[0]-chr_letter_bp[k1][k2[2]][0],
k2[3],k2[1]-chr_letter_bp[k1][k2[3]][0]]
out.append(rela)
return out
def rela_Pair_Double_Read_ThroughBP(chr_letter_bp,Double_Read_ThroughBP):
out=[]
for k1 in list(Double_Read_ThroughBP.keys()):
for k2 in Double_Read_ThroughBP[k1]:
rela=[k2[6],k2[0]-chr_letter_bp[k1][k2[6]][0],
k2[7],k2[1]-chr_letter_bp[k1][k2[7]][0],
k2[8],k2[2]-chr_letter_bp[k1][k2[8]][0],
k2[9],k2[3]-chr_letter_bp[k1][k2[9]][0],k2[4],k2[5]]
out.append(rela)
return out
def rd_low_qual_modify(rd_low_qual,block_bps,temp_rec_LowQual):
#eg of rd_low_qual={'chr1': {}}
#eg of block_bps={'chr1': {'a': [2780927, 2782153], 'c': [2782378, 2782468], 'b': [2782153, 2782378], 'right': [2782468, 2782968], 'left': [2780427, 2780927]}}
#eg of temp_rec_LowQual={'ERR894726.127038234': [['177', 'chr1', '2781655', '0', '101S20M5S', 'chrX', '83458898', '0']], 'ERR899712.53791925': [['163', 'chr1', '2782490', '18', '19M2I45M60S', '=', '2783112', '748']]}
for k3 in list(temp_rec_LowQual.keys()):
for k4 in temp_rec_LowQual[k3]:
read_pos=[int(k4[2]),int(k4[2])+cigar2reaadlength(k4[4])]
pos_block_assign(block_bps[k1],read_pos,tolerance_bp)
if read_pos[-1]==read_pos[-2]:
if not read_pos[-1] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-1]]=0
rd_low_qual[k1][read_pos[-1]]+=(read_pos[1]-read_pos[0])
else:
if not read_pos[-2] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-2]]=0
if not read_pos[-1] in list(rd_low_qual[k1].keys()):
rd_low_qual[k1][read_pos[-1]]=0
rd_low_qual[k1][read_pos[-2]]+=block_bps[k1][read_pos[-2]][1]-read_pos[0]
rd_low_qual[k1][read_pos[-1]]+=-block_bps[k1][read_pos[-1]][0]+read_pos[1]
return rd_low_qual
def SV_file_name_readin(file_path,file_key,file_appdix):
out=[]
for k1 in os.listdir(file_path):
if k1.split('.')[-1]==file_appdix:
if file_key in k1:
out.append(file_path+k1)
return out
def SV_readin_svelter(svelter_file):
#eg of svelter_file: /scratch/remills_flux/xuefzhao/SV_discovery_index/download/SVelter.version10/svelter/HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.svelter
fin=open(svelter_file)
pin=fin.readline().strip().split()
out=[]
for line in fin:
pin=line.strip().split()
bp_info=pin[3].split(':')
ref_sv=pin[4]
alt_sv=pin[5]
out.append([bp_info]+[ref_sv,alt_sv])
return out
def seq_file_name_readin(seq_path):
#we support bam and cram as input
out=[]
seq_path=path_modify(seq_path)
for k1 in os.listdir(seq_path):
if k1.split('.')[-1] in ['bam','cram']:
out.append(seq_path+k1)
return out
def Single_Rec_Read_Locate(BP_para_dict,Letter_Double_rec,temp_bp, temp_let):
Pair_ThroughBP=[]
Double_Read_ThroughBP=[]
Single_Read_ThroughBP=[]
Initial_IL=[]
BlockCov={}
Initial_Cov={}
Initial_DR_Penal=0
for j in temp_let:
BlockCov[j]=0
for key in list(Letter_Double_rec.keys()):
if len(Letter_Double_rec[key])==1:
pos1=int(Letter_Double_rec[key][0][3])
pos2=int(Letter_Double_rec[key][0][7])
bamChr=Letter_Double_rec[key][0][2]
fbamtemp=os.popen(r'''samtools view -F 256 %s %s:%d-%d'''%(Initial_Bam,bamChr,pos2,pos2+ReadLength))
while True:
pbam=fbamtemp.readline().strip().split()
if not pbam: break
flag=0
if pbam[0]==key:
Letter_Double_rec[key]+=[pbam[:9]]
flag+=1
if flag==1:
break
fbamtemp.close()
for key in list(Letter_Double_rec.keys()):
if len(Letter_Double_rec[key])==2:
pos1=int(Letter_Double_rec[key][0][3])
pos2=int(Letter_Double_rec[key][1][3])
if not pos1>pos2:
pos1=int(Letter_Double_rec[key][0][3])
pos1b=pos1+cigar2reaadlength(Letter_Double_rec[key][0][5])
pos2=int(Letter_Double_rec[key][1][3])
pos2b=pos2+cigar2reaadlength(Letter_Double_rec[key][1][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double_rec[key][0][1])
elif pos1>pos2:
pos1=int(Letter_Double_rec[key][1][3])
pos1b=pos2+cigar2reaadlength(Letter_Double_rec[key][1][5])
pos2=int(Letter_Double_rec[key][0][3])
pos2b=pos1+cigar2reaadlength(Letter_Double_rec[key][0][5])
direct_temp=Reads_Direction_Detect_flag(Letter_Double_rec[key][1][1])
if not pos1<temp_bp[0]-BP_para_dict['flank']+1 and not pos2b>temp_bp[-1]+BP_para_dict['flank']-1:
block1=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos1+low_qual_edge)
block2=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos2+low_qual_edge)
block1b=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos1b-low_qual_edge)
block2b=Reads_block_assignment_1(BP_para_dict['flank'],temp_bp,temp_let,pos2b-low_qual_edge)
rela_1=pos1-temp_bp[temp_let.index(block1)]
rela_2=pos2-temp_bp[temp_let.index(block2)]
rela_1b=pos1b-temp_bp[temp_let.index(block1b)]
rela_2b=pos2b-temp_bp[temp_let.index(block2b)]
if block1==block1b==block2==block2:
BlockCov[block1]+=cigar2reaadlength(Letter_Double_rec[key][0][5])
else:
if block1==block1b and block2==block2b:
Pair_ThroughBP.append([block1,rela_1,rela_1b, block2,rela_2,rela_2b]+direct_temp)
else:
Double_Read_ThroughBP.append([block1,rela_1,block1b,rela_1b, block2,rela_2,block2b,rela_2b]+direct_temp)
del Letter_Double_rec[key]
for j in Pair_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in Double_Read_ThroughBP:
if not j[-2:]==['+', '-']:
Initial_DR_Penal+=1
for j in temp_let:
Initial_Cov[j]=0
for j in Pair_ThroughBP:
Initial_Cov[j[0]]+=j[2]-j[1]
Initial_Cov[j[3]]+=j[5]-j[4]
for j in Single_Read_ThroughBP:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
for j in Double_Read_ThroughBP:
if j[0]==j[2]:
Initial_Cov[j[0]]+=j[3]-j[1]
else:
Initial_Cov[j[0]]+=temp_bp[temp_let.index(j[0])+1]-temp_bp[temp_let.index(j[0])]-j[1]
Initial_Cov[j[2]]+=j[3]
if j[4]==j[6]:
Initial_Cov[j[4]]+=j[7]-j[5]
else:
Initial_Cov[j[4]]+=temp_bp[temp_let.index(j[4])+1]-temp_bp[temp_let.index(j[4])]-j[5]
Initial_Cov[j[6]]+=j[7]
Initial_IL=[]
for j in Pair_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[3])]-temp_bp[temp_let.index(j[0])]-j[1]+j[5])
for j in Double_Read_ThroughBP:
Initial_IL.append(temp_bp[temp_let.index(j[6])]-temp_bp[temp_let.index(j[0])]-j[1]+j[7])
Initial_ILPenal=[]
for j in Initial_IL:
Initial_ILPenal+=[pdf_calculate(j,GC_para_dict['IL_Statistics'][4],GC_para_dict['IL_Statistics'][0],GC_para_dict['IL_Statistics'][1],GC_para_dict['IL_Statistics'][2],GC_para_dict['IL_Statistics'][3],BP_para_dict['Cut_Upper'],BP_para_dict['Cut_Lower'],Penalty_For_InsertLengthZero)/len(Initial_IL)]
return [Initial_DR_Penal,Initial_ILPenal,Pair_ThroughBP,Double_Read_ThroughBP,Single_Read_ThroughBP,BlockCov,Initial_Cov]
def Single_Read_Assort_For_insert(Full_Info,bp_list,flank):
relative_bps=[i-bp_list[0] for i in bp_list]
letter_list=[chr(97+i) for i in range(len(bp_list)-1)]
Block_and_Reads={}
Block_and_Reads['left']=[]
Block_and_Reads['right']=[]
SingleR_Through=Full_Info[6]
Pair_Through=Full_Info[4]
Read_Through=Full_Info[5]
for block in letter_list:
Block_and_Reads[block]=[]
for j in Pair_Through:
Block_and_Reads[j[0]]=[j[1:3],j[3:]]
Block_and_Reads[j[3]]=[j[4:6],j[:3]+j[6:8]]
for j in Read_Through:
Block_and_Reads[j[0]]=[]
for key in list(Full_Info_of_Reads.keys()):
read_left=[int(i) for i in Full_Info_of_Reads[key][:2]]+[Full_Info_of_Reads[key][-2]]
read_right=[int(i) for i in Full_Info_of_Reads[key][2:4]]+[Full_Info_of_Reads[key][-1]]
assign_left=Reads_block_assignment_2(relative_bps,letter_list,read_left[0],read_left[1],flank)
assign_right=Reads_block_assignment_2(relative_bps,letter_list,read_right[0],read_right[1],flank)
New_Info=['_'.join([assign_left[0],str(int(co)-assign_left[1])]) for co in Full_Info_of_Reads[key][:2]]+['_'.join([assign_right[0],str(int(co)-assign_right[1])]) for co in Full_Info_of_Reads[key][2:4]]+Full_Info_of_Reads[key][4:]
Block_and_Reads[assign_left[0]][key]=New_Info
Block_and_Reads[assign_right[0]][key]=New_Info
return Block_and_Reads
def tau_calcu(Insert_Len_Stat,Physical_Cov_Stat,Read_Depth_Stat):
#eg of Insert_Len_Stat='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/ILNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.Bimodal'
#eg of Physical_Cov_Stat='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/TBNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.Bimodal'
#eg of Read_Depth_Stat='/scratch/remills_flux/xuefzhao/SV_discovery_index/download/NullModel.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.cram/RDNull.HG00512.alt_bwamem_GRCh38DH.20150715.CHS.high_coverage.genome.NegativeBinomial'
IL_Stat=readin_IL_Stat(Insert_Len_Stat,'C')
log_P_IL=pdf_calculate(IL_Stat[0]*IL_Stat[1]+IL_Stat[3]*IL_Stat[4],IL_Stat[0],IL_Stat[1],IL_Stat[4],IL_Stat[2],IL_Stat[5],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero)
TB_Stat=readin_PC_Stat(Physical_Cov_Stat,'C')
log_P_TB=pdf_calculate(TB_Stat[0]*TB_Stat[1]+TB_Stat[3]*TB_Stat[4],TB_Stat[0],TB_Stat[1],TB_Stat[4],TB_Stat[2],TB_Stat[5],TB_Cut_Upper,TB_Cut_Lower,Penalty_For_InsertLengthZero)
RD_Stat=readin_RD_Stat(Read_Depth_Stat)
log_P_RD=Prob_NB(RD_Stat[0],RD_Stat[0],RD_Stat[2])
return [log_P_IL,log_P_RD,log_P_TB]
def main():
opts,args=getopt.getopt(sys.argv[2:],'f:',['file-sample=','seq-path=','workdir=','file-type=','seq-type=','seq-path=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','input-bed=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
global dict_opts
dict_opts=dict(opts)
global Window_Size
Window_Size=100
if dict_opts=={}:
readme.print_default_parameters_genotyper()
else:
commandline_readin()
Define_Default_SVPredict(dict_opts)
sv_info_list=SV_readin_svelter(single_file)
global geno_likelihood_list,sv_rec_list
[geno_likelihood_list,sv_rec_list]=[{},{}]
for bam_file_name in seq_file_names:
print(bam_file_name)
global_name_define_1(bam_file_name)
if not os.path.isfile(Insert_Len_Stat):
print('Error: cannot access file: '+Insert_Len_Stat)
else:
ReadLenFin=NullPath+BamN+'.'+genome_name+'.Stats'
if not os.path.isfile(ReadLenFin):
print('Error: cannot access file: '+ReadLenFin)
else:
ReadLenFin_info_readin(ReadLenFin)
rec=0
for sv_info in sv_info_list:
rec+=1
sv_rec_list[individual_name][rec]=sv_info
bps2_new=[sv_info[0]]
bps2_new=modify_bps1_new(bps2_new)
bps2_new_2=modify_bps2_new(bps2_new)
bps2=LN_bps2_Modify(bps2_new_2,chromos_all)
if len(bps2)>0 and qual_check_bps2(bps2)=='right':
Chromo=bps2[0][0]
if str(Chromo) in list(GC_Std_Coverage.keys()) and str(Chromo) in list(GC_Mean_Coverage.keys()):
K_RD=GC_Std_Coverage[str(Chromo)]/GC_Mean_Coverage[str(Chromo)]
K_IL=IL_Normal_Stat[2]/IL_Normal_Stat[1]
K_RD_new=1
K_IL_new=(K_IL/K_RD)**2
IL_GS=Prob_Norm(IL_Normal_Stat[1],IL_Normal_Stat[1],IL_Normal_Stat[2]**2)
RD_GS=Prob_Norm(GC_Mean_Coverage[str(Chromo)],GC_Mean_Coverage[str(Chromo)],GC_Std_Coverage[str(Chromo)]**2)
for i in bps2:
temp2=[int(j) for j in i[1:]]
k=[i[0]]+sorted(temp2)
k2=k[:2]
for k3 in temp2:
if not k3 in k2 and k3-k2[-1]>10:
k2.append(k3)
if len(k2)>2:
bps2[bps2.index(i)]=k2
else:
del bps2[bps2.index(i)]
if not len(bps2)<1:
original_bps_all=[]
for obas in bps2:
original_bps_all+=obas
original_structure=bp_to_let([original_bps_all],chromos_all)
chr_letter_tbp=letter_rearrange(flank,bps2)
letter_tGC=letter_GC_ReadIn(chr_letter_tbp)
if letter_tGC=='error': continue
letter_tRD=letter_RD_ReadIn(chr_letter_tbp)
if letter_tRD=='error': continue
[chr_letter_bp,letter_GC,letter_RD]=letter_bp_GC_RD_Prep(chr_letter_tbp,letter_tRD,letter_tGC)
left_keys=left_keys_prep(chr_letter_bp)
#chr_letter_bp=chr_letter_bp_modify(chr_letter_bp)
if not left_keys==[]:
bps3={}
for k1 in list(chr_letter_bp.keys()):
bps3[k1]={}
for k2 in list(chr_letter_bp[k1].keys()):
bps3[k1][chr_letter_bp[k1][k2][0]]=[chr_letter_bp[k1][k2][0],chr_letter_bp[k1][k2][-1]]
bps4={}
for k1 in list(bps3.keys()):
if not bps3[k1]=={}:
bps4[k1]=[[k1]+bps3[k1][sorted(bps3[k1].keys())[0]]]
for k2 in range(len(list(bps3[k1].keys()))-1):
if bps3[k1][sorted(bps3[k1].keys())[k2+1]][0]==bps3[k1][sorted(bps3[k1].keys())[k2]][-1]:
bps4[k1][-1]+=[bps3[k1][sorted(bps3[k1].keys())[k2+1]][-1]]
else:
bps4[k1].append(bps3[k1][sorted(bps3[k1].keys())[k2+1]])
bps2=bps4_to_bps2(bps4)
global Chr
Chr=bps2[0][0]
Flank_para_dict={'flank':flank,'Cut_Lower':Cut_Lower,'Cut_Upper':Cut_Upper,'ReadLength':ReadLength}
[Copy_num_estimate,Copy_num_Check]=copy_num_estimate_calcu(GC_para_dict,Flank_para_dict,bps2)
dup_CN_check=[sv_info[-1].count(i) for i in sv_info[-2].split('/')[0]]
high_CN_block=[i for i in sv_info[-2].split('/')[0] if dup_CN_check[sv_info[-2].split('/')[0].index(i)]>3]
if not high_CN_block==[]:
#Full_Info=Full_Info_of_Reads_Integrate(GC_para_dict,Flank_para_dict,bps2)
for high_CN_let in high_CN_block:
#geno_likelihood_list[individual_name][rec]=sv_info[1:]+['tan_dup',':'.join(sv_info[0]+['CN='+str(int(Full_Info[1][high_CN_let]/GC_para_dict['GC_Mean_Coverage'][Chr]*2))])]
geno_likelihood_list[individual_name][rec]=sv_info[1:]+['tan_dup',':'.join(sv_info[0])]
else:
#if Copy_num_Check==[]:
Full_Info=Full_Info_of_Reads_Integrate(GC_para_dict,Flank_para_dict,bps2)
RD_within_B=RD_within_B_calcu(GC_Mean_Coverage,Full_Info,bps2)
for j in range(Cut_Lower,Cut_Upper+1):
Single_ILScore=pdf_calculate(j,IL_Statistics[4],IL_Statistics[0],IL_Statistics[1],IL_Statistics[2],IL_Statistics[3],Cut_Upper,Cut_Lower,Penalty_For_InsertLengthZero)
let_chr_rec={}
for i in list(chr_letter_bp.keys()):
for j in list(chr_letter_bp[i].keys()):
if j in left_keys:
let_chr_rec[j]=i
for i in list(let_chr_rec.keys()):
Theo_RD=GC_Overall_Median_Coverage[str(let_chr_rec[i])]
Theo_Var=GC_Var_Coverage[str(let_chr_rec[i])]
for j in range(int(Theo_RD/2),int(Theo_RD/2*3+1)):
single_ProbNB=Prob_Norm(j,Theo_RD,Theo_Var)
Block_CN_Upper={}
median_CN=GC_Overall_Median_Coverage[chrom_N]/2
for key in list(Initial_GCRD_Adj.keys()):
if not key in ['left','right']:
Block_CN_Upper[key]=Initial_GCRD_Adj[key]/median_CN+2
[Initial_DR,Initial_IL,BlockGC]=[Full_Info[2],Full_Info[3],Full_Info[7]]
BlockGC['left']=0.476
BlockGC['right']=0.476
BlockGC2={}
for key_B_GC in list(BlockGC.keys()):
BlockGC2[key_B_GC]=BlockGC[key_B_GC]
BlockGC2[key_B_GC+'^']=BlockGC[key_B_GC]
original_letters=Full_Info[9]
original_bp_list=Full_Info[8]
num_of_read_pairs=Be_BP_Letter_modify(original_letters,flank,RD_within_B,ReadLength,Full_Info,original_bp_list)
Initial_TB=0
Initial_Move_Prob=[1.0/3,1.0/3,1.0/3]
[Pair_Through,Read_Through]=Read_Through_modify(Full_Info[4],Full_Info[5],Be_BP_Letter)
SingleR_Through=Full_Info[6]
bp_MP=[original_bp_list,original_bp_list]
letter_MP=[original_letters,original_letters]
Be_BP=[original_bp_list,original_bp_list]
Be_Info=[Pair_Through,Read_Through,SingleR_Through]
Be_Letter=[[i for i in original_structure.split('/')[0]] for j in range(2)]
Best_Score=float("-inf")
Best_Letter=[]
Best_BPs=[]
score_record=[]
#best_score_rec=[]
num_of_reads=(original_bp_list[-1]-original_bp_list[0])*GC_Mean_Coverage[Chr]/2/ReadLength
Best_Score_Rec=0
Score_rec_hash={}
break_Iteration_Flag=0
run_flag=0
Best_Letter_Rec=[]
global BP_para_dict
BP_para_dict={'flank':flank,'Cut_Lower':Cut_Lower,'Cut_Upper':Cut_Upper,'ReadLength':ReadLength,'Be_Letter':Be_Letter,'num_of_reads':num_of_reads,'original_letters':original_letters,'BlockGC2':BlockGC2,'BlockGC':BlockGC,'original_bp_list':original_bp_list,'RD_within_B':RD_within_B}
structure_candidates=alt_SV_genotype_prep(sv_info)
geno_prob=genotype_SVs_Process(GC_para_dict,BP_para_dict,run_flag,Score_rec_hash,Be_BP_Letter,Be_Info,structure_candidates)
geno_likelihood_list[individual_name][rec]=['/'.join([''.join(i[0]),''.join(i[1])]) for i in structure_candidates]+geno_prob
#else:
# Full_Info=Full_Info_of_Reads_Integrate(GC_para_dict,Flank_para_dict,bps2)
# geno_likelihood_list[individual_name][rec]=sv_info[1:]+['tan_dup']+[':'.join([str(j) for j in i]) for i in Copy_num_Check_report(Copy_num_Check,Full_Info,chr_letter_bp)]
else: geno_likelihood_list[individual_name][rec]=sv_info[1:]+['none']
else: geno_likelihood_list[individual_name][rec]=sv_info[1:]+['none']
else: geno_likelihood_list[individual_name][rec]=sv_info[1:]+['none']
print(geno_likelihood_list[individual_name][rec])
if rec/100*100==rec:
geno_likelihood_write(geno_likelihood_list,sv_rec_list,single_file,bam_file_name)
for test in range(rec-100,rec):
del geno_likelihood_list[individual_name][test+1]
del sv_rec_list[individual_name][test+1]
geno_likelihood_write(geno_likelihood_list,sv_rec_list,single_file,bam_file_name)
main()
if function_name=='SVIntegrate_vcf4.1':
import glob
import getopt
opts,args=getopt.getopt(sys.argv[2:],'o:h:S:',['deterministic-flag=','help=','long-insert=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters_svintegrate()
else:
def add_csv_info(csv1,flag_sex,k1,k2):
if flag_sex==1:
del_let=[csv1[0],[]]
inv_let=[csv1[1],[]]
dup_let=[csv1[2],[]]
elif flag_sex==2:
del_let=[[],csv1[0]]
inv_let=[[],csv1[1]]
dup_let=[[],csv1[2]]
if simple_DEL_decide(k1,k2)=='Simple_DEL':
for k3 in sv_info[k1][k2]:
del_info_add(k3,del_let)
elif simple_DUP_decide(k1,k2)=='Simple_DUP':
dup_subtype=simple_DUP_type(k1,k2)
dis_dup_let=[[],[]]
tan_dup_let=[[],[]]
for x in range(2):
if not dup_subtype[x]==[]:
for y in dup_subtype[x]:
if y.split('_')[1]=='Disperse':
dis_dup_let[x].append([y.split('_')[0],k2.split('/')[x].count(y.split('_')[0])])
else:
tan_dup_let[x].append([y.split('_')[0],k2.split('/')[x].count(y.split('_')[0])])
if not tan_dup_let==[[],[]]:
for k3 in sv_info[k1][k2]:
dup_info_2_add(k3,tan_dup_let)
if not dis_dup_let==[[],[]]:
for k3 in sv_info[k1][k2]:
disperse_dup_info_2_add(k3,dis_dup_let)
elif simple_INV_decide(k1,k2)=='Simple_INV':
for k3 in sv_info[k1][k2]:
inv_info_add(k3,inv_let)
elif simple_TRA_decide(k1,k2)=='simple_TRA':
tra_info_add(k1,k2)
else:
dup_csv_subtype=dup_type_decide(dup_let,flag_sex,k1,k2)
for k3 in sv_info[k1][k2]:
del_csv_info_add(k3,del_let)
inv_csv_info_add(k3,inv_let)
dup_csv_info_add(k3,dup_let,dup_csv_subtype)
if csv1[3]==1:
tra_csv_info_add(k1,k2)
def comp_info_reorganize(k1,k2):
del_let=[[],[]]
dup_let=[[],[]]
inv_let=[[],[]]
tra_let=[[],[]]
k2a=k2.split('/')[0]
k2b=k2.split('/')[1]
k2c=[]
k2d=[]
for k3 in k2a:
if not k3=='^':
k2c.append(k3)
else:
k2c[-1]+=k3
for k3 in k2b:
if not k3=='^':
k2d.append(k3)
else:
k2d[-1]+=k3
for k3 in k1.split('/')[0]:
if k2a.count(k3)==0:
del_let[0].append(k3)
if k2b.count(k3)==0:
del_let[1].append(k3)
if k2a.count(k3)>1:
dup_let[0].append(k3)
if k2b.count(k3)>1:
dup_let[1].append(k3)
k2e=let_reclust(k2c)
k2f=let_reclust(k2d)
k2g=dup_let_recombind(dup_let[0])
k2h=dup_let_recombind(dup_let[1])
k2i=[]
k2j=[]
for k3 in k2g:
flag1=0
for k4 in k2e:
if k3 in k4:
flag1+=1
if flag1>1:
k2i.append(k3)
for k3 in dup_let[0]:
if k2e.count(k3[0])+k2e.count(k3[0]+'^')>0:
if not k3[0] in k2i:
k2i.append(k3[0])
for k3 in k2h:
flag1=0
for k4 in k2e:
if k3 in k4:
flag1+=1
if flag1>1:
k2j.append(k3)
for k3 in dup_let[1]:
if k2e.count(k3[0])+k2e.count(k3[0]+'^')>0:
if not k3[0] in k2j:
k2j.append(k3[0])
k2m=[]
for k3 in k2e:
if k3[-1]=='^':
k2m.append(k3)
k2n=[]
for k3 in k2f:
if k3[-1]=='^':
k2n.append(k3)
for k3 in sv_info[k1][k2]:
del_info_add(k3,del_let)
dup_info_add(k3,[k2i,k2j])
inv_info_add(k3,[k2m,k2n])
def del_info_reorganize(k1,k2):
del_let=[[],[]]
for k3 in k1.split('/')[0]:
if not k3 in k2.split('/')[0]:
del_let[0].append(k3)
for k3 in k1.split('/')[1]:
if not k3 in k2.split('/')[1]:
del_let[1].append(k3)
for k3 in sv_info[k1][k2]:
del_bp=[]
if not del_let[0]==[]:
del_bp.append(bp_to_hash(k3,del_let[0]),chromos)
else:
del_bp.append([])
if not del_let[1]==[]:
del_bp.append(bp_to_hash(k3,del_let[1]),chromos)
else:
del_bp.append([])
if del_bp[0]==del_bp[1]:
for k4 in del_bp[0]:
if not k4[0] in list(del1.keys()):
del1[k4[0]]=[]
if not [int(k4[1]),int(k4[-1]),'hom'] in del1[k4[0]]:
del1[k4[0]].append([int(k4[1]),int(k4[-1]),'hom'])
else:
for k5 in del_bp:
for k4 in k5:
if not k4[0] in list(del1.keys()):
del1[k4[0]]=[]
if not [int(k4[1]),int(k4[-1]),'het'] in del1[k4[0]]:
del1[k4[0]].append([int(k4[1]),int(k4[-1]),'het'])
def del_info_add(k3,del_let):
tempa=bp_to_hash(k3[:-1],del_let[0],chromos)
tempb=bp_to_hash(k3[:-1],del_let[1],chromos)
for k1 in tempa:
if k1 in tempb:
tempc='hom'
tempb.remove(k1)
else:
tempc='heta'
if not k1[0] in list(del1.keys()):
del1[k1[0]]=[]
del1[k1[0]].append(k1[1:]+[tempc,k3[-1],';'.join(k3[:-1]+['S'])])
for k1 in tempb:
if not k1[0] in list(del1.keys()):
del1[k1[0]]=[]
del1[k1[0]].append(k1[1:]+['hetb',k3[-1],';'.join(k3[:-1]+['S'])])
def del_csv_info_add(k3,del_let):
tempa=bp_to_hash(k3[:-1],del_let[0],chromos)
tempb=bp_to_hash(k3[:-1],del_let[1],chromos)
for k1 in tempa:
if k1 in tempb:
tempc='hom'
tempb.remove(k1)
else:
tempc='heta'
if not k1[0] in list(del1.keys()):
del1[k1[0]]=[]
del1[k1[0]].append(k1[1:]+[tempc,k3[-1],';'.join(k3[:-1]+['C'])])
for k1 in tempb:
if not k1[0] in list(del1.keys()):
del1[k1[0]]=[]
del1[k1[0]].append(k1[1:]+['hetb',k3[-1],';'.join(k3[:-1]+['C'])])
def dup_csv_info_add(k3,dup_let,dup_csv_subtype):
temprec=-1
dup_index_1=-1
for k2x in dup_let:
temprec+=1
hetx=['heta','hetb'][temprec]
dup_index_1+=1
dup_index_2=-1
for k4 in k2x:
dup_index_2+=1
dup_subtype_current=dup_csv_subtype[dup_index_1][dup_index_2]
if dup_subtype_current=='Tandem':
temp=bp_to_hash(k3[:-1],[i for i in k4[0]],chromos)
for k5 in temp:
if not k5[0] in list(disperse_dup.keys()):
disperse_dup[k5[0]]=[]
if k4[1]>1:
disperse_dup[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['S']),k4[1]])
elif dup_subtype_current=='Disperse':
temp=bp_to_hash(k3[:-1],[i for i in k4[0]],chromos)
for k5 in temp:
if not k5[0] in list(disperse_dup.keys()):
disperse_dup[k5[0]]=[]
if k4[1]>1:
disperse_dup[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['S']),k4[1]])
def dup_info_add(k3,dup_let):
for k2x in dup_let:
for k4 in k2x:
temp=bp_to_hash(k3[:-1],[i for i in k4],chromos)
for k5 in temp:
if not k5[0] in list(dup1.keys()):
dup1[k5[0]]=[]
dup1[k5[0]].append(k5[1:]+[k3[-1],'_'.join(k3[:-1])])
def dup_info_2_add(k3,dup_let):
temprec=-1
for k2x in dup_let:
temprec+=1
hetx=['heta','hetb'][temprec]
for k4 in k2x:
temp=bp_to_hash(k3[:-1],[i for i in k4[0]],chromos)
for k5 in temp:
if not k5[0] in list(dup1.keys()):
dup1[k5[0]]=[]
if k4[1]>1:
dup1[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['S']),k4[1]])
def Define_Default_SVIntegrate():
global score_Cff
if not '--qc-structure' in dict_opts:
score_Cff=0
else:
score_Cff=int(dict_opts['--qc-structure'])
def disperse_dup_info_2_add(k3,dup_let):
temprec=-1
for k2x in dup_let:
temprec+=1
hetx=['heta','hetb'][temprec]
for k4 in k2x:
temp=bp_to_hash(k3[:-1],[i for i in k4[0]],chromos)
for k5 in temp:
if not k5[0] in list(disperse_dup.keys()):
disperse_dup[k5[0]]=[]
if k4[1]>1:
disperse_dup[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['S']),k4[1]])
def dup_csv_info_2_add(k3,dup_let):
temprec=-1
for k2x in dup_let:
temprec+=1
hetx=['heta','hetb'][temprec]
for k4 in k2x:
temp=bp_to_hash(k3[:-1],[i for i in k4[0]],chromos)
for k5 in temp:
if not k5[0] in list(dup1.keys()):
dup1[k5[0]]=[]
if k4[1]>1:
dup1[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['C']),k4[1]])
def hash_collaps():
for k1 in list(sv_out.keys()):
for k2 in list(sv_out[k1].keys()):
if len(sv_out[k1][k2])>1:
temp=[]
temp2=[]
for k3 in sv_out[k1][k2]:
if not k3[:-1] in temp:
temp.append(k3[:-1])
temp2.append([k3[-1]])
else:
temp2[temp.index(k3[:-1])].append(k3[-1])
for k3 in range(len(temp2)):
if len(temp2[k3])>1:
if sorted([temp2[k3][0].split(':')[0],temp2[k3][1].split(':')[0]])==['0|1', '1|0']:
if not ':' in temp2[k3][0]:
temp2[k3]=['1|1']
else:
temp2[k3]=['1|1:'+str(int(temp2[k3][0].split(':')[1])+int(temp2[k3][1].split(':')[1]))]
temp3=[]
for k3 in range(len(temp2)):
temp3.append(temp[k3]+temp2[k3])
sv_out[k1][k2]=temp3
def hash_collaps2():
temp={}
for k1 in list(sv_out.keys()):
temp[k1]={}
for k2 in list(sv_out[k1].keys()):
for k3 in sv_out[k1][k2]:
pos=end_cordi_calcu(k3)
if not pos[1] in list(temp[k1].keys()):
temp[k1][pos[1]]={}
if not pos[2] in list(temp[k1][pos[1]].keys()):
temp[k1][pos[1]][pos[2]]=[]
temp[k1][pos[1]][pos[2]].append([k1,k2,k3])
out={}
for k1 in list(temp.keys()):
out[k1]={}
for k2 in list(temp[k1].keys()):
if len(temp[k1][k2])>1:
flag=1
for k3 in list(temp[k1][k2].keys()):
for k4 in temp[k1][k2][k3]:
if not k4[2][4]=='<DUP>':
flag=0
if flag==1:
for k4 in list(temp[k1][k2].keys()):
if not k4==max(temp[k1][k2].keys()):
for k5 in temp[k1][k2][k4]:
del sv_out[k1][k5[2][2]][sv_out[k1][k5[2][2]].index(k5[2])]
if sv_out[k1][k5[2][2]]==[]:
del sv_out[k1][k5[2][2]]
def hash_collaps3():
for k1 in list(sv_out.keys()):
for k2 in list(sv_out[k1].keys()):
if len(sv_out[k1][k2])>1:
temp1=[]
temp2=[]
for k3 in range(len(sv_out[k1][k2])):
if not sv_out[k1][k2][k3][:5]+sv_out[k1][k2][k3][6:-1] in temp1:
temp1.append(sv_out[k1][k2][k3][:5]+sv_out[k1][k2][k3][6:-1])
temp2.append(sv_out[k1][k2][k3])
else:
continue
sv_out[k1][k2]=temp2
def hash_reorder():
for ka1 in list(del1.keys()):
if not ka1 in list(sv_out.keys()):
sv_out[ka1]={}
for ka2 in del1[ka1]:
REF_AL='N'
Pass_Sign='PASS'
if ka2[3]<score_Cff:
Pass_Sign='LowQual'
if ka2[2]=='heta':
GenoType='1|0'
elif ka2[2]=='hetb':
GenoType='0|1'
elif ka2[2]=='homo':
GenoType='1|1'
ka_new=[ka1,ka2[0],ka2[-1],REF_AL,'<DEL>',ka2[3],Pass_Sign,'SVTYPE=DEL;END='+str(ka2[1]),'GT',GenoType]
if not ka2[-1] in list(sv_out[ka1].keys()):
sv_out[ka1][ka2[-1]]=[]
if not ka_new in sv_out[ka1][ka2[-1]]:
sv_out[ka1][ka2[-1]].append(ka_new)
for ka1 in list(inv1.keys()):
if not ka1 in list(sv_out.keys()):
sv_out[ka1]={}
for ka2 in inv1[ka1]:
REF_AL='N'
Pass_Sign='PASS'
if ka2[3]<score_Cff:
Pass_Sign='LowQual'
if ka2[2]=='heta':
GenoType='1|0'
elif ka2[2]=='hetb':
GenoType='0|1'
elif ka2[2]=='homo':
GenoType='1|1'
ka_new=[ka1,ka2[0],ka2[-1],REF_AL,'<INV>',ka2[3],Pass_Sign,'SVTYPE=INV;END='+str(ka2[1]),'GT',GenoType]
if not ka2[-1] in list(sv_out[ka1].keys()):
sv_out[ka1][ka2[-1]]=[]
if not ka_new in sv_out[ka1][ka2[-1]]:
sv_out[ka1][ka2[-1]].append(ka_new)
for ka1 in list(dup1.keys()):
if not ka1 in list(sv_out.keys()):
sv_out[ka1]={}
for ka2 in dup1[ka1]:
REF_AL='N'
CopyNumber=str(ka2[-1])
Pass_Sign='PASS'
if ka2[3]<score_Cff:
Pass_Sign='LowQual'
if ka2[2]=='heta':
GenoType='1|0'
elif ka2[2]=='hetb':
GenoType='0|1'
elif ka2[2]=='homo':
GenoType='1|1'
ka_new=[ka1,ka2[0],ka2[-2],REF_AL,'<DUP:TANDEM>',ka2[3],Pass_Sign,'SVTYPE=DUP;END='+str(ka2[1]),'GT:CN',GenoType+':'+CopyNumber]
if not ka2[-2] in list(sv_out[ka1].keys()):
sv_out[ka1][ka2[-2]]=[]
if not ka_new in sv_out[ka1][ka2[-2]]:
sv_out[ka1][ka2[-2]].append(ka_new)
for ka1 in list(disperse_dup.keys()):
if not ka1 in list(sv_out.keys()):
sv_out[ka1]={}
for ka2 in disperse_dup[ka1]:
REF_AL='N'
CopyNumber=str(ka2[-1])
Pass_Sign='PASS'
if ka2[3]<score_Cff:
Pass_Sign='LowQual'
if ka2[2]=='heta':
GenoType='1|0'
elif ka2[2]=='hetb':
GenoType='0|1'
elif ka2[2]=='homo':
GenoType='1|1'
ka_new=[ka1,ka2[0],ka2[-2],REF_AL,'<DUP>',ka2[3],Pass_Sign,'SVTYPE=DUP;END='+str(ka2[1]),'GT:CN',GenoType+':'+CopyNumber]
if not ka2[-2] in list(sv_out[ka1].keys()):
sv_out[ka1][ka2[-2]]=[]
if not ka_new in sv_out[ka1][ka2[-2]]:
sv_out[ka1][ka2[-2]].append(ka_new)
for ka1 in list(tra1.keys()):
ks1=ka1.split(';')[0]
ks2=';'.join(ka1.split(';')[:-2]+[ka1.split(';')[-1]])
SV_Score=float(ka1.split(';')[-2])
Pass_Sign='PASS'
if SV_Score<score_Cff:
Pass_Sign='LowQual'
if not ks1 in list(sv_out.keys()):
sv_out[ks1]={}
if not ks2 in list(sv_out[ks1].keys()):
sv_out[ks1][ks2]=[]
for ka2 in list(tra1[ka1].keys()):
hetx='het'+ka2
if ka2=='a':
GenoType='1|0'
elif ka2=='b':
GenoType='0|1'
for ka3 in tra1[ka1][ka2]:
ka_new=ka3[:2]+[ks2,ka3[2]]+ka3[3:]+[SV_Score,Pass_Sign,'SVTYPE=TRA','GT',GenoType]
if not ka_new in sv_out[ks1][ks2]:
sv_out[ks1][ks2].append(ka_new)
def inv_csv_info_add(k3,inv_let):
temprec=-1
for k2x in inv_let:
temprec+=1
hetx=['heta','hetb'][temprec]
for k4 in k2x:
temp=bp_to_hash(k3[:-1],[i for i in k4],chromos)
for k5 in temp:
if not k5[0] in list(inv1.keys()):
inv1[k5[0]]=[]
inv1[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['C'])])
def inv_info_add(k3,inv_let):
temprec=-1
for k2x in inv_let:
temprec+=1
hetx=['heta','hetb'][temprec]
for k4 in k2x:
temp=bp_to_hash(k3[:-1],[i for i in k4],chromos)
for k5 in temp:
if not k5[0] in list(inv1.keys()):
inv1[k5[0]]=[]
inv1[k5[0]].append(k5[1:]+[hetx,k3[-1],';'.join(k3[:-1]+['S'])])
def MissedSV_Produce_files(ref_file,samp_file):
ref_hash={}
samp_hash={}
out={}
for i in chromos:
ref_hash[i]=[]
samp_hash[i]=[]
fin=open(ref_file)
for line in fin:
pin=line.strip().split()
ref_hash[pin[0]].append([int(i) for i in pin[1:3]])
fin.close()
fin=open(samp_file)
for line in fin:
pin=line.strip().split()
samp_hash[pin[0]].append([int(i) for i in pin[1:3]])
fin.close()
for k1 in chromos:
flag1=0
if not ref_hash[k1]==[]:
out[k1]=[]
for k2 in ref_hash[k1]:
flag2=0
for k3 in samp_hash[k1]:
if k3[1]<k2[0]: continue
elif k3[0]>k2[1]: continue
else:
if float(sorted(k2+k3)[2]-sorted(k2+k3)[1])/float(max(k2[1]-k2[0],k3[1]-k3[0]))>0.5:
flag2+=1
if flag2>0:
flag1+=1
else:
out[k1].append(k2)
return out
def MissSV_writing(filename,hash):
fo=open(filename,'w')
for k1 in list(hash.keys()):
for k2 in list(hash[k1].keys()):
for k3 in chromos:
if k3 in list(hash[k1][k2].keys()):
for k4 in hash[k1][k2][k3]:
print(' '.join([str(i) for i in [k3]+k4+[k1,k2]]), file=fo)
fo.close()
def MissSV_Compare(File1,File2):
hash1={}
hash2={}
for k1 in chromos:
hash1[k1]={}
hash2[k1]={}
fin=open(File1)
for line in fin:
pin=line.strip().split()
if not pin[3] in list(hash1[pin[0]].keys()):
hash1[pin[0]][pin[3]]={}
if not pin[4].upper() in list(hash1[pin[0]][pin[3]].keys()):
hash1[pin[0]][pin[3]][pin[4].upper()]=[]
hash1[pin[0]][pin[3]][pin[4].upper()].append([pin[1],pin[2]])
fin.close()
fin=open(File2)
for line in fin:
pin=line.strip().split()
if not pin[3] in list(hash2[pin[0]].keys()):
hash2[pin[0]][pin[3]]={}
if not pin[4].upper() in list(hash2[pin[0]][pin[3]].keys()):
hash2[pin[0]][pin[3]][pin[4].upper()]=[]
hash2[pin[0]][pin[3]][pin[4].upper()].append([pin[1],pin[2]])
fin.close()
hash3={}
for k1 in list(hash1.keys()):
hash3[k1]={}
for k2 in list(hash1[k1].keys()):
hash3[k1][k2]={}
if k2 in list(hash2[k1].keys()):
for k3 in list(hash1[k1][k2].keys()):
hash3[k1][k2][k3]=[]
if k3 in list(hash2[k1][k2].keys()):
for k4 in hash1[k1][k2][k3]:
if not k4 in hash2[k1][k2][k3]:
hash3[k1][k2][k3].append(k4)
else:
hash3[k1][k2][k3]=hash1[k1][k2][k3]
else:
hash3[k1][k2]=hash1[k1][k2]
fo=open(File1+'.vs.'+File2.split('/')[-1],'w')
for k1 in chromos:
if k1 in list(hash3.keys()):
for k2 in list(hash3[k1].keys()):
for k3 in list(hash3[k1][k2].keys()):
for k4 in hash3[k1][k2][k3]:
print(' '.join([str(i) for i in [k1]+k4+[k2,k3]]), file=fo)
fo.close()
def ROC_produce_files(ref_file,samp_file):
ref_hash={}
samp_hash={}
out={}
for i in chromos:
ref_hash[i]=[]
samp_hash[i]=[]
fin=open(ref_file)
for line in fin:
pin=line.strip().split()
ref_hash[pin[0]].append([int(i) for i in pin[1:3]])
fin.close()
fin=open(samp_file)
for line in fin:
pin=line.strip().split()
samp_hash[pin[0]].append([int(i) for i in pin[1:3]])
fin.close()
for k1 in chromos:
flag1=0
if not ref_hash[k1]==[]:
out[k1]=[]
for k2 in ref_hash[k1]:
flag2=0
for k3 in samp_hash[k1]:
if k3[1]<k2[0]: continue
elif k3[0]>k2[1]: continue
else:
if float(sorted(k2+k3)[2]-sorted(k2+k3)[1])/float(max(k2[1]-k2[0],k3[1]-k3[0]))>0.5:
flag2+=1
if flag2>0:
flag1+=1
out[k1]=[flag1,len(ref_hash[k1]),len(samp_hash[k1]),float(flag1)/float(len(ref_hash[k1]))]
return out
def ROC_writing(filename,hash):
fo=open(filename,'w')
for k1 in list(hash.keys()):
for k2 in list(hash[k1].keys()):
for k3 in chromos:
if k3 in list(hash[k1][k2].keys()):
print(' '.join([str(i) for i in [k1,k2,k3]+hash[k1][k2][k3]]), file=fo)
fo.close()
def read_in_structures(filein):
fin=open(filein)
while True:
pin1=fin.readline().strip().split()
if not pin1: break
if pin1[0]=='Total': break
pin2=fin.readline().strip().split()
pin3=fin.readline().strip().split()
pin4=fin.readline().strip().split()
pin5=fin.readline().strip().split()
if pin3[0]=='Theoretical' and pin4[0]=='Current' and pin5[0]=='Time':
let1=bp_to_let([pin1],chromos)
if not let1==0:
let2='/'.join(sorted(pin2[0].split('/')))
if not let1 in list(sv_info.keys()):
sv_info[let1]={}
if not let2 in list(sv_info[let1].keys()):
sv_info[let1][let2]=[]
if not pin1 in sv_info[let1][let2]:
sv_info[let1][let2].append(pin1+[float(pin4[-1])-float(pin3[-1])])
fin.close()
def SV_Info_Write_svelter(sv_info):
temp1={}
sv_type_record={}
for k1 in list(sv_info.keys()):
for k2 in list(sv_info[k1].keys()):
for k3 in sv_info[k1][k2]:
if not k3[0] in list(temp1.keys()):
temp1[k3[0]]={}
if not int(k3[1]) in list(temp1[k3[0]].keys()):
temp1[k3[0]][int(k3[1])]={}
if not int(k3[-2]) in list(temp1[k3[0]][int(k3[1])].keys()):
temp1[k3[0]][int(k3[1])][int(k3[-2])]=[]
temp1[k3[0]][int(k3[1])][int(k3[-2])].append(k3+[k1,k2])
fo=open(output_file.replace('.vcf','.svelter'),'w')
print('\t'.join(['chr','start','end','bp_info','ref','alt','alt_type','score']), file=fo)
for k1 in chromos:
if k1 in list(temp1.keys()):
for k2 in sorted(temp1[k1].keys()):
for k3 in sorted(temp1[k1][k2].keys()):
for k4 in temp1[k1][k2][k3]:
if len(k4[-1])/len(k4[-2])>20: continue
chrom_svelter=k1
bp_start_svelter=k2
bp_end_svelter=k3
bps_info_svelter=':'.join(k4[:-3])
struc_ref_svelter=k4[-2]
struc_alt_svelter=k4[-1]
score_svelter=k4[-3]
output_old=[str(i) for i in [chrom_svelter,bp_start_svelter,bp_end_svelter,bps_info_svelter,struc_ref_svelter,struc_alt_svelter,score_svelter]]
output_new=svc.classify(output_old)
output_new2=output_new[:-2]+['/'.join(output_new[-2:])]
if not output_new[3] in list(sv_type_record.keys()):
sv_type_record[output_new[3]]=[output_new2[-1]]
print('\t'.join(output_new2), file=fo)
fo.close()
return sv_type_record
def sv_rec_2(sv_info):
for k1ab in list(sv_info.keys()):
for k2ab in list(sv_info[k1ab].keys()):
if not k2ab==k1ab:
k1aba=k1ab.split('/')[0]
k2aba=k2ab.split('/')[0]
k2abb=k2ab.split('/')[1]
flaga=[]
flagb=[]
test=[[],[]]
if flaga==[] and not k1aba==k2aba:
if k2aba=='':
csv1=[[i for i in k1aba],[],[],0]
else:
csv1=simple_flag_SA(k1aba,k2aba)
add_csv_info(csv1,1,k1ab,k2ab)
if flagb==[] and not k1aba==k2abb:
if k2abb=='':
csv1=[[i for i in k1aba],[],[],0]
else:
csv1=simple_flag_SA(k1aba,k2abb)
add_csv_info(csv1,2,k1ab,k2ab)
def sv_rec(sv_info):
for k1ab in list(sv_info.keys()):
for k2ab in list(sv_info[k1ab].keys()):
if not k2ab==k1ab:
if del_flag(k1ab,k2ab)==1:
delM=[]
delP=[]
for i in k1ab.split('/')[0]:
if not i in k2ab.split('/')[0]:
delM.append(i)
if not i in k2ab.split('/')[1]:
delP.append(i)
for k3 in sv_info[k1ab][k2ab]:
del_info_add(k3,[delM,delP])
else:
if inv_flag(k1ab,k2ab)+dup_flag(k1ab,k2ab)==0:
tra_info_add(k1ab,k2ab)
if del_flag_SA(k1ab.split('/')[0],k2ab.split('/')[0])==1:
delM=[]
delP=[]
for i in k1ab.split('/')[0]:
if not i in k2ab.split('/')[0]:
delM.append(i)
for k3 in sv_info[k1ab][k2ab]:
del_info_add(k3,[delM,delP])
if del_flag_SA(k1ab.split('/')[1],k2ab.split('/')[1])==1:
delM=[]
delP=[]
for i in k1ab.split('/')[0]:
if not i in k2ab.split('/')[1]:
delP.append(i)
for k3 in sv_info[k1ab][k2ab]:
del_info_add(k3,[delM,delP])
else:
k1aba=k1ab.split('/')[0]
k2aba=k2ab.split('/')[0]
k2abb=k2ab.split('/')[1]
flaga=[]
flagb=[]
if del_flag_SA(k1aba,k2aba)==1:#simple del on one allele
delM=[]
delP=[]
for i in k1ab.split('/')[0]:
if not i in k2ab.split('/')[0]:
delM.append(i)
for k3 in sv_info[k1ab][k2ab]:
del_info_add(k3,[delM,delP])
flaga.append('del')
if del_flag_SA(k1aba,k2abb)==1:#simple del on one allele
delM=[]
delP=[]
for i in k1ab.split('/')[0]:
if not i in k2ab.split('/')[1]:
delP.append(i)
for k3 in sv_info[k1ab][k2ab]:
del_info_add(k3,[delM,delP])
flagb.append('del')
if dup_flag_SA(k1aba,k2aba)==1:#simple dup on one allele
dupM=[]
dupP=[]
for i in k1aba:
if k2aba.count(i)>1:
dupM.append(i)
for k3 in sv_info[k1ab][k2ab]:
dup_info_add(k3,[dupM,dupP])
flaga.append('dup')
if dup_flag_SA(k1aba,k2abb)==1:#simple dup on one allele
dupM=[]
dupP=[]
for i in k1aba:
if k2abb.count(i)>1:
dupP.append(i)
for k3 in sv_info[k1ab][k2ab]:
dup_info_add(k3,[dupM,dupP])
flagb.append('dup')
if inv_flag_SA(k1aba,k2aba)==1:#simple inv on one allele
invM=[]
invP=[]
for i in range(len(k2aba)):
if k2aba[i]=='^':
invM.append(k2aba[i-1])
for k3 in sv_info[k1ab][k2ab]:
inv_info_add(k3,[invM,invP])
flaga.append('inv')
if inv_flag_SA(k1aba,k2abb)==1:#simple inv on one allele
invM=[]
invP=[]
for i in range(len(k2abb)):
if k2abb[i]=='^':
invP.append(k2abb[i-1])
for k3 in sv_info[k1ab][k2ab]:
inv_info_add(k3,[invM,invP])
flagb.append('inv')
if flaga==[] and not k1aba==k2aba:
csv1=simple_flag_SA(k1aba,k2aba)
add_csv_info(csv1,1,k1ab,k2ab)
if flagb==[] and not k1aba==k2abb:
csv1=simple_flag_SA(k1aba,k2abb)
add_csv_info(csv1,2,k1ab,k2ab)
def tra_csv_info_add(k1,k2):
for k3 in sv_info[k1][k2]:
SV_ID=';'.join([str(i) for i in k3]+['C'])
if not SV_ID in list(tra1.keys()):
tra1[SV_ID]={}
k2a=k2.split('/')[0]
k2b=k2.split('/')[1]
bp_hash={}
block_rec=0
block_hash=[]
for a3 in k3[:-1]:
if a3 in chromos or not a3.isdigit():
block_hash.append([a3])
else:
block_hash[-1].append(a3)
for a3 in block_hash:
for a4 in range(len(a3)-2):
bp_hash[chr(97+block_rec)]=[a3[0],a3[a4+1],a3[a4+2]]
block_rec+=1
for a3 in list(bp_hash.keys()):
temp=[]
for a4 in bp_hash[a3][1:]:
temp.append(int(a4)-1)
temp.append(int(a4))
bp_hash[a3][1:]=temp
bp_hash['left']=[bp_hash[k1[0]][0],bp_hash[k1[0]][1],bp_hash[k1[0]][2]]
bp_hash['right']=[bp_hash[k1[-1]][0],bp_hash[k1[-1]][3],bp_hash[k1[-1]][4]]
ref_allele={}
for a3 in list(bp_hash.keys()):
ref_allele[a3]=[bp_hash[a3][0]]
for a4 in bp_hash[a3][1:]:
ref_allele[a3].append(ref_base_readin(ref,bp_hash[a3][0],a4))
if not k2a==k1.split('/')[0] and del_flag_SA(k1.split('/')[0],k2a)==0:
flag1=0#flag1==0:w/o inversion in the alt structure
if '^' in k2a:
flag1+=1
flag2=0#flag2==0:w/o duplication in the alt structure
for j in k2a:
if k2a.count(j)>1:
flag2+=1
flag3=0 #flag3==0: w/o translocation
if len(k2a)>1:
for i in range(len(k2a)-1):
if not ord(k2a[i+1])>ord(k2a[i]):
flag3+=1
if flag1+flag2+flag3==0:
heta_Del_block=[]
for a1 in k1.split('/')[0]:
if not a1 in k2a:
heta_Del_block.append(a1)
if not 'a' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['a']=[]
block_hash=[]
del_hash={}
block_rec=0
for a3 in a2[0]:
if a3 in chromos:
block_hash.append([a3])
else:
block_hash[-1].append(a3)
for a3 in block_hash:
for a4 in range(len(a3)-2):
del_hash[chr(97+block_rec)]=[a3[0],a3[a4+1],a3[a4+2]]
block_rec+=1
if not heta_Del_block==[]:
a_heta=0
heta_Del_new=[heta_Del_block[0]]
while True:
a_heta+=1
if a_heta==len(heta_Del_block):break
if ord(heta_Del_block[a_heta])-ord(heta_Del_block[a_heta-1])==1 and del_hash[heta_Del_block[a_heta]][0]==del_hash[heta_Del_block[a_heta-1]][0]:
heta_Del_new[-1]+=heta_Del_block[a_heta]
else:
heta_Del_new.append(heta_Del_block[a_heta])
for a3 in heta_Del_new:
a4=a3[0]
tra1[SV_ID]['a'].append(['DEL',del_hash[a4][0],int(del_hash[a4][1]),ref_allele[a4][2]])
a4=a3[-1]
tra1[SV_ID]['a'][-1].append(int(del_hash[a4][2])-1)
else:
if not 'a' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['a']=[]
t1=[]
for a3 in k2a:
if not a3=='^':
t1.append(a3)
else:
t1[-1]+=a3
t2=[t1[0]]
for a3 in t1[1:]:
if not '^' in a3 and not '^' in t2[-1] and ord(a3)-ord(t2[-1][-1])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-1]][0]:
t2[-1]+=a3
elif '^' in a3 and '^' in t2[-1] and ord(t2[-1][-2])-ord(a3[0])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-2]][0]:
t2[-1]+=a3
else:
t2.append(a3)
a3='left'
a4=t2[0]
l_chr=bp_hash[a3][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a4:
if not a4[0]==k1[0]:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3][1])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+'['+r_chr+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a4:
tra1[SV_ID]['a'].append([r_chr, bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3][1])+']'])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
for t3 in range(len(t2)-1):
a3=t2[t3]
a4=t2[t3+1]
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a3 and not '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a3 and not '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4[0]][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['+ref_allele[a3[-2]][2]])
elif not '^' in a3 and '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
elif '^' in a3 and '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2], ']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'+ref_allele[a3[-2]][2]])
if len(t2)>1:
a3=t2[t3+1]
else:
a3=t2[0]
a4='right'
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4][0]
if not '^' in a3:
if not a3[-1]==k1[-1]:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['])
if '^' in a3:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['+ref_allele[a3[-2]][2]])
if not k2b==k1.split('/')[1] and del_flag_SA(k1.split('/')[1],k2b)==0:
flag1=0#flag1==0:w/o inversion in the alt structure
if '^' in k2b:
flag1+=1
flag2=0#flag2==0:w/o duplication in the alt structure
for j in k2b:
if k2b.count(j)>1:
flag2+=1
flag3=0 #flag3==0: w/o translocation
if len(k2b)>1:
for i in range(len(k2b)-1):
if not ord(k2b[i+1])>ord(k2b[i]):
flag3+=1
if flag1+flag2+flag3==0:
heta_Del_block=[]
for a1 in k1.split('/')[1]:
if not a1 in k2b:
heta_Del_block.append(a1)
if not 'b' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['b']=[]
block_hash=[]
del_hash={}
block_rec=0
for a3 in a2[0]:
if a3 in chromos:
block_hash.append([a3])
else:
block_hash[-1].append(a3)
for a3 in block_hash:
for a4 in range(len(a3)-2):
del_hash[chr(97+block_rec)]=[a3[0],a3[a4+1],a3[a4+2]]
block_rec+=1
if not heta_Del_block==[]:
a_heta=0
heta_Del_new=[heta_Del_block[0]]
while True:
a_heta+=1
if a_heta==len(heta_Del_block):break
if ord(heta_Del_block[a_heta])-ord(heta_Del_block[a_heta-1])==1 and del_hash[heta_Del_block[a_heta]][0]==del_hash[heta_Del_block[a_heta-1]][0]:
heta_Del_new[-1]+=heta_Del_block[a_heta]
else:
heta_Del_new.append(heta_Del_block[a_heta])
for a3 in heta_Del_new:
a4=a3[0]
tra1[SV_ID]['b'].append(['DEL',del_hash[a4][0],int(del_hash[a4][1]),ref_allele[a4][2]])
a4=a3[-1]
tra1[SV_ID]['b'][-1].append(int(del_hash[a4][2])-1)
else:
if not 'b' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['b']=[]
t1=[]
for a3 in k2b:
if not a3=='^':
t1.append(a3)
else:
t1[-1]+=a3
t2=[t1[0]]
for a3 in t1[1:]:
if not '^' in a3 and not '^' in t2[-1] and ord(a3)-ord(t2[-1][-1])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-1]][0]:
t2[-1]+=a3
elif '^' in a3 and '^' in t2[-1] and ord(t2[-1][-2])-ord(a3[0])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-2]][0]:
t2[-1]+=a3
else:
t2.append(a3)
a3='left'
a4=t2[0]
l_chr=bp_hash[a3][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a4:
if not a4[0]==k1[0]:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3][1])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+'['+r_chr+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a4:
tra1[SV_ID]['b'].append([r_chr, bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3][1])+']'])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
for t3 in range(len(t2)-1):
a3=t2[t3]
a4=t2[t3+1]
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a3 and not '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a3 and not '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4[0]][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['+ref_allele[a3[-2]][2]])
elif not '^' in a3 and '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
elif '^' in a3 and '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2], ']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'+ref_allele[a3[-2]][2]])
if len(t2)>1:
a3=t2[t3+1]
else:
a3=t2[0]
a4='right'
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4][0]
if not '^' in a3:
if not a3[-1]==k1[-1]:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['])
if '^' in a3:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['+ref_allele[a3[-2]][2]])
def tra_info_add(k1,k2):
for k3 in sv_info[k1][k2]:
SV_ID=';'.join([str(i) for i in k3]+['S'])
if not SV_ID in list(tra1.keys()):
tra1[SV_ID]={}
k2a=k2.split('/')[0]
k2b=k2.split('/')[1]
bp_hash={}
block_rec=0
block_hash=[]
for a3 in k3[:-1]:
if a3 in chromos or not a3.isdigit():
block_hash.append([a3])
else:
block_hash[-1].append(a3)
for a3 in block_hash:
for a4 in range(len(a3)-2):
bp_hash[chr(97+block_rec)]=[a3[0],a3[a4+1],a3[a4+2]]
block_rec+=1
for a3 in list(bp_hash.keys()):
temp=[]
for a4 in bp_hash[a3][1:]:
temp.append(int(a4)-1)
temp.append(int(a4))
bp_hash[a3][1:]=temp
bp_hash['left']=[bp_hash[k1[0]][0],bp_hash[k1[0]][1],bp_hash[k1[0]][2]]
bp_hash['right']=[bp_hash[k1[-1]][0],bp_hash[k1[-1]][3],bp_hash[k1[-1]][4]]
ref_allele={}
for a3 in list(bp_hash.keys()):
ref_allele[a3]=[bp_hash[a3][0]]
for a4 in bp_hash[a3][1:]:
ref_allele[a3].append(ref_base_readin(ref,bp_hash[a3][0],a4))
if not k2a==k1.split('/')[0] and del_flag_SA(k1.split('/')[0],k2a)==0:
flag1=0#flag1==0:w/o inversion in the alt structure
if '^' in k2a:
flag1+=1
flag2=0#flag2==0:w/o duplication in the alt structure
for j in k2a:
if k2a.count(j)>1:
flag2+=1
flag3=0 #flag3==0: w/o translocation
if len(k2a)>1:
for i in range(len(k2a)-1):
if not ord(k2a[i+1])>ord(k2a[i]):
flag3+=1
if flag1+flag2+flag3==0:
heta_Del_block=[]
for a1 in k1.split('/')[0]:
if not a1 in k2a:
heta_Del_block.append(a1)
if not 'a' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['a']=[]
block_hash=[]
del_hash={}
block_rec=0
for a3 in a2[0]:
if a3 in chromos:
block_hash.append([a3])
else:
block_hash[-1].append(a3)
for a3 in block_hash:
for a4 in range(len(a3)-2):
del_hash[chr(97+block_rec)]=[a3[0],a3[a4+1],a3[a4+2]]
block_rec+=1
if not heta_Del_block==[]:
a_heta=0
heta_Del_new=[heta_Del_block[0]]
while True:
a_heta+=1
if a_heta==len(heta_Del_block):break
if ord(heta_Del_block[a_heta])-ord(heta_Del_block[a_heta-1])==1 and del_hash[heta_Del_block[a_heta]][0]==del_hash[heta_Del_block[a_heta-1]][0]:
heta_Del_new[-1]+=heta_Del_block[a_heta]
else:
heta_Del_new.append(heta_Del_block[a_heta])
for a3 in heta_Del_new:
a4=a3[0]
tra1[SV_ID]['a'].append(['DEL',del_hash[a4][0],int(del_hash[a4][1]),ref_allele[a4][2]])
a4=a3[-1]
tra1[SV_ID]['a'][-1].append(int(del_hash[a4][2])-1)
else:
if not 'a' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['a']=[]
t1=[]
for a3 in k2a:
if not a3=='^':
t1.append(a3)
else:
t1[-1]+=a3
t2=[t1[0]]
for a3 in t1[1:]:
if not '^' in a3 and not '^' in t2[-1] and ord(a3)-ord(t2[-1][-1])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-1]][0]:
t2[-1]+=a3
elif '^' in a3 and '^' in t2[-1] and ord(t2[-1][-2])-ord(a3[0])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-2]][0]:
t2[-1]+=a3
else:
t2.append(a3)
a3='left'
a4=t2[0]
l_chr=bp_hash[a3][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a4:
if not a4[0]==k1[0]:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3][1])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+'['+r_chr+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a4:
tra1[SV_ID]['a'].append([r_chr, bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3][1])+']'])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
for t3 in range(len(t2)-1):
a3=t2[t3]
a4=t2[t3+1]
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a3 and not '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a3 and not '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4[0]][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['+ref_allele[a3[-2]][2]])
elif not '^' in a3 and '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
elif '^' in a3 and '^' in a4:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2], ']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'+ref_allele[a3[-2]][2]])
if len(t2)>1:
a3=t2[t3+1]
else:
a3=t2[0]
a4='right'
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4][0]
if not '^' in a3:
if not a3[-1]==k1[-1]:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['])
if '^' in a3:
tra1[SV_ID]['a'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4][2]])
tra1[SV_ID]['a'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['+ref_allele[a3[-2]][2]])
if not k2b==k1.split('/')[1] and del_flag_SA(k1.split('/')[1],k2b)==0:
flag1=0#flag1==0:w/o inversion in the alt structure
if '^' in k2b:
flag1+=1
flag2=0#flag2==0:w/o duplication in the alt structure
for j in k2b:
if k2b.count(j)>1:
flag2+=1
flag3=0 #flag3==0: w/o translocation
if len(k2b)>1:
for i in range(len(k2b)-1):
if not ord(k2b[i+1])>ord(k2b[i]):
flag3+=1
if flag1+flag2+flag3==0:
heta_Del_block=[]
for a1 in k1.split('/')[1]:
if not a1 in k2b:
heta_Del_block.append(a1)
if not 'b' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['b']=[]
block_hash=[]
del_hash={}
block_rec=0
for a3 in a2[0]:
if a3 in chromos:
block_hash.append([a3])
else:
block_hash[-1].append(a3)
for a3 in block_hash:
for a4 in range(len(a3)-2):
del_hash[chr(97+block_rec)]=[a3[0],a3[a4+1],a3[a4+2]]
block_rec+=1
if not heta_Del_block==[]:
a_heta=0
heta_Del_new=[heta_Del_block[0]]
while True:
a_heta+=1
if a_heta==len(heta_Del_block):break
if ord(heta_Del_block[a_heta])-ord(heta_Del_block[a_heta-1])==1 and del_hash[heta_Del_block[a_heta]][0]==del_hash[heta_Del_block[a_heta-1]][0]:
heta_Del_new[-1]+=heta_Del_block[a_heta]
else:
heta_Del_new.append(heta_Del_block[a_heta])
for a3 in heta_Del_new:
a4=a3[0]
tra1[SV_ID]['b'].append(['DEL',del_hash[a4][0],int(del_hash[a4][1]),ref_allele[a4][2]])
a4=a3[-1]
tra1[SV_ID]['b'][-1].append(int(del_hash[a4][2])-1)
else:
if not 'b' in list(tra1[SV_ID].keys()):
tra1[SV_ID]['b']=[]
t1=[]
for a3 in k2b:
if not a3=='^':
t1.append(a3)
else:
t1[-1]+=a3
t2=[t1[0]]
for a3 in t1[1:]:
if not '^' in a3 and not '^' in t2[-1] and ord(a3)-ord(t2[-1][-1])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-1]][0]:
t2[-1]+=a3
elif '^' in a3 and '^' in t2[-1] and ord(t2[-1][-2])-ord(a3[0])==1 and bp_hash[a3[0]][0]==bp_hash[t2[-1][-2]][0]:
t2[-1]+=a3
else:
t2.append(a3)
a3='left'
a4=t2[0]
l_chr=bp_hash[a3][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a4:
if not a4[0]==k1[0]:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3][1])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+'['+r_chr+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a4:
tra1[SV_ID]['b'].append([r_chr, bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3][1])+']'])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3][1],ref_allele[a3][1],ref_allele[a3][1]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
for t3 in range(len(t2)-1):
a3=t2[t3]
a4=t2[t3+1]
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4[0]][0]
if not '^' in a3 and not '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4[0]][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['])
elif '^' in a3 and not '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][2],ref_allele[a4[0]][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4[0]][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4[0]][0]+':'+str(bp_hash[a4[0]][2])+'['+ref_allele[a3[-2]][2]])
elif not '^' in a3 and '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'])
elif '^' in a3 and '^' in a4:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4[0]][3],ref_allele[a4[0]][3],ref_allele[a4[0]][3]+'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2], ']'+r_chr+':'+str(bp_hash[a4[0]][3])+']'+ref_allele[a3[-2]][2]])
if len(t2)>1:
a3=t2[t3+1]
else:
a3=t2[0]
a4='right'
l_chr=bp_hash[a3[0]][0]
r_chr=bp_hash[a4][0]
if not '^' in a3:
if not a3[-1]==k1[-1]:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],']'+l_chr+':'+str(bp_hash[a3[-1]][3])+']'+ref_allele[a4][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-1]][3],ref_allele[a3[-1]][3],ref_allele[a3[-1]][3]+'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['])
if '^' in a3:
tra1[SV_ID]['b'].append([r_chr,bp_hash[a4][2],ref_allele[a4][2],'['+l_chr+':'+str(bp_hash[a3[-2]][2])+'['+ref_allele[a4][2]])
tra1[SV_ID]['b'].append([l_chr,bp_hash[a3[-2]][2],ref_allele[a3[-2]][2],'['+bp_hash[a4][0]+':'+str(bp_hash[a4][2])+'['+ref_allele[a3[-2]][2]])
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
Define_Default_SVIntegrate()
if not '--workdir' in list(dict_opts.keys()):
print('Error: please specify working directory using: --workdir')
else:
workdir=path_modify(dict_opts['--workdir'])
if not '--input-path' in list(dict_opts.keys()):
print('Error: please specify path of input .coverge files using --input-path')
else:
if '--input-path' in list(dict_opts.keys()):
if not dict_opts['--input-path'][-1]=='/':
dict_opts['--input-path']+='/'
InputPath=[dict_opts['--input-path']]
else:
InputPath=[]
if os.path.isdir(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1]):
InputPath.append(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1])
print('Reading Result from default path: '+workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1])
else:
print('Error: please specify input path using --input-path')
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
if '--reference' in list(dict_opts.keys()):
ref_file=dict_opts['--reference']
ref_path='/'.join(ref_file.split('/')[:-1])+'/'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index):
print('Error: reference genome not indexed')
else:
if not '--prefix' in list(dict_opts.keys()):
print('Warning: output file name not specified. output file: '+workdir+'Output.vcf')
output_file=workdir+'Output.vcf'
else:
output_file=dict_opts['--prefix']+'.vcf'
time1=time.time()
ref=ref_file
chromos=[]
fin=open(ref_index)
for line in fin:
pin=line.strip().split()
chromos.append(pin[0])
fin.close()
for path2 in InputPath:
sv_info={}
for k3 in os.listdir(path2):
if k3.split('.')[-1]=='coverge':
read_in_structures(path2+k3)
sv_info=sv_info_score_modify(sv_info)
sv_type_record=SV_Info_Write_svelter(sv_info)
dup1={}
disperse_dup={}
inv1={}
del1={}
tra1={}
sv_rec_2(sv_info)
dup1=dup_collaps(dup1)
sv_out={}
hash_reorder()
hash_collaps()
hash_collaps2()
hash_collaps3()
write_VCF_header(output_file,time,workdir)
write_VCF_main(output_file,sv_out,chromos,ref,sv_type_record)
time2=time.time()
print('SVIntegrate Complete !')
print('Time Consuming: '+str(time2-time1))
if not function_name in ['BPSearch_Predefined','PredefinedBP','Setup','NullModel','BPSearch','BPIntegrate','SVPredict','SVIntegrate','SVIntegrate_vcf4.1','Clean','GenoTyper']:
import glob
import getopt
opts,args=getopt.getopt(sys.argv[1:],'o:h:S:',['deterministic-flag=','help=','long-insert=','prefix=','batch=','sample=','workdir=','reference=','chromosome=','exclude=','copyneutral=','ploidy=','svelter-path=','input-path=','null-model=','null-copyneutral-length=','null-copyneutral-perc=','null-random-length=','null-random-num=','null-random-length=','null-random-num=','qc-align=','qc-split=','qc-structure=','qc-map-tool=','qc-map-file=','split-min-len=','read-length=','keep-temp-files=','keep-temp-figs=','bp-file=','num-iteration=','keep-interval-files='])
dict_opts=dict(opts)
if dict_opts=={} or list(dict_opts.keys())==['-h'] or list(dict_opts.keys())==['--help']:
readme.print_default_parameters()
else:
def Code_Files_Define():
global Code_File
global Code0_Function
global Code1_Function
global Code2_Function
global Code3_Function
global Code4_Function
global Code5_Function
global RCode_Path
global Code1a_file
global Code1d_file
global Code1d2_file
Code_File=script_name
Code0_Function='Setup'
Code1_Function='NullModel'
Code2_Function='BPSearch'
Code3_Function='BPIntegrate'
Code4_Function='SVPredict'
Code5_Function='SVIntegrate'
RCode_Path=workdir+'reference_SVelter/'
Code1a_file=RCode_Path+'SVelter1.NullModel.Figure.a.r'
Code1d_file=RCode_Path+'SVelter1.NullModel.Figure.b.r'
Code1d2_file=RCode_Path+'SVelter1.NullModel.Figure.c.r'
def check_scripts(Code_path):
flag=0
out=[]
Code0_file=Code_path+'SVelter0.Ref.Setup.py'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter1.NullModel.py'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter1.NullModel.Figure.a.r'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter1.NullModel.Figure.b.r'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter1.NullModel.Figure.c.r'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter2.BP.Searching.py'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter3.BPIntegrate.py'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter4.StructureResolvation.py'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
Code0_file=Code_path+'SVelter5.result.integrate.py'
if not os.path.isfile(Code0_file):
flag+=1
out.append(Code0_file)
return out
def Define_Default_AllInOne():
global deterministic_flag
deterministic_flag=0
if '--deterministic-flag' in list(dict_opts.keys()):
deterministic_flag=int(dict_opts['--deterministic-flag'])
if '--core' in list(dict_opts.keys()):
global pool
pool = Pool(processes=int(dict_opts['--core']))
global model_comp
if not '--null-model' in list(dict_opts.keys()):
model_comp='C'
else:
if dict_opts['--null-model'] in ['S','Simple']:
model_comp='S'
else:
model_comp='C'
global QCAlign
if '--qc-align' in list(dict_opts.keys()):
QCAlign=int(dict_opts['--qc-align'])
else:
QCAlign=20
global QCSplit
if '--qc-split' in list(dict_opts.keys()):
QCSplit=int(dict_opts['--qc-split'])
else:
QCSplit=20
global NullSplitLen_perc
if '--split-min-len' in list(dict_opts.keys()):
NullSplitLen_perc=int(dict_opts['--split-min-len'])
else:
NullSplitLen_perc=0.9
global KeepFile
if '--keep-temp-files' in list(dict_opts.keys()):
KeepFile=dict_opts['--keep-temp-files']
else:
KeepFile='No'
global KeepFigure
if '--keep-temp-figs' in list(dict_opts.keys()):
KeepFigure=dict_opts['--keep-temp-figs']
else:
KeepFigure='No'
global Trail_Number
if '--num-iteration' in list(dict_opts.keys()):
Trail_Number=int(dict_opts['--num-iteration'])
else:
Trail_Number=10000
global Ploidy
if '--ploidy' in list(dict_opts.keys()):
Ploidy=int(dict_opts['--ploidy'])
else:
Ploidy=2
global ILCff_STD_Time
if '-S' in list(dict_opts.keys()):
ILCff_STD_Time=int(dict_opts['-S'])
else:
ILCff_STD_Time=3
def run_SVelter0_chrom(chrom_name):
os.system(r'''%s --workdir %s --ref %s --ex %s --sample %s --chr %s'''%(Code0_file,workdir,ref_file,ex_file,sin_bam_file,chrom_name))
def run_SVelter1_chrom(sin_bam_file):
os.system(r'''%s %s --keep-temp-files %s --keep-temp-figs %s --null-model %s --workdir %s --sample %s'''%(Code_File,Code1_Function,KeepFile,KeepFigure,model_comp,workdir,sin_bam_file))
def run_SVelter1_Single_chrom(sin_bam_file,chromos_single):
os.system(r'''%s %s --keep-temp-files %s --keep-temp-figs %s --null-model %s --workdir %s --sample %s --chromosome %s'''%(Code_File,Code1_Function,KeepFile,KeepFigure,model_comp,workdir,sin_bam_file,chromos_single))
def run_SVelter2_chrom(chrom_name,sin_bam_file,ILCff_STD_Time):
os.system(r'''%s %s --chromosome %s --workdir %s --sample %s --null-model %s -S %s'''%(Code_File,Code2_Function,chrom_name,workdir,sin_bam_file,model_comp,ILCff_STD_Time))
print(chrom_name+' done!')
def run_SVelter3_chrom(sin_bam_file):
os.system(r'''%s %s --batch %s --workdir %s --sample %s'''%(Code_File,Code3_Function,dict_opts['--batch'],workdir,sin_bam_file))
def run_SVelter4_chrom(txt_name,sin_bam_file):
os.system(r'''%s %s --workdir %s --bp-file %s --sample %s --num-iteration %s --ploidy %s --null-model %s --deterministic-flag %s'''%(Code_File,Code4_Function,workdir,txt_name,sin_bam_file,str(Trail_Number),str(Ploidy),model_comp,deterministic_flag))
print(txt_name+' done!')
def run_SVelter5_chrom(path2,out_vcf):
os.system(r'''%s %s --workdir %s --input-path %s --prefix %s'''%(Code_File,Code5_Function,workdir,path2,out_vcf))
def SamplingPercentage_read_in():
global SamplingPercentage
if '--null-copyneutral-perc' in list(dict_opts.keys()):
SamplingPercentage=float(dict_opts['--null-copyneutral-perc'])
else:
SamplingPercentage=0.001
def clean_path(path):
if os.path.isdir(path):
os.system(r'''rm -r %s'''%(path))
def global_para_declaration_all():
global whole_genome
global len_genome
import numpy
import scipy
import math
from math import sqrt,pi,exp
from scipy.stats import norm
import random
import pickle
import time
import datetime
import itertools
Define_Default_AllInOne()
global_para_declaration_all()
if not '--workdir' in list(dict_opts.keys()):
print('Error: please specify working directory using: --workdir')
else:
workdir=path_modify(dict_opts['--workdir'])
if not os.path.isdir(workdir):
print('Error: working directory does not exit!')
Code_Files_Define()
if not '--sample' in list(dict_opts.keys()) and not '--samplePath' in list(dict_opts.keys()):
print('Error: please specify input file using --sample')
else:
if '--sample' in list(dict_opts.keys()):
bam_path='/'.join(dict_opts['--sample'].split('/')[:-1])+'/'
bam_files=[dict_opts['--sample']]
bam_files_appdix=dict_opts['--sample'].split('.')[-1]
else:
bam_path=path_modify(dict_opts['--samplePath'])
bam_files=[]
for file in os.listdir(bam_path):
if file.split('.')[-1]==bam_files_appdix:
bam_files.append(bam_path+file)
ref_path=workdir+'reference_SVelter/'
ref_file=ref_path+'genome.fa'
ref_index=ref_file+'.fai'
if not os.path.isfile(ref_index):
print('Error: reference genome not indexed ')
else:
[whole_genome,len_genome]=calculate_len_genome(ref_file)
chromos=list(whole_genome.keys())
chr_name_check=0
fin=open(ref_index)
chr_ref_check=[]
for line in fin:
pin=line.strip().split()
chr_ref_check.append(pin[0])
fin.close()
for filein_bam in bam_files:
chr_bam_check=[]
fin=os.popen(r'''samtools view -H %s'''%(filein_bam))
for line in fin:
pin=line.strip().split()
if pin[0]=='@SQ':
chr_bam_check.append(pin[1].split(':')[1])
fin.close()
if not chr_ref_check==chr_bam_check:
print('Warning: please make sure the reference file matches the sample file')
chr_flag=0
if 'chr' in chr_ref_check[0]:
chr_flag=1
SamplingPercentage_read_in()
cn2_file=cn2_file_read_in(dict_opts,workdir)
ex_file=ex_file_read_in(dict_opts,workdir)
cn2_length=int(cn2_length_readin(dict_opts))
Gap_Refs=[ex_file]
if not os.path.isfile(cn2_file):
print('Error: CN2 file not correctly setup!')
if not os.path.isfile(ex_file):
random_produce_exclude_region(ex_file,chromos)
if '--prefix' in list(dict_opts.keys()):
out_vcf=dict_opts['--prefix']+'.vcf'
out_svelter=dict_opts['--prefix']+'.svelter'
else:
#out_vcf=workdir+dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'.vcf')
#out_svelter=workdir+dict_opts['--sample'].split('/')[-1].replace('.'+bam_files_appdix,'.svelter')
out_vcf=workdir+'.'.join(dict_opts['--sample'].split('/')[-1].split('.')[:-1])+'.vcf'
out_svelter=workdir+'.'.join(dict_opts['--sample'].split('/')[-1].split('.')[:-1])+'.svelter'
print('Warning: output file is not specified')
print('output file: '+out_vcf)
print('output file: '+out_svelter)
temp_inter_replace=0
if '--chromosome' in list(dict_opts.keys()):
chrom_single=dict_opts['--chromosome']
if not chrom_single in chromos:
print('Error: please make sure the chromosome defined by --chr is correct based on the reference genome')
chromos=[]
else:
chromos=[chrom_single]
for sin_bam_file in bam_files:
running_time=[]
print(' ')
print('Step1: Running null parameters for '+sin_bam_file+' ...')
time1=time.time()
if len(chromos)>1:
run_SVelter1_chrom(sin_bam_file)
elif len(chromos)==1:
run_SVelter1_Single_chrom(sin_bam_file,chromos[0])
time2=time.time()
running_time.append(time2-time1)
print('Null model built for '+sin_bam_file)
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step2: Searching for BreakPoints of sample '+sin_bam_file+' ...')
time1=time.time()
for x in chromos:
print(x)
run_SVelter2_chrom(x,sin_bam_file,ILCff_STD_Time)
time2=time.time()
running_time.append(time2-time1)
print('Break points searching done for sample:'+sin_bam_file)
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step3: Integrating breakpoints ... ')
if not '--batch' in list(dict_opts.keys()):
dict_opts['--batch']='0'
time1=time.time()
run_SVelter3_chrom(sin_bam_file)
time2=time.time()
running_time.append(time2-time1)
print('Break points cluster done for sample:'+sin_bam_file)
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
print('Step4: Resolving structure ... ')
time1=time.time()
for k1 in os.listdir(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1]+'/'):
if k1.split('.')[-1]=='txt':
run_SVelter4_chrom(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1]+'/'+k1,sin_bam_file)
time2=time.time()
running_time.append(time2-time1)
print('Structure resolved !')
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print(' ')
time1=time.time()
run_SVelter5_chrom(workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1]+'/','.'.join(out_vcf.split('.')[:-1]))
time2=time.time()
running_time.append(time2-time1)
if temp_inter_replace==0:
print(out_vcf+' completed! ')
print('Time Consuming: '+str(datetime.timedelta(seconds=(time2-time1))))
print('Total Running Time:'+' '.join([str(i) for i in running_time]))
#if os.path.isfile(out_vcf):
NullPath=workdir+'NullModel.'+dict_opts['--sample'].split('/')[-1]
BPPath=workdir+'BreakPoints.'+dict_opts['--sample'].split('/')[-1]
TXTPath=workdir+'bp_files.'+dict_opts['--sample'].split('/')[-1]
if not '--keep-interval-files' in list(dict_opts.keys()):
clean_path(NullPath)
clean_path(BPPath)
clean_path(TXTPath)
elif dict_opts['--keep-interval-files']=='FALSE':
clean_path(NullPath)
clean_path(BPPath)
clean_path(TXTPath)
| 73.521131 | 592 | 0.407851 | 89,731 | 808,953 | 3.433228 | 0.01607 | 0.012676 | 0.01116 | 0.00693 | 0.864692 | 0.836149 | 0.811421 | 0.793895 | 0.7796 | 0.763266 | 0 | 0.049146 | 0.470859 | 808,953 | 11,002 | 593 | 73.527813 | 0.670553 | 0.021899 | 0 | 0.7458 | 0 | 0.002506 | 0.040969 | 0.004221 | 0.001114 | 0 | 0 | 0 | 0 | 1 | 0.024594 | false | 0.001856 | 0.011694 | 0.000093 | 0.055777 | 0.023295 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b99e9a8b90194336284ecd79970cdb127cbc6039 | 2,003 | py | Python | python/check_list_items_similar.py | codevscolor/codevscolor | 35ef9042bdc86f45ef87795c35963b75fb64d5d7 | [
"Apache-2.0"
] | 6 | 2019-04-26T03:11:54.000Z | 2021-05-07T21:48:29.000Z | python/check_list_items_similar.py | akojif/codevscolor | 56db3dffeac8f8d76ff8fcf5656770f33765941f | [
"Apache-2.0"
] | null | null | null | python/check_list_items_similar.py | akojif/codevscolor | 56db3dffeac8f8d76ff8fcf5656770f33765941f | [
"Apache-2.0"
] | 26 | 2019-02-23T14:50:46.000Z | 2022-02-04T23:44:24.000Z | #example 1:
def is_all_items_unique(input_list):
first_element = input_list[0]
for element in input_list:
if element != first_element :
return False
return True
first_list = [1,1,1,1,1,1,1,1,2,1,1,1,1]
second_list = ["one","one","one","one","one","one","one","one","one"]
if is_all_items_unique(first_list):
print("first_list items are unique")
else:
print("first_list items are not unique")
if is_all_items_unique(second_list):
print("second_list items are unique")
else:
print("second_list items are not unique")
#example 2:
def is_all_items_unique(input_list):
return input_list.count(input_list[0]) == len(input_list)
first_list = [1,1,1,1,1,1,1,1,2,1,1,1,1]
second_list = ["one","one","one","one","one","one","one","one","one"]
if is_all_items_unique(first_list):
print("first_list items are unique")
else:
print("first_list items are not unique")
if is_all_items_unique(second_list):
print("second_list items are unique")
else:
print("second_list items are not unique")
#example 3:
def is_all_items_unique(input_list):
return len(set(input_list)) == 1
first_list = [1,1,1,1,1,1,1,1,2,1,1,1,1]
second_list = ["one","one","one","one","one","one","one","one","one"]
if is_all_items_unique(first_list):
print("first_list items are unique")
else:
print("first_list items are not unique")
if is_all_items_unique(second_list):
print("second_list items are unique")
else:
print("second_list items are not unique")
#example 4:
def is_all_items_unique(input_list):
return all(value == input_list[0] for value in input_list)
first_list = [1,1,1,1,1,1,1,1,2,1,1,1,1]
second_list = ["one","one","one","one","one","one","one","one","one"]
if is_all_items_unique(first_list):
print("first_list items are unique")
else:
print("first_list items are not unique")
if is_all_items_unique(second_list):
print("second_list items are unique")
else:
print("second_list items are not unique")
| 32.836066 | 69 | 0.691962 | 353 | 2,003 | 3.694051 | 0.087819 | 0.06135 | 0.07362 | 0.07362 | 0.871933 | 0.871933 | 0.871933 | 0.85046 | 0.772239 | 0.772239 | 0 | 0.03517 | 0.148278 | 2,003 | 60 | 70 | 33.383333 | 0.729191 | 0.01997 | 0 | 0.846154 | 0 | 0 | 0.296069 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0.057692 | 0.173077 | 0.307692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b9a091b936573cbdbd3198d96a834527edd7892f | 18,988 | py | Python | test/test_car_statemachine.py | khelsabeck/easy_patterns | c4a9b2409f94599ee6a2960b32bc52cdb55712fa | [
"MIT"
] | null | null | null | test/test_car_statemachine.py | khelsabeck/easy_patterns | c4a9b2409f94599ee6a2960b32bc52cdb55712fa | [
"MIT"
] | null | null | null | test/test_car_statemachine.py | khelsabeck/easy_patterns | c4a9b2409f94599ee6a2960b32bc52cdb55712fa | [
"MIT"
] | null | null | null | import pytest
from src.car_statemachine import CarState, Braking, Driving, Coasting, Car_ErrorState, Car
from src.trafficlight_statemachine import Green, Red, Yellow, ErrorState, TrafficLight
def test_base_state_car():
'''This should test that the base class is abstract and trying to instantiate it yields an error with known message.'''
with pytest.raises(Exception) as exc_info:
state = CarState() # This should raise an exception
exception_raised = exc_info.value
assert type(TypeError()) == type(exception_raised)
assert "Can't instantiate abstract class CarState with abstract methods on_event" in str(exc_info.__dict__)
def test_base_state_abstractmethod_car():
'''This should test that the instantiating a child of the base class without an abstract method fails.'''
with pytest.raises(Exception) as exc_info:
class TestState(CarState):
pass
state = TestState() # This should raise an exception because there is no implementation of the on_event method
exception_raised = exc_info.value
assert type(TypeError()) == type(exception_raised)
assert "Can't instantiate abstract class TestState with abstract methods on_event" in str(exc_info.__dict__)
def test_braking_state_type():
'''This tests the braking state. Expectation: It should be a State--Braking type with str "Braking.'''
state = Braking()
assert type(Braking()) == type(state)
assert "Braking" == str(state)
assert "Braking" == repr(state)
def test_driving_state_type():
'''This tests the driving state. Expectation: It should be a State--Driving type and its str should be "Driving".'''
state = Driving()
assert type(Driving()) == type(state)
assert "Driving" == str(state)
assert "Driving" == repr(state)
def test_coasting_state_type():
'''This tests the coasting state. Expectation: It should be a State--Coasting type with an str value of "Coasting".'''
state = Coasting()
assert type(Coasting()) == type(state)
assert "Coasting" == str(state)
assert "Coasting" == repr(state)
def test_error_state_type():
'''This tests the error state. Expectation: It should be a State--Car_ErrorState type and the str should be "Car_ErrorState.'''
state = Car_ErrorState()
assert type(Car_ErrorState()) == type(state)
assert "Car_ErrorState" == str(state)
assert "Car_ErrorState" == repr(state)
def test_braking_transition_green():
'''This tests the braking transition logic with a valid light state as a parameter (green-->driving).'''
state = Braking()
light = TrafficLight()
light.on_event("change") # red to green
new_state = state.on_event(light)
assert "Driving" == str(new_state)
def test_braking_transition_yellow():
'''This tests the braking transition logic with a valid light state as a parameter (yellow-->coasting).'''
state = Braking()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
new_state = state.on_event(light)
assert "Coasting" == str(new_state)
def test_braking_transition_red():
'''This tests the braking transition logic with a valid light state as a parameter (red-->braking).'''
state = Braking()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("change") # to red
new_state = state.on_event(light)
assert "Braking" == str(new_state)
def test_braking_transition_error():
'''This tests the braking transition logic with a light in the error state (ErrorState-->Car_ErrorState).'''
state = Braking()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("bad input") # to ErrorState
new_state = state.on_event(light)
assert "Car_ErrorState" == str(new_state)
def test_braking_transition_bad_input():
'''This tests the braking transition logic with a light in the error state (ErrorState-->Car_ErrorState).'''
state = Braking()
light = TrafficLight()
new_state = state.on_event("bad data to car")
assert "Car_ErrorState" == str(new_state)
def test_coasting_transition_green():
'''This tests the coasting transition logic with a valid light state as a parameter (green-->driving).'''
state = Coasting()
light = TrafficLight()
light.on_event("change") # red to green
new_state = state.on_event(light)
assert "Driving" == str(new_state)
def test_coasting_transition_yellow():
'''This tests the coasting transition logic with a valid light state as a parameter (yellow-->coasting).'''
state = Coasting()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
new_state = state.on_event(light)
assert "Coasting" == str(new_state)
def test_coasting_transition_red():
'''This tests the coasting transition logic with a valid light state as a parameter (red-->braking).'''
state = Coasting()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("change") # to red
new_state = state.on_event(light)
assert "Braking" == str(new_state)
def test_coasting_transition_error():
'''This tests the coasting transition logic with a light in the error state (ErrorState-->Car_ErrorState).'''
state = Coasting()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("bad input") # to ErrorState
new_state = state.on_event(light)
assert "Car_ErrorState" == str(new_state)
def test_coasting_transition_bad_input():
'''This tests the coasting transition logic with invalid data.'''
state = Coasting()
new_state = state.on_event("invalid data to car")
assert "Car_ErrorState" == str(new_state)
def test_driving_transition_green():
'''This tests the driving transition logic with a valid light state as a parameter (green-->driving).'''
state = Driving()
light = TrafficLight()
light.on_event("change") # red to green
new_state = state.on_event(light)
assert "Driving" == str(new_state)
def test_driving_transition_yellow():
'''This tests the driving transition logic with a valid light state as a parameter (yellow-->coasting).'''
state = Driving()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
new_state = state.on_event(light)
assert "Coasting" == str(new_state)
def test_driving_transition_red():
'''This tests the driving transition logic with a valid light state as a parameter (red-->braking).'''
state = Driving()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("change") # to red
new_state = state.on_event(light)
assert "Braking" == str(new_state)
def test_driving_transition_error():
'''This tests the driving transition logic with a light in the error state (ErrorState-->Car_ErrorState).'''
state = Driving()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("bad input") # to ErrorState
new_state = state.on_event(light)
assert "Car_ErrorState" == str(new_state)
def test_driving_transition_bad_input():
'''This tests the driving transition logic with invalid data.'''
state = Driving()
new_state = state.on_event("invalid data to car")
assert "Car_ErrorState" == str(new_state)
def test_error_transition_green():
'''This tests the Car_ErrorState transition logic with a valid light state as a parameter (green-->driving).'''
state = Car_ErrorState()
light = TrafficLight()
light.on_event("change") # red to green
new_state = state.on_event(light)
assert "Driving" == str(new_state)
def test_error_transition_yellow():
'''This tests the Car_ErrorState transition logic with a valid light state as a parameter (yellow-->coasting).'''
state = Car_ErrorState()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
new_state = state.on_event(light)
assert "Coasting" == str(new_state)
def test_error_transition_red():
'''This tests the Car_ErrorState transition logic with a valid light state as a parameter (red-->braking).'''
state = Car_ErrorState()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("change") # to red
new_state = state.on_event(light)
assert "Braking" == str(new_state)
def test_error_transition_error():
'''This tests the coasting transition logic with a light in the error state (ErrorState-->Car_ErrorState).'''
state = Car_ErrorState()
light = TrafficLight()
light.on_event("change") # red to green
light.on_event("change") # to yellow
light.on_event("bad input") # to ErrorState
new_state = state.on_event(light)
assert "Car_ErrorState" == str(new_state)
def test_error_transition_bad_input():
'''This tests the coasting transition logic with invalid data.'''
state = Car_ErrorState()
new_state = state.on_event("invalid data to car")
assert "Car_ErrorState" == str(new_state)
def test_transition_car_braking_to_driving():
'''This tests the transitions with a car from braking to driving.'''
car = Car()
light = TrafficLight()
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
light.on_event("change")
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
def test_transition_car_braking_to_coasting():
'''This tests the transitions with a car from braking to coasting.'''
car = Car()
light = TrafficLight()
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
light.on_event("change")
light.on_event("change")
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
def test_transition_car_braking_to_braking():
'''This tests the transitions with a car from braking to braking.'''
car = Car()
light = TrafficLight()
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
car.on_event(light)
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
def test_transition_car_braking_to_error_bad_light_state():
'''This tests the transitions with a car from braking to Car_ErrorState due to a light in ErrorState.'''
car = Car()
light = TrafficLight()
light.on_event("bad input for light")
car.on_event(light)
assert type(Car_ErrorState()) == type(car.state)
assert type(ErrorState()) == type(light.state)
def test_transition_car_braking_to_error_bad_car_state():
'''This tests the transitions with a car from braking to Car_ErrorState due to invalid input for the car.'''
car = Car()
light = TrafficLight()
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
car.on_event(light) #car is braking
assert type(Braking()) == type(car.state)
car.on_event("bad data creating error")
assert type(Car_ErrorState()) == type(car.state)
def test_transition_car_driving_to_driving():
'''This tests the transitions with a car from driving to driving.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
def test_transition_car_driving_to_coasting():
'''This tests the transitions with a car from driving to coasting.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
def test_transition_car_driving_to_braking():
'''This tests the transitions with a car from driving to braking.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
light.on_event("change") # yellow
light.on_event("change") # red
car.on_event(light)
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
def test_transition_car_driving_to_error_bad_light_state():
'''This tests the transitions with a car from driving to Car_ErrorState due to a light in ErrorState.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state) #now driving
assert type(Green()) == type(light.state)
light.on_event("bad input for light")
car.on_event(light)
assert type(Car_ErrorState()) == type(car.state)
assert type(ErrorState()) == type(light.state)
def test_transition_car_driving_to_error_bad_car_state():
'''This tests the transitions with a car from driving to Car_ErrorState due to invalid input for the car.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state) #now driving
assert type(Green()) == type(light.state)
car.on_event("bad data creating error")
assert type(Car_ErrorState()) == type(car.state)
def test_transition_car_coasting_to_driving():
'''This tests the transitions with a car from coasting to driving.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
light.on_event("change") # red
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
def test_transition_car_coasting_to_coasting():
'''This tests the transitions with a car from coasting to coasting.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
def test_transition_car_coasting_to_braking():
'''This tests the transitions with a car from coasting to braking.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
light.on_event("change") # red
car.on_event(light)
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
def test_transition_car_coasting_to_error_bad_light_state():
'''This tests the transitions with a car from coasting to Car_ErrorState due to a light in ErrorState.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
light.on_event("bad input for light")
car.on_event(light)
assert type(Car_ErrorState()) == type(car.state)
assert type(ErrorState()) == type(light.state)
def test_transition_car_coasting_to_error_bad_car_state():
'''This tests the transitions with a car from coasting to Car_ErrorState due to invalid input for the car.'''
car = Car()
light = TrafficLight()
light.on_event("change") # green
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
car.on_event("bad data creating error")
assert type(Car_ErrorState()) == type(car.state)
def test_transition_car_error_to_driving():
'''This tests the transitions with a car from error to driving.'''
car = Car()
light = TrafficLight()
car.on_event("throwing error")
light.on_event("change") # green
car.on_event(light)
assert type(Driving()) == type(car.state)
assert type(Green()) == type(light.state)
def test_transition_car_error_to_coasting():
'''This tests the transitions with a car from error to coasting.'''
car = Car()
light = TrafficLight()
car.on_event("throwing error")
light.on_event("change") # green
light.on_event("change") # yellow
car.on_event(light)
assert type(Coasting()) == type(car.state)
assert type(Yellow()) == type(light.state)
def test_transition_car_error_to_braking():
'''This tests the transitions with a car from error to braking.'''
car = Car()
light = TrafficLight()
car.on_event("throwing error")
car.on_event(light)
assert type(Braking()) == type(car.state)
assert type(Red()) == type(light.state)
def test_transition_car_coasting_to_error_bad_light_state():
'''This tests the transitions with a car from error to Car_ErrorState due to a light in ErrorState.'''
car = Car()
light = TrafficLight()
car.on_event("throwing error")
assert type(Car_ErrorState()) == type(car.state)
light.on_event("bad input for light")
car.on_event(light)
assert type(Car_ErrorState()) == type(car.state)
assert type(ErrorState()) == type(light.state)
def test_transition_car_coasting_to_error_bad_car_state():
'''This tests the transitions with a car from coasting to Car_ErrorState due to invalid input for the car.'''
car = Car()
light = TrafficLight()
car.on_event("throwing error")
assert type(Car_ErrorState()) == type(car.state)
car.on_event("bad data creating error")
assert type(Car_ErrorState()) == type(car.state)
##############################################################################################################################
| 41.18872 | 131 | 0.674057 | 2,554 | 18,988 | 4.839468 | 0.036022 | 0.071359 | 0.065049 | 0.085922 | 0.922816 | 0.894984 | 0.882767 | 0.844579 | 0.843042 | 0.834951 | 0 | 0 | 0.198862 | 18,988 | 460 | 132 | 41.278261 | 0.812516 | 0.2579 | 0 | 0.824176 | 0 | 0 | 0.081718 | 0 | 0 | 0 | 0 | 0 | 0.282967 | 1 | 0.126374 | false | 0.002747 | 0.008242 | 0 | 0.137363 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b9d692d2729f44056cf86e811470cab0771fed49 | 6,125 | py | Python | vgio/quake2/__init__.py | joshuaskelly/game-tools | e71bcf4ef6553adf0b51f4379f72bc5a82a60176 | [
"MIT"
] | 22 | 2017-11-30T22:13:50.000Z | 2019-12-19T17:56:40.000Z | vgio/quake2/__init__.py | joshuaskelly/vgio | e71bcf4ef6553adf0b51f4379f72bc5a82a60176 | [
"MIT"
] | 22 | 2019-08-11T05:07:26.000Z | 2020-12-30T16:07:04.000Z | vgio/quake2/__init__.py | joshuaskelly/game-tools | e71bcf4ef6553adf0b51f4379f72bc5a82a60176 | [
"MIT"
] | 4 | 2018-06-24T14:04:36.000Z | 2019-05-14T06:01:51.000Z | __version__ = '0.2.1'
anorms = (
(-0.525731, 0.000000, 0.850651),
(-0.442863, 0.238856, 0.864188),
(-0.295242, 0.000000, 0.955423),
(-0.309017, 0.500000, 0.809017),
(-0.162460, 0.262866, 0.951056),
(0.000000, 0.000000, 1.000000),
(0.000000, 0.850651, 0.525731),
(-0.147621, 0.716567, 0.681718),
(0.147621, 0.716567, 0.681718),
(0.000000, 0.525731, 0.850651),
(0.309017, 0.500000, 0.809017),
(0.525731, 0.000000, 0.850651),
(0.295242, 0.000000, 0.955423),
(0.442863, 0.238856, 0.864188),
(0.162460, 0.262866, 0.951056),
(-0.681718, 0.147621, 0.716567),
(-0.809017, 0.309017, 0.500000),
(-0.587785, 0.425325, 0.688191),
(-0.850651, 0.525731, 0.000000),
(-0.864188, 0.442863, 0.238856),
(-0.716567, 0.681718, 0.147621),
(-0.688191, 0.587785, 0.425325),
(-0.500000, 0.809017, 0.309017),
(-0.238856, 0.864188, 0.442863),
(-0.425325, 0.688191, 0.587785),
(-0.716567, 0.681718, -0.147621),
(-0.500000, 0.809017, -0.309017),
(-0.525731, 0.850651, 0.000000),
(0.000000, 0.850651, -0.525731),
(-0.238856, 0.864188, -0.442863),
(0.000000, 0.955423, -0.295242),
(-0.262866, 0.951056, -0.162460),
(0.000000, 1.000000, 0.000000),
(0.000000, 0.955423, 0.295242),
(-0.262866, 0.951056, 0.162460),
(0.238856, 0.864188, 0.442863),
(0.262866, 0.951056, 0.162460),
(0.500000, 0.809017, 0.309017),
(0.238856, 0.864188, -0.442863),
(0.262866, 0.951056, -0.162460),
(0.500000, 0.809017, -0.309017),
(0.850651, 0.525731, 0.000000),
(0.716567, 0.681718, 0.147621),
(0.716567, 0.681718, -0.147621),
(0.525731, 0.850651, 0.000000),
(0.425325, 0.688191, 0.587785),
(0.864188, 0.442863, 0.238856),
(0.688191, 0.587785, 0.425325),
(0.809017, 0.309017, 0.500000),
(0.681718, 0.147621, 0.716567),
(0.587785, 0.425325, 0.688191),
(0.955423, 0.295242, 0.000000),
(1.000000, 0.000000, 0.000000),
(0.951056, 0.162460, 0.262866),
(0.850651, -0.525731, 0.000000),
(0.955423, -0.295242, 0.000000),
(0.864188, -0.442863, 0.238856),
(0.951056, -0.162460, 0.262866),
(0.809017, -0.309017, 0.500000),
(0.681718, -0.147621, 0.716567),
(0.850651, 0.000000, 0.525731),
(0.864188, 0.442863, -0.238856),
(0.809017, 0.309017, -0.500000),
(0.951056, 0.162460, -0.262866),
(0.525731, 0.000000, -0.850651),
(0.681718, 0.147621, -0.716567),
(0.681718, -0.147621, -0.716567),
(0.850651, 0.000000, -0.525731),
(0.809017, -0.309017, -0.500000),
(0.864188, -0.442863, -0.238856),
(0.951056, -0.162460, -0.262866),
(0.147621, 0.716567, -0.681718),
(0.309017, 0.500000, -0.809017),
(0.425325, 0.688191, -0.587785),
(0.442863, 0.238856, -0.864188),
(0.587785, 0.425325, -0.688191),
(0.688191, 0.587785, -0.425325),
(-0.147621, 0.716567, -0.681718),
(-0.309017, 0.500000, -0.809017),
(0.000000, 0.525731, -0.850651),
(-0.525731, 0.000000, -0.850651),
(-0.442863, 0.238856, -0.864188),
(-0.295242, 0.000000, -0.955423),
(-0.162460, 0.262866, -0.951056),
(0.000000, 0.000000, -1.000000),
(0.295242, 0.000000, -0.955423),
(0.162460, 0.262866, -0.951056),
(-0.442863, -0.238856, -0.864188),
(-0.309017, -0.500000, -0.809017),
(-0.162460, -0.262866, -0.951056),
(0.000000, -0.850651, -0.525731),
(-0.147621, -0.716567, -0.681718),
(0.147621, -0.716567, -0.681718),
(0.000000, -0.525731, -0.850651),
(0.309017, -0.500000, -0.809017),
(0.442863, -0.238856, -0.864188),
(0.162460, -0.262866, -0.951056),
(0.238856, -0.864188, -0.442863),
(0.500000, -0.809017, -0.309017),
(0.425325, -0.688191, -0.587785),
(0.716567, -0.681718, -0.147621),
(0.688191, -0.587785, -0.425325),
(0.587785, -0.425325, -0.688191),
(0.000000, -0.955423, -0.295242),
(0.000000, -1.000000, 0.000000),
(0.262866, -0.951056, -0.162460),
(0.000000, -0.850651, 0.525731),
(0.000000, -0.955423, 0.295242),
(0.238856, -0.864188, 0.442863),
(0.262866, -0.951056, 0.162460),
(0.500000, -0.809017, 0.309017),
(0.716567, -0.681718, 0.147621),
(0.525731, -0.850651, 0.000000),
(-0.238856, -0.864188, -0.442863),
(-0.500000, -0.809017, -0.309017),
(-0.262866, -0.951056, -0.162460),
(-0.850651, -0.525731, 0.000000),
(-0.716567, -0.681718, -0.147621),
(-0.716567, -0.681718, 0.147621),
(-0.525731, -0.850651, 0.000000),
(-0.500000, -0.809017, 0.309017),
(-0.238856, -0.864188, 0.442863),
(-0.262866, -0.951056, 0.162460),
(-0.864188, -0.442863, 0.238856),
(-0.809017, -0.309017, 0.500000),
(-0.688191, -0.587785, 0.425325),
(-0.681718, -0.147621, 0.716567),
(-0.442863, -0.238856, 0.864188),
(-0.587785, -0.425325, 0.688191),
(-0.309017, -0.500000, 0.809017),
(-0.147621, -0.716567, 0.681718),
(-0.425325, -0.688191, 0.587785),
(-0.162460, -0.262866, 0.951056),
(0.442863, -0.238856, 0.864188),
(0.162460, -0.262866, 0.951056),
(0.309017, -0.500000, 0.809017),
(0.147621, -0.716567, 0.681718),
(0.000000, -0.525731, 0.850651),
(0.425325, -0.688191, 0.587785),
(0.587785, -0.425325, 0.688191),
(0.688191, -0.587785, 0.425325),
(-0.955423, 0.295242, 0.000000),
(-0.951056, 0.162460, 0.262866),
(-1.000000, 0.000000, 0.000000),
(-0.850651, 0.000000, 0.525731),
(-0.955423, -0.295242, 0.000000),
(-0.951056, -0.162460, 0.262866),
(-0.864188, 0.442863, -0.238856),
(-0.951056, 0.162460, -0.262866),
(-0.809017, 0.309017, -0.500000),
(-0.864188, -0.442863, -0.238856),
(-0.951056, -0.162460, -0.262866),
(-0.809017, -0.309017, -0.500000),
(-0.681718, 0.147621, -0.716567),
(-0.681718, -0.147621, -0.716567),
(-0.850651, 0.000000, -0.525731),
(-0.688191, 0.587785, -0.425325),
(-0.587785, 0.425325, -0.688191),
(-0.425325, 0.688191, -0.587785),
(-0.425325, -0.688191, -0.587785),
(-0.587785, -0.425325, -0.688191),
(-0.688191, -0.587785, -0.425325)
)
"""Table of pre-calculated normals."""
| 36.458333 | 38 | 0.563102 | 982 | 6,125 | 3.508147 | 0.03055 | 0.099565 | 0.099855 | 0.073149 | 0.987518 | 0.987518 | 0.987518 | 0.848766 | 0.83106 | 0.807547 | 0 | 0.683872 | 0.187102 | 6,125 | 167 | 39 | 36.676647 | 0.008034 | 0 | 0 | 0 | 0 | 0 | 0.000821 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b9db90291f1e74286a1b92d5256c846c14ffec09 | 4,122 | py | Python | AssetMaintainer/ValueAndCostCalculator/ValueAndCostOfBoughtPositionCalculator.py | HallBlazzar/BackTester-BuyAndHold | 1890a5a3f0af46140d9b537ae40a62b7fef65813 | [
"Apache-2.0"
] | null | null | null | AssetMaintainer/ValueAndCostCalculator/ValueAndCostOfBoughtPositionCalculator.py | HallBlazzar/BackTester-BuyAndHold | 1890a5a3f0af46140d9b537ae40a62b7fef65813 | [
"Apache-2.0"
] | null | null | null | AssetMaintainer/ValueAndCostCalculator/ValueAndCostOfBoughtPositionCalculator.py | HallBlazzar/BackTester-BuyAndHold | 1890a5a3f0af46140d9b537ae40a62b7fef65813 | [
"Apache-2.0"
] | 2 | 2021-01-20T14:22:57.000Z | 2022-03-22T06:12:25.000Z | import pandas as pd
import numpy as np
class ValueAndCostOfBoughtPositionCalculator:
def __init__(self, calculation_source: pd.DataFrame):
self.__calculation_source = calculation_source
def append_value_and_cost(self) -> pd.DataFrame:
self.__calculation_source = OriginalValueCalculator(self.__calculation_source.copy()).append_original_value()
original_value_greater_or_equal_to_maintenance_margin_condition = \
self.__get_original_value_greater_or_equal_to_maintenance_margin_condition()
self.__calculation_source = ValueCalculator(
self.__calculation_source.copy(), original_value_greater_or_equal_to_maintenance_margin_condition
).append_value()
self.__calculation_source = CostCalculator(
self.__calculation_source.copy(), original_value_greater_or_equal_to_maintenance_margin_condition
).append_cost()
self.__calculation_source = self.__calculation_source.drop(['original_value'], axis=1)
return self.__calculation_source
def __get_original_value_greater_or_equal_to_maintenance_margin_condition(self):
return self.__calculation_source['original_value'] >= self.__calculation_source['maintenance_margin']
class OriginalValueCalculator:
def __init__(self, calculation_source):
self.__calculation_source = calculation_source
def append_original_value(self):
self.__calculation_source.loc[:, 'original_value'] = self.__calculation_source['initial_margin'] + \
(
self.__calculation_source['close_price'] - self.__calculation_source['open_price']
) * self.__calculation_source['leverage'] * self.__calculation_source['unit']
return self.__calculation_source
class ValueCalculator:
def __init__(self, calculation_source, original_value_greater_or_equal_to_maintenance_margin_condition):
self.__calculation_source = calculation_source
self.__original_value_greater_or_equal_to_maintenance_margin_condition = \
original_value_greater_or_equal_to_maintenance_margin_condition
def append_value(self):
self.__calculation_source.loc[:, 'value'] = np.where(
self.__original_value_greater_or_equal_to_maintenance_margin_condition,
self.__get_value_when_original_value_greater_or_equal_to_maintenance_margin(),
self.__get_value_when_original_value_less_than_maintenance_margin()
)
return self.__calculation_source
def __get_value_when_original_value_greater_or_equal_to_maintenance_margin(self):
return self.__calculation_source['original_value']
def __get_value_when_original_value_less_than_maintenance_margin(self):
return self.__calculation_source['initial_margin']
class CostCalculator:
def __init__(self, calculation_source, original_value_greater_or_equal_to_maintenance_margin_condition):
self.__calculation_source = calculation_source
self.__original_value_greater_or_equal_to_maintenance_margin_condition = \
original_value_greater_or_equal_to_maintenance_margin_condition
def append_cost(self):
self.__calculation_source.loc[:, 'cost'] = np.where(
self.__original_value_greater_or_equal_to_maintenance_margin_condition,
self.__get_cost_when_original_value_greater_or_equal_to_maintenance_margin(),
self.__get_cost_when_original_value_less_than_maintenance_margin()
)
return self.__calculation_source
def __get_cost_when_original_value_greater_or_equal_to_maintenance_margin(self):
return self.__calculation_source['initial_margin'] + self.__calculation_source['fee']
def __get_cost_when_original_value_less_than_maintenance_margin(self):
return self.__calculation_source['initial_margin'] + \
(
self.__calculation_source['initial_margin'] - self.__calculation_source['original_value']
) + self.__calculation_source['fee']
| 47.37931 | 118 | 0.752547 | 451 | 4,122 | 6.088692 | 0.104213 | 0.260015 | 0.290605 | 0.136198 | 0.828842 | 0.785506 | 0.725419 | 0.684268 | 0.619446 | 0.610706 | 0 | 0.000297 | 0.182921 | 4,122 | 86 | 119 | 47.930233 | 0.815024 | 0 | 0 | 0.285714 | 0 | 0 | 0.051053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.206349 | false | 0 | 0.031746 | 0.079365 | 0.444444 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a01771d0ba2190243da4a95008ab9d00bb80e22 | 24,627 | py | Python | cryptoapis/api/tokens_api.py | xan187/Crypto_APIs_2.0_SDK_Python | a56c75df54ef037b39be1315ed6e54de35bed55b | [
"MIT"
] | null | null | null | cryptoapis/api/tokens_api.py | xan187/Crypto_APIs_2.0_SDK_Python | a56c75df54ef037b39be1315ed6e54de35bed55b | [
"MIT"
] | null | null | null | cryptoapis/api/tokens_api.py | xan187/Crypto_APIs_2.0_SDK_Python | a56c75df54ef037b39be1315ed6e54de35bed55b | [
"MIT"
] | 1 | 2021-07-21T03:35:18.000Z | 2021-07-21T03:35:18.000Z | """
CryptoAPIs
Crypto APIs 2.0 is a complex and innovative infrastructure layer that radically simplifies the development of any Blockchain and Crypto related applications. Organized around REST, Crypto APIs 2.0 can assist both novice Bitcoin/Ethereum enthusiasts and crypto experts with the development of their blockchain applications. Crypto APIs 2.0 provides unified endpoints and data, raw data, automatic tokens and coins forwardings, callback functionalities, and much more. # noqa: E501
The version of the OpenAPI document: 2.0.0
Contact: developers@cryptoapis.io
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from cryptoapis.api_client import ApiClient, Endpoint as _Endpoint
from cryptoapis.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from cryptoapis.model.feature_mainnets_not_allowed_for_plan import FeatureMainnetsNotAllowedForPlan
from cryptoapis.model.insufficient_credits import InsufficientCredits
from cryptoapis.model.invalid_api_key import InvalidApiKey
from cryptoapis.model.invalid_data import InvalidData
from cryptoapis.model.invalid_pagination import InvalidPagination
from cryptoapis.model.invalid_request_body_structure import InvalidRequestBodyStructure
from cryptoapis.model.list_tokens_by_address_response import ListTokensByAddressResponse
from cryptoapis.model.list_tokens_transfers_by_address_response import ListTokensTransfersByAddressResponse
from cryptoapis.model.list_tokens_transfers_by_transaction_hash_response import ListTokensTransfersByTransactionHashResponse
from cryptoapis.model.request_limit_reached import RequestLimitReached
from cryptoapis.model.unexpected_server_error import UnexpectedServerError
from cryptoapis.model.unsupported_media_type import UnsupportedMediaType
class TokensApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __list_tokens_by_address(
self,
network,
address,
blockchain="ethereum",
**kwargs
):
"""List Tokens By Address # noqa: E501
Through this endpoint customers can obtain token data by providing an attribute - `address`. The information that can be returned can include the contract address, the token symbol, type and balance. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_tokens_by_address(network, address, blockchain="ethereum", async_req=True)
>>> result = thread.get()
Args:
network (str): Represents the name of the blockchain network used; blockchain networks are usually identical as technology and software, but they differ in data, e.g. - \"mainnet\" is the live network with actual data while networks like \"ropsten\", \"rinkeby\" are test networks.
address (str): Represents the public address, which is a compressed and shortened form of a public key.
blockchain (str): Represents the specific blockchain protocol name, e.g. Ethereum, Ethereum Classic, etc.. defaults to "ethereum", must be one of ["ethereum"]
Keyword Args:
context (str): In batch situations the user can use the context to correlate responses with requests. This property is present regardless of whether the response was successful or returned as an error. `context` is specified by the user.. [optional]
limit (int): Defines how many items should be returned in the response per page basis.. [optional] if omitted the server will use the default value of 50
offset (int): The starting index of the response items, i.e. where the response should start listing the returned items.. [optional] if omitted the server will use the default value of 0
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
ListTokensByAddressResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['blockchain'] = \
blockchain
kwargs['network'] = \
network
kwargs['address'] = \
address
return self.call_with_http_info(**kwargs)
self.list_tokens_by_address = _Endpoint(
settings={
'response_type': (ListTokensByAddressResponse,),
'auth': [
'ApiKey'
],
'endpoint_path': '/blockchain-data/{blockchain}/{network}/addresses/{address}/tokens',
'operation_id': 'list_tokens_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'blockchain',
'network',
'address',
'context',
'limit',
'offset',
],
'required': [
'blockchain',
'network',
'address',
],
'nullable': [
],
'enum': [
'blockchain',
'network',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('blockchain',): {
"ETHEREUM": "ethereum"
},
('network',): {
"MAINNET": "mainnet",
"ROPSTEN": "ropsten",
"RINKEBY": "rinkeby"
},
},
'openapi_types': {
'blockchain':
(str,),
'network':
(str,),
'address':
(str,),
'context':
(str,),
'limit':
(int,),
'offset':
(int,),
},
'attribute_map': {
'blockchain': 'blockchain',
'network': 'network',
'address': 'address',
'context': 'context',
'limit': 'limit',
'offset': 'offset',
},
'location_map': {
'blockchain': 'path',
'network': 'path',
'address': 'path',
'context': 'query',
'limit': 'query',
'offset': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__list_tokens_by_address
)
def __list_tokens_transfers_by_address(
self,
network,
address,
blockchain="ethereum",
**kwargs
):
"""List Tokens Transfers By Address # noqa: E501
Through this endpoint customers can obtain a list with token transfers by the `address` attribute. Token transfers may include information such as addresses of the sender and recipient, token name, token symbol, etc. {note}This refers only to transfers done for **tokens** not coins.{/note} # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_tokens_transfers_by_address(network, address, blockchain="ethereum", async_req=True)
>>> result = thread.get()
Args:
network (str): Represents the name of the blockchain network used; blockchain networks are usually identical as technology and software, but they differ in data, e.g. - \"mainnet\" is the live network with actual data while networks like \"ropsten\", \"rinkeby\" are test networks.
address (str): Represents the public address, which is a compressed and shortened form of a public key.
blockchain (str): Represents the specific blockchain protocol name, e.g. Ethereum, Ethereum Classic, etc.. defaults to "ethereum", must be one of ["ethereum"]
Keyword Args:
context (str): In batch situations the user can use the context to correlate responses with requests. This property is present regardless of whether the response was successful or returned as an error. `context` is specified by the user.. [optional]
limit (int): Defines how many items should be returned in the response per page basis.. [optional] if omitted the server will use the default value of 50
offset (int): The starting index of the response items, i.e. where the response should start listing the returned items.. [optional] if omitted the server will use the default value of 0
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
ListTokensTransfersByAddressResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['blockchain'] = \
blockchain
kwargs['network'] = \
network
kwargs['address'] = \
address
return self.call_with_http_info(**kwargs)
self.list_tokens_transfers_by_address = _Endpoint(
settings={
'response_type': (ListTokensTransfersByAddressResponse,),
'auth': [
'ApiKey'
],
'endpoint_path': '/blockchain-data/{blockchain}/{network}/addresses/{address}/tokens-transfers',
'operation_id': 'list_tokens_transfers_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'blockchain',
'network',
'address',
'context',
'limit',
'offset',
],
'required': [
'blockchain',
'network',
'address',
],
'nullable': [
],
'enum': [
'blockchain',
'network',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('blockchain',): {
"ETHEREUM": "ethereum"
},
('network',): {
"MAINNET": "mainnet",
"ROPSTEN": "ropsten",
"RINKEBY": "rinkeby"
},
},
'openapi_types': {
'blockchain':
(str,),
'network':
(str,),
'address':
(str,),
'context':
(str,),
'limit':
(int,),
'offset':
(int,),
},
'attribute_map': {
'blockchain': 'blockchain',
'network': 'network',
'address': 'address',
'context': 'context',
'limit': 'limit',
'offset': 'offset',
},
'location_map': {
'blockchain': 'path',
'network': 'path',
'address': 'path',
'context': 'query',
'limit': 'query',
'offset': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__list_tokens_transfers_by_address
)
def __list_tokens_transfers_by_transaction_hash(
self,
network,
transaction_hash,
blockchain="ethereum",
**kwargs
):
"""List Tokens Transfers By Transaction Hash # noqa: E501
Through this endpoint customers can obtain a list with token transfers by the `transactionHash` attribute. Token transfers may include information such as addresses of the sender and recipient, token name, token symbol, etc. {note}This refers only to transfers done for **tokens** not coins.{/note} # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_tokens_transfers_by_transaction_hash(network, transaction_hash, blockchain="ethereum", async_req=True)
>>> result = thread.get()
Args:
network (str): Represents the name of the blockchain network used; blockchain networks are usually identical as technology and software, but they differ in data, e.g. - \"mainnet\" is the live network with actual data while networks like \"ropsten\", \"rinkeby\" are test networks.
transaction_hash (str): Represents the hash of the transaction, which is its unique identifier. It represents a cryptographic digital fingerprint made by hashing the block header twice through the SHA256 algorithm.
blockchain (str): Represents the specific blockchain protocol name, e.g. Ethereum, Ethereum Classic, etc.. defaults to "ethereum", must be one of ["ethereum"]
Keyword Args:
context (str): In batch situations the user can use the context to correlate responses with requests. This property is present regardless of whether the response was successful or returned as an error. `context` is specified by the user.. [optional]
limit (int): Defines how many items should be returned in the response per page basis.. [optional] if omitted the server will use the default value of 50
offset (int): The starting index of the response items, i.e. where the response should start listing the returned items.. [optional] if omitted the server will use the default value of 0
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
ListTokensTransfersByTransactionHashResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['blockchain'] = \
blockchain
kwargs['network'] = \
network
kwargs['transaction_hash'] = \
transaction_hash
return self.call_with_http_info(**kwargs)
self.list_tokens_transfers_by_transaction_hash = _Endpoint(
settings={
'response_type': (ListTokensTransfersByTransactionHashResponse,),
'auth': [
'ApiKey'
],
'endpoint_path': '/blockchain-data/{blockchain}/{network}/transactions/{transactionHash}/tokens-transfers',
'operation_id': 'list_tokens_transfers_by_transaction_hash',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'blockchain',
'network',
'transaction_hash',
'context',
'limit',
'offset',
],
'required': [
'blockchain',
'network',
'transaction_hash',
],
'nullable': [
],
'enum': [
'blockchain',
'network',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('blockchain',): {
"ETHEREUM": "ethereum"
},
('network',): {
"MAINNET": "mainnet",
"ROPSTEN": "ropsten",
"RINKEBY": "rinkeby"
},
},
'openapi_types': {
'blockchain':
(str,),
'network':
(str,),
'transaction_hash':
(str,),
'context':
(str,),
'limit':
(int,),
'offset':
(int,),
},
'attribute_map': {
'blockchain': 'blockchain',
'network': 'network',
'transaction_hash': 'transactionHash',
'context': 'context',
'limit': 'limit',
'offset': 'offset',
},
'location_map': {
'blockchain': 'path',
'network': 'path',
'transaction_hash': 'path',
'context': 'query',
'limit': 'query',
'offset': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__list_tokens_transfers_by_transaction_hash
)
| 44.213645 | 484 | 0.513298 | 2,216 | 24,627 | 5.544224 | 0.149819 | 0.017093 | 0.021651 | 0.02393 | 0.803679 | 0.797168 | 0.779261 | 0.763308 | 0.751994 | 0.751994 | 0 | 0.003696 | 0.406749 | 24,627 | 556 | 485 | 44.293165 | 0.837235 | 0.386243 | 0 | 0.713924 | 0 | 0 | 0.208409 | 0.037886 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010127 | false | 0 | 0.040506 | 0 | 0.060759 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a40bc34f8de2164cc007f254593aa7b4f0153e8 | 3,818 | py | Python | tools/packtag.py | lantti/RestessbarKSreporter | 43cf8bf568a986bdfafed895949cbc06e16d1d26 | [
"MIT"
] | 48 | 2016-02-13T14:55:42.000Z | 2021-04-19T21:03:34.000Z | tools/packtag.py | lantti/RestessbarKSreporter | 43cf8bf568a986bdfafed895949cbc06e16d1d26 | [
"MIT"
] | null | null | null | tools/packtag.py | lantti/RestessbarKSreporter | 43cf8bf568a986bdfafed895949cbc06e16d1d26 | [
"MIT"
] | 2 | 2016-12-20T20:22:02.000Z | 2019-04-28T12:08:59.000Z | #!/usr/bin/env python
import sys
from struct import pack
suffix = '\x04\x00\x00\x00\n\x00\x00\x00D\x00e\x00m\x00o\x00\x00\x00\x02\x00\x00\x00\x04\x00\x00\x00\xff\xff\xff\xff\x03\x00\x00\x00\x04\x00\x00\x00\x01\x00\x00\x00\x05\x00\x00\x00\x04\x00\x00\x00\x00\x00\x01\x00\x16\x00\x00\x00\x04\x00\x00\x00\x01\x00\x00\x00\x06\x00\x00\x00\x18\x00\x00\x00\xde\x07\x00\x00\x03\x00\x00\x00\x1c\x00\x00\x00\x0f\x00\x00\x00(\x00\x00\x00\x00\x00\x00\x00\x07\x00\x00\x00\x18\x00\x00\x00\xde\x07\x00\x00\x05\x00\x00\x00\x1c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0f\x00\x00\x00\x04\x00\x00\x00\x00\x04\x00\x00\x10\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x11\x00\x00\x00\x04\x00\x00\x00\x00\x00\xc4\t!\x00\x00\x00\x04\x00\x00\x00\x06\x00\x00\x00#\x00\x00\x00\x04\x00\x00\x00\x01\x00\x00\x00"\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x18\x00\x00\x00\x04\x00\x00\x00\x01\x00\x00\x002\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00/\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x001\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x1c\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00*\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00,\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00-\x00\x00\x00\x04\x00\x00\x00\xff\xff\xff\xff.\x00\x00\x00\x02\x00\x00\x00\x00\x00\x01\x00\x00\x00\x1c\x00\x00\x00M\x00e\x00d\x00i\x00a\x00T\x00e\x00k\x00 \x00I\x00n\x00c\x00.\x00\x00\x00%\x00\x00\x00\x04\x00\x00\x00\x01\x00\x00\x003\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x12\x00\x00\x00\n\x00\x00\x001234567890\x17\x00\x00\x00\x10\x00\x00\x00c\x00o\x00n\x00t\x00e\x00n\x00t\x00\x00\x00\x19\x00\x00\x00>\x00\x00\x00\x01\x00\x00\x00\n\x00\x00\x00D\x00e\x00m\x00o\x00\x00\x00\x02\x00\x00\x00\n\x00\x00\x00D\x00e\x00m\x00o\x00\x00\x00\x03\x00\x00\x00\n\x00\x00\x00D\x00e\x00m\x00o\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x13\x00\x00\x00\xd0\x00\x00\x00\x88\x13\x00\x00\x01\x00\x00\x00\x89\x13\x00\x00\x01\x00\x00\x00\x8a\x13\x00\x00\x01\x00\x00\x00\x8b\x13\x00\x00\x01\x00\x00\x00\x8c\x13\x00\x00\x01\x00\x00\x00\x8d\x13\x00\x00\x01\x00\x00\x00\x8e\x13\x00\x00\x01\x00\x00\x00\x8f\x13\x00\x00\x01\x00\x00\x00\x90\x13\x00\x00\x01\x00\x00\x00\x91\x13\x00\x00\x01\x00\x00\x00\x92\x13\x00\x00\x01\x00\x00\x00\x93\x13\x00\x00\x01\x00\x00\x00\x94\x13\x00\x00\x01\x00\x00\x00\x95\x13\x00\x00\x01\x00\x00\x00\x96\x13\x00\x00\x01\x00\x00\x00\x97\x13\x00\x00\x01\x00\x00\x00\x98\x13\x00\x00\x01\x00\x00\x00\x99\x13\x00\x00\x01\x00\x00\x00\x9a\x13\x00\x00\x01\x00\x00\x00\x9b\x13\x00\x00\x01\x00\x00\x00\x9c\x13\x00\x00\x01\x00\x00\x00\x9d\x13\x00\x00\x01\x00\x00\x00\x9e\x13\x00\x00\x01\x00\x00\x00\x9f\x13\x00\x00\x01\x00\x00\x00\xa0\x13\x00\x00\x01\x00\x00\x00\xa1\x13\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xb4VDE10\x01\x00\x00\x00LV)\xfb\xed\xfe\xb6\xd0\x9e\xa6\xe0\xcb\xb3\x122\xa6\xff8\xdd\xf5\xfc\xb2i2X\xe1\x10\x9dw}\x19\xdd;0V*\x92\x9bo\xf8\x0f\xf0\xa0 "\xd9\x12$\x01f\xe3\x0f\xc1\n\xff\xa5\xae\x9a\xeb\xae4\x81\xed\xbb'
if __name__ == '__main__':
if len(sys.argv) < 2:
print('Usage: %s input.elf output.vxp' % sys.argv[0])
sys.exit(-1)
elfname = sys.argv[1]
vxpname = elfname + '.vxp'
if len(sys.argv) > 2:
vxpname = sys.argv[2]
elf = open(elfname, 'rb')
if not elf:
print('Can not open %s' % elfname)
sys.exit(-2)
vxp = open(vxpname, 'wb')
if not vxp:
print('Can not open %s' % vxp)
sys.exit(-3)
vxp.write(elf.read())
elf.close()
# append 0xff
vxp.write('\xff')
size = vxp.tell()
# align with 0x30
while size & 0x3:
vxp.write('\x30')
size += 1
vxp.write(suffix)
# add elf file length information
lengthinfo = pack('qi', size, 0)
vxp.write(lengthinfo)
vxp.close()
| 79.541667 | 2,899 | 0.690938 | 828 | 3,818 | 3.176329 | 0.179952 | 0.752852 | 0.684411 | 0.497338 | 0.690494 | 0.661597 | 0.646388 | 0.407985 | 0.398859 | 0.357795 | 0 | 0.383119 | 0.084599 | 3,818 | 47 | 2,900 | 81.234043 | 0.369385 | 0.020953 | 0 | 0 | 0 | 0.033333 | 0.796465 | 0.772898 | 0 | 1 | 0.000803 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0.1 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6a48cb6632bc656ae7776fd2516c43d98464a161 | 10,679 | py | Python | tests/tests_test_workflow/test_integ_workflow/integ_test/test_service_opensearch_dashboards.py | naveenpajjuri/opensearch-build | 855f0296b36ba32b18cf4fc40b096659b5b3f1f0 | [
"Apache-2.0"
] | null | null | null | tests/tests_test_workflow/test_integ_workflow/integ_test/test_service_opensearch_dashboards.py | naveenpajjuri/opensearch-build | 855f0296b36ba32b18cf4fc40b096659b5b3f1f0 | [
"Apache-2.0"
] | null | null | null | tests/tests_test_workflow/test_integ_workflow/integ_test/test_service_opensearch_dashboards.py | naveenpajjuri/opensearch-build | 855f0296b36ba32b18cf4fc40b096659b5b3f1f0 | [
"Apache-2.0"
] | null | null | null | # SPDX-License-Identifier: Apache-2.0
#
# The OpenSearch Contributors require contributions made to
# this file be licensed under the Apache-2.0 license or a
# compatible open source license.
import os
import unittest
from unittest.mock import MagicMock, PropertyMock, call, mock_open, patch
from test_workflow.integ_test.service_opensearch_dashboards import ServiceOpenSearchDashboards
class ServiceOpenSearchDashboardsTests(unittest.TestCase):
def setUp(self):
self.version = "1.1.0"
self.work_dir = "test_work_dir"
self.additional_config = {"script.context.field.max_compilations_rate": "1000/1m"}
self.dependency_installer = ""
@patch("test_workflow.integ_test.service.Process.start")
@patch('test_workflow.integ_test.service.Process.pid', new_callable=PropertyMock, return_value=12345)
@patch("builtins.open", new_callable=mock_open)
@patch("yaml.dump")
@patch("tarfile.open")
def test_start(self, mock_tarfile_open, mock_dump, mock_file, mock_pid, mock_process):
mock_dependency_installer = MagicMock()
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
True,
mock_dependency_installer,
self.work_dir
)
bundle_full_name = "test_bundle_name"
mock_dependency_installer.download_dist.return_value = bundle_full_name
mock_bundle_tar = MagicMock()
mock_tarfile_open.return_value.__enter__.return_value = mock_bundle_tar
mock_dump_result = MagicMock()
mock_dump.return_value = mock_dump_result
# call the target test function
service.start()
mock_dependency_installer.download_dist.called_once_with(self.work_dir)
mock_tarfile_open.assert_called_once_with(bundle_full_name, "r")
mock_bundle_tar.extractall.assert_called_once_with(self.work_dir)
mock_file.assert_called_once_with(os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "config", "opensearch_dashboards.yml"), "a")
mock_dump.assert_called_once_with(
{
"script.context.field.max_compilations_rate": "1000/1m",
"logging.dest": os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "logs", "opensearch_dashboards.log")
}
)
mock_file.return_value.write.assert_called_once_with(mock_dump_result)
mock_process.assert_called_once_with("./opensearch-dashboards", os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "bin"))
self.assertEqual(mock_pid.call_count, 1)
@patch("os.path.isdir")
@patch("subprocess.check_call")
@patch("test_workflow.integ_test.service.Process.start")
@patch('test_workflow.integ_test.service.Process.pid', new_callable=PropertyMock, return_value=12345)
@patch("builtins.open", new_callable=mock_open)
@patch("yaml.dump")
@patch("tarfile.open")
def test_start_without_security(self, mock_tarfile_open, mock_dump, mock_file, mock_pid, mock_process, mock_check_call, mock_os_isdir):
mock_dependency_installer = MagicMock()
service = ServiceOpenSearchDashboards(
self.version,
{},
False,
mock_dependency_installer,
self.work_dir
)
bundle_full_name = "test_bundle_name"
mock_dependency_installer.download_dist.return_value = bundle_full_name
mock_bundle_tar = MagicMock()
mock_tarfile_open.return_value.__enter__.return_value = mock_bundle_tar
mock_file_handler_for_security = mock_open().return_value
mock_file_handler_for_additional_config = mock_open().return_value
# open() will be called twice, one for disabling security, second for additional_config
mock_file.side_effect = [mock_file_handler_for_security, mock_file_handler_for_additional_config]
mock_dump_result = MagicMock()
mock_dump.return_value = mock_dump_result
mock_os_isdir.return_value = True
# call the target test function
service.start()
mock_file.assert_has_calls(
[call(os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "config", "opensearch_dashboards.yml"), "w")],
[call(os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "config", "opensearch_dashboards.yml"), "a")],
)
mock_check_call.assert_called_once_with(
"./opensearch-dashboards-plugin remove securityDashboards",
cwd=os.path.join("test_work_dir", "opensearch-dashboards-1.1.0", "bin"),
shell=True
)
mock_dump.assert_called_once_with({"logging.dest": os.path.join(
self.work_dir, "opensearch-dashboards-1.1.0", "logs", "opensearch_dashboards.log")})
mock_file_handler_for_security.close.assert_called_once()
mock_file_handler_for_additional_config.write.assert_called_once_with(mock_dump_result)
@patch("os.path.isdir")
@patch("subprocess.check_call")
@patch("test_workflow.integ_test.service.Process.start")
@patch('test_workflow.integ_test.service.Process.pid', new_callable=PropertyMock, return_value=12345)
@patch("builtins.open", new_callable=mock_open)
@patch("yaml.dump")
@patch("tarfile.open")
def test_start_without_security_and_not_installed(self, mock_tarfile_open, mock_dump, mock_file, mock_pid, mock_process, mock_check_call, mock_os_isdir):
mock_dependency_installer = MagicMock()
service = ServiceOpenSearchDashboards(
self.version,
{},
False,
mock_dependency_installer,
self.work_dir
)
bundle_full_name = "test_bundle_name"
mock_dependency_installer.download_dist.return_value = bundle_full_name
mock_bundle_tar = MagicMock()
mock_tarfile_open.return_value.__enter__.return_value = mock_bundle_tar
mock_file_handler_for_security = mock_open().return_value
mock_file_handler_for_additional_config = mock_open().return_value
# open() will be called twice, one for disabling security, second for additional_config
mock_file.side_effect = [mock_file_handler_for_security, mock_file_handler_for_additional_config]
mock_dump_result = MagicMock()
mock_dump.return_value = mock_dump_result
mock_os_isdir.side_effect = [False, True]
# call the target test function
service.start()
mock_check_call.assert_not_called()
mock_file.assert_has_calls(
[call(os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "config", "opensearch_dashboards.yml"), "w")],
[call(os.path.join(self.work_dir, "opensearch-dashboards-1.1.0", "config", "opensearch_dashboards.yml"), "a")],
)
mock_dump.assert_called_once_with({"logging.dest": os.path.join(
self.work_dir, "opensearch-dashboards-1.1.0", "logs", "opensearch_dashboards.log")})
mock_file_handler_for_security.close.assert_called_once()
mock_file_handler_for_additional_config.write.assert_called_once_with(mock_dump_result)
def test_endpoint_port_url(self):
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
True,
self.dependency_installer,
self.work_dir
)
self.assertEqual(service.endpoint(), "localhost")
self.assertEqual(service.port(), 5601)
self.assertEqual(service.url(), "http://localhost:5601")
@patch("requests.get")
@patch.object(ServiceOpenSearchDashboards, "url")
def test_get_service_response_with_security(self, mock_url, mock_requests_get):
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
True,
self.dependency_installer,
self.work_dir
)
mock_url_result = MagicMock()
mock_url.return_value = mock_url_result
service.get_service_response()
mock_url.assert_called_once_with("/api/status")
mock_requests_get.assert_called_once_with(mock_url_result, verify=False, auth=("kibanaserver", "kibanaserver"))
@patch("requests.get")
@patch.object(ServiceOpenSearchDashboards, "url")
def test_get_service_response_without_security(self, mock_url, mock_requests_get):
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
False,
self.dependency_installer,
self.work_dir
)
mock_url_result = MagicMock()
mock_url.return_value = mock_url_result
service.get_service_response()
mock_url.assert_called_once_with("/api/status")
mock_requests_get.assert_called_once_with(mock_url_result, auth=None, verify=False)
@patch.object(ServiceOpenSearchDashboards, "get_service_response")
def test_service_alive_green_available(self, mock_get_service_response):
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
True,
self.dependency_installer,
self.work_dir
)
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.text = '"state":"green"'
mock_get_service_response.return_value = mock_response
self.assertTrue(service.service_alive())
@patch.object(ServiceOpenSearchDashboards, "get_service_response")
def test_service_alive_yellow_available(self, mock_get_service_response):
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
True,
self.dependency_installer,
self.work_dir
)
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.text = '"state":"yellow"'
mock_get_service_response.return_value = mock_response
self.assertTrue(service.service_alive())
@patch.object(ServiceOpenSearchDashboards, "get_service_response")
def test_service_alive_red_unavailable(self, mock_get_service_response):
service = ServiceOpenSearchDashboards(
self.version,
self.additional_config,
True,
self.dependency_installer,
self.work_dir
)
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.text = '"state":"red"'
mock_get_service_response.return_value = mock_response
self.assertFalse(service.service_alive())
| 37.868794 | 157 | 0.691451 | 1,251 | 10,679 | 5.520384 | 0.123901 | 0.041413 | 0.033449 | 0.04344 | 0.855633 | 0.847089 | 0.835505 | 0.827107 | 0.783811 | 0.764263 | 0 | 0.009543 | 0.215001 | 10,679 | 281 | 158 | 38.003559 | 0.814267 | 0.041483 | 0 | 0.704434 | 0 | 0 | 0.143374 | 0.089878 | 0 | 0 | 0 | 0 | 0.133005 | 1 | 0.049261 | false | 0 | 0.019704 | 0 | 0.073892 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a49ababdc57c09044f60fe34f249348e3cc0ec0 | 11,809 | py | Python | discover_protocol_command.py | mtasic85/routingtable | 03c581ab3a29b90b780fb1dec0dfcfe7fc77d5d4 | [
"MIT"
] | 5 | 2016-01-25T19:14:48.000Z | 2020-01-22T14:46:36.000Z | discover_protocol_command.py | mtasic85/routingtable | 03c581ab3a29b90b780fb1dec0dfcfe7fc77d5d4 | [
"MIT"
] | null | null | null | discover_protocol_command.py | mtasic85/routingtable | 03c581ab3a29b90b780fb1dec0dfcfe7fc77d5d4 | [
"MIT"
] | 1 | 2020-12-30T11:35:46.000Z | 2020-12-30T11:35:46.000Z | __all__ = ['DiscoverProtocolCommand']
import time
import random
from print_colors import PrintColors
from contact import Contact
from protocol_command import ProtocolCommand
class DiscoverProtocolCommand(ProtocolCommand):
def start(self):
self.req()
def stop(self):
raise NotImplementedError
def req(self):
# request
c = self.node.rt.contacts.random(without_id=self.node.id)
if not c or c.id is None:
self.node.loop.call_later(5.0 + random.random() * 5.0, self.req)
return
# print('discover_nodes:', c)
node_id = self.node.id
node_local_host = self.node.listen_host
node_local_port = self.node.listen_port
args = ()
kwargs = {
'id': node_id,
'local_host': node_local_host,
'local_port': node_local_port,
}
res = (args, kwargs)
# build message
message_data = self.node.build_message(
self.protocol_major_version,
self.protocol_minor_version,
self.PROTOCOL_REQ,
self.protocol_command_code,
res,
)
# force del
del args
del kwargs
del res
# send message
self.node.send_message(message_data, c.remote_host, c.remote_port)
# schedule next discover
self.node.loop.call_later(0.0 + random.random() * 10.0, self.req)
def on_req(self, remote_host, remote_port, *args, **kwargs):
node_id = kwargs['id']
local_host = kwargs['local_host']
local_port = kwargs['local_port']
bootstrap = kwargs.get('bootstrap', False)
# update contact's `last_seen`, or add contact
c = self.node.rt.contacts.get(node_id)
if c:
c.id = node_id
c.last_seen = time.time()
else:
c = self.node.rt.contacts.get((remote_host, remote_port))
if c:
c.id = node_id
c.last_seen = time.time()
else:
# add_contact
c = self.node.rt.add_contacts.get(node_id)
if c:
self.node.rt.add_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY REQ]:', self.node, c, PrintColors.END)
else:
c = self.node.rt.add_contacts.get((remote_host, remote_port))
if c:
self.node.rt.add_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY REQ]:', self.node, c, PrintColors.END)
else:
# remove_contact
c = self.node.rt.remove_contacts.get(node_id)
if c:
self.node.rt.remove_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY REQ]:', self.node, c, PrintColors.END)
else:
c = self.node.rt.remove_contacts.get((remote_host, remote_port))
if c:
self.node.rt.remove_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY REQ]:', self.node, c, PrintColors.END)
else:
c = Contact(
id = node_id,
local_host = local_host,
local_port = local_port,
remote_host = remote_host,
remote_port = remote_port,
bootstrap = bootstrap,
)
# because `c` is requesting to discover nodes
# put it into known active contacts
c.last_seen = time.time()
self.node.rt.contacts.add(c)
print(PrintColors.GREEN + 'new contact [DISCOVERY REQ]:', self.node, c, PrintColors.END)
# forward to res_discover_nodes
self.res(remote_host, remote_port, *args, **kwargs)
def res(self, remote_host, remote_port, *args, **kwargs):
# response
node_id = self.node.id
local_host = self.node.listen_host
local_port = self.node.listen_port
contacts = [c.__getstate__() for c in self.node.rt.contacts]
res = {
'id': node_id,
'local_host': local_host,
'local_port': local_port,
'contacts': contacts,
}
# build message
message_data = self.node.build_message(
self.protocol_major_version,
self.protocol_minor_version,
self.PROTOCOL_RES,
self.protocol_command_code,
res,
)
# force del
del contacts
del res
# send message
self.node.send_message(message_data, remote_host, remote_port)
def on_res(self, remote_host, remote_port, res):
node_id = res['id']
local_host = res['local_host']
local_port = res['local_port']
contacts = res['contacts']
bootstrap = res.get('bootstrap', False)
# update contact's `last_seen`, or add contact
c = self.node.rt.contacts.get(node_id)
if c:
c.id = node_id
c.last_seen = time.time()
else:
c = self.node.rt.contacts.get((remote_host, remote_port))
if c:
c.id = node_id
c.last_seen = time.time()
else:
# add_contact
c = self.node.rt.add_contacts.get(node_id)
if c:
self.node.rt.add_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY ON RES]:', self.node, c, PrintColors.END)
else:
c = self.node.rt.add_contacts.get((remote_host, remote_port))
if c:
self.node.rt.add_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY ON RES]:', self.node, c, PrintColors.END)
else:
# remove_contact
c = self.node.rt.remove_contacts.get(node_id)
if c:
self.node.rt.remove_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY ON RES]:', self.node, c, PrintColors.END)
else:
c = self.node.rt.remove_contacts.get((remote_host, remote_port))
if c:
self.node.rt.remove_contacts.remove(c)
self.node.rt.contacts.add(c)
c.id = node_id
c.last_seen = time.time()
print(PrintColors.GREEN + 'new contact [DISCOVERY ON RES]:', self.node, c, PrintColors.END)
else:
c = Contact(
id = node_id,
local_host = local_host,
local_port = local_port,
remote_host = remote_host,
remote_port = remote_port,
bootstrap = bootstrap,
)
# because `c` is requesting to discover nodes
# put it into known active contacts
c.last_seen = time.time()
self.node.rt.contacts.add(c)
print(PrintColors.GREEN + 'new contact [DISCOVERY ON RES]:', self.node, c, PrintColors.END)
# update discovered nodes/contacts
for cd in contacts:
node_id = cd['id']
local_host = cd['local_host']
local_port = cd['local_port']
remote_host = cd['remote_host']
remote_port = cd['remote_port']
bootstrap = cd.get('bootstrap', False)
# update contact's `last_seen`, or add contact
c = self.node.rt.contacts.get(node_id)
if c:
c.id = node_id
else:
c = self.node.rt.contacts.get((remote_host, remote_port))
if c:
c.id = node_id
else:
# add_contact
c = self.node.rt.add_contacts.get(node_id)
if c:
c.id = node_id
else:
c = self.node.rt.add_contacts.get((remote_host, remote_port))
if c:
c.id = node_id
else:
# remove_contact
c = self.node.rt.remove_contacts.get(node_id)
if c:
self.node.rt.remove_contacts.remove(c)
self.node.rt.add_contacts.add(c)
c.id = node_id
else:
c = self.node.rt.remove_contacts.get((remote_host, remote_port))
if c:
self.node.rt.remove_contacts.remove(c)
self.node.rt.add_contacts.add(c)
c.id = node_id
else:
c = Contact(
id = node_id,
local_host = local_host,
local_port = local_port,
remote_host = remote_host,
remote_port = remote_port,
bootstrap = bootstrap,
)
# because `c` is requesting to discover nodes
# put it into known active contacts
c.last_seen = time.time()
self.node.rt.add_contacts.add(c)
| 39.760943 | 123 | 0.437548 | 1,171 | 11,809 | 4.22801 | 0.078565 | 0.106645 | 0.086851 | 0.086649 | 0.806908 | 0.78469 | 0.750757 | 0.733993 | 0.719047 | 0.719047 | 0 | 0.001468 | 0.480989 | 11,809 | 296 | 124 | 39.89527 | 0.806331 | 0.055381 | 0 | 0.707207 | 0 | 0 | 0.044307 | 0.002067 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.022523 | 0 | 0.058559 | 0.04955 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a4cbd6d42bb8653c1b24037f47f32e22e221820 | 21,360 | py | Python | tests/builders/test_resume.py | frenzymadness/module-build | a6adae2c0799c5987eda5bec57c768acd16c2226 | [
"MIT"
] | null | null | null | tests/builders/test_resume.py | frenzymadness/module-build | a6adae2c0799c5987eda5bec57c768acd16c2226 | [
"MIT"
] | null | null | null | tests/builders/test_resume.py | frenzymadness/module-build | a6adae2c0799c5987eda5bec57c768acd16c2226 | [
"MIT"
] | null | null | null | import os
import shutil
from unittest.mock import patch
import pytest
from module_build.builders.mock_builder import MockBuilder
from module_build.stream import ModuleStream
from tests import (fake_buildroot_run, fake_get_artifacts, get_full_data_path,
mock_mmdv3_and_version)
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_failed_first_component(mock_config, tmpdir):
""" We test to resume the module build from the first failed component """
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl component
def die_on_perl(self):
return fake_buildroot_run(self, component_to_fail="perl")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl):
with pytest.raises(Exception) as e:
builder.build(module_stream, resume=False)
err_msg = e.value.args[0]
assert "Build of component 'perl' failed!!" == err_msg
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
build_batches_path = cwd + "/" + cntx_names[0] + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
assert len(build_batches_dir) == 2
assert 'batch_1' in build_batches_dir
assert 'batch_2' not in build_batches_dir
assert 'repodata' in build_batches_dir
batch_path = build_batches_path + "/batch_1"
batch_dir = os.listdir(batch_path)
assert len(batch_dir) == 1
assert "perl" in batch_dir
perl_comp_path = batch_path + "/perl"
perl_comp_dir = os.listdir(perl_comp_path)
assert len(perl_comp_dir) == 1
assert "finished" not in perl_comp_dir
assert "perl_mock.cfg" in perl_comp_dir
assert "perl-0:1.0-1.module_fc35+f26devel.x86_64.rpm" not in perl_comp_dir
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
perl_comp_dir = os.listdir(perl_comp_path)
assert "finished" in perl_comp_dir
assert "perl-0:1.0-1.module_fc35+f26devel.x86_64.rpm" in perl_comp_dir
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_failed_not_first_component(mock_config, tmpdir):
""" We test to resume the module build from a failed component in the 4th batch """
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl-Digest
# component
def die_on_perl_digest(self):
return fake_buildroot_run(self, component_to_fail="perl-Digest")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl_digest):
with pytest.raises(Exception) as e:
builder.build(module_stream, resume=False)
err_msg = e.value.args[0]
assert "Build of component 'perl-Digest' failed!!" == err_msg
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
build_batches_path = cwd + "/" + cntx_names[0] + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
assert len(build_batches_dir) == 5
assert 'batch_1' in build_batches_dir
assert 'batch_2' in build_batches_dir
assert 'batch_3' in build_batches_dir
assert 'batch_4' in build_batches_dir
assert 'batch_5' not in build_batches_dir
assert 'repodata' in build_batches_dir
batch_path = build_batches_path + "/batch_4"
batch_dir = os.listdir(batch_path)
assert len(batch_dir) == 19
assert "perl-Digest" in batch_dir
perl_digest_comp_path = batch_path + "/perl-Digest"
perl_digest_comp_dir = os.listdir(perl_digest_comp_path)
assert len(perl_digest_comp_dir) == 1
assert "finished" not in perl_digest_comp_dir
assert "perl-Digest_mock.cfg" in perl_digest_comp_dir
assert "perl-Digest-0:1.0-1.module_fc35+f26devel.x86_64.rpm" not in perl_digest_comp_dir
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
perl_digest_comp_dir = os.listdir(perl_digest_comp_path)
assert "perl-Digest-0:1.0-1.module_fc35+f26devel.x86_64.rpm" in perl_digest_comp_dir
assert "finished" in perl_digest_comp_dir
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_failed_to_create_batch_yaml_file(mock_config, tmpdir):
""" We test to resume the module build on a failed batch closure, where only the yaml file is
missing """
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl-generators
# component
def die_on_perl_generators(self):
return fake_buildroot_run(self, component_to_fail="perl-generators")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl_generators):
with pytest.raises(Exception):
builder.build(module_stream, resume=False)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
build_batches_path = cwd + "/" + cntx_names[0] + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
assert len(build_batches_dir) == 4
assert 'batch_1' in build_batches_dir
assert 'batch_2' in build_batches_dir
assert 'batch_3' in build_batches_dir
assert 'repodata' in build_batches_dir
# we prepare the directories to the state we want to resume from.
batch_3_path = build_batches_path + "/batch_3"
shutil.rmtree(batch_3_path)
batch_2_path = build_batches_path + "/batch_2"
finished_file_path = batch_2_path + '/finished'
os.remove(finished_file_path)
# the version on a batch yaml file is dynamic, so we have to search for it.
for file_name in os.listdir(batch_2_path):
if file_name.endswith("yaml"):
yaml_file_path = batch_2_path + "/" + file_name
assert yaml_file_path
os.remove(yaml_file_path)
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
# the version on a batch yaml file is dynamic, so we have to search for it.
for file_name in os.listdir(batch_2_path):
if file_name.endswith("yaml"):
yaml_file_path = batch_2_path + "/" + file_name
assert yaml_file_path
assert os.path.isfile(yaml_file_path)
assert os.path.isfile(finished_file_path)
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_continue_with_new_batch(mock_config, tmpdir):
""" We test to resume module build when a new batch directory has failed to create. """
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl-generators
# component
def die_on_perl_generators(self):
return fake_buildroot_run(self, component_to_fail="perl-generators")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl_generators):
with pytest.raises(Exception):
builder.build(module_stream, resume=False)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
build_batches_path = cwd + "/" + cntx_names[0] + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
assert len(build_batches_dir) == 4
assert 'batch_1' in build_batches_dir
assert 'batch_2' in build_batches_dir
assert 'batch_3' in build_batches_dir
assert 'repodata' in build_batches_dir
# we prepare the directories to the state we want to resume from.
batch_3_path = build_batches_path + "/batch_3"
shutil.rmtree(batch_3_path)
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
assert os.path.isdir(batch_3_path)
finished_file_path = batch_3_path + "/finished"
# the version on a batch yaml file is dynamic, so we have to search for it.
for file_name in os.listdir(batch_3_path):
if file_name.endswith("yaml"):
yaml_file_path = batch_3_path + "/" + file_name
assert yaml_file_path
assert os.path.isfile(yaml_file_path)
assert os.path.isfile(finished_file_path)
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_continue_with_next_context(mock_config, tmpdir):
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl-generators
# component
def die_on_perl_second_context(self):
return fake_buildroot_run(self, component_to_fail="perl", context="f27devel")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl_second_context):
with pytest.raises(Exception):
builder.build(module_stream, resume=False)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
if "f27devel" in name:
second_context_path = cwd + "/" + name
shutil.rmtree(second_context_path)
if "f26devel" in name:
first_context_path = cwd + "/" + name
first_context_dir = os.listdir(first_context_path)
assert "finished" in first_context_dir
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
@pytest.mark.parametrize("context", ["f26devel", "f27devel"])
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_do_not_continue_with_next_context_when_context_specified(mock_config,
context,
tmpdir):
""" We test that the resume function will only resumes the specified context and does not build
anything else """
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl-generators
# component
def die_on_perl_generators(self):
return fake_buildroot_run(self, component_to_fail="perl-generators")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl_generators):
with pytest.raises(Exception):
builder.build(module_stream, resume=False, context_to_build=context)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
build_batches_path = cwd + "/" + cntx_names[0] + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
assert len(build_batches_dir) == 4
assert 'batch_1' in build_batches_dir
assert 'batch_2' in build_batches_dir
assert 'batch_3' in build_batches_dir
assert 'repodata' in build_batches_dir
# we prepare the directories to the state we want to resume from.
batch_3_path = build_batches_path + "/batch_3"
shutil.rmtree(batch_3_path)
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True, context_to_build=context)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
assert context in cntx_names[0]
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
@pytest.mark.parametrize("context", ["f26devel", "f27devel"])
@patch("module_build.builders.mock_builder.MockBuilder.get_artifacts_nevra", new=fake_get_artifacts)
@patch("module_build.builders.mock_builder.mockbuild.config.load_config",
return_value={"target_arch": "x86_64", "dist": "fc35"})
def test_resume_module_build_first_specify_context_and_resume_without(mock_config, context, tmpdir):
cwd = tmpdir.mkdir("workdir").strpath
rootdir = None
mock_cfg_path = get_full_data_path("mock_cfg/fedora-35-x86_64.cfg")
external_repos = []
builder = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
mmd, version = mock_mmdv3_and_version()
module_stream = ModuleStream(mmd, version)
# wrapper function which sets the `fake_buildroot_run` function to fail on the perl-generators
# component
def die_on_perl_generators(self):
return fake_buildroot_run(self, component_to_fail="perl-generators")
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=die_on_perl_generators):
with pytest.raises(Exception):
builder.build(module_stream, resume=False, context_to_build=context)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 1
build_batches_path = cwd + "/" + cntx_names[0] + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
assert len(build_batches_dir) == 4
assert 'batch_1' in build_batches_dir
assert 'batch_2' in build_batches_dir
assert 'batch_3' in build_batches_dir
assert 'repodata' in build_batches_dir
# we prepare the directories to the state we want to resume from.
batch_3_path = build_batches_path + "/batch_3"
shutil.rmtree(batch_3_path)
# we run the build again on the same working directory with the resume option on
builder_resumed = MockBuilder(mock_cfg_path, cwd, external_repos, rootdir)
with patch("module_build.builders.mock_builder.MockBuildroot.run",
new=fake_buildroot_run):
builder_resumed.build(module_stream, resume=True)
cntx_names = os.listdir(cwd)
assert len(cntx_names) == 2
for name in cntx_names:
context_path = cwd + "/" + name
context_dir = os.listdir(context_path)
build_batches_path = context_path + "/build_batches"
build_batches_dir = os.listdir(build_batches_path)
for i in range(12):
batch_name = "batch_{position}".format(position=i + 1)
assert batch_name in build_batches_dir
assert "repodata" in build_batches_dir
assert "finished" in context_dir
assert "final_repo" in context_dir
| 41.800391 | 100 | 0.712032 | 2,988 | 21,360 | 4.773092 | 0.055556 | 0.087505 | 0.061001 | 0.046487 | 0.925607 | 0.908288 | 0.903029 | 0.900926 | 0.893213 | 0.888164 | 0 | 0.014867 | 0.20014 | 21,360 | 510 | 101 | 41.882353 | 0.819901 | 0.101545 | 0 | 0.81383 | 0 | 0 | 0.174527 | 0.105825 | 0 | 0 | 0 | 0 | 0.268617 | 1 | 0.037234 | false | 0 | 0.018617 | 0.018617 | 0.074468 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dbf47c5958ecf93141858eb1299700ce3082ea8a | 2,015 | py | Python | Assigments/Assigment10/Tests/test_sk_save_restore_additional.py | mevljas/Quality_and_testing | 6a39610084b1538eae270682a6842270e8971b7f | [
"MIT"
] | null | null | null | Assigments/Assigment10/Tests/test_sk_save_restore_additional.py | mevljas/Quality_and_testing | 6a39610084b1538eae270682a6842270e8971b7f | [
"MIT"
] | null | null | null | Assigments/Assigment10/Tests/test_sk_save_restore_additional.py | mevljas/Quality_and_testing | 6a39610084b1538eae270682a6842270e8971b7f | [
"MIT"
] | null | null | null | import pexpect
def test_bst_save_restore():
baza = pexpect.pexpect()
try:
baza.expect("Enter command: ")
baza.send("use sk")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("add Janez Levak 012345678")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("add Andrej Novak 013456789")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("add Janez Novak 014567890")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("print")
baza.expect("Novak, Janez - 014567890, Novak, Andrej - 013456789, Levak, Janez - 012345678")
baza.expect("Novak, Janez - 014567890, Novak, Andrej - 013456789, Levak, Janez - 012345678")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("count")
baza.expect("3")
baza.expect("Enter command: ")
baza.send("save test.bin")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("reset")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("print")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("count")
baza.expect("0")
baza.expect("Enter command: ")
baza.send("restore test.bin")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("print")
baza.expect("Novak, Janez - 014567890, Novak, Andrej - 013456789, Levak, Janez - 012345678")
baza.expect("Novak, Janez - 014567890, Novak, Andrej - 013456789, Levak, Janez - 012345678")
baza.expect("OK")
baza.expect("Enter command: ")
baza.send("count")
baza.expect("3")
baza.expect("Enter command: ")
print "PASSED\ttest_bst_save_restore"
except:
print "FAILED\ttest_bst_save_restore"
finally:
baza.kill()
if __name__ == "__main__":
test_bst_save_restore()
| 25.506329 | 100 | 0.572208 | 226 | 2,015 | 5.013274 | 0.172566 | 0.27361 | 0.185349 | 0.271845 | 0.801412 | 0.801412 | 0.741395 | 0.741395 | 0.741395 | 0.741395 | 0 | 0.09537 | 0.281886 | 2,015 | 78 | 101 | 25.833333 | 0.68763 | 0 | 0 | 0.654545 | 0 | 0 | 0.373883 | 0.028798 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.018182 | 0.018182 | null | null | 0.090909 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e03909486587800bc822be6755f3762b6bfb7f60 | 937 | py | Python | src/sumo/wrapper/_request_error.py | equinor/sumo-wrapper-python | f6e61145e3334965555764d24e66babc01133905 | [
"Apache-2.0"
] | null | null | null | src/sumo/wrapper/_request_error.py | equinor/sumo-wrapper-python | f6e61145e3334965555764d24e66babc01133905 | [
"Apache-2.0"
] | 1 | 2022-01-13T13:52:47.000Z | 2022-01-13T13:52:47.000Z | src/sumo/wrapper/_request_error.py | equinor/sumo-wrapper-python | f6e61145e3334965555764d24e66babc01133905 | [
"Apache-2.0"
] | null | null | null | class RequestError(Exception):
def __init__(self, code, message):
self.code = code
self.message = message
def __str__(self):
return f'Request Error with status code {self.code} and text {self.message}'
class AuthenticationError(RequestError):
def __init__(self, code, message):
super().__init__(code, message)
def __str__(self):
return f'Authentication failed with status code {self.code} and text {self.message}.'
class TransientError(RequestError):
def __init__(self, code, message):
super().__init__(code, message)
def __str__(self):
return f'Transient Error with status code {self.code} and text {self.message}.'
class PermanentError(RequestError):
def __init__(self, code, message):
super().__init__(code, message)
def __str__(self):
return f'Fatal Request Error with status code {self.code} and text {self.message}.'
| 29.28125 | 93 | 0.679829 | 115 | 937 | 5.156522 | 0.217391 | 0.121417 | 0.074199 | 0.10118 | 0.780776 | 0.743676 | 0.703204 | 0.703204 | 0.703204 | 0.703204 | 0 | 0 | 0.21238 | 937 | 31 | 94 | 30.225806 | 0.803523 | 0 | 0 | 0.52381 | 0 | 0 | 0.302028 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.380952 | false | 0 | 0 | 0.190476 | 0.761905 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
0ec1201c06d24564753c8606acc23c19b8ca94a5 | 55,198 | py | Python | tests/datasource/data_connector/test_configured_asset_azure_data_connector.py | OmriBromberg/great_expectations | 60eb81ebfb08fef5d37d55c316dc962928beb165 | [
"Apache-2.0"
] | 1 | 2021-04-11T20:54:23.000Z | 2021-04-11T20:54:23.000Z | tests/datasource/data_connector/test_configured_asset_azure_data_connector.py | OmriBromberg/great_expectations | 60eb81ebfb08fef5d37d55c316dc962928beb165 | [
"Apache-2.0"
] | 53 | 2021-10-02T02:26:51.000Z | 2021-12-28T20:49:25.000Z | tests/datasource/data_connector/test_configured_asset_azure_data_connector.py | OmriBromberg/great_expectations | 60eb81ebfb08fef5d37d55c316dc962928beb165 | [
"Apache-2.0"
] | 1 | 2022-03-03T16:47:32.000Z | 2022-03-03T16:47:32.000Z | from unittest import mock
import pytest
from ruamel.yaml import YAML
import great_expectations.exceptions as ge_exceptions
from great_expectations import DataContext
from great_expectations.core import IDDict
from great_expectations.core.batch import (
BatchDefinition,
BatchRequest,
BatchRequestBase,
)
from great_expectations.data_context.util import instantiate_class_from_config
from great_expectations.datasource.data_connector import (
ConfiguredAssetAzureDataConnector,
)
from great_expectations.execution_engine import PandasExecutionEngine
yaml = YAML()
@pytest.fixture
def expected_config_dict():
"""Used to validate `self_check()` and `test_yaml_config()` outputs."""
config = {
"class_name": "ConfiguredAssetAzureDataConnector",
"data_asset_count": 1,
"example_data_asset_names": [
"alpha",
],
"data_assets": {
"alpha": {
"example_data_references": [
"alpha-1.csv",
"alpha-2.csv",
"alpha-3.csv",
],
"batch_definition_count": 3,
},
},
"example_unmatched_data_references": [],
"unmatched_data_reference_count": 0,
}
return config
@pytest.fixture
def expected_batch_definitions_unsorted():
"""
Used to validate `get_batch_definition_list_from_batch_request()` outputs.
Input and output should maintain the same order (henced "unsorted")
"""
expected = [
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "alex", "timestamp": "20200809", "price": "1000"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "eugene", "timestamp": "20200809", "price": "1500"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "james", "timestamp": "20200811", "price": "1009"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "abe", "timestamp": "20200809", "price": "1040"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "will", "timestamp": "20200809", "price": "1002"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "james", "timestamp": "20200713", "price": "1567"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "eugene", "timestamp": "20201129", "price": "1900"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "will", "timestamp": "20200810", "price": "1001"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "james", "timestamp": "20200810", "price": "1003"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "alex", "timestamp": "20200819", "price": "1300"}
),
),
]
return expected
@pytest.fixture
def expected_batch_definitions_sorted():
"""
Used to validate `get_batch_definition_list_from_batch_request()` outputs.
Input should be sorted based on some criteria, resulting in some change
between input and output.
"""
expected = [
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "abe", "timestamp": "20200809", "price": "1040"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "alex", "timestamp": "20200819", "price": "1300"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "alex", "timestamp": "20200809", "price": "1000"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "eugene", "timestamp": "20201129", "price": "1900"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "eugene", "timestamp": "20200809", "price": "1500"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "james", "timestamp": "20200811", "price": "1009"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "james", "timestamp": "20200810", "price": "1003"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "james", "timestamp": "20200713", "price": "1567"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "will", "timestamp": "20200810", "price": "1001"}
),
),
BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
{"name": "will", "timestamp": "20200809", "price": "1002"}
),
),
]
return expected
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
return_value=["alpha-1.csv", "alpha-2.csv", "alpha-3.csv"],
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_account_url_and_credential(
mock_azure_conn, mock_list_keys, expected_config_dict
):
my_data_connector = ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={
"account_url": "my_account_url.blob.core.windows.net",
"credential": "my_credential",
},
)
assert my_data_connector.self_check() == expected_config_dict
my_data_connector._refresh_data_references_cache()
assert my_data_connector.get_data_reference_list_count() == 3
assert my_data_connector.get_unmatched_data_references() == []
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
return_value=["alpha-1.csv", "alpha-2.csv", "alpha-3.csv"],
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_conn_str_and_credential(
mock_azure_conn, mock_list_keys, expected_config_dict
):
my_data_connector = ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={ # Representative of format noted in official docs
"conn_str": "DefaultEndpointsProtocol=https;AccountName=storagesample;AccountKey=my_account_key",
"credential": "my_credential",
},
)
assert my_data_connector.self_check() == expected_config_dict
my_data_connector._refresh_data_references_cache()
assert my_data_connector.get_data_reference_list_count() == 3
assert my_data_connector.get_unmatched_data_references() == []
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_valid_account_url_assigns_account_name(mock_azure_conn):
my_data_connector = ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={
"account_url": "my_account_url.blob.core.windows.net",
"credential": "my_credential",
},
)
assert my_data_connector._account_name == "my_account_url"
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_valid_conn_str_assigns_account_name(mock_azure_conn):
my_data_connector = ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={ # Representative of format noted in official docs
"conn_str": "DefaultEndpointsProtocol=https;AccountName=storagesample;AccountKey=my_account_key",
"credential": "my_credential",
},
)
assert my_data_connector._account_name == "storagesample"
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_multiple_auth_methods_raises_error(
mock_azure_conn,
):
# Raises error in DataContext's schema validation due to having both `account_url` and `conn_str`
with pytest.raises(AssertionError):
ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={
"account_url": "account.blob.core.windows.net",
"conn_str": "DefaultEndpointsProtocol=https;AccountName=storagesample;AccountKey=my_account_key",
"credential": "my_credential",
},
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_improperly_formatted_auth_keys_in_azure_options_raises_error(
mock_azure_conn,
):
# Raises error in ConfiguredAssetAzureDataConnector's constructor due to `account_url` not conforming to the expected format
# Format: <ACCOUNT>.blob.core.windows.net
with pytest.raises(ImportError):
ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={"account_url": "not_a_valid_url"},
)
# Raises error in ConfiguredAssetAzureDataConnector's constructor due to `conn_str` not conforming to the expected format
# Format: Must be a variable length, semicolon-delimited string containing "AccountName=<ACCOUNT>"
with pytest.raises(ImportError):
ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={"conn_str": "not_a_valid_conn_str"},
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
return_value=["alpha-1.csv", "alpha-2.csv", "alpha-3.csv"],
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_test_yaml_config(
mock_azure_conn,
mock_list_keys,
mock_emit,
empty_data_context_stats_enabled,
expected_config_dict,
):
context: DataContext = empty_data_context_stats_enabled
report_object = context.test_yaml_config(
f"""
module_name: great_expectations.datasource.data_connector
class_name: ConfiguredAssetAzureDataConnector
datasource_name: FAKE_DATASOURCE
name: TEST_DATA_CONNECTOR
default_regex:
pattern: alpha-(.*)\\.csv
group_names:
- index
container: my_container
name_starts_with: ""
assets:
alpha:
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
runtime_environment={
"execution_engine": PandasExecutionEngine(),
},
return_mode="report_object",
)
assert report_object == expected_config_dict
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
return_value=["alpha-1.csv", "alpha-2.csv", "alpha-3.csv"],
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_with_test_yaml_config_emits_proper_payload(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
context: DataContext = empty_data_context_stats_enabled
context.test_yaml_config(
f"""
module_name: great_expectations.datasource.data_connector
class_name: ConfiguredAssetAzureDataConnector
datasource_name: FAKE_DATASOURCE
name: TEST_DATA_CONNECTOR
default_regex:
pattern: alpha-(.*)\\.csv
group_names:
- index
container: my_container
name_starts_with: ""
assets:
alpha:
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
runtime_environment={
"execution_engine": PandasExecutionEngine(),
},
return_mode="report_object",
)
assert mock_emit.call_count == 1
anonymized_name = mock_emit.call_args_list[0][0][0]["event_payload"][
"anonymized_name"
]
expected_call_args_list = [
mock.call(
{
"event": "data_context.test_yaml_config",
"event_payload": {
"anonymized_name": anonymized_name,
"parent_class": "ConfiguredAssetAzureDataConnector",
},
"success": True,
}
),
]
assert mock_emit.call_args_list == expected_call_args_list
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
return_value=["alpha-1.csv", "alpha-2.csv", "alpha-3.csv"],
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_instantiation_from_a_config_with_nonmatching_regex_creates_unmatched_references(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
context: DataContext = empty_data_context_stats_enabled
report_object = context.test_yaml_config(
f"""
module_name: great_expectations.datasource.data_connector
class_name: ConfiguredAssetAzureDataConnector
datasource_name: FAKE_DATASOURCE
name: TEST_DATA_CONNECTOR
default_regex:
pattern: beta-(.*)\\.csv
group_names:
- index
container: my_container
name_starts_with: ""
assets:
alpha:
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
runtime_environment={
"execution_engine": PandasExecutionEngine(),
},
return_mode="report_object",
)
assert report_object == {
"class_name": "ConfiguredAssetAzureDataConnector",
"data_asset_count": 1,
"example_data_asset_names": [
"alpha",
],
"data_assets": {
"alpha": {"example_data_references": [], "batch_definition_count": 0},
},
"example_unmatched_data_references": [
"alpha-1.csv",
"alpha-2.csv",
"alpha-3.csv",
],
"unmatched_data_reference_count": 3,
}
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
return_value=["alpha-1.csv", "alpha-2.csv", "alpha-3.csv"],
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_get_batch_definition_list_from_batch_request_with_nonexistent_datasource_name_raises_error(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
my_data_connector = ConfiguredAssetAzureDataConnector(
name="my_data_connector",
datasource_name="FAKE_DATASOURCE_NAME",
execution_engine=PandasExecutionEngine(),
default_regex={
"pattern": "alpha-(.*)\\.csv",
"group_names": ["index"],
},
container="my_container",
name_starts_with="",
assets={"alpha": {}},
azure_options={
"account_url": "my_account_url.blob.core.windows.net",
"credential": "my_credential",
},
)
# Raises error in `DataConnector._validate_batch_request()` due to `datasource_name` in BatchRequest not matching DataConnector `datasource_name`
with pytest.raises(ValueError):
my_data_connector.get_batch_definition_list_from_batch_request(
BatchRequest(
datasource_name="something",
data_connector_name="my_data_connector",
data_asset_name="something",
)
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_get_definition_list_from_batch_request_with_empty_args_raises_error(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = (
[
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
],
)
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
# Raises error in `FilePathDataConnector.get_batch_definition_list_from_batch_request()` due to missing a `batch_request` arg
with pytest.raises(TypeError):
# noinspection PyArgumentList
my_data_connector.get_batch_definition_list_from_batch_request()
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_get_definition_list_from_batch_request_with_unnamed_data_asset_name_raises_error(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
# Raises error in `Batch._validate_init_parameters()` due to `data_asset_name` being `NoneType` and not the required `str`
with pytest.raises(TypeError):
my_data_connector.get_batch_definition_list_from_batch_request(
BatchRequest(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="",
)
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_return_all_batch_definitions_unsorted_without_named_data_asset_name(
mock_azure_conn,
mock_list_keys,
mock_emit,
empty_data_context_stats_enabled,
expected_batch_definitions_unsorted,
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
# In an actual production environment, Azure Blob Storage will automatically sort these blobs by path (alphabetic order).
# Source: https://docs.microsoft.com/en-us/rest/api/storageservices/List-Blobs?redirectedfrom=MSDN
#
# The expected behavior is that our `unsorted_batch_definition_list` will maintain the same order it parses through `list_azure_keys()` (hence "unsorted").
# When using an actual `BlobServiceClient` (and not a mock), the output of `list_azure_keys` would be pre-sorted by nature of how the system orders blobs.
# It is important to note that although this is a minor deviation, it is deemed to be immaterial as we still end up testing our desired behavior.
unsorted_batch_definition_list = (
my_data_connector._get_batch_definition_list_from_batch_request(
BatchRequestBase(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="",
)
)
)
assert unsorted_batch_definition_list == expected_batch_definitions_unsorted
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_return_all_batch_definitions_unsorted_with_named_data_asset_name(
mock_azure_conn,
mock_list_keys,
mock_emit,
empty_data_context_stats_enabled,
expected_batch_definitions_unsorted,
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
# In an actual production environment, Azure Blob Storage will automatically sort these blobs by path (alphabetic order).
# Source: https://docs.microsoft.com/en-us/rest/api/storageservices/List-Blobs?redirectedfrom=MSDN
#
# The expected behavior is that our `unsorted_batch_definition_list` will maintain the same order it parses through `list_azure_keys()` (hence "unsorted").
# When using an actual `BlobServiceClient` (and not a mock), the output of `list_azure_keys` would be pre-sorted by nature of how the system orders blobs.
# It is important to note that although this is a minor deviation, it is deemed to be immaterial as we still end up testing our desired behavior.
unsorted_batch_definition_list = (
my_data_connector.get_batch_definition_list_from_batch_request(
BatchRequest(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
)
)
)
assert unsorted_batch_definition_list == expected_batch_definitions_unsorted
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_return_all_batch_definitions_basic_sorted(
mock_azure_conn,
mock_list_keys,
mock_emit,
empty_data_context_stats_enabled,
expected_batch_definitions_sorted,
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
sorters:
- orderby: asc
class_name: LexicographicSorter
name: name
- datetime_format: "%Y%m%d"
orderby: desc
class_name: DateTimeSorter
name: timestamp
- orderby: desc
class_name: NumericSorter
name: price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
self_check_report = my_data_connector.self_check()
assert self_check_report["class_name"] == "ConfiguredAssetAzureDataConnector"
assert self_check_report["data_asset_count"] == 1
assert self_check_report["data_assets"]["TestFiles"]["batch_definition_count"] == 10
assert self_check_report["unmatched_data_reference_count"] == 0
sorted_batch_definition_list = (
my_data_connector.get_batch_definition_list_from_batch_request(
BatchRequest(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
)
)
)
assert sorted_batch_definition_list == expected_batch_definitions_sorted
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
def test_return_all_batch_definitions_returns_specified_partition(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
sorters:
- orderby: asc
class_name: LexicographicSorter
name: name
- datetime_format: "%Y%m%d"
orderby: desc
class_name: DateTimeSorter
name: timestamp
- orderby: desc
class_name: NumericSorter
name: price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
self_check_report = my_data_connector.self_check()
assert self_check_report["class_name"] == "ConfiguredAssetAzureDataConnector"
assert self_check_report["data_asset_count"] == 1
assert self_check_report["data_assets"]["TestFiles"]["batch_definition_count"] == 10
assert self_check_report["unmatched_data_reference_count"] == 0
my_batch_request: BatchRequest = BatchRequest(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
data_connector_query=IDDict(
**{
"batch_filter_parameters": {
"name": "james",
"timestamp": "20200713",
"price": "1567",
}
}
),
)
my_batch_definition_list = (
my_data_connector.get_batch_definition_list_from_batch_request(
batch_request=my_batch_request
)
)
assert len(my_batch_definition_list) == 1
my_batch_definition = my_batch_definition_list[0]
expected_batch_definition: BatchDefinition = BatchDefinition(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
batch_identifiers=IDDict(
**{
"name": "james",
"timestamp": "20200713",
"price": "1567",
}
),
)
assert my_batch_definition == expected_batch_definition
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
def test_return_all_batch_definitions_sorted_without_data_connector_query(
mock_azure_conn,
mock_list_keys,
mock_emit,
empty_data_context_stats_enabled,
expected_batch_definitions_sorted,
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
sorters:
- orderby: asc
class_name: LexicographicSorter
name: name
- datetime_format: "%Y%m%d"
orderby: desc
class_name: DateTimeSorter
name: timestamp
- orderby: desc
class_name: NumericSorter
name: price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
)
self_check_report = my_data_connector.self_check()
assert self_check_report["class_name"] == "ConfiguredAssetAzureDataConnector"
assert self_check_report["data_asset_count"] == 1
assert self_check_report["data_assets"]["TestFiles"]["batch_definition_count"] == 10
assert self_check_report["unmatched_data_reference_count"] == 0
sorted_batch_definition_list = (
my_data_connector.get_batch_definition_list_from_batch_request(
BatchRequest(
datasource_name="test_environment",
data_connector_name="general_azure_data_connector",
data_asset_name="TestFiles",
)
)
)
assert sorted_batch_definition_list == expected_batch_definitions_sorted
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
def test_return_all_batch_definitions_raises_error_due_to_sorter_that_does_not_match_group(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
assets:
TestFiles:
pattern: (.+)_(.+)_(.+)\\.csv
group_names:
- name
- timestamp
- price
default_regex:
pattern: (.+)_.+_.+\\.csv
group_names:
- name
sorters:
- orderby: asc
class_name: LexicographicSorter
name: name
- datetime_format: "%Y%m%d"
orderby: desc
class_name: DateTimeSorter
name: timestamp
- orderby: desc
class_name: NumericSorter
name: for_me_Me_Me
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
# Raises error due to a sorter (for_me_Me_me) not matching a group_name in `FilePathDataConnector._validate_sorters_configuration()`
with pytest.raises(ge_exceptions.DataConnectorError):
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys"
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
def test_return_all_batch_definitions_too_many_sorters(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
my_data_connector_yaml = yaml.load(
f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: test_environment
container: my_container
name_starts_with: ""
assets:
TestFiles:
default_regex:
pattern: (.+)_.+_.+\\.csv
group_names:
- name
sorters:
- orderby: asc
class_name: LexicographicSorter
name: name
- datetime_format: "%Y%m%d"
orderby: desc
class_name: DateTimeSorter
name: timestamp
- orderby: desc
class_name: NumericSorter
name: price
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
""",
)
mock_list_keys.return_value = [
"alex_20200809_1000.csv",
"eugene_20200809_1500.csv",
"james_20200811_1009.csv",
"abe_20200809_1040.csv",
"will_20200809_1002.csv",
"james_20200713_1567.csv",
"eugene_20201129_1900.csv",
"will_20200810_1001.csv",
"james_20200810_1003.csv",
"alex_20200819_1300.csv",
]
# Raises error due to a non-existent sorter being specified in `FilePathDataConnector._validate_sorters_configuration()`
with pytest.raises(ge_exceptions.DataConnectorError):
instantiate_class_from_config(
config=my_data_connector_yaml,
runtime_environment={
"name": "general_azure_data_connector",
"execution_engine": PandasExecutionEngine(),
},
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
def test_example_with_explicit_data_asset_names(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
yaml_string = f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: FAKE_DATASOURCE_NAME
container: my_container
name_starts_with: my_base_directory/
default_regex:
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.(csv|txt)$
group_names:
- data_asset_name
- year_dir
- month_dir
assets:
alpha:
name_starts_with: my_base_directory/alpha/files/go/here/
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.csv$
beta:
name_starts_with: my_base_directory/beta_here/
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.txt$
gamma:
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.csv$
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
"""
config = yaml.load(yaml_string)
mock_list_keys.return_value = [ # Initial return value during instantiation
"my_base_directory/alpha/files/go/here/alpha-202001.csv",
"my_base_directory/alpha/files/go/here/alpha-202002.csv",
"my_base_directory/alpha/files/go/here/alpha-202003.csv",
"my_base_directory/beta_here/beta-202001.txt",
"my_base_directory/beta_here/beta-202002.txt",
"my_base_directory/beta_here/beta-202003.txt",
"my_base_directory/beta_here/beta-202004.txt",
"my_base_directory/gamma-202001.csv",
"my_base_directory/gamma-202002.csv",
"my_base_directory/gamma-202003.csv",
"my_base_directory/gamma-202004.csv",
"my_base_directory/gamma-202005.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config,
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
runtime_environment={
"name": "my_data_connector",
"execution_engine": PandasExecutionEngine(),
},
)
)
# Since we are using mocks, we need to redefine the output of subsequent calls to `list_azure_keys()`
# Our patched object provides the ability to define a "side_effect", an iterable containing return
# values for subsequent calls. Since `_refresh_data_references_cache()` makes multiple calls to
# this method (once per asset), we define our expected behavior below.
#
# Source: https://stackoverflow.com/questions/24897145/python-mock-multiple-return-values
mock_list_keys.side_effect = [
[ # Asset alpha
"my_base_directory/alpha/files/go/here/alpha-202001.csv",
"my_base_directory/alpha/files/go/here/alpha-202002.csv",
"my_base_directory/alpha/files/go/here/alpha-202003.csv",
],
[ # Asset beta
"my_base_directory/beta_here/beta-202001.txt",
"my_base_directory/beta_here/beta-202002.txt",
"my_base_directory/beta_here/beta-202003.txt",
"my_base_directory/beta_here/beta-202004.txt",
],
[ # Asset gamma
"my_base_directory/gamma-202001.csv",
"my_base_directory/gamma-202002.csv",
"my_base_directory/gamma-202003.csv",
"my_base_directory/gamma-202004.csv",
"my_base_directory/gamma-202005.csv",
],
]
my_data_connector._refresh_data_references_cache()
assert len(my_data_connector.get_unmatched_data_references()) == 0
assert (
len(
my_data_connector.get_batch_definition_list_from_batch_request(
batch_request=BatchRequest(
datasource_name="FAKE_DATASOURCE_NAME",
data_connector_name="my_data_connector",
data_asset_name="alpha",
)
)
)
== 3
)
assert (
len(
my_data_connector.get_batch_definition_list_from_batch_request(
batch_request=BatchRequest(
datasource_name="FAKE_DATASOURCE_NAME",
data_connector_name="my_data_connector",
data_asset_name="beta",
)
)
)
== 4
)
assert (
len(
my_data_connector.get_batch_definition_list_from_batch_request(
batch_request=BatchRequest(
datasource_name="FAKE_DATASOURCE_NAME",
data_connector_name="my_data_connector",
data_asset_name="gamma",
)
)
)
== 5
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.BlobServiceClient"
)
@mock.patch(
"great_expectations.datasource.data_connector.configured_asset_azure_data_connector.list_azure_keys",
)
@mock.patch(
"great_expectations.core.usage_statistics.usage_statistics.UsageStatisticsHandler.emit"
)
def test_get_full_file_path(
mock_azure_conn, mock_list_keys, mock_emit, empty_data_context_stats_enabled
):
yaml_string = f"""
class_name: ConfiguredAssetAzureDataConnector
datasource_name: FAKE_DATASOURCE_NAME
container: my_container
name_starts_with: my_base_directory/
default_regex:
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.(csv|txt)$
group_names:
- data_asset_name
- year_dir
- month_dir
assets:
alpha:
prefix: my_base_directory/alpha/files/go/here/
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.csv$
beta:
prefix: my_base_directory/beta_here/
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.txt$
gamma:
pattern: ^(.+)-(\\d{{4}})(\\d{{2}})\\.csv$
azure_options:
account_url: my_account_url.blob.core.windows.net
credential: my_credential
"""
config = yaml.load(yaml_string)
mock_list_keys.return_value = [
"my_base_directory/alpha/files/go/here/alpha-202001.csv",
"my_base_directory/alpha/files/go/here/alpha-202002.csv",
"my_base_directory/alpha/files/go/here/alpha-202003.csv",
"my_base_directory/beta_here/beta-202001.txt",
"my_base_directory/beta_here/beta-202002.txt",
"my_base_directory/beta_here/beta-202003.txt",
"my_base_directory/beta_here/beta-202004.txt",
"my_base_directory/gamma-202001.csv",
"my_base_directory/gamma-202002.csv",
"my_base_directory/gamma-202003.csv",
"my_base_directory/gamma-202004.csv",
"my_base_directory/gamma-202005.csv",
]
my_data_connector: ConfiguredAssetAzureDataConnector = (
instantiate_class_from_config(
config,
config_defaults={
"module_name": "great_expectations.datasource.data_connector"
},
runtime_environment={
"name": "my_data_connector",
"execution_engine": PandasExecutionEngine(),
},
)
)
assert (
my_data_connector._get_full_file_path(
"my_base_directory/alpha/files/go/here/alpha-202001.csv", "alpha"
)
== "my_account_url.blob.core.windows.net/my_container/my_base_directory/alpha/files/go/here/alpha-202001.csv"
)
assert (
my_data_connector._get_full_file_path(
"my_base_directory/beta_here/beta-202002.txt", "beta"
)
== "my_account_url.blob.core.windows.net/my_container/my_base_directory/beta_here/beta-202002.txt"
)
assert (
my_data_connector._get_full_file_path(
"my_base_directory/gamma-202005.csv", "gamma"
)
== "my_account_url.blob.core.windows.net/my_container/my_base_directory/gamma-202005.csv"
)
| 35.338028 | 159 | 0.643447 | 5,551 | 55,198 | 5.973158 | 0.063952 | 0.093314 | 0.033929 | 0.049552 | 0.912718 | 0.906626 | 0.901258 | 0.894381 | 0.88651 | 0.884489 | 0 | 0.038105 | 0.261169 | 55,198 | 1,561 | 160 | 35.360666 | 0.774925 | 0.063082 | 0 | 0.78156 | 0 | 0.001418 | 0.46818 | 0.248363 | 0 | 0 | 0 | 0 | 0.02695 | 1 | 0.017021 | false | 0 | 0.008511 | 0 | 0.02766 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
16342d9c58631826d51cb8511f522e6d6d236803 | 33,217 | py | Python | model/model_cd.py | MingSun-Tse/Collaborative-Distillation | 915712674af82ff91d926d922c14988cce0430f3 | [
"MIT"
] | 172 | 2020-03-20T00:50:34.000Z | 2022-03-29T08:49:30.000Z | model/model_cd.py | MingSun-Tse/Collaborative-Distillation | 915712674af82ff91d926d922c14988cce0430f3 | [
"MIT"
] | 22 | 2020-06-25T03:20:49.000Z | 2022-03-12T00:37:59.000Z | model/model_cd.py | MingSun-Tse/Collaborative-Distillation | 915712674af82ff91d926d922c14988cce0430f3 | [
"MIT"
] | 24 | 2020-03-20T10:11:56.000Z | 2021-06-01T06:42:22.000Z | import numpy as np
import os
import torch.nn as nn
import torch
from torch.utils.serialization import load_lua
from utils import load_param_from_t7 as load_param
from model.model_kd2sd import SmallDecoder1_16x_aux, SmallDecoder2_16x_aux, SmallDecoder3_16x_aux, SmallDecoder4_16x_aux, SmallDecoder5_16x_aux
import pickle
pjoin = os.path.join
# calculate style distances in CVPR paper
# since 5-stage style distances are shown separately, there is no need to normalize it by num_channel.
# ref https://pytorch.org/tutorials/advanced/neural_style_tutorial.html
def gram_matrix(input):
a, b, c, d = input.size() # [N, C, H, W]
batch_feat = input.view(a, b, c*d) # [N, C, HW]
batch_gram = torch.stack([torch.mm(feat, feat.t()) for feat in batch_feat])
batch_gram = batch_gram.div(a*b*c*d)
return batch_gram # shape: [N, C, C]
# ref: AdaIN impel. (https://github.com/naoto0804/pytorch-AdaIN/blob/master/function.py)
def calc_mean_std(feat, eps=1e-5):
# eps is a small value added to the variance to avoid divide-by-zero.
size = feat.size()
assert (len(size) == 4)
N, C = size[:2]
feat_var = feat.view(N, C, -1).var(dim=2) + eps
feat_std = feat_var.sqrt().view(N, C, 1, 1)
feat_mean = feat.view(N, C, -1).mean(dim=2).view(N, C, 1, 1)
return feat_mean, feat_std
def adaptive_instance_normalization(content_feat, style_feat):
assert (content_feat.size()[:2] == style_feat.size()[:2])
size = content_feat.size()
style_mean, style_std = calc_mean_std(style_feat)
content_mean, content_std = calc_mean_std(content_feat)
normalized_feat = (content_feat - content_mean.expand(
size)) / content_std.expand(size)
return normalized_feat * style_std.expand(size) + style_mean.expand(size)
# calculate average style distance, which needs normalization by num_channel.
def gram_matrix_ave(input):
a, b, c, d = input.size()
batch_feat = input.view(a, b, c*d)
batch_gram = torch.stack([torch.mm(feat, feat.t()).div(b*c*d) for feat in batch_feat])
return batch_gram # shape: [batch_size, channel, channel]
# Load param from model1 to model2
# For each layer of model2, if model1 has the same layer, then copy the params.
def load_param2(model1_path, model2):
dict_param1 = torch.load(model1_path) # model1_path: .pth model path
dict_param2 = model2.state_dict()
for name2 in dict_param2:
if name2 in dict_param1:
# print("tensor '%s' found in both models, so copy it from model 1 to model 2" % name2)
dict_param2[name2].data.copy_(dict_param1[name2].data)
model2.load_state_dict(dict_param2)
return model2
# -----------------------------------------------
class SmallDecoder1_16x(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallDecoder1_16x, self).__init__()
self.fixed = fixed
self.conv11 = nn.Conv2d(24,3,3,1,0, dilation=1)
self.relu = nn.ReLU(inplace=True)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.relu(self.conv11(self.pad(y)))
return y
def forward_pwct(self, input):
out11 = self.conv11(self.pad(input))
return out11
class SmallDecoder2_16x(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallDecoder2_16x, self).__init__()
self.fixed = fixed
self.conv21 = nn.Conv2d( 32, 16,3,1,0)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv11 = nn.Conv2d( 16, 3,3,1,0, dilation=1)
self.relu = nn.ReLU(inplace=True)
self.unpool = nn.UpsamplingNearest2d(scale_factor=2)
self.unpool_pwct = nn.MaxUnpool2d(kernel_size=2, stride=2)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.relu(self.conv21(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv12(self.pad(y)))
y = self.relu(self.conv11(self.pad(y)))
return y
def forward_pwct(self, x, pool1_idx=None, pool1_size=None, pool2_idx=None, pool2_size=None, pool3_idx=None, pool3_size=None):
out21 = self.relu(self.conv21(self.pad(x)))
out21 = self.unpool_pwct(out21, pool1_idx, output_size=pool1_size)
out12 = self.relu(self.conv12(self.pad(out21)))
out11 = self.conv11(self.pad(out12))
return out11
class SmallDecoder3_16x(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallDecoder3_16x, self).__init__()
self.fixed = fixed
self.conv31 = nn.Conv2d( 64, 32,3,1,0)
self.conv22 = nn.Conv2d( 32, 32,3,1,0)
self.conv21 = nn.Conv2d( 32, 16,3,1,0)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv11 = nn.Conv2d( 16, 3,3,1,0, dilation=1)
self.relu = nn.ReLU(inplace=True)
self.unpool = nn.UpsamplingNearest2d(scale_factor=2)
self.unpool_pwct = nn.MaxUnpool2d(kernel_size=2, stride=2)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.relu(self.conv31(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv22(self.pad(y)))
y = self.relu(self.conv21(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv12(self.pad(y)))
y = self.relu(self.conv11(self.pad(y)))
return y
def forward_pwct(self, x, pool1_idx=None, pool1_size=None, pool2_idx=None, pool2_size=None, pool3_idx=None, pool3_size=None):
out31 = self.relu(self.conv31(self.pad(x)))
out31 = self.unpool_pwct(out31, pool2_idx, output_size=pool2_size)
out22 = self.relu(self.conv22(self.pad(out31)))
out21 = self.relu(self.conv21(self.pad(out22)))
out21 = self.unpool_pwct(out21, pool1_idx, output_size=pool1_size)
out12 = self.relu(self.conv12(self.pad(out21)))
out11 = self.conv11(self.pad(out12))
return out11
class SmallDecoder4_16x(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallDecoder4_16x, self).__init__()
self.fixed = fixed
self.conv41 = nn.Conv2d(128, 64,3,1,0)
self.conv34 = nn.Conv2d( 64, 64,3,1,0)
self.conv33 = nn.Conv2d( 64, 64,3,1,0)
self.conv32 = nn.Conv2d( 64, 64,3,1,0)
self.conv31 = nn.Conv2d( 64, 32,3,1,0)
self.conv22 = nn.Conv2d( 32, 32,3,1,0, dilation=1)
self.conv21 = nn.Conv2d( 32, 16,3,1,0, dilation=1)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv11 = nn.Conv2d( 16, 3,3,1,0, dilation=1)
self.relu = nn.ReLU(inplace=True)
self.unpool = nn.UpsamplingNearest2d(scale_factor=2)
self.unpool_pwct = nn.MaxUnpool2d(kernel_size=2, stride=2)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.relu(self.conv41(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv34(self.pad(y)))
y = self.relu(self.conv33(self.pad(y)))
y = self.relu(self.conv32(self.pad(y)))
y = self.relu(self.conv31(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv22(self.pad(y)))
y = self.relu(self.conv21(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv12(self.pad(y)))
y = self.relu(self.conv11(self.pad(y)))
return y
def forward_pwct(self, x, pool1_idx=None, pool1_size=None, pool2_idx=None, pool2_size=None, pool3_idx=None, pool3_size=None):
out41 = self.relu(self.conv41(self.pad(x)))
out41 = self.unpool_pwct(out41, pool3_idx, output_size=pool3_size)
out34 = self.relu(self.conv34(self.pad(out41)))
out33 = self.relu(self.conv33(self.pad(out34)))
out32 = self.relu(self.conv32(self.pad(out33)))
out31 = self.relu(self.conv31(self.pad(out32)))
out31 = self.unpool_pwct(out31, pool2_idx, output_size=pool2_size)
out22 = self.relu(self.conv22(self.pad(out31)))
out21 = self.relu(self.conv21(self.pad(out22)))
out21 = self.unpool_pwct(out21, pool1_idx, output_size=pool1_size)
out12 = self.relu(self.conv12(self.pad(out21)))
out11 = self.conv11(self.pad(out12))
return out11
class SmallDecoder5_16x(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallDecoder5_16x, self).__init__()
self.fixed = fixed
self.conv51 = nn.Conv2d(128,128,3,1,0)
self.conv44 = nn.Conv2d(128,128,3,1,0)
self.conv43 = nn.Conv2d(128,128,3,1,0)
self.conv42 = nn.Conv2d(128,128,3,1,0)
self.conv41 = nn.Conv2d(128, 64,3,1,0)
self.conv34 = nn.Conv2d( 64, 64,3,1,0, dilation=1)
self.conv33 = nn.Conv2d( 64, 64,3,1,0, dilation=1)
self.conv32 = nn.Conv2d( 64, 64,3,1,0, dilation=1)
self.conv31 = nn.Conv2d( 64, 32,3,1,0, dilation=1)
self.conv22 = nn.Conv2d( 32, 32,3,1,0, dilation=1)
self.conv21 = nn.Conv2d( 32, 16,3,1,0, dilation=1)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv11 = nn.Conv2d( 16, 3,3,1,0, dilation=1)
self.relu = nn.ReLU(inplace=True)
self.unpool = nn.UpsamplingNearest2d(scale_factor=2)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.relu(self.conv51(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv44(self.pad(y)))
y = self.relu(self.conv43(self.pad(y)))
y = self.relu(self.conv42(self.pad(y)))
y = self.relu(self.conv41(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv34(self.pad(y)))
y = self.relu(self.conv33(self.pad(y)))
y = self.relu(self.conv32(self.pad(y)))
y = self.relu(self.conv31(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv22(self.pad(y)))
y = self.relu(self.conv21(self.pad(y)))
y = self.unpool(y)
y = self.relu(self.conv12(self.pad(y)))
y = self.relu(self.conv11(self.pad(y))) # self.conv11(self.pad(y))
return y
def forward_branch(self, input):
out51 = self.relu(self.conv51(self.pad(input)))
out51 = self.unpool(out51)
out44 = self.relu(self.conv44(self.pad(out51)))
out43 = self.relu(self.conv43(self.pad(out44)))
out42 = self.relu(self.conv42(self.pad(out43)))
out41 = self.relu(self.conv41(self.pad(out42)))
out41 = self.unpool(out41)
out34 = self.relu(self.conv34(self.pad(out41)))
out33 = self.relu(self.conv33(self.pad(out34)))
out32 = self.relu(self.conv32(self.pad(out33)))
out31 = self.relu(self.conv31(self.pad(out32)))
out31 = self.unpool(out31)
out22 = self.relu(self.conv22(self.pad(out31)))
out21 = self.relu(self.conv21(self.pad(out22)))
out21 = self.unpool(out21)
out12 = self.relu(self.conv12(self.pad(out21)))
out11 = self.relu(self.conv11(self.pad(out12)))
return out11
# bridge the dimension mismatch using a 1x1 linear layer
class SmallEncoder1_16x_aux(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallEncoder1_16x_aux, self).__init__()
self.fixed = fixed
self.conv0 = nn.Conv2d(3,3,1,1,0)
self.conv0.requires_grad = False
self.conv11 = nn.Conv2d( 3, 24, 3, 1, 0, dilation=1)
self.conv11_aux = nn.Conv2d( 24, 64, 1, 1, 0)
self.relu = nn.ReLU(inplace=True)
self.pool = nn.MaxPool2d(kernel_size=2, stride=2, return_indices=False)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
# "forward" only outputs the final output
# "forward_branch" outputs all the middle branch ouputs
# "forward_aux" outputs all the middle auxiliary mapping layers
def forward(self, y):
y = self.conv0(y)
y = self.relu(self.conv11(self.pad(y)))
return y
def forward_branch(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
return out11,
def forward_aux(self, input, relu=True):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
if relu:
out11_aux = self.relu(self.conv11_aux(out11))
else:
out11_aux = self.conv11_aux(out11)
return out11_aux,
def forward_aux2(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out11_aux = self.relu(self.conv11_aux(out11))
return out11_aux, out11 # used for feature loss and style loss
class SmallEncoder2_16x_aux(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallEncoder2_16x_aux, self).__init__()
self.fixed = fixed
self.conv0 = nn.Conv2d(3,3,1,1,0)
self.conv0.requires_grad = False
self.conv11 = nn.Conv2d( 3, 16,3,1,0, dilation=1)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv21 = nn.Conv2d( 16, 32,3,1,0)
self.conv11_aux = nn.Conv2d( 16, 64,1,1,0)
self.conv21_aux = nn.Conv2d( 32,128,1,1,0)
self.relu = nn.ReLU(inplace=True)
self.pool = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=False)
self.pool2 = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=True)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.conv0(y)
y = self.relu(self.conv11(self.pad(y)))
y = self.relu(self.conv12(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv21(self.pad(y)))
return y
def forward_branch(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
return out11, out21
def forward_aux(self, input, relu=True):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
if relu:
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
else:
out11_aux = self.conv11_aux(out11)
out21_aux = self.conv21_aux(out21)
return out11_aux, out21_aux
def forward_aux2(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
return out11_aux, out21_aux, out21 # used for feature loss and style loss
def forward_pwct(self, input): # for function in photo WCT
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
pool12, out12_ix = self.pool2(out12)
out21 = self.relu(self.conv21(self.pad(pool12)))
return out21, out12_ix, out12.size()
class SmallEncoder3_16x_aux(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallEncoder3_16x_aux, self).__init__()
self.fixed = fixed
self.conv0 = nn.Conv2d(3,3,1,1,0)
self.conv0.requires_grad = False
self.conv11 = nn.Conv2d( 3, 16,3,1,0, dilation=1)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv21 = nn.Conv2d( 16, 32,3,1,0)
self.conv22 = nn.Conv2d( 32, 32,3,1,0)
self.conv31 = nn.Conv2d( 32, 64,3,1,0)
self.conv11_aux = nn.Conv2d( 16, 64,1,1,0)
self.conv21_aux = nn.Conv2d( 32,128,1,1,0)
self.conv31_aux = nn.Conv2d( 64,256,1,1,0)
self.relu = nn.ReLU(inplace=True)
self.pool = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=False)
self.pool2 = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=True)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.conv0(y)
y = self.relu(self.conv11(self.pad(y)))
y = self.relu(self.conv12(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv21(self.pad(y)))
y = self.relu(self.conv22(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv31(self.pad(y)))
return y
def forward_branch(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
return out11, out21, out31
def forward_aux(self, input, relu=True):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
if relu:
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
out31_aux = self.relu(self.conv31_aux(out31))
else:
out11_aux = self.conv11_aux(out11)
out21_aux = self.conv21_aux(out21)
out31_aux = self.conv31_aux(out31)
return out11_aux, out21_aux, out31_aux
def forward_aux2(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
out31_aux = self.relu(self.conv31_aux(out31))
return out11_aux, out21_aux, out31_aux, out31 # used for feature loss and style loss
def forward_pwct(self, input): # for function in photo WCT
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
pool12, out12_ix = self.pool2(out12)
out21 = self.relu(self.conv21(self.pad(pool12)))
out22 = self.relu(self.conv22(self.pad(out21)))
pool22, out22_ix = self.pool2(out22)
out31 = self.relu(self.conv31(self.pad(pool22)))
return out31, out12_ix, out12.size(), out22_ix, out22.size()
class SmallEncoder4_16x_aux(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallEncoder4_16x_aux, self).__init__()
self.fixed = fixed
self.conv0 = nn.Conv2d(3,3,1,1,0)
self.conv0.requires_grad = False
self.conv11 = nn.Conv2d( 3, 16,3,1,0, dilation=1)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv21 = nn.Conv2d( 16, 32,3,1,0, dilation=1)
self.conv22 = nn.Conv2d( 32, 32,3,1,0, dilation=1)
self.conv31 = nn.Conv2d( 32, 64,3,1,0)
self.conv32 = nn.Conv2d( 64, 64,3,1,0)
self.conv33 = nn.Conv2d( 64, 64,3,1,0)
self.conv34 = nn.Conv2d( 64, 64,3,1,0)
self.conv41 = nn.Conv2d( 64,128,3,1,0)
self.conv11_aux = nn.Conv2d( 16, 64,1,1,0)
self.conv21_aux = nn.Conv2d( 32,128,1,1,0)
self.conv31_aux = nn.Conv2d( 64,256,1,1,0)
self.conv41_aux = nn.Conv2d(128,512,1,1,0)
self.relu = nn.ReLU(inplace=True)
self.pool = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=False)
self.pool2 = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=True)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.conv0(y)
y = self.relu(self.conv11(self.pad(y)))
y = self.relu(self.conv12(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv21(self.pad(y)))
y = self.relu(self.conv22(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv31(self.pad(y)))
y = self.relu(self.conv32(self.pad(y)))
y = self.relu(self.conv33(self.pad(y)))
y = self.relu(self.conv34(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv41(self.pad(y)))
return y
def forward_branch(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
return out11, out21, out31, out41
def forward_pwct(self, input): # for function in photo WCT
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
pool12, out12_ix = self.pool2(out12)
out21 = self.relu(self.conv21(self.pad(pool12)))
out22 = self.relu(self.conv22(self.pad(out21)))
pool22, out22_ix = self.pool2(out22)
out31 = self.relu(self.conv31(self.pad(pool22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
pool34, out34_ix = self.pool2(out34)
out41 = self.relu(self.conv41(self.pad(pool34)))
return out41, out12_ix, out12.size(), out22_ix, out22.size(), out34_ix, out34.size()
def forward_aux(self, input, relu=True):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
if relu:
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
out31_aux = self.relu(self.conv31_aux(out31))
out41_aux = self.relu(self.conv41_aux(out41))
else:
out11_aux = self.conv11_aux(out11)
out21_aux = self.conv21_aux(out21)
out31_aux = self.conv31_aux(out31)
out41_aux = self.conv41_aux(out41)
return out11_aux, out21_aux, out31_aux, out41_aux
def forward_aux2(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
out31_aux = self.relu(self.conv31_aux(out31))
out41_aux = self.relu(self.conv41_aux(out41))
return out11_aux, out21_aux, out31_aux, out41_aux, out41 # used for feature loss and style loss
class SmallEncoder5_16x_aux(nn.Module):
def __init__(self, model=None, fixed=False):
super(SmallEncoder5_16x_aux, self).__init__()
self.fixed = fixed
self.conv0 = nn.Conv2d(3,3,1,1,0)
self.conv0.requires_grad = False
self.conv11 = nn.Conv2d( 3, 16,3,1,0, dilation=1)
self.conv12 = nn.Conv2d( 16, 16,3,1,0, dilation=1)
self.conv21 = nn.Conv2d( 16, 32,3,1,0, dilation=1)
self.conv22 = nn.Conv2d( 32, 32,3,1,0, dilation=1)
self.conv31 = nn.Conv2d( 32, 64,3,1,0, dilation=1)
self.conv32 = nn.Conv2d( 64, 64,3,1,0, dilation=1)
self.conv33 = nn.Conv2d( 64, 64,3,1,0, dilation=1)
self.conv34 = nn.Conv2d( 64, 64,3,1,0, dilation=1)
self.conv41 = nn.Conv2d( 64,128,3,1,0)
self.conv42 = nn.Conv2d(128,128,3,1,0)
self.conv43 = nn.Conv2d(128,128,3,1,0)
self.conv44 = nn.Conv2d(128,128,3,1,0)
self.conv51 = nn.Conv2d(128,128,3,1,0)
self.conv11_aux = nn.Conv2d( 16, 64,1,1,0)
self.conv21_aux = nn.Conv2d( 32,128,1,1,0)
self.conv31_aux = nn.Conv2d( 64,256,1,1,0)
self.conv41_aux = nn.Conv2d(128,512,1,1,0)
self.conv51_aux = nn.Conv2d(128,512,1,1,0)
self.relu = nn.ReLU(inplace=True)
self.pool = nn.MaxPool2d(kernel_size=2,stride=2,return_indices=False)
self.pad = nn.ReflectionPad2d((1,1,1,1))
if model:
weights = torch.load(model, map_location=lambda storage, location: storage)
if "model" in weights:
self.load_state_dict(weights["model"])
else:
self.load_state_dict(weights)
print("load model '%s' successfully" % model)
if fixed:
for param in self.parameters():
param.requires_grad = False
def forward(self, y):
y = self.conv0(y)
y = self.relu(self.conv11(self.pad(y)))
y = self.relu(self.conv12(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv21(self.pad(y)))
y = self.relu(self.conv22(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv31(self.pad(y)))
y = self.relu(self.conv32(self.pad(y)))
y = self.relu(self.conv33(self.pad(y)))
y = self.relu(self.conv34(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv41(self.pad(y)))
y = self.relu(self.conv42(self.pad(y)))
y = self.relu(self.conv43(self.pad(y)))
y = self.relu(self.conv44(self.pad(y)))
y = self.pool(y)
y = self.relu(self.conv51(self.pad(y)))
return y
def forward_branch(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
out42 = self.relu(self.conv42(self.pad(out41)))
out43 = self.relu(self.conv43(self.pad(out42)))
out44 = self.relu(self.conv44(self.pad(out43)))
out44 = self.pool(out44)
out51 = self.relu(self.conv51(self.pad(out44)))
return out11, out21, out31, out41, out51
def forward_aux(self, input, relu=True):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
out42 = self.relu(self.conv42(self.pad(out41)))
out43 = self.relu(self.conv43(self.pad(out42)))
out44 = self.relu(self.conv44(self.pad(out43)))
out44 = self.pool(out44)
out51 = self.relu(self.conv51(self.pad(out44)))
if relu:
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
out31_aux = self.relu(self.conv31_aux(out31))
out41_aux = self.relu(self.conv41_aux(out41))
out51_aux = self.relu(self.conv51_aux(out51))
else:
out11_aux = self.conv11_aux(out11)
out21_aux = self.conv21_aux(out21)
out31_aux = self.conv31_aux(out31)
out41_aux = self.conv41_aux(out41)
out51_aux = self.conv51_aux(out51)
return out11_aux, out21_aux, out31_aux, out41_aux, out51_aux
def forward_aux2(self, input):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
out42 = self.relu(self.conv42(self.pad(out41)))
out43 = self.relu(self.conv43(self.pad(out42)))
out44 = self.relu(self.conv44(self.pad(out43)))
out44 = self.pool(out44)
out51 = self.relu(self.conv51(self.pad(out44)))
out11_aux = self.relu(self.conv11_aux(out11))
out21_aux = self.relu(self.conv21_aux(out21))
out31_aux = self.relu(self.conv31_aux(out31))
out41_aux = self.relu(self.conv41_aux(out41))
out51_aux = self.relu(self.conv51_aux(out51))
return out11_aux, out21_aux, out31_aux, out41_aux, out51_aux, out51 # output out51
def forward_aux3(self, input, relu=False):
out0 = self.conv0(input)
out11 = self.relu(self.conv11(self.pad(out0)))
out12 = self.relu(self.conv12(self.pad(out11)))
out12 = self.pool(out12)
out21 = self.relu(self.conv21(self.pad(out12)))
out22 = self.relu(self.conv22(self.pad(out21)))
out22 = self.pool(out22)
out31 = self.relu(self.conv31(self.pad(out22)))
out32 = self.relu(self.conv32(self.pad(out31)))
out33 = self.relu(self.conv33(self.pad(out32)))
out34 = self.relu(self.conv34(self.pad(out33)))
out34 = self.pool(out34)
out41 = self.relu(self.conv41(self.pad(out34)))
out42 = self.relu(self.conv42(self.pad(out41)))
out43 = self.relu(self.conv43(self.pad(out42)))
out44 = self.relu(self.conv44(self.pad(out43)))
out44 = self.pool(out44)
out51 = self.relu(self.conv51(self.pad(out44)))
if relu:
out51_aux = self.relu(self.conv51_aux(out51))
else:
out51_aux = self.conv51_aux(out51)
return out11, out21, out31, out41, out51, out51_aux | 39.170991 | 143 | 0.656862 | 5,219 | 33,217 | 4.084499 | 0.05001 | 0.094948 | 0.136792 | 0.029085 | 0.886522 | 0.879439 | 0.877046 | 0.8527 | 0.839002 | 0.830699 | 0 | 0.105277 | 0.188157 | 33,217 | 848 | 144 | 39.170991 | 0.685208 | 0.038203 | 0 | 0.864065 | 0 | 0 | 0.011905 | 0 | 0 | 0 | 0 | 0 | 0.002692 | 1 | 0.065949 | false | 0 | 0.010767 | 0 | 0.142665 | 0.013459 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
165ad261dea1c02130c6007e689e18cc670276b3 | 39,182 | py | Python | source/gradient-descent/simulation.py | consideRatio/jupyter-math | 6952cddbb9e72b700e1216c3097adc5c3366f52e | [
"MIT"
] | 3 | 2018-05-02T00:27:36.000Z | 2021-09-27T00:51:37.000Z | source/gradient-descent/simulation.py | consideRatio/jupyter-math | 6952cddbb9e72b700e1216c3097adc5c3366f52e | [
"MIT"
] | null | null | null | source/gradient-descent/simulation.py | consideRatio/jupyter-math | 6952cddbb9e72b700e1216c3097adc5c3366f52e | [
"MIT"
] | 1 | 2019-11-23T18:31:00.000Z | 2019-11-23T18:31:00.000Z | import textwrap
import numpy as np
from math import isclose
from IPython.display import display, HTML, Markdown
from visualization import Vis
fail_base64 = '<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJQAAABACAYAAAD8k17tAAAABmJLR0QA/wD/AP+gvaeTAAAACXBIWXMAAC4jAAAuIwF4pT92AAAAB3RJTUUH4QoTDxQrkJ7rfwAAIABJREFUeNrtvXe0b9dV3/tZZfdfO/2e23WvqiVZliV3I3disMGATTEMHi3wgISWBDCQ8BjvhYQEQgjJCPB4EEyIg2MwxmDANriAsY1VLEuy2pVu0W3nnvpru672/lAkmni89gdn5H7H2GP8zm+tvdae8/c9a88591xziyvve39PPfTA68srG7vpYHhdiGPb7u3Jfq8fEO78fHuGXl9N1eXLo3hpNKh3tqfRYMkJ2xQyeB9kEoQ3pkmTKJrXdas5kxw63DUPfn6pOHz0WqkEZnfcIrwxTqp4aVFuF9GHjn/DN82vfOB3+ukDD9+QeL8SEpntbO1dGeV5HFrn5YlDy+VjZx/ThxZqYcTzuyC6xHQmGhWldGLF7k1KL9GRksoKeWF2+NCZ5KGHjxZLiyeNbarg6VScpCrPMztp2/n64L7eytpB8fiTyzi0a+a1jOPIWS81vpp09iwLhZBWrBem7rleL1Y7k9JlyVxl+aosUi0vb5bBOOtjUZRtu+mPHdvMJvPnR963PosHkYqaVkYitGVXVLZtu1KErC+rrcun47X1vog4FGeFL1tTFsLJsDMVXVcho9j6rd1tuzAycn1N5ucvLFmlBvSH0857GS5cbtXbvuzymc1L41Fw3c3f+wOTCz/8w69MQzjA3k7ldNzKtopI810T6yW1tWP9aCQnx695ODJdlp67lDjpUYkq5LhOiptOrMzOnD1TmtAmJ683Po9t7+JG0u5uLUutB865LbmyiHjybO4PHuqrenqeY8edu3TJ0QYdDiwX/vS5pL+6quqL521y+OCgKptp7+S1n9L2Tz/x7fZd7/rJpG1pI0nfeFoRsG1HtbhCWk8h66E2NymFw2V9CmPZiyM0kkI6up0JLC0gncBm2tPLg5xVqu0siffIeg46ojt0iG5jg6V+b+/Rn//lcjlSPTmbjVocyWTO4MQJwrlzmKNH0e+9RLo4QjYttXfI1mJuvoH8sVMYW+NkgmwbnNaYo8eQ493JovXDajJGIInTmMZ5EikgyQm9rGmXF9Lk848SVpYRdc28MgwjiU80wyC86/dpxhMZ5yldVaOEgtEAMynxwpF5iW1K0q4l6Q+oEtlEtUkFYKSAOKZuagqdQjXHFBmqaRmkRYiUEE3VYFJF7CVEAt90MBiBirDjLYbOe5umQsxb0b7m5a744AcVdUsdJ+y97UveePz6G6+Y0twC/Jp/8OFf0/d86phrDUJqpIQ6ShDBMT96DUuPPYpf7E+ysou98rGOEgioOZLyd2uiLGexmdtMR64CK/M8zr2IwnSMiyJUnODahtYG9PFDTn34o8LNZ14UQ+GVUGI8xuYZ+XQOWYzIUirxxi+Uup5HqqlROEZ7M+bOMqgb0hBItq4g8oJkb49UBvTyKlIrImcpnEUKT1k2tN/73cjZjKbcI53VUu/sqnwyIdQ1XS8jDgHjPHpxGSklZjhYWL7l5sNxvxilwRNagwieoBV9Z4k3r2CdZb63R6UUg+mMeNQjmpbMuxbhJEljkQjM2grxW75s2t/eGVbTPYxQkGpmowU84LyjbWvSra10cP+D1FrRzUvCZE517BBzINme0K6vy/FkKnPhMVvb1FKQ7ewy7mr6t99GZi1hvouwHZ1UKBtIqi7NpmPkfE4sFKZzDFqHnOyhGkdvbw8Xx5THD4uubkB41GSGth1yUhF50Ls7yPEuK9bREqQaz4SopojF1e0uyLaWgf5sxtDxg741/04u5ubR//JbX6qb6cX5vCG3DlGV2OGAdLKHnc8ZPP8mqkgR74yHrmszMatUuzdRzWxGv2tJypJ0extdNXqyvZvoti3E7jRKrlyknZeEXp+4nBLtjckmO4hTTyo73pNx1Wo/3VF+OiNUU6K9PcJCgaoa+rszorr16h0veOEr43vve32zvIRsKmIvUKZGJhljJeD5L0Ds7EBVkbcNCEVkDI11RO/4YeI/+D04dw7ReWolSYMlkRplDLUIRM6RG0PTy3FnzoE1ZPM5cnMDOZ0yns6IAgQcuqrwXYMVEplqEuvIZhXN8UMk05Js6woGiRwUJCEAgdA0RHf/WaKNQ+qYcn2NnnUo06FjzbTzEDx5qmnTjGw6o+xlCOfJmobUGLa8RR09xvKjj1IZS5RmqLpBxxHeGOSTjxOMpRaK+O1fg/nUp9De0WUp+sYbqHYnZPMZUVfj1g/QtIa4nzE5cT0LGxfxSGoJiYCk7TABDAIfwJsOrr0B18vh8HHmC31aD3FRpNF8GsXWCGUs9WhxYzuPRzd+5dv/yfKtN26Y//TOf81kHI37ffTyMnZ3TKEFUVPjHjtFHEXYsiReXsBUFTngrEPjqADlPXGsMTpCtx3SWJwWyCQl294EG7BFhg4CnyR0TUdLIE8LfJqSz6eYJMJNG1SkaQXY66/7z5JUBx8JUusxd72KyBhanVIKyZKxJB//Y1Tep15bZ9pbZJ4klMOMiQwEofFRTFFVxKZhIBVZY2B1GasFqYiIe31qqZFVTfp1bycZjog7h6lrRN0hR31CrGkdeC+wwwHuwDqydqgspS1S3FMbWAHbR44Sa0nXNOz6jqTrkF7ggmCqFJiWvGloraNzjqisWQuOFImcN4S9Ma43YKlqSOKEQdvgtaaQkvjSZSejiERIfH+ID57KOeIgIMlorSFeW8P/t3djX3gHk5e/lHDyJM2VLXxwdGlM0AnJdAKRpprVhHpCPVpE7E4QkwlM5zgZMB7EW7+MxjSEYoA6/Rj+4mW6++5mdP+DxDfeiP+Tj2m/uSPSyQzZlsSReuLgwWO/JoSot7//h367ffCBrKtK4q0t1Nu+EtU2eONxKiGKIlpniG67Dbe1h8577K0sEwlPLTRFnBGpmFZJJND5p4nWSY2wDi81k7VlugAigDUd/V5BlKfYpsQWOY0JGAdF29DMZyTOka6s9NX3v+COY9lDn3+rEYF04wrloEdWV3jraIwnlo4QIMEznkwonCHxEDuP+OSn0RKaziCFQvRyoqrB7W7jVcTMOFIEXgWUE4THHiW1DV2aM481ubFEZQsCCh+Im4o5EG9uEfczvNaYsiFTAVtWJMHhyxaZKNrSoJ0hJqCFgFjhooR4totqG2rnia2jw2Pals55QlGg2oq0CWwHj1pZI+zuoNMEL4QwSNGmMbqak3cGJzyil1BOKtRggNqdIFxA1jWJEPQfeYS0NVTBEroOGQnaskYbS5ZF6M0dxtYh0ohmuIhvWxJjibQge+Qxes7jBbRthRIJvVjjhCc+d55Ya1RTYYQgiTP8fH5w8bu/+we+9sTx/6x+9mdf228byAoS1+E+80nizuCFJkQSo2PCeEy+PacRHZGzGB9AalItMF2HxODiPsZZCm9oO0ukIqRp6KRDf9t3ED/0INK0CClxsxmxjOg0JOMJ1hi0UsiioGtqxGiIvf7aX9Mi+G53YUixs0fVdcSHDtOkMeYrvvqXote++r1XfvcDbxmpdKdRIRaHl9v20SeOOJW0tWl6ubdVE2Qil5f07MqVqcpSxnnmiqbtqqCTSHm17Y1WrUgyaofIdSWCyF7zuj88+Q++/fee+t7v/ubiyKGtEGS2df/nj+fVdLUpisi1bd20ViOcVHEsytY0UmXBtWNFnjnbkReJDM3aQafq9ny9cf6AkzLyTobe8pKbbG1H5KnX1mofpZ2PVdT1iwf09Tc8KnZ3r7185uyJ7MjBevvMmXbZPP+AsK00klg7562IQ91ZbbJEKulpJq2Tq/1Zmeenc89tbmcrMXm+22u9mhw9kmrvldWx0JotXvv6M+NP3P3yXrPXjVVK7+YbHml2x3l64viFa//xP/l4N9kYnv72d3xn38/KoLO+nU5sf2WU6L3GeTq5HWKRveS2z3WnLy26OKyHsxdT2VVVZ7q4CsJO/4+f/6lrv+cH/5eNs6fvnXz+8VtRhHl/IJk2XepqjVDeOeuTEKIQK7nnfSvjPDZSuPyLvuiXd9EyfujzbxaxtXZnvOH25i+hHLtdvAlCS6M1C4cPn+mm9ciePWf8TTcfjALCvejOPxHTvX5c1otia+dA2bV39D7xJ0ddHFF6hxyMCJ2HNjRsfN8/+l8v9Hphur4e9g6shitZHq4cPhRCCNdwFX8NIYS/rV38Px3vmeNvGutvaHvmEH/x72fO++/HM5+f8/r/Sr+/Ns8z7X917tPf+Z3/Yrw0CNuDhbB97HjYXlwI8wNr4cr3fdcXa2FskwvYbluSuqEfxUz6fSaQXKXPX4cQ4m9rD/9/jffMWM/V5y9899f6/JVrCH/TfH/btf5N7XJj4xpdeUTk8TvbxFKxl0YwKysZDfIsSIU2HUFLGhVQexPcH33w5FX6XMVzrYyinicoQcgzImOZSoGoa3oiiqWpKt1kKX0HqVJIC3J3h73/8u47rqrvKp5rgUp6o8wLgakt21lM6hyFt3TSCzmPk13dtISVBczx6/G2IxsOkc87GV/V3VU8x23Qd1rPNoc5od9nFCQ2TRBFj64sO5lJdYlen7o/DO2Lbm+tD4T5jMypi1fVdxXP5SiYbp5lxITZlNga2JvSSQ/FwljWTflw2evTv3BR7N13XxJJgSLgppW+qr6reC5O9UPU9Y3Bra4yy1NcnmAOHic7fvQ6ObThddGl87TGMlzs09x+O15q5Itu2buqu6t4Lu+yW13aboMnEaCtx2Up8jWv9uUD9zwhfZZdKHSMF4G1j3yS4r7P0VhH+V/ffzUOdRXPecuTFy6vNatrsL2NNx2DrkP82q+03cETM1mbbm+e52TGgJR0t91CHKeoxeJqHOoqnpNT3foB19+6RGwtsQ8Ea/G9hWD6kZOR0C+Pd7cRWcautAwffRRTznHz2u8H6c7+83/1/04rZ87/f9Pq6VP/l+0b7/stHvmGb/2/Mc4T+45R8dZWGM87HAGR5zRZH5Fl+VrUu1PHWRS3OqGRgoVsQKkhGgxoNHY/CHf8n/4gZ//tz7243ThzW3zd9d3obW/9s4XRwqMhhBRYO/ejP/Yq8iRxxgQXpI2ztD7+A9//7lBIADb/3c9/yXz33JFQuUaurqhweavpr68M/ckT4/byls/vvH3HPPjA0nhW/cZN3/c9HcCV3/ltxInrCCEsb/7Uv3ilG6735Quf9/jwBXd6HnvCTe7/9NEDX/bl7wPYeM97vtan6cUDr3zJtJLZ5Mp/+Devi5pQFtddL6LXvO594siBcr8Ryo/3QnZgHTEd4+oJ5dIa+fUnUZsXOy1mcyuakvpld1HcfzexV8zxpMnQ7RcB9an7fzL+uV+4iyJn9jP/gce+8A3u8de8RhWbOyw9/ghBxph+jixrJosD7BOP3S9XDz1mHn349otf+pb3r545Q90foI1jlmimTctCa5nEEpFnpCKusu/5jk8ATwGsfclb2P3cZ7/wkeNHPjjcm5ICFZLxscOUFy7Qu/7GzwHvA3Dv/51fFu/7rWQzdLSDFXr1lLrq6A6uceHuN78VeO9+s8uD9zod71BaQ3bkGpLbbiOcOs3esfXL2sepiNIcfd+naZYPMN+6TH3wMIf/zY99gF/4D/tCwqblwqgo8EKRbTxF7+I5VXcW/YqXOS6eVw2e1DvqLCGrG8pHHz8CPDa9/8HlePZ0tqMQCmcssYoYtSVewcEkohMxKJel2/P1ZwgFEAe3tLo7RQVQztJXGnfmDMpY2lF//izZjx/Zztr6kBCCdD4jSEGsFX5vzMp8LvejDeWS3M+vO4l65GFs3eG/+IvHdnn1s7GWkQxRdEU5R0hSyo2L9KKY0fY25a+/73n7RcJskGltLSrTlCHC5Dl+oY94+UuauqmRccLUOBprcOvrZKPRKoCY7V2XzGbMAiAhFhplW86vjJBpjotibDlFxKmIjq4d+0uT2rBb5ikuUvhhj9l1R3BJRqxjxMrKsyGX3XMXLvvhEt36OiJP8Z3FaEVdDIItBrv7MXIgndHJhUvkOkNMtnAf/chI/9iPvsa9772xVOO92khBESSZVMi2pSzndBtbS/tFwsZ087btmNx6O/GrvoCmqvFlAz/9M0WSFSgp6AVLkWb4jcuU2zsVwPS+h17phWTUVE8nBi4Oab1gfVZRaUkVPD3r8btj2Nn7y4Qa5r2urLBdhzGO0aNPQaIJwSPOnj/8TLdCyxXrHH46wXUtMZJCS8KgJ+TNNwz3YRwqxEUxV11FiCWtkcS/+Zvoy5eJszyTod+7UdtA09bEC0OqvGAYF6hLZ4/tGyNxc7efDnvkG1vwqT8JwhkWZIAQYRd6gKBbWMBLRVv0KZYXHUB38fytiYLpykGaLKdtG4o0Bi0pnCcvO2yWYLwjqHDjX5xTNWG75wPFQg9Zt4Q8gdkcMeoT4uTZHJF49WBJWyKtQHtJlUdU1z+PptenefCRyX6MG0xWV8+raUVd12glSF7zGvy8JszmRpvd7T19x/Nx938en8REu1MiWzMtTb1v3NhMo4NiVs4ZvP4N9fwPPpT32g6lof4H3/1O/fBDv7BniUexqPUNN8SdkJ8B6Jdufao1xcULhF5Oly+gPGzFMaPJBBXHSGchTbAPPPyVwLeESxcQBw/T1dUBmcSYqoVIE5IYpTR7rUPPx9GzxJtsxol1SFfTCUWU9ukunGdUVvCG1x3bj4RKTl5nG60YAB5o772XTArc4YORjlfXFu1/eQ9ZElGP55hBjpwY4ihq94uAXetdGUkOmJr5Bz+Yr5U1VRoRq4jZRz4c3/R7H/zUc564msg0itFLy2wniqXNDaokJv/CN+Lf99soZWh7fdqqQl3e6AOIg4efMSR2qRu6LCHqDOO77qT3iY+zYj1XbGiemaIqzYg0Q9qATBXddEYSJcxGPdKy9PuRUE60j4jRArMsJtsaI2dTVH8Bc+FSJTsfHo8GGeOXvwIXJ8RtQBV9atOo/SJgLmRhkGx39ulk/eCQjSGfTSlInvNHe+Bd735p+OgnFsRsRikcK1tjnNd0q2v4k9ds6TylzDK89yzFEnH2DCGE48+c33ppRZaggkAkEn1xg7pu2FxaRY/628/0i2bjNmpbujtfgD92nF7rIE/JPb5dGJzdlytU4w9Gkwn57oRGg7bgvXOliraklNGqCQL94EMk1RQXQVlXiDtetG9WKLu08JTynmXrMUWOThNEP8cBm2uL3XOds3L2zA16PMEWfZjMmC4OGCcCeeR4Obzhph/fCYHEdLg8w09KknnJmZ/8yS9/5vyezvK6aUl8QDlJd+E06hV3MTi8SjRaXn2mn+n3MxsrzJVNup095uuLmEhi5pUMVbsvMzri1h5HOKxzVAHqYZ9GBJUeWF6VbZH27B0vJJ3uooYj8rpDFgXxaDTeN57H7l6e1iWtN6RVx25jiI2HLOHoyWt2AUIIB0IIyyGEHCB7/avP2UEPXnQHLC5jX/JydNkwM+3m9NFHb+8rQXfgYJVee/24XRqi24ZY69ufdQSWe11y5BAzAS6SiLYj2dtGPfQAyekn/9yG2hvvhSDhyiZ6bwd2JsjKIPLY++0rZt8FoUIQ7o//8GDoLbCTRORBk3QN2jjS0o5kUjV74f4HsTKGK5s478jGU+TeXrRvCHXsWF1bh1hapgmGIk8phcPagP33v/CPr7z2NeHc7bdffuro4a0H3/ZV95z+uf/I/Jd+5a1J1VKbllxL8g9/kAUtWT60/nDk/NnYQjybZv7CuUEwHpxFnn7i4DO7QGKSqpzMSKSCOGE4XCB57HFaA12RP2suiAMHN3Rbk6AQEvTaAYQWBKmEHK3uxwVK0B9GTZqwNJ1jXI1xgWAN5e7lTamTZKakpnMN9cIQ0Ta4PGNy372L+0XCKEnO170+YXObVMfIKCbvHFHXogTkn76HhcceJxkskp5+cvb4z/0i1ZOnDop+j8Hdn6PeuYLv95gurSC9eCq67tg2kULNKqFncxkijT10FGHq3jM7QfSRgw/lx44Tuw5CwO/uMCE8HTHv2mftNr22utNkBeobv2YqTEDNS3ySkJWtFzsb3X6MlFdRSrG9DVqSO0GiNV2WoZ2PZW3M3T3tMDIi7QJ2aYFmPmf1uhML+0XC+VNn7kwjjbKGqqkI85JCKXyRQlvRZZoYT3jyFIu9YiW+5kZGTXsiLktkWyKjnHmUIuOInSx5yF7e/ZBZWEHaBjedgBCgBOEznxvuPPBQDjD91N1vsltbtAtDnLHUL3s54uhRKh/QJ048/qwHeubUnUoK7HU3/Z5bWgRTouYlxjQqitLD+zKwubT4uTp4hNKERGA7gwzB+5tvLaWP5CvNZMYQkPMp2hhiwJy7uG+egi/0i7SaVzRFio4iEJJJWyGcRxqLsjBfXaf7oR+Bb/2WT7z2t/8rsi37E+uRaUERDIM8ZdZakqqb1/N5iQxeDnLiYgidoZvMSDY2DjS/8ssDgHSoIx887prriKRA338/oW65sr6MIHo2hpfs7s1T46n+5U+8TSYJMsmRSYQ9sOrMaHFf5u1r60a5UFQhUKMwSUKU51RFhiwac1lJjTIdItJUTlKkGU2q940Hsl366UgBFrQQ6F6GX1yGxuBXl9k5dLBtbr91tnvDyZ8d7046gHrWLhaRpFSe0gvay5dZDq1Pjx7e6jY3bGeda71i5+hBIg+RkJjRcMSh9YMArUxP0bYkWxuEyYy8c6QblzmxtYuNGP65B7qcGzxre7vaNzWubGl1zLTp5gs/8I/O70dCiWHReRzy4EGUc+hyhppMZHr3PUjbS+8zB9abuVBIrYlSDV2DLO2+eSyw+u3f8psiyZGdQTQNyXQPPZ0/XRbottvfq7/8S5fc4aNrR259wfeNXv2S/xkgWl4u5gsLeC9QaYGuK1Tn/JbvxqPXv6n1vcGOEwp96kmcFESuw3z1V7374D/4jscA+jfe9On82DWopy7jBhlCWnSU4KxBjMe9Z73Bsu4a57G9AaHtCNqToFgKIrr8Bx/Nw/nNfUeoariw47xAbFwmSlPao0cojSOOop5UjmuT7c1UHT6AjWOiuqaWEdHycN94eaM7XvjpdtAjcR211gSZsHvrTWX1vBu896Y89uP/sjz273+6Xr75eX75BS91IQTphYvzjU3S4KmFJU9zyjjxq8Z+2eQPfuftbS9PdTWh8B2ybRAu4Kzvn73nkfS/2xK+U2HLS1AWyhPX4q3BBkdy6crgWYfh0BGfdS3MpgyqEi00814BmDy7909vEEf2n6c3ePvXfKrOE6K6IzaG5Mo2IU+hGAjJxsU72t6AZGMT0da0/YIujtmsJ/smsHn2x3/iy7PtTeoj19AtLRCamuzsU4U5c1panegQAkL8+b7V6kMffmtxz/3oYY/KGgaTkk4qQl3Gya++6weWfv3d/3Hh0UdGncpROoWs4IqAwVu+pHA72y9+Zhx5w/P+cG46PAFxcYPZaEBnHEbKZx8Oy2uOPxhHMVHbUDWG6bUn6Z0+g/bCy4X1/Zi+gvvld35pNp7QxZq2bhlbSyZFaBVTGVoXyd0drJJAQO3MyNqGwdRk+8ZIVH7QmICcbNHfmWCzjPzECVatw4e/XvBh/iefeIXzHY2OyYyjSWNcHJO+7W1UTYVdXyGaT9Ftje3m+Be9mKLrKD/wgUvXvuWLfv/ie39XAIilBa2OHaO684WEN76RdG+C1pqgZfoseZ984vk+eMYra9g04uD73/dv616Ga5pGH1zaDuPZvgts+gtPnZgsreCiCC9huWvoOtfafjGR8prjgsUlpLXIECFGPcygIOqndr8IKZACrcGBcoaqlxF/5m662hCdOHzmr1Yv6e777JfM1g8TbEWTRpRpQlFVlP/t3YTlJXxdM/3ar9tJQkA2UBcRaQhkv/CLb9/8Vz/x1Ye+4s1PVzw5fOCnB7OZiR85hf7jj+PXDhBZcCE8G4dSF855l8WIssa1jlPf+M3f2xMCVeR5/bGPXS9G/b/8g423/s7re0sIm9UzYhUxcI5uOIRUGBlnnewub9g9bxEhsGtqRGuQ0ymRLvr7hVCyrlcL0yGCIzhPMS3p0gQxymkff+rIX/kPk+rSUyeKo0eQL/4CZN2RzmpsUzLSMXZxmZBn6EuXl5QIkCiSj30K2Stw413U6bM/9SxZHnzke8u2iaztkAGi3S3GaytIwbMhF5emvSA0sQrofs6hT3xCdFVNrWOi/uhLznzbt37ThX/1r9+6/eEPvn7nc/euiNHK3/k41OpNN57qN47UdtQHn/aCfdn14/s/d0CLurWjrR1sJBlqBU2DzXvkX/Tqj/MzP7kvCBVfuHhkZ3GRyHqU86iFAfG8w2d9soj4aRvqz3PeXJwFf+89oo1i8jghSzU+kphBYdMfeccXdFFya/iJf/2DsdYnXRyTRIZo3tEsLDD7b79++KmHTr3yyM3XPnj6hhu/eqVsSETAllO0jBnsbCEPHHh2hfKLCzvmjheu6/sfwJdzmkNHqHZ2WPt3//bM/Cu/6luTSH+rGy3iraF+21tfCvydX6KMdbtz7RFKMb18CbG0yGonqLVOpehno7FWRF7QRAk6zYimM2Yf/+Rr98sKVdYTv7A7JqpmdGlGmJZMlEdsbxHKmfkrRbaa2S3Pm4jg0Uogi5Q9qZ6uyjud2Xpjc374K77iF0MUn3eRoi0KPDG1DOijx0lPXo/71Edfbsbja/vzkk4IRBRRaE3UtYhen43h4Mln2Vs2VXjqAntdi+4VyL0xIwXt13ztNSqOcUJC1SLajsGRw6/fD/r2p564SUcpqbGsaM1a2WAD2IhGJo1rFrqW0LYkCrpmjoxTpvPJvtlGlfSKtBOgfCDpDFMBvc7Ruo7xyupf2lmy84H339m7cH4kgiSPEnxryPdKAo55MXRr3/aNT28wOH5ko5lMKcopflLB9/+IvfTqVwfRy5C/96EbZo8+eEiO93Ay0InAuLVs33oLYTKnf+DA5Wcdhro92j//FEvGUs7mRJFEukBEwGmNimLSeob1Fnfp8t/5DI8Qgoi6ZilpGoTpUMbStC2hnxMEBY9PAAAPvklEQVR0YaQf9oSUihBHJFWHFopxkVA4t28eXLpWqhSPFAr70hczyAuaOEYOBvhp9WyNyXD+NBu//5ET5vgxqhuup60burKkjoCqIb7lpstJOrgEII1t1aBPi6J69Ssxh1Y/s/Do41Z/4k8R1fTaWOseUpGaQB4cSZQyuHABpwLt+ade+Oy1Sa+Es5Dm6FtvxYynTIFWa1zbYpXG5AmmN4Is2xdJjVYobL+PDB6jJPQH0FpEV/bl7nw6tVmMFJJLSuJ9oHjFK9jd3G32DaFuOv7B8eFjzfzWWydu4/LerpZWE1zqWpdL8ewtTxw5QXzq87f1P/ZJ641pwpGjU7+6XDHsWZdmdNefPP9MNoFfWnigXV1l9xV3ufiHfvDHqqz/62KyF9Vf/80Pt3UzTEaLk90sRyXauyizTazduG2dVolJ4/RZh6ZWcesGC6ZeWpypCxfbcOSAy+LIsbrkfL/vXKx8ZKwrggtm1NsXsb/oymaYlBXz4aKXKgLbIcoSsX6or4dJejEYjwqBRGlsHuP/9JOk3/TN+6ZYxpF3/OjP8o4f/WWeLlBaP/rjP/bi6qnLuk4THR84ej689zcRQhCePMv5d/7Kv++OnPuD8MoXba9+w3dsXHzw7uHkZ3/1SJRJ7VaPXHpWaW9487vHKyv3xm956+UDd7zg8fDgA9npN7zuoRM/9MN/fOZd77pVLR96svi6r3rFrDNZ63yXBauipCd2OlPL1774PB/+0NM21Bte/QXVm//eaPVr/6cLl9/5zoP2zz676OqpZGWJsLXjomFfbU1mvjh6tDrwg//0Ad7xz/7ux6KGmeiriO7OO+fRH/3hwDqDPXAIMd6ZsP093/PW8frBsLm8Ei4Nh2GW52EzGYStP/rdV3EVV/EcNtTZ1736/ZtZHsq8F04fOBB28yzsjEZh6+1fc5eMJY0JT7+ColeWmKKPGeakL3xpd1V9V/GcTpCxLvaOWVfT+6EfJgSBimNYX09kK/VtfjwhKIdfXMC2NaEpqT/+sSNXVXcVzxXb7PIiJ85hcZHJY48SL4yYKY1N+5W0dTt0WmMah2g60JLMeZq7733+Vd1dxXPd9Uxbtt52yNmcZPMKs7Ki7z223LM6TPfmaZygmpJWxygv8Uog6s5d1d1VPLdbHQJKI1aW6N97LyKLkVVNvDfPZLy2NvRdg0fijKcc9AgW/KCv/rb3mlzF/5CQSZLlsXT4i0/R/f2/j7QSsgivg5V+XpaRVIQ8px8cWdUQ5wXSdWf+tveaXMX/gAaUEC50zXlfeUSSoX/pl5g1Jc5YgvdKdp2JDQJtWnwUYbxDtA3NuTOrV9V3Fc+JW25/olzIqQ8fRs2nJFLQZT3ipCelGoxEEIJWKBoJqXV4FdA33Xa18P1VPCfacnyzNI7lpy4QtZ4kzQmJpi1nrbTjce0D2ODBOOha9GxOvjycXFXdVfw1Fy8Eoc5cGOZxhl8YUvqOuq5I5iVoqaWkE5m0aBXjjx56uiaSTijvfSC9qr6reA4bKqR01s932dnefXqLfRIRhEIKpA7XXi92X/kanO8YPfAw0geqlQWine0fOf91X/9d3dLoXLk7dUtHDlwwT166Lu6m61Vnx+s/979/U5BhsvPTP/Mv/cWLN0WmlIioawaDheWF4X1b29syvfZGE9ZXy/aTn9yUeXKtOXbdudHJo58yjzzyxWZW+TDZW8/3ptdjG984vxeNFhftyqHzaiHak2nxsIu1K44du980rpcqG++85wNf0+R6NdvcxQ37UhrfDL/hq3/q8j333rFyZuOlTTVdkkq5JItaF8UqbM0iu9IX0fbU1Ncde2T0xje9Z/FVd90DyCe/6x/+s5FIjs23L42TF75wWn7+4ef3nR3Ou+5K5KJId/MoHDwQd9tbTZL3+05iZFtPXZwktm5kL9Aaosjh5qKfDf3eZK4Gw35Q8TxKEuFWhp8xj5663UuZTIMv1z295iUveVI/8fALkqMnP2EWhpPms/evaaGe7+aTeeqFDlpHbZ6qpKy6uprN4mMn1lSQdyeve/Xn5x/8w8PtbHJjatuBiiLrUVGnVRnffMuFhRe/9L17pz5/46Fv+KZfBTY3fuKn3x8++5mkjFTc9+T+0KH7ope+pO7uue9wOHdquYsSr5QupFTT+fZOmxVFT0tlTXA6SbLYN/PSCadEsRDv1qVZ7PeHumxnjfAuqVvfVtMX9Ypl+t0MFylEWyMzjepaI678p1952fzuP/tk/MRZ0s98EmcsUaSpOkfO06+uH9clBwK0cURwFqtzukGBlJqmq1meTEAq9mJNhCBtauZCUmhF6wRNGjGalZhYM18YIVAU0ylSeqK2Q0jNXEnyOKGcjIkXl+jqCm0NOigMHpWm6K6jso4sjxFCoWzHVGiKriV4R/CBSmvk4eNk55/AFwuk1YSpUugkxxqLXVqk5x3R1hbg2VMRyeGjxBfOI6oanyaQxtiyRCCQSYaQkunLX2l6H/lwpKUi5DG1Vwxw7DnDsHMI09KureMiRXrpMt2gh5AR0d4OIc3QpqNKC7K6ohqNEE2FHy0gd3fJsoJOAlqhuw6nNHHd4GyH7gzjQY90OicdDAldQ6NjZFOjVhbxV3axQSD6Kf7oYdrWoi9dIPeCEEmoW9okwc3m9LSitI7oxAnyyxvMyhnFYEDtAz1nqTqLyiOal72K7KMfQcSKqGmplSLOC0pn6TctiMD08CHSzV1s8MRKQT+nedOb3yDdn/3pa/u/+msUn/4TlFLoXg+lIrwCLSSxdawVI6o4Je0MKilwiSaZzfDlLovzKU5IbK9gZB1tANnrs5D1EMYSRCCyhvrwEZxpyb7+G4kEJM7irUElCS44DDB9we2IW28mG4/ppz3U29/OrOhRpJouBFrv6Q1yZN6nixOUTIgShYkVMslpi5xekZP3CurRCm2WItCkvR5Yh11aJH3VXajW4ooCtbbK0HkILSIpaK47iUlivO2IoxiZF3SJRgrP6JN/rONhH4/DCQ3SY9sW7cAsLVJ+wV2oW24hDBaxUUwcIJ1O0XkfEaeMR4ukEpoiRriONkD+D79z2lUtXVuiJxOipsWMp6TVnHmsESHgej1iAhQ5ndTQdVSmww4HuN0x0fEjXBmmkGSEsxdY2tkhQeKsYyYVpt+nVpJh0cMnOVmRIreu0AZHlBWY/ohEeLokJctTRGfRn/8sylvmUYwRwMLw6WQ66Zldc4y9okc+noM1RM6htaKx4ITyMhmOJpGKcD5A2TIzBt3WJELSaU2TRuz0M5pEU/d7NEpQlyWCwKCFyUtehl8YodsG7wKqV1BWDfO2pj1xAq0li0GRX7lET0Rw5klSKemCIwqaykvaJCUOgej8U8gnz9GmCWo+xfzGb6DbGcFJCmOJ8xSXDwhrBxBJwlwLsqolBEmVJSQqRRCoz5wmC44Ix2xlGVW1iF6K1Qr/8Y/StTVSa3bqFqM0yZnz1PUE6SECRFAgBJU15NbgdYbt5cI0Hd3CMsymxI2hlYKeCMjpFPXQg/i2MWNhiLRGCMHO619ndnsZHs9ACIJxJCoh9YpRGtP92D8fFKMBNikQeU6bpOg8p5QRCdA4gRWCxCmkcSRtSUgzRN5D7E2QxtJt77A2bdDTMQPTMt/bwwZBhGelrIitY4g0xnV4BTbOkE2HSmJCcNRvfhNbSyt411EVOSytEu/NmI76yOuuo0p76KbDOEcYLGH+3heTti3CWmQUPb2a1y2xCjhj5nKyM0nnMiCWlhAhsCglIYrxQhFMB50h39miXzWUSUY77JHjiYxhFinc6bN00ynBwFQGerM5WkKmNf6ps+TG4YJjPBrSFRn5h36ffG+XLkAkPJEA3VkSGZFeOIfwhjJ46qUFdJSSiZgujzF5Sjh2lBkOV01p5iXq2puo0oi8P6I0HWKpj3cK/YoXkdYl/d0pzckTiCJGzxtWL15AdR5lDHY6pfDQnTxB+/de20aHj5ow3QPrUauLjNcP0U8UPukxO7xaupWVMk4K/JveiE0zlLP0lCYgCVLhTSB54HPR2uNPkLSGzaYlevDhqD+d4uuWui1xeUIwHuktYV6jgVDPEabBKI12hsh2JKbDzua4W2+gXF6B0OKVosxSjNTock6mJdpbdN0SK0/QiqAiqkiTKYGOYmZIZjfdhP3Gb7jU6QhvHdJZnBIIH1BJhHr8YbLgERZyL5nPdqmf9zy08eQPPkhPgg4QBYWe7KEuXyLpDEZITN4DCTYSBB0RHT3QU//4dXe9KTx66gv8rCRogUszPCCVwoyGOCSkCVWakVlD2XWorCD0epg0xaYJaEUTa0zeQwpBNezDddfNUqJkYhtc0UMlT2+m9KsHd6bHjubSBazz1FKQnTjBbHFhSxW9QuKR1lOZFhklWCwuSlHOUluHdQE1KzFaQzkjLC5RBk8eAqGzhMUBzgtmu2NMP0NdvEjTWtpeRpv1sM7gsxyTpsiqpJ2V1Epr74wqen2ma6sIFRFvb0FnmTmDDCFW8ypuqynq4cdpDqyS9Hs0aUIXRZimAWNxCESkabSgAHRVUq4fQCYxjZRkbcfcGjoRIIqZLS3SjIaIokc5nqAtmFRjDh2atsuriRSCMJkRWUsTBGrQR1YVqRJ0KsYUOfbQOpOFFeJen5DGyKYlGMs8i1Cdw04nRGdOj2y/R130MLMZ2noEYJxH7E6oqxqFZ9pUxMNF/OY2+ubr6fIeDkfXddg8xy8tPl0Hq5qTHFqnvvYGurIkblustbgTN7xTdJ/9bK7a7voQnBdJlPoggiqyyNadlYm2dKaHFHUIkVNS9EJkCm+jRuKdE64nZ/WGt64XhKxVGgthTexF3ImD/QfDbtkXnbuOhd5BFyVPRvMutnsb9/zkK15lv//UqevCzuWjuj+SJNEDl/+3n9hc+67vPSQjc22wWjpaJUWujZmVSb6QmfnEiCBigZ06ZC6QnTA+ZSC3VW/ZMBunIYtOCCE/p/qLV7pLG6+W3k38fCbkwiAOxk9Fli505dzLpulkFBmvQo5X22FhtBNlWV86+YTIIldtjZ+vm/mqcKYxjU9ULGLm7Uxlwrl8MdJx+mey38/8dHvVerEexluZzBIT5m2n2tZ0Mu7lJw9lZnfPtXuzrv+CG+6tzm3dEtqyiCvTeh3HPo2EXl0qVSo+hRiO5o/d/zKqrs1Wl0O7W34ou/n6SCVqsbz7s69Qa4NdM+l60TWHWn9p41Ds/I6RtJXvoqUX3/nh6tROZ3y92F9fWSsfeOQW1ZXjQBT5JEhUMo+PHzvlui5RxrV+d+d1oWr3yFPnuyZTvcVt21TP01V7qu3KIukP8w67m64t/X6ydlLuPfbgjbENJ4WpZ+ltzzfj++5f18Ym+vqbnsyG+T1ub/Ki+QMPnGAwqEd33fWe/xOx7gWuuoRAOAAAAABJRU5ErkJggg==" />'
pass_base64 = '<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJQAAABACAYAAAD8k17tAAAABmJLR0QA/wD/AP+gvaeTAAAACXBIWXMAAC4jAAAuIwF4pT92AAAAB3RJTUUH4QoTDxQWyPanbgAAIABJREFUeNrsvXn0bVdV5/tZ3W5P82tv36TvE0MgJhAS+jaKgiWKhYKiaFmlgGWD8squAFHklYjNk5JO5AnSxIBYghAB04AEEkJCbnLT3L79dafb3ereHyGURSXqG2+M5/3jfsc444yz9ppzrT3nd++19txrzSP+6sTbl6bJ/ud2Poogmk7MijmTxuNNjPuEqy9Oy7kl3WT7Vmb7j8+n29ZGG1Nx9taLk/0rd+3ubS42V5MuRCf8YJgWzdSNi37ef3Blz1cvzp9WB1M9rTEz3/hJ08m6K+VgrvXNV5KuWCBxftaNLSLYmqpb0LsutGHWyehn1snpmGMj3RZBBCHSXlaWSbFTU/ZK0y8fnt55zzXDVz5w38nPCR/dpTEV/SY/ebJkcbtoeieI9YEw1WOzIJ880Uda4/spxBBsJ9OwFEidqWcbU5P00k6PZiEKozqprW4rREyX9a7+LMzaVEs/no6P5Xm21Vp/bFLPmrlBsclPTT/LStf5jTCS04czekUeymzaTDdMgVzQS8tCmu0Hu/sOD5JhNrMbVVtxMkm0LbNB1m40m9NBGmN0sszms/F0pe7JLc3x+tD+Qd+c50UIhiKtx7PJlsHurUfs/cf66Xx6bLp/XCbDTJKouWSh11Sdz9RAdXrcbrTHq0E+SHSby7EfrfqIcyFMB3owX623djBfhilrc32z2DMxzRq9dqLyXZLGTJZ6oE/MDtWJLo3zjTfGhDQTetKOaiPzRFoR1Kx/JE0yuc7KoKDX6zfbTkzTo27FHopLC/O7XS382eqaB8X7jrzmlz6n3/0WgUHgEUohmoT5uCOcSB+WorOoWKBNQGCqIK1I9EA39cg4YZHjElE6bDnCrM/joqNQPaLBStma1fGM/maFbT2FMAST+NigQtahiBA0dAIfIsE7bO4o2xKVKXyIEe0IMYo2TjHk+GghQm4Hk2CcbHVVuuDIqj4LcjerzRFQwg56eX2yOzowJqLaAjFoObO5lr3m84RZSj8v8dHigsdHi/cRlQeihy46Sp9TZDkr1RiTCHI3YComJJmktS25KkgpsbYhSCgoWbXrXrdaDQdzrMWD5HFAEBV1IxkWeazqEId5X45HDXrgCKqhGwkyUSJ7DtWWbRQhnZpVqAJlMc9spcHMO1Ap/W6ZSh/HJS1FN2AcRqQoglJ47emLAVUzoZ8vUY8n0HMdLibYFJOJWNtGxLwhTjKynmbaVuQiI8w0qvR0coaIkm6sMImkrh3FUKEAQdmorFP1uDNaJSQ29+QxBjq6ZKbdLHLW9Nq71Ate/4QrDot7XpCKki6ZQhT4tuHK4vtOHi5uLX1VIouGVtZ42ZgYO9OESnkXED1LEIFYdCgpiL0J1jvaUOH6I9W1nkQrsnoeZyqKuDmGEIQTG8InNaESkAZClSDKiEtqEqGxyhJFpBJrAt0I30jmw7lU3TpSKkRhsXmThlomsY2kuWDKiJk5iTc1pm/VarWSpvORFguJpY0tojZ0+QhHi9VTOueJaUMQHcIIdAq2hrwPbeOZhjHSKkyqcKpFd5JhshU38TjV0Zh1utAhXcpGt4JUSJEGbGiJpkOkAuESbFphaYRMgnChxffHbLeXsj6awvwE6xusbrF6prtsA1qD1Ia2q0nmPE2oSfLAfNiJbwKN3sBmjkiL1n1suk7iU8ZrU0xmqNyUTlfQalXRQrC0qhIoj3OWJIcQPEmSQBDU2QgSR/Qaowyi36IMMKhxXSApwIdOe5xKjMHqKb5oZOumsktrGapAZnoMwuYTcq7ddnM6ncMmE9RsgOw5bOL4Ch/ZVG8Igm+wrkMiiYnHCoimQfUj7Tii84joNHacQkjQpScpJUpIpIa0yHiCvp44SVBVhk1H6EKBS5ExJQSP6jV421AywDpHFzusD+hM080ki2YHk95eYm+CkJ5oDaEL2GyKLB2uUZR5gRSCTPaoGk/aN6hxST/OYZoepcw5MbiP1jdIJ2CtIDWa6AShTlBRkdo+mBbbSUTPIbXA9yu6ZEYUYIuKiT2JnxtjfIH2KbvDE2nyFQyKUiVIGYk2kIVFtowvQfk+iVComCKDJiQtSbOImpQsDZaQaUBohTGKWAuCE8gkoNIaU2gaGlJR0FWRI+ZOJuVBhAr0ZIpSCokkd30m4450YIi9EVG2GGXwSU0vNxgzpMcCQgqiCrgAsVPEzlCLDWItUF2CFAqnZzSrEmREzDK8iNhRglE5Xnu2j5+EiJG2cZhMI6oEnSgqNhBZp2SXT3bXc+sEB7N2SrCRrKeZ2HWklFyWPA/pCzCWSERNC0SXoZwkNJKMHCEdOmnBeLApQkkijkCkDutYHenNFawM9wiPE7VvwVtcNmWpuxDZlqiQYZo+oZIMWSYTGbJJyUzBCXcI6yzpbBNBWAZigdBB0vUgGIoso44NWmU0oeIC8WScs3R6TAwCK2cwGTKYbiUJObpICIlFT+cp3AKhZ4k+sDpdR4ocrR1GaqKVmJCSkCGtYefGFUzaMXFWsK29iE3VZYzGI3TUSKOZ5lPaYEEoZtWISTPClivYOqISh7Ue1yqsqDihHmbDH0ZUOSKzCJ+QlYbMGOgkzkm0EwzsDjCRTf58BFBbT/Qwbh2m7mPHLhaTs8aDBY3sJFWlEQV4bynMPFU9Ycltbxebc6P1NUb0CKKhZoqVYxIK8jkBXYaQkbaO6KGHqQET2SJ34YsZeIEUgfvF54lKIJXDW0nMGnAGjcF2CvXcn7/sJQfU7c/SKHrzCdOjLamao/FjdJZyTHwDi0UJiZGKoEA6iRMtikhmt7ChjxKzwLDaQas3iEi8h6gdunSsjU4ySo8iQsSNA0JrdA5GJEzjKqKIFN0ix8QDyNzSb7dA0NR6FScjKvGkScqkGxGnGhtbkn6gti00BlmXML9ON3EkheFQ2ItKJT44BnYzo+QkhUpxsiMKh9cNVawxhaMWUxIl8CbQ1Y6d+kyOTlaRwZF0O/BqTNSB0ARW/SppqtBCMSoPsy4OMh93sG6PYfMpaTNApDW1a1BFpLNjghJ479AiwciCijWE18h+RVN7yCukz7BNBRLaKcj5BhE9nbLM2jFIiVcW13mKXk6IETv26FwipRSb4k510O2VMoXg3SN3PTJU0Hgik/So8mYiWllhrSNTOVIqKtsRG0HdtGjjUMKQJRnrJyfIvkUmGjHTzOIYjKOyHWlu8F5hhMGgKbtttGaKMxVb7Nlr4v0rv/jGv/O//4bcDKiTdXTQeBs5r33GvXj95USYTkpdTtaa2XJ/+f4zecrRPfHWHVV18jyZmyiNT/M0M+vT0XTZnv31qpttTYZsCcJ7p2f9atJOyUnSTltDL/TZvrGm9gxLs8N1cZSXfm5j4k94hO8XeV6sTDaC8jHkYouOxVh4QjeU22dj91CyW159pBe3f2Gt3feEE9kDT3K6qokxiT4EL1B57LVhdXDf4lI+f7S5Z+ekdeGs4pIDF6lnffar7m+efnz08Hn5fLl6QfKkb+xr7r4uT+aOudaKaXqy553NY6u7Tcm59zXxuI6Z3dI07WKUvu35+WzenX33mcMLv36o27t9VT947Sgc9cEl4KQM6dj2mCti1X9ASdFJoXb20vljFyZP/eCd9saXHGj3n5unKcZI1Rd9d7I6rnU3H1q1Hgdis43CZqR+ow6zQRnKbBKrNjellNEGNVs2ZWrWJ83xpCpmSSqlLZqd+wo597UNd+RJjV6/ODjRiYGN0tKEKE2f4VrX6G73YJf/2uhLOxKRaF1ic5EaMRtulFlx/4ZfPTvo9aQNzBsz7Np6XYgmM6ZIJ0UvSN/oXmMnlTWtyWNPRSncstu9MepmY6n9FfcNP3VN1pak1TLrc3uQocfO2cX3inc/+PO/9JniHW/pZz18OsOuG3wX+M7s373lZ7e+75c5DWKMCCEe9/c3y4QQIv6/0fM4dYQQIj5a91/SG2MUQAQerf+4fXssXd+U55s6vnU8xiiFEOGfHn9U7y31x17w59VP/U3VNQQZMSpineUinne/toPxrfNqkVk1ARNJ8hSbtfRPbj54mkqP4Nsd9Fik+JfI9Hhyj6fn0br/kt5vP/7P9e2xdH1bWfwn5eHxZA5Nv7bfR0VaCGKdYk2DFS1ndFd+TS6E7c+pJw22C8TGEPUMy4zNxTnVaSqdxmPeFUUUBM+orfju4peYHq+Ykwts95fepNfa1dCmFbkuqf0YUedoAamcXzttutN4LMylW01mEoSb56bx++gtFdRdzd7JHRfKQGtLXVLVE5RICGlFEvs8vH775adNdxqPhXZaH5vrzqSRM1q3jgiCoDvEXHOFLNO5+4mGpFCIIJBtj0qvki7o/LTpTuOxMMgXzjrpHiYhRw4iy805yJiwWh8IemKPrcWipXOOREBCSSZ7HJk+7E+b7jQeC9IoFX3EVQ7lepRmgVIsMCy2HNN9vfWM4AWZKnHRUTcVUdbsGu4Up013Go+FE7N9x6VNEbEhCseKegDfWHari78iD4z33oeVSBOhFcispdzYyg9ufsv7TpvuNB5zyNNL/QW/gzJNmekVxuUK1cIa682RWmYmW9epcbSgjMLWEnIJ4E6b7jQeCx112xVjYtKRmJSkylCVIustnCVl6ROdem1Vi0kNg6Kgy47z4UO/8eOnygl8eM/b/j/rWI8rXPPpf339W/ff8Jjl9zW3/M94zDejyDHGRz//qmnCA/4L/1QHD0zu0LdPPm0eGt+n7125Wz967GTYc0oSqlRz/TV3hOm4RmlodUV0OXuaTzxTL1dnn3Ff8neEUc7W/DxW/DGCFzQ0q6fKCXz/Bf+Zj+/54xetuId3UthGm0xnqtB9v+XIjBNnzPyoDULXl+VPP3CFeM4+sSwe/HYdf3nfL3DLc+ETh97xynmWH9ob7rrUhbZVbVohRfqj57zxFsATyTjKYbFdbBxze9miz/1f9JyfXUOMccsf3v+6n/r1e19w+TsP/txDv3n0up09uSXMupF+55H/cHR7svsfrl96/YcAvtZ8iu/Invct+SNhP9vkbgA+ePytP/e2I9dfcYiHzwxe6YV06K7gZRu3Hfrkh67e/sKPCSGmAPvbe9mdXnjKEOp49/BGr1tmI2mRXhGsJyYjfMSLjx77nZf/rX7b+zu1wc7mavantyKj4XvX3vQzLz73dX9wqpzEW+5/2Vf3LHzyCVJHXOeonGXYS4g64E4mZGVBGyYkdY+d9kmHr9n04l/dxWUfPHfwnd+K+B+a3vviNxy/7mNWTyEoUm1wwwoVBfLYEj6foYJkmziP/7rr1i1CiOP/tA8xRvnR47/7C/8Q//tbVsU+sgHENiUTC7iups0mxJnGCcdZ9VUrz9386pdfW/zQp779XD65+ic/+NnpH7x7fXhfHpQiCYaoQLcZ01FF6oYs9Ja5TDzvPWcV3/mJ1Kc3fm3yxfCq3W88JXxxw8pbL/8f/nfuMFlgVM14RvoT3D/9Mpuqiz4rp8mxC7puhmxTTqR3IaSkJ4fsrb4yfyrdZpd6mzf5pCaGiFGGRZbZufZUQgumF3mSfCkknm6wwcHlm7f/Wfead90ebngjwN8dfTcAt4w/uatV65iBYKm/TGIkPb+EmCbEzev08x5hbsqBwZ184MCvPe/b+/COPa9629+Ld7xlo3gYZRLcWgbTgtaOqf2Ys6trQQd6WcHG8IGlG0e/+7FDkwc2AazbdQA+8PBv/eLfht/6i8nSgTxxSxhb4CuBix2dqhGZRyxOODG8h9vEX/zoR2e//rFb3Id/91W730iM8ZTwRZOMWyUSBu25pDJBrQ7j95k3fOqK4rsKOa2qJkRBFxqsrFBtghMd5WBu66lEqGPmG/NhYjDGQND4pGGPuYkkZATd8o/x/ZRxAS8hOsgLzaem//11nz70p294ztYf++Y86u7v7A8H5BRU8iQurZjORqRpxkK7g3F2DL8h0VODVnn5aNsH7T18Yf8Hn33fppteW7sxc6OLGNoldGYQUSA6AyElDX2khthqnLAccfcWn5z8t3fEGJN5M89tB2/Ycmv57l+uk1WaaYeNEwQdXkeUjggnKXs5qSzJq01403JyeA9H+dprYozqX/Ny+f8PzImlnpSRk8U9BBM56u8RH7S/9bwvVx9Ssp8OemUpIQOURuSWyaTGM10/lQiVtOVq0gu0qiZ6S9AdWanYNrkKawNNOqG2Y6QDkVjazhJ7U/6xvuHfPaqjbrUMzHC6ph1ruha0glqPGRdHEV6iBgJvE85QVz70qNxOczFfkn/9Pe24ZlHuZjK8j6lZwYcWPz+iH7ewFLfykPsyUgg63xJsQA0cX9QfeOmtJz/+vQBfiX9zuTMrc0QwKiEqTzCBTOVRrs8z8NsJawnjdsxiOkcmS2STckzdL288+vbXniq+KMIwbWOLnymiVxxcvIWNhTvIy8GqdEH8HT7FfDNSEBAYIdmRXjo+lQh1Kc/7onUWuybxZYf0hszOoZSJT6y//47Lpj+wVyhJEeawrSK6iKg1D8/u2XZgdc9ijFGNkgPnBqcxtkQbhVaakmV2dJcTQsBoQxCRYpjy8OyzZ/zT9tf8Q2cnC5ZBsomllcspq+0oqTmjvmr//7njLvF08TOvboo1QlDo1NHKlqLuE5TjeHPvNQBSyqfUTcfm6aXgEnRTEltF6BA/MfyTK693v/bMLh0H5QUr61PW7CEYl4QQsKHbfar4YirX1ks5R64VPaVZ91P8WLCgzrhLVvJE6BoPQaMxRKsQeWCjWi1PJUIt9HbeaHxKpjPCWJNpRZOssa93MyoOf+3lvV//jnmx45gLj6wH16lG6YR0czR77E35algbrDWHt1ofyHwfq2aokNG6lsP6LryL6JUF5vwmNk8vZ19553cdjmvy0faFjJsqX/OQvI2Tw7tZ6+1DCE9pl+8BeGLvJV/UQZLJFGk00VQkOidKy0PxnhJABDmnc8XJ4l6yQtGPS5RxkS3m/O6W1Q9+41nbXrp3qT7P95ptdPk6qimQRYfwCVFunDKPeRKRiqgw3RKiHjJIU5Ik40H7ue+R83HbObqEWIMXAhEiIalp9PopRagQu8o1ChsbdBIZ1y3pZBktiK6ruXn8ISGs9IlJUUbhKxBJTSVrWbdR3fjwW/tuab3f13OIUNIzcyzHc5gzW8Om6YWxaJdoihUmcp0D+otU6ckXfvLgr6hRfGQVj0vcMHFDCj8gBEidoWs143plG0CwbW/adVR2QrVuES6hci0iBozAANRi/BThJK2pqNsZoGnFmEl9vMvpIYQ41C83u3F5BGFaZN9BZtFJJPNn3nzqOENH5RWHxQPMsuN0oqPJNkjzwunKj0JUHb60COcRRpJSMIvT9lQilPTqAl9MKcUQGy1ROwbtZuykEbZrTW/Qi2jXilmK1IaoZiR+SIw2AsFLq03QasQ6PpvhVGBjdpDtvQtb75U52b9HuzqSSkNaKmbtVF636ceWhmLhKEAhimRdOqwKmEQhGoNKO6bt+uUA873hbOf4IkoxT9QSWUOt1yibRS5Krv0MvJ8u1CdFGgiVoJQltRzTUbOQl72fXHhn+1P8d3b5i46ty8NnDu08o/UaUTYk0pAbuelU8UUaerKW6yz5TTRig1RoEtFn1R33ehuXfW7otjet259FpZCNYpZOGKqBOpUI9ZkT7/kNs9UQa4UvpmxWZ7Bj+iSO1l/3W9VFBzcX50Smxs7iEXzesNjtxjcRJWaoIOOg7C3Ppl1uhpGqCaQYZqLCdlWTh/64dXZzqXOEkKyv1+ws0/jJlTev/Gi8gfeIF+Nk91UX6l3Sp0SbIFJLHVuq4Z286+jP39bL5p8MPM5j2CMh+p3zu752X3TPN71IFkrW2xNoqTlk9/KmB7537aZj733GM7e88qxH13N/M/aVCSGa3+ZVfOLQ28V373jNv3ns4Ki/ZwOt6fQ6OmT4TmLtjLP9jkx+ef9NenHlYqWsxnYdKnMoBJ54KvEJN7fmMRGbNRRxnpXmGN9IP4U2Rpw9OH/w0OQuKZVZyPoJopKc2z3jH7qmI7ggpPQ+lcU1vrchzGgz/bTEmwZPy7HqyIGL0+u/aKxhajtCA0kWiIj6x7a/a9t7xIsB2FSdd0T1IhJBl41IYk5iFAv+XP5BvfPqtxz/rpv/7vBf/DsAeeP/2vcYH1kJNJqt3ZILhWhKVvxhVFB0qkU1ir3Lnxl8kF++5UNH3/A3j5IJQAjRAEzjCqcCmQBSNlW5zjsZFYlKkSpBFJZpuublubt3X/tgebOh7NDSEESCzhK8t/WpRChFAo0hjBVdnJEmKZeEZ3BWeIbbiCfP2b927/eO3ZFl7yO1alk2258cehM6ZbG+ClO5ukNKRcgbutCRiSFlkbFFn7U6kQdGscvJMMio2KYu4ETzQPFX+976fwCMw0l266e8/cwjz17fMrvsTmV7zJJV1CxhEg6htGBP/ulrPpC++sO/eeTZsz//jjdfA/C3x/4bPlqEUPh4hOfkv3h7Otk1GtY72j7DYJOaXlwgG2T4RtCq9ezjyW+94F0n/tP73nf0l54RYzQPTL8MQE8snRpBTRdZGx2upqtu5EVg0ozwcoPoU9IubeXEzc6xxTpExfnxWmKskJ3hjOLSB08lQk1Gk8b7ljBsSVRJ11ruTG7k8+Xbk79Ofvv379v2tx/o+mtY1dI3BZ9Rf6RD3tFL8li4891affhs4SQ2mYGOBPXId6LUkV619f5oKnSR0skZm+zZj+RyyKZXASgR+J7tP3n/s7OfPOe1Z/7Bdd/nfv3LRbMpuKwmtCnedohpgsfy0OAfipuzP7n5w4d+/b3P3/I6XvGF5UcuCLGNc+cuOXqJ+57fn6mVNGorMmUAaMOMoumjs5Tc5Nxs3vsjnxZ/8Hd/8MCr33ZO78pTaqTItMDqJs8GajlWmqxUBAFaBjontMx0FogCpRV11+ByR8uU44er0SkV2OznAzmE3BhmGw1tVyEy6BWpaMJGEjtHV0mEiUhpkHlH5yLDdod79rkv2jAyn9VdR+wCbd1iW2ibhixs2hBZNkuSBOEEJIrb049h0oRx+8ijeik2A3DVjuvXHjjx0LA9lr3+fPusPzQyRfcCQQdkLjGZInSKTm3w18lbXvH2h37s1vdft549MuwF/vTA63nVrjf/6nXiR9/VTqSorGehPg8pHGbRQgikYQ7d5ewOV6j7N3/6Z9548AWf/+rsxl2nih9ijHgzdtFjHY7oJLLKaGaRtGyCbPW6zVWOcBKTZPgxZGIBv3jiyacSoXp2+SBW45xHL0SyuMjidDdjO0OlgpB2pIUAPAiHD55ht8gzkh97nRDC12r67Gf61/AD4Xe//MLkZydSC5S2bPbnr55rrnhAdQkTO6G0PUpdUqgeD5vbdAxxx6N9uO3Ix7lq83cf+v4rX33T63a+72efVb/mI861lNmAXhziWx5JOIFFZ7C39/dP/v19P/YpgI+svo0f3/UWzv80/Pstb/rxV6rf+/H56Tnsy26lWdNUtSUmns32XCbtmGn5EBN3nIcWb7ruE7O33wSwp/3Cv7kfhBAMOQvltcgyQ6pLwmBKlI7oTCJNLJMmNNSzjkPFbUihCSNPrP3kVCKUznWM3qOtQYbIotnG4fQe0lQShSNESTt1uC5Q15b+6hn7L5u89D8+d/kn/xygieNNejT4+GzcXLl39PW8sw0harScO3D+3FW3GzukND2cDMzqGlE0rOqj3Hb0w5c+2ocnb3vR/1wxcPQd4mVb/+v3X9Re/0dqY4htJSYaZKuRCcSYM1FHuGtw43W3H/8fT//+pV8A4L7nwpdOfIxrN738Xb8+/9fbLho//6v9okQVDn/SsIdbybOMME1JZZ8z1p7OYX/f2X979I//8wXpdaeELzbaoyo6lAgwrsZoW1BmOdoWXoakHRdhC98pXnokVAnaZ5AovmP5hbeeQrdZNQ4HtyutaIVFIZgmR+iFTUQrwzNmr/vIM+uf/q9XjH/4r55Z/8RbX5X9wSu+a/gr57/6nLf9EcCejTtefLT9OmcunNfsST7Fw8MvaI8gdX2et/CKA8BK4ntM1ho8DYkxNF2Dyh1f858487H6dP3Wn4kAr9/5sf94TfrDb6nsiFB2GCPRXY9m0mAySbCeL8WPveyfyl616SWPRP/7u4++bOmNVz0//fn/JEclUTuW7E6K0Vba4gRylrI/vY26PMoax5dPFX88afNzLqmzkWhkjUEyrWsa32FlJbR2g8VeM8e82fJZG6ofFqlFmJZjozu3nUqB8mk7FTIKtJG03qOVo7YV2WQYzlBP+Y2n7rr+7v9tIh9r+iJn7+xzF/sy8in3jpeO5h+iV2+mVH0mYY0PjV9//fr64at7WVkPZJZ7LEIHZNdHhoBN1QWP5gs40h048/D+uy6/8tzvuuHR+cS+k3eZ42rvjdvMBa88XN+xxTFlQZ6JSyuilbTJjNVq/3PXq/VylB+KN+5780+Hvl/I7dyRzeHiB24+9nG7qh6Svzl/61PeufraP3lo+PeXLoozENUWgmrwiSfrhtxZ//UrVm39jkWTH/63dMQd8Tg3Hf2prydJilSLNOkqC2ETs7gK0mgdGnH/elzhC7z75bHXMBe3sDI9ypj1S08VNgkh4q8evo41r4giolLJqB6TJJpZM4ulHMjHkuuLR7YW3jO+7cLJ0hpNtsFsxdJbhO3jF7Cx+CBf6v7stU537B5dSUhbPJAoQdfURCUYy/1nPNTdwdjvmbv5ns98tNfTP2njOJXo+PD4Vg6s73/r08//odcc3zj5pDe3Tz60ZlfYKA8gXIae9un1JMeTPeUXpx8sDx3fu3PP4FNvaXVUrmtIezpmuhdtnch7xdN2v+HMT7zwZw7veGianDRuOMa1Bh9aAoHjvbuXbz36f20F/k0JdaGHO8cXhCPFg3aU7Dc9NU8lRqRJQs4wys5PRyLtqPS60F3G2K9jSsmKO3JKRTbH9YiuttjY4WTDUC6Rd/MM1ULcVZ79z66M6JfeX8AsAAAVS0lEQVS9zVF0pEoxXCqoZoGeGd4t27nY98v02iWsqZC5BBMJUZFmPaLp2O4uvSOrGwbqgo1BMf/xLitO7F39+rk33/WZZ581vKZ72vLL3gSweW75cH+0C1FM2B4vRPRq2vQEvpMkKkP5GONgtLuppXLM0IXEtlGM/EwmfcVac6hRQhw6Wz3luFeOOFZkMcHEhKgcys2j6uLffONIpjdz6fITvNJqrdQFzgWidsiYUtv6qBwWy0s+NBRJwVXi39PaCu8Mm+J5G6fQHEpoJSl6GqlTkrZgajcgDVTmqP7Qibec/8/JlvZMaWTG6iGLTCy5yLgr/etLNtxRsTy7ECNS1vU+5OqQnf5Coop43yKrjKcmP3LT9rmncOdDX7quFba3RS6fd9HSNXc/7Tte9De3rH3ghe/c+Ln7H21rXm/bH4Jkw64SrKDMC7ya0qylduXBqVsf1UoNHbLTECGVBSaNiCphIVtSAKIrjTMNMdE44RFeY4wGuvCC8159SmTEOSaP7B7rY5tjUdMN1slUwYwRc7r/oJza1V2zqsH6jlvse0hVSZzAxB/cdioNeU9Q3/s+7yRh6okiUJiS6bgiUQU+t4/73nHUrZW3J+/fFemY3ymwLhJ7FuNy+mIxjsQxqnqKdZ5WTTHdENUoQhCgLJ+dveu/AFx+1lVf8NadNapr+6juu8b/8PqvihuGMcY5gEHZP5QmGa0ZQwg45widQgw2xK4zdtJLizxsaM6X12Ktw4uGEBQh76j9I5tdnGtXo7dkJOg2h8QhxyVkrbxt49PbTwV/LKabl4WULFcXk9kezSyipaBpm1I63Y168z26CkIQaGGIZeCKwYu+cioNeVHbcxCBVPeRkx5TZqi+RcYsblKbHnfICxhRloXACWwHNI+spZqxARbx5h23/cibtnzxWc/zr/kdeo6T08MsNDsJBNCG/eltTznoH9gK8NwLfuglOxZ33wzwmbU/e/VXso9c29MLCCE2qmgv+Wr7t9fYrsUkAi8tWhWoTKJlala7k4mQ4b5sjlh3LVrnGJkiOs3MrVMaed3Jb8R8Mv/g2cKljP2Iq/wrPhm9RvUtzliq2cH0VPDF4XbvQ8JqzmyvoosBX44hgo3qEqlIjwUsiZEoLfA4ZBfYM/lScgoNeXpvuPWqNrYEOcUPpyRe4Rvw+UyO28nmx5Jr43HuG98kJ+GkMSZDkhAJ2KmmNDnL/ryRIfm/N6uzb9owJ46qLmHLcBsr6WEQHSF05H4pv3v82RcdWN+/5XMP/OXPHlvf/7rffeinf+XG8F9+Pwwqpr2j9ueOXvDlXzx60dc3zH5ULAm1wqiMWb2BShxZN7/+ojN/ZmN3vHYtzvJ6P19Fmpq2jejFFpE53r/yqx9889KFk2Pm/tRXkiRLudt84nqhI23nWT52Wf2s7a+681Twx4OzOydJpvkKHyHRBuNziBKlWZfaqQVVZ5A7sjQnqxYRRlCrjctOpbjmrFvRvfFOpJunaztkDokoiF2Mw2zr9DHfiovN7A+37y5Uf6dQArKOIi3opKZTDaUYrgghPMDErp6tu5wH1O1YMQMJmelxwN7DHHm3a373seOrJ9xt4v2/c2fxnjelMk2DdVTtmplmx5/UlsdY7M4GYQl5jZ0GkBFfpZwTn/17Qoj2+m0/unfn6IlfKYqE2BQoAdn+MxGdxBYj1tR+VXcVeVriJoGxOE7sOrbUF41ftemt1wkh7KngjCuH331BF2tm2Qm6rqZrW0IXaFkfSFWoe730hLHGi5a2XEOZBGHFxilEqE4lCQvl5tFl0+cfEwH8THOuvTb4Lrrzeer9jydosvyZJ/w+fLCkPmPiJpw7vppdsyvZai7+6v8kX/KAzWfM221stZcxDNuIdPj5EXsne34S4Aeu+k9/ZGL/HcYYVuQRVEgwIqOlpoktpgRpBLNRS57nJD7ngu45f/UT29/8x4+288vn3fD0Le2l3xAy4AuL3Dxhzu9GFg6NQStBW6yQDTVSRYJUnBWvvOGi4dPvuHX1L0+Nq5syt51HKQVoEpMjE03TBSfjGodKkxCjQPiEED2MFa8+5/c+fCrNoequ7Sq70puUB5dVzFyRa2dNW+nZIIaen3s8uVTkNok92Cjdxpr1i3GLr/XIVWo1nhg9+K3JvCJrkyZtrXO+koebuqptHha7xdG5zZnpVXc8Wu+nN/3xr1w1esU9apzZ0H4zQb4ziAgn3H5s5xnMD6KcJM3TZq9/5y9t/+iLb974y2+FYP585TfCa9sbrr5w9pKPsSbtSX+YSm0QNxKsavEug0mKrwLJeLN7avXKD7x65x++8oH66zxl8aWnxtXt10dJDp2wRGlJc0HEkvdEqrvYzTkCZAEnLCK1hE6zMT16yqTzEUKEP933huumq5PcyvVwdfbDqjoRQ6+H3l5cGurZ2uMutdGrg1vPr7/raXO9NAafKjNWTMIxv9xd7pflrjX4KACX19/zwXPUM28+wJ3LRRweTYhz6epce+3yvz84P1xc+cLxG5DRIXIxBS75yMHf/Y6j/OPlTRUu9M5dpay2rapU0mV3ylb8jxckr//GxTuvPAJw7fwPfKs/37/0OjIxnADfd/PGX1741eZvnhFEe1UIcfO0Wo2Z7s/nuv+Verbx+WvMK752zfaX3AdwTn5qxJk34kFuPPF79+bF0sz5umw8pCeX6RUJZjz/sHjf6i/+4K3uvX/RmZYkQN7tZKpPcvHqC1//ugvf89ucxmNfpXGdRDz25uq7m89zSfa0f1HHvm4PZyQX/DPOW2dOzJ9y5/650Xuf8hfuF27ZHC/icLwdn3raieeF8g0fla6bbhYmYoSgjg3DuERIpoQkDk7T5vGR/DOO/teQCfhnyQSckmQCWA+HK289x8U3EFqT2zn6gx4H1+7wUifDJUtNJwOoyIPmVtTaHIXsx9O0OY3HwrI6Y1lpg9AghKJznsbVsBgvlYnMjmy2F+P8lDDTlHqeC9XTmWfenjbdaTwWxuH4WmgjbWzwzqKlIpEpuODk0mx3+x/ku864fv23X3uOe+rtok027pvcjgqL09OmO43HwqI8e7uSEiNShJBYPaa1DXlYGOnn7Hjlu79Z7+3A2z9z8s9+/pNzv/XWcXdInTbdaTzOHGql9RYqjy4FUSgGdgGd6CPfSr/3/gOv/4sqqe6/vfnQ5ZPeEabdydOT8tN4TJRJv8hFyTicRDiD0J4uGaPHMmqA9z74m7/998nv/GAsBXk1R9ANUcTTSVtP4zFxZLrnoIiGop8zG3foLBJMx0p8+GIJMC4eeHJMBG5qabM16pGnV59932nTnca3o4nHOLF64qTB1FVbkaQCYzSpknihawlwbu/qn9u+drWPjYQ6QUvFtoXtp8MGp/G/wSMo5zY5lcZxWmZoSqL22A5kpY389Oy9vKD/07f/UO/NL79EPndfKyuU0fQpwmnzncb/Nn8Sm9lpztzVdO3m0EQ6O0XFHKEk5LX91vu6/etfH/SKzZf8l5NX37IhDvEE9717czH4vMZccFw/uCBi0uvJ+VZGa/w427Ozf/5dUvYfPO7uevmsrnbFuVGbVH29M73wBpRp9k3vfJHQkMUUodL7NtojT8ziYnhC+czfP+FPLp4Id79sFtdVNKEVNrH93nBuOl2vVapDtCE01pk8kbITXZRC+3K0yZe9LU0Tj87N5GQ1SjsQMSFQt9En/RRWZJ76thtpX+ukbb3cmZy3r8gX90Y9fvr+7p40S7NmUtdqLl3cpDp50Dt/R5vUTxEx6FTN36+02L3RHJkZM1zuwswP5KATs+IbT1x+yWfH7cP9I919T1oLhy/v6aXPFWHBrlSHnqzyuLalOe9ru+YuP/pAfcsT1u1KuWt41r6Ts6NX+3JjPnrVdrQxiBBGzTT0siRNYo7rgkeavtfTlcHkvK9cv+0Hb/jA6hvf1POLR64cvvgjS2rX6J727594uL37GQO36R/LZHFxP/fOdojdckd29jc+s/KBHwmFrTNRLHaNa3vFMO/ibOS7tk71kkmjlKNuPZmLOw8s9LbvOza997n9sPPTV+x8xl3HNw6cE1sz89Kujdn70jqzZ3Ttxt09dp7YlZ2x7/zi+XfcOf3YuXfbT/9wdIlFhlCGgWgqcfjs8vLPtP0TP3ej+s3LVJBILfFBMB+2Mpyed7cAWI372HdsT7q4csXwpk2/9ZkvqHdeEpJatG1ACkmuCzoxZbE+i9V4jKxQtKJGjPNH/gldOrRM8DiCU4jwyD+D1nKdsGFIh5KutaQmp5l68qEk8SmzZJ2e3YLzHpuuIkjobIsMCQJHbFJELyCjpKk60lwS04j2huY4pFsCzgEhoLXE2D5trEkSwdRX0CoSA9500CmiUKA8QoC2ySNlacCLDo0heIESksSW2LLCuJQqjBBRI1REKE/sDKLoCEHSthYdUvLU0LYWlVu8VxTNAnW2hheBVGmazpHLFCRIMnysqduGMg5p1IxMJKhQUqkRqTGEVoMHmTtc9Hjr0FphO49RBustRVLQyYroJLnsM51WqMIipMaNHMNyK0pGNjhKqDRyaNFe4euU6AOi7CBKpBSETmOpMMqAl/TVIlNzAqtbjEvxVoB2dCNJmSfEzGFCSasq5FShTE5UE85ce8HXJcC0Oko37S1+sfenv/zwsb3Lmd3k3UxTsoSpS2LrKEdnMDUbkFiqtkVJRcwdvosYkRBVhxEZWkWMLJBBUtplkqEm75bRbYY2CVnSQ7qEFo9xJYvtOQzdHNFr4syguwyCQqAfeXqwHtWknK2eivI5QzlH6udJFkAKQ5QOISNaKNraElWLc4pBkjFf9lFGIZqcM5NLOKu5GoOhiEPq9QilQylJURQM2p1IGUkySRQCFRTeVEQhUUaTpBoTFvDBcUn1EmSTkMsMowQxRFJRktJHOslCsoNU5kgrcd6RkeMVtKJ5JDeAl+RpjvMBV0VsCx0TdJMQZQsyYopH1s57OpK2R9sGtskLwIDpSbrYIIRBSk3SDok4sjAgOE9aljg9ptZTVMwos3lCI6Hrkc5FpBHoWJLEjLQZolNFqYeoRCGkZmxOoEWCv69PHBuSxJCEnLSv0KJEdSnWNygriVETREXlBXmeBAmwq7ha3yb/8Ee/LD/8Ewe3fWaLlatap4ogGtSCR3Z9VK+jMxWp1mSpRlmDigKVRVziIKaEtEVqQSgbnGwRSUcaMmb5cWQB1tXosgIhuL79FdvGisPpV1grTiClRCaCJDMkIsG34CIkucBpxwl1J8G030zHPIMYkV6SiZS0mqeLEZM5aBNcaGnryGTcIj1EG1hrNwitRMcCGRXFJokSkugUwUVcMgVtaKtAUBacRnUlxhiUjtiRxM4sVnTs7i5/SFqDFJqoPSGApSEKT2J6HBrdj6hyEpNwwcbzEXXCue2V5NU8sfU8kv0p0FdLaGUQRhB7lpAGWvfI5ogYNCIqaDOCcfTSkjzME6xH+wzfRsJUMLDbvbex6pV9vIwIbx7ZEobCiRaRRpwZkeeGrfJM/EyzQ16Ms5ZO1SilIXZ42RBCxGSexJfETnLmtvMoyzlcZ5HCYJLAUjgLgkRJjdQSmXt8cPSNYLu5aI8E+O2933fTl9IPv7H1sbxQPIMz/ZUUYR6Zgq8ibjgjOEvcCGBzYpSYmOFbQ5rp/6eEM9mR6yoA6LnTG6uqu6rb7tHGaewMUiwiESkWQqxAsGTH17Bhj8SPZBOQEEKgsLKEiFAM2CFOwMSO2+65q169d98dWfg3ztE5SGcQ8s3qTyRBGt68wrPMeDHQpA2klNRsYl9rZK+Yst2FURNlQncV9jwhbPGm2pWCYjVnLm4yLAWQSF4RXKTsd0jZUTWKoAbsOhILi1GZwSV+XvzycaU0QgnqhcFdVCQTGdWKi/pLsukI0UGCGBNIR+oEtrpio2xIKiBLh7QV67Uj54y3AV+MmJlnUs34Q/2bI9VGhCuhL9GiRBdvRGlQHbPZhCQH1msPUhHxdLFDKEiVQ0owuWJVvmJazGlUzf2rnyHqiB6mHPQfYE1HHDVKBCDhwsg36TN0nSFqimnCS8dUTvN+fi9F5ZAKyJlKN3R5iXaaihKVSoIPrNQZ1nS8Uv9El4KNsM2yOsWbEcaS0reMy4zUERMrnveP0UlzNP4AFy3l9R5n+t+YCnIO+CigCAgNQ/Y8j48O1cMXv9v7W/ztr1LwlVGSzq5xcSAmR2sXLMzhUnZl2bsBhKQUNYvhLgTFRM2RXY1PAyJratESgyIHQVYR6UpEkMz9W7ZSrb4eTzBGklLBv9yfahNKNJp5scdM3qRfrwkp0uYbHJXf5/XwDFMK9HpKXbXcdh/9vq2KhR9lk5wkjFClCS5EVFYEOXDTvTt7sX6qMxl3ndCVwIgSPwaSjESv0FISOkVpWtwwkoGJ2mJYB1QoGEbHZrvFYX4bkya5kJWo+k0UNckFnBsJIVDbDeb6VtwQC2mHSJumFHGDc3+MSIbt8oAgRqxZMZhTXIwk62nDLVq/GKYsjI09yUvO7RlZWkTMNMMuZqxIyiIKwWY+oBAb1xvVtuuHvrRXHtCYVHDRncu+OC+cH4nuDYvOo0GrQEKTnCZYiUgQc0CHgoxiKrfo/YBzDoVCCMngepo0JweYpl183ROi453845Pj8rN2zJbG7bK0p1RihnCChX2LOt3AR8vW9Z1rAfBteHTLo257xlCkvEq6/Q5hfV6J2eWeufcU4Jn/6/ulnC48KZ8N/32YZMpb+khPmH+4licroi3RWkAyMcrs8ygmuh7HXOx/V9//5OvVo4Vq/PciUuucCckNPgdaWTgpbx534fOXlTh4sFK9Nq48mRdzhtzvL9Ol2xa7JhEeHpj7wyfHvxbvTh/slM1sd7CX22ix1Jla6+ms9z2t2Xri0+puzEkqclilVa+CiIU2EyMnRcr23Ia+qdXmZNDDV9n7Q5W8WFR3X1/k50qH5vbaXfW70ztPJqm1Lo/dWf+/ncX0ThqsvbMSz2sRy9jJtZyLxfHb1YNHL/3jWRTxnZTEsqGevxif3dtuFpc5tH+cqw3zKn/53uVwfVjqaSwUTRMnF6/s13/ea4+EMJvb1+7bHw5p6HMMRVO2aksdfkqWdnSr+2uxNJvF4uWBev8rgCf9p7/AVM3Knl5M9aLwOQYbrpCmbpMf3GBHa0olNuuFv+wvyv3mXl65Vee5mGdZVj50cavcPz4o7v79m+UXP1mKk0mSZOHS6SjHiRZl1GN5+sGNH/3nH5d/+UhXSl/118IYkRq17UIamhhdyEqYrWrniybvfH42PN288uc//XD7wcf/B9ybhk78ePw+AAAAAElFTkSuQmCC" />'
class Sim:
"""A gradient descent simulation"""
def __init__(self):
self.vis = Vis()
self.f = None
self.df_dx, self.df_dy = None, None
self.gradient_descent_step = None
self.gd_trails = []
def test(self, actual_data, correct_data, mismatch_messages):
try:
for i in range(len(actual_data)):
assert np.allclose(actual_data[i], correct_data[i], atol=0), mismatch_messages[i]
self.test_passed()
except AssertionError as e:
self.test_failed(*e.args)
def test_passed(self):
md = """
|Result|Reason|
|:-|:-|
|{}|{}|
""".format(pass_base64, 'All tests passed.')
md = textwrap.dedent(md).strip()
display(Markdown(md))
def test_failed(self, reason):
md = """
|Result|Reason|
|:-|:-|
|{}|{}|
""".format(fail_base64, reason)
md = textwrap.dedent(md).strip()
display(Markdown(md))
# Simulation setup
def setup_f(self, f):
"""Pass the function specified in the instructions (a function that we will investigate using GD)."""
self.f = f
# Test cases
actual_data = [[f(4,4), f(0,0), f(0,5), f(5,0), f(5,5)]]
correct_data = [[0.0, 0.53455254198027291, -0.24808622997157845, -0.85934218863936607, 0.4794146557024872]]
mismatch_messages = ["f(x) did not yield the correct values."]
self.test(actual_data, correct_data, mismatch_messages)
def setup_grad_f(self, df_dx, df_dy):
"""Pass the two functions specified in the instructions (representing the numerical derivatives of the function f)."""
self.df_dx = df_dx
self.df_dy = df_dy
# Test cases
actual_data = [[
df_dx(4,4), df_dx(0,0), df_dx(0,5), df_dx(5,0), df_dx(5,5),
df_dy(4,4), df_dy(0,0), df_dy(0,5), df_dy(5,0), df_dy(5,5)
]]
correct_data = [[
-4.57865284353384e-09, -0.67528633233632229, 1.1072220512945845, -0.4642336087448129, 0.435550702873988,
4.6625163358499513e-09, 1.8222564213395964, -0.92753851759706796, 1.3567830224835431, 0.44203310716930955
]]
mismatch_messages = [
'df_dx(x,y) did not yield the correct values.',
'df_dy(x,y) did not yield the correct values.',
]
self.test(actual_data, correct_data, mismatch_messages)
def setup_gds(self, gds):
"""Pass a function specified in the instructions that makes a gradient descent step. It should take the parameters x, y and return the values new_x, new_y and step_length."""
self.gradient_descent_step = gds
# Test cases
actual_data = [
[gds(4,4, alpha=0.2)[0], gds(0,0, alpha=0.2)[0], gds(0,5, alpha=0.2)[0], gds(5,0, alpha=0.2)[0], gds(5,5, alpha=0.2)[0]],
[gds(4,4, alpha=0.2)[1], gds(0,0, alpha=0.2)[1], gds(0,5, alpha=0.2)[1], gds(5,0, alpha=0.2)[1], gds(5,5, alpha=0.2)[1]],
[gds(4,4, alpha=0.2)[2], gds(0,0, alpha=0.2)[2], gds(0,5, alpha=0.2)[2], gds(5,0, alpha=0.2)[2], gds(5,5, alpha=0.2)[2]],
]
correct_data = [
[4.0000000009157306, 0.13505726646726446, -0.22144441025891692, 5.0928467217489626, 4.9128898594252028],
[3.9999999990674966, -0.36445128426791928, 5.1855077035194137, -0.27135660449670862, 4.9115933785661383],
[1.3069526165403576e-09, 0.38867107408468843, 0.28887840850428093, 0.28680118644021058, 0.12411247843916039],
]
mismatch_messages = [
'new_x was not always set to the correct value.',
'new_y was not always set to the correct value.',
'step_size was not always set to the correct value.',
]
self.test(actual_data, correct_data, mismatch_messages)
def run(self, show_2d=True, show_3d=True):
"""Runs the simulation."""
self.vis.run(self.f, self.df_dx, self.df_dy, self.gd_trails, show_2d=show_2d, show_3d=show_3d) | 349.839286 | 18,412 | 0.921112 | 1,857 | 39,182 | 19.391492 | 0.727518 | 0.002499 | 0.002916 | 0.001333 | 0.026631 | 0.023827 | 0.016523 | 0.016523 | 0.010775 | 0.009442 | 0 | 0.168515 | 0.029044 | 39,182 | 112 | 18,413 | 349.839286 | 0.77802 | 0.012225 | 0 | 0.240506 | 0 | 0.025316 | 0.913743 | 0.901978 | 0 | 1 | 0.000078 | 0 | 0.025316 | 1 | 0.101266 | false | 0.050633 | 0.063291 | 0 | 0.177215 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
16703470645daf6533dfa7d916cb2a260655a978 | 20,467 | py | Python | v6.0.5/switch_controller/test_fortios_switch_controller_managed_switch.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 14 | 2018-09-25T20:35:25.000Z | 2021-07-14T04:30:54.000Z | v6.0.6/switch_controller/test_fortios_switch_controller_managed_switch.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 32 | 2018-10-09T04:13:42.000Z | 2020-05-11T07:20:28.000Z | v6.0.5/switch_controller/test_fortios_switch_controller_managed_switch.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 11 | 2018-10-09T00:14:53.000Z | 2021-11-03T10:54:09.000Z | # Copyright 2019 Fortinet, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <https://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import json
import pytest
from mock import ANY
from ansible.module_utils.network.fortios.fortios import FortiOSHandler
try:
from ansible.modules.network.fortios import fortios_switch_controller_managed_switch
except ImportError:
pytest.skip("Could not load required modules for testing", allow_module_level=True)
@pytest.fixture(autouse=True)
def connection_mock(mocker):
connection_class_mock = mocker.patch('ansible.modules.network.fortios.fortios_switch_controller_managed_switch.Connection')
return connection_class_mock
fos_instance = FortiOSHandler(connection_mock)
def test_switch_controller_managed_switch_creation(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'switch_controller_managed_switch': {'delayed_restart_trigger': '3',
'description': 'test_value_4',
'directly_connected': '5',
'dynamic_capability': '6',
'dynamically_discovered': '7',
'fsw_wan1_admin': 'discovered',
'fsw_wan1_peer': 'test_value_9',
'fsw_wan2_admin': 'discovered',
'fsw_wan2_peer': 'test_value_11',
'max_allowed_trunk_members': '12',
'name': 'default_name_13',
'owner_vdom': 'test_value_14',
'poe_detection_type': '15',
'poe_pre_standard_detection': 'enable',
'pre_provisioned': '17',
'staged_image_version': 'test_value_18',
'switch_device_tag': 'test_value_19',
'switch_id': 'test_value_20',
'switch_profile': 'test_value_21',
'type': 'virtual',
'version': '23'
},
'vdom': 'root'}
is_error, changed, response = fortios_switch_controller_managed_switch.fortios_switch_controller(input_data, fos_instance)
expected_data = {'delayed-restart-trigger': '3',
'description': 'test_value_4',
'directly-connected': '5',
'dynamic-capability': '6',
'dynamically-discovered': '7',
'fsw-wan1-admin': 'discovered',
'fsw-wan1-peer': 'test_value_9',
'fsw-wan2-admin': 'discovered',
'fsw-wan2-peer': 'test_value_11',
'max-allowed-trunk-members': '12',
'name': 'default_name_13',
'owner-vdom': 'test_value_14',
'poe-detection-type': '15',
'poe-pre-standard-detection': 'enable',
'pre-provisioned': '17',
'staged-image-version': 'test_value_18',
'switch-device-tag': 'test_value_19',
'switch-id': 'test_value_20',
'switch-profile': 'test_value_21',
'type': 'virtual',
'version': '23'
}
set_method_mock.assert_called_with('switch-controller', 'managed-switch', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_switch_controller_managed_switch_creation_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'switch_controller_managed_switch': {'delayed_restart_trigger': '3',
'description': 'test_value_4',
'directly_connected': '5',
'dynamic_capability': '6',
'dynamically_discovered': '7',
'fsw_wan1_admin': 'discovered',
'fsw_wan1_peer': 'test_value_9',
'fsw_wan2_admin': 'discovered',
'fsw_wan2_peer': 'test_value_11',
'max_allowed_trunk_members': '12',
'name': 'default_name_13',
'owner_vdom': 'test_value_14',
'poe_detection_type': '15',
'poe_pre_standard_detection': 'enable',
'pre_provisioned': '17',
'staged_image_version': 'test_value_18',
'switch_device_tag': 'test_value_19',
'switch_id': 'test_value_20',
'switch_profile': 'test_value_21',
'type': 'virtual',
'version': '23'
},
'vdom': 'root'}
is_error, changed, response = fortios_switch_controller_managed_switch.fortios_switch_controller(input_data, fos_instance)
expected_data = {'delayed-restart-trigger': '3',
'description': 'test_value_4',
'directly-connected': '5',
'dynamic-capability': '6',
'dynamically-discovered': '7',
'fsw-wan1-admin': 'discovered',
'fsw-wan1-peer': 'test_value_9',
'fsw-wan2-admin': 'discovered',
'fsw-wan2-peer': 'test_value_11',
'max-allowed-trunk-members': '12',
'name': 'default_name_13',
'owner-vdom': 'test_value_14',
'poe-detection-type': '15',
'poe-pre-standard-detection': 'enable',
'pre-provisioned': '17',
'staged-image-version': 'test_value_18',
'switch-device-tag': 'test_value_19',
'switch-id': 'test_value_20',
'switch-profile': 'test_value_21',
'type': 'virtual',
'version': '23'
}
set_method_mock.assert_called_with('switch-controller', 'managed-switch', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_switch_controller_managed_switch_removal(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
delete_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
delete_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.delete', return_value=delete_method_result)
input_data = {
'username': 'admin',
'state': 'absent',
'switch_controller_managed_switch': {'delayed_restart_trigger': '3',
'description': 'test_value_4',
'directly_connected': '5',
'dynamic_capability': '6',
'dynamically_discovered': '7',
'fsw_wan1_admin': 'discovered',
'fsw_wan1_peer': 'test_value_9',
'fsw_wan2_admin': 'discovered',
'fsw_wan2_peer': 'test_value_11',
'max_allowed_trunk_members': '12',
'name': 'default_name_13',
'owner_vdom': 'test_value_14',
'poe_detection_type': '15',
'poe_pre_standard_detection': 'enable',
'pre_provisioned': '17',
'staged_image_version': 'test_value_18',
'switch_device_tag': 'test_value_19',
'switch_id': 'test_value_20',
'switch_profile': 'test_value_21',
'type': 'virtual',
'version': '23'
},
'vdom': 'root'}
is_error, changed, response = fortios_switch_controller_managed_switch.fortios_switch_controller(input_data, fos_instance)
delete_method_mock.assert_called_with('switch-controller', 'managed-switch', mkey=ANY, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_switch_controller_managed_switch_deletion_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
delete_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
delete_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.delete', return_value=delete_method_result)
input_data = {
'username': 'admin',
'state': 'absent',
'switch_controller_managed_switch': {'delayed_restart_trigger': '3',
'description': 'test_value_4',
'directly_connected': '5',
'dynamic_capability': '6',
'dynamically_discovered': '7',
'fsw_wan1_admin': 'discovered',
'fsw_wan1_peer': 'test_value_9',
'fsw_wan2_admin': 'discovered',
'fsw_wan2_peer': 'test_value_11',
'max_allowed_trunk_members': '12',
'name': 'default_name_13',
'owner_vdom': 'test_value_14',
'poe_detection_type': '15',
'poe_pre_standard_detection': 'enable',
'pre_provisioned': '17',
'staged_image_version': 'test_value_18',
'switch_device_tag': 'test_value_19',
'switch_id': 'test_value_20',
'switch_profile': 'test_value_21',
'type': 'virtual',
'version': '23'
},
'vdom': 'root'}
is_error, changed, response = fortios_switch_controller_managed_switch.fortios_switch_controller(input_data, fos_instance)
delete_method_mock.assert_called_with('switch-controller', 'managed-switch', mkey=ANY, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_switch_controller_managed_switch_idempotent(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'DELETE', 'http_status': 404}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'switch_controller_managed_switch': {'delayed_restart_trigger': '3',
'description': 'test_value_4',
'directly_connected': '5',
'dynamic_capability': '6',
'dynamically_discovered': '7',
'fsw_wan1_admin': 'discovered',
'fsw_wan1_peer': 'test_value_9',
'fsw_wan2_admin': 'discovered',
'fsw_wan2_peer': 'test_value_11',
'max_allowed_trunk_members': '12',
'name': 'default_name_13',
'owner_vdom': 'test_value_14',
'poe_detection_type': '15',
'poe_pre_standard_detection': 'enable',
'pre_provisioned': '17',
'staged_image_version': 'test_value_18',
'switch_device_tag': 'test_value_19',
'switch_id': 'test_value_20',
'switch_profile': 'test_value_21',
'type': 'virtual',
'version': '23'
},
'vdom': 'root'}
is_error, changed, response = fortios_switch_controller_managed_switch.fortios_switch_controller(input_data, fos_instance)
expected_data = {'delayed-restart-trigger': '3',
'description': 'test_value_4',
'directly-connected': '5',
'dynamic-capability': '6',
'dynamically-discovered': '7',
'fsw-wan1-admin': 'discovered',
'fsw-wan1-peer': 'test_value_9',
'fsw-wan2-admin': 'discovered',
'fsw-wan2-peer': 'test_value_11',
'max-allowed-trunk-members': '12',
'name': 'default_name_13',
'owner-vdom': 'test_value_14',
'poe-detection-type': '15',
'poe-pre-standard-detection': 'enable',
'pre-provisioned': '17',
'staged-image-version': 'test_value_18',
'switch-device-tag': 'test_value_19',
'switch-id': 'test_value_20',
'switch-profile': 'test_value_21',
'type': 'virtual',
'version': '23'
}
set_method_mock.assert_called_with('switch-controller', 'managed-switch', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 404
def test_switch_controller_managed_switch_filter_foreign_attributes(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'switch_controller_managed_switch': {
'random_attribute_not_valid': 'tag', 'delayed_restart_trigger': '3',
'description': 'test_value_4',
'directly_connected': '5',
'dynamic_capability': '6',
'dynamically_discovered': '7',
'fsw_wan1_admin': 'discovered',
'fsw_wan1_peer': 'test_value_9',
'fsw_wan2_admin': 'discovered',
'fsw_wan2_peer': 'test_value_11',
'max_allowed_trunk_members': '12',
'name': 'default_name_13',
'owner_vdom': 'test_value_14',
'poe_detection_type': '15',
'poe_pre_standard_detection': 'enable',
'pre_provisioned': '17',
'staged_image_version': 'test_value_18',
'switch_device_tag': 'test_value_19',
'switch_id': 'test_value_20',
'switch_profile': 'test_value_21',
'type': 'virtual',
'version': '23'
},
'vdom': 'root'}
is_error, changed, response = fortios_switch_controller_managed_switch.fortios_switch_controller(input_data, fos_instance)
expected_data = {'delayed-restart-trigger': '3',
'description': 'test_value_4',
'directly-connected': '5',
'dynamic-capability': '6',
'dynamically-discovered': '7',
'fsw-wan1-admin': 'discovered',
'fsw-wan1-peer': 'test_value_9',
'fsw-wan2-admin': 'discovered',
'fsw-wan2-peer': 'test_value_11',
'max-allowed-trunk-members': '12',
'name': 'default_name_13',
'owner-vdom': 'test_value_14',
'poe-detection-type': '15',
'poe-pre-standard-detection': 'enable',
'pre-provisioned': '17',
'staged-image-version': 'test_value_18',
'switch-device-tag': 'test_value_19',
'switch-id': 'test_value_20',
'switch-profile': 'test_value_21',
'type': 'virtual',
'version': '23'
}
set_method_mock.assert_called_with('switch-controller', 'managed-switch', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
| 52.479487 | 142 | 0.494308 | 1,790 | 20,467 | 5.292737 | 0.116201 | 0.075997 | 0.06312 | 0.079586 | 0.896137 | 0.885793 | 0.871965 | 0.866477 | 0.866477 | 0.866477 | 0 | 0.02952 | 0.40084 | 20,467 | 389 | 143 | 52.614396 | 0.743048 | 0.032442 | 0 | 0.880734 | 0 | 0 | 0.337141 | 0.103149 | 0 | 0 | 0 | 0 | 0.110092 | 1 | 0.021407 | false | 0 | 0.024465 | 0 | 0.04893 | 0.003058 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
167831d53963297faf5b55c742c9b5c63e6e972d | 83,790 | py | Python | sdk/python/pulumi_azure_nextgen/costmanagement/_inputs.py | pulumi/pulumi-azure-nextgen | 452736b0a1cf584c2d4c04666e017af6e9b2c15c | [
"Apache-2.0"
] | 31 | 2020-09-21T09:41:01.000Z | 2021-02-26T13:21:59.000Z | sdk/python/pulumi_azure_nextgen/costmanagement/_inputs.py | pulumi/pulumi-azure-nextgen | 452736b0a1cf584c2d4c04666e017af6e9b2c15c | [
"Apache-2.0"
] | 231 | 2020-09-21T09:38:45.000Z | 2021-03-01T11:16:03.000Z | sdk/python/pulumi_azure_nextgen/costmanagement/_inputs.py | pulumi/pulumi-azure-nextgen | 452736b0a1cf584c2d4c04666e017af6e9b2c15c | [
"Apache-2.0"
] | 4 | 2020-09-29T14:14:59.000Z | 2021-02-10T20:38:16.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union
from .. import _utilities, _tables
from ._enums import *
__all__ = [
'BudgetTimePeriodArgs',
'CostAllocationProportionArgs',
'CostAllocationRuleDetailsArgs',
'CostAllocationRulePropertiesArgs',
'ExportDatasetArgs',
'ExportDatasetConfigurationArgs',
'ExportDefinitionArgs',
'ExportDeliveryDestinationArgs',
'ExportDeliveryInfoArgs',
'ExportRecurrencePeriodArgs',
'ExportScheduleArgs',
'ExportTimePeriodArgs',
'KpiPropertiesArgs',
'NotificationArgs',
'PivotPropertiesArgs',
'ReportAggregationArgs',
'ReportComparisonExpressionArgs',
'ReportConfigAggregationArgs',
'ReportConfigComparisonExpressionArgs',
'ReportConfigDatasetArgs',
'ReportConfigDatasetConfigurationArgs',
'ReportConfigDefinitionArgs',
'ReportConfigDeliveryDestinationArgs',
'ReportConfigDeliveryInfoArgs',
'ReportConfigFilterArgs',
'ReportConfigGroupingArgs',
'ReportConfigRecurrencePeriodArgs',
'ReportConfigScheduleArgs',
'ReportConfigSortingArgs',
'ReportConfigTimePeriodArgs',
'ReportDatasetArgs',
'ReportDatasetConfigurationArgs',
'ReportDefinitionArgs',
'ReportDeliveryDestinationArgs',
'ReportDeliveryInfoArgs',
'ReportFilterArgs',
'ReportGroupingArgs',
'ReportRecurrencePeriodArgs',
'ReportScheduleArgs',
'ReportTimePeriodArgs',
'SourceCostAllocationResourceArgs',
'TargetCostAllocationResourceArgs',
]
@pulumi.input_type
class BudgetTimePeriodArgs:
def __init__(__self__, *,
start_date: pulumi.Input[str],
end_date: Optional[pulumi.Input[str]] = None):
"""
The start and end date for a budget.
:param pulumi.Input[str] start_date: The start date for the budget.
:param pulumi.Input[str] end_date: The end date for the budget. If not provided, we default this to 10 years from the start date.
"""
pulumi.set(__self__, "start_date", start_date)
if end_date is not None:
pulumi.set(__self__, "end_date", end_date)
@property
@pulumi.getter(name="startDate")
def start_date(self) -> pulumi.Input[str]:
"""
The start date for the budget.
"""
return pulumi.get(self, "start_date")
@start_date.setter
def start_date(self, value: pulumi.Input[str]):
pulumi.set(self, "start_date", value)
@property
@pulumi.getter(name="endDate")
def end_date(self) -> Optional[pulumi.Input[str]]:
"""
The end date for the budget. If not provided, we default this to 10 years from the start date.
"""
return pulumi.get(self, "end_date")
@end_date.setter
def end_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "end_date", value)
@pulumi.input_type
class CostAllocationProportionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
percentage: pulumi.Input[float]):
"""
Target resources and allocation
:param pulumi.Input[str] name: Target resource for cost allocation
:param pulumi.Input[float] percentage: Percentage of source cost to allocate to this resource. This value can be specified to two decimal places and the total percentage of all resources in this rule must sum to 100.00.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "percentage", percentage)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Target resource for cost allocation
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def percentage(self) -> pulumi.Input[float]:
"""
Percentage of source cost to allocate to this resource. This value can be specified to two decimal places and the total percentage of all resources in this rule must sum to 100.00.
"""
return pulumi.get(self, "percentage")
@percentage.setter
def percentage(self, value: pulumi.Input[float]):
pulumi.set(self, "percentage", value)
@pulumi.input_type
class CostAllocationRuleDetailsArgs:
def __init__(__self__, *,
source_resources: Optional[pulumi.Input[Sequence[pulumi.Input['SourceCostAllocationResourceArgs']]]] = None,
target_resources: Optional[pulumi.Input[Sequence[pulumi.Input['TargetCostAllocationResourceArgs']]]] = None):
"""
Resource details of the cost allocation rule
:param pulumi.Input[Sequence[pulumi.Input['SourceCostAllocationResourceArgs']]] source_resources: Source resources for cost allocation. At this time, this list can contain no more than one element.
:param pulumi.Input[Sequence[pulumi.Input['TargetCostAllocationResourceArgs']]] target_resources: Target resources for cost allocation. At this time, this list can contain no more than one element.
"""
if source_resources is not None:
pulumi.set(__self__, "source_resources", source_resources)
if target_resources is not None:
pulumi.set(__self__, "target_resources", target_resources)
@property
@pulumi.getter(name="sourceResources")
def source_resources(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['SourceCostAllocationResourceArgs']]]]:
"""
Source resources for cost allocation. At this time, this list can contain no more than one element.
"""
return pulumi.get(self, "source_resources")
@source_resources.setter
def source_resources(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['SourceCostAllocationResourceArgs']]]]):
pulumi.set(self, "source_resources", value)
@property
@pulumi.getter(name="targetResources")
def target_resources(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['TargetCostAllocationResourceArgs']]]]:
"""
Target resources for cost allocation. At this time, this list can contain no more than one element.
"""
return pulumi.get(self, "target_resources")
@target_resources.setter
def target_resources(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['TargetCostAllocationResourceArgs']]]]):
pulumi.set(self, "target_resources", value)
@pulumi.input_type
class CostAllocationRulePropertiesArgs:
def __init__(__self__, *,
details: pulumi.Input['CostAllocationRuleDetailsArgs'],
status: pulumi.Input[Union[str, 'RuleStatus']],
description: Optional[pulumi.Input[str]] = None):
"""
The properties of a cost allocation rule
:param pulumi.Input['CostAllocationRuleDetailsArgs'] details: Resource information for the cost allocation rule
:param pulumi.Input[Union[str, 'RuleStatus']] status: Status of the rule
:param pulumi.Input[str] description: Description of a cost allocation rule.
"""
pulumi.set(__self__, "details", details)
pulumi.set(__self__, "status", status)
if description is not None:
pulumi.set(__self__, "description", description)
@property
@pulumi.getter
def details(self) -> pulumi.Input['CostAllocationRuleDetailsArgs']:
"""
Resource information for the cost allocation rule
"""
return pulumi.get(self, "details")
@details.setter
def details(self, value: pulumi.Input['CostAllocationRuleDetailsArgs']):
pulumi.set(self, "details", value)
@property
@pulumi.getter
def status(self) -> pulumi.Input[Union[str, 'RuleStatus']]:
"""
Status of the rule
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: pulumi.Input[Union[str, 'RuleStatus']]):
pulumi.set(self, "status", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of a cost allocation rule.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@pulumi.input_type
class ExportDatasetArgs:
def __init__(__self__, *,
configuration: Optional[pulumi.Input['ExportDatasetConfigurationArgs']] = None,
granularity: Optional[pulumi.Input[Union[str, 'GranularityType']]] = None):
"""
The definition for data in the export.
:param pulumi.Input['ExportDatasetConfigurationArgs'] configuration: The export dataset configuration.
:param pulumi.Input[Union[str, 'GranularityType']] granularity: The granularity of rows in the export. Currently only 'Daily' is supported.
"""
if configuration is not None:
pulumi.set(__self__, "configuration", configuration)
if granularity is not None:
pulumi.set(__self__, "granularity", granularity)
@property
@pulumi.getter
def configuration(self) -> Optional[pulumi.Input['ExportDatasetConfigurationArgs']]:
"""
The export dataset configuration.
"""
return pulumi.get(self, "configuration")
@configuration.setter
def configuration(self, value: Optional[pulumi.Input['ExportDatasetConfigurationArgs']]):
pulumi.set(self, "configuration", value)
@property
@pulumi.getter
def granularity(self) -> Optional[pulumi.Input[Union[str, 'GranularityType']]]:
"""
The granularity of rows in the export. Currently only 'Daily' is supported.
"""
return pulumi.get(self, "granularity")
@granularity.setter
def granularity(self, value: Optional[pulumi.Input[Union[str, 'GranularityType']]]):
pulumi.set(self, "granularity", value)
@pulumi.input_type
class ExportDatasetConfigurationArgs:
def __init__(__self__, *,
columns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The export dataset configuration. Allows columns to be selected for the export. If not provided then the export will include all available columns.
:param pulumi.Input[Sequence[pulumi.Input[str]]] columns: Array of column names to be included in the export. If not provided then the export will include all available columns. The available columns can vary by customer channel (see examples).
"""
if columns is not None:
pulumi.set(__self__, "columns", columns)
@property
@pulumi.getter
def columns(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Array of column names to be included in the export. If not provided then the export will include all available columns. The available columns can vary by customer channel (see examples).
"""
return pulumi.get(self, "columns")
@columns.setter
def columns(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "columns", value)
@pulumi.input_type
class ExportDefinitionArgs:
def __init__(__self__, *,
timeframe: pulumi.Input[Union[str, 'TimeframeType']],
type: pulumi.Input[Union[str, 'ExportType']],
data_set: Optional[pulumi.Input['ExportDatasetArgs']] = None,
time_period: Optional[pulumi.Input['ExportTimePeriodArgs']] = None):
"""
The definition of an export.
:param pulumi.Input[Union[str, 'TimeframeType']] timeframe: The time frame for pulling data for the export. If custom, then a specific time period must be provided.
:param pulumi.Input[Union[str, 'ExportType']] type: The type of the export. Note that 'Usage' is equivalent to 'ActualCost' and is applicable to exports that do not yet provide data for charges or amortization for service reservations.
:param pulumi.Input['ExportDatasetArgs'] data_set: The definition for data in the export.
:param pulumi.Input['ExportTimePeriodArgs'] time_period: Has time period for pulling data for the export.
"""
pulumi.set(__self__, "timeframe", timeframe)
pulumi.set(__self__, "type", type)
if data_set is not None:
pulumi.set(__self__, "data_set", data_set)
if time_period is not None:
pulumi.set(__self__, "time_period", time_period)
@property
@pulumi.getter
def timeframe(self) -> pulumi.Input[Union[str, 'TimeframeType']]:
"""
The time frame for pulling data for the export. If custom, then a specific time period must be provided.
"""
return pulumi.get(self, "timeframe")
@timeframe.setter
def timeframe(self, value: pulumi.Input[Union[str, 'TimeframeType']]):
pulumi.set(self, "timeframe", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[Union[str, 'ExportType']]:
"""
The type of the export. Note that 'Usage' is equivalent to 'ActualCost' and is applicable to exports that do not yet provide data for charges or amortization for service reservations.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[Union[str, 'ExportType']]):
pulumi.set(self, "type", value)
@property
@pulumi.getter(name="dataSet")
def data_set(self) -> Optional[pulumi.Input['ExportDatasetArgs']]:
"""
The definition for data in the export.
"""
return pulumi.get(self, "data_set")
@data_set.setter
def data_set(self, value: Optional[pulumi.Input['ExportDatasetArgs']]):
pulumi.set(self, "data_set", value)
@property
@pulumi.getter(name="timePeriod")
def time_period(self) -> Optional[pulumi.Input['ExportTimePeriodArgs']]:
"""
Has time period for pulling data for the export.
"""
return pulumi.get(self, "time_period")
@time_period.setter
def time_period(self, value: Optional[pulumi.Input['ExportTimePeriodArgs']]):
pulumi.set(self, "time_period", value)
@pulumi.input_type
class ExportDeliveryDestinationArgs:
def __init__(__self__, *,
container: pulumi.Input[str],
resource_id: pulumi.Input[str],
root_folder_path: Optional[pulumi.Input[str]] = None):
"""
The destination information for the delivery of the export. To allow access to a storage account, you must register the account's subscription with the Microsoft.CostManagementExports resource provider. This is required once per subscription. When creating an export in the Azure portal, it is done automatically, however API users need to register the subscription. For more information see https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-supported-services .
:param pulumi.Input[str] container: The name of the container where exports will be uploaded.
:param pulumi.Input[str] resource_id: The resource id of the storage account where exports will be delivered.
:param pulumi.Input[str] root_folder_path: The name of the directory where exports will be uploaded.
"""
pulumi.set(__self__, "container", container)
pulumi.set(__self__, "resource_id", resource_id)
if root_folder_path is not None:
pulumi.set(__self__, "root_folder_path", root_folder_path)
@property
@pulumi.getter
def container(self) -> pulumi.Input[str]:
"""
The name of the container where exports will be uploaded.
"""
return pulumi.get(self, "container")
@container.setter
def container(self, value: pulumi.Input[str]):
pulumi.set(self, "container", value)
@property
@pulumi.getter(name="resourceId")
def resource_id(self) -> pulumi.Input[str]:
"""
The resource id of the storage account where exports will be delivered.
"""
return pulumi.get(self, "resource_id")
@resource_id.setter
def resource_id(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_id", value)
@property
@pulumi.getter(name="rootFolderPath")
def root_folder_path(self) -> Optional[pulumi.Input[str]]:
"""
The name of the directory where exports will be uploaded.
"""
return pulumi.get(self, "root_folder_path")
@root_folder_path.setter
def root_folder_path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "root_folder_path", value)
@pulumi.input_type
class ExportDeliveryInfoArgs:
def __init__(__self__, *,
destination: pulumi.Input['ExportDeliveryDestinationArgs']):
"""
The delivery information associated with a export.
:param pulumi.Input['ExportDeliveryDestinationArgs'] destination: Has destination for the export being delivered.
"""
pulumi.set(__self__, "destination", destination)
@property
@pulumi.getter
def destination(self) -> pulumi.Input['ExportDeliveryDestinationArgs']:
"""
Has destination for the export being delivered.
"""
return pulumi.get(self, "destination")
@destination.setter
def destination(self, value: pulumi.Input['ExportDeliveryDestinationArgs']):
pulumi.set(self, "destination", value)
@pulumi.input_type
class ExportRecurrencePeriodArgs:
def __init__(__self__, *,
from_: pulumi.Input[str],
to: Optional[pulumi.Input[str]] = None):
"""
The start and end date for recurrence schedule.
:param pulumi.Input[str] from_: The start date of recurrence.
:param pulumi.Input[str] to: The end date of recurrence.
"""
pulumi.set(__self__, "from_", from_)
if to is not None:
pulumi.set(__self__, "to", to)
@property
@pulumi.getter(name="from")
def from_(self) -> pulumi.Input[str]:
"""
The start date of recurrence.
"""
return pulumi.get(self, "from_")
@from_.setter
def from_(self, value: pulumi.Input[str]):
pulumi.set(self, "from_", value)
@property
@pulumi.getter
def to(self) -> Optional[pulumi.Input[str]]:
"""
The end date of recurrence.
"""
return pulumi.get(self, "to")
@to.setter
def to(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "to", value)
@pulumi.input_type
class ExportScheduleArgs:
def __init__(__self__, *,
recurrence: Optional[pulumi.Input[Union[str, 'RecurrenceType']]] = None,
recurrence_period: Optional[pulumi.Input['ExportRecurrencePeriodArgs']] = None,
status: Optional[pulumi.Input[Union[str, 'StatusType']]] = None):
"""
The schedule associated with the export.
:param pulumi.Input[Union[str, 'RecurrenceType']] recurrence: The schedule recurrence.
:param pulumi.Input['ExportRecurrencePeriodArgs'] recurrence_period: Has start and end date of the recurrence. The start date must be in future. If present, the end date must be greater than start date.
:param pulumi.Input[Union[str, 'StatusType']] status: The status of the export's schedule. If 'Inactive', the export's schedule is paused.
"""
if recurrence is not None:
pulumi.set(__self__, "recurrence", recurrence)
if recurrence_period is not None:
pulumi.set(__self__, "recurrence_period", recurrence_period)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def recurrence(self) -> Optional[pulumi.Input[Union[str, 'RecurrenceType']]]:
"""
The schedule recurrence.
"""
return pulumi.get(self, "recurrence")
@recurrence.setter
def recurrence(self, value: Optional[pulumi.Input[Union[str, 'RecurrenceType']]]):
pulumi.set(self, "recurrence", value)
@property
@pulumi.getter(name="recurrencePeriod")
def recurrence_period(self) -> Optional[pulumi.Input['ExportRecurrencePeriodArgs']]:
"""
Has start and end date of the recurrence. The start date must be in future. If present, the end date must be greater than start date.
"""
return pulumi.get(self, "recurrence_period")
@recurrence_period.setter
def recurrence_period(self, value: Optional[pulumi.Input['ExportRecurrencePeriodArgs']]):
pulumi.set(self, "recurrence_period", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[Union[str, 'StatusType']]]:
"""
The status of the export's schedule. If 'Inactive', the export's schedule is paused.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[Union[str, 'StatusType']]]):
pulumi.set(self, "status", value)
@pulumi.input_type
class ExportTimePeriodArgs:
def __init__(__self__, *,
from_: pulumi.Input[str],
to: pulumi.Input[str]):
"""
The date range for data in the export. This should only be specified with timeFrame set to 'Custom'. The maximum date range is 3 months.
:param pulumi.Input[str] from_: The start date for export data.
:param pulumi.Input[str] to: The end date for export data.
"""
pulumi.set(__self__, "from_", from_)
pulumi.set(__self__, "to", to)
@property
@pulumi.getter(name="from")
def from_(self) -> pulumi.Input[str]:
"""
The start date for export data.
"""
return pulumi.get(self, "from_")
@from_.setter
def from_(self, value: pulumi.Input[str]):
pulumi.set(self, "from_", value)
@property
@pulumi.getter
def to(self) -> pulumi.Input[str]:
"""
The end date for export data.
"""
return pulumi.get(self, "to")
@to.setter
def to(self, value: pulumi.Input[str]):
pulumi.set(self, "to", value)
@pulumi.input_type
class KpiPropertiesArgs:
def __init__(__self__, *,
enabled: Optional[pulumi.Input[bool]] = None,
id: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[Union[str, 'KpiTypeType']]] = None):
"""
Each KPI must contain a 'type' and 'enabled' key.
:param pulumi.Input[bool] enabled: show the KPI in the UI?
:param pulumi.Input[str] id: ID of resource related to metric (budget).
:param pulumi.Input[Union[str, 'KpiTypeType']] type: KPI type (Forecast, Budget).
"""
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if id is not None:
pulumi.set(__self__, "id", id)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
show the KPI in the UI?
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter
def id(self) -> Optional[pulumi.Input[str]]:
"""
ID of resource related to metric (budget).
"""
return pulumi.get(self, "id")
@id.setter
def id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "id", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[Union[str, 'KpiTypeType']]]:
"""
KPI type (Forecast, Budget).
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[Union[str, 'KpiTypeType']]]):
pulumi.set(self, "type", value)
@pulumi.input_type
class NotificationArgs:
def __init__(__self__, *,
contact_emails: pulumi.Input[Sequence[pulumi.Input[str]]],
enabled: pulumi.Input[bool],
operator: pulumi.Input[Union[str, 'NotificationOperatorType']],
threshold: pulumi.Input[float],
contact_groups: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
contact_roles: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The notification associated with a budget.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_emails: Email addresses to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[bool] enabled: The notification is enabled or not.
:param pulumi.Input[Union[str, 'NotificationOperatorType']] operator: The comparison operator.
:param pulumi.Input[float] threshold: Threshold value associated with a notification. Notification is sent when the cost exceeded the threshold. It is always percent and has to be between 0 and 1000.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_groups: Action groups to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_roles: Contact roles to send the budget notification to when the threshold is exceeded.
"""
pulumi.set(__self__, "contact_emails", contact_emails)
pulumi.set(__self__, "enabled", enabled)
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "threshold", threshold)
if contact_groups is not None:
pulumi.set(__self__, "contact_groups", contact_groups)
if contact_roles is not None:
pulumi.set(__self__, "contact_roles", contact_roles)
@property
@pulumi.getter(name="contactEmails")
def contact_emails(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Email addresses to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_emails")
@contact_emails.setter
def contact_emails(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "contact_emails", value)
@property
@pulumi.getter
def enabled(self) -> pulumi.Input[bool]:
"""
The notification is enabled or not.
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: pulumi.Input[bool]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter
def operator(self) -> pulumi.Input[Union[str, 'NotificationOperatorType']]:
"""
The comparison operator.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: pulumi.Input[Union[str, 'NotificationOperatorType']]):
pulumi.set(self, "operator", value)
@property
@pulumi.getter
def threshold(self) -> pulumi.Input[float]:
"""
Threshold value associated with a notification. Notification is sent when the cost exceeded the threshold. It is always percent and has to be between 0 and 1000.
"""
return pulumi.get(self, "threshold")
@threshold.setter
def threshold(self, value: pulumi.Input[float]):
pulumi.set(self, "threshold", value)
@property
@pulumi.getter(name="contactGroups")
def contact_groups(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Action groups to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_groups")
@contact_groups.setter
def contact_groups(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_groups", value)
@property
@pulumi.getter(name="contactRoles")
def contact_roles(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Contact roles to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_roles")
@contact_roles.setter
def contact_roles(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_roles", value)
@pulumi.input_type
class PivotPropertiesArgs:
def __init__(__self__, *,
name: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[Union[str, 'PivotTypeType']]] = None):
"""
Each pivot must contain a 'type' and 'name'.
:param pulumi.Input[str] name: Data field to show in view.
:param pulumi.Input[Union[str, 'PivotTypeType']] type: Data type to show in view.
"""
if name is not None:
pulumi.set(__self__, "name", name)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Data field to show in view.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[Union[str, 'PivotTypeType']]]:
"""
Data type to show in view.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[Union[str, 'PivotTypeType']]]):
pulumi.set(self, "type", value)
@pulumi.input_type
class ReportAggregationArgs:
def __init__(__self__, *,
function: pulumi.Input[Union[str, 'FunctionType']],
name: pulumi.Input[str]):
"""
The aggregation expression to be used in the report.
:param pulumi.Input[Union[str, 'FunctionType']] function: The name of the aggregation function to use.
:param pulumi.Input[str] name: The name of the column to aggregate.
"""
pulumi.set(__self__, "function", function)
pulumi.set(__self__, "name", name)
@property
@pulumi.getter
def function(self) -> pulumi.Input[Union[str, 'FunctionType']]:
"""
The name of the aggregation function to use.
"""
return pulumi.get(self, "function")
@function.setter
def function(self, value: pulumi.Input[Union[str, 'FunctionType']]):
pulumi.set(self, "function", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to aggregate.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@pulumi.input_type
class ReportComparisonExpressionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
operator: pulumi.Input[Union[str, 'OperatorType']],
values: pulumi.Input[Sequence[pulumi.Input[str]]]):
"""
The comparison expression to be used in the report.
:param pulumi.Input[str] name: The name of the column to use in comparison.
:param pulumi.Input[Union[str, 'OperatorType']] operator: The operator to use for comparison.
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Array of values to use for comparison
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "values", values)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to use in comparison.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def operator(self) -> pulumi.Input[Union[str, 'OperatorType']]:
"""
The operator to use for comparison.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: pulumi.Input[Union[str, 'OperatorType']]):
pulumi.set(self, "operator", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Array of values to use for comparison
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@pulumi.input_type
class ReportConfigAggregationArgs:
def __init__(__self__, *,
function: pulumi.Input[Union[str, 'FunctionType']],
name: pulumi.Input[str]):
"""
The aggregation expression to be used in the report.
:param pulumi.Input[Union[str, 'FunctionType']] function: The name of the aggregation function to use.
:param pulumi.Input[str] name: The name of the column to aggregate.
"""
pulumi.set(__self__, "function", function)
pulumi.set(__self__, "name", name)
@property
@pulumi.getter
def function(self) -> pulumi.Input[Union[str, 'FunctionType']]:
"""
The name of the aggregation function to use.
"""
return pulumi.get(self, "function")
@function.setter
def function(self, value: pulumi.Input[Union[str, 'FunctionType']]):
pulumi.set(self, "function", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to aggregate.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@pulumi.input_type
class ReportConfigComparisonExpressionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
operator: pulumi.Input[Union[str, 'OperatorType']],
values: pulumi.Input[Sequence[pulumi.Input[str]]]):
"""
The comparison expression to be used in the report.
:param pulumi.Input[str] name: The name of the column to use in comparison.
:param pulumi.Input[Union[str, 'OperatorType']] operator: The operator to use for comparison.
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Array of values to use for comparison
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "values", values)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to use in comparison.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def operator(self) -> pulumi.Input[Union[str, 'OperatorType']]:
"""
The operator to use for comparison.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: pulumi.Input[Union[str, 'OperatorType']]):
pulumi.set(self, "operator", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Array of values to use for comparison
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@pulumi.input_type
class ReportConfigDatasetArgs:
def __init__(__self__, *,
aggregation: Optional[pulumi.Input[Mapping[str, pulumi.Input['ReportConfigAggregationArgs']]]] = None,
configuration: Optional[pulumi.Input['ReportConfigDatasetConfigurationArgs']] = None,
filter: Optional[pulumi.Input['ReportConfigFilterArgs']] = None,
granularity: Optional[pulumi.Input[Union[str, 'ReportGranularityType']]] = None,
grouping: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigGroupingArgs']]]] = None,
sorting: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigSortingArgs']]]] = None):
"""
The definition of data present in the report.
:param pulumi.Input[Mapping[str, pulumi.Input['ReportConfigAggregationArgs']]] aggregation: Dictionary of aggregation expression to use in the report. The key of each item in the dictionary is the alias for the aggregated column. Report can have up to 2 aggregation clauses.
:param pulumi.Input['ReportConfigDatasetConfigurationArgs'] configuration: Has configuration information for the data in the report. The configuration will be ignored if aggregation and grouping are provided.
:param pulumi.Input['ReportConfigFilterArgs'] filter: Has filter expression to use in the report.
:param pulumi.Input[Union[str, 'ReportGranularityType']] granularity: The granularity of rows in the report.
:param pulumi.Input[Sequence[pulumi.Input['ReportConfigGroupingArgs']]] grouping: Array of group by expression to use in the report. Report can have up to 2 group by clauses.
:param pulumi.Input[Sequence[pulumi.Input['ReportConfigSortingArgs']]] sorting: Array of order by expression to use in the report.
"""
if aggregation is not None:
pulumi.set(__self__, "aggregation", aggregation)
if configuration is not None:
pulumi.set(__self__, "configuration", configuration)
if filter is not None:
pulumi.set(__self__, "filter", filter)
if granularity is not None:
pulumi.set(__self__, "granularity", granularity)
if grouping is not None:
pulumi.set(__self__, "grouping", grouping)
if sorting is not None:
pulumi.set(__self__, "sorting", sorting)
@property
@pulumi.getter
def aggregation(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input['ReportConfigAggregationArgs']]]]:
"""
Dictionary of aggregation expression to use in the report. The key of each item in the dictionary is the alias for the aggregated column. Report can have up to 2 aggregation clauses.
"""
return pulumi.get(self, "aggregation")
@aggregation.setter
def aggregation(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input['ReportConfigAggregationArgs']]]]):
pulumi.set(self, "aggregation", value)
@property
@pulumi.getter
def configuration(self) -> Optional[pulumi.Input['ReportConfigDatasetConfigurationArgs']]:
"""
Has configuration information for the data in the report. The configuration will be ignored if aggregation and grouping are provided.
"""
return pulumi.get(self, "configuration")
@configuration.setter
def configuration(self, value: Optional[pulumi.Input['ReportConfigDatasetConfigurationArgs']]):
pulumi.set(self, "configuration", value)
@property
@pulumi.getter
def filter(self) -> Optional[pulumi.Input['ReportConfigFilterArgs']]:
"""
Has filter expression to use in the report.
"""
return pulumi.get(self, "filter")
@filter.setter
def filter(self, value: Optional[pulumi.Input['ReportConfigFilterArgs']]):
pulumi.set(self, "filter", value)
@property
@pulumi.getter
def granularity(self) -> Optional[pulumi.Input[Union[str, 'ReportGranularityType']]]:
"""
The granularity of rows in the report.
"""
return pulumi.get(self, "granularity")
@granularity.setter
def granularity(self, value: Optional[pulumi.Input[Union[str, 'ReportGranularityType']]]):
pulumi.set(self, "granularity", value)
@property
@pulumi.getter
def grouping(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigGroupingArgs']]]]:
"""
Array of group by expression to use in the report. Report can have up to 2 group by clauses.
"""
return pulumi.get(self, "grouping")
@grouping.setter
def grouping(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigGroupingArgs']]]]):
pulumi.set(self, "grouping", value)
@property
@pulumi.getter
def sorting(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigSortingArgs']]]]:
"""
Array of order by expression to use in the report.
"""
return pulumi.get(self, "sorting")
@sorting.setter
def sorting(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigSortingArgs']]]]):
pulumi.set(self, "sorting", value)
@pulumi.input_type
class ReportConfigDatasetConfigurationArgs:
def __init__(__self__, *,
columns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The configuration of dataset in the report.
:param pulumi.Input[Sequence[pulumi.Input[str]]] columns: Array of column names to be included in the report. Any valid report column name is allowed. If not provided, then report includes all columns.
"""
if columns is not None:
pulumi.set(__self__, "columns", columns)
@property
@pulumi.getter
def columns(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Array of column names to be included in the report. Any valid report column name is allowed. If not provided, then report includes all columns.
"""
return pulumi.get(self, "columns")
@columns.setter
def columns(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "columns", value)
@pulumi.input_type
class ReportConfigDefinitionArgs:
def __init__(__self__, *,
timeframe: pulumi.Input[Union[str, 'TimeframeType']],
type: pulumi.Input[Union[str, 'ReportType']],
dataset: Optional[pulumi.Input['ReportConfigDatasetArgs']] = None,
time_period: Optional[pulumi.Input['ReportConfigTimePeriodArgs']] = None):
"""
The definition of a report config.
:param pulumi.Input[Union[str, 'TimeframeType']] timeframe: The time frame for pulling data for the report. If custom, then a specific time period must be provided.
:param pulumi.Input[Union[str, 'ReportType']] type: The type of the report.
:param pulumi.Input['ReportConfigDatasetArgs'] dataset: Has definition for data in this report config.
:param pulumi.Input['ReportConfigTimePeriodArgs'] time_period: Has time period for pulling data for the report.
"""
pulumi.set(__self__, "timeframe", timeframe)
pulumi.set(__self__, "type", type)
if dataset is not None:
pulumi.set(__self__, "dataset", dataset)
if time_period is not None:
pulumi.set(__self__, "time_period", time_period)
@property
@pulumi.getter
def timeframe(self) -> pulumi.Input[Union[str, 'TimeframeType']]:
"""
The time frame for pulling data for the report. If custom, then a specific time period must be provided.
"""
return pulumi.get(self, "timeframe")
@timeframe.setter
def timeframe(self, value: pulumi.Input[Union[str, 'TimeframeType']]):
pulumi.set(self, "timeframe", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[Union[str, 'ReportType']]:
"""
The type of the report.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[Union[str, 'ReportType']]):
pulumi.set(self, "type", value)
@property
@pulumi.getter
def dataset(self) -> Optional[pulumi.Input['ReportConfigDatasetArgs']]:
"""
Has definition for data in this report config.
"""
return pulumi.get(self, "dataset")
@dataset.setter
def dataset(self, value: Optional[pulumi.Input['ReportConfigDatasetArgs']]):
pulumi.set(self, "dataset", value)
@property
@pulumi.getter(name="timePeriod")
def time_period(self) -> Optional[pulumi.Input['ReportConfigTimePeriodArgs']]:
"""
Has time period for pulling data for the report.
"""
return pulumi.get(self, "time_period")
@time_period.setter
def time_period(self, value: Optional[pulumi.Input['ReportConfigTimePeriodArgs']]):
pulumi.set(self, "time_period", value)
@pulumi.input_type
class ReportConfigDeliveryDestinationArgs:
def __init__(__self__, *,
container: pulumi.Input[str],
resource_id: pulumi.Input[str],
root_folder_path: Optional[pulumi.Input[str]] = None):
"""
The destination information for the delivery of the report.
:param pulumi.Input[str] container: The name of the container where reports will be uploaded.
:param pulumi.Input[str] resource_id: The resource id of the storage account where reports will be delivered.
:param pulumi.Input[str] root_folder_path: The name of the directory where reports will be uploaded.
"""
pulumi.set(__self__, "container", container)
pulumi.set(__self__, "resource_id", resource_id)
if root_folder_path is not None:
pulumi.set(__self__, "root_folder_path", root_folder_path)
@property
@pulumi.getter
def container(self) -> pulumi.Input[str]:
"""
The name of the container where reports will be uploaded.
"""
return pulumi.get(self, "container")
@container.setter
def container(self, value: pulumi.Input[str]):
pulumi.set(self, "container", value)
@property
@pulumi.getter(name="resourceId")
def resource_id(self) -> pulumi.Input[str]:
"""
The resource id of the storage account where reports will be delivered.
"""
return pulumi.get(self, "resource_id")
@resource_id.setter
def resource_id(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_id", value)
@property
@pulumi.getter(name="rootFolderPath")
def root_folder_path(self) -> Optional[pulumi.Input[str]]:
"""
The name of the directory where reports will be uploaded.
"""
return pulumi.get(self, "root_folder_path")
@root_folder_path.setter
def root_folder_path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "root_folder_path", value)
@pulumi.input_type
class ReportConfigDeliveryInfoArgs:
def __init__(__self__, *,
destination: pulumi.Input['ReportConfigDeliveryDestinationArgs']):
"""
The delivery information associated with a report config.
:param pulumi.Input['ReportConfigDeliveryDestinationArgs'] destination: Has destination for the report being delivered.
"""
pulumi.set(__self__, "destination", destination)
@property
@pulumi.getter
def destination(self) -> pulumi.Input['ReportConfigDeliveryDestinationArgs']:
"""
Has destination for the report being delivered.
"""
return pulumi.get(self, "destination")
@destination.setter
def destination(self, value: pulumi.Input['ReportConfigDeliveryDestinationArgs']):
pulumi.set(self, "destination", value)
@pulumi.input_type
class ReportConfigFilterArgs:
def __init__(__self__, *,
and_: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]]] = None,
dimension: Optional[pulumi.Input['ReportConfigComparisonExpressionArgs']] = None,
not_: Optional[pulumi.Input['ReportConfigFilterArgs']] = None,
or_: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]]] = None,
tag: Optional[pulumi.Input['ReportConfigComparisonExpressionArgs']] = None):
"""
The filter expression to be used in the report.
:param pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]] and_: The logical "AND" expression. Must have at least 2 items.
:param pulumi.Input['ReportConfigComparisonExpressionArgs'] dimension: Has comparison expression for a dimension
:param pulumi.Input['ReportConfigFilterArgs'] not_: The logical "NOT" expression.
:param pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]] or_: The logical "OR" expression. Must have at least 2 items.
:param pulumi.Input['ReportConfigComparisonExpressionArgs'] tag: Has comparison expression for a tag
"""
if and_ is not None:
pulumi.set(__self__, "and_", and_)
if dimension is not None:
pulumi.set(__self__, "dimension", dimension)
if not_ is not None:
pulumi.set(__self__, "not_", not_)
if or_ is not None:
pulumi.set(__self__, "or_", or_)
if tag is not None:
pulumi.set(__self__, "tag", tag)
@property
@pulumi.getter(name="and")
def and_(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]]]:
"""
The logical "AND" expression. Must have at least 2 items.
"""
return pulumi.get(self, "and_")
@and_.setter
def and_(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]]]):
pulumi.set(self, "and_", value)
@property
@pulumi.getter
def dimension(self) -> Optional[pulumi.Input['ReportConfigComparisonExpressionArgs']]:
"""
Has comparison expression for a dimension
"""
return pulumi.get(self, "dimension")
@dimension.setter
def dimension(self, value: Optional[pulumi.Input['ReportConfigComparisonExpressionArgs']]):
pulumi.set(self, "dimension", value)
@property
@pulumi.getter(name="not")
def not_(self) -> Optional[pulumi.Input['ReportConfigFilterArgs']]:
"""
The logical "NOT" expression.
"""
return pulumi.get(self, "not_")
@not_.setter
def not_(self, value: Optional[pulumi.Input['ReportConfigFilterArgs']]):
pulumi.set(self, "not_", value)
@property
@pulumi.getter(name="or")
def or_(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]]]:
"""
The logical "OR" expression. Must have at least 2 items.
"""
return pulumi.get(self, "or_")
@or_.setter
def or_(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportConfigFilterArgs']]]]):
pulumi.set(self, "or_", value)
@property
@pulumi.getter
def tag(self) -> Optional[pulumi.Input['ReportConfigComparisonExpressionArgs']]:
"""
Has comparison expression for a tag
"""
return pulumi.get(self, "tag")
@tag.setter
def tag(self, value: Optional[pulumi.Input['ReportConfigComparisonExpressionArgs']]):
pulumi.set(self, "tag", value)
@pulumi.input_type
class ReportConfigGroupingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
type: pulumi.Input[Union[str, 'ReportConfigColumnType']]):
"""
The group by expression to be used in the report.
:param pulumi.Input[str] name: The name of the column to group. This version supports subscription lowest possible grain.
:param pulumi.Input[Union[str, 'ReportConfigColumnType']] type: Has type of the column to group.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "type", type)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to group. This version supports subscription lowest possible grain.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[Union[str, 'ReportConfigColumnType']]:
"""
Has type of the column to group.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[Union[str, 'ReportConfigColumnType']]):
pulumi.set(self, "type", value)
@pulumi.input_type
class ReportConfigRecurrencePeriodArgs:
def __init__(__self__, *,
from_: pulumi.Input[str],
to: Optional[pulumi.Input[str]] = None):
"""
The start and end date for recurrence schedule.
:param pulumi.Input[str] from_: The start date of recurrence.
:param pulumi.Input[str] to: The end date of recurrence. If not provided, we default this to 10 years from the start date.
"""
pulumi.set(__self__, "from_", from_)
if to is not None:
pulumi.set(__self__, "to", to)
@property
@pulumi.getter(name="from")
def from_(self) -> pulumi.Input[str]:
"""
The start date of recurrence.
"""
return pulumi.get(self, "from_")
@from_.setter
def from_(self, value: pulumi.Input[str]):
pulumi.set(self, "from_", value)
@property
@pulumi.getter
def to(self) -> Optional[pulumi.Input[str]]:
"""
The end date of recurrence. If not provided, we default this to 10 years from the start date.
"""
return pulumi.get(self, "to")
@to.setter
def to(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "to", value)
@pulumi.input_type
class ReportConfigScheduleArgs:
def __init__(__self__, *,
recurrence: pulumi.Input[Union[str, 'RecurrenceType']],
recurrence_period: pulumi.Input['ReportConfigRecurrencePeriodArgs'],
status: Optional[pulumi.Input[Union[str, 'StatusType']]] = None):
"""
The schedule associated with a report config.
:param pulumi.Input[Union[str, 'RecurrenceType']] recurrence: The schedule recurrence.
:param pulumi.Input['ReportConfigRecurrencePeriodArgs'] recurrence_period: Has start and end date of the recurrence. The start date must be in future. If present, the end date must be greater than start date.
:param pulumi.Input[Union[str, 'StatusType']] status: The status of the schedule. Whether active or not. If inactive, the report's scheduled execution is paused.
"""
pulumi.set(__self__, "recurrence", recurrence)
pulumi.set(__self__, "recurrence_period", recurrence_period)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def recurrence(self) -> pulumi.Input[Union[str, 'RecurrenceType']]:
"""
The schedule recurrence.
"""
return pulumi.get(self, "recurrence")
@recurrence.setter
def recurrence(self, value: pulumi.Input[Union[str, 'RecurrenceType']]):
pulumi.set(self, "recurrence", value)
@property
@pulumi.getter(name="recurrencePeriod")
def recurrence_period(self) -> pulumi.Input['ReportConfigRecurrencePeriodArgs']:
"""
Has start and end date of the recurrence. The start date must be in future. If present, the end date must be greater than start date.
"""
return pulumi.get(self, "recurrence_period")
@recurrence_period.setter
def recurrence_period(self, value: pulumi.Input['ReportConfigRecurrencePeriodArgs']):
pulumi.set(self, "recurrence_period", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[Union[str, 'StatusType']]]:
"""
The status of the schedule. Whether active or not. If inactive, the report's scheduled execution is paused.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[Union[str, 'StatusType']]]):
pulumi.set(self, "status", value)
@pulumi.input_type
class ReportConfigSortingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
direction: Optional[pulumi.Input[str]] = None):
"""
The order by expression to be used in the report.
:param pulumi.Input[str] name: The name of the column to sort.
:param pulumi.Input[str] direction: Direction of sort.
"""
pulumi.set(__self__, "name", name)
if direction is not None:
pulumi.set(__self__, "direction", direction)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to sort.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def direction(self) -> Optional[pulumi.Input[str]]:
"""
Direction of sort.
"""
return pulumi.get(self, "direction")
@direction.setter
def direction(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "direction", value)
@pulumi.input_type
class ReportConfigTimePeriodArgs:
def __init__(__self__, *,
from_: pulumi.Input[str],
to: pulumi.Input[str]):
"""
The start and end date for pulling data for the report.
:param pulumi.Input[str] from_: The start date to pull data from.
:param pulumi.Input[str] to: The end date to pull data to.
"""
pulumi.set(__self__, "from_", from_)
pulumi.set(__self__, "to", to)
@property
@pulumi.getter(name="from")
def from_(self) -> pulumi.Input[str]:
"""
The start date to pull data from.
"""
return pulumi.get(self, "from_")
@from_.setter
def from_(self, value: pulumi.Input[str]):
pulumi.set(self, "from_", value)
@property
@pulumi.getter
def to(self) -> pulumi.Input[str]:
"""
The end date to pull data to.
"""
return pulumi.get(self, "to")
@to.setter
def to(self, value: pulumi.Input[str]):
pulumi.set(self, "to", value)
@pulumi.input_type
class ReportDatasetArgs:
def __init__(__self__, *,
aggregation: Optional[pulumi.Input[Mapping[str, pulumi.Input['ReportAggregationArgs']]]] = None,
configuration: Optional[pulumi.Input['ReportDatasetConfigurationArgs']] = None,
filter: Optional[pulumi.Input['ReportFilterArgs']] = None,
granularity: Optional[pulumi.Input[Union[str, 'GranularityType']]] = None,
grouping: Optional[pulumi.Input[Sequence[pulumi.Input['ReportGroupingArgs']]]] = None):
"""
The definition of data present in the report.
:param pulumi.Input[Mapping[str, pulumi.Input['ReportAggregationArgs']]] aggregation: Dictionary of aggregation expression to use in the report. The key of each item in the dictionary is the alias for the aggregated column. Report can have up to 2 aggregation clauses.
:param pulumi.Input['ReportDatasetConfigurationArgs'] configuration: Has configuration information for the data in the report. The configuration will be ignored if aggregation and grouping are provided.
:param pulumi.Input['ReportFilterArgs'] filter: Has filter expression to use in the report.
:param pulumi.Input[Union[str, 'GranularityType']] granularity: The granularity of rows in the report.
:param pulumi.Input[Sequence[pulumi.Input['ReportGroupingArgs']]] grouping: Array of group by expression to use in the report. Report can have up to 2 group by clauses.
"""
if aggregation is not None:
pulumi.set(__self__, "aggregation", aggregation)
if configuration is not None:
pulumi.set(__self__, "configuration", configuration)
if filter is not None:
pulumi.set(__self__, "filter", filter)
if granularity is not None:
pulumi.set(__self__, "granularity", granularity)
if grouping is not None:
pulumi.set(__self__, "grouping", grouping)
@property
@pulumi.getter
def aggregation(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input['ReportAggregationArgs']]]]:
"""
Dictionary of aggregation expression to use in the report. The key of each item in the dictionary is the alias for the aggregated column. Report can have up to 2 aggregation clauses.
"""
return pulumi.get(self, "aggregation")
@aggregation.setter
def aggregation(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input['ReportAggregationArgs']]]]):
pulumi.set(self, "aggregation", value)
@property
@pulumi.getter
def configuration(self) -> Optional[pulumi.Input['ReportDatasetConfigurationArgs']]:
"""
Has configuration information for the data in the report. The configuration will be ignored if aggregation and grouping are provided.
"""
return pulumi.get(self, "configuration")
@configuration.setter
def configuration(self, value: Optional[pulumi.Input['ReportDatasetConfigurationArgs']]):
pulumi.set(self, "configuration", value)
@property
@pulumi.getter
def filter(self) -> Optional[pulumi.Input['ReportFilterArgs']]:
"""
Has filter expression to use in the report.
"""
return pulumi.get(self, "filter")
@filter.setter
def filter(self, value: Optional[pulumi.Input['ReportFilterArgs']]):
pulumi.set(self, "filter", value)
@property
@pulumi.getter
def granularity(self) -> Optional[pulumi.Input[Union[str, 'GranularityType']]]:
"""
The granularity of rows in the report.
"""
return pulumi.get(self, "granularity")
@granularity.setter
def granularity(self, value: Optional[pulumi.Input[Union[str, 'GranularityType']]]):
pulumi.set(self, "granularity", value)
@property
@pulumi.getter
def grouping(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportGroupingArgs']]]]:
"""
Array of group by expression to use in the report. Report can have up to 2 group by clauses.
"""
return pulumi.get(self, "grouping")
@grouping.setter
def grouping(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportGroupingArgs']]]]):
pulumi.set(self, "grouping", value)
@pulumi.input_type
class ReportDatasetConfigurationArgs:
def __init__(__self__, *,
columns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The configuration of dataset in the report.
:param pulumi.Input[Sequence[pulumi.Input[str]]] columns: Array of column names to be included in the report. Any valid report column name is allowed. If not provided, then report includes all columns.
"""
if columns is not None:
pulumi.set(__self__, "columns", columns)
@property
@pulumi.getter
def columns(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Array of column names to be included in the report. Any valid report column name is allowed. If not provided, then report includes all columns.
"""
return pulumi.get(self, "columns")
@columns.setter
def columns(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "columns", value)
@pulumi.input_type
class ReportDefinitionArgs:
def __init__(__self__, *,
timeframe: pulumi.Input[Union[str, 'TimeframeType']],
type: pulumi.Input[Union[str, 'ReportType']],
dataset: Optional[pulumi.Input['ReportDatasetArgs']] = None,
time_period: Optional[pulumi.Input['ReportTimePeriodArgs']] = None):
"""
The definition of a report.
:param pulumi.Input[Union[str, 'TimeframeType']] timeframe: The time frame for pulling data for the report. If custom, then a specific time period must be provided.
:param pulumi.Input[Union[str, 'ReportType']] type: The type of the report.
:param pulumi.Input['ReportDatasetArgs'] dataset: Has definition for data in this report.
:param pulumi.Input['ReportTimePeriodArgs'] time_period: Has time period for pulling data for the report.
"""
pulumi.set(__self__, "timeframe", timeframe)
pulumi.set(__self__, "type", type)
if dataset is not None:
pulumi.set(__self__, "dataset", dataset)
if time_period is not None:
pulumi.set(__self__, "time_period", time_period)
@property
@pulumi.getter
def timeframe(self) -> pulumi.Input[Union[str, 'TimeframeType']]:
"""
The time frame for pulling data for the report. If custom, then a specific time period must be provided.
"""
return pulumi.get(self, "timeframe")
@timeframe.setter
def timeframe(self, value: pulumi.Input[Union[str, 'TimeframeType']]):
pulumi.set(self, "timeframe", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[Union[str, 'ReportType']]:
"""
The type of the report.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[Union[str, 'ReportType']]):
pulumi.set(self, "type", value)
@property
@pulumi.getter
def dataset(self) -> Optional[pulumi.Input['ReportDatasetArgs']]:
"""
Has definition for data in this report.
"""
return pulumi.get(self, "dataset")
@dataset.setter
def dataset(self, value: Optional[pulumi.Input['ReportDatasetArgs']]):
pulumi.set(self, "dataset", value)
@property
@pulumi.getter(name="timePeriod")
def time_period(self) -> Optional[pulumi.Input['ReportTimePeriodArgs']]:
"""
Has time period for pulling data for the report.
"""
return pulumi.get(self, "time_period")
@time_period.setter
def time_period(self, value: Optional[pulumi.Input['ReportTimePeriodArgs']]):
pulumi.set(self, "time_period", value)
@pulumi.input_type
class ReportDeliveryDestinationArgs:
def __init__(__self__, *,
container: pulumi.Input[str],
resource_id: pulumi.Input[str],
root_folder_path: Optional[pulumi.Input[str]] = None):
"""
The destination information for the delivery of the report.
:param pulumi.Input[str] container: The name of the container where reports will be uploaded.
:param pulumi.Input[str] resource_id: The resource id of the storage account where reports will be delivered.
:param pulumi.Input[str] root_folder_path: The name of the directory where reports will be uploaded.
"""
pulumi.set(__self__, "container", container)
pulumi.set(__self__, "resource_id", resource_id)
if root_folder_path is not None:
pulumi.set(__self__, "root_folder_path", root_folder_path)
@property
@pulumi.getter
def container(self) -> pulumi.Input[str]:
"""
The name of the container where reports will be uploaded.
"""
return pulumi.get(self, "container")
@container.setter
def container(self, value: pulumi.Input[str]):
pulumi.set(self, "container", value)
@property
@pulumi.getter(name="resourceId")
def resource_id(self) -> pulumi.Input[str]:
"""
The resource id of the storage account where reports will be delivered.
"""
return pulumi.get(self, "resource_id")
@resource_id.setter
def resource_id(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_id", value)
@property
@pulumi.getter(name="rootFolderPath")
def root_folder_path(self) -> Optional[pulumi.Input[str]]:
"""
The name of the directory where reports will be uploaded.
"""
return pulumi.get(self, "root_folder_path")
@root_folder_path.setter
def root_folder_path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "root_folder_path", value)
@pulumi.input_type
class ReportDeliveryInfoArgs:
def __init__(__self__, *,
destination: pulumi.Input['ReportDeliveryDestinationArgs']):
"""
The delivery information associated with a report.
:param pulumi.Input['ReportDeliveryDestinationArgs'] destination: Has destination for the report being delivered.
"""
pulumi.set(__self__, "destination", destination)
@property
@pulumi.getter
def destination(self) -> pulumi.Input['ReportDeliveryDestinationArgs']:
"""
Has destination for the report being delivered.
"""
return pulumi.get(self, "destination")
@destination.setter
def destination(self, value: pulumi.Input['ReportDeliveryDestinationArgs']):
pulumi.set(self, "destination", value)
@pulumi.input_type
class ReportFilterArgs:
def __init__(__self__, *,
and_: Optional[pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]]] = None,
dimension: Optional[pulumi.Input['ReportComparisonExpressionArgs']] = None,
not_: Optional[pulumi.Input['ReportFilterArgs']] = None,
or_: Optional[pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]]] = None,
tag: Optional[pulumi.Input['ReportComparisonExpressionArgs']] = None):
"""
The filter expression to be used in the report.
:param pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]] and_: The logical "AND" expression. Must have at least 2 items.
:param pulumi.Input['ReportComparisonExpressionArgs'] dimension: Has comparison expression for a dimension
:param pulumi.Input['ReportFilterArgs'] not_: The logical "NOT" expression.
:param pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]] or_: The logical "OR" expression. Must have at least 2 items.
:param pulumi.Input['ReportComparisonExpressionArgs'] tag: Has comparison expression for a tag
"""
if and_ is not None:
pulumi.set(__self__, "and_", and_)
if dimension is not None:
pulumi.set(__self__, "dimension", dimension)
if not_ is not None:
pulumi.set(__self__, "not_", not_)
if or_ is not None:
pulumi.set(__self__, "or_", or_)
if tag is not None:
pulumi.set(__self__, "tag", tag)
@property
@pulumi.getter(name="and")
def and_(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]]]:
"""
The logical "AND" expression. Must have at least 2 items.
"""
return pulumi.get(self, "and_")
@and_.setter
def and_(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]]]):
pulumi.set(self, "and_", value)
@property
@pulumi.getter
def dimension(self) -> Optional[pulumi.Input['ReportComparisonExpressionArgs']]:
"""
Has comparison expression for a dimension
"""
return pulumi.get(self, "dimension")
@dimension.setter
def dimension(self, value: Optional[pulumi.Input['ReportComparisonExpressionArgs']]):
pulumi.set(self, "dimension", value)
@property
@pulumi.getter(name="not")
def not_(self) -> Optional[pulumi.Input['ReportFilterArgs']]:
"""
The logical "NOT" expression.
"""
return pulumi.get(self, "not_")
@not_.setter
def not_(self, value: Optional[pulumi.Input['ReportFilterArgs']]):
pulumi.set(self, "not_", value)
@property
@pulumi.getter(name="or")
def or_(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]]]:
"""
The logical "OR" expression. Must have at least 2 items.
"""
return pulumi.get(self, "or_")
@or_.setter
def or_(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ReportFilterArgs']]]]):
pulumi.set(self, "or_", value)
@property
@pulumi.getter
def tag(self) -> Optional[pulumi.Input['ReportComparisonExpressionArgs']]:
"""
Has comparison expression for a tag
"""
return pulumi.get(self, "tag")
@tag.setter
def tag(self, value: Optional[pulumi.Input['ReportComparisonExpressionArgs']]):
pulumi.set(self, "tag", value)
@pulumi.input_type
class ReportGroupingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
type: pulumi.Input[Union[str, 'ReportColumnType']]):
"""
The group by expression to be used in the report.
:param pulumi.Input[str] name: The name of the column to group.
:param pulumi.Input[Union[str, 'ReportColumnType']] type: Has type of the column to group.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "type", type)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to group.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[Union[str, 'ReportColumnType']]:
"""
Has type of the column to group.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[Union[str, 'ReportColumnType']]):
pulumi.set(self, "type", value)
@pulumi.input_type
class ReportRecurrencePeriodArgs:
def __init__(__self__, *,
from_: pulumi.Input[str],
to: Optional[pulumi.Input[str]] = None):
"""
The start and end date for recurrence schedule.
:param pulumi.Input[str] from_: The start date of recurrence.
:param pulumi.Input[str] to: The end date of recurrence.
"""
pulumi.set(__self__, "from_", from_)
if to is not None:
pulumi.set(__self__, "to", to)
@property
@pulumi.getter(name="from")
def from_(self) -> pulumi.Input[str]:
"""
The start date of recurrence.
"""
return pulumi.get(self, "from_")
@from_.setter
def from_(self, value: pulumi.Input[str]):
pulumi.set(self, "from_", value)
@property
@pulumi.getter
def to(self) -> Optional[pulumi.Input[str]]:
"""
The end date of recurrence.
"""
return pulumi.get(self, "to")
@to.setter
def to(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "to", value)
@pulumi.input_type
class ReportScheduleArgs:
def __init__(__self__, *,
recurrence: pulumi.Input[Union[str, 'RecurrenceType']],
recurrence_period: Optional[pulumi.Input['ReportRecurrencePeriodArgs']] = None,
status: Optional[pulumi.Input[Union[str, 'StatusType']]] = None):
"""
The schedule associated with a report.
:param pulumi.Input[Union[str, 'RecurrenceType']] recurrence: The schedule recurrence.
:param pulumi.Input['ReportRecurrencePeriodArgs'] recurrence_period: Has start and end date of the recurrence. The start date must be in future. If present, the end date must be greater than start date.
:param pulumi.Input[Union[str, 'StatusType']] status: The status of the schedule. Whether active or not. If inactive, the report's scheduled execution is paused.
"""
pulumi.set(__self__, "recurrence", recurrence)
if recurrence_period is not None:
pulumi.set(__self__, "recurrence_period", recurrence_period)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def recurrence(self) -> pulumi.Input[Union[str, 'RecurrenceType']]:
"""
The schedule recurrence.
"""
return pulumi.get(self, "recurrence")
@recurrence.setter
def recurrence(self, value: pulumi.Input[Union[str, 'RecurrenceType']]):
pulumi.set(self, "recurrence", value)
@property
@pulumi.getter(name="recurrencePeriod")
def recurrence_period(self) -> Optional[pulumi.Input['ReportRecurrencePeriodArgs']]:
"""
Has start and end date of the recurrence. The start date must be in future. If present, the end date must be greater than start date.
"""
return pulumi.get(self, "recurrence_period")
@recurrence_period.setter
def recurrence_period(self, value: Optional[pulumi.Input['ReportRecurrencePeriodArgs']]):
pulumi.set(self, "recurrence_period", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[Union[str, 'StatusType']]]:
"""
The status of the schedule. Whether active or not. If inactive, the report's scheduled execution is paused.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[Union[str, 'StatusType']]]):
pulumi.set(self, "status", value)
@pulumi.input_type
class ReportTimePeriodArgs:
def __init__(__self__, *,
from_: pulumi.Input[str],
to: pulumi.Input[str]):
"""
The start and end date for pulling data for the report.
:param pulumi.Input[str] from_: The start date to pull data from.
:param pulumi.Input[str] to: The end date to pull data to.
"""
pulumi.set(__self__, "from_", from_)
pulumi.set(__self__, "to", to)
@property
@pulumi.getter(name="from")
def from_(self) -> pulumi.Input[str]:
"""
The start date to pull data from.
"""
return pulumi.get(self, "from_")
@from_.setter
def from_(self, value: pulumi.Input[str]):
pulumi.set(self, "from_", value)
@property
@pulumi.getter
def to(self) -> pulumi.Input[str]:
"""
The end date to pull data to.
"""
return pulumi.get(self, "to")
@to.setter
def to(self, value: pulumi.Input[str]):
pulumi.set(self, "to", value)
@pulumi.input_type
class SourceCostAllocationResourceArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
resource_type: pulumi.Input[Union[str, 'CostAllocationResourceType']],
values: pulumi.Input[Sequence[pulumi.Input[str]]]):
"""
Source resources for cost allocation
:param pulumi.Input[str] name: If resource type is dimension, this must be either ResourceGroupName or SubscriptionId. If resource type is tag, this must be a valid Azure tag
:param pulumi.Input[Union[str, 'CostAllocationResourceType']] resource_type: Type of resources contained in this cost allocation rule
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Source Resources for cost allocation. This list cannot contain more than 25 values.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "resource_type", resource_type)
pulumi.set(__self__, "values", values)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
If resource type is dimension, this must be either ResourceGroupName or SubscriptionId. If resource type is tag, this must be a valid Azure tag
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="resourceType")
def resource_type(self) -> pulumi.Input[Union[str, 'CostAllocationResourceType']]:
"""
Type of resources contained in this cost allocation rule
"""
return pulumi.get(self, "resource_type")
@resource_type.setter
def resource_type(self, value: pulumi.Input[Union[str, 'CostAllocationResourceType']]):
pulumi.set(self, "resource_type", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Source Resources for cost allocation. This list cannot contain more than 25 values.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@pulumi.input_type
class TargetCostAllocationResourceArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
policy_type: pulumi.Input[Union[str, 'CostAllocationPolicyType']],
resource_type: pulumi.Input[Union[str, 'CostAllocationResourceType']],
values: pulumi.Input[Sequence[pulumi.Input['CostAllocationProportionArgs']]]):
"""
Target resources for cost allocation.
:param pulumi.Input[str] name: If resource type is dimension, this must be either ResourceGroupName or SubscriptionId. If resource type is tag, this must be a valid Azure tag
:param pulumi.Input[Union[str, 'CostAllocationPolicyType']] policy_type: Method of cost allocation for the rule
:param pulumi.Input[Union[str, 'CostAllocationResourceType']] resource_type: Type of resources contained in this cost allocation rule
:param pulumi.Input[Sequence[pulumi.Input['CostAllocationProportionArgs']]] values: Target resources for cost allocation. This list cannot contain more than 25 values.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "policy_type", policy_type)
pulumi.set(__self__, "resource_type", resource_type)
pulumi.set(__self__, "values", values)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
If resource type is dimension, this must be either ResourceGroupName or SubscriptionId. If resource type is tag, this must be a valid Azure tag
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="policyType")
def policy_type(self) -> pulumi.Input[Union[str, 'CostAllocationPolicyType']]:
"""
Method of cost allocation for the rule
"""
return pulumi.get(self, "policy_type")
@policy_type.setter
def policy_type(self, value: pulumi.Input[Union[str, 'CostAllocationPolicyType']]):
pulumi.set(self, "policy_type", value)
@property
@pulumi.getter(name="resourceType")
def resource_type(self) -> pulumi.Input[Union[str, 'CostAllocationResourceType']]:
"""
Type of resources contained in this cost allocation rule
"""
return pulumi.get(self, "resource_type")
@resource_type.setter
def resource_type(self, value: pulumi.Input[Union[str, 'CostAllocationResourceType']]):
pulumi.set(self, "resource_type", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input['CostAllocationProportionArgs']]]:
"""
Target resources for cost allocation. This list cannot contain more than 25 values.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input['CostAllocationProportionArgs']]]):
pulumi.set(self, "values", value)
| 38.809634 | 499 | 0.64775 | 9,490 | 83,790 | 5.592097 | 0.038988 | 0.120635 | 0.055852 | 0.040815 | 0.87081 | 0.817615 | 0.789614 | 0.760877 | 0.730464 | 0.705063 | 0 | 0.000845 | 0.237499 | 83,790 | 2,158 | 500 | 38.827618 | 0.829786 | 0.280153 | 0 | 0.716216 | 1 | 0 | 0.143431 | 0.062407 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214626 | false | 0 | 0.004769 | 0 | 0.343402 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1682269a2d240a07b9ff06dea1111b7250771a1d | 35,441 | py | Python | solum/tests/worker/handlers/test_shell.py | dimtruck/solum | 7ec547039ab255052b954a102b9765e068a0f871 | [
"Apache-2.0"
] | null | null | null | solum/tests/worker/handlers/test_shell.py | dimtruck/solum | 7ec547039ab255052b954a102b9765e068a0f871 | [
"Apache-2.0"
] | null | null | null | solum/tests/worker/handlers/test_shell.py | dimtruck/solum | 7ec547039ab255052b954a102b9765e068a0f871 | [
"Apache-2.0"
] | null | null | null | # Copyright 2014 - Rackspace Hosting
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import base64
import json
import os.path
import uuid
import mock
from oslo_config import cfg
from solum.openstack.common.gettextutils import _
from solum.tests import base
from solum.tests import fakes
from solum.tests import utils
from solum.worker.handlers import shell as shell_handler
def mock_environment():
return {
'PATH': '/bin',
'SOLUM_TASK_DIR': '/dev/null',
'BUILD_ID': 'abcd',
'PROJECT_ID': 1,
}
def mock_git_info():
return {
'source_url': 'git://example.com/foo',
'repo_token': '8765',
'status_url': 'https://api.github.com/repos/u/r/statuses/SHA'
}
def mock_request_hdr(token):
return {'Authorization': 'token ' + token,
'Content-Type': 'application/json'}
def mock_req_pending_body(log_url):
data = {'state': 'pending',
'description': 'Solum says: Testing in progress',
'target_url': log_url}
return json.dumps(data)
def mock_req_success_body(log_url):
data = {'state': 'success',
'description': 'Solum says: Tests passed',
'target_url': log_url}
return json.dumps(data)
def mock_req_failure_body(log_url):
data = {'state': 'failure',
'description': 'Solum says: Tests failed',
'target_url': log_url}
return json.dumps(data)
def mock_http_response():
return {'status': '401'}, ''
class HandlerTest(base.BaseTestCase):
scenarios = [
('auto_lp_id',
dict(base_image_id='auto',
expected_img_id='auto',
img_name='')),
('lp_id',
dict(base_image_id='1-2-3-4',
expected_img_id='TempUrl',
img_name='tenant-name-ts-commit'))]
def setUp(self):
super(HandlerTest, self).setUp()
self.ctx = utils.dummy_context()
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.deployer.api.API.deploy')
@mock.patch('subprocess.Popen')
def test_build(self, mock_popen, mock_deploy, mock_b_update, mock_uas,
mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
fake_glance_id = str(uuid.uuid4())
fake_image_name = 'tenant-name-ts-commit'
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
mock_popen.return_value.communicate.return_value = [
'foo\ncreated_image_id=%s\ndocker_image_name=%s' %
(fake_glance_id, fake_image_name), None]
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.build(self.ctx, build_id=5, git_info=git_info,
name='new_app', base_image_id=self.base_image_id,
source_format='heroku', image_format='docker',
assembly_id=44, run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-app')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'new_app', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env,
stdout=-1)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'READY', 'built successfully',
fake_glance_id, fake_image_name, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(44, {'status': 'BUILDING'}),
mock.call(44, {'status': 'BUILT'})]
self.assertEqual(expected, mock_uas.call_args_list)
assert not mock_deploy.called
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.deployer.api.API.deploy')
@mock.patch('subprocess.Popen')
def test_build_swft(self, mock_popen, mock_deploy, mock_b_update, mock_uas,
mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
fake_glance_id = str(uuid.uuid4())
fake_image_name = 'tenant-name-ts-commit'
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_by_uuid.return_value = fake_image
mock_registry.Image.get_lp_by_name_or_uuid = fake_image
cfg.CONF.set_override('image_storage', 'swift',
group='worker')
mock_popen.return_value.communicate.return_value = [
'foo\ncreated_image_id=%s\ndocker_image_name=%s' %
(fake_glance_id, fake_image_name)]
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.build(self.ctx, build_id=5, git_info=git_info, name='new_app',
base_image_id=fake_image.base_image_id,
source_format='heroku', image_format='docker',
assembly_id=44, run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-app')
expected_loc = fake_image.external_ref
expected_tag = fake_image.docker_image_name
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'new_app', self.ctx.tenant,
expected_loc, expected_tag],
env=test_env,
stdout=-1)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'READY', 'built successfully',
fake_glance_id, fake_image_name, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(44, {'status': 'BUILDING'}),
mock.call(44, {'status': 'BUILT'})]
self.assertEqual(expected, mock_uas.call_args_list)
assert not mock_deploy.called
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('solum.deployer.api.API.deploy')
@mock.patch('subprocess.Popen')
@mock.patch('ast.literal_eval')
def test_build_with_private_github_repo(
self, mock_ast, mock_popen, mock_deploy, mock_uas, mock_b_update,
mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
fake_glance_id = str(uuid.uuid4())
fake_image_name = 'tenant-name-ts-commit'
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
handler._update_assembly_status = mock.MagicMock()
mock_popen.return_value.communicate.return_value = [
'foo\ncreated_image_id=%s\ndocker_image_name=%s' %
(fake_glance_id, fake_image_name), None]
test_env = mock_environment()
mock_get_env.return_value = test_env
mock_ast.return_value = [{'source_url': 'git://example.com/foo',
'private_key': 'some-private-key'}]
git_info = mock_git_info()
handler.launch_workflow(
self.ctx, build_id=5, git_info=git_info,
workflow=['unittest', 'build', 'deploy'], ports=[80],
name='new_app', base_image_id=self.base_image_id,
source_format='heroku', image_format='docker', assembly_id=44,
test_cmd=None, run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-app')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'new_app', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env, stdout=-1)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'READY', 'built successfully',
fake_glance_id, fake_image_name, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(44, {'status': 'BUILDING'}),
mock.call(44, {'status': 'BUILT'})]
self.assertEqual(expected, mock_uas.call_args_list)
expected = [mock.call(assembly_id=44, image_loc=fake_glance_id,
image_name=fake_image_name,
ports=[80])]
self.assertEqual(expected, mock_deploy.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('solum.deployer.api.API.deploy')
@mock.patch('subprocess.Popen')
@mock.patch('shelve.open')
@mock.patch('ast.literal_eval')
def test_build_with_private_github_repo_with_shelve(
self, mock_ast, mock_shelve, mock_popen,
mock_deploy, mock_uas, mock_b_update, mock_registry,
mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
fake_glance_id = str(uuid.uuid4())
fake_image_name = 'tenant-name-ts-commit'
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
handler._update_assembly_status = mock.MagicMock()
mock_popen.return_value.communicate.return_value = [
'foo\ncreated_image_id=%s\ndocker_image_name=%s' %
(fake_glance_id, fake_image_name), None]
test_env = mock_environment()
mock_get_env.return_value = test_env
cfg.CONF.set_override('system_param_store', 'local_file',
group='api')
cfg.CONF.set_override('system_param_file', 'some_file_path',
group='api')
mock_shelve.return_value = mock.MagicMock()
base64.b64decode = mock.MagicMock()
mock_ast.return_value = [{'source_url': 'git://example.com/foo',
'private_key': 'some-private-key'}]
git_info = mock_git_info()
handler.launch_workflow(
self.ctx, build_id=5, git_info=git_info,
workflow=['unitetst', 'build', 'deploy'], ports=[80],
name='new_app', base_image_id=self.base_image_id,
source_format='heroku', image_format='docker', assembly_id=44,
test_cmd=None, run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-app')
# TODO(datsun180b): Determine if this commented line should be removed
# since I can't seem to find anywhere in shell.py that writes to
# shelve.
# self.assertTrue(mock_shelve.call().__setitem__.called)
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'new_app', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env, stdout=-1)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'READY', 'built successfully',
fake_glance_id, fake_image_name, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(44, {'status': 'BUILDING'}),
mock.call(44, {'status': 'BUILT'})]
self.assertEqual(expected, mock_uas.call_args_list)
expected = [mock.call(assembly_id=44, image_loc=fake_glance_id,
image_name=fake_image_name,
ports=[80])]
self.assertEqual(expected, mock_deploy.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('subprocess.Popen')
def test_build_fail(self, mock_popen, mock_uas, mock_b_update,
mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
mock_popen.return_value.communicate.return_value = [
'foo\ncreated_image_id=\n', None]
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.build(self.ctx, build_id=5, git_info=git_info, name='new_app',
base_image_id=self.base_image_id, source_format='heroku',
image_format='docker', assembly_id=44, run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-app')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'new_app', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env, stdout=-1)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'ERROR', 'image not created', None, None, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(44, {'status': 'BUILDING'}),
mock.call(44, {'status': 'ERROR'})]
self.assertEqual(expected, mock_uas.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('subprocess.Popen')
def test_build_error(self, mock_popen, mock_uas, mock_b_update,
mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
mock_popen.call.return_value = ValueError
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.build(self.ctx, build_id=5, git_info=git_info, name='new_app',
base_image_id=self.base_image_id, source_format='heroku',
image_format='docker', assembly_id=44, run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-app')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'new_app', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env, stdout=-1)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'ERROR', 'image not created', None, None, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(44, {'status': 'BUILDING'}),
mock.call(44, {'status': 'ERROR'})]
self.assertEqual(expected, mock_uas.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('subprocess.Popen')
@mock.patch('solum.worker.handlers.shell.update_assembly_status')
def test_unittest(self, mock_a_update, mock_popen, mock_registry,
mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
test_env = mock_environment()
mock_get_env.return_value = test_env
mock_popen.return_value.wait.return_value = 0
git_info = mock_git_info()
handler.unittest(self.ctx, build_id=5, name='new_app',
base_image_id=self.base_image_id,
source_format='chef', image_format='docker',
assembly_id=fake_assembly.id,
git_info=git_info, test_cmd='tox')
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir,
'contrib/lp-chef/docker/unittest-app')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env, stdout=-1)
expected = [mock.call(self.ctx, 8, 'UNIT_TESTING'),
mock.call(self.ctx, 8, 'UNIT_TESTING_PASSED')]
self.assertEqual(expected, mock_a_update.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('subprocess.Popen')
@mock.patch('solum.worker.handlers.shell.update_assembly_status')
def test_unittest_failure(self, mock_a_update, mock_popen,
mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
test_env = mock_environment()
mock_get_env.return_value = test_env
mock_popen.return_value.wait.return_value = 1
git_info = mock_git_info()
handler.unittest(self.ctx, build_id=5, name='new_app',
assembly_id=fake_assembly.id,
base_image_id=self.base_image_id,
source_format='chef',
image_format='docker',
git_info=git_info, test_cmd='tox')
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir,
'contrib/lp-chef/docker/unittest-app')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'', self.ctx.tenant,
self.expected_img_id,
self.img_name],
env=test_env, stdout=-1)
expected = [mock.call(self.ctx, 8, 'UNIT_TESTING'),
mock.call(self.ctx, 8, 'UNIT_TESTING_FAILED')]
self.assertEqual(expected, mock_a_update.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('subprocess.Popen')
@mock.patch('solum.conductor.api.API.build_job_update')
@mock.patch('solum.worker.handlers.shell.update_assembly_status')
@mock.patch('solum.deployer.api.API.deploy')
def test_unittest_build_deploy(self, mock_deploy, mock_a_update,
mock_b_update, mock_popen, mock_registry,
mock_get_env):
handler = shell_handler.Handler()
fake_assembly = fakes.FakeAssembly()
fake_glance_id = str(uuid.uuid4())
fake_image_name = 'tenant-name-ts-commit'
mock_registry.Assembly.get_by_id.return_value = fake_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
mock_popen.return_value.wait.return_value = 0
mock_popen.return_value.communicate.return_value = [
'foo\ncreated_image_id=%s\ndocker_image_name=%s' %
(fake_glance_id, fake_image_name), None]
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.launch_workflow(
self.ctx, build_id=5, git_info=git_info,
workflow=['unittest', 'build', 'deploy'], ports=[80],
name='new_app', base_image_id=self.base_image_id,
source_format='heroku', image_format='docker', assembly_id=44,
test_cmd='faketests', run_cmd=None)
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
util_dir = os.path.join(proj_dir, 'contrib', 'lp-cedarish', 'docker')
u_script = os.path.join(util_dir, 'unittest-app')
b_script = os.path.join(util_dir, 'build-app')
expected = [
mock.call([u_script, 'git://example.com/foo', '',
self.ctx.tenant, self.expected_img_id,
self.img_name], env=test_env,
stdout=-1),
mock.call([b_script, 'git://example.com/foo', 'new_app',
self.ctx.tenant, self.expected_img_id,
self.img_name], env=test_env,
stdout=-1)]
self.assertEqual(expected, mock_popen.call_args_list)
expected = [mock.call(5, 'BUILDING', 'Starting the image build',
None, None, 44),
mock.call(5, 'READY', 'built successfully',
fake_glance_id, fake_image_name, 44)]
self.assertEqual(expected, mock_b_update.call_args_list)
expected = [mock.call(self.ctx, 44, 'UNIT_TESTING'),
mock.call(self.ctx, 44, 'UNIT_TESTING_PASSED'),
mock.call(self.ctx, 44, 'BUILDING'),
mock.call(self.ctx, 44, 'BUILT')]
self.assertEqual(expected, mock_a_update.call_args_list)
expected = [mock.call(assembly_id=44, image_loc=fake_glance_id,
image_name=fake_image_name,
ports=[80])]
self.assertEqual(expected, mock_deploy.call_args_list)
@mock.patch('solum.worker.handlers.shell.Handler._do_build')
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('subprocess.Popen')
@mock.patch('solum.worker.handlers.shell.update_assembly_status')
@mock.patch('solum.objects.registry')
def test_unittest_no_build(self, mock_registry, mock_a_update, mock_popen,
mock_get_env, mock_do_build):
handler = shell_handler.Handler()
mock_assembly = mock.MagicMock()
mock_registry.Assembly.get_by_id.return_value = mock_assembly
fake_image = fakes.FakeImage()
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
mock_popen.return_value.wait.return_value = 1
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.launch_workflow(
self.ctx, build_id=5, git_info=git_info, name='new_app',
base_image_id=self.base_image_id, source_format='chef',
image_format='docker', assembly_id=44, ports=[80],
test_cmd='faketests', run_cmd=None, workflow=['unittest', 'build'])
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
util_dir = os.path.join(proj_dir, 'contrib', 'lp-chef', 'docker')
u_script = os.path.join(util_dir, 'unittest-app')
expected = [
mock.call([u_script, 'git://example.com/foo', '',
self.ctx.tenant, self.expected_img_id,
self.img_name], env=test_env,
stdout=-1)]
self.assertEqual(expected, mock_popen.call_args_list)
expected = [mock.call(self.ctx, 44, 'UNIT_TESTING'),
mock.call(self.ctx, 44, 'UNIT_TESTING_FAILED')]
self.assertEqual(expected, mock_a_update.call_args_list)
assert not mock_do_build.called
class HandlerUtilityTest(base.BaseTestCase):
def setUp(self):
super(HandlerUtilityTest, self).setUp()
self.ctx = utils.dummy_context()
@mock.patch('solum.worker.handlers.shell.LOG')
def test_echo(self, fake_LOG):
shell_handler.Handler().echo({}, 'foo')
fake_LOG.debug.assert_called_once_with(_('%s') % 'foo')
@mock.patch('solum.worker.handlers.shell.get_parameter_by_assem_id')
@mock.patch('six.moves.builtins.open')
@mock.patch('os.makedirs')
def test_get_parameter_files(self, mock_mkdirs, mock_open, mock_param):
mock_param.return_value = fakes.FakeParameter()
fake_build_id = '1-2-3-4'
cfg.CONF.set_override('param_file_path', '/tmp/test', group='worker')
path = '/tmp/test/' + fake_build_id
handler = shell_handler.Handler()
handler._get_parameter_env(self.ctx, 'git://example.com/foo',
8, fake_build_id)
mock_mkdirs.assert_called_once_with(path, 0o700)
expected = [mock.call(path + '/user_params', 'w'),
mock.call(path + '/solum_params', 'w')]
self.assertEqual(expected, mock_open.call_args_list)
mock_file = mock_open.return_value.__enter__.return_value
expected_params = [mock.call('#!/bin/bash\n'),
mock.call('export key="ab\\"cd"\n'),
mock.call('#!/bin/bash\n')]
self.assertEqual(expected_params, mock_file.write.call_args_list)
class TestNotifications(base.BaseTestCase):
def setUp(self):
super(TestNotifications, self).setUp()
self.ctx = utils.dummy_context()
self.db = self.useFixture(utils.Database())
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('solum.objects.registry')
def test_update_assembly_status(self, mock_registry, mock_uas):
mock_assembly = mock.MagicMock()
mock_registry.Assembly.get_by_id.return_value = mock_assembly
shell_handler.update_assembly_status(self.ctx, '1234',
'BUILDING')
self.assertEqual(mock_registry.Assembly.get_by_id.call_count, 0)
self.assertEqual(mock_registry.save.call_count, 0)
self.assertEqual(mock_uas.call_count, 1)
@mock.patch('solum.conductor.api.API.update_assembly')
@mock.patch('solum.objects.registry')
def test_update_assembly_status_pass(self, mock_registry, mock_uas):
shell_handler.update_assembly_status(self.ctx, None,
'BUILDING')
self.assertEqual(mock_registry.call_count, 0)
class TestBuildCommand(base.BaseTestCase):
scenarios = [
('docker',
dict(source_format='heroku', image_format='docker',
base_image_id='auto', artifact_type=None,
expect_b='lp-cedarish/docker/build-app',
expect_u='lp-cedarish/docker/unittest-app')),
('dockerfile',
dict(source_format='dockerfile', image_format='docker',
base_image_id='auto', artifact_type=None,
expect_b='lp-dockerfile/docker/build-app',
expect_u='lp-dockerfile/docker/unittest-app')),
('chef',
dict(source_format='chef', image_format='docker',
base_image_id='xyz', artifact_type=None,
expect_b='lp-chef/docker/build-app',
expect_u='lp-chef/docker/unittest-app'))]
def test_build_cmd(self):
ctx = utils.dummy_context()
handler = shell_handler.Handler()
cmd = handler._get_build_command(ctx,
'build',
'http://example.com/a.git',
'testa',
self.base_image_id,
self.source_format,
self.image_format, '',
self.artifact_type)
self.assertIn(self.expect_b, cmd[0])
self.assertEqual('http://example.com/a.git', cmd[1])
self.assertEqual('testa', cmd[2])
self.assertEqual(ctx.tenant, cmd[3])
if self.base_image_id == 'auto' and self.image_format == 'qcow2':
self.assertEqual('cedarish', cmd[4])
else:
self.assertEqual(self.base_image_id, cmd[4])
def test_unittest_cmd(self):
ctx = utils.dummy_context()
handler = shell_handler.Handler()
cmd = handler._get_build_command(ctx,
'unittest',
'http://example.com/a.git',
'testa',
self.base_image_id,
self.source_format,
self.image_format, 'asdf',
self.artifact_type)
self.assertIn(self.expect_u, cmd[0])
self.assertEqual('http://example.com/a.git', cmd[1])
self.assertEqual('asdf', cmd[2])
self.assertEqual(ctx.tenant, cmd[3])
class TestLanguagePackBuildCommand(base.BaseTestCase):
def setUp(self):
super(TestLanguagePackBuildCommand, self).setUp()
self.ctx = utils.dummy_context()
def test_languagepack_build_cmd(self):
ctx = utils.dummy_context()
handler = shell_handler.Handler()
cmd = handler._get_build_command(ctx,
'build',
'http://example.com/a.git',
'testa',
'auto',
'heroku',
'docker', '',
'language_pack')
self.assertIn('lp-cedarish/docker/build-lp', cmd[0])
self.assertEqual('http://example.com/a.git', cmd[1])
self.assertEqual('testa', cmd[2])
self.assertEqual(ctx.tenant, cmd[3])
@mock.patch('solum.worker.handlers.shell.Handler._get_environment')
@mock.patch('solum.objects.registry')
@mock.patch('solum.conductor.api.API.update_image')
@mock.patch('subprocess.Popen')
def test_build_lp(self, mock_popen, mock_ui, mock_registry, mock_get_env):
handler = shell_handler.Handler()
fake_image = fakes.FakeImage()
fake_glance_id = str(uuid.uuid4())
fake_image_name = 'tenant-name-ts-commit'
mock_registry.Image.get_lp_by_name_or_uuid.return_value = fake_image
mock_popen.return_value.communicate.return_value = [
'foo\nimage_external_ref=%s\ndocker_image_name=%s\n' %
(fake_glance_id, fake_image_name), None]
test_env = mock_environment()
mock_get_env.return_value = test_env
git_info = mock_git_info()
handler.build_lp(self.ctx, image_id=5, git_info=git_info,
name='lp_name', source_format='heroku',
image_format='docker', artifact_type='language_pack')
proj_dir = os.path.abspath(os.path.join(os.path.dirname(__file__),
'..', '..', '..', '..'))
script = os.path.join(proj_dir, 'contrib/lp-cedarish/docker/build-lp')
mock_popen.assert_called_once_with([script, 'git://example.com/foo',
'lp_name', self.ctx.tenant],
env=test_env,
stdout=-1)
expected = [mock.call(5, 'BUILDING', None, None),
mock.call(5, 'READY', fake_glance_id, fake_image_name)]
self.assertEqual(expected, mock_ui.call_args_list)
| 47.635753 | 79 | 0.582828 | 4,131 | 35,441 | 4.700557 | 0.081094 | 0.031517 | 0.037491 | 0.033371 | 0.824081 | 0.80482 | 0.770059 | 0.753631 | 0.751777 | 0.747657 | 0 | 0.008611 | 0.298778 | 35,441 | 743 | 80 | 47.699865 | 0.772735 | 0.021218 | 0 | 0.714511 | 0 | 0 | 0.166162 | 0.094921 | 0 | 0 | 0 | 0.001346 | 0.089905 | 1 | 0.045741 | false | 0.006309 | 0.01735 | 0.006309 | 0.085174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1684506a40c0dbd817f1b13156b88199c7b8bfb2 | 16,784 | py | Python | py.py | TDHTTTT/MadGraph_Windows | 8a0be5befed650b6adcb9825c1b57af907c0167a | [
"NCSA"
] | null | null | null | py.py | TDHTTTT/MadGraph_Windows | 8a0be5befed650b6adcb9825c1b57af907c0167a | [
"NCSA"
] | null | null | null | py.py | TDHTTTT/MadGraph_Windows | 8a0be5befed650b6adcb9825c1b57af907c0167a | [
"NCSA"
] | null | null | null |
# py.py
# This file is automatically generated. Do not edit.
_tabversion = '3.2'
_lr_method = 'LALR'
_lr_signature = 'c44e5f4dc282722122d9cb6aabd9e53c'
_lr_action_items = {'NUMBER':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,]),'COMPLEX':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,]),'COND':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,]),'LOGICAL':([1,7,15,21,29,30,33,34,35,36,37,38,40,41,42,43,44,45,51,54,59,60,62,63,64,65,66,67,69,75,77,88,89,90,92,93,97,98,101,],[-17,-12,-40,-18,-28,-35,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,76,-10,-19,-3,-5,-4,-6,-2,76,76,-13,-20,-21,-23,-24,-14,-22,-15,-16,]),'ASEC':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,]),'CONJ':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,]),'RECMS':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,6,]),'REGLOG':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,22,]),')':([1,7,15,21,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,60,61,62,63,64,65,66,67,74,75,77,81,83,84,85,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,-28,-35,60,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,-10,77,-19,-3,-5,-4,-6,-2,85,60,-13,90,-8,92,-11,-7,93,-20,-21,-23,-24,-14,97,98,-22,-15,101,-16,]),'(':([0,2,3,4,5,6,8,9,10,11,13,14,16,17,18,19,20,22,23,24,26,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[8,27,28,8,8,31,8,8,8,8,8,8,8,39,8,8,8,8,8,8,8,8,8,58,8,8,8,8,8,8,8,58,58,8,8,58,8,8,8,8,8,8,8,8,]),'REGLOGM':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,26,]),'*':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,48,-28,-35,48,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,48,48,48,-10,48,-19,48,-5,48,-6,48,48,48,-13,48,48,48,48,48,48,48,-23,-24,-14,48,48,-22,-15,48,-16,]),'-':([0,1,7,8,9,15,21,25,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,58,59,60,61,62,63,64,65,66,67,69,70,71,72,73,75,76,77,78,79,80,81,82,84,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,],[9,-17,-12,9,9,-40,-18,49,9,9,-28,-35,9,49,-9,-29,-26,-37,-33,-27,9,-34,-25,-30,-36,-32,-31,9,9,9,9,9,-39,9,9,-38,49,49,9,49,-10,49,-19,-3,-5,-4,-6,49,49,9,9,9,9,49,9,-13,9,9,9,49,49,49,49,49,49,49,-23,9,-24,-14,9,49,49,-22,-15,9,49,-16,]),',':([1,7,15,21,29,30,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,57,60,61,62,63,64,65,66,67,77,82,83,85,86,87,88,89,90,92,93,96,97,98,101,],[-17,-12,-40,-18,-28,-35,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,70,71,73,-10,78,-19,-3,-5,-4,-6,-2,-13,91,-8,-11,-7,94,-20,-21,-23,-24,-14,99,-22,-15,-16,]),'/':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,50,-28,-35,50,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,50,50,50,-10,50,-19,50,-5,50,-6,50,50,50,-13,50,50,50,50,50,50,50,-23,-24,-14,50,50,-22,-15,50,-16,]),'RE':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,10,]),'SEC':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,11,]),'REGLOGP':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,13,]),'TAN':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,14,]),'PI':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,15,]),'=':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,52,-28,-35,52,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,52,52,52,-10,52,-19,-3,-5,-4,-6,-2,52,52,-13,52,52,52,52,52,52,52,-23,-24,-14,52,52,-22,-15,52,-16,]),'ACSC':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,]),'$end':([1,7,12,15,21,25,29,30,33,34,35,36,37,38,40,41,42,43,44,45,51,54,60,62,63,64,65,66,67,77,88,89,90,92,93,97,98,101,],[-17,-12,0,-40,-18,-1,-28,-35,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,-10,-19,-3,-5,-4,-6,-2,-13,-20,-21,-23,-24,-14,-22,-15,-16,]),'FUNCTION':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,]),'ATAN':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,18,]),'CSC':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,19,]),'ELSE':([1,7,15,21,29,30,33,34,35,36,37,38,40,41,42,43,44,45,51,54,60,62,63,64,65,66,67,68,69,77,83,85,86,88,89,90,92,93,97,98,101,],[-17,-12,-40,-18,-28,-35,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,-10,-19,-3,-5,-4,-6,-2,79,80,-13,-8,-11,-7,-20,-21,-23,-24,-14,-22,-15,-16,]),'IM':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,20,]),'VARIABLE':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,21,]),'IF':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,53,-28,-35,53,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,53,53,53,-10,53,-19,-3,-5,-4,-6,-2,53,53,-13,53,53,53,53,53,-20,-21,-23,-24,-14,53,53,-22,-15,53,-16,]),'LOGICALCOMB':([1,7,15,21,29,30,33,34,35,36,37,38,40,41,42,43,44,45,51,54,57,60,62,63,64,65,66,67,68,74,77,83,85,86,88,89,90,92,93,97,98,101,],[-17,-12,-40,-18,-28,-35,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,72,-10,-19,-3,-5,-4,-6,-2,72,72,-13,72,-11,-7,-20,-21,-23,-24,-14,-22,-15,-16,]),'POWER':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,46,-28,-35,46,46,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,46,46,46,-10,46,-19,46,46,46,46,46,46,46,-13,46,46,46,46,46,46,46,-23,-24,-14,46,46,-22,-15,46,-16,]),'RE2':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,51,-28,-35,51,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,51,51,51,-10,51,-19,51,51,51,51,51,51,51,-13,51,51,51,51,51,51,51,-23,-24,-14,51,51,-22,-15,51,-16,]),'SQRT':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,23,]),'ARG':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,24,]),'+':([1,7,15,21,25,29,30,32,33,34,35,36,37,38,40,41,42,43,44,45,51,54,55,56,59,60,61,62,63,64,65,66,67,69,75,77,81,82,84,86,87,88,89,90,92,93,95,96,97,98,100,101,],[-17,-12,-40,-18,47,-28,-35,47,-9,-29,-26,-37,-33,-27,-34,-25,-30,-36,-32,-31,-39,-38,47,47,47,-10,47,-19,-3,-5,-4,-6,47,47,47,-13,47,47,47,47,47,47,47,-23,-24,-14,47,47,-22,-15,47,-16,]),}
_lr_action = { }
for _k, _v in _lr_action_items.items():
for _x,_y in zip(_v[0],_v[1]):
if not _x in _lr_action: _lr_action[_x] = { }
_lr_action[_x][_k] = _y
del _lr_action_items
_lr_goto_items = {'boolexpression':([31,53,58,72,],[57,68,74,83,]),'group':([0,4,5,8,9,10,11,13,14,16,18,19,20,22,23,24,26,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[7,29,30,7,7,34,35,36,37,38,40,41,42,43,44,45,54,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,]),'expression':([0,8,9,27,28,31,39,46,47,48,49,50,52,53,58,70,71,72,73,76,78,79,80,91,94,99,],[25,32,33,55,56,59,61,62,63,64,65,66,67,69,75,81,82,59,84,86,87,88,89,95,96,100,]),'statement':([0,],[12,]),}
_lr_goto = { }
for _k, _v in _lr_goto_items.items():
for _x,_y in zip(_v[0],_v[1]):
if not _x in _lr_goto: _lr_goto[_x] = { }
_lr_goto[_x][_k] = _y
del _lr_goto_items
_lr_productions = [
("S' -> statement","S'",1,None,None,None),
('statement -> expression','statement',1,'p_statement_expr','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',210),
('expression -> expression = expression','expression',3,'p_expression_binop','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',214),
('expression -> expression + expression','expression',3,'p_expression_binop','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',215),
('expression -> expression - expression','expression',3,'p_expression_binop','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',216),
('expression -> expression * expression','expression',3,'p_expression_binop','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',217),
('expression -> expression / expression','expression',3,'p_expression_binop','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',218),
('boolexpression -> expression LOGICAL expression','boolexpression',3,'p_expression_logical','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',222),
('boolexpression -> boolexpression LOGICALCOMB boolexpression','boolexpression',3,'p_expression_logicalcomb','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',229),
('expression -> - expression','expression',2,'p_expression_uminus','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',236),
('group -> ( expression )','group',3,'p_group_parentheses','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',240),
('boolexpression -> ( boolexpression )','boolexpression',3,'p_group_parentheses_boolexpr','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',245),
('expression -> group','expression',1,'p_expression_group','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',249),
('expression -> FUNCTION ( expression )','expression',4,'p_expression_function1','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',253),
('expression -> FUNCTION ( expression , expression )','expression',6,'p_expression_function2','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',261),
('expression -> FUNCTION ( expression , expression , expression )','expression',8,'p_expression_function3','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',269),
('expression -> FUNCTION ( expression , expression , expression , expression )','expression',10,'p_expression_function4','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',277),
('expression -> NUMBER','expression',1,'p_expression_number','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',732),
('expression -> VARIABLE','expression',1,'p_expression_variable','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',736),
('expression -> expression POWER expression','expression',3,'p_expression_power','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',740),
('expression -> expression IF boolexpression ELSE expression','expression',5,'p_expression_if','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',744),
('expression -> expression IF expression ELSE expression','expression',5,'p_expression_ifimplicit','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',755),
('expression -> COND ( expression , expression , expression )','expression',8,'p_expression_cond','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',766),
('expression -> COMPLEX ( expression , expression )','expression',6,'p_expression_complex','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',772),
('expression -> RECMS ( boolexpression , expression )','expression',6,'p_expression_recms','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',776),
('expression -> CSC group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',780),
('expression -> SEC group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',781),
('expression -> ACSC group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',782),
('expression -> ASEC group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',783),
('expression -> RE group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',784),
('expression -> IM group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',785),
('expression -> ARG group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',786),
('expression -> SQRT group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',787),
('expression -> TAN group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',788),
('expression -> ATAN group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',789),
('expression -> CONJ group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',790),
('expression -> REGLOG group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',791),
('expression -> REGLOGP group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',792),
('expression -> REGLOGM group','expression',2,'p_expression_func','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',793),
('expression -> expression RE2','expression',2,'p_expression_real','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',810),
('expression -> PI','expression',1,'p_expression_pi','/mnt/c/Users/tdhttt/workspace/madgraph5/MG5_aMC_v2_6_2/madgraph/iolibs/ufo_expression_parsers.py',814),
]
| 239.771429 | 8,392 | 0.680887 | 3,908 | 16,784 | 2.825998 | 0.058598 | 0.008693 | 0.032597 | 0.054328 | 0.792557 | 0.775534 | 0.742575 | 0.720844 | 0.712514 | 0.706175 | 0 | 0.331174 | 0.024905 | 16,784 | 69 | 8,393 | 243.246377 | 0.343639 | 0.003337 | 0 | 0.033333 | 1 | 0.666667 | 0.394702 | 0.242526 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
16c161fa860545a670b9ab76b6b82120bfd81972 | 6,482 | py | Python | equipmentdb/migrations/0003_auto_20210826_2223.py | Jongmassey/ubuc-dev | e38d1ec606792e8feed16a00bcdd56f84ce3feda | [
"MIT"
] | null | null | null | equipmentdb/migrations/0003_auto_20210826_2223.py | Jongmassey/ubuc-dev | e38d1ec606792e8feed16a00bcdd56f84ce3feda | [
"MIT"
] | 7 | 2021-09-02T21:12:23.000Z | 2021-11-15T10:01:05.000Z | equipmentdb/migrations/0003_auto_20210826_2223.py | Jongmassey/ubuc-dev | e38d1ec606792e8feed16a00bcdd56f84ce3feda | [
"MIT"
] | 1 | 2021-11-11T17:32:54.000Z | 2021-11-11T17:32:54.000Z | # Generated by Django 3.2.6 on 2021-08-26 22:23
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('equipmentdb', '0002_equipmenttype_unique_name'),
]
operations = [
migrations.AlterField(
model_name='equipment',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipment_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipment',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipment_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmentnote',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmentnote_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmentnote',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmentnote_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmentservice',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmentservice_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmentservice',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmentservice_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttest',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttest_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttest',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttest_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttype',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttype_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttype',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttype_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttypeserviceschedule',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttypeserviceschedule_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttypeserviceschedule',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttypeserviceschedule_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttypetestschedule',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttypetestschedule_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='equipmenttypetestschedule',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmenttypetestschedule_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='service',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='service_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='service',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='service_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='test',
name='created_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='test_created_by', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='test',
name='updated_by',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='test_updated_by', to=settings.AUTH_USER_MODEL),
),
migrations.CreateModel(
name='EquipmentFault',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_on', models.DateTimeField(auto_now_add=True)),
('updated_on', models.DateTimeField(auto_now=True)),
('notes', models.TextField()),
('created_by', models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmentfault_created_by', to=settings.AUTH_USER_MODEL)),
('equipment', models.ForeignKey(on_delete=django.db.models.deletion.RESTRICT, to='equipmentdb.equipment')),
('updated_by', models.ForeignKey(editable=False, on_delete=django.db.models.deletion.RESTRICT, related_name='equipmentfault_updated_by', to=settings.AUTH_USER_MODEL)),
],
options={
'abstract': False,
},
),
]
| 53.131148 | 183 | 0.674329 | 685 | 6,482 | 6.128467 | 0.10365 | 0.04383 | 0.073368 | 0.115293 | 0.879943 | 0.866603 | 0.855407 | 0.83111 | 0.821105 | 0.821105 | 0 | 0.003728 | 0.213823 | 6,482 | 121 | 184 | 53.570248 | 0.820055 | 0.006942 | 0 | 0.634783 | 1 | 0 | 0.168143 | 0.086247 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026087 | 0 | 0.052174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bc418c63effbcc0f398cafd914745c5817b24b3c | 281 | py | Python | indentation.py | mansour20-meet/meet2018y1lab4 | d1885947e9a78d40d14803b367ae9134a2505c32 | [
"MIT"
] | null | null | null | indentation.py | mansour20-meet/meet2018y1lab4 | d1885947e9a78d40d14803b367ae9134a2505c32 | [
"MIT"
] | null | null | null | indentation.py | mansour20-meet/meet2018y1lab4 | d1885947e9a78d40d14803b367ae9134a2505c32 | [
"MIT"
] | null | null | null | indentation = False
if indentation:
print('chocolate')
print('Indentations are cool!')
indentation = True
if indentation:
print('chocolate')
print('Indentations are cool!')
indentation = False
if indentation:
print('chocolate')
print('Indentations are cool!')
| 15.611111 | 35 | 0.711744 | 30 | 281 | 6.666667 | 0.3 | 0.195 | 0.27 | 0.405 | 0.98 | 0.98 | 0.98 | 0.98 | 0.98 | 0.67 | 0 | 0 | 0.170819 | 281 | 17 | 36 | 16.529412 | 0.858369 | 0 | 0 | 0.916667 | 0 | 0 | 0.330961 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
bcbc11cd1b90370f7caf05b6c2f995871375d496 | 1,159 | py | Python | tests/test_recipes_hooks.py | Inconnu08/bocadillo | 87daa5f47099da932396cd29fe0375bb3704913c | [
"MIT"
] | null | null | null | tests/test_recipes_hooks.py | Inconnu08/bocadillo | 87daa5f47099da932396cd29fe0375bb3704913c | [
"MIT"
] | null | null | null | tests/test_recipes_hooks.py | Inconnu08/bocadillo | 87daa5f47099da932396cd29fe0375bb3704913c | [
"MIT"
] | null | null | null | from bocadillo import API, Recipe
from .utils import async_function_hooks
def test_on_async_function_view(api: API):
numbers = Recipe("numbers")
with async_function_hooks() as (before, after):
@numbers.before(before)
@numbers.after(after)
@numbers.route("/real")
async def real_numbers(req, res):
pass
api.recipe(numbers)
api.client.get("/numbers/real")
def test_on_sync_function_view(api: API):
numbers = Recipe("numbers")
with async_function_hooks() as (before, after):
@numbers.before(before)
@numbers.after(after)
@numbers.route("/real")
def real_numbers(req, res):
pass
api.recipe(numbers)
api.client.get("/numbers/real")
def test_on_class_based_view(api: API):
numbers = Recipe("numbers")
with async_function_hooks() as (before, after):
@numbers.before(before)
@numbers.route("/real")
class RealNumbers:
@numbers.after(after)
async def get(self, req, res):
pass
api.recipe(numbers)
api.client.get("/numbers/real")
| 23.653061 | 51 | 0.610009 | 138 | 1,159 | 4.963768 | 0.217391 | 0.113869 | 0.105109 | 0.074453 | 0.779562 | 0.779562 | 0.779562 | 0.779562 | 0.779562 | 0.779562 | 0 | 0 | 0.27006 | 1,159 | 48 | 52 | 24.145833 | 0.809693 | 0 | 0 | 0.727273 | 0 | 0 | 0.064711 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.121212 | false | 0.090909 | 0.060606 | 0 | 0.212121 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
4c34c1755b1fcf74ba7deb90a344310a651a6efe | 47,383 | py | Python | clinica/pipelines/machine_learning/algorithm.py | alexandreroutier/clinica | 66625c65e74962db7d5cea267d1a0e51d774bf91 | [
"MIT"
] | null | null | null | clinica/pipelines/machine_learning/algorithm.py | alexandreroutier/clinica | 66625c65e74962db7d5cea267d1a0e51d774bf91 | [
"MIT"
] | null | null | null | clinica/pipelines/machine_learning/algorithm.py | alexandreroutier/clinica | 66625c65e74962db7d5cea267d1a0e51d774bf91 | [
"MIT"
] | null | null | null | # coding: utf8
from os import path
import json
from multiprocessing.pool import ThreadPool
import datetime
import numpy as np
import pandas as pd
from sklearn.svm import SVC
from sklearn.linear_model import LogisticRegression
from sklearn.ensemble import RandomForestClassifier
from xgboost import XGBClassifier
from sklearn.model_selection import StratifiedKFold
from sklearn.metrics import roc_auc_score
import itertools
from sklearn.multiclass import OneVsOneClassifier, OneVsRestClassifier
from clinica.pipelines.machine_learning import base
import clinica.pipelines.machine_learning.ml_utils as utils
class DualSVMAlgorithm(base.MLAlgorithm):
def _launch_svc(self, kernel_train, x_test, y_train, y_test, c):
if self._algorithm_params['balanced']:
svc = SVC(C=c, kernel='precomputed', probability=True, tol=1e-6, class_weight='balanced')
else:
svc = SVC(C=c, kernel='precomputed', probability=True, tol=1e-6)
svc.fit(kernel_train, y_train)
y_hat_train = svc.predict(kernel_train)
y_hat = svc.predict(x_test)
proba_test = svc.predict_proba(x_test)[:, 1]
auc = roc_auc_score(y_test, proba_test)
return svc, y_hat, auc, y_hat_train
def _grid_search(self, kernel_train, x_test, y_train, y_test, c):
_, y_hat, _, _ = self._launch_svc(kernel_train, x_test, y_train, y_test, c)
res = utils.evaluate_prediction(y_test, y_hat)
return res['balanced_accuracy']
def _select_best_parameter(self, async_result):
c_values = []
accuracies = []
for fold in async_result.keys():
best_c = -1
best_acc = -1
for c, async_acc in async_result[fold].items():
acc = async_acc.get()
if acc > best_acc:
best_c = c
best_acc = acc
c_values.append(best_c)
accuracies.append(best_acc)
best_acc = np.mean(accuracies)
best_c = np.power(10, np.mean(np.log10(c_values)))
return {'c': best_c, 'balanced_accuracy': best_acc}
def evaluate(self, train_index, test_index):
inner_pool = ThreadPool(self._algorithm_params['n_threads'])
async_result = {}
for i in range(self._algorithm_params['grid_search_folds']):
async_result[i] = {}
outer_kernel = self._kernel[train_index, :][:, train_index]
y_train = self._y[train_index]
skf = StratifiedKFold(n_splits=self._algorithm_params['grid_search_folds'], shuffle=True)
inner_cv = list(skf.split(np.zeros(len(y_train)), y_train))
for i in range(len(inner_cv)):
inner_train_index, inner_test_index = inner_cv[i]
inner_kernel = outer_kernel[inner_train_index, :][:, inner_train_index]
x_test_inner = outer_kernel[inner_test_index, :][:, inner_train_index]
y_train_inner, y_test_inner = y_train[inner_train_index], y_train[inner_test_index]
for c in self._algorithm_params['c_range']:
async_result[i][c] = inner_pool.apply_async(self._grid_search,
(inner_kernel, x_test_inner,
y_train_inner, y_test_inner, c))
inner_pool.close()
inner_pool.join()
best_parameter = self._select_best_parameter(async_result)
x_test = self._kernel[test_index, :][:, train_index]
y_train, y_test = self._y[train_index], self._y[test_index]
_, y_hat, auc, y_hat_train = self._launch_svc(outer_kernel, x_test, y_train, y_test, best_parameter['c'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction(y_test, y_hat)
result['evaluation_train'] = utils.evaluate_prediction(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
result['auc'] = auc
return result
def apply_best_parameters(self, results_list):
best_c_list = []
bal_acc_list = []
for result in results_list:
best_c_list.append(result['best_parameter']['c'])
bal_acc_list.append(result['best_parameter']['balanced_accuracy'])
# 10^(mean of log10 of best Cs of each fold) is selected
best_c = np.power(10, np.mean(np.log10(best_c_list)))
# Mean balanced accuracy
mean_bal_acc = np.mean(bal_acc_list)
if self._algorithm_params['balanced']:
svc = SVC(C=best_c, kernel='precomputed', probability=True, tol=1e-6, class_weight='balanced')
else:
svc = SVC(C=best_c, kernel='precomputed', probability=True, tol=1e-6)
svc.fit(self._kernel, self._y)
return svc, {'c': best_c, 'balanced_accuracy': mean_bal_acc}
def save_classifier(self, classifier, output_dir):
np.savetxt(path.join(output_dir, 'dual_coefficients.txt'), classifier.dual_coef_)
np.savetxt(path.join(output_dir, 'support_vectors_indices.txt'), classifier.support_)
np.savetxt(path.join(output_dir, 'intersect.txt'), classifier.intercept_)
def save_weights(self, classifier, x, output_dir):
dual_coefficients = classifier.dual_coef_
sv_indices = classifier.support_
weighted_sv = dual_coefficients.transpose() * x[sv_indices]
weights = np.sum(weighted_sv, 0)
np.savetxt(path.join(output_dir, 'weights.txt'), weights)
return weights
def save_parameters(self, parameters_dict, output_dir):
with open(path.join(output_dir, 'best_parameters.json'), 'w') as f:
json.dump(parameters_dict, f)
@staticmethod
def uses_kernel():
return True
@staticmethod
def get_default_parameters():
parameters_dict = {'balanced': True,
'grid_search_folds': 10,
'c_range': np.logspace(-6, 2, 17),
'n_threads': 15}
return parameters_dict
class LogisticReg(base.MLAlgorithm):
def _launch_logistic_reg(self, x_train, x_test, y_train, y_test, c):
if self._algorithm_params['balanced']:
classifier = LogisticRegression(penalty=self._algorithm_params['penalty'], tol=1e-6, C=c,
class_weight='balanced')
else:
classifier = LogisticRegression(penalty=self._algorithm_params['penalty'], tol=1e-6, C=c)
classifier.fit(x_train, y_train)
y_hat_train = classifier.predict(x_train)
y_hat = classifier.predict(x_test)
proba_test = classifier.predict_proba(x_test)[:, 1]
auc = roc_auc_score(y_test, proba_test)
return classifier, y_hat, auc, y_hat_train
def _grid_search(self, x_train, x_test, y_train, y_test, c):
_, y_hat, _, _ = self._launch_logistic_reg(x_train, x_test, y_train, y_test, c)
res = utils.evaluate_prediction(y_test, y_hat)
return res['balanced_accuracy']
def _select_best_parameter(self, async_result):
c_values = []
accuracies = []
for fold in async_result.keys():
best_c = -1
best_acc = -1
for c, async_acc in async_result[fold].items():
acc = async_acc.get()
if acc > best_acc:
best_c = c
best_acc = acc
c_values.append(best_c)
accuracies.append(best_acc)
best_acc = np.mean(accuracies)
best_c = np.power(10, np.mean(np.log10(c_values)))
return {'c': best_c, 'balanced_accuracy': best_acc}
def evaluate(self, train_index, test_index):
inner_pool = ThreadPool(self._algorithm_params['n_threads'])
async_result = {}
for i in range(self._algorithm_params['grid_search_folds']):
async_result[i] = {}
x_train = self._x[train_index]
y_train = self._y[train_index]
skf = StratifiedKFold(n_splits=self._algorithm_params['grid_search_folds'], shuffle=True)
inner_cv = list(skf.split(np.zeros(len(y_train)), y_train))
for i in range(len(inner_cv)):
inner_train_index, inner_test_index = inner_cv[i]
x_train_inner = x_train[inner_train_index]
x_test_inner = x_train[inner_test_index]
y_train_inner = y_train[inner_train_index]
y_test_inner = y_train[inner_test_index]
for c in self._algorithm_params['c_range']:
async_result[i][c] = inner_pool.apply_async(self._grid_search,
(x_train_inner, x_test_inner,
y_train_inner, y_test_inner, c))
inner_pool.close()
inner_pool.join()
best_parameter = self._select_best_parameter(async_result)
x_test = self._x[test_index]
y_test = self._y[test_index]
_, y_hat, auc, y_hat_train = self._launch_logistic_reg(x_train, x_test, y_train, y_test, best_parameter['c'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction(y_test, y_hat)
result['evaluation_train'] = utils.evaluate_prediction(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
result['auc'] = auc
return result
def apply_best_parameters(self, results_list):
best_c_list = []
bal_acc_list = []
for result in results_list:
best_c_list.append(result['best_parameter']['c'])
bal_acc_list.append(result['best_parameter']['balanced_accuracy'])
# 10^(mean of log10 of best Cs of each fold) is selected
best_c = np.power(10, np.mean(np.log10(best_c_list)))
# Mean balanced accuracy
mean_bal_acc = np.mean(bal_acc_list)
if self._algorithm_params['balanced']:
classifier = LogisticRegression(C=best_c, penalty=self._algorithm_params['penalty'], tol=1e-6,
class_weight='balanced')
else:
classifier = LogisticRegression(C=best_c, penalty=self._algorithm_params['penalty'], tol=1e-6)
classifier.fit(self._x, self._y)
return classifier, {'c': best_c, 'balanced_accuracy': mean_bal_acc}
def save_classifier(self, classifier, output_dir):
np.savetxt(path.join(output_dir, 'weights.txt'), classifier.coef_.transpose())
np.savetxt(path.join(output_dir, 'intercept.txt'), classifier.intercept_)
def save_weights(self, classifier, x, output_dir):
np.savetxt(path.join(output_dir, 'weights.txt'), classifier.coef_.transpose())
return classifier.coef_.transpose()
def save_parameters(self, parameters_dict, output_dir):
with open(path.join(output_dir, 'best_parameters.json'), 'w') as f:
json.dump(parameters_dict, f)
@staticmethod
def _centered_normalised_data(features):
std = np.std(features, axis=0)
std[np.where(std == 0)[0]] = 1.
mean = np.mean(features, axis=0)
features_bis = (features - mean)/std
return features_bis, mean, std
@staticmethod
def uses_kernel():
return False
@staticmethod
def get_default_parameters():
parameters_dict = {'penalty': 'l2',
'balanced': False,
'grid_search_folds': 10,
'c_range': np.logspace(-6, 2, 17),
'n_threads': 15}
return parameters_dict
class RandomForest(base.MLAlgorithm):
def _launch_random_forest(self, x_train, x_test, y_train, y_test, n_estimators, max_depth, min_samples_split,
max_features):
if self._algorithm_params['balanced']:
classifier = RandomForestClassifier(n_estimators=n_estimators, max_depth=max_depth,
min_samples_split=min_samples_split, max_features=max_features,
class_weight='balanced', n_jobs=self._algorithm_params['n_threads'])
else:
classifier = RandomForestClassifier(n_estimators=n_estimators, max_depth=max_depth,
min_samples_split=min_samples_split, max_features=max_features,
n_jobs=self._algorithm_params['n_threads'])
classifier.fit(x_train, y_train)
y_hat_train = classifier.predict(x_train)
y_hat = classifier.predict(x_test)
proba_test = classifier.predict_proba(x_test)[:, 1]
auc = roc_auc_score(y_test, proba_test)
return classifier, y_hat, auc, y_hat_train
def _grid_search(self, x_train, x_test, y_train, y_test, n_estimators, max_depth, min_samples_split, max_features):
_, y_hat, _, _ = self._launch_random_forest(x_train, x_test, y_train, y_test,
n_estimators, max_depth,
min_samples_split, max_features)
res = utils.evaluate_prediction(y_test, y_hat)
return res['balanced_accuracy']
def _select_best_parameter(self, async_result):
params_list = []
accuracies = []
all_params_acc = []
for fold in async_result.keys():
best_params = None
best_acc = -1
for params, async_acc in async_result[fold].items():
acc = async_acc.get()
if acc > best_acc:
best_params = params
best_acc = acc
all_params_acc.append(pd.DataFrame({'n_estimators': params[0],
'max_depth': params[1],
'min_samples_split': params[2],
'max_features': params[3],
'balanced_accuracy': acc}, index=['i', ]))
params_list.append(best_params)
accuracies.append(best_acc)
best_acc = np.mean(accuracies)
best_n_estimators = int(round(np.mean([x[0] for x in params_list])))
best_max_depth = int(round(np.mean([x[1] if x[1] is not None else 50 for x in params_list])))
best_min_samples_split = int(round(np.mean([x[2] for x in params_list])))
def max_feature_to_float(m):
if type(m) is float:
return m
if type(m) is int:
return float(m) / float(self._x.shape[1])
if m == 'auto' or m == 'sqrt':
return np.sqrt(self._x.shape[1]) / float(self._x.shape[1])
if m == 'log2':
return np.log2(self._x.shape[1]) / float(self._x.shape[1])
raise ValueError('Not valid value for max_feature: %s' % m)
float_max_feat = [max_feature_to_float(x[3]) for x in params_list]
best_max_features = np.mean(float_max_feat)
return {'n_estimators': best_n_estimators,
'max_depth': best_max_depth,
'min_samples_split': best_min_samples_split,
'max_features': best_max_features,
'balanced_accuracy': best_acc}
def evaluate(self, train_index, test_index):
inner_pool = ThreadPool(self._algorithm_params['n_threads'])
async_result = {}
for i in range(self._algorithm_params['grid_search_folds']):
async_result[i] = {}
x_train = self._x[train_index]
y_train = self._y[train_index]
skf = StratifiedKFold(n_splits=self._algorithm_params['grid_search_folds'], shuffle=True)
inner_cv = list(skf.split(np.zeros(len(y_train)), y_train))
parameters_combinations = list(itertools.product(self._algorithm_params['n_estimators_range'],
self._algorithm_params['max_depth_range'],
self._algorithm_params['min_samples_split_range'],
self._algorithm_params['max_features_range']))
for i in range(len(inner_cv)):
inner_train_index, inner_test_index = inner_cv[i]
x_train_inner = x_train[inner_train_index]
x_test_inner = x_train[inner_test_index]
y_train_inner = y_train[inner_train_index]
y_test_inner = y_train[inner_test_index]
for parameters in parameters_combinations:
async_result[i][parameters] = inner_pool.apply_async(self._grid_search,
(x_train_inner, x_test_inner,
y_train_inner, y_test_inner,
parameters[0], parameters[1],
parameters[2], parameters[3]))
inner_pool.close()
inner_pool.join()
best_parameter = self._select_best_parameter(async_result)
x_test = self._x[test_index]
y_test = self._y[test_index]
_, y_hat, auc, y_hat_train = self._launch_random_forest(x_train, x_test, y_train, y_test,
best_parameter['n_estimators'],
best_parameter['max_depth'],
best_parameter['min_samples_split'],
best_parameter['max_features'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction(y_test, y_hat)
result['evaluation_train'] = utils.evaluate_prediction(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
result['auc'] = auc
return result
def evaluate_no_cv(self, train_index, test_index):
x_train = self._x[train_index]
y_train = self._y[train_index]
x_test = self._x[test_index]
y_test = self._y[test_index]
best_parameter = dict()
best_parameter['n_estimators'] = self._algorithm_params['n_estimators_range']
best_parameter['max_depth'] = self._algorithm_params['max_depth_range']
best_parameter['min_samples_split'] = self._algorithm_params['min_samples_split_range']
best_parameter['max_features'] = self._algorithm_params['max_features_range']
_, y_hat, auc, y_hat_train = self._launch_random_forest(x_train, x_test, y_train, y_test,
self._algorithm_params['n_estimators_range'],
self._algorithm_params['max_depth_range'],
self._algorithm_params['min_samples_split_range'],
self._algorithm_params['max_features_range'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction(y_test, y_hat)
best_parameter['balanced_accuracy'] = result['evaluation']['balanced_accuracy']
result['evaluation_train'] = utils.evaluate_prediction(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
result['auc'] = auc
return result
def apply_best_parameters(self, results_list):
mean_bal_acc = np.mean([result['best_parameter']['balanced_accuracy'] for result in results_list])
best_n_estimators = int(round(np.mean([result['best_parameter']['n_estimators'] for result in results_list])))
best_max_depth = int(round(np.mean([result['best_parameter']['max_depth']
if result['best_parameter']['max_depth'] is not None
else 50 for result in results_list])))
best_min_samples_split = int(round(np.mean([result['best_parameter']['min_samples_split']
for result in results_list])))
max_feat = []
n_features = self._x.shape[1]
for result in results_list:
result_feat = result['best_parameter']['max_features']
if result_feat is None:
max_features = 1.0
elif result_feat in ["auto", "sqrt"]:
max_features = np.sqrt(n_features) / n_features
elif result_feat == "log2":
max_features = np.log2(n_features) / n_features
elif isinstance(result_feat, int):
max_features = float(result_feat) / n_features
elif isinstance(result_feat, float):
max_features = result_feat
else:
raise ("Unknown max_features type")
max_feat.append(max_features)
best_max_features = np.mean(max_feat)
if self._algorithm_params['balanced']:
classifier = RandomForestClassifier(n_estimators=best_n_estimators, max_depth=best_max_depth,
min_samples_split=best_min_samples_split,
max_features=best_max_features,
class_weight='balanced', n_jobs=self._algorithm_params['n_threads'])
else:
classifier = RandomForestClassifier(n_estimators=best_n_estimators, max_depth=best_max_depth,
min_samples_split=best_min_samples_split,
max_features=best_max_features,
n_jobs=self._algorithm_params['n_threads'])
classifier.fit(self._x, self._y)
return classifier, {'n_estimators': best_n_estimators,
'max_depth': best_max_depth,
'min_samples_split': best_min_samples_split,
'max_features': best_max_features,
'balanced_accuracy': mean_bal_acc}
def save_classifier(self, classifier, output_dir):
np.savetxt(path.join(output_dir, 'feature_importances.txt'), classifier.feature_importances_)
# print classifier.estimators_
# np.savetxt(path.join(output_dir, 'estimators.txt'), str(classifier.estimators_))
def save_weights(self, classifier, x, output_dir):
np.savetxt(path.join(output_dir, 'weights.txt'), classifier.feature_importances_)
return classifier.feature_importances_
def save_parameters(self, parameters_dict, output_dir):
with open(path.join(output_dir, 'best_parameters.json'), 'w') as f:
json.dump(parameters_dict, f)
@staticmethod
def uses_kernel():
return False
@staticmethod
def get_default_parameters():
parameters_dict = {'balanced': False,
'grid_search_folds': 10,
'n_estimators_range': (10, 25, 50, 100, 150, 200, 500),
'max_depth_range': (None, 6, 8, 10, 12),
'min_samples_split_range': (2, 4, 6, 8),
'max_features_range': ('auto', 0.1, 0.2, 0.3, 0.4, 0.5),
'n_threads': 15}
return parameters_dict
class XGBoost(base.MLAlgorithm):
def _launch_xgboost(self, x_train, x_test, y_train, y_test, max_depth, learning_rate, n_estimators,
colsample_bytree):
if self._algorithm_params['balanced']:
# set scale_pos_weight
# http://xgboost.readthedocs.io/en/latest//how_to/param_tuning.html
scale_pos_weight = float(len(self._y - sum(self._y)) / sum(self._y))
classifier = XGBClassifier(max_depth=max_depth, learning_rate=learning_rate, n_estimators=n_estimators,
n_jobs=self._algorithm_params['n_threads'], colsample_bytree=colsample_bytree,
reg_alpha=self._algorithm_params['reg_alpha'],
reg_lambda=self._algorithm_params['reg_lambda'],
scale_pos_weight=scale_pos_weight)
else:
classifier = XGBClassifier(max_depth=max_depth, learning_rate=learning_rate, n_estimators=n_estimators,
n_jobs=self._algorithm_params['n_threads'], colsample_bytree=colsample_bytree,
reg_alpha=self._algorithm_params['reg_alpha'],
reg_lambda=self._algorithm_params['reg_lambda'])
classifier.fit(x_train, y_train)
y_hat_train = classifier.predict(x_train)
y_hat = classifier.predict(x_test)
proba_test = classifier.predict_proba(x_test)[:, 1]
auc = roc_auc_score(y_test, proba_test)
return classifier, y_hat, auc, y_hat_train
def _grid_search(self, x_train, x_test, y_train, y_test, max_depth, learning_rate, n_estimators, colsample_bytree):
_, y_hat, _, _ = self._launch_xgboost(x_train, x_test, y_train, y_test, max_depth, learning_rate, n_estimators,
colsample_bytree)
res = utils.evaluate_prediction(y_test, y_hat)
return res['balanced_accuracy']
def _select_best_parameter(self, async_result):
params_list = []
accuracies = []
all_params_acc = []
for fold in async_result.keys():
best_params = None
best_acc = -1
for params, async_acc in async_result[fold].items():
acc = async_acc.get()
if acc > best_acc:
best_params = params
best_acc = acc
all_params_acc.append(pd.DataFrame({'max_depth': params[0],
'learning_rate': params[1],
'n_estimators': params[2],
'colsample_bytree': params[3],
'balanced_accuracy': acc}, index=['i', ]))
params_list.append(best_params)
accuracies.append(best_acc)
best_acc = np.mean(accuracies)
best_max_depth = int(round(np.mean([x[0] for x in params_list])))
best_learning_rate = np.mean([x[1] for x in params_list])
best_n_estimators = int(round(np.mean([x[2] for x in params_list])))
best_colsample_bytree = np.mean([x[3] for x in params_list])
return {'max_depth': best_max_depth,
'learning_rate': best_learning_rate,
'n_estimators': best_n_estimators,
'colsample_bytree': best_colsample_bytree,
'balanced_accuracy': best_acc}
def evaluate(self, train_index, test_index):
inner_pool = ThreadPool(self._algorithm_params['n_threads'])
async_result = {}
for i in range(self._algorithm_params['grid_search_folds']):
async_result[i] = {}
x_train = self._x[train_index]
y_train = self._y[train_index]
skf = StratifiedKFold(n_splits=self._algorithm_params['grid_search_folds'], shuffle=True)
inner_cv = list(skf.split(np.zeros(len(y_train)), y_train))
parameters_combinations = list(itertools.product(self._algorithm_params['max_depth_range'],
self._algorithm_params['learning_rate_range'],
self._algorithm_params['n_estimators_range'],
self._algorithm_params['colsample_bytree_range']))
for i in range(len(inner_cv)):
inner_train_index, inner_test_index = inner_cv[i]
x_train_inner = x_train[inner_train_index]
x_test_inner = x_train[inner_test_index]
y_train_inner = y_train[inner_train_index]
y_test_inner = y_train[inner_test_index]
for parameters in parameters_combinations:
async_result[i][parameters] = inner_pool.apply_async(self._grid_search,
(x_train_inner, x_test_inner,
y_train_inner, y_test_inner,
parameters[0], parameters[1],
parameters[2], parameters[3]))
inner_pool.close()
inner_pool.join()
best_parameter = self._select_best_parameter(async_result)
x_test = self._x[test_index]
y_test = self._y[test_index]
_, y_hat, auc, y_hat_train = self._launch_xgboost(x_train, x_test, y_train, y_test,
best_parameter['max_depth'],
best_parameter['learning_rate'],
best_parameter['n_estimators'],
best_parameter['colsample_bytree'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction(y_test, y_hat)
result['evaluation_train'] = utils.evaluate_prediction(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
result['auc'] = auc
return result
def evaluate_no_cv(self, train_index, test_index):
x_train = self._x[train_index]
y_train = self._y[train_index]
x_test = self._x[test_index]
y_test = self._y[test_index]
best_parameter = dict()
best_parameter['max_depth'] = self._algorithm_params['max_depth_range']
best_parameter['learning_rate'] = self._algorithm_params['learning_rate_range']
best_parameter['n_estimators'] = self._algorithm_params['n_estimators_range']
best_parameter['colsample_bytree'] = self._algorithm_params['colsample_bytree_range']
_, y_hat, auc, y_hat_train = self._launch_xgboost(x_train, x_test, y_train, y_test,
self._algorithm_params['max_depth_range'],
self._algorithm_params['learning_rate_range'],
self._algorithm_params['n_estimators_range'],
self._algorithm_params['colsample_bytree_range'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction(y_test, y_hat)
best_parameter['balanced_accuracy'] = result['evaluation']['balanced_accuracy']
result['evaluation_train'] = utils.evaluate_prediction(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
result['auc'] = auc
return result
def apply_best_parameters(self, results_list):
mean_bal_acc = np.mean([result['best_parameter']['balanced_accuracy'] for result in results_list])
best_max_depth = int(round(np.mean([result['best_parameter']['max_depth'] for result in results_list])))
best_learning_rate = np.mean([result['best_parameter']['learning_rate'] for result in results_list])
best_n_estimators = int(round(np.mean([result['best_parameter']['n_estimators'] for result in results_list])))
best_colsample_bytree = np.mean([result['best_parameter']['colsample_bytree'] for result in results_list])
if self._algorithm_params['balanced']:
scale_pos_weight = float(len(self._y - sum(self._y)) / sum(self._y))
classifier = XGBClassifier(max_depth=best_max_depth, learning_rate=best_learning_rate,
n_estimators=best_n_estimators, n_jobs=self._algorithm_params['n_threads'],
colsample_bytree=best_colsample_bytree,
reg_alpha=self._algorithm_params['reg_alpha'],
reg_lambda=self._algorithm_params['reg_lambda'],
scale_pos_weight=scale_pos_weight)
else:
classifier = XGBClassifier(max_depth=best_max_depth, learning_rate=best_learning_rate,
n_estimators=best_n_estimators, n_jobs=self._algorithm_params['n_threads'],
colsample_bytree=best_colsample_bytree,
reg_alpha=self._algorithm_params['reg_alpha'],
reg_lambda=self._algorithm_params['reg_lambda'])
classifier.fit(self._x, self._y)
return classifier, {'max_depth': best_max_depth,
'learning_rate': best_learning_rate,
'n_estimators': best_n_estimators,
'colsample_bytree': best_colsample_bytree,
'balanced_accuracy': mean_bal_acc}
def save_classifier(self, classifier, output_dir):
np.savetxt(path.join(output_dir, 'feature_importances.txt'), classifier.feature_importances_)
# print classifier.estimators_
# np.savetxt(path.join(output_dir, 'estimators.txt'), str(classifier.estimators_))
def save_weights(self, classifier, x, output_dir):
np.savetxt(path.join(output_dir, 'weights.txt'), classifier.feature_importances_)
return classifier.feature_importances_
def save_parameters(self, parameters_dict, output_dir):
with open(path.join(output_dir, 'best_parameters.json'), 'w') as f:
json.dump(parameters_dict, f)
@staticmethod
def uses_kernel():
return False
@staticmethod
def get_default_parameters():
parameters_dict = {'balanced': False,
'grid_search_folds': 10,
'max_depth_range': (0, 6),
'learning_rate_range': (0.1, 0.3),
'n_estimators_range': (100, 200),
'colsample_bytree_range': (0.5, 1),
'reg_alpha': 0,
'reg_lambda': 1,
'n_threads': 15}
return parameters_dict
class OneVsOneSVM(base.MLAlgorithm):
def _launch_svc(self, kernel_train, x_test, y_train, y_test, c):
if self._algorithm_params['balanced']:
svc = OneVsOneClassifier(SVC(C=c, kernel='precomputed', probability=True, tol=1e-6,
class_weight='balanced'))
else:
svc = OneVsOneClassifier(SVC(C=c, kernel='precomputed', probability=True, tol=1e-6))
svc.fit(kernel_train, y_train)
y_hat_train = svc.predict(kernel_train)
y_hat = svc.predict(x_test)
proba_test = svc.predict_proba(x_test)[:, 1]
return svc, y_hat, y_hat_train
def _grid_search(self, kernel_train, x_test, y_train, y_test, c):
# y_hat is the value predicted
_, y_hat, _ = self._launch_svc(kernel_train, x_test, y_train, y_test, c)
res = utils.evaluate_prediction_multiclass(y_test, y_hat)
return res['balanced_accuracy']
def _select_best_parameter(self, async_result):
c_values = []
accuracies = []
for fold in async_result.keys():
best_c = -1
best_acc = -1
for c, async_acc in async_result[fold].items():
acc = async_acc.get()
if acc > best_acc:
best_c = c
best_acc = acc
c_values.append(best_c)
accuracies.append(best_acc)
best_acc = np.mean(accuracies)
best_c = np.power(10, np.mean(np.log10(c_values)))
return {'c': best_c, 'balanced_accuracy': best_acc}
def evaluate(self, train_index, test_index):
inner_pool = ThreadPool(self._algorithm_params['n_threads'])
async_result = {}
for i in range(self._algorithm_params['grid_search_folds']):
async_result[i] = {}
outer_kernel = self._kernel[train_index, :][:, train_index]
y_train = self._y[train_index]
skf = StratifiedKFold(n_splits=self._algorithm_params['grid_search_folds'], shuffle=True)
inner_cv = list(skf.split(np.zeros(len(y_train)), y_train))
for i in range(len(inner_cv)):
inner_train_index, inner_test_index = inner_cv[i]
inner_kernel = outer_kernel[inner_train_index, :][:, inner_train_index]
x_test_inner = outer_kernel[inner_test_index, :][:, inner_train_index]
y_train_inner, y_test_inner = y_train[inner_train_index], y_train[inner_test_index]
for c in self._algorithm_params['c_range']:
async_result[i][c] = inner_pool.apply_async(self._grid_search,
(inner_kernel, x_test_inner,
y_train_inner, y_test_inner, c))
inner_pool.close()
inner_pool.join()
best_parameter = self._select_best_parameter(async_result)
x_test = self._kernel[test_index, :][:, train_index]
y_train, y_test = self._y[train_index], self._y[test_index]
_, y_hat, y_hat_train = self._launch_svc(outer_kernel, x_test, y_train, y_test, best_parameter['c'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction_multiclass(y_test, y_hat)
result['evaluation_train'] = utils.evaluate_prediction_multiclass(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
return result
def apply_best_parameters(self, results_list):
best_c_list = []
bal_acc_list = []
for result in results_list:
best_c_list.append(result['best_parameter']['c'])
bal_acc_list.append(result['best_parameter']['balanced_accuracy'])
# 10^(mean of log10 of best Cs of each fold) is selected
best_c = np.power(10, np.mean(np.log10(best_c_list)))
# Mean balanced accuracy
mean_bal_acc = np.mean(bal_acc_list)
if self._algorithm_params['balanced']:
svc = OneVsOneClassifier(SVC(C=best_c, kernel='precomputed', probability=True, tol=1e-6,
class_weight='balanced'))
else:
svc = OneVsOneClassifier(SVC(C=best_c, kernel='precomputed', probability=True, tol=1e-6))
svc.fit(self._kernel, self._y)
return svc, {'c': best_c, 'balanced_accuracy': mean_bal_acc}
def save_classifier(self, classifier, output_dir):
np.savetxt(path.join(output_dir, 'support_vectors_indices.txt'), classifier.support_)
np.savetxt(path.join(output_dir, 'intersect.txt'), classifier.intercept_)
def save_weights(self, classifier, x, output_dir):
dual_coefficients = classifier.dual_coef_
sv_indices = classifier.support_
weighted_sv = dual_coefficients.transpose() * x[sv_indices]
weights = np.sum(weighted_sv, 0)
np.savetxt(path.join(output_dir, 'weights.txt'), weights)
return weights
def save_parameters(self, parameters_dict, output_dir):
with open(path.join(output_dir, 'best_parameters.json'), 'w') as f:
json.dump(parameters_dict, f)
@staticmethod
def uses_kernel():
return True
@staticmethod
def get_default_parameters():
parameters_dict = {'balanced': True,
'grid_search_folds': 10,
'c_range': np.logspace(-6, 2, 17),
'n_threads': 15}
return parameters_dict
class OneVsRestSVM(base.MLAlgorithm):
def _launch_svc(self, kernel_train, x_test, y_train, y_test, c):
if self._algorithm_params['balanced']:
svc = OneVsRestClassifier(SVC(C=c, kernel='precomputed', probability=True, tol=1e-6,
class_weight='balanced'))
else:
svc = OneVsRestClassifier(SVC(C=c, kernel='precomputed', probability=True, tol=1e-6))
svc.fit(kernel_train, y_train)
y_hat_train = svc.predict(kernel_train)
y_hat = svc.predict(x_test)
proba_test = svc.predict_proba(x_test)[:, 1]
return svc, y_hat, y_hat_train
def _grid_search(self, kernel_train, x_test, y_train, y_test, c):
# y_hat is the value predicted
_, y_hat, _ = self._launch_svc(kernel_train, x_test, y_train, y_test, c)
res = utils.evaluate_prediction_multiclass(y_test, y_hat)
return res['balanced_accuracy']
def _select_best_parameter(self, async_result):
c_values = []
accuracies = []
for fold in async_result.keys():
best_c = -1
best_acc = -1
for c, async_acc in async_result[fold].items():
acc = async_acc.get()
if acc > best_acc:
best_c = c
best_acc = acc
c_values.append(best_c)
accuracies.append(best_acc)
best_acc = np.mean(accuracies)
best_c = np.power(10, np.mean(np.log10(c_values)))
return {'c': best_c, 'balanced_accuracy': best_acc}
def evaluate(self, train_index, test_index):
inner_pool = ThreadPool(self._algorithm_params['n_threads'])
async_result = {}
for i in range(self._algorithm_params['grid_search_folds']):
async_result[i] = {}
outer_kernel = self._kernel[train_index, :][:, train_index]
y_train = self._y[train_index]
skf = StratifiedKFold(n_splits=self._algorithm_params['grid_search_folds'], shuffle=True)
inner_cv = list(skf.split(np.zeros(len(y_train)), y_train))
for i in range(len(inner_cv)):
inner_train_index, inner_test_index = inner_cv[i]
inner_kernel = outer_kernel[inner_train_index, :][:, inner_train_index]
x_test_inner = outer_kernel[inner_test_index, :][:, inner_train_index]
y_train_inner, y_test_inner = y_train[inner_train_index], y_train[inner_test_index]
for c in self._algorithm_params['c_range']:
async_result[i][c] = inner_pool.apply_async(self._grid_search,
(inner_kernel, x_test_inner,
y_train_inner, y_test_inner, c))
inner_pool.close()
inner_pool.join()
best_parameter = self._select_best_parameter(async_result)
x_test = self._kernel[test_index, :][:, train_index]
y_train, y_test = self._y[train_index], self._y[test_index]
_, y_hat, y_hat_train = self._launch_svc(outer_kernel, x_test, y_train, y_test, best_parameter['c'])
result = dict()
result['best_parameter'] = best_parameter
result['evaluation'] = utils.evaluate_prediction_multiclass(y_test, y_hat)
result['evaluation_train'] = utils.evaluate_prediction_multiclass(y_train, y_hat_train)
result['y_hat'] = y_hat
result['y_hat_train'] = y_hat_train
result['y'] = y_test
result['y_train'] = y_train
result['y_index'] = test_index
result['x_index'] = train_index
return result
def apply_best_parameters(self, results_list):
best_c_list = []
bal_acc_list = []
for result in results_list:
best_c_list.append(result['best_parameter']['c'])
bal_acc_list.append(result['best_parameter']['balanced_accuracy'])
# 10^(mean of log10 of best Cs of each fold) is selected
best_c = np.power(10, np.mean(np.log10(best_c_list)))
# Mean balanced accuracy
mean_bal_acc = np.mean(bal_acc_list)
if self._algorithm_params['balanced']:
svc = OneVsOneClassifier(SVC(C=best_c, kernel='precomputed', probability=True, tol=1e-6,
class_weight='balanced'))
else:
svc = OneVsOneClassifier(SVC(C=best_c, kernel='precomputed', probability=True, tol=1e-6))
svc.fit(self._kernel, self._y)
return svc, {'c': best_c, 'balanced_accuracy': mean_bal_acc}
def save_classifier(self, classifier, output_dir):
np.savetxt(path.join(output_dir, 'support_vectors_indices.txt'), classifier.support_)
np.savetxt(path.join(output_dir, 'intersect.txt'), classifier.intercept_)
def save_weights(self, classifier, x, output_dir):
dual_coefficients = classifier.dual_coef_
sv_indices = classifier.support_
weighted_sv = dual_coefficients.transpose() * x[sv_indices]
weights = np.sum(weighted_sv, 0)
np.savetxt(path.join(output_dir, 'weights.txt'), weights)
return weights
def save_parameters(self, parameters_dict, output_dir):
with open(path.join(output_dir, 'best_parameters.json'), 'w') as f:
json.dump(parameters_dict, f)
@staticmethod
def uses_kernel():
return True
@staticmethod
def get_default_parameters():
parameters_dict = {'balanced': True,
'grid_search_folds': 10,
'c_range': np.logspace(-6, 2, 17),
'n_threads': 15}
return parameters_dict
| 42.006206 | 119 | 0.589937 | 5,631 | 47,383 | 4.584621 | 0.043509 | 0.026495 | 0.057406 | 0.012357 | 0.924814 | 0.913271 | 0.892199 | 0.883832 | 0.877944 | 0.870894 | 0 | 0.007412 | 0.310913 | 47,383 | 1,127 | 120 | 42.043478 | 0.783253 | 0.014541 | 0 | 0.826456 | 0 | 0 | 0.089511 | 0.007027 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07767 | false | 0 | 0.026699 | 0.007282 | 0.178398 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5d479b70ffda4162349c98cea80cedb12900df8 | 1,085 | py | Python | data/typing/numpy.lib.arraypad.py | pydata-apis/python-api-record | 684cffbbb6dc6e81f9de4e02619c8b0ebc557b2b | [
"MIT"
] | 67 | 2020-08-17T11:53:26.000Z | 2021-11-08T20:16:06.000Z | data/typing/numpy.lib.arraypad.py | data-apis/python-record-api | 684cffbbb6dc6e81f9de4e02619c8b0ebc557b2b | [
"MIT"
] | 36 | 2020-08-17T11:09:51.000Z | 2021-12-15T18:09:47.000Z | data/typing/numpy.lib.arraypad.py | pydata-apis/python-api-record | 684cffbbb6dc6e81f9de4e02619c8b0ebc557b2b | [
"MIT"
] | 7 | 2020-08-19T05:06:47.000Z | 2020-11-04T05:10:38.000Z | from typing import *
@overload
def _as_pairs(x: Tuple[Tuple[int, int], Tuple[int, int]], ndim: int, as_index: bool):
"""
usage.skimage: 1
"""
...
@overload
def _as_pairs(
x: Tuple[Tuple[int, int], Tuple[int, int], Tuple[int, int]],
ndim: int,
as_index: bool,
):
"""
usage.skimage: 1
"""
...
@overload
def _as_pairs(x: int, ndim: int, as_index: bool):
"""
usage.skimage: 1
"""
...
@overload
def _as_pairs(x: List[Tuple[int, int]], ndim: int, as_index: bool):
"""
usage.skimage: 1
"""
...
@overload
def _as_pairs(x: List[Tuple[numpy.int64, numpy.int64]], ndim: int, as_index: bool):
"""
usage.skimage: 1
"""
...
@overload
def _as_pairs(x: Tuple[int, int], ndim: int, as_index: bool):
"""
usage.skimage: 1
"""
...
def _as_pairs(
x: Union[
Tuple[Union[Tuple[int, int], int], ...],
int,
List[Tuple[Union[int, numpy.int64], Union[int, numpy.int64]]],
],
ndim: int,
as_index: bool,
):
"""
usage.skimage: 6
"""
...
| 15.724638 | 85 | 0.533641 | 142 | 1,085 | 3.929577 | 0.147887 | 0.107527 | 0.157706 | 0.137993 | 0.811828 | 0.811828 | 0.811828 | 0.811828 | 0.811828 | 0.722222 | 0 | 0.019036 | 0.273733 | 1,085 | 68 | 86 | 15.955882 | 0.689086 | 0.108756 | 0 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212121 | false | 0 | 0.030303 | 0 | 0.242424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5db814ef19fdf2f8fbf2f0067af765609657027 | 27,989 | py | Python | tests/test_geofileops_gpd.py | theroggy/geofileops | e48a0a69e5a927d003919ba556727bfd72ed226d | [
"BSD-3-Clause"
] | 1 | 2021-02-01T20:01:12.000Z | 2021-02-01T20:01:12.000Z | tests/test_geofileops_gpd.py | theroggy/geofileops | e48a0a69e5a927d003919ba556727bfd72ed226d | [
"BSD-3-Clause"
] | 18 | 2020-06-12T13:46:30.000Z | 2021-07-30T15:24:09.000Z | tests/test_geofileops_gpd.py | theroggy/geofileops | e48a0a69e5a927d003919ba556727bfd72ed226d | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Tests for operations using GeoPandas.
"""
from pathlib import Path
import sys
import geopandas as gpd
# Add path so the local geofileops packages are found
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
from geofileops import geofile
from geofileops.geofile import GeometryType
from geofileops.util import geofileops_gpd
from geofileops.util.geofileops_gpd import ParallelizationConfig
from geofileops.util.geometry_util import SimplifyAlgorithm
import test_helper
def get_nb_parallel() -> int:
# The number of parallel processes to use for these tests.
return 2
def get_parallelization_config() -> ParallelizationConfig:
#default_config = ParallelizationConfig()
test_config = ParallelizationConfig(
#bytes_basefootprint: int = 50*1024*1024,
#bytes_per_row: int = 100,
min_avg_rows_per_batch=1,
max_avg_rows_per_batch=5,
#bytes_min_per_process=None,
#bytes_usable=None
)
return test_config
def test_buffer_gpkg(tmpdir):
# Buffer polygon source to test dir
input_path = test_helper.TestFiles.polygons_parcels_gpkg
output_path = Path(tmpdir) / 'polygons_parcels-output.gpkg'
basetest_buffer(input_path, output_path, GeometryType.MULTIPOLYGON)
# Buffer point source to test dir
input_path = test_helper.TestFiles.points_gpkg
output_path = Path(tmpdir) / 'points-output.gpkg'
basetest_buffer(input_path, output_path, GeometryType.MULTIPOINT)
# Buffer line source to test dir
input_path = test_helper.TestFiles.linestrings_rows_of_trees_gpkg
output_path = Path(tmpdir) / 'linestrings_rows_of_trees-output.gpkg'
basetest_buffer(input_path, output_path, GeometryType.MULTILINESTRING)
def test_buffer_shp(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_shp
output_path = Path(tmpdir) / 'polygons_parcels-output.shp'
basetest_buffer(input_path, output_path, GeometryType.MULTIPOLYGON)
def basetest_buffer(
input_path: Path,
output_path: Path,
input_geometry_type: GeometryType):
layerinfo_input = geofile.get_layerinfo(input_path)
### Test positive buffer ###
geofileops_gpd.buffer(
input_path=input_path,
output_path=output_path,
distance=1,
nb_parallel=get_nb_parallel())
# Now check if the output file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_input.featurecount == layerinfo_output.featurecount
assert len(layerinfo_output.columns) == len(layerinfo_input.columns)
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Read result for some more detailed checks
output_gdf = geofile.read_file(output_path)
assert output_gdf['geometry'][0] is not None
### Test negative buffer ###
output_path = output_path.parent / f"{output_path.stem}_m10m{output_path.suffix}"
geofileops_gpd.buffer(
input_path=input_path,
output_path=output_path,
distance=-10,
nb_parallel=get_nb_parallel())
# Now check if the output file is correctly created
if input_geometry_type in [GeometryType.MULTIPOINT, GeometryType.MULTILINESTRING]:
# A Negative buffer of points or linestrings doesn't give a result.
assert output_path.exists() == False
else:
# A Negative buffer of polygons gives a result for large polygons.
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert len(layerinfo_output.columns) == len(layerinfo_input.columns)
# 7 polygons disappear because of the negative buffer
assert layerinfo_output.featurecount == layerinfo_input.featurecount - 7
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Read result for some more detailed checks
output_gdf = geofile.read_file(output_path)
assert output_gdf['geometry'][0] is not None
### Test negative buffer with explodecollections ###
output_path = output_path.parent / f"{output_path.stem}_m10m_explode{output_path.suffix}"
geofileops_gpd.buffer(
input_path=input_path,
output_path=output_path,
distance=-10,
explodecollections=True,
nb_parallel=get_nb_parallel())
# Now check if the output file is correctly created
if input_geometry_type in [GeometryType.MULTIPOINT, GeometryType.MULTILINESTRING]:
# A Negative buffer of points or linestrings doesn't give a result.
assert output_path.exists() == False
else:
# A Negative buffer of polygons gives a result for large polygons
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert len(layerinfo_output.columns) == len(layerinfo_input.columns)
# 6 polygons disappear because of the negative buffer, 3 polygons are
# split in 2 because of the negative buffer and/or explodecollections=True.
assert layerinfo_output.featurecount == layerinfo_input.featurecount - 7 + 3
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Read result for some more detailed checks
output_gdf = geofile.read_file(output_path)
assert output_gdf['geometry'][0] is not None
def test_buffer_various_options_gpkg(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_gpkg
output_path = Path(tmpdir) / 'polygons_parcels-output.gpkg'
basetest_buffer_various_options(input_path, output_path)
def test_buffer_various_options_shp(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_shp
output_path = Path(tmpdir) / 'polygons_parcels-output.shp'
basetest_buffer_various_options(input_path, output_path)
def basetest_buffer_various_options(input_path, output_path):
### Check if columns parameter works (case insensitive) ###
columns = ['OIDN', 'uidn', 'HFDTLT', 'lblhfdtlt', 'GEWASGROEP', 'lengte', 'OPPERVL']
geofileops_gpd.buffer(
input_path=input_path,
columns=columns,
output_path=output_path,
distance=1,
nb_parallel=get_nb_parallel())
# Now check if the tmp file is correctly created
layerinfo_orig = geofile.get_layerinfo(input_path)
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_orig.featurecount == layerinfo_output.featurecount
assert 'OIDN' in layerinfo_output.columns
assert 'UIDN' in layerinfo_output.columns
assert len(layerinfo_output.columns) == len(columns)
# Read result for some more detailed checks
output_gdf = geofile.read_file(output_path)
assert output_gdf['geometry'][0] is not None
### Check if ... parameter works ###
# TODO: increase test coverage of other options...
def test_convexhull_gpkg(tmpdir):
# Select some data from input to output file
input_path = test_helper.TestFiles.polygons_parcels_gpkg
output_path = Path(tmpdir) / 'polygons_parcels-output.gpkg'
basetest_convexhull(input_path, output_path)
def test_convexhull_shp(tmpdir):
# Select some data from input to output file
input_path = test_helper.TestFiles.polygons_parcels_shp
output_path = Path(tmpdir) / 'polygons_parcels-output.shp'
basetest_convexhull(input_path, output_path)
def basetest_convexhull(input_path, output_path):
layerinfo_orig = geofile.get_layerinfo(input_path)
geofileops_gpd.convexhull(
input_path=input_path,
output_path=output_path,
nb_parallel=get_nb_parallel())
# Now check if the output file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_orig.featurecount == layerinfo_output.featurecount
assert len(layerinfo_orig.columns) == len(layerinfo_output.columns)
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Read result for some more detailed checks
output_gdf = geofile.read_file(output_path)
assert output_gdf['geometry'][0] is not None
def test_dissolve_linestrings_nogroupby_gpkg(tmpdir):
# Apply operation
input_path = test_helper.TestFiles.linestrings_watercourses_gpkg
output_path = Path(tmpdir) / 'linestrings_watercourses-output.gpkg'
basetest_dissolve_linestrings_nogroupby(input_path, output_path)
def basetest_dissolve_linestrings_nogroupby(input_path, output_basepath):
# Apply dissolve with explodecollections
output_path = (output_basepath.parent /
f"{output_basepath.stem}_expl{output_basepath.suffix}")
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
explodecollections=True,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config())
# Check if the result file is correctly created
assert output_path.exists() == True
layerinfo_orig = geofile.get_layerinfo(input_path)
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 85
assert layerinfo_output.geometrytype is GeometryType.LINESTRING
assert len(layerinfo_output.columns) >= 0
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
# Apply dissolve without explodecollections
output_path = (output_basepath.parent /
f"{output_basepath.stem}_noexpl{output_basepath.suffix}")
# explodecollections=False only supported if
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
explodecollections=False,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config())
# Check if the result file is correctly created
assert output_path.exists() == True
layerinfo_orig = geofile.get_layerinfo(input_path)
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 1
assert layerinfo_output.geometrytype is layerinfo_orig.geometrytype
assert len(layerinfo_output.columns) >= 0
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
def test_dissolve_polygons_groupby_gpkg(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_gpkg
output_path = Path(tmpdir) / 'polygons_parcels-output.gpkg'
basetest_dissolve_polygons_groupby(input_path, output_path)
def test_dissolve_polygons_groupby_shp(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_shp
output_path = Path(tmpdir) / 'polygons_parcels-output.shp'
basetest_dissolve_polygons_groupby(input_path, output_path)
def basetest_dissolve_polygons_groupby(
input_path: Path,
output_basepath: Path):
# Init
layerinfo_input = geofile.get_layerinfo(input_path)
### Test dissolve polygons with groupby + without explodecollections ###
output_path = output_basepath.parent / f"{output_basepath.stem}_group{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
groupby_columns=['GEWASGROEP'],
explodecollections=False,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config())
# Now check if the tmp file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 6
assert len(layerinfo_output.columns) == 1
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test dissolve polygons with explodecollections ###
output_path = output_basepath.parent / f"{output_basepath.stem}_group_explode{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
groupby_columns=['GEWASGROEP'],
explodecollections=True,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config())
# Now check if the tmp file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 25
assert len(layerinfo_output.columns) == 1
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test dissolve polygons with explodecollections + all columns ###
output_path = output_basepath.parent / f"{output_basepath.stem}_group_explode_allcolumns{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
groupby_columns=['GEWASGROEP'],
columns=None,
explodecollections=True,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config())
# Now check if the tmp file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 25
assert len(layerinfo_output.columns) == len(layerinfo_input.columns)
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test dissolve polygons with specified output layer ###
# A different output layer is not supported for shapefile!!!
try:
output_path = output_basepath.parent / f"{output_basepath.stem}_group_outputlayer{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
groupby_columns=['GEWASGROEP'],
output_layer='banana',
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config())
except Exception as ex:
# A different output_layer is not supported for shapefile, so normal
# that an exception is thrown!
assert output_path.suffix.lower() == '.shp'
# Now check if the tmp file is correctly created
if output_path.suffix.lower() != '.shp':
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 25
assert len(layerinfo_output.columns) == 1
assert layerinfo_output.name == 'banana'
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
def test_dissolve_polygons_nogroupby_gpkg(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_gpkg
output_basepath = Path(tmpdir) / 'polygons_parcels-output.gpkg'
basetest_dissolve_polygons_nogroupby(input_path, output_basepath)
def test_dissolve_polygons_nogroupby_shp(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_shp
output_basepath = Path(tmpdir) / 'polygons_parcels-output.shp'
basetest_dissolve_polygons_nogroupby(input_path, output_basepath)
def basetest_dissolve_polygons_nogroupby(
input_path: Path,
output_basepath: Path):
# Init
layerinfo_input = geofile.get_layerinfo(input_path)
### Test dissolve polygons with explodecollections=True (= default) ###
output_path = output_basepath.parent / f"{output_basepath.stem}_defaults{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config(),
force=True)
# Now check if the result file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 23
if output_basepath.suffix == '.shp':
# Shapefile always has an FID field
# TODO: think about whether this should also be the case for geopackage???
assert len(layerinfo_output.columns) == 1
else:
assert len(layerinfo_output.columns) == 0
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test dissolve polygons with explodecollections=False ###
output_path = output_basepath.parent / f"{output_basepath.stem}_defaults{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
explodecollections=False,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config(),
force=True)
# Now check if the result file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 1
if output_basepath.suffix == '.shp':
# Shapefile always has an FID field
# TODO: think about whether this should also be the case for geopackage???
assert len(layerinfo_output.columns) == 1
else:
assert len(layerinfo_output.columns) == 0
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test dissolve polygons, with output_layer ###
# A different output layer is not supported for shapefile!!!
try:
output_path = output_basepath.parent / f"{output_basepath.stem}_outputlayer{output_basepath.suffix}"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
output_layer='banana',
explodecollections=True,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config(),
force=True)
except Exception as ex:
# A different output_layer is not supported for shapefile, so normal
# that an exception is thrown!
assert output_path.suffix.lower() == '.shp'
# Now check if the result file is correctly created
if output_path.suffix.lower() != '.shp':
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 23
assert len(layerinfo_output.columns) == 0
if output_basepath.suffix == '.shp':
# Shapefile doesn't support specifying an output_layer
assert layerinfo_output.name == output_path.stem
else:
assert layerinfo_output.name == 'banana'
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
def test_dissolve_multisinglepolygons_gpkg(tmpdir):
# Test to check if it is handled well that a file that results in single
# and multipolygons during dissolve is treated correctly, as geopackage
# doesn't support single and multi-polygons in one layer.
# Init
tmpdir = Path(tmpdir)
# Create test data
input_gdf = gpd.GeoDataFrame(geometry=[test_helper.TestData.polygon, test_helper.TestData.multipolygon])
input_path = tmpdir / 'test_polygon_input.gpkg'
geofile.to_file(input_gdf, input_path)
output_path = tmpdir / f"{input_path.stem}_diss.gpkg"
geofileops_gpd.dissolve(
input_path=input_path,
output_path=output_path,
explodecollections=True,
nb_squarish_tiles=2,
nb_parallel=get_nb_parallel(),
parallelization_config=get_parallelization_config(),
force=True)
# Now check if the result file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_output.featurecount == 3
assert len(layerinfo_output.columns) == 0
# Check geometry type
assert layerinfo_output.geometrytype == GeometryType.MULTIPOLYGON
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
def test_simplify_gpkg(tmpdir):
# Simplify polygon source to test dir
input_path = test_helper.TestFiles.polygons_parcels_gpkg
output_path = Path(tmpdir) / 'polygons_parcels-output.gpkg'
basetest_simplify(input_path, output_path, GeometryType.MULTIPOLYGON)
# Simplify point source to test dir
input_path = test_helper.TestFiles.points_gpkg
output_path = Path(tmpdir) / 'points-output.gpkg'
basetest_simplify(input_path, output_path, GeometryType.MULTIPOINT)
# Simplify line source to test dir
input_path = test_helper.TestFiles.linestrings_rows_of_trees_gpkg
output_path = Path(tmpdir) / 'linestrings_rows_of_trees-output.gpkg'
basetest_simplify(input_path, output_path, GeometryType.MULTILINESTRING)
def test_simplify_shp(tmpdir):
# Buffer to test dir
input_path = test_helper.TestFiles.polygons_parcels_shp
output_path = Path(tmpdir) / 'polygons_parcels-output.shp'
basetest_simplify(input_path, output_path, GeometryType.MULTIPOLYGON)
def basetest_simplify(
input_path: Path,
output_path: Path,
expected_output_geometrytype: GeometryType):
### Test default algorithm, rdp ###
layerinfo_orig = geofile.get_layerinfo(input_path)
geofileops_gpd.simplify(
input_path=input_path,
output_path=output_path,
tolerance=5,
nb_parallel=get_nb_parallel())
# Now check if the tmp file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_orig.featurecount == layerinfo_output.featurecount
assert len(layerinfo_orig.columns) == len(layerinfo_output.columns)
# Check geometry type
assert layerinfo_output.geometrytype == expected_output_geometrytype
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test vw (visvalingam-whyatt) algorithm ###
layerinfo_orig = geofile.get_layerinfo(input_path)
geofileops_gpd.simplify(
input_path=input_path,
output_path=output_path,
tolerance=5,
algorithm=SimplifyAlgorithm.VISVALINGAM_WHYATT,
nb_parallel=get_nb_parallel())
# Now check if the tmp file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_orig.featurecount == layerinfo_output.featurecount
assert len(layerinfo_orig.columns) == len(layerinfo_output.columns)
# Check geometry type
assert layerinfo_output.geometrytype == expected_output_geometrytype
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
### Test lang algorithm ###
layerinfo_orig = geofile.get_layerinfo(input_path)
geofileops_gpd.simplify(
input_path=input_path,
output_path=output_path,
tolerance=5,
algorithm=SimplifyAlgorithm.LANG,
lookahead=8,
nb_parallel=get_nb_parallel())
# Now check if the tmp file is correctly created
assert output_path.exists() == True
layerinfo_output = geofile.get_layerinfo(output_path)
assert layerinfo_orig.featurecount == layerinfo_output.featurecount
assert len(layerinfo_orig.columns) == len(layerinfo_output.columns)
# Check geometry type
assert layerinfo_output.geometrytype == expected_output_geometrytype
# Now check the contents of the result file
input_gdf = geofile.read_file(input_path)
output_gdf = geofile.read_file(output_path)
assert input_gdf.crs == output_gdf.crs
assert len(output_gdf) == layerinfo_output.featurecount
assert output_gdf['geometry'][0] is not None
if __name__ == '__main__':
# Init
tmpdir = test_helper.init_test_for_debug(Path(__file__).stem)
# Run
#test_buffer_gpkg(tmpdir)
#test_buffer_various_options_gpkg(tmpdir)
#test_dissolve_linestrings_nogroupby_gpkg(tmpdir)
#test_dissolve_linestrings_nogroupby_shp(tmpdir)
test_dissolve_polygons_groupby_gpkg(tmpdir)
#test_dissolve_polygons_groupby_shp(tmpdir)
#test_dissolve_polygons_nogroupby_gpkg(tmpdir)
#test_dissolve_polygons_nogroupby_shp(tmpdir)
#test_dissolve_multisinglepolygons_gpkg(tmpdir)
#test_simplify_gpkg(tmpdir)
| 41.58841 | 117 | 0.721105 | 3,412 | 27,989 | 5.637456 | 0.072392 | 0.076943 | 0.041487 | 0.034572 | 0.888017 | 0.876215 | 0.834676 | 0.814453 | 0.793813 | 0.760853 | 0 | 0.003816 | 0.204187 | 27,989 | 672 | 118 | 41.650298 | 0.859747 | 0.184573 | 0 | 0.800459 | 0 | 0 | 0.063556 | 0.048784 | 0 | 0 | 0 | 0.001488 | 0.286697 | 1 | 0.052752 | false | 0 | 0.020642 | 0.002294 | 0.077982 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
913167234d37181b5677548e5fba35a40becf5ba | 19,522 | py | Python | openstack_dashboard/dashboards/project/images/tests.py | Hodorable/0602 | 3b1e4cb7458e4f456bfebc52fc2902205c36cc15 | [
"Apache-2.0"
] | null | null | null | openstack_dashboard/dashboards/project/images/tests.py | Hodorable/0602 | 3b1e4cb7458e4f456bfebc52fc2902205c36cc15 | [
"Apache-2.0"
] | null | null | null | openstack_dashboard/dashboards/project/images/tests.py | Hodorable/0602 | 3b1e4cb7458e4f456bfebc52fc2902205c36cc15 | [
"Apache-2.0"
] | null | null | null | # Copyright 2012 United States Government as represented by the
# Administrator of the National Aeronautics and Space Administration.
# All Rights Reserved.
#
# Copyright 2012 Nebula, Inc.
# Copyright 2012 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from socket import timeout as socket_timeout # noqa
from django.core.urlresolvers import reverse
from django import http
from mox import IsA # noqa
from horizon import exceptions
from openstack_dashboard import api
from openstack_dashboard.dashboards.project.images import utils
from openstack_dashboard.test import helpers as test
INDEX_URL = reverse('horizon:project:images:index')
CREATE_URL = reverse('horizon:project:images:images:create')
class ImagesAndSnapshotsTests(test.TestCase):
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_index(self):
images = self.images.list()
api.glance.image_list_detailed(IsA(http.HttpRequest),
marker=None).AndReturn([images,
False, False])
self.mox.ReplayAll()
res = self.client.get(INDEX_URL)
self.assertTemplateUsed(res, 'project/images/index.html')
self.assertContains(res, 'help_text="Deleted images'
' are not recoverable."')
self.assertIn('images_table', res.context)
images_table = res.context['images_table']
images = images_table.data
self.assertTrue(len(images), 3)
row_actions = images_table.get_row_actions(images[0])
self.assertTrue(len(row_actions), 3)
row_actions = images_table.get_row_actions(images[1])
self.assertTrue(len(row_actions), 2)
self.assertTrue('delete_image' not in
[a.name for a in row_actions])
row_actions = images_table.get_row_actions(images[2])
self.assertTrue(len(row_actions), 3)
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_index_no_images(self):
api.glance.image_list_detailed(IsA(http.HttpRequest),
marker=None).AndReturn([(),
False, False])
self.mox.ReplayAll()
res = self.client.get(INDEX_URL)
self.assertTemplateUsed(res, 'project/images/index.html')
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_index_error(self):
api.glance.image_list_detailed(IsA(http.HttpRequest),
marker=None) \
.AndRaise(self.exceptions.glance)
self.mox.ReplayAll()
res = self.client.get(INDEX_URL)
self.assertTemplateUsed(res, 'project/images/index.html')
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_snapshot_actions(self):
snapshots = self.snapshots.list()
api.glance.image_list_detailed(IsA(http.HttpRequest), marker=None) \
.AndReturn([snapshots, False, False])
self.mox.ReplayAll()
res = self.client.get(INDEX_URL)
self.assertTemplateUsed(res, 'project/images/index.html')
self.assertIn('images_table', res.context)
snaps = res.context['images_table']
self.assertEqual(len(snaps.get_rows()), 3)
row_actions = snaps.get_row_actions(snaps.data[0])
# first instance - status active, owned
self.assertEqual(len(row_actions), 4)
self.assertEqual(row_actions[0].verbose_name, u"Launch Instance")
self.assertEqual(row_actions[1].verbose_name, u"Create Volume")
self.assertEqual(row_actions[2].verbose_name, u"Edit Image")
self.assertEqual(row_actions[3].verbose_name, u"Delete Image")
row_actions = snaps.get_row_actions(snaps.data[1])
# second instance - status active, not owned
self.assertEqual(len(row_actions), 2)
self.assertEqual(row_actions[0].verbose_name, u"Launch Instance")
self.assertEqual(row_actions[1].verbose_name, u"Create Volume")
row_actions = snaps.get_row_actions(snaps.data[2])
# third instance - status queued, only delete is available
self.assertEqual(len(row_actions), 1)
self.assertEqual(unicode(row_actions[0].verbose_name),
u"Delete Image")
self.assertEqual(str(row_actions[0]), "<DeleteImage: delete>")
class ImagesAndSnapshotsUtilsTests(test.TestCase):
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_list_image(self):
public_images = [image for image in self.images.list()
if image.status == 'active' and image.is_public]
private_images = [image for image in self.images.list()
if (image.status == 'active' and
not image.is_public)]
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'is_public': True, 'status': 'active'}) \
.AndReturn([public_images, False, False])
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'property-owner_id': self.tenant.id,
'status': 'active'}) \
.AndReturn([private_images, False, False])
self.mox.ReplayAll()
ret = utils.get_available_images(self.request, self.tenant.id)
expected_images = [image for image in self.images.list()
if (image.status == 'active' and
image.container_format not in ('ami', 'aki'))]
self.assertEqual(len(expected_images), len(ret))
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_list_image_using_cache(self):
public_images = [image for image in self.images.list()
if image.status == 'active' and image.is_public]
private_images = [image for image in self.images.list()
if (image.status == 'active' and
not image.is_public)]
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'is_public': True, 'status': 'active'}) \
.AndReturn([public_images, False, False])
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'property-owner_id': self.tenant.id,
'status': 'active'}) \
.AndReturn([private_images, False, False])
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'property-owner_id': 'other-tenant',
'status': 'active'}) \
.AndReturn([private_images, False, False])
self.mox.ReplayAll()
expected_images = [image for image in self.images.list()
if (image.status == 'active' and
image.container_format not in ('ari', 'aki'))]
images_cache = {}
ret = utils.get_available_images(self.request, self.tenant.id,
images_cache)
self.assertEqual(len(expected_images), len(ret))
self.assertEqual(
len(public_images),
len(images_cache['public_images']))
self.assertEqual(1, len(images_cache['images_by_project']))
self.assertEqual(
len(private_images),
len(images_cache['images_by_project'][self.tenant.id]))
ret = utils.get_available_images(self.request, self.tenant.id,
images_cache)
self.assertEqual(len(expected_images), len(ret))
# image list for other-tenant
ret = utils.get_available_images(self.request, 'other-tenant',
images_cache)
self.assertEqual(len(expected_images), len(ret))
self.assertEqual(
len(public_images),
len(images_cache['public_images']))
self.assertEqual(2, len(images_cache['images_by_project']))
self.assertEqual(
len(private_images),
len(images_cache['images_by_project']['other-tenant']))
@test.create_stubs({api.glance: ('image_list_detailed',),
exceptions: ('handle',)})
def test_list_image_error_public_image_list(self):
public_images = [image for image in self.images.list()
if image.status == 'active' and image.is_public]
private_images = [image for image in self.images.list()
if (image.status == 'active' and
not image.is_public)]
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'is_public': True, 'status': 'active'}) \
.AndRaise(self.exceptions.glance)
exceptions.handle(IsA(http.HttpRequest),
"Unable to retrieve public images.")
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'property-owner_id': self.tenant.id,
'status': 'active'}) \
.AndReturn([private_images, False, False])
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'is_public': True, 'status': 'active'}) \
.AndReturn([public_images, False, False])
self.mox.ReplayAll()
images_cache = {}
ret = utils.get_available_images(self.request, self.tenant.id,
images_cache)
expected_images = [image for image in private_images
if image.container_format not in ('ami', 'aki')]
self.assertEqual(len(expected_images), len(ret))
self.assertNotIn('public_images', images_cache)
self.assertEqual(1, len(images_cache['images_by_project']))
self.assertEqual(
len(private_images),
len(images_cache['images_by_project'][self.tenant.id]))
ret = utils.get_available_images(self.request, self.tenant.id,
images_cache)
expected_images = [image for image in self.images.list()
if image.container_format not in ('ami', 'aki')]
self.assertEqual(len(expected_images), len(ret))
self.assertEqual(
len(public_images),
len(images_cache['public_images']))
self.assertEqual(1, len(images_cache['images_by_project']))
self.assertEqual(
len(private_images),
len(images_cache['images_by_project'][self.tenant.id]))
@test.create_stubs({api.glance: ('image_list_detailed',),
exceptions: ('handle',)})
def test_list_image_error_private_image_list(self):
public_images = [image for image in self.images.list()
if image.status == 'active' and image.is_public]
private_images = [image for image in self.images.list()
if (image.status == 'active' and
not image.is_public)]
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'is_public': True, 'status': 'active'}) \
.AndReturn([public_images, False, False])
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'property-owner_id': self.tenant.id,
'status': 'active'}) \
.AndRaise(self.exceptions.glance)
exceptions.handle(IsA(http.HttpRequest),
"Unable to retrieve images for the current project.")
api.glance.image_list_detailed(
IsA(http.HttpRequest),
filters={'property-owner_id': self.tenant.id,
'status': 'active'}) \
.AndReturn([private_images, False, False])
self.mox.ReplayAll()
images_cache = {}
ret = utils.get_available_images(self.request, self.tenant.id,
images_cache)
expected_images = [image for image in public_images
if image.container_format not in ('ami', 'aki')]
self.assertEqual(len(expected_images), len(ret))
self.assertEqual(
len(public_images),
len(images_cache['public_images']))
self.assertFalse(len(images_cache['images_by_project']))
ret = utils.get_available_images(self.request, self.tenant.id,
images_cache)
expected_images = [image for image in self.images.list()
if image.container_format not in ('ami', 'aki')]
self.assertEqual(len(expected_images), len(ret))
self.assertEqual(
len(public_images),
len(images_cache['public_images']))
self.assertEqual(1, len(images_cache['images_by_project']))
self.assertEqual(
len(private_images),
len(images_cache['images_by_project'][self.tenant.id]))
class SeleniumTests(test.SeleniumTestCase):
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_modal_create_image_from_url(self):
driver = self.selenium
images = self.images.list()
api.glance.image_list_detailed(IsA(http.HttpRequest),
marker=None).AndReturn([images,
False, False])
filters = {'disk_format': 'aki'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
filters = {'disk_format': 'ari'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
self.mox.ReplayAll()
driver.get("%s%s" % (self.live_server_url, INDEX_URL))
# Open the modal menu
driver.find_element_by_id("images__action_create").send_keys("\n")
wait = self.ui.WebDriverWait(self.selenium, 10,
ignored_exceptions=[socket_timeout])
wait.until(lambda x: driver.find_element_by_id("id_disk_format"))
srctypes = self.ui.Select(driver.find_element_by_id("id_source_type"))
srctypes.select_by_value("url")
copyfrom = driver.find_element_by_id("id_image_url")
copyfrom.send_keys("http://www.test.com/test.iso")
formats = self.ui.Select(driver.find_element_by_id("id_disk_format"))
body = formats.first_selected_option
self.assertTrue("ISO" in body.text,
"ISO should be selected when the extension is *.iso")
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_modal_create_image_from_file(self):
driver = self.selenium
images = self.images.list()
api.glance.image_list_detailed(IsA(http.HttpRequest),
marker=None).AndReturn([images,
False, False])
filters = {'disk_format': 'aki'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
filters = {'disk_format': 'ari'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
self.mox.ReplayAll()
driver.get("%s%s" % (self.live_server_url, INDEX_URL))
# Open the modal menu
driver.find_element_by_id("images__action_create").send_keys("\n")
wait = self.ui.WebDriverWait(driver, 10,
ignored_exceptions=[socket_timeout])
wait.until(lambda x: driver.find_element_by_id("id_disk_format"))
srctypes = self.ui.Select(driver.find_element_by_id("id_source_type"))
srctypes.select_by_value("file")
driver.find_element_by_id("id_image_file").send_keys("/tmp/test.iso")
formats = self.ui.Select(driver.find_element_by_id("id_disk_format"))
body = formats.first_selected_option
self.assertTrue("ISO" in body.text,
"ISO should be selected when the extension is *.iso")
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_create_image_from_url(self):
driver = self.selenium
filters = {'disk_format': 'aki'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
filters = {'disk_format': 'ari'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
self.mox.ReplayAll()
driver.get("%s%s" % (self.live_server_url, CREATE_URL))
wait = self.ui.WebDriverWait(driver, 10,
ignored_exceptions=[socket_timeout])
wait.until(lambda x: driver.find_element_by_id("id_disk_format"))
srctypes = self.ui.Select(driver.find_element_by_id("id_source_type"))
srctypes.select_by_value("url")
copyfrom = driver.find_element_by_id("id_image_url")
copyfrom.send_keys("http://www.test.com/test.iso")
formats = self.ui.Select(driver.find_element_by_id("id_disk_format"))
body = formats.first_selected_option
self.assertTrue("ISO" in body.text,
"ISO should be selected when the extension is *.iso")
@test.create_stubs({api.glance: ('image_list_detailed',)})
def test_create_image_from_file(self):
driver = self.selenium
filters = {'disk_format': 'aki'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
filters = {'disk_format': 'ari'}
api.glance.image_list_detailed(
IsA(http.HttpRequest), filters=filters).AndReturn(
[self.images.list(), False, False])
self.mox.ReplayAll()
driver.get("%s%s" % (self.live_server_url, CREATE_URL))
wait = self.ui.WebDriverWait(driver, 10,
ignored_exceptions=[socket_timeout])
wait.until(lambda x: driver.find_element_by_id("id_disk_format"))
srctypes = self.ui.Select(driver.find_element_by_id("id_source_type"))
srctypes.select_by_value("file")
driver.find_element_by_id("id_image_file").send_keys("/tmp/test.iso")
formats = self.ui.Select(driver.find_element_by_id("id_disk_format"))
body = formats.first_selected_option
self.assertTrue("ISO" in body.text,
"ISO should be selected when the extension is *.iso")
| 45.08545 | 79 | 0.605471 | 2,228 | 19,522 | 5.092908 | 0.105925 | 0.031726 | 0.045651 | 0.058694 | 0.846039 | 0.832379 | 0.808848 | 0.805587 | 0.791046 | 0.78382 | 0 | 0.003629 | 0.280043 | 19,522 | 432 | 80 | 45.189815 | 0.8037 | 0.0502 | 0 | 0.804035 | 0 | 0 | 0.112395 | 0.011126 | 0 | 0 | 0 | 0 | 0.152738 | 1 | 0.034582 | false | 0 | 0.023055 | 0 | 0.066282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
914c7280473ea824eaa14ea1b825175c3ff6af56 | 9,846 | py | Python | dashboard/settings.py | Don-Joel/MyDash | 8c556c451752c860426a061c230f524e77afcb6f | [
"MIT"
] | null | null | null | dashboard/settings.py | Don-Joel/MyDash | 8c556c451752c860426a061c230f524e77afcb6f | [
"MIT"
] | null | null | null | dashboard/settings.py | Don-Joel/MyDash | 8c556c451752c860426a061c230f524e77afcb6f | [
"MIT"
] | null | null | null | import os
if 'TRAVIS' in os.environ:
if os.environ.get('IS_HEROKU') == True:
import django_heroku
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.0/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = 'xy4(+@z$0sea7g=i#%w+^u5c3dlk2m7!e3h0dm5nj!=y!tpsio'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = [
"todo.apps.TodoConfig",
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'allauth', # new
'allauth.account', # new
'allauth.socialaccount', # new
'allauth.socialaccount.providers.google',
'django.contrib.sites',
'pages',
'users',
'main',
'social_django',
'weather',
'calendarApp.apps.CalendarappConfig',
'gpa.apps.GpaConfig',
'bootstrap4', # for gpa module
]
AUTHENTICATION_BACKENDS = (
"django.contrib.auth.backends.ModelBackend",
"allauth.account.auth_backends.AuthenticationBackend",
'social_core.backends.twitter.TwitterOAuth',
'social_core.backends.facebook.FacebookOAuth2',
)
SITE_ID = 1
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_USERNAME_REQUIRED = False
AUTH_USER_MODEL = 'users.CustomUser'
LOGIN_REDIRECT_URL = 'home'
LOGOUT_REDIRECT_URL = 'home'
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
'social_django.middleware.SocialAuthExceptionMiddleware',
]
ROOT_URLCONF = 'dashboard.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS':[os.path.join(BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
'social_django.context_processors.backends',
'social_django.context_processors.login_redirect',
],
},
},
]
WSGI_APPLICATION = 'dashboard.wsgi.application'
# Database
# https://docs.djangoproject.com/en/3.0/ref/settings/#databases
if os.environ.get('IS_HEROKU') == True:
import dj_database_url
DATABASES['default'] = dj_database_url.config(conn_max_age=600, ssl_require=True)
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
# Password validation
# https://docs.djangoproject.com/en/3.0/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/3.0/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
#Facebook API LOGIN OATH
SOCIAL_AUTH_FACEBOOK_KEY = '259905795014698' # App ID
SOCIAL_AUTH_FACEBOOK_SECRET = '2c511f8bf96aa396f45a16b3c0823467' # App Secret
#Twitter API Login Oath
SOCIAL_AUTH_TWITTER_KEY = 'URi9HvMcXFx5kyhUOHKZIaEaX'
SOCIAL_AUTH_TWITTER_SECRET = 'v70G2LuSYT7092V4ZS9Uu6iaTZB9RXRLfSFsC9Hf2FS99VG2Y8'
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.0/howto/static-files/
STATIC_URL = '/static/'
STATICFILES_DIRS = [
os.path.join(BASE_DIR, "static")
]
STATIC_ROOT = os.path.join(BASE_DIR, "staticfiles")
# Activate Django-Heroku.
if os.environ.get('IS_HEROKU') == True:
django_heroku.settings(locals())
else:
import django_heroku
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.0/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
# Moved to DB section
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = [
"todo.apps.TodoConfig",
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'allauth', # new
'allauth.account', # new
'allauth.socialaccount', # new
'allauth.socialaccount.providers.google',
'django.contrib.sites',
'pages',
'users',
'main',
'social_django',
'weather',
'calendarApp.apps.CalendarappConfig',
'gpa.apps.GpaConfig',
'bootstrap4', # for gpa module
]
AUTHENTICATION_BACKENDS = (
"django.contrib.auth.backends.ModelBackend",
"allauth.account.auth_backends.AuthenticationBackend",
'social_core.backends.twitter.TwitterOAuth',
'social_core.backends.facebook.FacebookOAuth2',
)
SITE_ID = 1
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_USERNAME_REQUIRED = False
AUTH_USER_MODEL = 'users.CustomUser'
LOGIN_REDIRECT_URL = 'home'
LOGOUT_REDIRECT_URL = 'home'
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
'social_django.middleware.SocialAuthExceptionMiddleware',
]
ROOT_URLCONF = 'dashboard.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS':[os.path.join(BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
'social_django.context_processors.backends',
'social_django.context_processors.login_redirect',
],
},
},
]
WSGI_APPLICATION = 'dashboard.wsgi.application'
# Database
# https://docs.djangoproject.com/en/3.0/ref/settings/#databases
if os.environ.get('IS_HEROKU') == True:
SECRET_KEY = os.environ["SECRET_KEY"]
import dj_database_url
DATABASES['default'] = dj_database_url.config(conn_max_age=600, ssl_require=True)
else:
SECRET_KEY = 'xy4(+@z$0sea7g=i#%w+^u5c3dlk2m7!e3h0dm5nj!=y!tpsio'
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
# Password validation
# https://docs.djangoproject.com/en/3.0/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/3.0/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.0/howto/static-files/
STATIC_URL = '/static/'
STATICFILES_DIRS = [
os.path.join(BASE_DIR, "static")
]
STATIC_ROOT = os.path.join(BASE_DIR, "staticfiles")
# Activate Django-Heroku.
django_heroku.settings(locals())
| 30.482972 | 95 | 0.62858 | 952 | 9,846 | 6.343487 | 0.211134 | 0.073191 | 0.045041 | 0.023183 | 0.942871 | 0.935585 | 0.935585 | 0.93128 | 0.926975 | 0.926975 | 0 | 0.016905 | 0.255027 | 9,846 | 322 | 96 | 30.57764 | 0.806408 | 0.15844 | 0 | 0.791667 | 0 | 0 | 0.451809 | 0.36186 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.046296 | 0.023148 | 0 | 0.023148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e6a100d52476bac5728974493795b0bb8700447b | 186 | py | Python | garbage/admin.py | tes002/Garbage-Collector-Trading-Platform | a2689d419556562e9e32efbe834965b965631f24 | [
"MIT"
] | 1 | 2019-01-18T23:10:12.000Z | 2019-01-18T23:10:12.000Z | garbage/admin.py | tes002/Garbage-Collector-Trading-Platform | a2689d419556562e9e32efbe834965b965631f24 | [
"MIT"
] | null | null | null | garbage/admin.py | tes002/Garbage-Collector-Trading-Platform | a2689d419556562e9e32efbe834965b965631f24 | [
"MIT"
] | null | null | null | from django.contrib import admin
from garbage.models import Garbage
from garbage.models import Watch
# Register your models here.
admin.site.register(Garbage)
admin.site.register(Watch) | 26.571429 | 34 | 0.827957 | 27 | 186 | 5.703704 | 0.444444 | 0.142857 | 0.220779 | 0.298701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102151 | 186 | 7 | 35 | 26.571429 | 0.922156 | 0.139785 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fc05d1db3dbf461ec6d344e6f53085db29e1d6af | 612 | py | Python | sdk/python/pulumi_aws/wafregional/__init__.py | Charliekenney23/pulumi-aws | 55bd0390160d27350b297834026fee52114a2d41 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/wafregional/__init__.py | Charliekenney23/pulumi-aws | 55bd0390160d27350b297834026fee52114a2d41 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/wafregional/__init__.py | Charliekenney23/pulumi-aws | 55bd0390160d27350b297834026fee52114a2d41 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-03-08T15:05:29.000Z | 2021-03-08T15:05:29.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
# Export this package's modules as members:
from .byte_match_set import *
from .geo_match_set import *
from .ip_set import *
from .rate_based_rule import *
from .regex_match_set import *
from .regex_pattern_set import *
from .rule import *
from .rule_group import *
from .size_constraint_set import *
from .sql_injection_match_set import *
from .web_acl import *
from .web_acl_association import *
from .xss_match_set import *
| 32.210526 | 87 | 0.761438 | 98 | 612 | 4.530612 | 0.55102 | 0.27027 | 0.204955 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001931 | 0.153595 | 612 | 18 | 88 | 34 | 0.855212 | 0.357843 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fc2b4bc958dcdf10914e3851b55f2b942bc3278b | 24,493 | py | Python | tests/test_naive_bayes.py | m-martin-j/pomegranate | d79b5464e8d2a3678de33d2323d75f0bc4168e19 | [
"MIT"
] | 2 | 2021-05-19T00:44:38.000Z | 2022-03-28T16:56:51.000Z | tests/test_naive_bayes.py | m-martin-j/pomegranate | d79b5464e8d2a3678de33d2323d75f0bc4168e19 | [
"MIT"
] | null | null | null | tests/test_naive_bayes.py | m-martin-j/pomegranate | d79b5464e8d2a3678de33d2323d75f0bc4168e19 | [
"MIT"
] | null | null | null | from __future__ import (division)
from pomegranate import *
from nose.tools import with_setup
from nose.tools import assert_almost_equal
from nose.tools import assert_equal
from nose.tools import assert_not_equal
from nose.tools import assert_less_equal
from nose.tools import assert_raises
from nose.tools import assert_true
from numpy.testing import assert_array_equal
from numpy.testing import assert_array_almost_equal
import random
import pickle
import numpy as np
nan = numpy.nan
def setup_univariate_mixed():
normal = NormalDistribution(5, 2)
uniform = UniformDistribution(0, 10)
global model
model = NaiveBayes([normal, uniform])
global X
X = numpy.array([[5], [3], [1], [-1]])
def setup_multivariate_gaussian():
d11 = NormalDistribution(0.0, 1)
d12 = NormalDistribution(0.5, 1)
d13 = NormalDistribution(0.3, 1)
d1 = IndependentComponentsDistribution([d11, d12, d13])
d21 = NormalDistribution(1.0, 1)
d22 = NormalDistribution(1.2, 1)
d23 = NormalDistribution(1.5, 1)
d2 = IndependentComponentsDistribution([d21, d22, d23])
global model
model = NaiveBayes([d1, d2])
global X
X = numpy.array([[0.3, 0.5, 0.1],
[0.8, 1.4, 0.5],
[1.4, 2.6, 1.8],
[4.2, 3.3, 3.7],
[2.6, 3.6, 3.3]])
global y
y = [0, 0, 0, 1, 1]
global X_nan
X_nan = numpy.array([[0.3, nan, 0.1],
[nan, 1.4, nan],
[1.4, 2.6, nan],
[nan, nan, nan],
[nan, 3.6, 3.3]])
def setup_multivariate_mixed():
d11 = ExponentialDistribution(5)
d12 = LogNormalDistribution(0.5, 0.78)
d13 = PoissonDistribution(4)
d1 = IndependentComponentsDistribution([d11, d12, d13])
d21 = ExponentialDistribution(35)
d22 = LogNormalDistribution(1.8, 1.33)
d23 = PoissonDistribution(6)
d2 = IndependentComponentsDistribution([d21, d22, d23])
global model
model = NaiveBayes([d1, d2])
global X
X = numpy.array([[0.3, 0.5, 0.1],
[0.8, 1.4, 0.5],
[1.4, 2.6, 1.8],
[4.2, 3.3, 3.7],
[2.6, 3.6, 3.3]])
global y
y = [0, 0, 0, 1, 1]
global X_nan
X_nan = numpy.array([[0.3, nan, 0.1],
[nan, 1.4, nan],
[1.4, 2.6, nan],
[nan, nan, nan],
[nan, 3.6, 3.3]])
def teardown():
pass
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_initialization():
assert_equal(model.d, 1)
assert_equal(model.n, 2)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_initialization():
assert_equal(model.d, 3)
assert_equal(model.n, 2)
def test_nb_univariate_constructors():
d1 = NormalDistribution(0.5, 1)
d2 = MultivariateGaussianDistribution([0, 0], [[1, 0], [0, 1]])
d3 = IndependentComponentsDistribution([NormalDistribution(0, 1),
NormalDistribution(2, 1), NormalDistribution(3, 1)])
assert_raises(TypeError, NaiveBayes, [d1, d2])
assert_raises(TypeError, NaiveBayes, [d1, d3])
assert_raises(ValueError, NaiveBayes, [NormalDistribution])
def test_nb_multivariate_constructors():
d1 = MultivariateGaussianDistribution([0, 0], [[1, 0], [0, 1]])
d2 = IndependentComponentsDistribution([NormalDistribution(0, 1),
NormalDistribution(2, 1), NormalDistribution(3, 1)])
d3 = IndependentComponentsDistribution([NormalDistribution(0, 1),
NormalDistribution(2, 1)])
NaiveBayes([d1, d3])
assert_raises(TypeError, NaiveBayes, [d2, d3])
assert_raises(TypeError, NaiveBayes, [d2, d1])
assert_raises(ValueError, NaiveBayes, [MultivariateGaussianDistribution])
assert_raises(ValueError, NaiveBayes, [IndependentComponentsDistribution])
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_predict_log_proba():
y_hat = model.predict_log_proba(X)
y = [[-0.4063484, -1.096847],
[-0.6024268, -0.792926],
[-1.5484819, -0.238981],
[ 0.0, float('-inf')]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_predict_log_proba():
y_hat = model.predict_log_proba(X)
y = [[ -2.194303e-01, -1.624430e+00],
[ -8.00891133e-01, -5.95891133e-01],
[ -3.24475797e+00, -3.97579742e-02],
[ -8.77515454e+00, -1.54536960e-04],
[ -6.90600226e+00, -1.00225665e-03]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_predict_log_proba():
y_hat = model.predict_log_proba(X)
y = [[ -3.96979060e-05, -1.01342320e+01],
[ -1.43325352e-11, -2.49684574e+01],
[ 0.00000000e+00, -4.18889545e+01],
[ 0.00000000e+00, -1.24795606e+02],
[ 0.00000000e+00, -7.68246547e+01]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_nan_predict_log_proba():
y_hat = model.predict_log_proba(X_nan)
y = [[-0.27268481, -1.43268481],
[-0.90406199, -0.51906199],
[-2.23782228, -0.11282228],
[-0.69314718, -0.69314718],
[-4.81315536, -0.00815536]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_nan_predict_log_proba():
y_hat = model.predict_log_proba(X_nan)
y = [[ -1.21742279e-04, -9.01366508e+00],
[ -2.83092062e-01, -1.40019217e+00],
[ 0.00000000e+00, -4.06187917e+01],
[ -6.93147181e-01, -6.93147181e-01],
[ -3.80319311e-01, -1.15088421e+00]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_predict_log_proba_parallel():
y_hat = model.predict_log_proba(X, n_jobs=2)
y = [[-0.4063484, -1.096847],
[-0.6024268, -0.792926],
[-1.5484819, -0.238981],
[ 0.0, float('-inf')]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_predict_log_proba_parallel():
y_hat = model.predict_log_proba(X, n_jobs=2)
y = [[ -2.194303e-01, -1.624430e+00],
[ -8.00891133e-01, -5.95891133e-01],
[ -3.24475797e+00, -3.97579742e-02],
[ -8.77515454e+00, -1.54536960e-04],
[ -6.90600226e+00, -1.00225665e-03]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_predict_log_proba_parallel():
y_hat = model.predict_log_proba(X, n_jobs=2)
y = [[ -3.96979060e-05, -1.01342320e+01],
[ -1.43325352e-11, -2.49684574e+01],
[ 0.00000000e+00, -4.18889545e+01],
[ 0.00000000e+00, -1.24795606e+02],
[ 0.00000000e+00, -7.68246547e+01]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_predict_proba():
y_hat = model.predict_proba(X)
y = [[ 0.66607801, 0.33392199],
[ 0.54748134, 0.45251866],
[ 0.21257042, 0.78742958],
[ 1., 0. ]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_predict_proba():
y_hat = model.predict_proba(X)
y = [[ 8.02976114e-01, 1.97023886e-01],
[ 4.48928731e-01, 5.51071269e-01],
[ 3.89779969e-02, 9.61022003e-01],
[ 1.54525019e-04, 9.99845475e-01],
[ 1.00175456e-03, 9.98998245e-01]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_predict_proba():
y_hat = model.predict_proba(X)
y = [[ 9.99960303e-01, 3.96971181e-05],
[ 1.00000000e+00, 1.43329876e-11],
[ 1.00000000e+00, 6.42477904e-19],
[ 1.00000000e+00, 6.33806932e-55],
[ 1.00000000e+00, 4.31992661e-34]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_nan_predict_proba():
y_hat = model.predict_proba(X_nan)
y = [[ 0.76133271, 0.23866729],
[ 0.40492153, 0.59507847],
[ 0.10669059, 0.89330941],
[ 0.5, 0.5 ],
[ 0.00812219, 0.99187781]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_nan_predict_proba():
y_hat = model.predict_proba(X_nan)
y = [[ 9.99878265e-01, 1.21734869e-04],
[ 7.53450421e-01, 2.46549579e-01],
[ 1.00000000e+00, 2.28814158e-18],
[ 5.00000000e-01, 5.00000000e-01],
[ 6.83643080e-01, 3.16356920e-01]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_predict_proba_parallel():
y_hat = model.predict_proba(X, n_jobs=2)
y = [[ 0.66607801, 0.33392199],
[ 0.54748134, 0.45251866],
[ 0.21257042, 0.78742958],
[ 1., 0. ]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_predict_proba_parallel():
y_hat = model.predict_proba(X, n_jobs=2)
y = [[ 8.02976114e-01, 1.97023886e-01],
[ 4.48928731e-01, 5.51071269e-01],
[ 3.89779969e-02, 9.61022003e-01],
[ 1.54525019e-04, 9.99845475e-01],
[ 1.00175456e-03, 9.98998245e-01]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_predict_proba_parallel():
y_hat = model.predict_proba(X, n_jobs=2)
y = [[ 9.99960303e-01, 3.96971181e-05,],
[ 1.00000000e+00, 1.43329876e-11,],
[ 1.00000000e+00, 6.42477904e-19,],
[ 1.00000000e+00, 6.33806932e-55,],
[ 1.00000000e+00, 4.31992661e-34,]]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_predict():
y_hat = model.predict(X)
y = [0, 0, 1, 0]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_predict():
y_hat = model.predict(X)
y = [0, 1, 1, 1, 1]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_predict():
y_hat = model.predict(X)
y = [0, 0, 0, 0, 0]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_nan_predict():
y_hat = model.predict(X_nan)
y = [0, 1, 1, 0, 1]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_nan_predict():
y_hat = model.predict(X_nan)
y = [0, 0, 0, 0, 0]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_predict_parallel():
y_hat = model.predict(X, n_jobs=2)
y = [0, 0, 1, 0]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_predict_parallel():
y_hat = model.predict(X, n_jobs=2)
y = [0, 1, 1, 1, 1]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_predict_parallel():
y_hat = model.predict(X, n_jobs=2)
y = [0, 0, 0, 0, 0]
assert_array_almost_equal(y, y_hat)
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_fit():
X = np.array([5, 4, 5, 4, 6, 5, 6, 5, 4, 6, 5, 4, 0, 0,
1, 9, 8, 2, 0, 1, 1, 8, 10, 0]).reshape(-1, 1)
y = np.array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1])
model.fit(X, y)
d1 = model.distributions[0]
d2 = model.distributions[1]
assert_array_almost_equal(d1.parameters, [4.916666666666667, 0.7592027982620252])
assert_array_almost_equal(d2.parameters, [0.0, 10.0])
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_fit():
model.fit(X, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [0.8333333333333334, 0.4496912521077347])
assert_array_almost_equal(d12.parameters, [1.5, 0.8602325267042628])
assert_array_almost_equal(d13.parameters, [0.7999999999999999, 0.725718035235908])
assert_array_almost_equal(d21.parameters, [3.4000000000000004, 0.7999999999999993])
assert_array_almost_equal(d22.parameters, [3.45, 0.1499999999999969])
assert_array_almost_equal(d23.parameters, [3.5, 0.19999999999999787])
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_fit():
model.fit(X, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [1.1999999920000004])
assert_array_almost_equal(d12.parameters, [0.199612167029568, 0.6799837375101412])
assert_array_almost_equal(d13.parameters, [0.7999999999999999])
assert_array_almost_equal(d21.parameters, [0.2941176574394461])
assert_array_almost_equal(d22.parameters, [1.2374281569672494, 0.04350568849481522])
assert_array_almost_equal(d23.parameters, [3.5])
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_nan_fit():
model.fit(X_nan, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [0.85, 0.55])
assert_array_almost_equal(d12.parameters, [2.0, 0.6000000000000003])
assert_array_almost_equal(d13.parameters, [0.1, 0.0])
assert_array_almost_equal(d21.parameters, [1.0, 1.0])
assert_array_almost_equal(d22.parameters, [3.6, 0.0])
assert_array_almost_equal(d23.parameters, [3.3, 0.0])
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_nan_fit():
model.fit(X_nan, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [1.1764705778546718])
assert_array_almost_equal(d12.parameters, [0.645991, 0.3095196])
assert_array_almost_equal(d13.parameters, [0.1])
assert_array_almost_equal(d21.parameters, [35.0])
assert_array_almost_equal(d22.parameters, [1.2809338454620642, 0.0])
assert_array_almost_equal(d23.parameters, [3.3])
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_fit_parallel():
X = np.array([5, 4, 5, 4, 6, 5, 6, 5, 4, 6, 5, 4, 0, 0,
1, 9, 8, 2, 0, 1, 1, 8, 10, 0]).reshape(-1, 1)
y = np.array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1])
model.fit(X, y, n_jobs=2)
d1 = model.distributions[0]
d2 = model.distributions[1]
assert_array_almost_equal(d1.parameters, [4.916666666666667, 0.7592027982620252])
assert_array_almost_equal(d2.parameters, [0.0, 10.0])
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_fit_parallel():
model.fit(X, y, n_jobs=2)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [0.8333333333333334, 0.4496912521077347])
assert_array_almost_equal(d12.parameters, [1.5, 0.8602325267042628])
assert_array_almost_equal(d13.parameters, [0.7999999999999999, 0.725718035235908])
assert_array_almost_equal(d21.parameters, [3.4000000000000004, 0.7999999999999993])
assert_array_almost_equal(d22.parameters, [3.45, 0.1499999999999969])
assert_array_almost_equal(d23.parameters, [3.5, 0.19999999999999787])
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_fit_parallel():
model.fit(X, y, n_jobs=2)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [1.1999999920000004])
assert_array_almost_equal(d12.parameters, [0.199612167029568, 0.6799837375101412])
assert_array_almost_equal(d13.parameters, [0.7999999999999999])
assert_array_almost_equal(d21.parameters, [0.2941176574394461])
assert_array_almost_equal(d22.parameters, [1.2374281569672494, 0.04350568849481522])
assert_array_almost_equal(d23.parameters, [3.5])
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_from_samples():
X = np.array([5, 4, 5, 4, 6, 5, 6, 5, 4, 6, 5, 4, 0, 0,
1, 9, 8, 2, 0, 1, 1, 8, 10, 0]).reshape(-1, 1)
y = np.array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1])
model = NaiveBayes.from_samples([NormalDistribution, UniformDistribution],
X, y)
d1 = model.distributions[0]
d2 = model.distributions[1]
assert_array_almost_equal(d1.parameters, [4.916666666666667, 0.7592027982620252])
assert_array_almost_equal(d2.parameters, [0.0, 10.0])
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_from_samples():
model = NaiveBayes.from_samples(NormalDistribution, X, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [0.8333333333333334, 0.4496912521077347])
assert_array_almost_equal(d12.parameters, [1.5, 0.8602325267042628])
assert_array_almost_equal(d13.parameters, [0.7999999999999999, 0.725718035235908])
assert_array_almost_equal(d21.parameters, [3.4000000000000004, 0.7999999999999993])
assert_array_almost_equal(d22.parameters, [3.45, 0.1499999999999969])
assert_array_almost_equal(d23.parameters, [3.5, 0.19999999999999787])
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_from_samples():
d = [ExponentialDistribution, LogNormalDistribution, PoissonDistribution]
model = NaiveBayes.from_samples(d, X, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [1.1999999920000004])
assert_array_almost_equal(d12.parameters, [0.199612167029568, 0.6799837375101412])
assert_array_almost_equal(d13.parameters, [0.7999999999999999])
assert_array_almost_equal(d21.parameters, [0.2941176574394461])
assert_array_almost_equal(d22.parameters, [1.2374281569672494, 0.04350568849481522])
assert_array_almost_equal(d23.parameters, [3.5])
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_nan_from_samples():
model = NaiveBayes.from_samples(NormalDistribution, X_nan, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [0.85, 0.55])
assert_array_almost_equal(d12.parameters, [2.0, 0.6000000000000003])
assert_array_almost_equal(d13.parameters, [0.1, 0.0])
assert_array_almost_equal(d21.parameters, [0.0, 1.0])
assert_array_almost_equal(d22.parameters, [3.6, 0.0])
assert_array_almost_equal(d23.parameters, [3.3, 0.0])
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_nan_from_samples():
d = [ExponentialDistribution, LogNormalDistribution, PoissonDistribution]
model = NaiveBayes.from_samples(d, X_nan, y)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [1.1764705778546718])
assert_array_almost_equal(d12.parameters, [0.645991, 0.3095196])
assert_array_almost_equal(d13.parameters, [0.1])
assert_array_almost_equal(d21.parameters, [1])
assert_array_almost_equal(d22.parameters, [1.2809338454620642, 0.0])
assert_array_almost_equal(d23.parameters, [3.3])
@with_setup(setup_univariate_mixed, teardown)
def test_nb_univariate_mixed_from_samples_parallel():
X = np.array([5, 4, 5, 4, 6, 5, 6, 5, 4, 6, 5, 4, 0, 0,
1, 9, 8, 2, 0, 1, 1, 8, 10, 0]).reshape(-1, 1)
y = np.array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1])
model = NaiveBayes.from_samples([NormalDistribution, UniformDistribution],
X, y, n_jobs=2)
d1 = model.distributions[0]
d2 = model.distributions[1]
assert_array_almost_equal(d1.parameters, [4.916666666666667, 0.7592027982620252])
assert_array_almost_equal(d2.parameters, [0.0, 10.0])
@with_setup(setup_multivariate_gaussian, teardown)
def test_nb_multivariate_gaussian_from_samples_parallel():
model = NaiveBayes.from_samples(NormalDistribution, X, y, n_jobs=2)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [0.8333333333333334, 0.4496912521077347])
assert_array_almost_equal(d12.parameters, [1.5, 0.8602325267042628])
assert_array_almost_equal(d13.parameters, [0.7999999999999999, 0.725718035235908])
assert_array_almost_equal(d21.parameters, [3.4000000000000004, 0.7999999999999993])
assert_array_almost_equal(d22.parameters, [3.45, 0.1499999999999969])
assert_array_almost_equal(d23.parameters, [3.5, 0.19999999999999787])
@with_setup(setup_multivariate_mixed, teardown)
def test_nb_multivariate_mixed_from_samples_parallel():
d = [ExponentialDistribution, LogNormalDistribution, PoissonDistribution]
model = NaiveBayes.from_samples(d, X, y, n_jobs=2)
d11 = model.distributions[0].distributions[0]
d12 = model.distributions[0].distributions[1]
d13 = model.distributions[0].distributions[2]
d21 = model.distributions[1].distributions[0]
d22 = model.distributions[1].distributions[1]
d23 = model.distributions[1].distributions[2]
assert_array_almost_equal(d11.parameters, [1.1999999920000004])
assert_array_almost_equal(d12.parameters, [0.199612167029568, 0.6799837375101412])
assert_array_almost_equal(d13.parameters, [0.7999999999999999])
assert_array_almost_equal(d21.parameters, [0.2941176574394461])
assert_array_almost_equal(d22.parameters, [1.2374281569672494, 0.04350568849481522])
assert_array_almost_equal(d23.parameters, [3.5])
@with_setup(setup_univariate_mixed, teardown)
def test_raise_errors():
# check raises no errors when converting values
model.predict_log_proba([[5]])
model.predict_log_proba([[4.5]])
model.predict_log_proba([[5], [6]])
model.predict_log_proba(np.array([[5], [6]]) )
model.predict_proba([[5]])
model.predict_proba([[4.5]])
model.predict_proba([[5], [6]])
model.predict_proba(np.array([[5], [6]]))
model.predict([[5]])
model.predict([[4.5]])
model.predict([[5], [6]])
model.predict(np.array([[5], [6]]))
@with_setup(setup_univariate_mixed, teardown)
def test_pickling():
j_univ = pickle.dumps(model)
new_univ = pickle.loads(j_univ)
assert_true(isinstance(new_univ.distributions[0], NormalDistribution))
assert_true(isinstance(new_univ.distributions[1], UniformDistribution))
assert_true(isinstance(new_univ, NaiveBayes))
numpy.testing.assert_array_equal(model.weights, new_univ.weights)
@with_setup(setup_univariate_mixed, teardown)
def test_json():
j_univ = model.to_json()
new_univ = model.from_json(j_univ)
assert_true(isinstance(new_univ.distributions[0], NormalDistribution))
assert_true(isinstance(new_univ.distributions[1], UniformDistribution))
assert_true(isinstance(new_univ, NaiveBayes))
numpy.testing.assert_array_equal( model.weights, new_univ.weights)
| 34.208101 | 85 | 0.744049 | 3,587 | 24,493 | 4.839978 | 0.061054 | 0.068429 | 0.102817 | 0.133057 | 0.911699 | 0.88981 | 0.869593 | 0.85957 | 0.847762 | 0.842002 | 0 | 0.180791 | 0.114523 | 24,493 | 715 | 86 | 34.255944 | 0.619698 | 0.001837 | 0 | 0.709259 | 0 | 0 | 0.000327 | 0 | 0 | 0 | 0 | 0 | 0.242593 | 1 | 0.094444 | false | 0.001852 | 0.025926 | 0 | 0.12037 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d9d8ffa0fef691bd027ed5b287d561e326e2384 | 1,243 | py | Python | data_helper/CIFAR.py | acmi-lab/PU_learning | a9174bda92c7411906056c789011cfa41749ee5f | [
"Apache-2.0"
] | 18 | 2021-11-04T02:26:47.000Z | 2022-03-15T04:41:18.000Z | data_helper/CIFAR.py | acmi-lab/PU_learning | a9174bda92c7411906056c789011cfa41749ee5f | [
"Apache-2.0"
] | null | null | null | data_helper/CIFAR.py | acmi-lab/PU_learning | a9174bda92c7411906056c789011cfa41749ee5f | [
"Apache-2.0"
] | 1 | 2022-01-14T03:22:37.000Z | 2022-01-14T03:22:37.000Z | import torchvision
import numpy as np
class BinarizedCifarData(torchvision.datasets.CIFAR10):
def __init__(self, root, train=True, transform=None, target_transform=None,
download=True):
super().__init__( root, train, transform, target_transform,
download)
targets = np.array(self.targets)
data = np.array(self.data)
p_data_idx = np.where(targets<=4)[0]
self.p_data = data[p_data_idx]
n_data_idx = np.where(targets>4)[0]
self.n_data = data[n_data_idx]
def __len__(self):
return len(self.n_data) + len(self.p_data)
class DogCatData(torchvision.datasets.CIFAR10):
def __init__(self, root, train=True, transform=None, target_transform=None,
download=True):
super().__init__( root, train, transform, target_transform,
download)
targets = np.array(self.targets)
data = np.array(self.data)
p_data_idx = np.where(targets==3)[0]
self.p_data = data[p_data_idx]
n_data_idx = np.where(targets==5)[0]
self.n_data = data[n_data_idx]
def __len__(self):
return len(self.n_data) + len(self.p_data)
| 29.595238 | 79 | 0.609815 | 162 | 1,243 | 4.358025 | 0.209877 | 0.056657 | 0.062323 | 0.067989 | 0.898017 | 0.898017 | 0.898017 | 0.898017 | 0.895184 | 0.895184 | 0 | 0.013378 | 0.278359 | 1,243 | 41 | 80 | 30.317073 | 0.77369 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.071429 | 0.071429 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5dbf87a9d2d9f56fe1766c71dc7f76077fa26a5f | 172 | py | Python | src/researchhub_document/models.py | ResearchHub/ResearchHub-Backend-Open | d36dca33afae2d442690694bb2ab17180d84bcd3 | [
"MIT"
] | 18 | 2021-05-20T13:20:16.000Z | 2022-02-11T02:40:18.000Z | src/researchhub_document/models.py | ResearchHub/ResearchHub-Backend-Open | d36dca33afae2d442690694bb2ab17180d84bcd3 | [
"MIT"
] | 109 | 2021-05-21T20:14:23.000Z | 2022-03-31T20:56:10.000Z | src/researchhub_document/models.py | ResearchHub/ResearchHub-Backend-Open | d36dca33afae2d442690694bb2ab17180d84bcd3 | [
"MIT"
] | 4 | 2021-05-17T13:47:53.000Z | 2022-02-12T10:48:21.000Z | # flake8: noqa
from .related_models.researchhub_post_model import ResearchhubPost
from .related_models.researchhub_unified_document_model import ResearchhubUnifiedDocument
| 43 | 89 | 0.901163 | 19 | 172 | 7.789474 | 0.684211 | 0.148649 | 0.22973 | 0.378378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006211 | 0.063953 | 172 | 3 | 90 | 57.333333 | 0.913043 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5dcf58a4fed5999b0d8bf21f2f21349f330ae9da | 18,668 | py | Python | test/programytest/clients/test_config.py | RonKhondji/program-y | 422c5dea440e569469d7512d80315a731c5e35d6 | [
"MIT"
] | 345 | 2016-11-23T22:37:04.000Z | 2022-03-30T20:44:44.000Z | test/programytest/clients/test_config.py | sofi2305/Nik | e8bb4a6614c16c334cd0df3a16b30a9daac0070d | [
"MIT"
] | 275 | 2016-12-07T10:30:28.000Z | 2022-02-08T21:28:33.000Z | test/programytest/clients/test_config.py | sofi2305/Nik | e8bb4a6614c16c334cd0df3a16b30a9daac0070d | [
"MIT"
] | 159 | 2016-11-28T18:59:30.000Z | 2022-03-20T18:02:44.000Z | import unittest
from programy.clients.config import ClientConfigurationData
from programy.clients.events.console.config import ConsoleConfiguration
from programy.config.file.yaml_file import YamlConfigurationFile
from programytest.config.bot.test_bot import BotConfigurationTests
from programytest.utils.email.test_config import EmailConfigurationTests
from programytest.triggers.test_config import TriggersConfigurationTests
from programytest.clients.ping.test_config import PingResponderConfigurationTests
from programytest.storage.test_config import StorageConfigurationTests
from programytest.scheduling.test_config import SchedulerConfigurationTests
class ClientConfigurationDataTests(unittest.TestCase):
def test_with_data_single_bot(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
prompt: ">>>"
renderer: programy.clients.render.text.TextRenderer
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
bot_selector: programy.clients.botfactory.DefaultBotSelector
bots:
bot1:
prompt: ">>>"
initial_question: Hi, how can I help you today?
initial_question_srai: YINITIALQUESTION
default_response: Sorry, I don't have an answer for that!
default_response_srai: YDEFAULTRESPONSE
empty_string: YEMPTY
exit_response: So long, and thanks for the fish!
exit_response_srai: YEXITRESPONSE
override_properties: true
max_question_recursion: 1000
max_question_timeout: 60
max_search_depth: 100
max_search_timeout: 60
spelling:
load: true
classname: programy.spelling.norvig.NorvigSpellingChecker
alphabet: 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
check_before: true
check_and_retry: true
splitter:
classname: programy.dialog.splitter.regex.RegexSentenceSplitter
joiner:
classname: programy.dialog.joiner.SentenceJoiner
conversations:
save: true
load: false
max_histories: 100
restore_last_topic: false
initial_topic: TOPIC1
empty_on_start: false
from_translator:
classname: programy.nlp.translate.textblob_translator.TextBlobTranslator
from: fr
to: en
to_translator:
classname: programy.nlp.translate.textblob_translator.TextBlobTranslator
from: en
to: fr
sentiment:
classname: programy.nlp.sentiment.textblob_sentiment.TextBlobSentimentAnalyser
scores: programy.nlp.sentiment.scores.SentimentScores
brain_selector: programy.bot.DefaultBrainSelector
brains:
brain1:
# Overrides
overrides:
allow_system_aiml: true
allow_learn_aiml: true
allow_learnf_aiml: true
# Defaults
defaults:
default_get: unknown
default_property: unknown
default_map: unknown
learnf-path: file
# Binary
binaries:
save_binary: true
load_binary: true
load_aiml_on_binary_fail: true
# Braintree
braintree:
create: true
security:
authentication:
classname: programy.security.authenticate.passthrough.BasicPassThroughAuthenticationService
denied_srai: AUTHENTICATION_FAILED
authorisation:
classname: programy.security.authorise.usergroupsauthorisor.BasicUserGroupAuthorisationService
denied_srai: AUTHORISATION_FAILED
usergroups:
storage: file
dynamic:
variables:
gettime: programy.dynamic.variables.datetime.GetTime
sets:
numeric: programy.dynamic.sets.numeric.IsNumeric
roman: programy.dynamic.sets.roman.IsRomanNumeral
maps:
romantodec: programy.dynamic.maps.roman.MapRomanToDecimal
dectoroman: programy.dynamic.maps.roman.MapDecimalToRoman
""", ConsoleConfiguration(), ".")
client_config = ClientConfigurationData("console")
client_config.load_configuration(yaml, ".")
self.assertEqual(1, len(client_config.configurations))
self.assertEqual("programy.clients.botfactory.DefaultBotSelector", client_config.bot_selector)
self.assertIsNotNone(client_config.scheduler)
self.assertEqual("Scheduler1", client_config.scheduler.name)
self.assertEqual(0, client_config.scheduler.debug_level)
self.assertTrue(client_config.scheduler.add_listeners)
self.assertTrue(client_config.scheduler.remove_all_jobs)
self.assertEqual("programy.clients.render.text.TextRenderer", client_config.renderer)
def test_with_data_multiple_bots(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
prompt: ">>>"
renderer: programy.clients.render.text.TextRenderer
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
bot_selector: programy.clients.botfactory.DefaultBotSelector
bots:
bot1:
prompt: ">>>"
initial_question: Hi, how can I help you today?
initial_question_srai: YINITIALQUESTION
default_response: Sorry, I don't have an answer for that!
default_response_srai: YDEFAULTRESPONSE
empty_string: YEMPTY
exit_response: So long, and thanks for the fish!
exit_response_srai: YEXITRESPONSE
override_properties: true
max_question_recursion: 1000
max_question_timeout: 60
max_search_depth: 100
max_search_timeout: 60
spelling:
load: true
classname: programy.spelling.norvig.NorvigSpellingChecker
alphabet: 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
check_before: true
check_and_retry: true
splitter:
classname: programy.dialog.splitter.regex.RegexSentenceSplitter
joiner:
classname: programy.dialog.joiner.SentenceJoiner
conversations:
save: true
load: false
max_histories: 100
restore_last_topic: false
initial_topic: TOPIC1
empty_on_start: false
from_translator:
classname: programy.nlp.translate.textblob_translator.TextBlobTranslator
from: fr
to: en
to_translator:
classname: programy.nlp.translate.textblob_translator.TextBlobTranslator
from: en
to: fr
sentiment:
classname: programy.nlp.sentiment.textblob_sentiment.TextBlobSentimentAnalyser
scores: programy.nlp.sentiment.scores.SentimentScores
brain_selector: programy.bot.DefaultBrainSelector
brains:
brain1:
# Overrides
overrides:
allow_system_aiml: true
allow_learn_aiml: true
allow_learnf_aiml: true
# Defaults
defaults:
default_get: unknown
default_property: unknown
default_map: unknown
learnf-path: file
# Binary
binaries:
save_binary: true
load_binary: true
load_aiml_on_binary_fail: true
# Braintree
braintree:
create: true
security:
authentication:
classname: programy.security.authenticate.passthrough.BasicPassThroughAuthenticationService
denied_srai: AUTHENTICATION_FAILED
authorisation:
classname: programy.security.authorise.usergroupsauthorisor.BasicUserGroupAuthorisationService
denied_srai: AUTHORISATION_FAILED
usergroups:
storage: file
dynamic:
variables:
gettime: programy.dynamic.variables.datetime.GetTime
sets:
numeric: programy.dynamic.sets.numeric.IsNumeric
roman: programy.dynamic.sets.roman.IsRomanNumeral
maps:
romantodec: programy.dynamic.maps.roman.MapRomanToDecimal
dectoroman: programy.dynamic.maps.roman.MapDecimalToRoman
bot2:
prompt: ">>>"
initial_question: Hi, how can I help you today?
initial_question_srai: YINITIALQUESTION
default_response: Sorry, I don't have an answer for that!
default_response_srai: YDEFAULTRESPONSE
empty_string: YEMPTY
exit_response: So long, and thanks for the fish!
exit_response_srai: YEXITRESPONSE
override_properties: true
max_question_recursion: 1000
max_question_timeout: 60
max_search_depth: 100
max_search_timeout: 60
spelling:
load: true
classname: programy.spelling.norvig.NorvigSpellingChecker
alphabet: 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
check_before: true
check_and_retry: true
splitter:
classname: programy.dialog.splitter.regex.RegexSentenceSplitter
joiner:
classname: programy.dialog.joiner.SentenceJoiner
conversations:
save: true
load: false
max_histories: 100
restore_last_topic: false
initial_topic: TOPIC1
empty_on_start: false
from_translator:
classname: programy.nlp.translate.textblob_translator.TextBlobTranslator
from: fr
to: en
to_translator:
classname: programy.nlp.translate.textblob_translator.TextBlobTranslator
from: en
to: fr
sentiment:
classname: programy.nlp.sentiment.textblob_sentiment.TextBlobSentimentAnalyser
scores: programy.nlp.sentiment.scores.SentimentScores
brain_selector: programy.bot.DefaultBrainSelector
brains:
brain1:
# Overrides
overrides:
allow_system_aiml: true
allow_learn_aiml: true
allow_learnf_aiml: true
# Defaults
defaults:
default_get: unknown
default_property: unknown
default_map: unknown
learnf-path: file
# Binary
binaries:
save_binary: true
load_binary: true
load_aiml_on_binary_fail: true
# Braintree
braintree:
create: true
security:
authentication:
classname: programy.security.authenticate.passthrough.BasicPassThroughAuthenticationService
denied_srai: AUTHENTICATION_FAILED
authorisation:
classname: programy.security.authorise.usergroupsauthorisor.BasicUserGroupAuthorisationService
denied_srai: AUTHORISATION_FAILED
usergroups:
storage: file
dynamic:
variables:
gettime: programy.dynamic.variables.datetime.GetTime
sets:
numeric: programy.dynamic.sets.numeric.IsNumeric
roman: programy.dynamic.sets.roman.IsRomanNumeral
maps:
romantodec: programy.dynamic.maps.roman.MapRomanToDecimal
dectoroman: programy.dynamic.maps.roman.MapDecimalToRoman
""", ConsoleConfiguration(), ".")
client_config = ClientConfigurationData("console")
client_config.load_configuration(yaml, ".")
self.assertEqual(2, len(client_config.configurations))
self.assertEqual("programy.clients.botfactory.DefaultBotSelector", client_config.bot_selector)
self.assertIsNotNone(client_config.scheduler)
self.assertEqual("Scheduler1", client_config.scheduler.name)
self.assertEqual(0, client_config.scheduler.debug_level)
self.assertTrue(client_config.scheduler.add_listeners)
self.assertTrue(client_config.scheduler.remove_all_jobs)
self.assertEqual("programy.clients.render.text.TextRenderer", client_config.renderer)
def test_without_data(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
""", ConsoleConfiguration(), ".")
client_config = ClientConfigurationData("console")
client_config.load_configuration(yaml, ".")
self.assertIsNotNone(client_config.bot_selector)
self.assertIsNotNone(client_config.scheduler)
self.assertEqual(None, client_config.scheduler.name)
self.assertEqual(0, client_config.scheduler.debug_level)
self.assertFalse(client_config.scheduler.add_listeners)
self.assertFalse(client_config.scheduler.remove_all_jobs)
self.assertIsNotNone(client_config.renderer)
def test_with_no_data(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
other:
""", ConsoleConfiguration(), ".")
client_config = ClientConfigurationData("console")
client_config.load_configuration(yaml, ".")
self.assertIsNotNone(client_config.bot_selector)
self.assertIsNotNone(client_config.scheduler)
self.assertEqual(None, client_config.scheduler.name)
self.assertEqual(0, client_config.scheduler.debug_level)
self.assertFalse(client_config.scheduler.add_listeners)
self.assertFalse(client_config.scheduler.remove_all_jobs)
self.assertIsNotNone(client_config.renderer)
def test_defaults(self):
client_config = ClientConfigurationData("console")
data = {}
client_config.to_yaml(data, True)
ClientConfigurationDataTests.assert_defaults(self, data)
@staticmethod
def assert_defaults(test, data):
test.assertEqual(data['description'], 'ProgramY AIML2.0 Client')
test.assertEqual(data['renderer'], "programy.clients.render.text.TextRenderer")
test.assertTrue('scheduler' in data)
SchedulerConfigurationTests.assert_defaults(test, data['scheduler'])
test.assertTrue('email' in data)
EmailConfigurationTests.assert_defaults(test, data['email'])
test.assertTrue('triggers' in data)
TriggersConfigurationTests.assert_defaults(test, data['triggers'])
test.assertTrue('responder' in data)
PingResponderConfigurationTests.assert_defaults(test, data['responder'])
test.assertTrue('storage' in data)
StorageConfigurationTests.assert_defaults(test, data['storage'])
test.assertTrue('bots' in data)
test.assertTrue('bot' in data['bots'])
BotConfigurationTests.assert_defaults(test, data['bots']['bot'])
test.assertEqual(data['bot_selector'], "programy.clients.botfactory.DefaultBotSelector")
| 42.427273 | 126 | 0.532355 | 1,374 | 18,668 | 7.034935 | 0.151383 | 0.049659 | 0.043451 | 0.025657 | 0.84761 | 0.84761 | 0.83654 | 0.83654 | 0.83654 | 0.83654 | 0 | 0.005935 | 0.413328 | 18,668 | 439 | 127 | 42.523918 | 0.876644 | 0 | 0 | 0.886111 | 0 | 0 | 0.748875 | 0.174523 | 0 | 0 | 0 | 0 | 0.144444 | 1 | 0.016667 | false | 0.008333 | 0.027778 | 0 | 0.047222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b90a8fad8f713827e2e94e2740e130a9f70ac944 | 14,158 | py | Python | lwm2mthon/resources/device.py | Tanganelli/lwm2mthon | bbd4079e14572e3e0094b798e0d0e7c1bc7f0f51 | [
"Apache-2.0"
] | 1 | 2019-01-25T02:39:18.000Z | 2019-01-25T02:39:18.000Z | lwm2mthon/resources/device.py | Tanganelli/lwm2mthon | bbd4079e14572e3e0094b798e0d0e7c1bc7f0f51 | [
"Apache-2.0"
] | null | null | null | lwm2mthon/resources/device.py | Tanganelli/lwm2mthon | bbd4079e14572e3e0094b798e0d0e7c1bc7f0f51 | [
"Apache-2.0"
] | null | null | null | from datetime import date, datetime
from coapthon.resources.resource import Resource
import time
from lwm2mthon import defines
from lwm2mthon.defines import LWM2MResourceType
from lwm2mthon.utils import TreeVisit
__author__ = 'giacomo'
class Device(Resource):
def __init__(self, name="3"):
super(Device, self).__init__(name)
class DeviceInstance(Resource):
def __init__(self, name, children, coap_server):
super(DeviceInstance, self).__init__(name, coap_server=coap_server)
self.children = children
def set_children(self, children):
self.children = children
def render_DELETE(self, request):
return True
def render_PUT(self, request):
ret = TreeVisit.decode(request.payload, request.uri_path, request.content_type)
for r in ret:
c = str(r[1]) # identifier
v = r[2]
assert isinstance(self.children[c], Resource)
method = getattr(self.children[c], 'set_value', None)
if method is not None:
self.children[c].set_value(v)
return self
def render_GET(self, request):
# Object Instance
resources = TreeVisit.get_children(self.children)
# encode as TLV
self.payload = TreeVisit.encode(resources, request.uri_path,
defines.Content_types["application/vnd.oma.lwm2m+tlv"])
return self
class Manufacturer(Resource):
def __init__(self, name="0", value="", resource_id="0", coap_server=None):
super(Manufacturer, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class ModelNumber(Resource):
def __init__(self, name="1", value="", resource_id="1", coap_server=None):
super(ModelNumber, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class SerialNumber(Resource):
def __init__(self, name="2", value="", resource_id="2", coap_server=None):
super(SerialNumber, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class FirmwareVersion(Resource):
def __init__(self, name="3", value="", resource_id="3", coap_server=None):
super(FirmwareVersion, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class Reboot(Resource):
def __init__(self, name="4", resource_id="4", lwm2mclient=None):
super(Reboot, self).__init__(name)
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
self.lwm2mclient = lwm2mclient
def render_POST(self, request):
self.lwm2mclient.reboot()
return self
class FactoryReboot(Resource):
def __init__(self, name="5", resource_id="5", lwm2mclient=None):
super(FactoryReboot, self).__init__(name)
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
self.lwm2mclient = lwm2mclient
def render_POST(self, request):
self.lwm2mclient.factoryreboot()
return self
class AvailablePowerSource(Resource):
def __init__(self, name="6", children=None, resource_id="6", coap_server=None):
super(AvailablePowerSource, self).__init__(name, coap_server=coap_server)
self.resource_id = resource_id
self.children = children
self.lwm2m_type = LWM2MResourceType.INTEGER
def set_children(self, children):
self.children = children
def render_GET(self, request):
resource = {self.resource_id: (None, self.path, self.lwm2m_type)}
resources = TreeVisit.get_children(self.children, resource)
self.payload = TreeVisit.encode(resources, request.uri_path,
defines.Content_types["application/vnd.oma.lwm2m+tlv"])
return self
def get_value(self):
return TreeVisit.get_children(self.children)
class AvailablePowerSourceItem(Resource):
def __init__(self, name, value, resource_id, coap_server):
super(AvailablePowerSourceItem, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class PowerSourceVoltage(Resource):
def __init__(self, name="7", children=None, resource_id="7", coap_server=None):
super(PowerSourceVoltage, self).__init__(name, coap_server=coap_server)
self.resource_id = resource_id
self.children = children
self.lwm2m_type = LWM2MResourceType.INTEGER
def set_children(self, children):
self.children = children
def render_GET(self, request):
resource = {self.resource_id: (None, self.path, self.lwm2m_type)}
resources = TreeVisit.get_children(self.children, resource)
self.payload = TreeVisit.encode(resources, request.uri_path,
defines.Content_types["application/vnd.oma.lwm2m+tlv"])
return self
def get_value(self):
return TreeVisit.get_children(self.children)
class PowerSourceVoltageItem(Resource):
def __init__(self, name, value, resource_id, coap_server):
super(PowerSourceVoltageItem, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class PowerSourceCurrent(Resource):
def __init__(self, name="8", children=None, resource_id="8", coap_server=None):
super(PowerSourceCurrent, self).__init__(name, coap_server=coap_server)
self.resource_id = resource_id
self.children = children
self.lwm2m_type = LWM2MResourceType.INTEGER
def set_children(self, children):
self.children = children
def render_GET(self, request):
resource = {self.resource_id: (None, self.path, self.lwm2m_type)}
resources = TreeVisit.get_children(self.children, resource)
self.payload = TreeVisit.encode(resources, request.uri_path,
defines.Content_types["application/vnd.oma.lwm2m+tlv"])
return self
def get_value(self):
return TreeVisit.get_children(self.children)
class PowerSourceCurrentItem(Resource):
def __init__(self, name, value, resource_id, coap_server):
super(PowerSourceCurrentItem, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class BatteryLevel(Resource):
def __init__(self, name="9", value="", resource_id="9", coap_server=None):
super(BatteryLevel, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class MemoryFree(Resource):
def __init__(self, name="10", value="", resource_id="10", coap_server=None):
super(MemoryFree, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class ErrorCode(Resource):
def __init__(self, name="11", children=None, resource_id="11", coap_server=None):
super(ErrorCode, self).__init__(name, coap_server=coap_server)
self.resource_id = resource_id
self.children = children
self.lwm2m_type = LWM2MResourceType.INTEGER
def set_children(self, children):
self.children = children
def render_GET(self, request):
resource = {self.resource_id: (None, self.path, self.lwm2m_type)}
resources = TreeVisit.get_children(self.children, resource)
self.payload = TreeVisit.encode(resources, request.uri_path,
defines.Content_types["application/vnd.oma.lwm2m+tlv"])
return self
def get_value(self):
return TreeVisit.get_children(self.children)
class ErrorCodeItem(Resource):
def __init__(self, name, value, resource_id, coap_server):
super(ErrorCodeItem, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class ResetErrorCode(Resource):
def __init__(self, name="12", resource_id="12", coap_server=None):
super(ResetErrorCode, self).__init__(name, coap_server=coap_server)
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_POST(self, request):
pass
class CurrentTime(Resource):
def __init__(self, name="13", value=0, resource_id="13", coap_server=None):
super(CurrentTime, self).__init__(name, coap_server=coap_server)
self.start_time = int(time.time())
if value == 0:
self.value = self.start_time
else:
self.value = int(value)
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.INTEGER
def render_GET(self, request):
self.payload = str(self.get_value())
return self
def render_PUT(self, request):
self.set_value(request.payload)
return self
def set_value(self, value):
self.start_time = int(time.time())
self.value = int(value)
self._coap_server.notify(self)
def get_value(self):
now = int(time.time())
dif = now - self.start_time
self.value += dif
return self.value
class UTCOffset(Resource):
def __init__(self, name="14", value="", resource_id="14", coap_server=None):
super(UTCOffset, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.get_value())
return self
def render_PUT(self, request):
self.set_value(request.payload)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class Timezone(Resource):
def __init__(self, name="15", value="", resource_id="15", coap_server=None):
super(Timezone, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.value)
return self
def render_PUT(self, request):
self.set_value(request.payload)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
class Binding(Resource):
def __init__(self, name="16", value="", resource_id="16", coap_server=None):
super(Binding, self).__init__(name, coap_server=coap_server)
self.value = value
self.resource_id = resource_id
self.lwm2m_type = LWM2MResourceType.STRING
def render_GET(self, request):
self.payload = str(self.value)
return self
def set_value(self, value):
self.value = str(value)
self._coap_server.notify(self)
def get_value(self):
return self.value
| 29.932347 | 95 | 0.658073 | 1,714 | 14,158 | 5.165111 | 0.068261 | 0.076245 | 0.042697 | 0.049362 | 0.787304 | 0.727663 | 0.713092 | 0.713092 | 0.704959 | 0.704959 | 0 | 0.010793 | 0.240853 | 14,158 | 472 | 96 | 29.995763 | 0.812895 | 0.002825 | 0 | 0.725373 | 0 | 0 | 0.014879 | 0.010273 | 0 | 0 | 0 | 0 | 0.002985 | 1 | 0.259701 | false | 0.002985 | 0.01791 | 0.053731 | 0.477612 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b90d82e28f2a8bac4d8ea44fdd5789780e87b987 | 166 | py | Python | test2.py | jmschwei/pyneta | 119c242f6b8737cfcd2bddfd69569a39410dbd44 | [
"Apache-2.0"
] | null | null | null | test2.py | jmschwei/pyneta | 119c242f6b8737cfcd2bddfd69569a39410dbd44 | [
"Apache-2.0"
] | null | null | null | test2.py | jmschwei/pyneta | 119c242f6b8737cfcd2bddfd69569a39410dbd44 | [
"Apache-2.0"
] | null | null | null | print("hello")
print("hello")
print("hello")
print("hello")
print("hello")
print("hello")
print("hello")
print("hello")
print("hello")
for x in range(100):
print(x)
| 13.833333 | 20 | 0.662651 | 25 | 166 | 4.4 | 0.28 | 0.818182 | 1.090909 | 1.454545 | 0.818182 | 0.818182 | 0.818182 | 0.818182 | 0.818182 | 0.818182 | 0 | 0.019868 | 0.090361 | 166 | 11 | 21 | 15.090909 | 0.708609 | 0 | 0 | 0.818182 | 0 | 0 | 0.271084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.909091 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 12 |
5d061c892afa5e8a02d69cdb1a55c5e4dbc2ccdf | 30,505 | py | Python | generated_code_examples/python/regression/svm.py | Symmetry-International/m2cgen | 3157e0cbd5bd1ee7e044a992223c60224e2b7709 | [
"MIT"
] | 2,161 | 2019-01-13T02:37:56.000Z | 2022-03-30T13:24:09.000Z | generated_code_examples/python/regression/svm.py | Symmetry-International/m2cgen | 3157e0cbd5bd1ee7e044a992223c60224e2b7709 | [
"MIT"
] | 380 | 2019-01-17T15:59:29.000Z | 2022-03-31T20:59:20.000Z | generated_code_examples/python/regression/svm.py | Symmetry-International/m2cgen | 3157e0cbd5bd1ee7e044a992223c60224e2b7709 | [
"MIT"
] | 201 | 2019-02-13T19:06:44.000Z | 2022-03-12T09:45:46.000Z | import math
def score(input):
return (((((((((((((((((((((((((((((((((((((((((((((((((((25.346480984077544) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((16.8118) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.7) - (input[4]), 2.0))) + (math.pow((5.277) - (input[5]), 2.0))) + (math.pow((98.1) - (input[6]), 2.0))) + (math.pow((1.4261) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((30.81) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((38.3518) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.453) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.4896) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((30.59) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.84054) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((5.599) - (input[5]), 2.0))) + (math.pow((85.7) - (input[6]), 2.0))) + (math.pow((4.4546) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((303.42) - (input[11]), 2.0))) + (math.pow((16.51) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.15172) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((5.701) - (input[5]), 2.0))) + (math.pow((95.0) - (input[6]), 2.0))) + (math.pow((3.7872) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((358.77) - (input[11]), 2.0))) + (math.pow((18.35) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((24.8017) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.349) - (input[5]), 2.0))) + (math.pow((96.0) - (input[6]), 2.0))) + (math.pow((1.7028) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((19.77) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((41.5292) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.531) - (input[5]), 2.0))) + (math.pow((85.4) - (input[6]), 2.0))) + (math.pow((1.6074) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((329.46) - (input[11]), 2.0))) + (math.pow((27.38) - (input[12]), 2.0))))) * (-0.3490103966325617))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.38735) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((25.65) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.581) - (input[4]), 2.0))) + (math.pow((5.613) - (input[5]), 2.0))) + (math.pow((95.6) - (input[6]), 2.0))) + (math.pow((1.7572) - (input[7]), 2.0))) + (math.pow((2.0) - (input[8]), 2.0))) + (math.pow((188.0) - (input[9]), 2.0))) + (math.pow((19.1) - (input[10]), 2.0))) + (math.pow((359.29) - (input[11]), 2.0))) + (math.pow((27.26) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.05602) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((2.46) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.488) - (input[4]), 2.0))) + (math.pow((7.831) - (input[5]), 2.0))) + (math.pow((53.6) - (input[6]), 2.0))) + (math.pow((3.1992) - (input[7]), 2.0))) + (math.pow((3.0) - (input[8]), 2.0))) + (math.pow((193.0) - (input[9]), 2.0))) + (math.pow((17.8) - (input[10]), 2.0))) + (math.pow((392.63) - (input[11]), 2.0))) + (math.pow((4.45) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((25.0461) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.987) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.5888) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((26.77) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((8.26725) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.668) - (input[4]), 2.0))) + (math.pow((5.875) - (input[5]), 2.0))) + (math.pow((89.6) - (input[6]), 2.0))) + (math.pow((1.1296) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((347.88) - (input[11]), 2.0))) + (math.pow((8.88) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((5.66998) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.631) - (input[4]), 2.0))) + (math.pow((6.683) - (input[5]), 2.0))) + (math.pow((96.8) - (input[6]), 2.0))) + (math.pow((1.3567) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((375.33) - (input[11]), 2.0))) + (math.pow((3.73) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.51902) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.605) - (input[4]), 2.0))) + (math.pow((8.375) - (input[5]), 2.0))) + (math.pow((93.9) - (input[6]), 2.0))) + (math.pow((2.162) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((388.45) - (input[11]), 2.0))) + (math.pow((3.32) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.29819) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((6.2) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.504) - (input[4]), 2.0))) + (math.pow((7.686) - (input[5]), 2.0))) + (math.pow((17.0) - (input[6]), 2.0))) + (math.pow((3.3751) - (input[7]), 2.0))) + (math.pow((8.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((17.4) - (input[10]), 2.0))) + (math.pow((377.51) - (input[11]), 2.0))) + (math.pow((3.92) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((3.32105) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.871) - (input[4]), 2.0))) + (math.pow((5.403) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.3216) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((26.82) - (input[12]), 2.0))))) * (-0.400989603367655))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.61154) - (input[0]), 2.0)) + (math.pow((20.0) - (input[1]), 2.0))) + (math.pow((3.97) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.647) - (input[4]), 2.0))) + (math.pow((8.704) - (input[5]), 2.0))) + (math.pow((86.9) - (input[6]), 2.0))) + (math.pow((1.801) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((264.0) - (input[9]), 2.0))) + (math.pow((13.0) - (input[10]), 2.0))) + (math.pow((389.7) - (input[11]), 2.0))) + (math.pow((5.12) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.02009) - (input[0]), 2.0)) + (math.pow((95.0) - (input[1]), 2.0))) + (math.pow((2.68) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.4161) - (input[4]), 2.0))) + (math.pow((8.034) - (input[5]), 2.0))) + (math.pow((31.9) - (input[6]), 2.0))) + (math.pow((5.118) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((224.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((390.55) - (input[11]), 2.0))) + (math.pow((2.88) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.08187) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((2.89) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.445) - (input[4]), 2.0))) + (math.pow((7.82) - (input[5]), 2.0))) + (math.pow((36.9) - (input[6]), 2.0))) + (math.pow((3.4952) - (input[7]), 2.0))) + (math.pow((2.0) - (input[8]), 2.0))) + (math.pow((276.0) - (input[9]), 2.0))) + (math.pow((18.0) - (input[10]), 2.0))) + (math.pow((393.53) - (input[11]), 2.0))) + (math.pow((3.57) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.57834) - (input[0]), 2.0)) + (math.pow((20.0) - (input[1]), 2.0))) + (math.pow((3.97) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.575) - (input[4]), 2.0))) + (math.pow((8.297) - (input[5]), 2.0))) + (math.pow((67.0) - (input[6]), 2.0))) + (math.pow((2.4216) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((264.0) - (input[9]), 2.0))) + (math.pow((13.0) - (input[10]), 2.0))) + (math.pow((384.54) - (input[11]), 2.0))) + (math.pow((7.44) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.35472) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((6.072) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((4.175) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((376.73) - (input[11]), 2.0))) + (math.pow((13.04) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.52693) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((6.2) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.504) - (input[4]), 2.0))) + (math.pow((8.725) - (input[5]), 2.0))) + (math.pow((83.0) - (input[6]), 2.0))) + (math.pow((2.8944) - (input[7]), 2.0))) + (math.pow((8.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((17.4) - (input[10]), 2.0))) + (math.pow((382.0) - (input[11]), 2.0))) + (math.pow((4.63) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.33147) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((6.2) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.507) - (input[4]), 2.0))) + (math.pow((8.247) - (input[5]), 2.0))) + (math.pow((70.4) - (input[6]), 2.0))) + (math.pow((3.6519) - (input[7]), 2.0))) + (math.pow((8.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((17.4) - (input[10]), 2.0))) + (math.pow((378.95) - (input[11]), 2.0))) + (math.pow((3.95) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.13081) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((5.713) - (input[5]), 2.0))) + (math.pow((94.1) - (input[6]), 2.0))) + (math.pow((4.233) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((360.17) - (input[11]), 2.0))) + (math.pow((22.6) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((4.89822) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.631) - (input[4]), 2.0))) + (math.pow((4.97) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.3325) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((375.52) - (input[11]), 2.0))) + (math.pow((3.26) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.25179) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((5.57) - (input[5]), 2.0))) + (math.pow((98.1) - (input[6]), 2.0))) + (math.pow((3.7979) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((376.57) - (input[11]), 2.0))) + (math.pow((21.02) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.06129) - (input[0]), 2.0)) + (math.pow((20.0) - (input[1]), 2.0))) + (math.pow((3.33) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.4429) - (input[4]), 2.0))) + (math.pow((7.645) - (input[5]), 2.0))) + (math.pow((49.7) - (input[6]), 2.0))) + (math.pow((5.2119) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((216.0) - (input[9]), 2.0))) + (math.pow((14.9) - (input[10]), 2.0))) + (math.pow((377.07) - (input[11]), 2.0))) + (math.pow((3.01) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((9.2323) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.631) - (input[4]), 2.0))) + (math.pow((6.216) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.1691) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((366.15) - (input[11]), 2.0))) + (math.pow((9.53) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((2.77974) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.871) - (input[4]), 2.0))) + (math.pow((4.903) - (input[5]), 2.0))) + (math.pow((97.8) - (input[6]), 2.0))) + (math.pow((1.3459) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((29.29) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.01381) - (input[0]), 2.0)) + (math.pow((80.0) - (input[1]), 2.0))) + (math.pow((0.46) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.422) - (input[4]), 2.0))) + (math.pow((7.875) - (input[5]), 2.0))) + (math.pow((32.0) - (input[6]), 2.0))) + (math.pow((5.6484) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((255.0) - (input[9]), 2.0))) + (math.pow((14.4) - (input[10]), 2.0))) + (math.pow((394.23) - (input[11]), 2.0))) + (math.pow((2.97) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.01538) - (input[0]), 2.0)) + (math.pow((90.0) - (input[1]), 2.0))) + (math.pow((3.75) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.394) - (input[4]), 2.0))) + (math.pow((7.454) - (input[5]), 2.0))) + (math.pow((34.2) - (input[6]), 2.0))) + (math.pow((6.3361) - (input[7]), 2.0))) + (math.pow((3.0) - (input[8]), 2.0))) + (math.pow((244.0) - (input[9]), 2.0))) + (math.pow((15.9) - (input[10]), 2.0))) + (math.pow((386.34) - (input[11]), 2.0))) + (math.pow((3.11) - (input[12]), 2.0))))) * (0.7500000000002167))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.38799) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((5.95) - (input[5]), 2.0))) + (math.pow((82.0) - (input[6]), 2.0))) + (math.pow((3.99) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((232.6) - (input[11]), 2.0))) + (math.pow((27.71) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.83377) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.605) - (input[4]), 2.0))) + (math.pow((7.802) - (input[5]), 2.0))) + (math.pow((98.2) - (input[6]), 2.0))) + (math.pow((2.0407) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((389.61) - (input[11]), 2.0))) + (math.pow((1.92) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.31533) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((6.2) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.504) - (input[4]), 2.0))) + (math.pow((8.266) - (input[5]), 2.0))) + (math.pow((78.3) - (input[6]), 2.0))) + (math.pow((2.8944) - (input[7]), 2.0))) + (math.pow((8.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((17.4) - (input[10]), 2.0))) + (math.pow((385.05) - (input[11]), 2.0))) + (math.pow((4.14) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((9.91655) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.852) - (input[5]), 2.0))) + (math.pow((77.8) - (input[6]), 2.0))) + (math.pow((1.5004) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((338.16) - (input[11]), 2.0))) + (math.pow((29.97) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.01501) - (input[0]), 2.0)) + (math.pow((90.0) - (input[1]), 2.0))) + (math.pow((1.21) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.401) - (input[4]), 2.0))) + (math.pow((7.923) - (input[5]), 2.0))) + (math.pow((24.8) - (input[6]), 2.0))) + (math.pow((5.885) - (input[7]), 2.0))) + (math.pow((1.0) - (input[8]), 2.0))) + (math.pow((198.0) - (input[9]), 2.0))) + (math.pow((13.6) - (input[10]), 2.0))) + (math.pow((395.52) - (input[11]), 2.0))) + (math.pow((3.16) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.25387) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((6.91) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.448) - (input[4]), 2.0))) + (math.pow((5.399) - (input[5]), 2.0))) + (math.pow((95.3) - (input[6]), 2.0))) + (math.pow((5.87) - (input[7]), 2.0))) + (math.pow((3.0) - (input[8]), 2.0))) + (math.pow((233.0) - (input[9]), 2.0))) + (math.pow((17.9) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((30.81) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((14.2362) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((6.343) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.5741) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((20.32) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((22.5971) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.7) - (input[4]), 2.0))) + (math.pow((5.0) - (input[5]), 2.0))) + (math.pow((89.5) - (input[6]), 2.0))) + (math.pow((1.5184) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((396.9) - (input[11]), 2.0))) + (math.pow((31.99) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((67.9208) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.683) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.4254) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((384.97) - (input[11]), 2.0))) + (math.pow((22.98) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.61282) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((6.096) - (input[5]), 2.0))) + (math.pow((96.9) - (input[6]), 2.0))) + (math.pow((3.7598) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((248.31) - (input[11]), 2.0))) + (math.pow((20.34) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.46336) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.605) - (input[4]), 2.0))) + (math.pow((7.489) - (input[5]), 2.0))) + (math.pow((90.8) - (input[6]), 2.0))) + (math.pow((1.9709) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((374.43) - (input[11]), 2.0))) + (math.pow((1.73) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((7.67202) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((5.747) - (input[5]), 2.0))) + (math.pow((98.9) - (input[6]), 2.0))) + (math.pow((1.6334) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((393.1) - (input[11]), 2.0))) + (math.pow((19.92) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((2.01019) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.605) - (input[4]), 2.0))) + (math.pow((7.929) - (input[5]), 2.0))) + (math.pow((96.2) - (input[6]), 2.0))) + (math.pow((2.0459) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((369.3) - (input[11]), 2.0))) + (math.pow((3.7) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((45.7461) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.693) - (input[4]), 2.0))) + (math.pow((4.519) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.6582) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((88.27) - (input[11]), 2.0))) + (math.pow((36.98) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.03578) - (input[0]), 2.0)) + (math.pow((20.0) - (input[1]), 2.0))) + (math.pow((3.33) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.4429) - (input[4]), 2.0))) + (math.pow((7.82) - (input[5]), 2.0))) + (math.pow((64.5) - (input[6]), 2.0))) + (math.pow((4.6947) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((216.0) - (input[9]), 2.0))) + (math.pow((14.9) - (input[10]), 2.0))) + (math.pow((387.31) - (input[11]), 2.0))) + (math.pow((3.76) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.18337) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((27.74) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.609) - (input[4]), 2.0))) + (math.pow((5.414) - (input[5]), 2.0))) + (math.pow((98.3) - (input[6]), 2.0))) + (math.pow((1.7554) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((711.0) - (input[9]), 2.0))) + (math.pow((20.1) - (input[10]), 2.0))) + (math.pow((344.05) - (input[11]), 2.0))) + (math.pow((23.97) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((6.53876) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((1.0) - (input[3]), 2.0))) + (math.pow((0.631) - (input[4]), 2.0))) + (math.pow((7.016) - (input[5]), 2.0))) + (math.pow((97.5) - (input[6]), 2.0))) + (math.pow((1.2024) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((392.05) - (input[11]), 2.0))) + (math.pow((2.96) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((1.22358) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((19.58) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.605) - (input[4]), 2.0))) + (math.pow((6.943) - (input[5]), 2.0))) + (math.pow((97.4) - (input[6]), 2.0))) + (math.pow((1.8773) - (input[7]), 2.0))) + (math.pow((5.0) - (input[8]), 2.0))) + (math.pow((403.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((363.43) - (input[11]), 2.0))) + (math.pow((4.59) - (input[12]), 2.0))))) * (1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((10.8342) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.679) - (input[4]), 2.0))) + (math.pow((6.782) - (input[5]), 2.0))) + (math.pow((90.8) - (input[6]), 2.0))) + (math.pow((1.8195) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((21.57) - (input[11]), 2.0))) + (math.pow((25.79) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.98843) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((8.14) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.538) - (input[4]), 2.0))) + (math.pow((5.813) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((4.0952) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((307.0) - (input[9]), 2.0))) + (math.pow((21.0) - (input[10]), 2.0))) + (math.pow((394.54) - (input[11]), 2.0))) + (math.pow((19.88) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((18.0846) - (input[0]), 2.0)) + (math.pow((0.0) - (input[1]), 2.0))) + (math.pow((18.1) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.679) - (input[4]), 2.0))) + (math.pow((6.434) - (input[5]), 2.0))) + (math.pow((100.0) - (input[6]), 2.0))) + (math.pow((1.8347) - (input[7]), 2.0))) + (math.pow((24.0) - (input[8]), 2.0))) + (math.pow((666.0) - (input[9]), 2.0))) + (math.pow((20.2) - (input[10]), 2.0))) + (math.pow((27.25) - (input[11]), 2.0))) + (math.pow((29.05) - (input[12]), 2.0))))) * (-1.0))) + ((math.exp((-0.0000036459736698188483) * (((((((((((((math.pow((0.0351) - (input[0]), 2.0)) + (math.pow((95.0) - (input[1]), 2.0))) + (math.pow((2.68) - (input[2]), 2.0))) + (math.pow((0.0) - (input[3]), 2.0))) + (math.pow((0.4161) - (input[4]), 2.0))) + (math.pow((7.853) - (input[5]), 2.0))) + (math.pow((33.2) - (input[6]), 2.0))) + (math.pow((5.118) - (input[7]), 2.0))) + (math.pow((4.0) - (input[8]), 2.0))) + (math.pow((224.0) - (input[9]), 2.0))) + (math.pow((14.7) - (input[10]), 2.0))) + (math.pow((392.78) - (input[11]), 2.0))) + (math.pow((3.81) - (input[12]), 2.0))))) * (1.0))
| 7,626.25 | 30,474 | 0.465366 | 5,618 | 30,505 | 2.526878 | 0.054646 | 0.093688 | 0.258664 | 0.387997 | 0.932587 | 0.927797 | 0.897788 | 0.780502 | 0.747394 | 0.747394 | 0 | 0.217008 | 0.115686 | 30,505 | 3 | 30,475 | 10,168.333333 | 0.309238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 17 |
5d1929dee6f60610270b86b075ccab20e67122cc | 30,095 | py | Python | great_international/migrations/0123_add_expandable_trigger_help_text.py | uktrade/directory-cms | 8c8d13ce29ea74ddce7a40f3dd29c8847145d549 | [
"MIT"
] | 6 | 2018-03-20T11:19:07.000Z | 2021-10-05T07:53:11.000Z | great_international/migrations/0123_add_expandable_trigger_help_text.py | uktrade/directory-cms | 8c8d13ce29ea74ddce7a40f3dd29c8847145d549 | [
"MIT"
] | 802 | 2018-02-05T14:16:13.000Z | 2022-02-10T10:59:21.000Z | great_international/migrations/0123_add_expandable_trigger_help_text.py | uktrade/directory-cms | 8c8d13ce29ea74ddce7a40f3dd29c8847145d549 | [
"MIT"
] | 6 | 2019-01-22T13:19:37.000Z | 2019-07-01T10:35:26.000Z | # Generated by Django 2.2.24 on 2021-09-23 11:30
from django.db import migrations
import great_international.blocks.great_international
import wagtail.core.blocks
import wagtail.core.fields
import wagtail.images.blocks
class Migration(migrations.Migration):
dependencies = [
('great_international', '0122_merge_20210921_1449'),
]
operations = [
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_ar',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_de',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_en_gb',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_es',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_fr',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_ja',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_pt',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='internationalinvestmentsectorpage',
name='downpage_content_zh_hans',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_ar',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_de',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_en_gb',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_es',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_fr',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_ja',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_pt',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
migrations.AlterField(
model_name='investmentgeneralcontentpage',
name='main_content_zh_hans',
field=wagtail.core.fields.StreamField([('content_section', wagtail.core.blocks.StructBlock([('content', wagtail.core.blocks.StreamBlock([('header', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('nested_content', wagtail.core.blocks.StreamBlock([('text', wagtail.core.blocks.StructBlock([('text', great_international.blocks.great_international.MarkdownBlock(required=False)), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock(required=False)), ('image_alt', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('caption', wagtail.core.blocks.CharBlock(max_length=255, required=False))], required=False)), ('cta', wagtail.core.blocks.StructBlock([('label', wagtail.core.blocks.CharBlock(max_length=255, required=False)), ('link', wagtail.core.blocks.StructBlock([('internal_link', wagtail.core.blocks.PageChooserBlock(label='Internal link', required=False)), ('external_link', wagtail.core.blocks.CharBlock(label='External link', max_length=255, required=False))], required=False))], help_text='Set text for the CTA and either an internal or an external URL for its destination', required=False))], help_text="Use H3 headers or lower, not H2 or H1. To add an expandable/folding marker to the text, add a horizontal rule (--- with a blank line before and after it) where the 'More' button should be.")), ('columns', wagtail.core.blocks.StreamBlock([('text', great_international.blocks.great_international.MarkdownBlock())]))], min_num=1))], required=False)), ('block_slug', wagtail.core.blocks.CharBlock(help_text="Only needed if special styling is involved: check with a developer. If in doubt, it's not needed", max_length=255, required=False))]))], blank=True, null=True),
),
]
| 278.657407 | 1,758 | 0.751055 | 3,892 | 30,095 | 5.710689 | 0.033402 | 0.121254 | 0.172861 | 0.105282 | 0.989472 | 0.987582 | 0.987582 | 0.987582 | 0.984253 | 0.984253 | 0 | 0.013719 | 0.094135 | 30,095 | 107 | 1,759 | 281.261682 | 0.801555 | 0.001528 | 0 | 0.712871 | 1 | 0.356436 | 0.302193 | 0.020601 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.049505 | 0 | 0.079208 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
5d22dd23b2b6d7771ea1785e6bf81d0fef6c47be | 197 | py | Python | brian_global_config.py | achilleas-k/brian-scripts | 4d2d8c9a53e7202b60c78716e8b1a9d521293c54 | [
"Apache-2.0"
] | null | null | null | brian_global_config.py | achilleas-k/brian-scripts | 4d2d8c9a53e7202b60c78716e8b1a9d521293c54 | [
"Apache-2.0"
] | null | null | null | brian_global_config.py | achilleas-k/brian-scripts | 4d2d8c9a53e7202b60c78716e8b1a9d521293c54 | [
"Apache-2.0"
] | null | null | null | from brian.globalprefs import *
set_global_preferences(useweave=True)
#set_global_preferences(usecodegen=True)
#set_global_preferences(usenewpropagate=True)
#set_global_preferences(usecstdp=True)
| 28.142857 | 45 | 0.86802 | 24 | 197 | 6.791667 | 0.5 | 0.220859 | 0.490798 | 0.441718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045685 | 197 | 6 | 46 | 32.833333 | 0.867021 | 0.609137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5d3c020ec61833703caeb937bc66c657165eeb73 | 21 | py | Python | test.py | jaspinder21/ibsarnew1 | c0d840c03344ae4d5dd15c2b9144a92adf722cbb | [
"Apache-2.0"
] | null | null | null | test.py | jaspinder21/ibsarnew1 | c0d840c03344ae4d5dd15c2b9144a92adf722cbb | [
"Apache-2.0"
] | null | null | null | test.py | jaspinder21/ibsarnew1 | c0d840c03344ae4d5dd15c2b9144a92adf722cbb | [
"Apache-2.0"
] | null | null | null | a=[1,3,5,6]
print(a)
| 7 | 11 | 0.52381 | 7 | 21 | 1.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 0.095238 | 21 | 2 | 12 | 10.5 | 0.368421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
5d6b0db093a7937136f6f66ac6343d9072cfaf2c | 10,481 | py | Python | soil/agents/ModelM2.py | vishalbelsare/soil | e860bdb922a22da2987fba07dffb81351c0272e5 | [
"Apache-2.0"
] | null | null | null | soil/agents/ModelM2.py | vishalbelsare/soil | e860bdb922a22da2987fba07dffb81351c0272e5 | [
"Apache-2.0"
] | null | null | null | soil/agents/ModelM2.py | vishalbelsare/soil | e860bdb922a22da2987fba07dffb81351c0272e5 | [
"Apache-2.0"
] | null | null | null | import random
import numpy as np
from . import BaseAgent
class SpreadModelM2(BaseAgent):
"""
Settings:
prob_neutral_making_denier
prob_infect
prob_cured_healing_infected
prob_cured_vaccinate_neutral
prob_vaccinated_healing_infected
prob_vaccinated_vaccinate_neutral
prob_generate_anti_rumor
"""
def __init__(self, environment=None, agent_id=0, state=()):
super().__init__(environment=environment, agent_id=agent_id, state=state)
self.prob_neutral_making_denier = np.random.normal(environment.environment_params['prob_neutral_making_denier'],
environment.environment_params['standard_variance'])
self.prob_infect = np.random.normal(environment.environment_params['prob_infect'],
environment.environment_params['standard_variance'])
self.prob_cured_healing_infected = np.random.normal(environment.environment_params['prob_cured_healing_infected'],
environment.environment_params['standard_variance'])
self.prob_cured_vaccinate_neutral = np.random.normal(environment.environment_params['prob_cured_vaccinate_neutral'],
environment.environment_params['standard_variance'])
self.prob_vaccinated_healing_infected = np.random.normal(environment.environment_params['prob_vaccinated_healing_infected'],
environment.environment_params['standard_variance'])
self.prob_vaccinated_vaccinate_neutral = np.random.normal(environment.environment_params['prob_vaccinated_vaccinate_neutral'],
environment.environment_params['standard_variance'])
self.prob_generate_anti_rumor = np.random.normal(environment.environment_params['prob_generate_anti_rumor'],
environment.environment_params['standard_variance'])
def step(self):
if self.state['id'] == 0: # Neutral
self.neutral_behaviour()
elif self.state['id'] == 1: # Infected
self.infected_behaviour()
elif self.state['id'] == 2: # Cured
self.cured_behaviour()
elif self.state['id'] == 3: # Vaccinated
self.vaccinated_behaviour()
def neutral_behaviour(self):
# Infected
infected_neighbors = self.get_neighboring_agents(state_id=1)
if len(infected_neighbors) > 0:
if random.random() < self.prob_neutral_making_denier:
self.state['id'] = 3 # Vaccinated making denier
def infected_behaviour(self):
# Neutral
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_infect:
neighbor.state['id'] = 1 # Infected
def cured_behaviour(self):
# Vaccinate
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_cured_vaccinate_neutral:
neighbor.state['id'] = 3 # Vaccinated
# Cure
infected_neighbors = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors:
if random.random() < self.prob_cured_healing_infected:
neighbor.state['id'] = 2 # Cured
def vaccinated_behaviour(self):
# Cure
infected_neighbors = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors:
if random.random() < self.prob_cured_healing_infected:
neighbor.state['id'] = 2 # Cured
# Vaccinate
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_cured_vaccinate_neutral:
neighbor.state['id'] = 3 # Vaccinated
# Generate anti-rumor
infected_neighbors_2 = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors_2:
if random.random() < self.prob_generate_anti_rumor:
neighbor.state['id'] = 2 # Cured
class ControlModelM2(BaseAgent):
"""
Settings:
prob_neutral_making_denier
prob_infect
prob_cured_healing_infected
prob_cured_vaccinate_neutral
prob_vaccinated_healing_infected
prob_vaccinated_vaccinate_neutral
prob_generate_anti_rumor
"""
def __init__(self, environment=None, agent_id=0, state=()):
super().__init__(environment=environment, agent_id=agent_id, state=state)
self.prob_neutral_making_denier = np.random.normal(environment.environment_params['prob_neutral_making_denier'],
environment.environment_params['standard_variance'])
self.prob_infect = np.random.normal(environment.environment_params['prob_infect'],
environment.environment_params['standard_variance'])
self.prob_cured_healing_infected = np.random.normal(environment.environment_params['prob_cured_healing_infected'],
environment.environment_params['standard_variance'])
self.prob_cured_vaccinate_neutral = np.random.normal(environment.environment_params['prob_cured_vaccinate_neutral'],
environment.environment_params['standard_variance'])
self.prob_vaccinated_healing_infected = np.random.normal(environment.environment_params['prob_vaccinated_healing_infected'],
environment.environment_params['standard_variance'])
self.prob_vaccinated_vaccinate_neutral = np.random.normal(environment.environment_params['prob_vaccinated_vaccinate_neutral'],
environment.environment_params['standard_variance'])
self.prob_generate_anti_rumor = np.random.normal(environment.environment_params['prob_generate_anti_rumor'],
environment.environment_params['standard_variance'])
def step(self):
if self.state['id'] == 0: # Neutral
self.neutral_behaviour()
elif self.state['id'] == 1: # Infected
self.infected_behaviour()
elif self.state['id'] == 2: # Cured
self.cured_behaviour()
elif self.state['id'] == 3: # Vaccinated
self.vaccinated_behaviour()
elif self.state['id'] == 4: # Beacon-off
self.beacon_off_behaviour()
elif self.state['id'] == 5: # Beacon-on
self.beacon_on_behaviour()
def neutral_behaviour(self):
self.state['visible'] = False
# Infected
infected_neighbors = self.get_neighboring_agents(state_id=1)
if len(infected_neighbors) > 0:
if random.random() < self.prob_neutral_making_denier:
self.state['id'] = 3 # Vaccinated making denier
def infected_behaviour(self):
# Neutral
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_infect:
neighbor.state['id'] = 1 # Infected
self.state['visible'] = False
def cured_behaviour(self):
self.state['visible'] = True
# Vaccinate
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_cured_vaccinate_neutral:
neighbor.state['id'] = 3 # Vaccinated
# Cure
infected_neighbors = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors:
if random.random() < self.prob_cured_healing_infected:
neighbor.state['id'] = 2 # Cured
def vaccinated_behaviour(self):
self.state['visible'] = True
# Cure
infected_neighbors = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors:
if random.random() < self.prob_cured_healing_infected:
neighbor.state['id'] = 2 # Cured
# Vaccinate
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_cured_vaccinate_neutral:
neighbor.state['id'] = 3 # Vaccinated
# Generate anti-rumor
infected_neighbors_2 = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors_2:
if random.random() < self.prob_generate_anti_rumor:
neighbor.state['id'] = 2 # Cured
def beacon_off_behaviour(self):
self.state['visible'] = False
infected_neighbors = self.get_neighboring_agents(state_id=1)
if len(infected_neighbors) > 0:
self.state['id'] == 5 # Beacon on
def beacon_on_behaviour(self):
self.state['visible'] = False
# Cure (M2 feature added)
infected_neighbors = self.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors:
if random.random() < self.prob_generate_anti_rumor:
neighbor.state['id'] = 2 # Cured
neutral_neighbors_infected = neighbor.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors_infected:
if random.random() < self.prob_generate_anti_rumor:
neighbor.state['id'] = 3 # Vaccinated
infected_neighbors_infected = neighbor.get_neighboring_agents(state_id=1)
for neighbor in infected_neighbors_infected:
if random.random() < self.prob_generate_anti_rumor:
neighbor.state['id'] = 2 # Cured
# Vaccinate
neutral_neighbors = self.get_neighboring_agents(state_id=0)
for neighbor in neutral_neighbors:
if random.random() < self.prob_cured_vaccinate_neutral:
neighbor.state['id'] = 3 # Vaccinated
| 43.131687 | 134 | 0.624177 | 1,082 | 10,481 | 5.718115 | 0.060074 | 0.054307 | 0.126717 | 0.076774 | 0.964926 | 0.951188 | 0.930176 | 0.919185 | 0.911104 | 0.911104 | 0 | 0.008084 | 0.291861 | 10,481 | 242 | 135 | 43.309917 | 0.825519 | 0.089018 | 0 | 0.893333 | 0 | 0 | 0.074492 | 0.036182 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093333 | false | 0 | 0.02 | 0 | 0.126667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d704f385f22ae262d02a422045ef626add6108b | 9,221 | py | Python | examples/task2/plotting/data/data_1/leaf_1.py | pieter-hendriks/contiki-ng | a2c360659aef57b917b2d97eccde06240391e97d | [
"BSD-3-Clause"
] | null | null | null | examples/task2/plotting/data/data_1/leaf_1.py | pieter-hendriks/contiki-ng | a2c360659aef57b917b2d97eccde06240391e97d | [
"BSD-3-Clause"
] | null | null | null | examples/task2/plotting/data/data_1/leaf_1.py | pieter-hendriks/contiki-ng | a2c360659aef57b917b2d97eccde06240391e97d | [
"BSD-3-Clause"
] | null | null | null | using saved target 'zoul'
rlwrap ../../tools/serial-io/serialdump -b115200 /dev/ttyUSB1
connecting to /dev/ttyUSB1 [OK]
[INFO: Main ] Starting Contiki-NG-v1.0-131-gfed8f5d5b-dirty
[INFO: Main ] - Routing: nullrouting
[INFO: Main ] - Net: nullnet
[INFO: Main ] - MAC: TSCH
[INFO: Main ] - 802.15.4 PANID: 0xabcd
[INFO: Main ] - 802.15.4 TSCH default hopping sequence length: 1
[INFO: Main ] Node ID: 58505
[INFO: Main ] Link-layer address: 0012.4b00.1932.e489
[INFO: Zoul ] Zolertia RE-Mote revision B platform
[INFO: SENSORNETS] Leaf started with channel hopping sequence size: 1
[INFO: SENSORNETS] With EB period = 128 / 128 seconds
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: SENSORNETS] First iteration send_callback. Not recording data.
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 27269
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 780
[INFO: HELPERS ] lpm = 24706
[INFO: HELPERS ] deep = 1918
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 24888
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29163
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 756
[INFO: HELPERS ] lpm = 26630
[INFO: HELPERS ] deep = 1911
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26790
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 33745
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 853
[INFO: HELPERS ] lpm = 31140
[INFO: HELPERS ] deep = 1887
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 31396
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29162
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 748
[INFO: HELPERS ] lpm = 26638
[INFO: HELPERS ] deep = 1910
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26790
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 26865
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 710
[INFO: HELPERS ] lpm = 24372
[INFO: HELPERS ] deep = 1918
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 24486
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 31455
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 771
[INFO: HELPERS ] lpm = 28919
[INFO: HELPERS ] deep = 1900
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 29093
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29160
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 741
[INFO: HELPERS ] lpm = 26645
[INFO: HELPERS ] deep = 1908
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26789
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 31455
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 769
[INFO: HELPERS ] lpm = 28921
[INFO: HELPERS ] deep = 1899
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 29093
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 26865
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 711
[INFO: HELPERS ] lpm = 24371
[INFO: HELPERS ] deep = 1918
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 24485
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 26868
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 745
[INFO: HELPERS ] lpm = 24336
[INFO: HELPERS ] deep = 1922
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 24485
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29158
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 745
[INFO: HELPERS ] lpm = 26641
[INFO: HELPERS ] deep = 1907
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26789
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29161
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 758
[INFO: HELPERS ] lpm = 26628
[INFO: HELPERS ] deep = 1910
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26789
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 33749
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 797
[INFO: HELPERS ] lpm = 31197
[INFO: HELPERS ] deep = 1889
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 31396
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 24572
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 702
[INFO: HELPERS ] lpm = 22076
[INFO: HELPERS ] deep = 1929
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 22181
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 26867
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 726
[INFO: HELPERS ] lpm = 24356
[INFO: HELPERS ] deep = 1919
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 24484
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29160
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 746
[INFO: HELPERS ] lpm = 26640
[INFO: HELPERS ] deep = 1909
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26788
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29160
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 734
[INFO: HELPERS ] lpm = 26652
[INFO: HELPERS ] deep = 1909
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26788
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 36042
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 937
[INFO: HELPERS ] lpm = 31176
[INFO: HELPERS ] deep = 4063
[INFO: HELPERS ] tx = 90
[INFO: HELPERS ] rx = 31441
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 24571
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 693
[INFO: HELPERS ] lpm = 22215
[INFO: HELPERS ] deep = 1798
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 22310
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29162
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 746
[INFO: HELPERS ] lpm = 26639
[INFO: HELPERS ] deep = 1911
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26787
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29160
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 768
[INFO: HELPERS ] lpm = 26618
[INFO: HELPERS ] deep = 1908
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26788
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 31456
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 766
[INFO: HELPERS ] lpm = 28924
[INFO: HELPERS ] deep = 1900
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 29093
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 26865
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 709
[INFO: HELPERS ] lpm = 24373
[INFO: HELPERS ] deep = 1917
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 24485
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 29161
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 736
[INFO: HELPERS ] lpm = 26649
[INFO: HELPERS ] deep = 1911
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 26788
[INFO: SENSORNETS] Leaf loop start!
[INFO: SENSORNETS] Leaf waiting for association!
[INFO: SENSORNETS] Leaf packet sent!
[INFO: HELPERS ] ticks = 31454
[INFO: HELPERS ] seconds = 1
[INFO: HELPERS ] cpu = 760
[INFO: HELPERS ] lpm = 28928
[INFO: HELPERS ] deep = 1900
[INFO: HELPERS ] tx = 38
[INFO: HELPERS ] rx = 29092
| 34.27881 | 69 | 0.668149 | 1,222 | 9,221 | 5.040917 | 0.151391 | 0.3125 | 0.230844 | 0.092857 | 0.791558 | 0.787013 | 0.787013 | 0.719805 | 0.719805 | 0.719805 | 0 | 0.093815 | 0.212775 | 9,221 | 268 | 70 | 34.406716 | 0.754787 | 0 | 0 | 0.656716 | 0 | 0 | 0.000434 | 0 | 0 | 0 | 0.000651 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
53d1c8504b7c192ebf80ea77e8af3165a83075c8 | 186,195 | py | Python | CarParkArcGisApi/CarParkArcGisApi/env/Lib/site-packages/arcgis/raster/functions/gbl.py | moazzamwaheed2017/carparkapi | e52ae1b2aed47321ce9d22ba6cd0b85fa60a417a | [
"MIT"
] | null | null | null | CarParkArcGisApi/CarParkArcGisApi/env/Lib/site-packages/arcgis/raster/functions/gbl.py | moazzamwaheed2017/carparkapi | e52ae1b2aed47321ce9d22ba6cd0b85fa60a417a | [
"MIT"
] | 9 | 2020-02-03T15:50:10.000Z | 2022-03-02T07:11:34.000Z | CarParkArcGisApi/CarParkArcGisApi/env/Lib/site-packages/arcgis/raster/functions/gbl.py | moazzamwaheed2017/carparkapi | e52ae1b2aed47321ce9d22ba6cd0b85fa60a417a | [
"MIT"
] | null | null | null | """
Global Raster functions.
These functions are applied to the raster data to create a
processed product on disk, using ImageryLayer.save() method or arcgis.raster.analytics.generate_raster().
Global functions cannot be used for visualization using dynamic image processing. They cannot be applied to layers that
are added to a map for on-the-fly image processing or visualized inline within the Jupyter notebook.
Functions can be applied to various rasters (or images), including the following:
* Imagery layers
* Rasters within imagery layers
"""
from arcgis.raster._layer import ImageryLayer
from arcgis.features import FeatureLayer
from arcgis.gis import Item
import copy
import numbers
from arcgis.raster.functions.utility import _raster_input, _get_raster, _replace_raster_url, _get_raster_url, _get_raster_ra
from arcgis.geoprocessing._support import _layer_input,_feature_input
import string as _string
import random as _random
import arcgis as _arcgis
def _create_output_image_service(gis, output_name, task):
ok = gis.content.is_service_name_available(output_name, "Image Service")
if not ok:
raise RuntimeError("An Image Service by this name already exists: " + output_name)
create_parameters = {
"name": output_name,
"description": "",
"capabilities": "Image",
"properties": {
"path": "@",
"description": "",
"copyright": ""
}
}
output_service = gis.content.create_service(output_name, create_params=create_parameters,
service_type="imageService")
description = "Image Service generated from running the " + task + " tool."
item_properties = {
"description": description,
"tags": "Analysis Result, " + task,
"snippet": "Analysis Image Service generated from " + task
}
output_service.update(item_properties)
return output_service
def _id_generator(size=6, chars=_string.ascii_uppercase + _string.digits):
return ''.join(_random.choice(chars) for _ in range(size))
def _gbl_clone_layer(layer, function_chain, function_chain_ra,**kwargs):
if isinstance(layer, Item):
layer = layer.layers[0]
newlyr = ImageryLayer(layer._url, layer._gis)
newlyr._lazy_properties = layer.properties
newlyr._hydrated = True
newlyr._lazy_token = layer._token
# if layer._fn is not None: # chain the functions
# old_chain = layer._fn
# newlyr._fn = function_chain
# newlyr._fn['rasterFunctionArguments']['Raster'] = old_chain
# else:
newlyr._fn = function_chain_ra
newlyr._fnra = function_chain_ra
newlyr._where_clause = layer._where_clause
newlyr._spatial_filter = layer._spatial_filter
newlyr._temporal_filter = layer._temporal_filter
newlyr._mosaic_rule = layer._mosaic_rule
newlyr._filtered = layer._filtered
newlyr._extent = layer._extent
newlyr._uses_gbl_function = True
for key in kwargs:
newlyr._other_outputs.update({key:kwargs[key]})
return newlyr
def _feature_gbl_clone_layer(layer, function_chain, function_chain_ra,**kwargs):
if isinstance(layer, Item):
layer = layer.layers[0]
newlyr = ImageryLayer(layer._url, layer._gis)
newlyr._fn = function_chain
newlyr._fnra = function_chain_ra
newlyr._storage = layer._storage
newlyr._dynamic_layer = layer._dynamic_layer
newlyr._uses_gbl_function = True
for key in kwargs:
newlyr._other_outputs.update({key:kwargs[key]})
return newlyr
def euclidean_distance(in_source_data,
cell_size=None,
max_distance=None,
distance_method="PLANAR",
in_barrier_data=None):
"""
Calculates, for each cell, the Euclidean distance to the closest source.
For more information, see
http://pro.arcgis.com/en/pro-app/help/data/imagery/euclidean-distance-global-function.htm
Parameters
----------
:param in_source_data: raster; The input raster that identifies the pixels or locations to
which the Euclidean distance for every output pixel location is calculated.
The input type can be an integer or a floating-point value.
:param cell_size: The pixel size at which the output raster will be created. If the cell
size was explicitly set in Environments, that will be the default cell size.
If Environments was not set, the output cell size will be the same as the
Source Raster
:param max_distance: The threshold that the accumulative distance values cannot exceed. If an
accumulative Euclidean distance exceeds this value, the output value for
the pixel location will be NoData. The default distance is to the edge
of the output raster
:param distance_method: Optional String; Determines whether to calculate the distance using a planar (flat earth)
or a geodesic (ellipsoid) method.
Planar - Planar measurements use 2D Cartesian mathematics to calculate
length and area. The option is only available when measuring in a
projected coordinate system and the 2D plane of that coordinate system
will be used as the basis for the measurements. This is the default.
Geodesic - The shortest line between two points on the earth's surface
on a spheroid (ellipsoid). Therefore, regardless of input or output
projection, the results do not change.
.. note::
One use for a geodesic line is when you want to determine the shortest
distance between two cities for an airplane's flight path. This is also
known as a great circle line if based on a sphere rather than an ellipsoid.
:param in_barrier_data: Optional barrier raster.
:return: output raster with function applied
"""
layer, in_source_data, raster_ra = _raster_input(in_source_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "EucDistance_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_distance_raster",
"in_source_data": in_source_data,
}
}
if in_barrier_data is not None:
layer2, in_barrier_data, raster_ra2 = _raster_input(in_barrier_data)
template_dict["rasterFunctionArguments"]["in_barrier_data"] = in_barrier_data
if cell_size is not None:
template_dict["rasterFunctionArguments"]["cell_size"] = cell_size
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
distance_method_list = ["PLANAR","GEODESIC"]
if distance_method is not None:
if distance_method.upper() not in distance_method_list:
raise RuntimeError('distance_method should be one of the following '+ str(distance_method_list))
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_source_data"] = raster_ra
if in_barrier_data is not None:
function_chain_ra["rasterFunctionArguments"]["in_barrier_data"] = raster_ra2
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def euclidean_allocation(in_source_data,
in_value_raster=None,
max_distance=None,
cell_size=None,
source_field=None,
distance_method="PLANAR",
in_barrier_data=None):
"""
Calculates, for each cell, the nearest source based on Euclidean distance.
For more information, see
http://pro.arcgis.com/en/pro-app/help/data/imagery/euclidean-allocation-global-function.htm
Parameters
----------
:param in_source_data: raster; The input raster that identifies the pixels or locations to which
the Euclidean distance for every output pixel location is calculated.
The input type can be an integer or a floating-point value.
If the input Source Raster is floating point, the Value Raster must be set,
and it must be an integer. The Value Raster will take precedence over any
setting of the Source Field.
:param in_value_raster: The input integer raster that identifies the zone values that should be
used for each input source location. For each source location pixel, the
value defined by the Value Raster will be assigned to all pixels allocated
to the source location for the computation. The Value Raster will take
precedence over any setting for the Source Field .
:param max_distance: The threshold that the accumulative distance values cannot exceed. If an
accumulative Euclidean distance exceeds this value, the output value for
the pixel location will be NoData. The default distance is to the edge
of the output raster
:param cell_size: The pixel size at which the output raster will be created. If the cell size
was explicitly set in Environments, that will be the default cell size.
If Environments was not set, the output cell size will be the same as the
Source Raster
:param source_field: The field used to assign values to the source locations. It must be an
integer type. If the Value Raster has been set, the values in that input
will take precedence over any setting for the Source Field.
:param distance_method: Optional String; Determines whether to calculate the distance using a planar (flat earth)
or a geodesic (ellipsoid) method.
Planar - Planar measurements use 2D Cartesian mathematics to calculate
length and area. The option is only available when measuring in a
projected coordinate system and the 2D plane of that coordinate system
will be used as the basis for the measurements. This is the default.
Geodesic - The shortest line between two points on the earth's surface
on a spheroid (ellipsoid). Therefore, regardless of input or output
projection, the results do not change.
.. note::
One use for a geodesic line is when you want to determine the shortest
distance between two cities for an airplane's flight path. This is also
known as a great circle line if based on a sphere rather than an ellipsoid.
:param in_barrier_data: Optional barrier raster.
:return: output raster with function applied
"""
layer1, in_source_data, raster_ra1 = _raster_input(in_source_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "EucAllocation_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_allocation_raster",
"in_source_data": in_source_data
}
}
if in_value_raster is not None:
layer2, in_value_raster, raster_ra2 = _raster_input(in_value_raster)
template_dict["rasterFunctionArguments"]["in_value_raster"] = in_value_raster
if in_barrier_data is not None:
layer3, in_barrier_data, raster_ra3 = _raster_input(in_barrier_data)
template_dict["rasterFunctionArguments"]["in_barrier_data"] = in_barrier_data
if cell_size is not None:
template_dict["rasterFunctionArguments"]["cell_size"] = cell_size
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
if source_field is not None:
template_dict["rasterFunctionArguments"]["source_field"] = source_field
distance_method_list = ["PLANAR","GEODESIC"]
if distance_method is not None:
if distance_method.upper() not in distance_method_list:
raise RuntimeError('distance_method should be one of the following '+ str(distance_method_list))
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_value_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_value_raster"] = raster_ra2
if in_barrier_data is not None:
function_chain_ra["rasterFunctionArguments"]["in_barrier_data"] = raster_ra3
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def cost_distance(in_source_data,
in_cost_raster,
max_distance=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Calculates the least accumulative cost distance for each cell from or to the least-cost
source over a cost surface.
For more information, see
http://pro.arcgis.com/en/pro-app/help/data/imagery/cost-distance-global-function.htm
Parameters
----------
:param in_source_data: The input raster that identifies the pixels or locations to which the
least accumulated cost distance for every output pixel location is
calculated. The Source Raster can be an integer or a floating-point value.
:param in_cost_raster: A raster defining the cost or impedance to move planimetrically through each pixel.
The value at each pixel location represents the cost-per-unit distance for moving
through it. Each pixel location value is multiplied by the pixel resolution, while
also compensating for diagonal movement to obtain the total cost of passing through
the pixel.
:param max_distance: The threshold that the accumulative cost values cannot exceed. If an accumulative cost
distance exceeds this value, the output value for the pixel location will be NoData.
The maximum distance defines the extent for which the accumulative cost distances are
calculated. The default distance is to the edge of the output raster.
:param source_cost_multiplier: The threshold that the accumulative cost values cannot exceed. If an accumulative
cost distance exceeds this value, the output value for the pixel location will be
NoData. The maximum distance defines the extent for which the accumulative cost
distances are calculated. The default distance is to the edge of the output raster.
:param source_start_cost: The starting cost from which to begin the cost calculations. This parameter allows
for the specification of the fixed cost associated with a source. Instead of starting
at a cost of 0, the cost algorithm will begin with the value set here.
The default is 0. The value must be 0 or greater. A numeric (double) value or a field
from the Source Raster can be used for this parameter.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs as the
accumulative cost increases. It is used to model fatigue of the traveler. The growing
accumulative cost to reach a pixel is multiplied by the resistance rate and added to
the cost to move into the subsequent pixel.
It is a modified version of a compound interest rate formula that is used to calculate
the apparent cost of moving through a pixel. As the value of the resistance rate increases,
it increases the cost of the pixels that are visited later. The greater the resistance rate,
the higher the cost to reach the next pixel, which is compounded for each subsequent movement.
Since the resistance rate is similar to a compound rate and generally the accumulative cost
values are very large, small resistance rates are suggested, such as 0.005 or even smaller,
depending on the accumulative cost values.
The default is 0. The values must be 0 or greater. A numeric (double) value or a field from
the Source Raster can be used for this parameter.
:param source_capacity: Defines the cost capacity for the traveler for a source. The cost calculations continue for
each source until the specified capacity is reached.
The default capacity is to the edge of the output raster. The values must be greater than 0.
A double numeric value or a field from the Source Raster can be used for this parameter.
:param source_direction: Defines the direction of the traveler when applying the source resistance rate and the source
starting cost.
FROM_SOURCE - The source resistance rate and source starting cost will be applied beginning
at the input source and moving out to the nonsource cells. This is the default.
TO_SOURCE - The source resistance rate and source starting cost will be applied beginning at
each nonsource cell and moving back to the input source.
Either specify the From Source or To Source keyword, which will be applied to all sources,
or specify a field in the Source Raster that contains the keywords to identify the direction
of travel for each source. That field must contain the string From Source or To Source.
:return: output raster with function applied
"""
layer1, in_source_data, raster_ra1 = _raster_input(in_source_data)
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "CostDistance_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_distance_raster",
"in_source_data": in_source_data,
"in_cost_raster": in_cost_raster
}
}
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction is not None:
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list))
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_source_data"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_cost_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def cost_allocation(in_source_data,
in_cost_raster,
in_value_raster=None,
max_distance=None,
source_field=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Calculates, for each cell, its least-cost source based on the least accumulative cost over a cost surface.
For more information, see
http://pro.arcgis.com/en/pro-app/help/data/imagery/cost-allocation-global-function.htm
Parameters
----------
:param in_source_data: The input raster that identifies the pixels or locations to which the
least accumulated cost distance for every output pixel location is
calculated. The Source Raster can be an integer or a floating-point value.
If the input Source Raster is floating point, the Value Raster must be set,
and it must be an integer. The Value Raster will take precedence over any
setting of the Source Field.
:param in_cost_raster: A raster defining the cost or impedance to move planimetrically through each pixel.
The value at each pixel location represents the cost-per-unit distance for moving
through it. Each pixel location value is multiplied by the pixel resolution, while
also compensating for diagonal movement to obtain the total cost of passing through
the pixel.
The values of the Cost Raster can be integer or floating point, but they cannot be
negative or zero.
:param in_value_raster: The input integer raster that identifies the zone values that should be used for
each input source location. For each source location pixel, the value defined by
the Value Raster will be assigned to all pixels allocated to the source location
for the computation. The Value Raster will take precedence over any setting for
the Source Field.
:param max_distance: The threshold that the accumulative cost values cannot exceed. If an accumulative cost
distance exceeds this value, the output value for the pixel location will be NoData.
The maximum distance defines the extent for which the accumulative cost distances are
calculated. The default distance is to the edge of the output raster.
:param source_field: The field used to assign values to the source locations. It must be an integer type.
If the Value Raster has been set, the values in that input will take precedence over
any setting for the Source Field.
:param source_cost_multiplier: This parameter allows for control of the mode of travel or the magnitude at
a source. The greater the multiplier, the greater the cost to move through each cell.
The default value is 1. The values must be greater than 0. A numeric (double) value or
a field from the Source Raster can be used for this parameter.
:param source_start_cost: The starting cost from which to begin the cost calculations. This parameter allows
for the specification of the fixed cost associated with a source. Instead of starting
at a cost of 0, the cost algorithm will begin with the value set here.
The default is 0. The value must be 0 or greater. A numeric (double) value or a field
from the Source Raster can be used for this parameter.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs as the
accumulative cost increases. It is used to model fatigue of the traveler. The growing
accumulative cost to reach a pixel is multiplied by the resistance rate and added to
the cost to move into the subsequent pixel.
It is a modified version of a compound interest rate formula that is used to calculate
the apparent cost of moving through a pixel. As the value of the resistance rate increases,
it increases the cost of the pixels that are visited later. The greater the resistance rate,
the higher the cost to reach the next pixel, which is compounded for each subsequent movement.
Since the resistance rate is similar to a compound rate and generally the accumulative cost
values are very large, small resistance rates are suggested, such as 0.005 or even smaller,
depending on the accumulative cost values.
The default is 0. The values must be 0 or greater. A numeric (double) value or a field from
the Source Raster can be used for this parameter.
:param source_capacity: Defines the cost capacity for the traveler for a source. The cost calculations continue for
each source until the specified capacity is reached.
The default capacity is to the edge of the output raster. The values must be greater than 0.
A double numeric value or a field from the Source Raster can be used for this parameter.
:source_direction: Defines the direction of the traveler when applying the source resistance rate and the source
starting cost.
FROM_SOURCE - The source resistance rate and source starting cost will be applied beginning
at the input source and moving out to the nonsource cells. This is the default.
TO_SOURCE - The source resistance rate and source starting cost will be applied beginning at
each nonsource cell and moving back to the input source.
Either specify the From Source or To Source keyword, which will be applied to all sources,
or specify a field in the Source Raster that contains the keywords to identify the direction
of travel for each source. That field must contain the string From Source or To Source.
:return: output raster with function applied
"""
layer1, in_source_data, raster_ra1 = _raster_input(in_source_data)
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "CostAllocation_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_allocation_raster",
"in_source_data": in_source_data,
"in_cost_raster": in_cost_raster
}
}
if in_value_raster is not None:
layer3, in_value_raster, raster_ra3 = _raster_input(in_value_raster)
template_dict["rasterFunctionArguments"]["in_value_raster"] = in_value_raster
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
if source_field is not None:
template_dict["rasterFunctionArguments"]["source_field"] = source_field
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction is not None:
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list))
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_source_data"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_cost_raster"] = raster_ra2
if in_value_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_value_raster"] = raster_ra3
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def zonal_statistics(in_zone_data,
zone_field,
in_value_raster,
ignore_nodata=True,
statistics_type='MEAN',
process_as_multidimensional=None):
""""
Calculates statistics on values of a raster within the zones of another dataset.
For more information,
http://pro.arcgis.com/en/pro-app/help/data/imagery/zonal-statistics-global-function.htm
Parameters
----------
:param in_zone_data: Required ImageryLayer. Dataset that defines the zones. The zones can be defined by an integer raster
:param zone_field: Required str. Field that holds the values that define each zone. It can be an integer or a
string field of the zone raster.
:param in_value_raster: Required ImageryLayer. Raster that contains the values on which to calculate a statistic.
:param ignore_no_data: Optional bool. Denotes whether NoData values in the Value Raster will influence the results
of the zone that they fall within.
True - Within any particular zone, only pixels that have a value in the Value
Raster will be used in determining the output value for that zone. NoData
pixels in the Value Raster will be ignored in the statistic calculation.
This is the default.
False - Within any particular zone, if any NoData pixels exist in the Value
Raster, it is deemed that there is insufficient information to perform
statistical calculations for all the pixels in that zone; therefore, the
entire zone will receive the NoData value on the output raster.
:param statistics_type: Optional str. Statistic type to be calculated. Default is MEAN
MEAN-Calculates the average of all pixels in the Value Raster that belong to
the same zone as the output pixel.
MAJORITY-Determines the value that occurs most often of all pixels in the
Value Raster that belong to the same zone as the output pixel.
MAXIMUM-Determines the largest value of all pixels in the Value Raster
that belong to the same zone as the output pixel.
MEDIAN-Determines the median value of all pixels in the Value Raster
that belong to the same zone as the output pixel.
MINIMUM-Determines the smallest value of all pixels in the Value Raster
that belong to the same zone as the output pixel.
MINORITY-Determines the value that occurs least often of all pixels in
the Value Raster that belong to the same zone as the output pixel.
RANGE-Calculates the difference between the largest and smallest value
of all pixels in the Value Raster that belong to the same zone as the
output pixel.
STD-Calculates the standard deviation of all pixels in
the Value Rasterthat belong to the same zone as the output pixel.
SUM-Calculates the total value of all pixels in the Value Raster that
belong to the same zone as the output pixel.
VARIETY-Calculates the number of unique values for all pixels in the
Value Raster that belong to the same zone as the output pixel.
:param process_as_multidimensional: Optional bool, Process as multidimensional if set to True. (If the input is multidimensional raster.)
:return: output raster with function applied
"""
layer1, in_zone_data, raster_ra1 = _raster_input(in_zone_data)
layer2, in_value_raster, raster_ra2 = _raster_input(in_value_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "ZonalStatistics_sa",
"PrimaryInputParameterName" : "in_value_raster",
"OutputRasterParameterName" : "out_raster",
"in_zone_data" : in_zone_data,
"zone_field" : zone_field,
"in_value_raster" : in_value_raster
}
}
if ignore_nodata is not None:
if not isinstance(ignore_nodata,bool):
raise RuntimeError('ignore_nodata should be a boolean value')
if ignore_nodata is True:
ignore_nodata = "DATA"
elif ignore_nodata is False:
ignore_nodata = "NODATA"
template_dict["rasterFunctionArguments"]["ignore_nodata"] = ignore_nodata
statistics_type_list = ["MEAN","MAJORITY","MAXIMUM","MEDIAN","MINIMUM","MINORITY","RANGE","STD","SUM","VARIETY"]
if statistics_type is not None:
if statistics_type.upper() not in statistics_type_list:
raise RuntimeError('statistics_type should be one of the following '+ str(statistics_type_list))
template_dict["rasterFunctionArguments"]["statistics_type"] = statistics_type
if process_as_multidimensional is not None:
if isinstance(process_as_multidimensional, bool):
if process_as_multidimensional==True:
template_dict["rasterFunctionArguments"]["process_as_multidimensional"]="ALL_SLICES"
else:
template_dict["rasterFunctionArguments"]["process_as_multidimensional"]="CURRENT_SLICE"
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_zone_data"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_value_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def least_cost_path(in_source_data,
in_cost_raster,
in_destination_data,
destination_field=None,
path_type="EACH_CELL",
max_distance=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Calculates the least-cost path from a source to a destination. The least accumulative cost distance
is calculated for each pixel over a cost surface, to the nearest source. This produces an output
raster that records the least-cost path, or paths, from selected locations to the closest source
pixels defined within the accumulative cost surface, in terms of cost distance.
For more information, see
http://pro.arcgis.com/en/pro-app/help/data/imagery/least-cost-path-global-function.htm
Parameters
----------
:param in_source_data: The input raster that identifies the pixels or locations to which the
least accumulated cost distance for every output pixel location is
calculated. The Source Raster can be an integer or a floating-point value.
If the input Source Raster is floating point, the Value Raster must be set,
and it must be an integer. The Value Raster will take precedence over any
setting of the Source Field.
:param in_cost_raster: A raster defining the cost or impedance to move planimetrically through each pixel.
The value at each pixel location represents the cost-per-unit distance for moving
through it. Each pixel location value is multiplied by the pixel resolution, while
also compensating for diagonal movement to obtain the total cost of passing through
the pixel.
The values of the Cost Raster can be integer or floating point, but they cannot be
negative or zero.
:param in_destination_data: A raster dataset that identifies the pixels from which the least-cost path is
determined to the least costly source. This input consists of pixels that have valid
values, and the remaining pixels must be assigned NoData. Values of 0 are valid.
:param destination_field: The field used to obtain values for the destination locations.
:param path_type: A keyword defining the manner in which the values and zones on the input destination
data will be interpreted in the cost path calculations:
EACH_CELL-A least-cost path is determined for each pixel with valid values on the
input destination data, and saved on the output raster. Each cell of the input
destination data is treated separately, and a least-cost path is determined for each from cell.
EACH_ZONE-A least-cost path is determined for each zone on the input destination data and
saved on the output raster. The least-cost path for each zone begins at the pixel with the
lowest cost distance weighting in the zone.
BEST_SINGLE-For all pixels on the input destination data, the least-cost path is derived
from the pixel with the minimum of the least-cost paths to source cells.
:param max_distance: The threshold that the accumulative cost values cannot exceed. If an accumulative cost
distance exceeds this value, the output value for the pixel location will be NoData.
The maximum distance defines the extent for which the accumulative cost distances are
calculated. The default distance is to the edge of the output raster.
:param source_field: The field used to assign values to the source locations. It must be an integer type.
If the Value Raster has been set, the values in that input will take precedence over
any setting for the Source Field.
:param source_cost_multiplier: The threshold that the accumulative cost values cannot exceed. If an accumulative
cost distance exceeds this value, the output value for the pixel location will be
NoData. The maximum distance defines the extent for which the accumulative cost
distances are calculated. The default distance is to the edge of the output raster.
:param source_start_cost: The starting cost from which to begin the cost calculations. This parameter allows
for the specification of the fixed cost associated with a source. Instead of starting
at a cost of 0, the cost algorithm will begin with the value set here.
The default is 0. The value must be 0 or greater. A numeric (double) value or a field
from the Source Raster can be used for this parameter.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs as the
accumulative cost increases. It is used to model fatigue of the traveler. The growing
accumulative cost to reach a pixel is multiplied by the resistance rate and added to
the cost to move into the subsequent pixel.
It is a modified version of a compound interest rate formula that is used to calculate
the apparent cost of moving through a pixel. As the value of the resistance rate increases,
it increases the cost of the pixels that are visited later. The greater the resistance rate,
the higher the cost to reach the next pixel, which is compounded for each subsequent movement.
Since the resistance rate is similar to a compound rate and generally the accumulative cost
values are very large, small resistance rates are suggested, such as 0.005 or even smaller,
depending on the accumulative cost values.
The default is 0. The values must be 0 or greater. A numeric (double) value or a field from
the Source Raster can be used for this parameter.
:param source_capacity: Defines the cost capacity for the traveler for a source. The cost calculations continue for
each source until the specified capacity is reached.
The default capacity is to the edge of the output raster. The values must be greater than 0.
A double numeric value or a field from the Source Raster can be used for this parameter.
:param source_direction: Defines the direction of the traveler when applying the source resistance rate and the source
starting cost.
FROM_SOURCE - The source resistance rate and source starting cost will be applied beginning
at the input source and moving out to the nonsource cells. This is the default.
TO_SOURCE-The source resistance rate and source starting cost will be applied beginning at
each nonsource cell and moving back to the input source.
Either specify the From Source or To Source keyword, which will be applied to all sources,
or specify a field in the Source Raster that contains the keywords to identify the direction
of travel for each source. That field must contain the string From Source or To Source.
:return: output raster with function applied
"""
layer1, in_source_data, raster_ra1 = _raster_input(in_source_data)
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
layer3, in_destination_data, raster_ra3 = _raster_input(in_destination_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "ShortestPath",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_path_raster",
"in_source_data" : in_source_data,
"in_cost_raster" : in_cost_raster,
"in_destination_data" : in_destination_data
}
}
if destination_field is not None:
template_dict["rasterFunctionArguments"]["destination_field"] = destination_field
if path_type is not None:
path_type_list = ["EACH_CELL", "EACH_ZONE", "BEST_SINGLE"]
if path_type.upper() not in path_type_list:
raise RuntimeError('path_type should be one of the following '+ str(path_type_list))
template_dict["rasterFunctionArguments"]["path_type"] = path_type
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction is not None:
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_cost_raster"] = raster_ra2
function_chain_ra["rasterFunctionArguments"]["in_destination_data"] = raster_ra3
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def flow_distance(input_stream_raster,
input_surface_raster,
input_flow_direction_raster=None,
distance_type="VERTICAL",
flow_direction_type= "D8",
statistics_type="MINIMUM"):
"""
This function computes, for each cell, the minimum downslope
horizontal or vertical distance to cell(s) on a stream or
river into which they flow. If an optional flow direction
raster is provided, the down slope direction(s) will be
limited to those defined by the input flow direction raster.
Parameters
----------
:param input_stream_raster: An input raster that represents a linear stream network
:param input_surface_raster: The input raster representing a continuous surface.
:param input_flow_direction_raster: The input raster that shows the direction of flow out of each cell.
:param distance_type: VERTICAL or HORIZONTAL distance to compute; if not
specified, VERTICAL distance is computed.
:param flow_direction_type: Optional String; Defines the type of the input flow direction raster.
D8 - The input flow direction raster is of type D8. This is the default.
MFD - The input flow direction raster is of type Multi Flow Direction (MFD).
Dinf - The input flow direction raster is of type D-Infinity (DINF).
:param statistics_type: Optional String; Determines the statistics type used to compute flow distance
over multiple flow paths.
If there is only a single flow path from each cell to a cell on the stream,
all statistics types produce the same result.
MINIMUM - Where multiple flow paths exist, minimum flow distance in computed.
This is the default.
WEIGHTED_MEAN - Where multiple flow paths exist, a weighted mean of flow distance
is computed. Flow proportion from a cell to its downstream neighboring cells are
used as weights for computing weighted mean.
MAXIMUM - When multiple flow paths exist, maximum flow distance is computed.
:return: output raster with function applied
"""
layer1, input_stream_raster, raster_ra1 = _raster_input(input_stream_raster)
layer2, input_surface_raster, raster_ra2 = _raster_input(input_surface_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "FlowDistance_sa",
"PrimaryInputParameterName" : "in_stream_raster",
"OutputRasterParameterName" : "out_raster",
"in_stream_raster" : input_stream_raster,
"in_surface_raster" : input_surface_raster,
}
}
if input_flow_direction_raster is not None:
layer3, input_flow_direction_raster, raster_ra3 = _raster_input(input_flow_direction_raster)
template_dict["rasterFunctionArguments"]["in_flow_direction_raster"] = input_flow_direction_raster
distance_type_list = ["VERTICAL","HORIZONTAL"]
if distance_type is not None:
if distance_type.upper() not in distance_type_list:
raise RuntimeError('distance_type should be one of the following '+ str(distance_type_list))
template_dict["rasterFunctionArguments"]["distance_type"] = distance_type
flow_direction_type_list = ["D8","MFD","DINF"]
if flow_direction_type is not None:
if flow_direction_type.upper() not in flow_direction_type_list:
raise RuntimeError('flow_direction_type should be one of the following D8, MFD, Dinf')
template_dict["rasterFunctionArguments"]["flow_direction_type"] = flow_direction_type
statistics_type_allowed_values = ["MINIMUM","WEIGHTED_MEAN","MAXIMUM"]
if [element.lower() for element in statistics_type_allowed_values].count(statistics_type.lower()) <= 0 :
raise RuntimeError('statistics_type can only be one of the following: '+ str(statistics_type_allowed_values))
for element in statistics_type_allowed_values:
if statistics_type.lower() == element.lower():
template_dict["rasterFunctionArguments"]["statistics_type"] = statistics_type
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_stream_raster"] = raster_ra1
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra2
if input_flow_direction_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra3
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def flow_accumulation(input_flow_direction_raster,
input_weight_raster=None,
data_type="FLOAT",
flow_direction_type= "D8"):
""""
Replaces cells of a raster corresponding to a mask
with the values of the nearest neighbors.
:param input_flow_direction_raster: The input raster that shows the direction of flow out of each cell.
:param input_weight_raster: An optional input raster for applying a weight to each cell.
:param data_type: INTEGER, FLOAT, DOUBLE
:return: output raster with function applied
"""
layer1, input_flow_direction_raster, raster_ra1 = _raster_input(input_flow_direction_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "FlowAccumulation_sa",
"PrimaryInputParameterName" : "in_flow_direction_raster",
"OutputRasterParameterName" : "out_accumulation_raster",
"in_flow_direction_raster" : input_flow_direction_raster
}
}
if input_weight_raster is not None:
layer2, input_weight_raster, raster_ra2 = _raster_input(input_weight_raster)
template_dict["rasterFunctionArguments"]["in_weight_raster"] = input_weight_raster
data_type_list=["FLOAT","INTEGER","DOUBLE"]
if data_type is not None:
if data_type.upper() not in data_type_list:
raise RuntimeError('data_type should be one of the following '+ str(data_type_list))
template_dict["rasterFunctionArguments"]["data_type"] = data_type
flow_direction_type_list = ["D8","MFD","DINF"]
if flow_direction_type is not None:
if flow_direction_type.upper() not in flow_direction_type_list:
raise RuntimeError('flow_direction_type should be one of the following '+ str(flow_direction_type_list))
template_dict["rasterFunctionArguments"]["flow_direction_type"] = flow_direction_type
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra1
if input_weight_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_weight_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def flow_direction(input_surface_raster,
force_flow="NORMAL",
flow_direction_type="D8",
generate_out_drop_raster=False):
"""
.. image:: _static/images/flow_direction/flow_direction.png
The ``flow_direction`` task creates a raster of flow direction from each cell to its steepest downslope neighbor.
This task supports three flow modeling algorithms. Those are D8, Multi Flow Direction (MFD), and D-Infinity (DINF).
**D8 flow modeling algorithm**
The D8 flow method models flow direction from each cell to its steepest downslope neighbor.
The output of the FlowDirection task run with the D8 flow direction type is an integer
raster whose values range from 1-255. The values for each direction from the center are the following:
.. image:: _static/images/flow_direction/D8.gif
For example, if the direction of steepest drop was to the left of the current
processing cell, its flow direction would be coded at 16.
The following are additional considerations for using the D8 flow method:
* If a cell is lower than its eight neighbors, that cell is given the value
of its lowest neighbor, and flow is defined toward this cell. If multiple
neighbors have the lowest value, the cell is still given this value, but
flow is defined with one of the two methods explained below. This is used
to filter out one-cell sinks, which are considered noise.
* If a cell has the same change in z-value in multiple directions and that
cell is part of a sink, the flow direction is referred to as undefined. In
such cases, the value for that cell in the output flow direction raster will
be the sum of those directions. For example, if the change in z-value is the
same both to the right (flow direction = 1) and down (flow direction = 4),
the flow direction for that cell is 5.
* If a cell has the same change in z-value in multiple directions and is not
part of a sink, the flow directions is assigned with a lookup table defining
the most likely direction. See Greenlee (1987).
* The output drop raster is calculated as the difference in z-value divided by
the path length between the cell centers, expressed in percentages. For adjacent
cells, this is analogous to the percent slop between cells. Across a flat area,
the distance becomes the distance to the nearest cell of lower elevation.
The result is a map of percent rise in the path of steepest descent from
each cell.
* When calculating a drop raster in flat areas, the distance to diagonally
adjacent cells (1.41421 * cell size) is approximated by 1.5 * cell
size for improved performance.
* With the forceFlow parameter set to the default value False, a cell
at the edge of the surface raster will flow towards the inner cell
with the steepest z-value. If the drop is less than or equal to zero,
the cell will flow out of the surface raster.
**MFD flow modeling algorithm**
The MFD algorithm, described by Qin et al. (2007), partitions flow from a cell to all downslope neighbors.
A flow-partition exponent is created from an adaptive approach based on local terrain conditions and is used
to determine fraction of flow draining to all downslope neighbors.
When the MFD flow direction output is added to a map, it only displays the D8 flow direction.
As MFD flow directions have potentially multiple values tied to each cell (each value corresponds
to proportion of flow to each downslope neighbor), it is not easily visualized. However, an MFD
flow direction output raster is an input recognized by the FlowAccumulation task that would utilize
the MFD flow directions to proportion and accumulate flow from each cell to all downslope neighbors.
**DINF flow modeling algorithm**
The DINF flow method, described by Tarboton (1997), determines flow direction as the steepest
downward slope on eight triangular facets formed in a 3x3 cell window centered on the cell of
interest. The flow direction output is a floating-point raster represented as a single angle in
degrees going counter-clockwise from 0 (due east) to 360 (also due east).
================================ ====================================================================
**Argument** **Description**
-------------------------------- --------------------------------------------------------------------
input_surface_raster Required. The input raster representing a continuous surface.
This parameter can be specified as a Portal Item ID, a URL to a raster image service layer,
a cloud raster dataset, or a shared raster dataset.
-------------------------------- --------------------------------------------------------------------
force_flow Optional string. Specifies if edge cells will always flow outward or follow normal flow rules.
Choice list: [''NORMAL', 'FORCE']
The default value is 'NORMAL'.
-------------------------------- --------------------------------------------------------------------
flow_direction_type Optional string. Specifies the flow direction type to use.
Choice list: ['D8', 'MFd', 'DINF']
* ``D8`` is for the D8 flow direction type. This is the default.
* ``MFD`` is for the Multi Flow Direction type.
* ``DINF`` is for the D-Infinity type.
The default value is 'D8'.
-------------------------------- --------------------------------------------------------------------
generate_out_drop_raster Boolean, determines whether out_drop_raster should be generated or not.
Set this parameter to True, in order to generate the out_drop_raster.
If set to true, the output will be a named tuple with name values being
output_flow_direction_service and output_drop_service.
================================ ====================================================================
:returns: output raster with function applied
.. code-block:: python
# Usage Example: To add an image to an existing image collection.
flow_direction_output = flow_direction(input_surface_raster=in_raster,
force_flow="NORMAL",
flow_direction_type="D8",
generate_out_drop_raster=True)
out_var = flow_direction_output.save()
out_var.output_flow_direction_service # gives you the output flow direction imagery layer item
out_var.output_drop_service # gives you the output drop raster imagery layer item
"""
layer, input_surface_raster, raster_ra = _raster_input(input_surface_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "FlowDirection_sa",
"PrimaryInputParameterName" : "in_surface_raster",
"OutputRasterParameterName" : "out_flow_direction_raster",
"in_surface_raster" : input_surface_raster
}
}
force_flow_list = ["NORMAL","FORCE"]
if force_flow is not None:
if force_flow.upper() not in force_flow_list:
raise RuntimeError('force_flow should be one of the following '+ str(force_flow_list))
template_dict["rasterFunctionArguments"]["force_flow"] = force_flow
flow_direction_type_list = ["D8","MFD","DINF"]
if flow_direction_type is not None:
if flow_direction_type.upper() not in flow_direction_type_list:
raise RuntimeError('flow_direction_type should be one of the following '+ str(flow_direction_type_list))
template_dict["rasterFunctionArguments"]["flow_direction_type"] = flow_direction_type
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_surface_raster"] = raster_ra
if generate_out_drop_raster is True:
return _gbl_clone_layer(layer, template_dict, function_chain_ra, out_drop_raster = generate_out_drop_raster, use_ra=True)
return _gbl_clone_layer(layer, template_dict, function_chain_ra, out_drop_raster = generate_out_drop_raster)
def fill(input_surface_raster,
zlimit=None):
"""
Fills sinks in a surface raster to remove small imperfections in the data
Parameters
----------
:param input_surface_raster: The input raster representing a continuous surface.
:param zlimit: Data type - Double. Maximum elevation difference between a sink and
its pour point to be filled.
If the difference in z-values between a sink and its pour point is greater than the z_limit, that sink will not be filled.
The value for z-limit must be greater than zero.
Unless a value is specified for this parameter, all sinks will be filled, regardless of depth.
:return: output raster with function applied
"""
layer, input_surface_raster, raster_ra = _raster_input(input_surface_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Fill_sa",
"PrimaryInputParameterName" : "in_surface_raster",
"OutputRasterParameterName" : "out_surface_raster",
"in_surface_raster" : input_surface_raster
}
}
if zlimit is not None:
template_dict["rasterFunctionArguments"]["z_limit"] = zlimit
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_surface_raster"] = raster_ra
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def nibble(input_raster,
input_mask_raster,
nibble_values= "ALL_VALUES",
nibble_no_data= "PRESERVE_NODATA",
input_zone_raster=None):
"""
Replaces cells of a raster corresponding to a mask
with the values of the nearest neighbors.
Parameters
----------
:param input_raster: The input rater to nibble.
The input raster can be either integer or floating point type.
:param input_mask_raster: The input raster to use as the mask.
:param nibble_values: possbile options are "ALL_VALUES" and "DATA_ONLY".
Default is "ALL_VALUES"
:param nibble_no_data: PRESERVE_NODATA or PROCESS_NODATA possible values;
Default is PRESERVE_NODATA.
:param input_zone_raster: The input raster that defines the zones to use as the mask.
:return: output raster with function applied
"""
layer1, input_raster, raster_ra1 = _raster_input(input_raster)
layer2, input_mask_raster, raster_ra2 = _raster_input(input_mask_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Nibble_sa",
"PrimaryInputParameterName" : "in_raster",
"OutputRasterParameterName" : "out_raster",
"in_raster" : input_raster,
"in_mask_raster" : input_mask_raster,
"nibble_values" : nibble_values,
"nibble_nodata" : nibble_no_data
}
}
nibble_values_list = ["ALL_VALUES","DATA_ONLY"]
if nibble_values is not None:
if nibble_values.upper() not in nibble_values_list:
raise RuntimeError('nibble_values should be one of the following '+ str(nibble_values_list))
template_dict["rasterFunctionArguments"]["nibble_values"] = nibble_values
nibble_no_data_list = ["PRESERVE_NODATA","PROCESS_NODATA"]
if nibble_no_data is not None:
if nibble_no_data.upper() not in nibble_no_data_list:
raise RuntimeError('nibble_nodata should be one of the following '+ str(nibble_no_data_list))
template_dict["rasterFunctionArguments"]["nibble_nodata"] = nibble_no_data
if input_zone_raster is not None:
layer3, input_zone_raster, raster_ra3 = _raster_input(input_zone_raster)
template_dict["rasterFunctionArguments"]["in_zone_raster"] = input_zone_raster
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_raster"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_mask_raster"] = raster_ra2
if input_zone_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_zone_raster"] = raster_ra3
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def stream_link(input_raster,
input_flow_direction_raster):
"""
Assigns unique values to sections of a raster linear network between intersections
Parameters
----------
:param input_raster: An input raster that represents a linear stream network.
:param input_flow_direction_raster: The input raster that shows the direction of flow out of each cell
:return: output raster with function applied
"""
layer1, input_raster, raster_ra1 = _raster_input(input_raster)
layer2, input_flow_direction_raster, raster_ra2 = _raster_input(input_flow_direction_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "StreamLink_sa",
"PrimaryInputParameterName" : "in_stream_raster",
"OutputRasterParameterName" : "out_raster",
"in_stream_raster" : input_raster,
"in_flow_direction_raster" : input_flow_direction_raster
}
}
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_stream_raster"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def watershed(input_flow_direction_raster,
input_pour_point_data,
pour_point_field=None):
"""
Replaces cells of a raster corresponding to a mask
with the values of the nearest neighbors.
Parameters
----------
:param input_flow_direction_raster: The input raster that shows the direction of flow out of each cell.
:param input_pour_point_data: The input pour point locations. For a raster, this represents cells above
which the contributing area, or catchment, will be determined. All cells that
are not NoData will be used as source cells.
For a point feature dataset, this represents locations above which the contributing
area, or catchment, will be determined.
:param pour_point_field: Field used to assign values to the pour point locations. If the pour point dataset is a
raster, use Value.
If the pour point dataset is a feature, use a numeric field. If the field contains
floating-point values, they will be truncated into integers.
:return: output raster with function applied
"""
layer1, input_flow_direction_raster, raster_ra1 = _raster_input(input_flow_direction_raster)
layer2, input_pour_point_data, raster_ra2 = _raster_input(input_pour_point_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Watershed_sa",
"PrimaryInputParameterName" : "in_flow_direction_raster",
"OutputRasterParameterName" : "out_raster",
"in_flow_direction_raster" : input_flow_direction_raster,
"in_pour_point_data" : input_pour_point_data
}
}
if pour_point_field is not None:
template_dict["rasterFunctionArguments"]["pour_point_field"] = pour_point_field
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_pour_point_data"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def calculate_travel_cost(in_source_data,
in_cost_raster=None,
in_surface_raster=None,
in_horizontal_raster=None,
in_vertical_raster=None,
horizontal_factor="BINARY",
vertical_factor="BINARY",
maximum_distance=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None,
allocation_field=None,
generate_out_allocation_raster=False,
generate_out_backlink_raster=False):
"""
Parameters
----------
:param in_source_data: The layer that defines the sources to calculate the distance too. The layer
can be raster or feature.
:param in_cost_raster: A raster defining the impedance or cost to move planimetrically through each cell.
:param in_surface_raster: A raster defining the elevation values at each cell location.
:param in_horizontal_raster: A raster defining the horizontal direction at each cell.
:param in_vertical_raster: A raster defining the vertical (z) value for each cell.
:param horizontal_factor: The Horizontal Factor defines the relationship between the horizontal cost
factor and the horizontal relative moving angle.
Possible values are: "BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"
:param vertical_factor: The Vertical Factor defines the relationship between the vertical cost factor and
the vertical relative moving angle (VRMA)
Possible values are: "BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"
:param maximum_distance: The maximum distance to calculate out to. If no distance is provided, a default will
be calculated that is based on the locations of the input sources.
:param source_cost_multiplier: Multiplier to apply to the cost values.
:param source_start_cost: The starting cost from which to begin the cost calculations.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs
as the accumulative cost increases.
:param source_capacity: Defines the cost capacity for the traveler for a source.
:param source_direction: Defines the direction of the traveler when applying horizontal and vertical factors,
the source resistance rate, and the source starting cost.
Possible values: FROM_SOURCE, TO_SOURCE
:param allocation_field: A field on theinputSourceRasterOrFeatures layer that holds the values that define each source.
:param generate_out_backlink_raster: Boolean, determines whether out_backlink_raster should be generated or not.
Set this parameter to True, in order to generate the out_backlink_raster.
If set to true, the output will be a named tuple with name values being
output_distance_service and output_backlink_service.
eg,
out_layer = calculate_travel_cost(in_source_data
generate_out_backlink_raster=True)
out_var = out_layer.save()
then,
out_var.output_distance_service -> gives you the output distance imagery layer item
out_var.output_backlink_service -> gives you the output backlink raster imagery layer item
:param generate_out_allocation_raster: Boolean, determines whether out_allocation_raster should be generated or not.
Set this parameter to True, in order to generate the out_backlink_raster.
If set to true, the output will be a named tuple with name values being
output_distance_service and output_allocation_service.
eg,
out_layer = calculate_travel_cost(in_source_data
generate_out_allocation_raster=False)
out_var = out_layer.save()
then,
out_var.output_distance_service -> gives you the output distance imagery layer item
out_var.output_allocation_service -> gives you the output allocation raster imagery layer item
:param gis: Optional, the GIS on which this tool runs. If not specified, the active GIS is used.
:return: output raster with function applied
"""
if isinstance (in_source_data, ImageryLayer):
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
else:
raster_ra1 = _layer_input(in_source_data)
input_source_data = raster_ra1
layer1=raster_ra1
if in_cost_raster is not None:
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
if in_surface_raster is not None:
layer3, in_surface_raster, raster_ra3 = _raster_input(in_surface_raster)
if in_horizontal_raster is not None:
layer4, in_horizontal_raster, raster_ra4 = _raster_input(in_horizontal_raster)
if in_vertical_raster is not None:
layer5, in_vertical_raster, raster_ra5 = _raster_input(in_vertical_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "CalculateTravelCost_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_distance_raster",
"in_source_data" : input_source_data
}
}
if in_cost_raster is not None:
template_dict["rasterFunctionArguments"]["in_cost_raster"] = in_cost_raster
if in_surface_raster is not None:
template_dict["rasterFunctionArguments"]["in_surface_raster"] = in_surface_raster
if in_horizontal_raster is not None:
template_dict["rasterFunctionArguments"]["in_horizontal_raster"] = in_horizontal_raster
if in_vertical_raster is not None:
template_dict["rasterFunctionArguments"]["in_vertical_raster"] = in_vertical_raster
horizontal_factor_list = ["BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"]
if horizontal_factor.upper() not in horizontal_factor_list:
raise RuntimeError('horizontal_factor should be one of the following '+ str(horizontal_factor_list))
template_dict["rasterFunctionArguments"]["horizontal_factor"] = horizontal_factor
vertical_factor_list = ["BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"]
if vertical_factor.upper() not in vertical_factor_list:
raise RuntimeError('vertical_factor should be one of the following '+ str(vertical_factor_list))
template_dict["rasterFunctionArguments"]["vertical_factor"] = vertical_factor
if maximum_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = maximum_distance
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
if allocation_field is not None:
template_dict["rasterFunctionArguments"]["allocation_field"] = allocation_field
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_cost_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_cost_raster"] = raster_ra2
if in_surface_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra3
if in_horizontal_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_horizontal_raster"] = raster_ra4
if in_vertical_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_vertical_raster"] = raster_ra5
if isinstance(in_source_data, ImageryLayer):
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra, out_allocation_raster = generate_out_allocation_raster, out_backlink_raster = generate_out_backlink_raster, use_ra=True)
else:
return _feature_gbl_clone_layer(in_source_data, template_dict, function_chain_ra, out_allocation_raster = generate_out_allocation_raster, out_backlink_raster = generate_out_backlink_raster, use_ra=True)
def kernel_density(in_features,
population_field,
cell_size=None,
search_radius=None,
area_unit_scale_factor="SQUARE_MAP_UNITS",
out_cell_values="DENSITIES",
method="PLANAR"):
"""
Calculates a magnitude-per-unit area from point or polyline features using a kernel function to
fit a smoothly tapered surface to each point or polyline.
For more information, see
http://pro.arcgis.com/en/pro-app/help/data/imagery/kernel-density-global-function.htm
Parameters
----------
:param in_features: The input point or line features for which to calculate the density
:param population_field: Field denoting population values for each feature. The Population
Field is the count or quantity to be spread across the landscape to
create a continuous surface. Values in the population field may be
integer or floating point.
:param cell_size: The pixel size for the output raster dataset. If the Cellsize has
been set in the geoprocessing Environments it will be the default.
:param search_radius: The search radius within which to calculate density. Units are
based on the linear unit of the projection.
:param area_unit_scale_factor: The desired area units of the output density values.
-SQUARE_MAP_UNITS-For the square of the linear units of the output spatial reference.
-SQUARE_MILES-For (U.S.) miles.
-SQUARE_KILOMETERS-For kilometers.
-ACRES For (U.S.) acres.
-HECTARES-For hectares.
-SQUARE_METERS-For meters.
-SQUARE_YARDS-For (U.S.) yards.
-SQUARE_FEET-For (U.S.) feet.
-SQUARE_INCHES-For (U.S.) inches.
-SQUARE_CENTIMETERS-For centimeters.
-SQUARE_MILLIMETERS-For millimeters.
:param out_cell_values: Determines what the values in the output raster represent.
-DENSITIES-The output values represent the predicted density value. This is the default.
-EXPECTED_COUNTS-The output values represent the predicted amount of the phenomenon within each
pixel. Since the pixel value is linked to the specified Cellsize, the resulting raster cannot be
resampled to a different pixel size and still represent the amount of the phenomenon.
:param method: Determines whether to use a shortest path on a spheroid (geodesic) or a flat earth (planar) method.
-PLANAR-Uses planar distances between the features. This is the default.
-GEODESIC-Uses geodesic distances between features. This method takes into account the curvature
of the spheroid and correctly deals with data near the poles and the International dateline.
:return: output raster
"""
input_features = _layer_input(in_features)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "KernelDensity_sa",
"PrimaryInputParameterName":"in_features",
"OutputRasterParameterName":"out_raster",
"in_features": input_features,
"population_field":population_field,
"RasterInfo":{"blockWidth" : 2048,
"blockHeight":256,
"bandCount":1,
"pixelType":9,
"firstPyramidLevel":1,
"maximumPyramidLevel":30,
"pixelSizeX":0,
"pixelSizeY" :0,
"type":"RasterInfo"}
}
}
if search_radius is not None:
template_dict["rasterFunctionArguments"]["search_radius"] = search_radius
if cell_size is not None:
template_dict["rasterFunctionArguments"]["cell_size"] = cell_size
if area_unit_scale_factor is not None:
area_unit_scale_factor_list = ["SQUARE_MAP_UNITS","SQUARE_MILES", "SQUARE_KILOMETERS", "ACRES","HECTARES","SQUARE_METERS","SQUARE_YARDS"
"SQUARE_FEET","SQUARE_INCHES", "SQUARE_CENTIMETERS","SQUARE_MILLIMETERS"]
if area_unit_scale_factor.upper() not in area_unit_scale_factor_list:
raise RuntimeError('area_unit_scale_factor should be one of the following '+ str(area_unit_scale_factor_list))
template_dict["rasterFunctionArguments"]["area_unit_scale_factor"] = area_unit_scale_factor
out_cell_values_list = ["DENSITIES", "EXPECTED_COUNTS"]
if out_cell_values.upper() not in out_cell_values_list:
raise RuntimeError('out_cell_values should be one of the following '+ str(out_cell_values_list))
template_dict["rasterFunctionArguments"]["out_cell_values"] = out_cell_values
method_list = ["PLANAR", "GEODESIC"]
if method.upper() not in method_list:
raise RuntimeError('method should be one of the following '+ str(method_list))
template_dict["rasterFunctionArguments"]["method"] = method
if isinstance(in_features, Item):
in_features = in_features.layers[0]
newlyr = ImageryLayer(in_features._url, in_features._gis)
newlyr._fn = template_dict
newlyr._fnra = template_dict
newlyr._uses_gbl_function = True
return newlyr
def cost_path(in_destination_data,
in_cost_distance_raster,
in_cost_backlink_raster,
path_type="EACH_CELL",
destination_field=None,
force_flow_direction_convention=None,
):
"""
Calculates the least-cost path from a source to a destination.
Parameters
----------
:param in_destination_data: A raster or feature dataset that identifies those cells from which the least-cost
path is determined to the least costly source. If the input is a raster, the input
consists of cells that have valid values (zero is a valid value), and the remaining
cells must be assigned NoData.
:param in_cost_distance_raster: The name of a cost distance raster to be used to determine the least-cost path from
the destination locations to a source. The cost distance raster is usually created
with the Cost Distance, Cost Allocation or Cost Back Link tools. The cost distance
raster stores, for each cell, the minimum accumulative cost distance over a cost
surface from each cell to a set of source cells.
:param in_cost_backlink_raster: The name of a cost back link raster used to determine the path to return to a source
via the least-cost path. For each cell in the back link raster, a value identifies
the neighbor that is the next cell on the least accumulative cost path from the cell
to a single source cell or set of source cells.
:param path_type: A keyword defining the manner in which the values and zones on the input destination
data will be interpreted in the cost path calculations.
EACH_CELL - For each cell with valid values on the input destination data, a least-cost
path is determined and saved on the output raster. With this option, each cell of the
input destination data is treated separately, and a least-cost path is determined for
each from cell.
EACH_ZONE - For each zone on the input destination data, a least-cost path is determined
and saved on the output raster. With this option, the least-cost path for each zone
begins at the cell with the lowest cost distance weighting in the zone.
BEST_SINGLE - For all cells on the input destination data, the least-cost path is derived
from the cell with the minimum of the least-cost paths to source cells.
:param destination_field: The field used to obtain values for the destination locations. Input feature data must
contain at least one valid field.
:param force_flow_direction_convention: Optional boolean. Set to True to force flow direction convention for backlink raster
:return: output raster with function applied
"""
layer1, in_destination_data, raster_ra1 = _raster_input(in_destination_data)
layer2, in_cost_distance_raster, raster_ra2 = _raster_input(in_cost_distance_raster)
layer3, in_cost_backlink_raster, raster_ra3 = _raster_input(in_cost_backlink_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "CostPath_sa",
"PrimaryInputParameterName":"in_destination_data",
"OutputRasterParameterName":"out_raster",
"in_destination_data": in_destination_data,
"in_cost_distance_raster": in_cost_distance_raster,
"in_cost_backlink_raster": in_cost_backlink_raster
}
}
if path_type is not None:
path_type_list = ["EACH_CELL", "EACH_ZONE", "BEST_SINGLE"]
if path_type.upper() not in path_type_list:
raise RuntimeError('path_type should be one of the following '+ str(path_type_list))
template_dict["rasterFunctionArguments"]["path_type"] = path_type
if destination_field is not None:
template_dict["rasterFunctionArguments"]["destination_field"] = destination_field
if force_flow_direction_convention is not None:
template_dict["rasterFunctionArguments"]["force_flow_direction_convention"] = force_flow_direction_convention
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_destination_data"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_cost_distance_raster"] = raster_ra2
function_chain_ra["rasterFunctionArguments"]["in_cost_backlink_raster"] = raster_ra3
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def euclidean_direction(in_source_data,
cell_size=None,
max_distance=None,
distance_method="PLANAR",
in_barrier_data=None):
"""
Calculates, for each cell, the Euclidean distance to the closest source.
Parameters
----------
:param in_source_data: The input source locations. This is a raster or feature dataset that
identifies the cells or locations to which the Euclidean distance for
every output cell location is calculated. For rasters, the input type
can be integer or floating point.
:param cell_size: Defines the threshold that the accumulative distance values cannot
exceed. If an accumulative Euclidean distance value exceeds this
value, the output value for the cell location will be NoData. The default
distance is to the edge of the output raster.
:param max_distance: The cell size at which the output raster will be created. This will be the
value in the environment if it is explicitly set. If it is not set in the
environment, the default cell size will depend on if the input source data
is a raster or a feature, as follows: If the source is raster, the output
will have that same cell size. If the source is feature, the output will
have a cell size determined by the shorter of the width or height of the
extent of input feature, in the input spatial reference, divided by 250.
:param distance_method: Optional String; Determines whether to calculate the distance using a planar (flat earth)
or a geodesic (ellipsoid) method.
Planar - Planar measurements use 2D Cartesian mathematics to calculate
length and area. The option is only available when measuring in a
projected coordinate system and the 2D plane of that coordinate system
will be used as the basis for the measurements. This is the default.
Geodesic - The shortest line between two points on the earth's surface
on a spheroid (ellipsoid). Therefore, regardless of input or output
projection, the results do not change.
.. note::
One use for a geodesic line is when you want to determine the shortest
distance between two cities for an airplane's flight path. This is also
known as a great circle line if based on a sphere rather than an ellipsoid.
:param in_barrier_data: Optional barrier raster.
:return: output raster with function applied
"""
layer, in_source_data, raster_ra = _raster_input(in_source_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "EucDirection_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_direction_raster",
"in_source_data": in_source_data,
}
}
if in_barrier_data is not None:
layer2, in_barrier_data, raster_ra2 = _raster_input(in_barrier_data)
template_dict["rasterFunctionArguments"]["in_barrier_data"] = in_barrier_data
if cell_size is not None:
template_dict["rasterFunctionArguments"]["cell_size"] = cell_size
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
distance_method_list = ["PLANAR","GEODESIC"]
if distance_method is not None:
if distance_method.upper() not in distance_method_list:
raise RuntimeError('distance_method should be one of the following '+ str(distance_method_list))
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_source_data"] = raster_ra
if in_barrier_data is not None:
function_chain_ra["rasterFunctionArguments"]["in_barrier_data"] = raster_ra2
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def cost_backlink(in_source_data,
in_cost_raster,
max_distance=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Calculates the least accumulative cost distance for each cell from or to the least-cost
source over a cost surface.
Parameters
----------
:param in_source_data: The input raster that identifies the pixels or locations to which the
least accumulated cost distance for every output pixel location is
calculated. The Source Raster can be an integer or a floating-point value.
:param in_cost_raster: A raster defining the cost or impedance to move planimetrically through each pixel.
The value at each pixel location represents the cost-per-unit distance for moving
through it. Each pixel location value is multiplied by the pixel resolution, while
also compensating for diagonal movement to obtain the total cost of passing through
the pixel.
:param max_distance: The threshold that the accumulative cost values cannot exceed. If an accumulative cost
distance exceeds this value, the output value for the pixel location will be NoData.
The maximum distance defines the extent for which the accumulative cost distances are
calculated. The default distance is to the edge of the output raster.
:param source_cost_multiplier: The threshold that the accumulative cost values cannot exceed. If an accumulative
cost distance exceeds this value, the output value for the pixel location will be
NoData. The maximum distance defines the extent for which the accumulative cost
distances are calculated. The default distance is to the edge of the output raster.
:param source_start_cost: The starting cost from which to begin the cost calculations. This parameter allows
for the specification of the fixed cost associated with a source. Instead of starting
at a cost of 0, the cost algorithm will begin with the value set here.
The default is 0. The value must be 0 or greater. A numeric (double) value or a field
from the Source Raster can be used for this parameter.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs as the
accumulative cost increases. It is used to model fatigue of the traveler. The growing
accumulative cost to reach a pixel is multiplied by the resistance rate and added to
the cost to move into the subsequent pixel.
It is a modified version of a compound interest rate formula that is used to calculate
the apparent cost of moving through a pixel. As the value of the resistance rate increases,
it increases the cost of the pixels that are visited later. The greater the resistance rate,
the higher the cost to reach the next pixel, which is compounded for each subsequent movement.
Since the resistance rate is similar to a compound rate and generally the accumulative cost
values are very large, small resistance rates are suggested, such as 0.005 or even smaller,
depending on the accumulative cost values.
The default is 0. The values must be 0 or greater. A numeric (double) value or a field from
the Source Raster can be used for this parameter.
:param source_capacity: Defines the cost capacity for the traveler for a source. The cost calculations continue for
each source until the specified capacity is reached.
The default capacity is to the edge of the output raster. The values must be greater than 0.
A double numeric value or a field from the Source Raster can be used for this parameter.
:param source_direction: Defines the direction of the traveler when applying the source resistance rate and the source
starting cost.
FROM_SOURCE - The source resistance rate and source starting cost will be applied beginning
at the input source and moving out to the nonsource cells. This is the default.
TO_SOURCE - The source resistance rate and source starting cost will be applied beginning at
each nonsource cell and moving back to the input source.
Either specify the From Source or To Source keyword, which will be applied to all sources,
or specify a field in the Source Raster that contains the keywords to identify the direction
of travel for each source. That field must contain the string From Source or To Source.
:return: output raster with function applied
"""
layer1, in_source_data, raster_ra1 = _raster_input(in_source_data)
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "CostBackLink_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_backlink_raster",
"in_source_data": in_source_data,
"in_cost_raster": in_cost_raster
}
}
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_source_data"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_cost_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def region_group(in_raster,
number_of_neighbor_cells="FOUR",
zone_connectivity="WITHIN",
add_link = "ADD_LINK",
excluded_value = 0):
"""
For each cell in the output, the identity of the connected region to which that cell
belongs is recorded. A unique number is assigned to each region.
Parameters
----------
:param in_raster: Required, the input raster whose unique connected regions will be identified.
It must be of integer type.
:param number_of_neighbor_cells: Optional. The number of neighboring cells to use in evaluating connectivity between cells.
Possible values - FOUR, EIGHT. Default is FOUR
:param zone_connectivity: Optional. Defines which cell values should be considered when testing for connectivity.
Possible values - WITHIN, CROSS. Default is WITHIN
:param add_link: Optional, Specifies whether a link field is added to the table of the output.
Possible values - ADD_LINK, NO_LINK. Default is ADD_LINK
:param excluded_value: Identifies a value such that if a cell location contains the value, no spatial
connectivity will be evaluated regardless how the number of neighbors is specified (FOUR or EIGHT).
Cells with the excluded value will be treated as NoData and are eliminated from calculations.
Cell locations that contain the excluded value will receive 0 on the output raster.
The excluded value is similar to the concept of a background value,
or setting a mask in the environment for a single run of the tool.
A value must be specified for this parameter if the CROSS keyword is specified
:return: output raster with function applied
"""
layer, in_raster, raster_ra = _raster_input(in_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "RegionGroup_sa",
"PrimaryInputParameterName":"in_raster",
"OutputRasterParameterName":"out_raster",
"in_raster": in_raster,
}
}
if number_of_neighbor_cells is not None:
if number_of_neighbor_cells.upper() == "EIGHT" or number_of_neighbor_cells.upper() == "FOUR":
template_dict["rasterFunctionArguments"]["number_neighbors"] = number_of_neighbor_cells.upper()
else:
raise RuntimeError("number_of_neighbor_cells should either be 'EIGHT' or 'FOUR' ")
if zone_connectivity is not None:
if zone_connectivity.upper() == "WITHIN" or zone_connectivity.upper() == "CROSS":
template_dict["rasterFunctionArguments"]["zone_connectivity"] = zone_connectivity.upper()
else:
raise RuntimeError("zone_connectivity should either be 'WITHIN' or 'CROSS' ")
if add_link is not None:
if add_link.upper() == "ADD_LINK" or add_link.upper() == "NO_LINK":
template_dict["rasterFunctionArguments"]["add_link"] = add_link.upper()
else:
raise RuntimeError("add_link should either be 'ADD_LINK' or 'NO_LINK' ")
if excluded_value is not None:
template_dict["rasterFunctionArguments"]["excluded_value"] = excluded_value
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_raster"] = raster_ra
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def corridor(in_distance_raster1,
in_distance_raster2):
"""
Calculates the sum of accumulative costs for two input accumulative cost rasters.
Parameters
----------
:param in_distance_raster1: The first input distance raster.
It should be an accumulated cost distance output from a distance function
such as cost_distance or path_distance.
:param in_distance_raster2: The second input distance raster.
It should be an accumulated cost distance output from a distance function
such as cost_distance or path_distance.
:return: output raster with function applied
"""
layer1, in_distance_raster1, raster_ra1 = _raster_input(in_distance_raster1)
layer2, in_distance_raster2, raster_ra2 = _raster_input(in_distance_raster2)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Corridor_sa",
"PrimaryInputParameterName":"in_distance_raster1",
"OutputRasterParameterName":"out_raster",
"in_distance_raster1": in_distance_raster1,
"in_distance_raster2": in_distance_raster2
}
}
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_distance_raster1"] = raster_ra1
function_chain_ra["rasterFunctionArguments"]["in_distance_raster2"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def path_distance(in_source_data,
in_cost_raster=None,
in_surface_raster=None,
in_horizontal_raster=None,
in_vertical_raster=None,
horizontal_factor="BINARY",
vertical_factor="BINARY",
maximum_distance=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Calculates, for each cell, the least accumulative cost distance from or to the least-cost source,
while accounting for surface distance along with horizontal and vertical cost factors
Parameters
----------
:param in_source_data: The input source locations.
This is a raster that identifies the cells or locations from or to which the
least accumulated cost distance for every output cell location is calculated.
The raster input type can be integer or floating point.
:param in_cost_raster: A raster defining the impedance or cost to move planimetrically through each cell.
The value at each cell location represents the cost-per-unit distance for moving through the cell.
Each cell location value is multiplied by the cell resolution while also
compensating for diagonal movement to obtain the total cost of passing through the cell.
The values of the cost raster can be integer or floating point,
but they cannot be negative or zero (you cannot have a negative or zero cost).
:param in_surface_raster: A raster defining the elevation values at each cell location.
The values are used to calculate the actual surface distance covered when
passing between cells.
:param in_horizontal_raster: A raster defining the horizontal direction at each cell.
The values on the raster must be integers ranging from 0 to 360, with 0 degrees being north,
or toward the top of the screen, and increasing clockwise. Flat areas should be given a value of -1.
The values at each location will be used in conjunction with the {horizontal_factor} to determine
the horizontal cost incurred when moving from a cell to its neighbors.
:param in_vertical_raster: A raster defining the vertical (z) value for each cell. The values are used for calculating the slope
used to identify the vertical factor incurred when moving from one cell to another.
:param horizontal_factor: The Horizontal Factor defines the relationship between the horizontal cost
factor and the horizontal relative moving angle.
Possible values are: "BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"
:param vertical_factor: The Vertical Factor defines the relationship between the vertical cost factor and
the vertical relative moving angle (VRMA)
Possible values are: "BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"
:param maximum_distance: Defines the threshold that the accumulative cost values cannot exceed.
If an accumulative cost distance value exceeds this value, the output value for the cell
location will be NoData. The maximum distance defines the extent for which the accumulative cost distances are calculated.
The default distance is to the edge of the output raster.
:param source_cost_multiplier: Multiplier to apply to the cost values.
:param source_start_cost: The starting cost from which to begin the cost calculations.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs
as the accumulative cost increases. It is used to model fatigue of the traveler.
The growing accumulative cost to reach a cell is multiplied by the resistance rate
and added to the cost to move into the subsequent cell.
:param source_capacity: Defines the cost capacity for the traveler for a source.
The cost calculations continue for each source until the specified capacity is reached.
The values must be greater than zero. The default capacity is to the edge of the output raster.
:param source_direction: Defines the direction of the traveler when applying horizontal and vertical factors,
the source resistance rate, and the source starting cost.
Possible values: FROM_SOURCE, TO_SOURCE
:return: output raster with function applied
"""
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
if in_cost_raster is not None:
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
if in_surface_raster is not None:
layer3, in_surface_raster, raster_ra3 = _raster_input(in_surface_raster)
if in_horizontal_raster is not None:
layer4, in_horizontal_raster, raster_ra4 = _raster_input(in_horizontal_raster)
if in_vertical_raster is not None:
layer5, in_vertical_raster, raster_ra5 = _raster_input(in_vertical_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "PathDistance_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_distance_raster",
"in_source_data" : input_source_data
}
}
if in_cost_raster is not None:
template_dict["rasterFunctionArguments"]["in_cost_raster"] = in_cost_raster
if in_surface_raster is not None:
template_dict["rasterFunctionArguments"]["in_surface_raster"] = in_surface_raster
if in_horizontal_raster is not None:
template_dict["rasterFunctionArguments"]["in_horizontal_raster"] = in_horizontal_raster
if in_vertical_raster is not None:
template_dict["rasterFunctionArguments"]["in_vertical_raster"] = in_vertical_raster
horizontal_factor_list = ["BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"]
if horizontal_factor is not None:
if horizontal_factor.upper() not in horizontal_factor_list:
raise RuntimeError('horizontal_factor should be one of the following '+ str(horizontal_factor_list))
template_dict["rasterFunctionArguments"]["horizontal_factor"] = horizontal_factor
vertical_factor_list = ["BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"]
if vertical_factor is not None:
if vertical_factor.upper() not in vertical_factor_list:
raise RuntimeError('vertical_factor should be one of the following '+ str(vertical_factor_list))
template_dict["rasterFunctionArguments"]["vertical_factor"] = vertical_factor
if maximum_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = maximum_distance
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction is not None:
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_cost_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_cost_raster"] = raster_ra2
if in_surface_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra3
if in_horizontal_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_horizontal_raster"] = raster_ra4
if in_vertical_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_vertical_raster"] = raster_ra5
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra)
def path_distance_allocation(in_source_data,
in_cost_raster=None,
in_surface_raster=None,
in_horizontal_raster=None,
in_vertical_raster=None,
horizontal_factor="BINARY",
vertical_factor="BINARY",
maximum_distance=None,
in_value_raster=None,
source_field = None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Calculates the least-cost source for each cell based on the least accumulative cost over a cost surface,
while accounting for surface distance along with horizontal and vertical cost factors.
Parameters
----------
:param in_source_data: The input source locations.
This is a raster or feature dataset that identifies the cells or locations from or to which
the least accumulated cost distance for every output cell location is calculated.
For rasters, the input type can be integer or floating point.
If the input source raster is floating point, the {in_value_raster} must be set, and it must be of integer type.
The value raster will take precedence over any setting of the {source_field}.
:param in_cost_raster: A raster defining the impedance or cost to move planimetrically through each cell.
:param in_surface_raster: A raster defining the elevation values at each cell location.
:param in_horizontal_raster: A raster defining the horizontal direction at each cell.
:param in_vertical_raster: A raster defining the vertical (z) value for each cell.
:param horizontal_factor: The Horizontal Factor defines the relationship between the horizontal cost
factor and the horizontal relative moving angle.
Possible values are: "BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"
:param vertical_factor: The Vertical Factor defines the relationship between the vertical cost factor and
the vertical relative moving angle (VRMA)
Possible values are: "BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"
:param maximum_distance: Defines the threshold that the accumulative cost values cannot exceed.
:param in_value_raster: The input integer raster that identifies the zone values that should be
used for each input source location.
For each source location (cell or feature), the value defined by the {in_value_raster} will be
assigned to all cells allocated to the source location for the computation.
The value raster will take precedence over any setting for the {source_field}.
:param source_field: The field used to assign values to the source locations. It must be of integer type.
If the {in_value_raster} has been set, the values in that input will have precedence over any setting for the {source_field}.
:param source_cost_multiplier: Multiplier to apply to the cost values.
Allows for control of the mode of travel or the magnitude at a source. The greater the multiplier,
the greater the cost to move through each cell.
The values must be greater than zero. The default is 1.
:param source_start_cost: The starting cost from which to begin the cost calculations.
Allows for the specification of the fixed cost associated with a source. Instead of starting at a cost of zero,
the cost algorithm will begin with the value set by source_start_cost.
The values must be zero or greater. The default is 0.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs as the accumulative cost increases.
It is used to model fatigue of the traveler. The growing accumulative cost to reach a cell is multiplied by
the resistance rate and added to the cost to move into the subsequent cell.
:param source_capacity: Defines the cost capacity for the traveler for a source.
The cost calculations continue for each source until the specified capacity is reached.
The values must be greater than zero. The default capacity is to the edge of the output raster.
:param source_direction: Defines the direction of the traveler when applying horizontal and vertical factors,
the source resistance rate, and the source starting cost.
Possible values: FROM_SOURCE, TO_SOURCE
:return: output raster with function applied
"""
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
if in_cost_raster is not None:
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
if in_surface_raster is not None:
layer3, in_surface_raster, raster_ra3 = _raster_input(in_surface_raster)
if in_horizontal_raster is not None:
layer4, in_horizontal_raster, raster_ra4 = _raster_input(in_horizontal_raster)
if in_vertical_raster is not None:
layer5, in_vertical_raster, raster_ra5 = _raster_input(in_vertical_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "PathAllocation_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_allocation_raster",
"in_source_data" : input_source_data
}
}
if in_cost_raster is not None:
template_dict["rasterFunctionArguments"]["in_cost_raster"] = in_cost_raster
if in_surface_raster is not None:
template_dict["rasterFunctionArguments"]["in_surface_raster"] = in_surface_raster
if in_horizontal_raster is not None:
template_dict["rasterFunctionArguments"]["in_horizontal_raster"] = in_horizontal_raster
if in_vertical_raster is not None:
template_dict["rasterFunctionArguments"]["in_vertical_raster"] = in_vertical_raster
horizontal_factor_list = ["BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"]
if horizontal_factor is not None:
if horizontal_factor.upper() not in horizontal_factor_list:
raise RuntimeError('horizontal_factor should be one of the following '+ str(horizontal_factor_list))
template_dict["rasterFunctionArguments"]["horizontal_factor"] = horizontal_factor
vertical_factor_list = ["BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"]
if vertical_factor is not None:
if vertical_factor.upper() not in vertical_factor_list:
raise RuntimeError('vertical_factor should be one of the following '+ str(vertical_factor_list))
template_dict["rasterFunctionArguments"]["vertical_factor"] = vertical_factor
if maximum_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = maximum_distance
if in_value_raster is not None:
layer6, in_value_raster, raster_ra6 = _raster_input(in_value_raster)
template_dict["rasterFunctionArguments"]["in_value_raster"] = in_value_raster
if source_field is not None:
template_dict["rasterFunctionArguments"]["source_field"] = source_field
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_cost_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_cost_raster"] = raster_ra2
if in_surface_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra3
if in_horizontal_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_horizontal_raster"] = raster_ra4
if in_vertical_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_vertical_raster"] = raster_ra5
if in_value_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_value_raster"] = raster_ra6
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra)
def path_distance_back_link(in_source_data,
in_cost_raster=None,
in_surface_raster=None,
in_horizontal_raster=None,
in_vertical_raster=None,
horizontal_factor="BINARY",
vertical_factor="BINARY",
maximum_distance=None,
source_cost_multiplier=None,
source_start_cost=None,
source_resistance_rate=None,
source_capacity=None,
source_direction=None):
"""
Defines the neighbor that is the next cell on the least accumulative cost path to the least-cost source,
while accounting for surface distance along with horizontal and vertical cost factors.
Parameters
----------
:param in_source_data: The input source locations.
This is a raster that identifies the cells or locations from
or to which the least accumulated cost distance for every output cell location is calculated.
For rasters, the input type can be integer or floating point.
:param in_cost_raster: A raster defining the impedance or cost to move planimetrically through each cell.
The value at each cell location represents the cost-per-unit distance for moving through the cell.
Each cell location value is multiplied by the cell resolution while also compensating for diagonal
movement to obtain the total cost of passing through the cell.
The values of the cost raster can be integer or floating point, but they cannot be negative or
zero (you cannot have a negative or zero cost).
:param in_surface_raster: A raster defining the elevation values at each cell location. The values are used to calculate the actual
surface distance covered when passing between cells.
:param in_horizontal_raster: A raster defining the horizontal direction at each cell.
The values on the raster must be integers ranging from 0 to 360, with 0 degrees being north, or toward
the top of the screen, and increasing clockwise. Flat areas should be given a value of -1.
The values at each location will be used in conjunction with the {horizontal_factor} to determine the
horizontal cost incurred when moving from a cell to its neighbors.
:param in_vertical_raster: A raster defining the vertical (z) value for each cell. The values are used for calculating the slope
used to identify the vertical factor incurred when moving from one cell to another.
:param horizontal_factor: The Horizontal Factor defines the relationship between the horizontal cost
factor and the horizontal relative moving angle.
Possible values are: "BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"
:param vertical_factor: The Vertical Factor defines the relationship between the vertical cost factor and
the vertical relative moving angle (VRMA)
Possible values are: "BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"
:param maximum_distance: Defines the threshold that the accumulative cost values cannot exceed. If an accumulative cost distance
value exceeds this value, the output value for the cell location will be NoData. The maximum distance
defines the extent for which the accumulative cost distances are calculated.
The default distance is to the edge of the output raster.
:param source_cost_multiplier: Multiplier to apply to the cost values. Allows for control of the mode of travel or the magnitude at a source.
The greater the multiplier, the greater the cost to move through each cell. The values must be greater than zero.
The default is 1.
:param source_start_cost: The starting cost from which to begin the cost calculations. Allows for the specification of the fixed cost associated with a source.
Instead of starting at a cost of zero, the cost algorithm will begin with the value set by source_start_cost.
The values must be zero or greater. The default is 0.
:param source_resistance_rate: This parameter simulates the increase in the effort to overcome costs as the accumulative cost increases.
It is used to model fatigue of the traveler. The growing accumulative cost to reach a cell is multiplied
by the resistance rate and added to the cost to move into the subsequent cell.
:param source_capacity: Defines the cost capacity for the traveler for a source.
The cost calculations continue for each source until the specified capacity is reached.
The values must be greater than zero. The default capacity is to the edge of the output raster.
:param source_direction: Defines the direction of the traveler when applying horizontal and vertical factors,
the source resistance rate, and the source starting cost.
Possible values: FROM_SOURCE, TO_SOURCE
:return: output raster with function applied
"""
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
if in_cost_raster is not None:
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
if in_surface_raster is not None:
layer3, in_surface_raster, raster_ra3 = _raster_input(in_surface_raster)
if in_horizontal_raster is not None:
layer4, in_horizontal_raster, raster_ra4 = _raster_input(in_horizontal_raster)
if in_vertical_raster is not None:
layer5, in_vertical_raster, raster_ra5 = _raster_input(in_vertical_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "PathBackLink_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_backlink_raster",
"in_source_data" : input_source_data
}
}
if in_cost_raster is not None:
template_dict["rasterFunctionArguments"]["in_cost_raster"] = in_cost_raster
if in_surface_raster is not None:
template_dict["rasterFunctionArguments"]["in_surface_raster"] = in_surface_raster
if in_horizontal_raster is not None:
template_dict["rasterFunctionArguments"]["in_horizontal_raster"] = in_horizontal_raster
if in_vertical_raster is not None:
template_dict["rasterFunctionArguments"]["in_vertical_raster"] = in_vertical_raster
horizontal_factor_list = ["BINARY", "LINEAR", "FORWARD", "INVERSE_LINEAR"]
if horizontal_factor is not None:
if horizontal_factor.upper() not in horizontal_factor_list:
raise RuntimeError('horizontal_factor should be one of the following '+ str(horizontal_factor_list))
template_dict["rasterFunctionArguments"]["horizontal_factor"] = horizontal_factor
vertical_factor_list = ["BINARY", "LINEAR", "SYMMETRIC_LINEAR", "INVERSE_LINEAR",
"SYMMETRIC_INVERSE_LINEAR", "COS", "SEC", "COS_SEC", "SEC_COS"]
if vertical_factor is not None:
if vertical_factor.upper() not in vertical_factor_list:
raise RuntimeError('vertical_factor should be one of the following '+ str(vertical_factor_list))
template_dict["rasterFunctionArguments"]["vertical_factor"] = vertical_factor
if maximum_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = maximum_distance
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_start_cost is not None:
template_dict["rasterFunctionArguments"]["source_start_cost"] = source_start_cost
if source_resistance_rate is not None:
template_dict["rasterFunctionArguments"]["source_resistance_rate"] = source_resistance_rate
if source_capacity is not None:
template_dict["rasterFunctionArguments"]["source_capacity"] = source_capacity
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_cost_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_cost_raster"] = raster_ra2
if in_surface_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra3
if in_horizontal_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_horizontal_raster"] = raster_ra4
if in_vertical_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_vertical_raster"] = raster_ra5
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra)
def calculate_distance(in_source_data,
maximum_distance=None,
output_cell_size=None,
allocation_field=None,
generate_out_allocation_raster=False,
generate_out_direction_raster=False,
generate_out_back_direction_raster=False,
in_barrier_data=None,
distance_method='PLANAR'):
"""
Calculates the Euclidean distance, direction, and allocation from a single source or set of sources.
Parameters
----------
:param in_source_data: The layer that defines the sources to calculate the distance to.
The layer can be raster or feature. To use a raster input, it must
be of integer type.
:param maximum_distance: Defines the threshold that the accumulative distance values
cannot exceed. If an accumulative Euclidean distance value exceeds
this value, the output value for the cell location will be NoData.
The default distance is to the edge of the output raster.
Supported units: Meters | Kilometers | Feet | Miles
Example:
{"distance":"60","units":"Meters"}
:param output_cell_size: Specify the cell size to use for the output raster.
Supported units: Meters | Kilometers | Feet | Miles
Example:
{"distance":"60","units":"Meters"}
:param allocation_field: A field on the input_source_data layer that holds the values that
defines each source.
It can be an integer or a string field of the source dataset.
The default for this parameter is 'Value'.
:param generate_out_direction_raster: Boolean, determines whether out_direction_raster should be generated or not.
Set this parameter to True, in order to generate the out_direction_raster.
If set to true, the output will be a named tuple with name values being
output_distance_service and output_direction_service.
eg,
out_layer = calculate_distance(in_source_data
generate_out_direction_raster=True)
out_var = out_layer.save()
then,
out_var.output_distance_service -> gives you the output distance imagery layer item
out_var.output_direction_service -> gives you the output backlink raster imagery layer item
The output direction raster is in degrees, and indicates the
direction to return to the closest source from each cell center.
The values on the direction raster are based on compass directions,
with 0 degrees reserved for the source cells. Thus, a value of 90
means 90 degrees to the East, 180 is to the South, 270 is to the west,
and 360 is to the North.
:param generate_out_allocation_raster: Boolean, determines whether out_allocation_raster should be generated or not.
Set this parameter to True, in order to generate the out_backlink_raster.
If set to true, the output will be a named tuple with name values being
output_distance_service and output_allocation_service.
eg,
out_layer = calculate_distance(in_source_data
generate_out_allocation_raster=False)
out_var = out_layer.save()
then,
out_var.output_distance_service -> gives you the output distance imagery layer item
out_var.output_allocation_service -> gives you the output allocation raster imagery layer item
This parameter calculates, for each cell, the nearest source based
on Euclidean distance.
:return: output raster with function applied
"""
if isinstance (in_source_data, ImageryLayer):
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
else:
raster_ra1 = _layer_input(in_source_data)
input_source_data = raster_ra1
layer1=raster_ra1
if in_barrier_data is not None:
layer2, in_barrier_data, raster_ra2 = _raster_input(in_barrier_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "CalculateDistance_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_distance_raster",
"in_source_data" : input_source_data
}
}
if maximum_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = maximum_distance
if output_cell_size is not None:
template_dict["rasterFunctionArguments"]["output_cell_size"] = output_cell_size
if allocation_field is not None:
template_dict["rasterFunctionArguments"]["allocation_field"] = allocation_field
if distance_method is not None:
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_barrier_data is not None:
function_chain_ra['rasterFunctionArguments']["in_barrier_data"] = raster_ra2
if isinstance(in_source_data, ImageryLayer):
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra, out_allocation_raster = generate_out_allocation_raster, out_direction_raster = generate_out_direction_raster, out_back_direction_raster=generate_out_back_direction_raster, use_ra=True)
else:
return _feature_gbl_clone_layer(in_source_data, template_dict, function_chain_ra, out_allocation_raster = generate_out_allocation_raster, out_direction_raster = generate_out_direction_raster, out_back_direction_raster=generate_out_back_direction_raster, use_ra=True)
def euclidean_back_direction(in_source_data,
cell_size=None,
max_distance=None,
distance_method="PLANAR",
in_barrier_data=None):
"""
Calculates, for each cell, the direction, in degrees, to the neighboring cell along
the shortest path back to the closest source while avoiding barriers.
The direction is calculated from each cell center to the center of the source cell
that's nearest to it.
The range of values is from 0 degrees to 360 degrees, with 0 reserved for the source cells.
Due east (right) is 90 and the values increase clockwise (180 is south, 270 is west,
and 360 is north).
For more information, see
https://pro.arcgis.com/en/pro-app/help/data/imagery/euclidean-back-direction-function.htm
Parameters
----------
:param in_source_data: raster; The input raster that identifies the pixels or locations to
which the Euclidean direction for every output cell location is calculated.
The input type can be an integer or a floating-point value.
:param cell_size: The pixel size at which the output raster will be created. If the cell
size was explicitly set in Environments, that will be the default cell size.
If Environments was not set, the output cell size will be the same as the
Source Raster
:param max_distance: The threshold that the accumulative distance values cannot exceed. If an
accumulative Euclidean distance exceeds this value, the output value for
the pixel location will be NoData. The default distance is to the edge
of the output raster
:param distance_method: Optional String; Determines whether to calculate the distance using a planar (flat earth)
or a geodesic (ellipsoid) method.
Planar - Planar measurements use 2D Cartesian mathematics to calculate
length and area. The option is only available when measuring in a
projected coordinate system and the 2D plane of that coordinate system
will be used as the basis for the measurements. This is the default.
Geodesic - The shortest line between two points on the earth's surface
on a spheroid (ellipsoid). Therefore, regardless of input or output
projection, the results do not change.
.. note::
One use for a geodesic line is when you want to determine the shortest
distance between two cities for an airplane's flight path. This is also
known as a great circle line if based on a sphere rather than an ellipsoid.
:param in_barrier_data: Optional barrier raster.
:return: output raster with function applied
"""
layer, in_source_data, raster_ra = _raster_input(in_source_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "EucBackDirection_sa",
"PrimaryInputParameterName":"in_source_data",
"OutputRasterParameterName":"out_back_direction_raster",
"in_source_data": in_source_data,
}
}
if in_barrier_data is not None:
layer2, in_barrier_data, raster_ra2 = _raster_input(in_barrier_data)
template_dict["rasterFunctionArguments"]["in_barrier_data"] = in_barrier_data
if cell_size is not None:
template_dict["rasterFunctionArguments"]["cell_size"] = cell_size
if max_distance is not None:
template_dict["rasterFunctionArguments"]["maximum_distance"] = max_distance
distance_method_list = ["PLANAR","GEODESIC"]
if distance_method is not None:
if distance_method.upper() not in distance_method_list:
raise RuntimeError('distance_method should be one of the following '+ str(distance_method_list))
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_source_data"] = raster_ra
if in_barrier_data is not None:
function_chain_ra["rasterFunctionArguments"]["in_barrier_data"] = raster_ra2
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def flow_length(input_flow_direction_raster,
direction_measurement="DOWNSTREAM",
input_weight_raster=None):
"""
Creates a raster layer of upstream or downstream distance, or weighted distance,
along the flow path for each cell.
A primary use of the Flow Length function is to calculate the length of the longest
flow path within a given basin. This measure is often used to calculate the time of
concentration of a basin. This would be done using the Upstream option.
The function can also be used to create distance-area diagrams of hypothetical
rainfall and runoff events using the weight raster as an impedance to movement downslope.
For more information, see
https://pro.arcgis.com/en/pro-app/help/data/imagery/flow-length-function.htm
Parameters
----------
:param input_flow_direction_raster: The input raster that shows the direction of flow out of each cell.
The flow direction raster can be created by running the Flow Direction function.
:param direction_measurement: String. The direction of measurement along the flow path.
DOWNSTREAM - Calculates the downslope distance along the flow path,
from each cell to a sink or outlet on the edge of the raster. this is the default.
UPSTREAM - Calculates the longest upslope distance along the flow path,
from each cell to the top of the drainage divide.
:param input_weight_raster: An optional input raster for applying a weight to each cell.
If no weight raster is specified, a default weight of 1 will
be applied to each cell.
:return: output raster with function applied
"""
layer1, input_flow_direction_raster, raster_ra1 = _raster_input(input_flow_direction_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "FlowLength_sa",
"PrimaryInputParameterName" : "in_flow_direction_raster",
"OutputRasterParameterName" : "out_raster",
"in_flow_direction_raster" : input_flow_direction_raster
}
}
if input_weight_raster is not None:
layer2, input_weight_raster, raster_ra2 = _raster_input(input_weight_raster)
template_dict["rasterFunctionArguments"]["in_weight_raster"] = input_weight_raster
direction_measurement_list = ["DOWNSTREAM","UPSTREAM"]
if direction_measurement is not None:
if direction_measurement.upper() not in direction_measurement_list:
raise RuntimeError('direction_measurement should be one of the following '+ str(direction_measurement_list))
template_dict["rasterFunctionArguments"]["direction_measurement"] = direction_measurement
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra1
if input_weight_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_weight_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def sink(input_flow_direction_raster):
"""
Creates a raster layer identifying all sinks or areas of internal drainage.
The value type for the Sink function output raster layer is floating point.
For more information, see
https://pro.arcgis.com/en/pro-app/help/data/imagery/sink-function.htm
Parameters
----------
:param input_flow_direction_raster: The input raster that shows the direction
of flow out of each cell.
The flow direction raster can be created by
running the Flow Direction function.
:return: output raster with function applied
"""
layer, input_flow_direction_raster, raster_ra = _raster_input(input_flow_direction_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Sink_sa",
"PrimaryInputParameterName" : "in_flow_direction_raster",
"OutputRasterParameterName" : "out_raster",
"in_flow_direction_raster" : input_flow_direction_raster
}
}
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def snap_pour_point(in_pour_point_data,
in_accumulation_raster=None,
snap_distance=0,
pour_point_field=None):
"""
Snaps pour points to the cell of highest flow accumulation within a specified distance.
For more information, see
https://pro.arcgis.com/en/pro-app/help/data/imagery/snap-pour-point-function.htm
Parameters
----------
:param in_pour_point_data: The input pour point locations that are to be snapped.
For an input raster layer, all cells that are not
NoData (that is, have a value) will be considered
pour points and will be snapped.
:param in_accumulation_raster: optional raster; The input flow accumulation raster layer.
:param snap_distance: Maximum distance, in map units, to search for a cell of higher
accumulated flow. Default is 0
:param pour_point_field: Field used to assign values to the pour point locations.
:return: output raster with function applied
"""
layer, in_pour_point_data, raster_ra = _raster_input(in_pour_point_data)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "SnapPourPoint_sa",
"PrimaryInputParameterName" : "in_pour_point_data",
"OutputRasterParameterName" : "out_raster",
"in_pour_point_data" : in_pour_point_data
}
}
if in_accumulation_raster is not None:
layer2, in_accumulation_raster, raster_ra2 = _raster_input(in_accumulation_raster)
template_dict["rasterFunctionArguments"]["in_accumulation_raster"] = in_accumulation_raster
if snap_distance is not None:
template_dict["rasterFunctionArguments"]["snap_distance"] = snap_distance
if pour_point_field is not None:
template_dict["rasterFunctionArguments"]["pour_point_field"] = pour_point_field
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_pour_point_data"] = raster_ra
if in_accumulation_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_accumulation_raster"] = raster_ra2
return _gbl_clone_layer(layer, template_dict, function_chain_ra)
def stream_order(input_stream_raster,
input_flow_direction_raster=None,
order_method="STRAHLER"):
"""
Creates a raster layer that assigns a numeric order to segments
of a raster representing branches of a linear network.
For more information, see
https://pro.arcgis.com/en/pro-app/help/data/imagery/stream-order-function.htm
Parameters
----------
:param input_stream_raster: An input stream raster that represents a linear stream network.
:param input_flow_direction_raster: The input raster that shows the direction of flow out of each cell
The flow direction raster can be created by running the Flow
Direction function.
:param order_method: The method used for assigning stream order.
STRAHLER - The method of stream ordering proposed by Strahler in 1952.
Stream order only increases when streams of the same order intersect.
Therefore, the intersection of a first-order and second-order link will
remain a second-order link, rather than creating a third-order link.
This is the default.
SHREVE - The method of stream ordering by magnitude, proposed by Shreve
in 1967. All links with no tributaries are assigned a magnitude (order)
of one. Magnitudes are additive downslope. When two links intersect,
their magnitudes are added and assigned to the downslope link.
:return: output raster with function applied
"""
layer1, input_stream_raster, raster_ra1 = _raster_input(input_stream_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "StreamOrder_sa",
"PrimaryInputParameterName" : "in_stream_raster",
"OutputRasterParameterName" : "out_raster",
"in_stream_raster" : input_stream_raster
}
}
if input_flow_direction_raster is not None:
layer2, input_flow_direction_raster, raster_ra2 = _raster_input(input_flow_direction_raster)
template_dict["rasterFunctionArguments"]["in_flow_direction_raster"] = input_flow_direction_raster
order_method_list = ["STRAHLER","SHREVE"]
if order_method is not None:
if order_method.upper() not in order_method_list:
raise RuntimeError('order_method should be one of the following '+ str(order_method_list))
template_dict["rasterFunctionArguments"]["order_method"] = order_method
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_stream_raster"] = raster_ra1
if input_flow_direction_raster is not None:
function_chain_ra["rasterFunctionArguments"]["in_flow_direction_raster"] = raster_ra2
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def expand(input_raster,
number_of_cells,
zone_values):
"""
Expands specified zones of a raster by a specified number of cells.
https://pro.arcgis.com/en/pro-app/help/data/imagery/expand-function.htm
Parameters
----------
:param input_raster: The input raster for which the identified zones are to
be expanded.
It must be of integer type.
:param number_of_cells: The number of cells to expand by.
The value must be integer, and can be 1 or greater.
:param zone_values: The list of zones to expand. The zone values
must be integer, and they can be in any order.
zone_values can be specified as a list or as a string
If specified as a string and if it is required to specify multiple zones,
use a ; to separate the zone values.
:return: output raster with function applied
"""
layer1, input_raster, raster_ra1 = _raster_input(input_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Expand_sa",
"PrimaryInputParameterName" : "in_raster",
"OutputRasterParameterName" : "out_raster",
"in_raster" : input_raster
}
}
if number_of_cells is not None:
template_dict["rasterFunctionArguments"]["number_cells"] = number_of_cells
zone_values_str = zone_values
if isinstance(zone_values, list):
zone_values_str = ";".join(str(zone) for zone in zone_values)
if zone_values_str is not None:
template_dict["rasterFunctionArguments"]["zone_values"] = zone_values_str
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_raster"] = raster_ra1
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def shrink(input_raster,
number_of_cells,
zone_values):
"""
Shrinks the selected zones by a specified number of cells by replacing them with
the value of the cell that is most frequent in its neighborhood.
https://pro.arcgis.com/en/pro-app/help/data/imagery/shrink-function.htm
Parameters
----------
:param input_raster: The input raster for which the identified zones are to be shrunk.
It must be of integer type.
:param number_of_cells: The number of cells by which to shrink each specified zone.
The value must be integer, and can be 1 or greater.
:param zone_values: The list of zones to shrink. The zone values must be integer, and they can be in any order.
zone_values can be specified as a list or as a string
If specified as a string and if it is required to specify multiple zones,
use a ; to separate the zone values.
:return: output raster with function applied
"""
layer1, input_raster, raster_ra1 = _raster_input(input_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "Shrink_sa",
"PrimaryInputParameterName" : "in_raster",
"OutputRasterParameterName" : "out_raster",
"in_raster" : input_raster
}
}
if number_of_cells is not None:
template_dict["rasterFunctionArguments"]["number_cells"] = number_of_cells
zone_values_str = zone_values
if isinstance(zone_values, list):
zone_values_str = ";".join(str(zone) for zone in zone_values)
if zone_values_str is not None:
template_dict["rasterFunctionArguments"]["zone_values"] = zone_values_str
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra["rasterFunctionArguments"]["in_raster"] = raster_ra1
return _gbl_clone_layer(layer1, template_dict, function_chain_ra)
def distance_accumulation(in_source_data,
in_barrier_data=None,
in_surface_raster=None,
in_cost_raster=None,
in_vertical_raster=None,
vertical_factor="BINARY 1 -30 30",
in_horizontal_raster=None,
horizontal_factor="BINARY 1 45",
generate_back_direction_band=False,
source_initial_accumulation=None,
source_maximum_accumulation=None,
source_cost_multiplier=None,
source_direction=None,
distance_method="PLANAR"):
"""
Calculates the least accumulative cost distance for each cell from or to the
least-cost source over a cost surface, preserving euclidean distance metric
Parameters
----------
:param in_source_data: The input source locations.
This is a raster that identifies the cells or locations from
or to which the least accumulated cost distance for every output cell location is calculated.
For rasters, the input type can be integer or floating point.
:param in_barrier_data: Optional barrier raster.
:param in_surface_raster: A raster defining the elevation values at each cell location. The values are used to calculate the actual
surface distance covered when passing between cells.
:param in_cost_raster: A raster defining the impedance or cost to move planimetrically through each cell.
The value at each cell location represents the cost-per-unit distance for moving through the cell.
Each cell location value is multiplied by the cell resolution while also compensating for diagonal
movement to obtain the total cost of passing through the cell.
The values of the cost raster can be integer or floating point, but they cannot be negative or
zero (you cannot have a negative or zero cost).
:param in_horizontal_raster: A raster defining the horizontal direction at each cell.
The values on the raster must be integers ranging from 0 to 360, with 0 degrees being north, or toward
the top of the screen, and increasing clockwise. Flat areas should be given a value of -1.
The values at each location will be used in conjunction with the {horizontal_factor} to determine the
horizontal cost incurred when moving from a cell to its neighbors.
:param in_vertical_raster: A raster defining the vertical (z) value for each cell. The values are used for calculating the slope
used to identify the vertical factor incurred when moving from one cell to another.
:param horizontal_factor: The Horizontal Factor defines the relationship between the horizontal cost
factor and the horizontal relative moving angle.
:param vertical_factor: The Vertical Factor defines the relationship between the vertical cost factor and
the vertical relative moving angle (VRMA)
:param maximum_distance: Defines the threshold that the accumulative cost values cannot exceed. If an accumulative cost distance
value exceeds this value, the output value for the cell location will be NoData. The maximum distance
defines the extent for which the accumulative cost distances are calculated.
The default distance is to the edge of the output raster.
:param distance_method: Optional String; Determines whether to calculate the distance using a planar (flat earth)
or a geodesic (ellipsoid) method.
Planar - Planar measurements use 2D Cartesian mathematics to calculate
length and area. The option is only available when measuring in a
projected coordinate system and the 2D plane of that coordinate system
will be used as the basis for the measurements. This is the default.
Geodesic - The shortest line between two points on the earth's surface
on a spheroid (ellipsoid). Therefore, regardless of input or output
projection, the results do not change.
.. note::
One use for a geodesic line is when you want to determine the shortest
distance between two cities for an airplane's flight path. This is also
known as a great circle line if based on a sphere rather than an ellipsoid.
:param generate_back_direction_band: Optional bool, Default is False. If set to True, function generates back direction as additional band
in the output raster
:param source_initial_accumulation: The starting cost from which to begin the cost calculations.
Allows for the specification of the fixed
cost associated with a source. Instead of starting at a cost of zero, the cost algorithm will begin with
the value set by source_start_cost.
The values must be zero or greater. The default is 0.
:param source_maximum_accumulation: The cost capacity for the traveler for a source.
The cost calculations continue for each source until the specified capacity is reached.
The values must be greater than zero. The default capacity is to the edge of the output raster.
:param source_cost_multiplier: Multiplier to apply to the cost values. Allows for control of the mode of travel or the magnitude at a source.
The greater the multiplier, the greater the cost to move through each cell. The values must be greater than zero.
The default is 1.
:param source_direction: Defines the direction of the traveler when applying horizontal and vertical factors,
the source resistance rate, and the source starting cost.
Possible values: FROM_SOURCE, TO_SOURCE
:return: output raster with function applied
"""
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
if in_cost_raster is not None:
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
if in_barrier_data is not None:
layer3, in_barrier_data, raster_ra3 = _raster_input(in_barrier_data)
if in_surface_raster is not None:
layer4, in_surface_raster, raster_ra4 = _raster_input(in_surface_raster)
if in_horizontal_raster is not None:
layer5, in_horizontal_raster, raster_ra5 = _raster_input(in_horizontal_raster)
if in_vertical_raster is not None:
layer6, in_vertical_raster, raster_ra6 = _raster_input(in_vertical_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "DistanceAccumulation_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_distance_accumulation_raster",
"in_source_data" : input_source_data
}
}
if in_cost_raster is not None:
template_dict["rasterFunctionArguments"]["in_cost_raster"] = in_cost_raster
if in_barrier_data is not None:
template_dict["rasterFunctionArguments"]["in_barrier_data"] = in_barrier_data
if in_surface_raster is not None:
template_dict["rasterFunctionArguments"]["in_surface_raster"] = in_surface_raster
if in_horizontal_raster is not None:
template_dict["rasterFunctionArguments"]["in_horizontal_raster"] = in_horizontal_raster
if in_vertical_raster is not None:
template_dict["rasterFunctionArguments"]["in_vertical_raster"] = in_vertical_raster
if horizontal_factor is not None:
template_dict["rasterFunctionArguments"]["horizontal_factor"] = horizontal_factor
if vertical_factor is not None:
template_dict["rasterFunctionArguments"]["vertical_factor"] = vertical_factor
if generate_back_direction_band is not None:
if isinstance(generate_back_direction_band, bool):
template_dict["rasterFunctionArguments"]["in_back_direction_band"] = generate_back_direction_band
else:
raise RuntimeError("generate_back_direction_band should be of type bool")
if distance_method is not None:
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
if source_initial_accumulation is not None:
template_dict["rasterFunctionArguments"]["source_initial_accumulation"] = source_initial_accumulation
if source_maximum_accumulation is not None:
template_dict["rasterFunctionArguments"]["source_maximum_accumulation"] = source_maximum_accumulation
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_cost_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_cost_raster"] = raster_ra2
if in_barrier_data is not None:
function_chain_ra['rasterFunctionArguments']["in_barrier_data"] = raster_ra3
if in_surface_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra4
if in_horizontal_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_horizontal_raster"] = raster_ra5
if in_vertical_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_vertical_raster"] = raster_ra6
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra)
def distance_allocation(in_source_data,
in_barrier_data=None,
in_surface_raster=None,
in_cost_raster=None,
in_vertical_raster=None,
vertical_factor="BINARY 1 -30 30",
in_horizontal_raster=None,
horizontal_factor="BINARY 1 45",
generate_source_row_column_bands=False,
source_field=None,
source_initial_accumulation=None,
source_maximum_accumulation=None,
source_cost_multiplier=None,
source_direction=None,
distance_method="PLANAR"):
"""
Calculates, for each cell, its least-cost source based on the least accumulative cost over a cost surface,
avoiding network distance distortion.",
Parameters
----------
:param in_source_data: The input source locations.
This is a raster that identifies the cells or locations from
or to which the least accumulated cost distance for every output cell location is calculated.
For rasters, the input type can be integer or floating point.
:param in_barrier_data: Optional barrier raster.
:param in_surface_raster: A raster defining the elevation values at each cell location. The values are used to calculate the actual
surface distance covered when passing between cells.
:param in_cost_raster: A raster defining the impedance or cost to move planimetrically through each cell.
The value at each cell location represents the cost-per-unit distance for moving through the cell.
Each cell location value is multiplied by the cell resolution while also compensating for diagonal
movement to obtain the total cost of passing through the cell.
The values of the cost raster can be integer or floating point, but they cannot be negative or
zero (you cannot have a negative or zero cost).
:param in_vertical_raster: A raster defining the vertical (z) value for each cell. The values are used for calculating the slope
used to identify the vertical factor incurred when moving from one cell to another.
:param vertical_factor: The Vertical Factor defines the relationship between the vertical cost factor and
the vertical relative moving angle (VRMA)
:param in_horizontal_raster: A raster defining the horizontal direction at each cell.
The values on the raster must be integers ranging from 0 to 360, with 0 degrees being north, or toward
the top of the screen, and increasing clockwise. Flat areas should be given a value of -1.
The values at each location will be used in conjunction with the {horizontal_factor} to determine the
horizontal cost incurred when moving from a cell to its neighbors.
:param horizontal_factor: The Horizontal Factor defines the relationship between the horizontal cost
factor and the horizontal relative moving angle.
:param source_field: The field used to assign values to the source locations. It must be an
integer type. If the Value Raster has been set, the values in that input
will take precedence over any setting for the source field.
:param generate_source_row_column_bands: Optional bool, Default is False. If set to True, function generates source row and column as additional bands
in the output raster
:param source_initial_accumulation: The starting cost from which to begin the cost calculations.
Allows for the specification of the fixed
cost associated with a source. Instead of starting at a cost of zero, the cost algorithm will begin with
the value set by source_start_cost.
The values must be zero or greater. The default is 0.
:param source_maximum_accumulation: The cost capacity for the traveler for a source.
The cost calculations continue for each source until the specified capacity is reached.
The values must be greater than zero. The default capacity is to the edge of the output raster.
:param source_cost_multiplier: Multiplier to apply to the cost values. Allows for control of the mode of travel or the magnitude at a source.
The greater the multiplier, the greater the cost to move through each cell. The values must be greater than zero.
The default is 1.
:param source_direction: Defines the direction of the traveler when applying horizontal and vertical factors,
the source resistance rate, and the source starting cost.
Possible values: FROM_SOURCE, TO_SOURCE
:param distance_method: Optional String; Determines whether to calculate the distance using a planar (flat earth)
or a geodesic (ellipsoid) method.
Planar - Planar measurements use 2D Cartesian mathematics to calculate
length and area. The option is only available when measuring in a
projected coordinate system and the 2D plane of that coordinate system
will be used as the basis for the measurements. This is the default.
Geodesic - The shortest line between two points on the earth's surface
on a spheroid (ellipsoid). Therefore, regardless of input or output
projection, the results do not change.
.. note::
One use for a geodesic line is when you want to determine the shortest
distance between two cities for an airplane's flight path. This is also
known as a great circle line if based on a sphere rather than an ellipsoid.
:return: output raster with function applied
"""
layer1, input_source_data, raster_ra1 = _raster_input(in_source_data)
if in_cost_raster is not None:
layer2, in_cost_raster, raster_ra2 = _raster_input(in_cost_raster)
if in_barrier_data is not None:
layer3, in_barrier_data, raster_ra3 = _raster_input(in_barrier_data)
if in_surface_raster is not None:
layer4, in_surface_raster, raster_ra4 = _raster_input(in_surface_raster)
if in_horizontal_raster is not None:
layer5, in_horizontal_raster, raster_ra5 = _raster_input(in_horizontal_raster)
if in_vertical_raster is not None:
layer6, in_vertical_raster, raster_ra6 = _raster_input(in_vertical_raster)
template_dict = {
"rasterFunction" : "GPAdapter",
"rasterFunctionArguments" : {
"toolName" : "DistanceAllocation_sa",
"PrimaryInputParameterName" : "in_source_data",
"OutputRasterParameterName":"out_distance_allocation_raster",
"in_source_data" : input_source_data
}
}
if in_cost_raster is not None:
template_dict["rasterFunctionArguments"]["in_cost_raster"] = in_cost_raster
if in_barrier_data is not None:
template_dict["rasterFunctionArguments"]["in_barrier_data"] = in_barrier_data
if in_surface_raster is not None:
template_dict["rasterFunctionArguments"]["in_surface_raster"] = in_surface_raster
if in_horizontal_raster is not None:
template_dict["rasterFunctionArguments"]["in_horizontal_raster"] = in_horizontal_raster
if in_vertical_raster is not None:
template_dict["rasterFunctionArguments"]["in_vertical_raster"] = in_vertical_raster
if horizontal_factor is not None:
template_dict["rasterFunctionArguments"]["horizontal_factor"] = horizontal_factor
if vertical_factor is not None:
template_dict["rasterFunctionArguments"]["vertical_factor"] = vertical_factor
if source_field is not None:
template_dict["rasterFunctionArguments"]["source_field"] = source_field
if generate_source_row_column_bands is not None:
if isinstance(generate_source_row_column_bands, bool):
template_dict["rasterFunctionArguments"]["in_source_location_bands"] = generate_source_row_column_bands
else:
raise RuntimeError("generate_source_row_column_bands should be of type bool")
if distance_method is not None:
template_dict["rasterFunctionArguments"]["distance_method"] = distance_method
if source_initial_accumulation is not None:
template_dict["rasterFunctionArguments"]["source_initial_accumulation"] = source_initial_accumulation
if source_maximum_accumulation is not None:
template_dict["rasterFunctionArguments"]["source_maximum_accumulation"] = source_maximum_accumulation
if source_cost_multiplier is not None:
template_dict["rasterFunctionArguments"]["source_cost_multiplier"] = source_cost_multiplier
if source_direction is not None:
source_direction_list = ["FROM_SOURCE","TO_SOURCE"]
if source_direction.upper() not in source_direction_list:
raise RuntimeError('source_direction should be one of the following '+ str(source_direction_list) )
template_dict["rasterFunctionArguments"]["source_direction"] = source_direction
function_chain_ra = copy.deepcopy(template_dict)
function_chain_ra['rasterFunctionArguments']["in_source_data"] = raster_ra1
if in_cost_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_cost_raster"] = raster_ra2
if in_barrier_data is not None:
function_chain_ra['rasterFunctionArguments']["in_barrier_data"] = raster_ra3
if in_surface_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_surface_raster"] = raster_ra4
if in_horizontal_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_horizontal_raster"] = raster_ra5
if in_vertical_raster is not None:
function_chain_ra['rasterFunctionArguments']["in_vertical_raster"] = raster_ra6
return _gbl_clone_layer(in_source_data, template_dict, function_chain_ra)
| 54.379381 | 274 | 0.650549 | 22,187 | 186,195 | 5.253437 | 0.042187 | 0.0279 | 0.017914 | 0.015898 | 0.819806 | 0.791306 | 0.77449 | 0.75625 | 0.742823 | 0.732811 | 0 | 0.003815 | 0.294756 | 186,195 | 3,423 | 275 | 54.395267 | 0.88382 | 0.539064 | 0 | 0.695378 | 1 | 0 | 0.246444 | 0.123606 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02591 | false | 0 | 0.007003 | 0.0007 | 0.060924 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
54ceb8a975159a9e5102c814272b2d1e93ee3244 | 12,098 | py | Python | tests/event_from_string_test.py | untitaker/khal | b2e89f3451e84520e99ba098816e67e51d7bb508 | [
"Unlicense",
"MIT"
] | 2 | 2015-08-01T15:18:01.000Z | 2015-08-31T13:41:57.000Z | tests/event_from_string_test.py | untitaker/khal | b2e89f3451e84520e99ba098816e67e51d7bb508 | [
"Unlicense",
"MIT"
] | null | null | null | tests/event_from_string_test.py | untitaker/khal | b2e89f3451e84520e99ba098816e67e51d7bb508 | [
"Unlicense",
"MIT"
] | null | null | null | # vim: set fileencoding=utf-8:
from datetime import date, datetime, timedelta
import random
import pytz
from khal.aux import construct_event
def _now():
return datetime(2014, 2, 16, 12, 0, 0, 0)
today = date.today()
tomorrow = today + timedelta(days=1)
today_s = '{0:02}{1:02}{2:02}'.format(*today.timetuple()[0:3])
tomorrow_s = '{0:02}{1:02}{2:02}'.format(*tomorrow.timetuple()[0:3])
this_year_s = str(today.year)
test_set_format_de = [
# all-day-events
# one day only
('25.10.2013 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:20131025',
'DTEND;VALUE=DATE:20131026',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# 2 day
('15.08.2014 16.08. Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:20140815',
'DTEND;VALUE=DATE:20140817', # XXX
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# end date in next year and not specified
('29.12.2014 03.01. Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:20141229',
'DTEND;VALUE=DATE:20150104',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# end date in next year
('29.12.2014 03.01.2015 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:20141229',
'DTEND;VALUE=DATE:20150104',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# datetime events
# start and end date same, no explicit end date given
('25.10.2013 18:00 20:00 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:20131025T180000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:20131025T200000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# start and end date same, explicit end date (but no year) given
#('25.10.2013 18:00 26.10. 20:00 Äwesöme Event', # XXX FIXME: if no explicit year is given for the end, this_year is used
#'\r\n'.join(['BEGIN:VEVENT',
#'SUMMARY:Äwesöme Event',
#'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:20131025T180000',
#'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:20131026T200000',
#'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
#'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
#'END:VEVENT',
#''])),
# date ends next day, but end date not given
('25.10.2013 23:00 0:30 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:20131025T230000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:20131026T003000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# only start datetime given
('25.10.2013 06:00 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:20131025T060000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:20131025T070000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
# timezone given
('25.10.2013 06:00 America/New_York Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=America/New_York;VALUE=DATE-TIME:20131025T060000',
'DTEND;TZID=America/New_York;VALUE=DATE-TIME:20131025T070000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
]
def test_construct_event_format_de():
timeformat = '%H:%M'
dateformat = '%d.%m.'
longdateformat = '%d.%m.%Y'
datetimeformat = '%d.%m. %H:%M'
longdatetimeformat = '%d.%m.%Y %H:%M'
DEFAULTTZ = pytz.timezone('Europe/Berlin')
for data_list, vevent in test_set_format_de:
random.seed(1)
event = construct_event(data_list.split(),
timeformat=timeformat,
dateformat=dateformat,
longdateformat=longdateformat,
datetimeformat=datetimeformat,
longdatetimeformat=longdatetimeformat,
defaulttz=DEFAULTTZ,
_now=_now).to_ical()
assert event == vevent
test_set_format_us = [
('12/31/1999 06:00 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=America/New_York;VALUE=DATE-TIME:19991231T060000',
'DTEND;TZID=America/New_York;VALUE=DATE-TIME:19991231T070000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
''])),
('12/18 12/20 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:{0}1218',
'DTEND;VALUE=DATE:{0}1221',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
'']).format(this_year_s)),
]
def test_construct_event_format_us():
timeformat = '%H:%M'
dateformat = '%m/%d'
longdateformat = '%m/%d/%Y'
datetimeformat = '%m/%d %H:%M'
longdatetimeformat = '%m/%d/%Y %H:%M'
DEFAULTTZ = pytz.timezone('America/New_York')
for data_list, vevent in test_set_format_us:
random.seed(1)
event = construct_event(data_list.split(),
timeformat=timeformat,
dateformat=dateformat,
longdateformat=longdateformat,
datetimeformat=datetimeformat,
longdatetimeformat=longdatetimeformat,
defaulttz=DEFAULTTZ,
_now=_now).to_ical()
assert event == vevent
test_set_format_de_complexer = [
# now events where the start date has to be inferred, too
# today
('8:00 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:{0}T080000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:{0}T090000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
'']).format(today_s)),
# today until tomorrow
('22:00 1:00 Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:{0}T220000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:{1}T010000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
'']).format(today_s, tomorrow_s)),
('15.06. Äwesöme Event',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:{0}0615',
'DTEND;VALUE=DATE:{0}0616',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'END:VEVENT',
'']).format(this_year_s)),
]
def test_construct_event_format_de_complexer():
timeformat = '%H:%M'
dateformat = '%d.%m.'
longdateformat = '%d.%m.%Y'
datetimeformat = '%d.%m. %H:%M'
longdatetimeformat = '%d.%m.%Y %H:%M'
DEFAULTTZ = pytz.timezone('Europe/Berlin')
for data_list, vevent in test_set_format_de_complexer:
random.seed(1)
event = construct_event(data_list.split(),
timeformat=timeformat,
dateformat=dateformat,
longdateformat=longdateformat,
datetimeformat=datetimeformat,
longdatetimeformat=longdatetimeformat,
defaulttz=DEFAULTTZ,
_now=_now).to_ical()
assert event == vevent
test_set_description = [
# now events where the start date has to be inferred, too
# today
('8:00 Äwesöme Event :: this is going to be awesome',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:{0}T080000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:{0}T090000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'DESCRIPTION:this is going to be awesome',
'END:VEVENT',
'']).format(today_s)),
# today until tomorrow
('22:00 1:00 Äwesöme Event :: Will be even better',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;TZID=Europe/Berlin;VALUE=DATE-TIME:{0}T220000',
'DTEND;TZID=Europe/Berlin;VALUE=DATE-TIME:{1}T010000',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'DESCRIPTION:Will be even better',
'END:VEVENT',
'']).format(today_s, tomorrow_s)),
('15.06. Äwesöme Event :: and again',
'\r\n'.join(['BEGIN:VEVENT',
'SUMMARY:Äwesöme Event',
'DTSTART;VALUE=DATE:{0}0615',
'DTEND;VALUE=DATE:{0}0616',
'DTSTAMP;VALUE=DATE-TIME:20140216T120000Z',
'UID:E41JRQX2DB4P1AQZI86BAT7NHPBHPRIIHQKA',
'DESCRIPTION:and again',
'END:VEVENT',
'']).format(this_year_s)),
]
def test_description():
timeformat = '%H:%M'
dateformat = '%d.%m.'
longdateformat = '%d.%m.%Y'
datetimeformat = '%d.%m. %H:%M'
longdatetimeformat = '%d.%m.%Y %H:%M'
DEFAULTTZ = pytz.timezone('Europe/Berlin')
for data_list, vevent in test_set_description:
random.seed(1)
event = construct_event(data_list.split(),
timeformat=timeformat,
dateformat=dateformat,
longdateformat=longdateformat,
datetimeformat=datetimeformat,
longdatetimeformat=longdatetimeformat,
defaulttz=DEFAULTTZ,
_now=_now).to_ical()
assert event == vevent
| 42.006944 | 127 | 0.530418 | 1,161 | 12,098 | 5.452196 | 0.136951 | 0.072512 | 0.075987 | 0.029542 | 0.852291 | 0.839336 | 0.808847 | 0.800158 | 0.770616 | 0.704265 | 0 | 0.113668 | 0.342619 | 12,098 | 287 | 128 | 42.15331 | 0.682258 | 0.074971 | 0 | 0.718487 | 0 | 0 | 0.382168 | 0.236201 | 0 | 0 | 0 | 0.003484 | 0.016807 | 1 | 0.021008 | false | 0 | 0.016807 | 0.004202 | 0.042017 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
070a14fb6ccc8c7d5a24bf2f802add0a28fa0167 | 65,246 | py | Python | test/apitest.py | liushilongbuaa/sonic-restapi | 93484dc58c683aefad510f6fd8acb624f07d18da | [
"Apache-2.0"
] | null | null | null | test/apitest.py | liushilongbuaa/sonic-restapi | 93484dc58c683aefad510f6fd8acb624f07d18da | [
"Apache-2.0"
] | null | null | null | test/apitest.py | liushilongbuaa/sonic-restapi | 93484dc58c683aefad510f6fd8acb624f07d18da | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
import datetime
import json
import requests
import time
import unittest
import logging
import redis
import json
TEST_HOST = 'http://localhost:8090/'
logging.basicConfig(filename='test.log', filemode='w', level=logging.INFO)
l = logging.getLogger('rest_api_test')
# DB Names
VXLAN_TUNNEL_TB = "VXLAN_TUNNEL"
VNET_TB = "VNET"
VLAN_TB = "VLAN"
VLAN_INTF_TB = "VLAN_INTERFACE"
VLAN_MEMB_TB = "VLAN_MEMBER"
VLAN_NEIGH_TB = "NEIGH"
ROUTE_TUN_TB = "_VNET_ROUTE_TUNNEL_TABLE"
LOCAL_ROUTE_TB = "_VNET_ROUTE_TABLE"
CFG_ROUTE_TUN_TB = "VNET_ROUTE_TUNNEL"
CFG_LOCAL_ROUTE_TB = "VNET_ROUTE"
# DB Helper constants
VNET_NAME_PREF = "Vnet"
VLAN_NAME_PREF = "Vlan"
RESRC_EXISTS = 0
DEP_MISSING = 1
DELETE_DEP = 2
class rest_api_client(unittest.TestCase):
maxDiff = None
def post(self, url, body = []):
if body == None:
data = None
else:
data = json.dumps(body)
l.info("Request POST: %s" % url)
l.info("JSON Body: %s" % data)
r = requests.post(TEST_HOST + url, data=data, headers={'Content-Type': 'application/json'})
l.info('Response Code: %s' % r.status_code)
l.info('Response Body: %s' % r.text)
return r
def patch(self, url, body = []):
if body == None:
data = None
else:
data = json.dumps(body)
l.info("Request PATCH: %s" % url)
l.info("JSON Body: %s" % data)
r = requests.patch(TEST_HOST + url, data=data, headers={'Content-Type': 'application/json'})
l.info('Response Code: %s' % r.status_code)
l.info('Response Body: %s' % r.text)
return r
def get(self, url, body = [], params = {}):
if body == None:
data = None
else:
data = json.dumps(body)
l.info("Request GET: %s" % url)
l.info("JSON Body: %s" % data)
r = requests.get(TEST_HOST + url, data=data, params=params, headers={'Content-Type': 'application/json'})
l.info('Response Code: %s' % r.status_code)
l.info('Response Body: %s' % r.text)
return r
def delete(self, url, body = [], params = {}):
if body == None:
data = None
else:
data = json.dumps(body)
l.info("Request DELETE: %s" % url)
l.info("JSON Body: %s" % data)
r = requests.delete(TEST_HOST + url, data=data, params=params, headers={'Content-Type': 'application/json'})
l.info('Response Code: %s' % r.status_code)
l.info('Response Body: %s' % r.text)
return r
def get_config_reset_status(self):
return self.get('v1/config/resetstatus')
def post_config_reset_status(self, value):
return self.post('v1/config/resetstatus', value)
# VRF/VNET
def post_config_vrouter_vrf_id(self, vrf_id, value):
return self.post('v1/config/vrouter/{vrf_id}'.format(vrf_id=vrf_id), value)
def get_config_vrouter_vrf_id(self, vrf_id):
return self.get('v1/config/vrouter/{vrf_id}'.format(vrf_id=vrf_id))
def delete_config_vrouter_vrf_id(self, vrf_id):
return self.delete('v1/config/vrouter/{vrf_id}'.format(vrf_id=vrf_id))
# Encap
def post_config_tunnel_encap_vxlan_vnid(self, vnid, value):
return self.post('v1/config/tunnel/encap/vxlan/{vnid}'.format(vnid=vnid), value)
def delete_config_tunnel_encap_vxlan_vnid(self, vnid):
return self.delete('v1/config/tunnel/encap/vxlan/{vnid}'.format(vnid=vnid))
def get_config_tunnel_encap_vxlan_vnid(self, vnid):
return self.get('v1/config/tunnel/encap/vxlan/{vnid}'.format(vnid=vnid))
# Decap
def post_config_tunnel_decap_tunnel_type(self, tunnel_type, value):
return self.post('v1/config/tunnel/decap/{tunnel_type}'.format(tunnel_type=tunnel_type), value)
def get_config_tunnel_decap_tunnel_type(self, tunnel_type):
return self.get('v1/config/tunnel/decap/{tunnel_type}'.format(tunnel_type=tunnel_type))
def delete_config_tunnel_decap_tunnel_type(self, tunnel_type):
return self.delete('v1/config/tunnel/decap/{tunnel_type}'.format(tunnel_type=tunnel_type))
# Vlan
def post_config_vlan(self, vlan_id, value):
return self.post('v1/config/interface/vlan/{vlan_id}'.format(vlan_id=vlan_id), value)
def get_config_vlan(self, vlan_id):
return self.get('v1/config/interface/vlan/{vlan_id}'.format(vlan_id=vlan_id))
def delete_config_vlan(self, vlan_id):
return self.delete('v1/config/interface/vlan/{vlan_id}'.format(vlan_id=vlan_id))
def get_config_interface_vlans(self, vnet_id=None):
params = {}
if vnet_id != None:
params['vnet_id'] = vnet_id
return self.get('v1/config/interface/vlans',params=params)
def get_config_vlans_all(self):
return self.get('v1/config/interface/vlans/all')
# Vlan Member
def post_config_vlan_member(self, vlan_id, if_name, value):
return self.post('v1/config/interface/vlan/{vlan_id}/member/{if_name}'.format(vlan_id=vlan_id, if_name=if_name), value)
def get_config_vlan_member(self, vlan_id, if_name):
return self.get('v1/config/interface/vlan/{vlan_id}/member/{if_name}'.format(vlan_id=vlan_id, if_name=if_name))
def delete_config_vlan_member(self, vlan_id, if_name):
return self.delete('v1/config/interface/vlan/{vlan_id}/member/{if_name}'.format(vlan_id=vlan_id, if_name=if_name))
def get_config_interface_vlan_members(self, vlan_id):
return self.get('v1/config/interface/vlan/{vlan_id}/members'.format(vlan_id=vlan_id))
# Vlan Neighbor
def post_config_vlan_neighbor(self, vlan_id, ip_addr):
return self.post('v1/config/interface/vlan/{vlan_id}/neighbor/{ip_addr}'.format(vlan_id=vlan_id, ip_addr=ip_addr))
def get_config_vlan_neighbor(self, vlan_id, ip_addr):
return self.get('v1/config/interface/vlan/{vlan_id}/neighbor/{ip_addr}'.format(vlan_id=vlan_id, ip_addr=ip_addr))
def delete_config_vlan_neighbor(self, vlan_id, ip_addr):
return self.delete('v1/config/interface/vlan/{vlan_id}/neighbor/{ip_addr}'.format(vlan_id=vlan_id, ip_addr=ip_addr))
def get_config_interface_vlan_neighbors(self, vlan_id):
return self.get('v1/config/interface/vlan/{vlan_id}/neighbors'.format(vlan_id=vlan_id))
# Routes
def patch_config_vrouter_vrf_id_routes(self, vrf_id, value):
return self.patch('v1/config/vrouter/{vrf_id}/routes'.format(vrf_id=vrf_id), value)
def patch_config_vrf_vrf_id_routes(self, vrf_id, value):
return self.patch('v1/config/vrf/{vrf_id}/routes'.format(vrf_id=vrf_id), value)
def delete_config_vrouter_vrf_id_routes(self, vrf_id, vnid=None, value=None):
params = {}
if vnid != None:
params['vnid'] = vnid
return self.delete('v1/config/vrouter/{vrf_id}/routes'.format(vrf_id=vrf_id), value, params=params)
def get_config_vrouter_vrf_id_routes(self, vrf_id, vnid=None, ip_prefix=None):
params = {}
if vnid != None:
params['vnid'] = vnid
if ip_prefix != None:
params['ip_prefix'] = ip_prefix
return self.get('v1/config/vrouter/{vrf_id}/routes'.format(vrf_id=vrf_id), params=params)
def get_config_vrf_vrf_id_routes(self, vrf_id, ip_prefix=None):
params = {}
if ip_prefix != None:
params['ip_prefix'] = ip_prefix
return self.get('v1/config/vrf/{vrf_id}/routes'.format(vrf_id=vrf_id), params=params)
# In memory DB restart
def post_config_restart_in_mem_db(self):
return self.post('v1/config/restartdb')
# Operations
# Ping
def post_ping(self, value):
return self.post('v1/operations/ping', value)
# Helper functions
def post_generic_vxlan_tunnel(self):
rv = self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '34.53.1.0'
})
self.assertEqual(rv.status_code, 204)
def post_generic_vrouter_and_deps(self):
self.post_generic_vxlan_tunnel()
rv = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(rv.status_code, 204)
def post_generic_vrouter_and_deps_duplicate(self):
self.post_generic_vxlan_tunnel()
rv = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(rv.status_code, 204)
rv = self.post_config_vrouter_vrf_id("vnet-guid-10", {
'vnid': 1001
})
self.assertEqual(rv.status_code, 409)
rv = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(rv.status_code, 409)
rv = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 2001
})
self.assertEqual(rv.status_code, 409)
def post_generic_vlan_and_deps(self):
self.post_generic_vrouter_and_deps()
rv = self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(rv.status_code, 204)
def check_routes_exist_in_tun_tb(self, vnet_num_mapped, routes_arr):
for route in routes_arr:
route_table = self.db.hgetall(ROUTE_TUN_TB + ':' + VNET_NAME_PREF +str(vnet_num_mapped)+':'+route['ip_prefix'])
self.assertEqual(route_table, {
b'endpoint' : route['nexthop'],
b'mac_address' : route['mac_address'],
b'vni' : str(route['vnid'])
})
def check_routes_dont_exist_in_tun_tb(self, vnet_num_mapped, routes_arr):
for route in routes_arr:
route_table = self.db.hgetall(ROUTE_TUN_TB + ':' + VNET_NAME_PREF +str(vnet_num_mapped)+':'+route['ip_prefix'])
self.assertEqual(route_table, {})
def check_routes_exist_in_loc_route_tb(self, vnet_num_mapped, routes_arr):
for route in routes_arr:
route_table = self.db.hgetall(LOCAL_ROUTE_TB + ':' + VNET_NAME_PREF +str(vnet_num_mapped)+':'+route['ip_prefix'])
self.assertEqual(route_table, {
b'nexthop' : route['nexthop'],
b'ifname' : route['ifname']
})
def check_routes_dont_exist_in_loc_route_tb(self, vnet_num_mapped, routes_arr):
for route in routes_arr:
route_table = self.db.hgetall(LOCAL_ROUTE_TB + ':' + VNET_NAME_PREF +str(vnet_num_mapped)+':'+route['ip_prefix'])
self.assertEqual(route_table, {})
# Test setup
def setUp(self):
l.info('============================================================')
l.info("Running: {0}".format(self._testMethodName))
l.info('------------------------------------------------------------')
# Clear DBs - reach known state
self.db = redis.StrictRedis('localhost', 6379, 0)
self.db.flushdb()
self.cache = redis.StrictRedis('localhost', 6379, 7)
self.cache.flushdb()
self.configdb = redis.StrictRedis('localhost', 6379, 4)
self.configdb.flushdb()
# Sanity check
keys = self.db.keys()
self.assertEqual(keys, [])
keys = self.cache.keys()
self.assertEqual(keys, [])
keys = self.configdb.keys()
self.assertEqual(keys, [])
self.post_config_restart_in_mem_db()
@classmethod
def setUpClass(cls):
l.info('============================================================')
l.info("Starting: {0} - {1}".format(cls.__name__, cls.__doc__))
l.info('------------------------------------------------------------')
class ra_client_positive_tests(rest_api_client):
"""Normal behaviour tests"""
# Helper func
def check_vrouter_exists(self, vnet_id, vnid):
r = self.get_config_vrouter_vrf_id(vnet_id)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vnet_id': vnet_id,
'attr': {
'vnid': vnid
}
})
def helper_get_config_tunnel_decap_tunnel_type(self):
self.post_generic_vxlan_tunnel()
r = self.get_config_tunnel_decap_tunnel_type('vxlan')
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'tunnel_type': 'vxlan',
'attr': {
'ip_addr': '34.53.1.0'
}
})
# Config reset status
def test_config_status_reset_get(self):
r = self.get_config_reset_status()
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'reset_status': 'true'
})
def test_config_status_reset_post(self):
r = self.post_config_reset_status({'reset_status': 'false'})
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'reset_status': 'false'
})
r = self.post_config_reset_status({'reset_status': 'boolean'})
self.assertEqual(r.status_code, 400)
# Decap
def test_post_config_tunnel_decap_tunnel_type(self):
r = self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '34.53.1.0'
})
self.assertEqual(r.status_code, 204)
# After 1st time config of decap, post is always no-op
r = self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '74.32.6.0'
})
self.assertEqual(r.status_code, 409)
tunnel_table = self.configdb.hgetall(VXLAN_TUNNEL_TB + '|default_vxlan_tunnel')
self.assertEqual(tunnel_table, {b'src_ip': b'34.53.1.0'})
l.info("Tunnel table is %s", tunnel_table)
def test_delete_config_tunnel_decap_tunnel_type(self):
self.post_generic_vxlan_tunnel()
r = self.delete_config_tunnel_decap_tunnel_type('vxlan')
self.assertEqual(r.status_code, 204)
# The delete is a no-op and should return 204, moreover the tunnel should not be deleted
tunnel_table = self.configdb.hgetall(VXLAN_TUNNEL_TB + '|default_vxlan_tunnel')
self.assertEqual(tunnel_table, {b'src_ip': b'34.53.1.0'})
# Encap
def test_post_encap(self):
r = self.post_config_tunnel_encap_vxlan_vnid(101, None)
self.assertEqual(r.status_code, 204)
keys = self.configdb.keys()
self.assertEqual(keys, [])
def test_get_encap(self):
r = self.get_config_tunnel_encap_vxlan_vnid(101)
self.assertEqual(r.status_code, 204)
def test_delete_encap(self):
r = self.delete_config_tunnel_encap_vxlan_vnid(101)
self.assertEqual(r.status_code, 204)
# Vrouter
def test_post_vrouter(self):
self.post_generic_vxlan_tunnel()
r = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(r.status_code, 204)
vrouter_table = self.configdb.hgetall(VNET_TB + '|' + VNET_NAME_PREF + '1')
self.assertEqual(vrouter_table, {
b'vxlan_tunnel': b'default_vxlan_tunnel',
b'vni': b'1001',
b'guid': b'vnet-guid-1'
})
def test_get_vrouter(self):
self.post_generic_vrouter_and_deps()
self.check_vrouter_exists("vnet-guid-1",1001)
def test_duplicate_vni(self):
self.post_generic_vrouter_and_deps_duplicate()
self.check_vrouter_exists("vnet-guid-1",1001)
def test_delete_vrouter(self):
self.post_generic_vrouter_and_deps()
r = self.delete_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 204)
vrouter_table = self.configdb.hgetall(VNET_TB + '|' + VNET_NAME_PREF + '1')
self.assertEqual(vrouter_table, {})
def test_guid_persistence(self):
self.post_generic_vrouter_and_deps()
r = self.post_config_vrouter_vrf_id("vnet-guid-2", { 'vnid': 1002 })
self.assertEqual(r.status_code, 204)
r = self.post_config_vrouter_vrf_id("vnet-guid-3", { 'vnid': 1003 })
self.assertEqual(r.status_code, 204)
self.post_config_restart_in_mem_db()
self.check_vrouter_exists("vnet-guid-1",1001)
self.check_vrouter_exists("vnet-guid-2",1002)
self.check_vrouter_exists("vnet-guid-3",1003)
def test_vnet_name_mapping_logic(self):
self.post_generic_vxlan_tunnel()
for i in range (1,4):
r = self.post_config_vrouter_vrf_id("vnet-guid-"+str(i), {'vnid': 1000+i})
self.assertEqual(r.status_code, 204)
self.check_vrouter_exists("vnet-guid-"+str(i), 1000+i)
vrouter_table = self.configdb.hgetall(VNET_TB + '|' + VNET_NAME_PREF +str(i))
self.assertEqual(vrouter_table, {
b'vxlan_tunnel': b'default_vxlan_tunnel',
b'vni': b'100'+str(i),
b'guid': b'vnet-guid-'+str(i)
})
for i in range (1,4):
r = self.delete_config_vrouter_vrf_id("vnet-guid-"+str(i))
self.assertEqual(r.status_code, 204)
r = self.post_config_vrouter_vrf_id("vnet-guid-"+str(i+3), {'vnid': 1003+i})
self.assertEqual(r.status_code, 204)
self.check_vrouter_exists("vnet-guid-"+str(i+3), 1003+i)
vrouter_table = self.configdb.hgetall(VNET_TB + '|' + VNET_NAME_PREF +str(i))
self.assertEqual(vrouter_table, {
b'vxlan_tunnel': b'default_vxlan_tunnel',
b'vni': b'100'+str(i+3),
b'guid': b'vnet-guid-'+str(i+3)
})
r = self.post_config_vrouter_vrf_id("vnet-guid-"+str(i+6), {'vnid': 1006+i})
self.assertEqual(r.status_code, 204)
self.check_vrouter_exists("vnet-guid-"+str(i+6), 1006+i)
vrouter_table = self.configdb.hgetall(VNET_TB + '|' + VNET_NAME_PREF +str(i+3))
self.assertEqual(vrouter_table, {
b'vxlan_tunnel': b'default_vxlan_tunnel',
b'vni': b'100'+str(i+6),
b'guid': b'vnet-guid-'+str(i+6)
})
# Vlan
def test_vlan_wo_ippref_vnetid_all_verbs(self):
# post
r = self.post_config_vlan(2, {})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan(2)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': 2,
'attr': {}
})
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {'host_ifname': 'MonVlan2', b'vlanid': b'2'})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_intf_table, {})
# delete
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {})
def test_vlan_with_vnetid_all_verbs(self):
# post
self.post_generic_vrouter_and_deps()
r = self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1'})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan(2)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': 2,
'attr': {'vnet_id':'vnet-guid-1'}
})
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {'host_ifname': 'MonVlan2', b'vlanid': b'2'})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_intf_table, {b'proxy_arp': b'enabled', b'vnet_name': VNET_NAME_PREF + '1'})
# delete
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_intf_table, {})
def test_vlan_with_ippref_all_verbs(self):
# post
self.post_generic_vrouter_and_deps()
r = self.post_config_vlan(2, {'ip_prefix':'10.0.1.1/24'})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan(2)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': 2,
'attr': {'ip_prefix':'10.0.1.1/24'}
})
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {'host_ifname': 'MonVlan2', b'vlanid': b'2'})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2|10.0.1.1/24')
self.assertEqual(vlan_intf_table, {b'':b''})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_intf_table, {})
# delete
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2|10.0.1.1/24')
self.assertEqual(vlan_intf_table, {})
def test_vlan_all_args_all_verbs(self):
# post
self.post_generic_vrouter_and_deps()
r = self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan(2)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': 2,
'attr': {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'}
})
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {'host_ifname': 'MonVlan2', b'vlanid': b'2'})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2|10.0.1.1/24')
self.assertEqual(vlan_intf_table, {b'':b''})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_intf_table, {b'proxy_arp': b'enabled', b'vnet_name': VNET_NAME_PREF+'1'})
# delete
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
vlan_table = self.configdb.hgetall(VLAN_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_table, {})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2')
self.assertEqual(vlan_intf_table, {})
vlan_intf_table = self.configdb.hgetall(VLAN_INTF_TB + '|' + VLAN_NAME_PREF + '2|10.0.1.1/24')
self.assertEqual(vlan_intf_table, {})
def test_get_vlans_per_vnetid_1digitvlans(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
self.post_config_vrouter_vrf_id('vnet-guid-2', {'vnid': 2001})
#create vlan interfaces
self.post_config_vlan(3, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.4.1/24'})
self.post_config_vlan(4, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.3.1/24'})
self.post_config_vlan(5, {'vnet_id' : 'vnet-guid-2', 'ip_prefix':'10.2.4.1/24'})
self.post_config_vlan(6, {'vnet_id' : 'vnet-guid-2', 'ip_prefix':'10.2.3.1/24'})
# get vlans for vnet-guid-1
r_vnet1 = self.get_config_interface_vlans('vnet-guid-1')
r_vnet2 = self.get_config_interface_vlans('vnet-guid-2')
j_vnet1 = json.loads(r_vnet1.text)
j_vnet2 = json.loads(r_vnet2.text)
k_vnet1 = {"vnet_id":"vnet-guid-1","attr":[{"vlan_id":3,"ip_prefix":"10.0.4.1/24"},{"vlan_id":4,"ip_prefix":"10.0.3.1/24"}]}
k_vnet2 = {"vnet_id":"vnet-guid-2","attr":[{"vlan_id":5,"ip_prefix":"10.2.4.1/24"},{"vlan_id":6,"ip_prefix":"10.2.3.1/24"}]}
for key,value in j_vnet1.iteritems():
if type(value)!=list:
#print("not type list",value)
self.assertEqual(k_vnet1[key],j_vnet1[key])
else:
#print("is type list",value)
self.assertItemsEqual(value,k_vnet1.values()[0])
for key,value in j_vnet2.iteritems():
if type(value)!=list:
#print("not type list",value)
self.assertEqual(k_vnet2[key],j_vnet2[key])
else:
#print("is type list",value)
self.assertItemsEqual(value,k_vnet2.values()[0])
def test_get_vlans_per_vnetid_4digitvlans(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
self.post_config_vrouter_vrf_id('vnet-guid-2', {'vnid': 2002})
#create vlan interfaces
self.post_config_vlan(1111, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.post_config_vlan(2222, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.2.1/24'})
self.post_config_vlan(3000, {'vnet_id' : 'vnet-guid-2'})
self.post_config_vlan(4000, {'vnet_id' : 'vnet-guid-2', 'ip_prefix':'10.2.2.1/24'})
# get vlans for vnet-guid-1
r = self.get_config_interface_vlans('vnet-guid-1')
j = json.loads(r.text)
r2 = self.get_config_interface_vlans('vnet-guid-2')
j2 = json.loads(r2.text)
k = {"vnet_id":"vnet-guid-1","attr":[{"vlan_id":1111,"ip_prefix":"10.0.1.1/24"},{"vlan_id":2222,"ip_prefix":"10.0.2.1/24"}]}
k2 = {"vnet_id":"vnet-guid-2","attr":[{"vlan_id":3000},{"vlan_id":4000,"ip_prefix":"10.2.2.1/24"}]}
for key,value in j.iteritems():
if type(value)!=list:
#print("not type list",value)
self.assertEqual(k[key],j[key])
else:
#print("is type list",value)
self.assertItemsEqual(value,k.values()[0])
for key,value in j2.iteritems():
if type(value)!=list:
#print("not type list",value)
self.assertEqual(k2[key],j2[key])
else:
#print("is type list",value)
self.assertItemsEqual(value,k2.values()[0])
# Vlan Get
def test_get_all_vlans(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
#create vlan interfaces
self.post_config_vlan(3000, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.post_config_vlan(3001, {'vnet_id' : 'vnet-guid-1'})
# get all vlans
r = self.get_config_vlans_all()
j = json.loads(r.text)
k = {"attr":[{"vlan_id":3000,"ip_prefix":"10.0.1.1/24","vnet_id":"vnet-guid-1"},{"vlan_id":3001,"vnet_id":"vnet-guid-1"}]}
for key,value in j.iteritems():
if type(value)!=list:
self.assertEqual(k[key],j[key])
return
for item in k[key]:
if item not in value:
assert False
# Vlan Member
def test_vlan_member_tagged_untagged_interop(self):
vlan0 = 2
vlans = [3,4]
members = ["Ethernet2", "Ethernet3", "Ethernet4"]
self.post_generic_vrouter_and_deps()
r = self.post_config_vlan(vlan0, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.assertEqual(r.status_code, 204)
for member in members:
r = self.post_config_vlan_member(vlan0, member, {'tagging_mode' : 'untagged'})
self.assertEqual(r.status_code, 204)
r = self.get_config_vlan_member(vlan0, member)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': vlan0,
'if_name': member,
'attr': {'tagging_mode' : 'untagged'}
})
for vlan in vlans:
r = self.post_config_vlan(vlan, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.assertEqual(r.status_code, 204)
# post
for member in members:
r = self.post_config_vlan_member(vlan, member, {'tagging_mode' : 'tagged'})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan_member(vlan, member)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': vlan,
'if_name': member,
'attr': {'tagging_mode' : 'tagged'}
})
def test_vlan_member_tagging_all_verbs(self):
vlans = [2,3]
members = ['Ethernet2', 'Ethernet3']
self.post_generic_vrouter_and_deps()
for vlan in vlans:
r = self.post_config_vlan(vlan, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.assertEqual(r.status_code, 204)
# post
for member in members:
r = self.post_config_vlan_member(vlan, member, {'tagging_mode' : 'tagged'})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan_member(vlan, member)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': vlan,
'if_name': member,
'attr': {'tagging_mode' : 'tagged'}
})
vlan_mem_table = self.configdb.hgetall(VLAN_MEMB_TB + '|' + VLAN_NAME_PREF +str(vlan)+'|'+member)
self.assertEqual(vlan_mem_table, {b'tagging_mode':b'tagged'})
# delete
r = self.delete_config_vlan_member(vlan, member)
self.assertEqual(r.status_code, 204)
vlan_mem_table = self.configdb.hgetall(VLAN_MEMB_TB + '|' + VLAN_NAME_PREF + str(vlan) + "|" + member)
self.assertEqual(vlan_mem_table, {})
def test_vlan_member_notagging_all_verbs(self):
# post
self.post_generic_vlan_and_deps()
r = self.post_config_vlan_member(2, "Ethernet2", {})
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan_member(2, "Ethernet2")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': 2,
'if_name': 'Ethernet2',
'attr': {'tagging_mode' : 'untagged'}
})
vlan_mem_table = self.configdb.hgetall(VLAN_MEMB_TB + '|' + VLAN_NAME_PREF + '2|Ethernet2')
self.assertEqual(vlan_mem_table, {b'tagging_mode':b'untagged'})
def test_get_members_per_vlan(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
#create vlan interface 2
self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
members = ["Ethernet2", "Ethernet3", "Ethernet4"]
for member in members:
self.post_config_vlan_member(2, member, {'tagging_mode' : 'untagged'})
r = self.get_config_interface_vlan_members(2)
j = json.loads(r.text)
self.assertItemsEqual( j,
{"vlan_id":2,"attr":[{"if_name":"Ethernet2","tagging_mode":"untagged"},{"if_name":"Ethernet3","tagging_mode":"untagged"},{"if_name":"Ethernet4","tagging_mode":"untagged"}]}
)
# Vlan Neighbor
def test_vlan_neighbor_all_verbs(self):
# post
self.post_generic_vlan_and_deps()
r = self.post_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 204)
# get
r = self.get_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, {
'vlan_id': 2,
'ip_addr': '10.10.10.10'
})
vlan_neigh_table = self.configdb.hgetall(VLAN_NEIGH_TB + '|' + VLAN_NAME_PREF + '2|10.10.10.10')
self.assertEqual(vlan_neigh_table, {b'family':b'IPv4'})
# delete
r = self.delete_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 204)
vlan_neigh_table = self.configdb.hgetall(VLAN_NEIGH_TB + '|' + VLAN_NAME_PREF + '2|10.10.10.10')
self.assertEqual(vlan_neigh_table, {})
def test_get_neighbors_per_vlan(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
#create vlan interface 2
self.post_config_vlan(3, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.2.1/24'})
self.post_config_vlan_neighbor(3, "10.10.20.10")
self.post_config_vlan_neighbor(3, "10.10.30.10")
# get vlans for vnet-guid-1
r = self.get_config_interface_vlan_neighbors(3)
j = json.loads(r.text)
self.assertItemsEqual( j,
{"vlan_id":3,"attr":[{"ip_addr":"10.10.20.10"},{"ip_addr":"10.10.30.10"}]}
)
# Routes
def test_patch_update_routes_with_optional_args(self):
self.post_generic_vlan_and_deps()
# No optional args
route = {
'cmd':'add',
'ip_prefix':'10.2.1.0/24',
'nexthop':'192.168.2.1'
}
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", [route])
self.assertEqual(r.status_code, 204)
route_table = self.db.hgetall(ROUTE_TUN_TB + ':' + VNET_NAME_PREF +str(1)+':'+route['ip_prefix'])
self.assertEqual(route_table, {b'endpoint' : route['nexthop']})
del route['cmd']
routes = list()
routes.append(route)
routes.append({'nexthop': '', 'ip_prefix': '10.1.1.0/24', 'ifname': 'Vlan2'})
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes)
# Vnid Optional arg
route['vnid'] = 5000
route['cmd'] = 'add'
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", [route])
self.assertEqual(r.status_code, 204)
route_table = self.db.hgetall(ROUTE_TUN_TB + ':' + VNET_NAME_PREF +str(1)+':'+route['ip_prefix'])
self.assertEqual(route_table, {b'endpoint' : route['nexthop'],
b'vni' : str(route['vnid'])
})
del route['cmd']
routes = list()
routes.append(route)
routes.append({'nexthop': '', 'ip_prefix': '10.1.1.0/24', 'ifname': 'Vlan2'})
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes)
# Mac address Optional arg
del route['vnid']
route['mac_address'] = '00:08:aa:bb:cd:ef'
route['cmd'] = 'add'
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", [route])
self.assertEqual(r.status_code, 204)
route_table = self.db.hgetall(ROUTE_TUN_TB + ':' + VNET_NAME_PREF +str(1)+':'+route['ip_prefix'])
self.assertEqual(route_table, {b'endpoint' : route['nexthop'],
b'mac_address' : route['mac_address']
})
del route['cmd']
routes = list()
routes.append(route)
routes.append({'nexthop': '', 'ip_prefix': '10.1.1.0/24', 'ifname': 'Vlan2'})
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes)
def test_patch_routes_drop_bm_routes_tunnel(self):
cidr = [24,30,32]
self.post_generic_vrouter_and_deps()
rv = self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(rv.status_code, 204)
r = self.post_config_vlan(3, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.1.5.0/24'})
self.assertEqual(r.status_code, 204)
routes = []
for i in range (1,7):
for ci in cidr:
routes.append({'cmd':'add',
'ip_prefix':'10.1.'+str(i)+'.1/'+str(ci),
'nexthop':'34.53.'+str(i)+'.0',
'vnid': 1 + i%5,
'mac_address':'00:08:aa:bb:cd:'+hex(15+i)[2:]})
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes)
self.assertEqual(r.status_code, 204)
routes_bm = []
routes_not_bm = []
for route in routes:
del route['cmd']
if route['nexthop'] == '34.53.1.0':
routes_bm.append(route)
else:
routes_not_bm.append(route)
self.check_routes_exist_in_tun_tb(1, routes_not_bm)
self.check_routes_dont_exist_in_tun_tb(1, routes_bm)
routes_not_bm.append({'nexthop': '', 'ip_prefix': '10.1.1.0/24', 'ifname': 'Vlan2'})
routes_not_bm.append({'nexthop': '', 'ip_prefix': '10.1.5.0/24', 'ifname': 'Vlan3'})
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes_not_bm)
for route in routes_bm:
route['cmd'] = 'delete'
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes_bm)
self.assertEqual(r.status_code, 204)
def test_patch_routes_drop_bm_routes_local(self):
cidr = [24,30,32]
self.post_generic_vrouter_and_deps()
rv = self.post_config_vlan(2, {'vnet_id':'vnet-guid-1', 'ip_prefix':'10.1.1.0/24'})
self.assertEqual(rv.status_code, 204)
r = self.post_config_vlan(3, {'vnet_id':'vnet-guid-1', 'ip_prefix':'10.1.5.0/24'})
self.assertEqual(r.status_code, 204)
routes = []
for i in range (1,7):
for ci in cidr:
routes.append({'cmd':'add',
'ip_prefix':'10.1.'+str(i)+'.1/'+str(ci),
'nexthop':'34.53.'+str(i)+'.0',
'ifname': 'Vlan3005'})
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes)
self.assertEqual(r.status_code, 204)
routes_bm = []
routes_not_bm = []
for route in routes:
del route['cmd']
if route['nexthop'] == '34.53.1.0':
routes_bm.append(route)
else:
routes_not_bm.append(route)
self.check_routes_exist_in_loc_route_tb(1, routes_not_bm)
self.check_routes_dont_exist_in_loc_route_tb(1, routes_bm)
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
j.remove({'nexthop': '', 'ifname': 'Vlan3', 'ip_prefix': '10.1.5.0/24'})
j.remove({'nexthop': '', 'ifname': 'Vlan2', 'ip_prefix': '10.1.1.0/24'})
self.assertItemsEqual(j, routes_not_bm)
for route in routes_bm:
route['cmd'] = 'delete'
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes_bm)
self.assertEqual(r.status_code, 204)
def test_routes_all_verbs(self):
self.post_generic_vlan_and_deps()
routes = []
rv = self.post_config_vrouter_vrf_id("vnet-guid-2", {
'vnid': 1002
})
self.assertEqual(rv.status_code, 204)
for i in range (1,100):
routes.append({'cmd':'add',
'ip_prefix':'10.2.'+str(i)+'.0/24',
'nexthop':'192.168.2.'+str(i),
'vnid': 1 + i%5,
'mac_address':'00:08:aa:bb:cd:'+hex(15+i)[2:]})
# Patch add
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes)
self.assertEqual(r.status_code, 204)
self.check_routes_exist_in_tun_tb(1, routes)
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-2", routes)
self.assertEqual(r.status_code, 204)
self.check_routes_exist_in_tun_tb(2, routes)
# Patch delete
# Get all
for route in routes:
del route['cmd']
routes.append({'nexthop': '', 'ip_prefix': '10.1.1.0/24', 'ifname': 'Vlan2'})
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes)
# Get filtered by vnid
routes_vnid = []
routes_not_vnid = []
route_pref = {}
i = 0
for route in routes:
if i == 70:
route_pref = route
if 'vnid' in route and route['vnid'] == 5:
routes_vnid.append(route)
else:
routes_not_vnid.append(route)
i += 1
routes_vnid.append({'nexthop': '', 'ifname': 'Vlan2', 'ip_prefix': '10.1.1.0/24'})
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1", vnid=5)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes_vnid)
routes_vnid.remove({'nexthop': '', 'ifname': 'Vlan2', 'ip_prefix': '10.1.1.0/24'})
# Get filtered by ip_prefix
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1", ip_prefix=route_pref['ip_prefix'])
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, [route_pref])
# Get filtered by both ip_prefix and vnid
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1", vnid=2, ip_prefix=route_pref['ip_prefix'])
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, [route_pref])
# Delete filtered by vnid
r = self.delete_config_vrouter_vrf_id_routes("vnet-guid-1", vnid=5)
self.assertEqual(r.status_code, 204)
routes_not_vnid_cleaned = routes_not_vnid
for route in routes_not_vnid:
if "mac_address" not in route:
routes_not_vnid_cleaned.remove(route)
self.check_routes_exist_in_tun_tb(1, routes_not_vnid_cleaned)
self.check_routes_dont_exist_in_tun_tb(1, routes_vnid)
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1", vnid=5)
j = json.loads(r.text)
self.assertEqual(j, [])
# Patch combo add and delete
routes_cleaned = []
for route in routes:
if len(route["nexthop"]) > 1:
if "vnid" in route and route['vnid'] == 5:
route['cmd'] = 'add'
else:
route['cmd'] = 'delete'
routes_cleaned.append(route)
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes_cleaned)
self.assertEqual(r.status_code, 204)
self.check_routes_exist_in_tun_tb(1, routes_vnid)
self.check_routes_dont_exist_in_tun_tb(1, routes_not_vnid)
for route in routes_cleaned:
del route['cmd']
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes_vnid)
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1", vnid=4)
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertEqual(j, [])
# Delete all routes
r = self.delete_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 204)
self.check_routes_dont_exist_in_tun_tb(1, routes)
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
j = json.loads(r.text)
self.assertEqual(j, [])
# Test that routes in other Vnet are untouched
self.check_routes_exist_in_tun_tb(2, routes_cleaned)
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-2")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes_cleaned)
def test_vrf_routes_all_verbs(self):
routes = []
routes.append({'cmd':'add',
'ip_prefix':'20.1.2.0/24',
'nexthop':'192.168.2.200'})
routes.append({'cmd':'add',
'ip_prefix':'30.1.2.0/24',
'nexthop':'192.168.2.200,192.168.2.201'})
routes.append({'cmd':'add',
'ip_prefix':'40.1.2.0/24',
'nexthop':'192.168.2.200,192.168.2.201,192.168.2.202',
'ifname':'Ethernet0,Ethernet4,Ethernet8'})
routes.append({'cmd':'add',
'ip_prefix':'50.1.2.0/24',
'ifname':'Ethernet0,Ethernet4'})
routes.append({'cmd':'add',
'ip_prefix':'60.1.2.0/24',
'ifname':'Ethernet8'})
# Patch add
r = self.patch_config_vrf_vrf_id_routes("default", routes)
self.assertEqual(r.status_code, 204)
for route in routes:
del route['cmd']
if 'nexthop' not in route:
route['nexthop'] = ''
r = self.get_config_vrf_vrf_id_routes("default")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes)
# Patch del
for route in routes:
route['cmd'] = 'delete'
if route['nexthop'] == '':
del route['nexthop']
r = self.patch_config_vrf_vrf_id_routes("default", routes)
self.assertEqual(r.status_code, 204)
r = self.get_config_vrf_vrf_id_routes("default")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
routes = []
self.assertItemsEqual(j, routes)
# Test modify
routes.append({'cmd':'add',
'ip_prefix':'40.1.2.0/24',
'nexthop':'192.168.2.200,192.168.2.201',
'ifname':'Ethernet0,Ethernet4'})
r = self.patch_config_vrf_vrf_id_routes("default", routes)
self.assertEqual(r.status_code, 204)
for route in routes:
route['nexthop'] = '192.168.2.200,192.168.2.201,10.1.1.1'
route['ifname'] = 'Ethernet0,Ethernet4,Vlan1000'
r = self.patch_config_vrf_vrf_id_routes("default", routes)
self.assertEqual(r.status_code, 204)
for route in routes:
del route['cmd']
r = self.get_config_vrf_vrf_id_routes("default")
self.assertEqual(r.status_code, 200)
j = json.loads(r.text)
self.assertItemsEqual(j, routes)
def test_local_subnet_route_addition(self):
self.post_generic_vlan_and_deps()
local_route_table = self.db.hgetall(LOCAL_ROUTE_TB + ':' + VNET_NAME_PREF +str(1)+':10.1.1.0/24')
self.assertEqual(local_route_table, {b'ifname' : VLAN_NAME_PREF + '2'})
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
local_route_table = self.db.hgetall(LOCAL_ROUTE_TB + ':' + VNET_NAME_PREF +str(1)+':10.1.1.0/24')
self.assertEqual(local_route_table, {})
# Operations
# PingVRF
def test_post_ping(self):
vlan0 = 2
self.post_generic_vrouter_and_deps()
# Ping loss but response 200
r = self.post_ping({'vnet_id' : 'vnet-guid-1', 'count' : '2', 'ip_addr' : '8.8.8.8'})
self.assertEqual(r.status_code, 200)
# Ping success and response 200
r = self.post_ping({"count" : "2", "ip_addr" : "8.8.8.8"})
self.assertEqual(r.status_code, 200)
# Ping success and response 200
r = self.post_ping({"ip_addr" : "8.8.8.8"})
self.assertEqual(r.status_code, 200)
class ra_client_negative_tests(rest_api_client):
"""Invalid input tests"""
# Decap:
def test_delete_config_tunnel_decap_tunnel_type_not_vxlan(self):
r = self.delete_config_tunnel_decap_tunnel_type('not_vxlan')
self.assertEqual(r.status_code, 400)
j = json.loads(r.text)
self.assertListEqual(['tunnel_type'], j['error']['fields'])
def test_get_config_tunnel_decap_tunnel_not_created(self):
r = self.get_config_tunnel_decap_tunnel_type('vxlan')
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['tunnel_type'], j['error']['fields'])
# Vrouter:
def test_delete_vrouter_with_dependencies(self):
# Init
self.post_generic_vrouter_and_deps()
r = self.post_config_vrouter_vrf_id("vnet-guid-2", { 'vnid': 1002 })
self.assertEqual(r.status_code, 204)
# Vlan Dependency
rv = self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(rv.status_code, 204)
rv = self.post_config_vlan(3, {'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(rv.status_code, 204)
r = self.delete_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(DELETE_DEP, j['error']['sub-code'])
rv = self.delete_config_vlan(2)
self.assertEqual(rv.status_code, 204)
r = self.delete_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 204)
# Routes Dependency
rv = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(rv.status_code, 204)
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", [{'cmd':'add', 'ip_prefix':'10.1.2.0/24', 'nexthop':'192.168.2.1'}])
self.assertEqual(r.status_code, 204)
r = self.delete_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(DELETE_DEP, j['error']['sub-code'])
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", [{'cmd':'delete', 'ip_prefix':'10.1.2.0/24', 'nexthop':'192.168.2.1'}])
self.assertEqual(r.status_code, 204)
r = self.delete_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 204)
def test_vrouter_not_created_all_verbs(self):
# Get
r = self.get_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 404)
# Delete
r = self.delete_config_vrouter_vrf_id("vnet-guid-1")
self.assertEqual(r.status_code, 404)
# Vrouter Routes
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", [{'cmd':'add', 'ip_prefix':'10.1.2.0/24', 'nexthop':'192.168.2.1'}])
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vnet-guid-1'], j['error']['fields'])
r = self.get_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vnet-guid-1'], j['error']['fields'])
r = self.delete_config_vrouter_vrf_id_routes("vnet-guid-1")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vnet-guid-1'], j['error']['fields'])
def test_post_vrouter_without_vtep(self):
r = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertListEqual(['tunnel'], j['error']['fields'])
self.assertEqual(DEP_MISSING, j['error']['sub-code'])
def test_post_vrouter_which_exists(self):
self.post_generic_vrouter_and_deps()
r = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': 1001
})
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(RESRC_EXISTS, j['error']['sub-code'])
def test_post_vrouter_malformed_arg(self):
self.post_generic_vrouter_and_deps()
r = self.post_config_vrouter_vrf_id("vnet-guid-1", {
'vnid': "this is malformed"
})
self.assertEqual(r.status_code, 400)
j = json.loads(r.text)
self.assertListEqual(['vnid'], j['error']['fields'])
# Vlan
def test_post_vlan_which_exists(self):
self.post_generic_vlan_and_deps()
r = self.post_config_vlan(2, {})
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(RESRC_EXISTS, j['error']['sub-code'])
def test_vlan_not_created_all_verbs(self):
# Get
r = self.get_config_vlan(2)
self.assertEqual(r.status_code, 404)
r = self.get_config_vlan_member(2, "ethernet2")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vlan_id'], j['error']['fields'])
r = self.get_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vlan_id'], j['error']['fields'])
# Delete
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 404)
r = self.delete_config_vlan_member(2, "ethernet2")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vlan_id'], j['error']['fields'])
r = self.delete_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vlan_id'], j['error']['fields'])
# Post
r = self.post_config_vlan_member(2, "ethernet2", {})
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vlan_id'], j['error']['fields'])
r = self.post_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['vlan_id'], j['error']['fields'])
def test_vlan_out_of_range(self):
vlan_ids = [1,4095]
member = "Ethernet1"
ip_addr = "10.10.1.1"
for vlan_id in vlan_ids:
r = self.post_config_vlan(vlan_id, {})
self.assertEqual(r.status_code, 400)
r = self.get_config_vlan(vlan_id)
self.assertEqual(r.status_code, 400)
r = self.delete_config_vlan(vlan_id)
self.assertEqual(r.status_code, 400)
r = self.post_config_vlan_member(vlan_id, member, {})
self.assertEqual(r.status_code, 400)
r = self.get_config_vlan_member(vlan_id, member)
self.assertEqual(r.status_code, 400)
r = self.delete_config_vlan_member(vlan_id, member)
self.assertEqual(r.status_code, 400)
r = self.post_config_vlan_neighbor(vlan_id, {})
self.assertEqual(r.status_code, 400)
r = self.get_config_vlan_neighbor(vlan_id, ip_addr)
self.assertEqual(r.status_code, 400)
r = self.delete_config_vlan_neighbor(vlan_id, ip_addr)
self.assertEqual(r.status_code, 400)
def test_delete_vlan_with_dependencies(self):
# Init generic config
self.post_generic_vlan_and_deps()
rv = self.post_config_vlan(3, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(rv.status_code, 204)
rv = self.post_config_vlan_member(3, "Ethernet2", {})
self.assertEqual(rv.status_code, 204)
# Dependency Vlan Member
rv = self.post_config_vlan_member(2, "Ethernet1", {})
self.assertEqual(rv.status_code, 204)
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(DELETE_DEP, j['error']['sub-code'])
rv = self.delete_config_vlan_member(2, "Ethernet1")
self.assertEqual(rv.status_code, 204)
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
# Dependency Vlan Neighbor
rv = self.post_config_vlan(2, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
rv = self.post_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(rv.status_code, 204)
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(DELETE_DEP, j['error']['sub-code'])
rv = self.delete_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(rv.status_code, 204)
r = self.delete_config_vlan(2)
self.assertEqual(r.status_code, 204)
def test_get_vlans_per_vnetid_invalid_vlan(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
#create invalid vlan interfaces
self.post_config_vlan(5555, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.post_config_vlan(4096, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.2.1/24'})
# get vlans for vnet-guid-1
r = self.get_config_interface_vlans('vnet-guid-1')
j = json.loads(r.text)
self.assertEqual(j,{u'attr': None, u'vnet_id': u'vnet-guid-1'})
def test_get_vlans_per_vnetid_invalid_vnet(self):
# create vxlan tunnel
self.post_config_tunnel_decap_tunnel_type('vxlan', {
'ip_addr': '6.6.6.6'
})
# create vnet_id/vrf
self.post_config_vrouter_vrf_id('vnet-guid-1', {'vnid': 1001})
#create vlan interfaces
self.post_config_vlan(555, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.1.1/24'})
self.post_config_vlan(409, {'vnet_id' : 'vnet-guid-1', 'ip_prefix':'10.0.2.1/24'})
# get vlans for vnet-guid-1
r = self.get_config_interface_vlans('')
j = json.loads(r.text)
self.assertEqual(r.status_code,404)
# Vlan Member
def test_post_vlan_mem_which_exists_tagged(self):
self.post_generic_vlan_and_deps()
r = self.post_config_vlan(3, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(r.status_code, 204)
attr = {'tagging_mode' : 'tagged'}
r = self.post_config_vlan_member(2, "Ethernet1", attr)
self.assertEqual(r.status_code, 204)
r = self.post_config_vlan_member(2, "Ethernet1", attr)
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(RESRC_EXISTS, j['error']['sub-code'])
r = self.post_config_vlan_member(3, "Ethernet1", attr)
self.assertEqual(r.status_code, 204)
def test_post_vlan_mem_which_exists_untagged(self):
self.post_generic_vlan_and_deps()
r = self.post_config_vlan(3, {'vnet_id' : 'vnet-guid-1', 'ip_prefix' : '10.1.1.0/24'})
self.assertEqual(r.status_code, 204)
attr = {'tagging_mode' : 'untagged'}
r = self.post_config_vlan_member(2, "Ethernet1", attr)
self.assertEqual(r.status_code, 204)
r = self.post_config_vlan_member(2, "Ethernet1", attr)
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(RESRC_EXISTS, j['error']['sub-code'])
r = self.post_config_vlan_member(3, "Ethernet1", attr)
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(RESRC_EXISTS, j['error']['sub-code'])
r = self.delete_config_vlan_member(2, "Ethernet1")
self.assertEqual(r.status_code, 204)
r = self.post_config_vlan_member(3, "Ethernet1", attr)
self.assertEqual(r.status_code, 204)
attr = {'tagging_mode' : 'tagged'}
r = self.post_config_vlan_member(2, "Ethernet1", attr)
self.assertEqual(r.status_code, 204)
def test_get_vlan_member_not_created(self):
self.post_generic_vlan_and_deps()
r = self.get_config_vlan_member(2, "ethernet2")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['if_name'], j['error']['fields'])
def test_delete_vlan_member_not_created(self):
self.post_generic_vlan_and_deps()
r = self.delete_config_vlan_member(2, "ethernet2")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['if_name'], j['error']['fields'])
# Vlan Neighbor
def test_post_vlan_neighbor_which_exists(self):
self.post_generic_vlan_and_deps()
r = self.post_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 204)
r = self.post_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 409)
j = json.loads(r.text)
self.assertEqual(RESRC_EXISTS, j['error']['sub-code'])
def test_get_vlan_neighbor_not_created(self):
self.post_generic_vlan_and_deps()
r = self.get_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['ip_addr'], j['error']['fields'])
def test_delete_vlan_neighbor_not_created(self):
self.post_generic_vlan_and_deps()
r = self.delete_config_vlan_neighbor(2, "10.10.10.10")
self.assertEqual(r.status_code, 404)
j = json.loads(r.text)
self.assertListEqual(['ip_addr'], j['error']['fields'])
def test_vlan_neighbor_not_valid_ip(self):
self.post_generic_vlan_and_deps()
# post
r = self.post_config_vlan_neighbor(2, "a.b.c.d")
self.assertEqual(r.status_code, 400)
j = json.loads(r.text)
self.assertListEqual(['ip_addr'], j['error']['fields'])
# get
r = self.get_config_vlan_neighbor(2, "a.b.c.d")
self.assertEqual(r.status_code, 400)
j = json.loads(r.text)
self.assertListEqual(['ip_addr'], j['error']['fields'])
# delete
r = self.delete_config_vlan_neighbor(2, "a.b.c.d")
self.assertEqual(r.status_code, 400)
j = json.loads(r.text)
self.assertListEqual(['ip_addr'], j['error']['fields'])
# Routes
def test_patch_delete_routes_not_created(self):
self.post_generic_vlan_and_deps()
routes = []
for i in range (1,100):
routes.append({'cmd':'delete',
'ip_prefix':'10.2.'+str(i)+'.0/24',
'nexthop':'192.168.2.'+str(i),
'vnid': 1 + i%5,
'mac_address':'00:08:aa:bb:cd:'+hex(15+i)[2:]})
# Patch
r = self.patch_config_vrouter_vrf_id_routes("vnet-guid-1", routes)
self.assertEqual(r.status_code, 207)
j = json.loads(r.text)
for route in routes:
route['error_code'] = 404
route['error_msg'] = 'Not found'
self.assertItemsEqual(routes, j['failed'])
self.check_routes_dont_exist_in_tun_tb(1, routes)
# Operations
# PingVRF
def test_post_ping_invalid(self):
vlan0 = 2
self.post_generic_vrouter_and_deps()
# Invalid count scenario
r = self.post_ping({"count" : "abc", "ip_addr" : "8.8.8.8"})
self.assertEqual(r.status_code, 400)
# Invalid ip_addr scenario
r = self.post_ping({"ip_addr" : "8.8.8.888"})
self.assertEqual(r.status_code, 400)
# vnet_id not found 404 error
r = self.post_ping({'vnet_id' : 'vnet-1', 'ip_addr' : '8.8.8.8'})
self.assertEqual(r.status_code, 404)
suite = unittest.TestLoader().loadTestsFromTestCase(ra_client_positive_tests)
unittest.TextTestRunner(verbosity=2).run(suite)
suite = unittest.TestLoader().loadTestsFromTestCase(ra_client_negative_tests)
unittest.TextTestRunner(verbosity=2).run(suite)
| 41.164669 | 184 | 0.598811 | 8,953 | 65,246 | 4.095834 | 0.04021 | 0.0994 | 0.045296 | 0.088192 | 0.87622 | 0.850914 | 0.811372 | 0.770766 | 0.742351 | 0.712135 | 0 | 0.044551 | 0.256215 | 65,246 | 1,584 | 185 | 41.190657 | 0.71108 | 0.032538 | 0 | 0.646481 | 0 | 0.001637 | 0.143034 | 0.024438 | 0 | 0 | 0 | 0 | 0.233224 | 0 | null | null | 0 | 0.006547 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
07298e7f19254efcd88bf4beafcea9bee1e77a5f | 487 | py | Python | ipymarkup/__init__.py | natasha/ipymarkup | 6279cdc8f896a2dbfaa5c18924c11fe7d57bcf4b | [
"MIT"
] | 108 | 2018-07-13T03:46:30.000Z | 2022-03-08T03:05:45.000Z | ipymarkup/__init__.py | natasha/ipymarkup | 6279cdc8f896a2dbfaa5c18924c11fe7d57bcf4b | [
"MIT"
] | 5 | 2019-05-20T13:54:58.000Z | 2020-05-26T07:29:18.000Z | ipymarkup/__init__.py | natasha/ipymarkup | 6279cdc8f896a2dbfaa5c18924c11fe7d57bcf4b | [
"MIT"
] | 24 | 2018-10-12T15:21:07.000Z | 2021-11-11T20:33:54.000Z |
from .span import format_span_box_markup, show_span_box_markup # noqa
from .span import format_span_line_markup, show_span_line_markup # noqa
from .span import format_span_ascii_markup, show_span_ascii_markup # noqa
from .dep import format_dep_markup, show_dep_markup # noqa
from .dep import format_dep_ascii_markup, show_dep_ascii_markup # noqa
# legacy
show_box_markup = show_span_box_markup
show_line_markup = show_span_line_markup
show_ascii_markup = show_span_ascii_markup
| 34.785714 | 74 | 0.848049 | 80 | 487 | 4.625 | 0.15 | 0.27027 | 0.227027 | 0.162162 | 0.827027 | 0.762162 | 0.356757 | 0 | 0 | 0 | 0 | 0 | 0.112936 | 487 | 13 | 75 | 37.461538 | 0.856481 | 0.063655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.625 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0736a283797be128b1ac9bca1ee227aae2fa6125 | 1,468 | py | Python | tests/test_utils_system.py | Muflhi01/videoflow | c49d3fe6c814574bcda1a4e907ce52ea86e1617c | [
"MIT"
] | 1,022 | 2019-05-24T21:27:49.000Z | 2022-03-30T04:08:35.000Z | tests/test_utils_system.py | Muflhi01/videoflow | c49d3fe6c814574bcda1a4e907ce52ea86e1617c | [
"MIT"
] | 57 | 2019-05-25T06:48:44.000Z | 2021-06-23T17:17:51.000Z | tests/test_utils_system.py | Muflhi01/videoflow | c49d3fe6c814574bcda1a4e907ce52ea86e1617c | [
"MIT"
] | 88 | 2019-05-23T14:24:14.000Z | 2022-03-28T05:06:33.000Z | import pytest
import videoflow.utils.system as system
def test_gpus_available_1(monkeypatch):
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', '')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 0
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', '0')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 0
def test_gpus_available_2(monkeypatch):
def get_system_gpus_mock():
return set([0])
monkeypatch.setattr(system, 'get_system_gpus', get_system_gpus_mock)
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', '')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 0
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', '0')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 1
assert 0 in gpus
def test_gpus_available_3(monkeypatch):
def get_system_gpus_mock():
return set([0, 1])
monkeypatch.setattr(system, 'get_system_gpus', get_system_gpus_mock)
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', '1, 2')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 1
assert 1 in gpus
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', '2, 3')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 0
monkeypatch.setenv('CUDA_VISIBLE_DEVICES', 'asdfa, 1, 0, asdf')
gpus = system.get_gpus_available_to_process()
assert len(gpus) == 2
if __name__ == "__main__":
pytest.main([__file__])
| 29.959184 | 72 | 0.711853 | 199 | 1,468 | 4.854271 | 0.175879 | 0.134576 | 0.152174 | 0.202899 | 0.818841 | 0.782609 | 0.782609 | 0.782609 | 0.782609 | 0.697723 | 0 | 0.018977 | 0.174387 | 1,468 | 48 | 73 | 30.583333 | 0.778053 | 0 | 0 | 0.583333 | 0 | 0 | 0.139646 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.138889 | false | 0 | 0.055556 | 0.055556 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
073f3b2e6550ec8a6619560f34e1fcc486e8994f | 24,651 | py | Python | nova/tests/unit/objects/test_migration.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/objects/test_migration.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/objects/test_migration.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2013 IBM Corp.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'mock'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'timeutils'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
name|'import'
name|'context'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'db'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'migration'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'fake_instance'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
op|'.'
name|'objects'
name|'import'
name|'test_objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
name|'import'
name|'uuidsentinel'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|NOW
name|'NOW'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
op|'.'
name|'replace'
op|'('
name|'microsecond'
op|'='
number|'0'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|fake_db_migration
name|'def'
name|'fake_db_migration'
op|'('
op|'**'
name|'updates'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_instance'
op|'='
op|'{'
nl|'\n'
string|"'created_at'"
op|':'
name|'NOW'
op|','
nl|'\n'
string|"'updated_at'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'deleted_at'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'deleted'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'id'"
op|':'
number|'123'
op|','
nl|'\n'
string|"'source_compute'"
op|':'
string|"'compute-source'"
op|','
nl|'\n'
string|"'dest_compute'"
op|':'
string|"'compute-dest'"
op|','
nl|'\n'
string|"'source_node'"
op|':'
string|"'node-source'"
op|','
nl|'\n'
string|"'dest_node'"
op|':'
string|"'node-dest'"
op|','
nl|'\n'
string|"'dest_host'"
op|':'
string|"'host-dest'"
op|','
nl|'\n'
string|"'old_instance_type_id'"
op|':'
number|'42'
op|','
nl|'\n'
string|"'new_instance_type_id'"
op|':'
number|'84'
op|','
nl|'\n'
string|"'instance_uuid'"
op|':'
string|"'fake-uuid'"
op|','
nl|'\n'
string|"'status'"
op|':'
string|"'migrating'"
op|','
nl|'\n'
string|"'migration_type'"
op|':'
string|"'resize'"
op|','
nl|'\n'
string|"'hidden'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'memory_total'"
op|':'
number|'123456'
op|','
nl|'\n'
string|"'memory_processed'"
op|':'
number|'12345'
op|','
nl|'\n'
string|"'memory_remaining'"
op|':'
number|'120000'
op|','
nl|'\n'
string|"'disk_total'"
op|':'
number|'234567'
op|','
nl|'\n'
string|"'disk_processed'"
op|':'
number|'23456'
op|','
nl|'\n'
string|"'disk_remaining'"
op|':'
number|'230000'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
name|'if'
name|'updates'
op|':'
newline|'\n'
indent|' '
name|'db_instance'
op|'.'
name|'update'
op|'('
name|'updates'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'db_instance'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|_TestMigrationObject
dedent|''
name|'class'
name|'_TestMigrationObject'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
DECL|member|test_get_by_id
indent|' '
name|'def'
name|'test_get_by_id'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'migration_get'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_get'
op|'('
name|'ctxt'
op|','
name|'fake_migration'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_migration'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'.'
name|'get_by_id'
op|'('
name|'ctxt'
op|','
name|'fake_migration'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'mig'
op|','
name|'fake_migration'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_by_instance_and_status
dedent|''
name|'def'
name|'test_get_by_instance_and_status'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'migration_get_by_instance_and_status'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_get_by_instance_and_status'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'fake_migration'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
string|"'migrating'"
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_migration'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'.'
name|'get_by_instance_and_status'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'fake_migration'
op|'['
string|"'id'"
op|']'
op|','
string|"'migrating'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'mig'
op|','
name|'fake_migration'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.db.migration_get_in_progress_by_instance'"
op|')'
newline|'\n'
DECL|member|test_get_in_progress_by_instance
name|'def'
name|'test_get_in_progress_by_instance'
op|'('
name|'self'
op|','
name|'m_get_mig'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'db_migrations'
op|'='
op|'['
name|'fake_migration'
op|','
name|'dict'
op|'('
name|'fake_migration'
op|','
name|'id'
op|'='
number|'456'
op|')'
op|']'
newline|'\n'
nl|'\n'
name|'m_get_mig'
op|'.'
name|'return_value'
op|'='
name|'db_migrations'
newline|'\n'
name|'migrations'
op|'='
name|'migration'
op|'.'
name|'MigrationList'
op|'.'
name|'get_in_progress_by_instance'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'fake_migration'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'len'
op|'('
name|'migrations'
op|')'
op|')'
newline|'\n'
name|'for'
name|'index'
op|','
name|'db_migration'
name|'in'
name|'enumerate'
op|'('
name|'db_migrations'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'migrations'
op|'['
name|'index'
op|']'
op|','
name|'db_migration'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create
dedent|''
dedent|''
name|'def'
name|'test_create'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'migration_create'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_create'
op|'('
name|'ctxt'
op|','
op|'{'
string|"'source_compute'"
op|':'
string|"'foo'"
op|','
nl|'\n'
string|"'migration_type'"
op|':'
string|"'resize'"
op|'}'
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_migration'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'('
name|'context'
op|'='
name|'ctxt'
op|')'
newline|'\n'
name|'mig'
op|'.'
name|'source_compute'
op|'='
string|"'foo'"
newline|'\n'
name|'mig'
op|'.'
name|'migration_type'
op|'='
string|"'resize'"
newline|'\n'
name|'mig'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'fake_migration'
op|'['
string|"'dest_compute'"
op|']'
op|','
name|'mig'
op|'.'
name|'dest_compute'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_recreate_fails
dedent|''
name|'def'
name|'test_recreate_fails'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'migration_create'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_create'
op|'('
name|'ctxt'
op|','
op|'{'
string|"'source_compute'"
op|':'
string|"'foo'"
op|','
nl|'\n'
string|"'migration_type'"
op|':'
string|"'resize'"
op|'}'
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_migration'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'('
name|'context'
op|'='
name|'ctxt'
op|')'
newline|'\n'
name|'mig'
op|'.'
name|'source_compute'
op|'='
string|"'foo'"
newline|'\n'
name|'mig'
op|'.'
name|'migration_type'
op|'='
string|"'resize'"
newline|'\n'
name|'mig'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'ObjectActionError'
op|','
name|'mig'
op|'.'
name|'create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_fails_migration_type
dedent|''
name|'def'
name|'test_create_fails_migration_type'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'migration_create'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'('
name|'context'
op|'='
name|'ctxt'
op|','
nl|'\n'
name|'old_instance_type_id'
op|'='
number|'42'
op|','
nl|'\n'
name|'new_instance_type_id'
op|'='
number|'84'
op|')'
newline|'\n'
name|'mig'
op|'.'
name|'source_compute'
op|'='
string|"'foo'"
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'ObjectActionError'
op|','
name|'mig'
op|'.'
name|'create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_save
dedent|''
name|'def'
name|'test_save'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'migration_update'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_update'
op|'('
name|'ctxt'
op|','
number|'123'
op|','
op|'{'
string|"'source_compute'"
op|':'
string|"'foo'"
op|'}'
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_migration'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'('
name|'context'
op|'='
name|'ctxt'
op|')'
newline|'\n'
name|'mig'
op|'.'
name|'id'
op|'='
number|'123'
newline|'\n'
name|'mig'
op|'.'
name|'source_compute'
op|'='
string|"'foo'"
newline|'\n'
name|'mig'
op|'.'
name|'save'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'fake_migration'
op|'['
string|"'dest_compute'"
op|']'
op|','
name|'mig'
op|'.'
name|'dest_compute'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_instance
dedent|''
name|'def'
name|'test_instance'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'fake_inst'
op|'='
name|'fake_instance'
op|'.'
name|'fake_db_instance'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'db'
op|','
string|"'instance_get_by_uuid'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'instance_get_by_uuid'
op|'('
name|'ctxt'
op|','
name|'fake_migration'
op|'['
string|"'instance_uuid'"
op|']'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
op|'['
string|"'info_cache'"
op|','
nl|'\n'
string|"'security_groups'"
op|']'
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_inst'
op|')'
newline|'\n'
name|'mig'
op|'='
name|'migration'
op|'.'
name|'Migration'
op|'.'
name|'_from_db_object'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'migration'
op|'.'
name|'Migration'
op|'('
op|')'
op|','
nl|'\n'
name|'fake_migration'
op|')'
newline|'\n'
name|'mig'
op|'.'
name|'_context'
op|'='
name|'ctxt'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'mig'
op|'.'
name|'instance'
op|'.'
name|'host'
op|','
name|'fake_inst'
op|'['
string|"'host'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_instance_setter
dedent|''
name|'def'
name|'test_instance_setter'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'migration'
op|'='
name|'objects'
op|'.'
name|'Migration'
op|'('
name|'instance_uuid'
op|'='
name|'uuidsentinel'
op|'.'
name|'instance'
op|')'
newline|'\n'
name|'inst'
op|'='
name|'objects'
op|'.'
name|'Instance'
op|'('
name|'uuid'
op|'='
name|'uuidsentinel'
op|'.'
name|'instance'
op|')'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.objects.Instance.get_by_uuid'"
op|')'
name|'as'
name|'mock_get'
op|':'
newline|'\n'
indent|' '
name|'migration'
op|'.'
name|'instance'
op|'='
name|'inst'
newline|'\n'
name|'migration'
op|'.'
name|'instance'
newline|'\n'
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'mock_get'
op|'.'
name|'called'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'inst'
op|','
name|'migration'
op|'.'
name|'_cached_instance'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'inst'
op|','
name|'migration'
op|'.'
name|'instance'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_unconfirmed_by_dest_compute
dedent|''
name|'def'
name|'test_get_unconfirmed_by_dest_compute'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'db_migrations'
op|'='
op|'['
name|'fake_migration'
op|','
name|'dict'
op|'('
name|'fake_migration'
op|','
name|'id'
op|'='
number|'456'
op|')'
op|']'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
nl|'\n'
name|'db'
op|','
string|"'migration_get_unconfirmed_by_dest_compute'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_get_unconfirmed_by_dest_compute'
op|'('
nl|'\n'
name|'ctxt'
op|','
string|"'window'"
op|','
string|"'foo'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'db_migrations'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'migrations'
op|'='
op|'('
nl|'\n'
name|'migration'
op|'.'
name|'MigrationList'
op|'.'
name|'get_unconfirmed_by_dest_compute'
op|'('
nl|'\n'
name|'ctxt'
op|','
string|"'window'"
op|','
string|"'foo'"
op|','
name|'use_slave'
op|'='
name|'False'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'len'
op|'('
name|'migrations'
op|')'
op|')'
newline|'\n'
name|'for'
name|'index'
op|','
name|'db_migration'
name|'in'
name|'enumerate'
op|'('
name|'db_migrations'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'migrations'
op|'['
name|'index'
op|']'
op|','
name|'db_migration'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_in_progress_by_host_and_node
dedent|''
dedent|''
name|'def'
name|'test_get_in_progress_by_host_and_node'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'db_migrations'
op|'='
op|'['
name|'fake_migration'
op|','
name|'dict'
op|'('
name|'fake_migration'
op|','
name|'id'
op|'='
number|'456'
op|')'
op|']'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
nl|'\n'
name|'db'
op|','
string|"'migration_get_in_progress_by_host_and_node'"
op|')'
newline|'\n'
name|'db'
op|'.'
name|'migration_get_in_progress_by_host_and_node'
op|'('
nl|'\n'
name|'ctxt'
op|','
string|"'host'"
op|','
string|"'node'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'db_migrations'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'migrations'
op|'='
op|'('
nl|'\n'
name|'migration'
op|'.'
name|'MigrationList'
op|'.'
name|'get_in_progress_by_host_and_node'
op|'('
nl|'\n'
name|'ctxt'
op|','
string|"'host'"
op|','
string|"'node'"
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'len'
op|'('
name|'migrations'
op|')'
op|')'
newline|'\n'
name|'for'
name|'index'
op|','
name|'db_migration'
name|'in'
name|'enumerate'
op|'('
name|'db_migrations'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'migrations'
op|'['
name|'index'
op|']'
op|','
name|'db_migration'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_by_filters
dedent|''
dedent|''
name|'def'
name|'test_get_by_filters'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'db_migrations'
op|'='
op|'['
name|'fake_migration'
op|','
name|'dict'
op|'('
name|'fake_migration'
op|','
name|'id'
op|'='
number|'456'
op|')'
op|']'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
nl|'\n'
name|'db'
op|','
string|"'migration_get_all_by_filters'"
op|')'
newline|'\n'
name|'filters'
op|'='
op|'{'
string|"'foo'"
op|':'
string|"'bar'"
op|'}'
newline|'\n'
name|'db'
op|'.'
name|'migration_get_all_by_filters'
op|'('
name|'ctxt'
op|','
name|'filters'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'db_migrations'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'migrations'
op|'='
name|'migration'
op|'.'
name|'MigrationList'
op|'.'
name|'get_by_filters'
op|'('
name|'ctxt'
op|','
name|'filters'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'len'
op|'('
name|'migrations'
op|')'
op|')'
newline|'\n'
name|'for'
name|'index'
op|','
name|'db_migration'
name|'in'
name|'enumerate'
op|'('
name|'db_migrations'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'migrations'
op|'['
name|'index'
op|']'
op|','
name|'db_migration'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_migrate_old_resize_record
dedent|''
dedent|''
name|'def'
name|'test_migrate_old_resize_record'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_migration'
op|'='
name|'dict'
op|'('
name|'fake_db_migration'
op|'('
op|')'
op|','
name|'migration_type'
op|'='
name|'None'
op|')'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.db.migration_get'"
op|')'
name|'as'
name|'fake_get'
op|':'
newline|'\n'
indent|' '
name|'fake_get'
op|'.'
name|'return_value'
op|'='
name|'db_migration'
newline|'\n'
name|'mig'
op|'='
name|'objects'
op|'.'
name|'Migration'
op|'.'
name|'get_by_id'
op|'('
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
op|','
number|'1'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'mig'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'migration_type'"
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'resize'"
op|','
name|'mig'
op|'.'
name|'migration_type'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_migrate_old_migration_record
dedent|''
name|'def'
name|'test_migrate_old_migration_record'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_migration'
op|'='
name|'dict'
op|'('
nl|'\n'
name|'fake_db_migration'
op|'('
op|')'
op|','
name|'migration_type'
op|'='
name|'None'
op|','
nl|'\n'
name|'old_instance_type_id'
op|'='
number|'1'
op|','
name|'new_instance_type_id'
op|'='
number|'1'
op|')'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.db.migration_get'"
op|')'
name|'as'
name|'fake_get'
op|':'
newline|'\n'
indent|' '
name|'fake_get'
op|'.'
name|'return_value'
op|'='
name|'db_migration'
newline|'\n'
name|'mig'
op|'='
name|'objects'
op|'.'
name|'Migration'
op|'.'
name|'get_by_id'
op|'('
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
op|','
number|'1'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'mig'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'migration_type'"
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'migration'"
op|','
name|'mig'
op|'.'
name|'migration_type'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_migrate_unset_type_resize
dedent|''
name|'def'
name|'test_migrate_unset_type_resize'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'mig'
op|'='
name|'objects'
op|'.'
name|'Migration'
op|'('
name|'old_instance_type_id'
op|'='
number|'1'
op|','
nl|'\n'
name|'new_instance_type_id'
op|'='
number|'2'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'resize'"
op|','
name|'mig'
op|'.'
name|'migration_type'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'mig'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'migration_type'"
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_migrate_unset_type_migration
dedent|''
name|'def'
name|'test_migrate_unset_type_migration'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'mig'
op|'='
name|'objects'
op|'.'
name|'Migration'
op|'('
name|'old_instance_type_id'
op|'='
number|'1'
op|','
nl|'\n'
name|'new_instance_type_id'
op|'='
number|'1'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'migration'"
op|','
name|'mig'
op|'.'
name|'migration_type'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'mig'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'migration_type'"
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.db.migration_get_by_id_and_instance'"
op|')'
newline|'\n'
DECL|member|test_get_by_id_and_instance
name|'def'
name|'test_get_by_id_and_instance'
op|'('
name|'self'
op|','
name|'fake_get'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'='
name|'fake_db_migration'
op|'('
op|')'
newline|'\n'
name|'fake_get'
op|'.'
name|'return_value'
op|'='
name|'fake_migration'
newline|'\n'
name|'migration'
op|'='
name|'objects'
op|'.'
name|'Migration'
op|'.'
name|'get_by_id_and_instance'
op|'('
name|'ctxt'
op|','
string|"'1'"
op|','
string|"'1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'compare_obj'
op|'('
name|'migration'
op|','
name|'fake_migration'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
name|'class'
name|'TestMigrationObject'
op|'('
name|'test_objects'
op|'.'
name|'_LocalTest'
op|','
nl|'\n'
DECL|class|TestMigrationObject
name|'_TestMigrationObject'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
name|'class'
name|'TestRemoteMigrationObject'
op|'('
name|'test_objects'
op|'.'
name|'_RemoteTest'
op|','
nl|'\n'
DECL|class|TestRemoteMigrationObject
name|'_TestMigrationObject'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
dedent|''
endmarker|''
end_unit
| 12.55782 | 88 | 0.614255 | 3,636 | 24,651 | 4.031903 | 0.055281 | 0.163302 | 0.096862 | 0.082128 | 0.884379 | 0.839222 | 0.790723 | 0.745839 | 0.714256 | 0.663506 | 0 | 0.003852 | 0.094236 | 24,651 | 1,962 | 89 | 12.56422 | 0.652723 | 0 | 0 | 0.931702 | 0 | 0 | 0.375036 | 0.041459 | 0 | 0 | 0 | 0 | 0.010194 | 0 | null | null | 0.001019 | 0.005097 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ab1649c4ffa4433f8f48c3450863febf16a0fd1f | 23,522 | py | Python | test/test_radius.py | gizmoguy/chewie | 7791e0961f5045e3883de0d8a99f0cc95db166e4 | [
"Apache-2.0"
] | null | null | null | test/test_radius.py | gizmoguy/chewie | 7791e0961f5045e3883de0d8a99f0cc95db166e4 | [
"Apache-2.0"
] | null | null | null | test/test_radius.py | gizmoguy/chewie | 7791e0961f5045e3883de0d8a99f0cc95db166e4 | [
"Apache-2.0"
] | null | null | null | import unittest
from netils import build_byte_string
from chewie.radius import *
from chewie.radius_attributes import UserName, ServiceType, FramedMTU, CalledStationId, AcctSessionId, NASPortType, \
ConnectInfo, EAPMessage, MessageAuthenticator, State, VendorSpecific, CallingStationId
class RadiusTestCase(unittest.TestCase):
def test_radius_access_request_parses(self):
packed_message = build_byte_string("010000a3982a0ba06d3557f0dbc8ba6e823822f1010b686f737431757365721e1434342d34342d34342d34342d34342d34343a3d06000000130606000000021f1330302d30302d30302d31312d31312d30314d17434f4e4e45435420304d627073203830322e3131622c12433634383030344139433930353537390c06000005784f100201000e01686f73743175736572501273f82750f6f261a95a7cc7d318b9f573")
message = Radius.parse(packed_message, secret="SECRET")
self.assertEqual(message.packet_id, 0)
self.assertEqual(message.authenticator, "982a0ba06d3557f0dbc8ba6e823822f1")
msg_attr = message.attributes
self.assertEqual(len(msg_attr.attributes), 10)
self.assertEqual(msg_attr.find(UserName.DESCRIPTION).data_type.data, 'host1user')
self.assertEqual(msg_attr.find(CalledStationId.DESCRIPTION).data_type.data, "44-44-44-44-44-44:")
self.assertEqual(msg_attr.find(NASPortType.DESCRIPTION).data_type.data, 19)
self.assertEqual(msg_attr.find(ServiceType.DESCRIPTION).data_type.data, 2)
self.assertEqual(msg_attr.find(ConnectInfo.DESCRIPTION).data_type.data, "CONNECT 0Mbps 802.11b")
self.assertEqual(msg_attr.find(AcctSessionId.DESCRIPTION).data_type.data, "C648004A9C905579")
self.assertEqual(msg_attr.find(FramedMTU.DESCRIPTION).data_type.data, 1400)
self.assertEqual(msg_attr.find(EAPMessage.DESCRIPTION).data_type.data.hex(), "0201000e01686f73743175736572")
self.assertEqual(msg_attr.find(MessageAuthenticator.DESCRIPTION).data_type.data.hex(), "73f82750f6f261a95a7cc7d318b9f573")
def test_radius_access_accept_parses(self):
packed_message = build_byte_string("0201004602970aff2ef0700780f70848e90d24101a0f00003039010973747564656e744f06030200045012d7ec84e8864dd6cd00916c1d5a3cf41b010b686f73743175736572")
message = Radius.parse(packed_message, secret="SECRET", request_authenticator="a0b4ace0b367114b1a16d76e2bfed5d8")
self.assertEqual(message.packet_id, 1)
self.assertEqual(message.authenticator, "02970aff2ef0700780f70848e90d2410")
msg_attr = message.attributes
self.assertEqual(len(msg_attr.attributes), 4)
self.assertEqual(msg_attr.find(EAPMessage.DESCRIPTION).data_type.data.hex(), "03020004")
self.assertEqual(msg_attr.find(MessageAuthenticator.DESCRIPTION).data_type.data.hex(), "d7ec84e8864dd6cd00916c1d5a3cf41b")
self.assertEqual(msg_attr.find(UserName.DESCRIPTION).data_type.data, 'host1user')
def test_radius_access_accept_packs(self):
expected_packed_message = build_byte_string("02010046"
"02970aff2ef0700780f70848e90d2410"
"1a0f00003039010973747564656e74"
"4f0603020004"
"5012d7ec84e8864dd6cd00916c1d5a3cf41b"
"010b686f73743175736572")
attr_list = list()
attr_list.append(VendorSpecific.parse(bytes.fromhex("00003039010973747564656e74")))
attr_list.append(EAPMessage.parse(bytes.fromhex("03020004")))
attr_list.append(MessageAuthenticator.parse(bytes.fromhex("d7ec84e8864dd6cd00916c1d5a3cf41b")))
attr_list.append(UserName.parse("host1user".encode()))
attributes = RadiusAttributesList(attr_list)
access_accept = RadiusAccessAccept(1, "02970aff2ef0700780f70848e90d2410", attributes)
packed_message = access_accept.pack()
self.assertEqual(len(expected_packed_message), len(packed_message))
self.assertEqual(expected_packed_message, packed_message)
def test_radius_access_challenge_parses(self):
packed_message = build_byte_string(
"0b00005056d9280d3e4fed327eb31cf1823f8c244f1801020016041074d3db089b727d9cc5774599e4a32a295012ecc840b316217c851bd6708afb554b24181219ddf6d119dff272fa2fe16c34990c7d")
message = Radius.parse(packed_message, secret="SECRET", request_authenticator="982a0ba06d3557f0dbc8ba6e823822f1")
self.assertEqual(message.packet_id, 0)
self.assertEqual(message.authenticator, "56d9280d3e4fed327eb31cf1823f8c24")
msg_attr = message.attributes
self.assertEqual(len(msg_attr.attributes), 3)
self.assertEqual(msg_attr.find(EAPMessage.DESCRIPTION).data_type.data.hex(), "01020016041074d3db089b727d9cc5774599e4a32a29")
self.assertEqual(msg_attr.find(MessageAuthenticator.DESCRIPTION).data_type.data.hex(), "ecc840b316217c851bd6708afb554b24")
self.assertEqual(msg_attr.find(State.DESCRIPTION).data_type.data.hex(), "19ddf6d119dff272fa2fe16c34990c7d")
def test_radius_access_challenge_ttls_parses(self):
packed_message = build_byte_string(
"0b06042c54dbc73332c00c0347fc4b462d1811a74fff016a03ec15c000000a76160303003e0200003a0303114aa9dae3f9d452ca12535b03aee03cd4dabe3ca7639929dd3b645d1f86ad6500c030000012ff01000100000b000403000102000f00010116030308d30b0008cf0008cc0003de308203da308202c2a003020102020101300d06092a864886f70d01010b0500308193310b3009060355040613024652310f300d06035504080c065261646975733112301006035504070c09536f6d65776865726531153013060355040a0c0c4578616d706c6520496e632e3120301e06092a864886f70d010901161161646d696e406578616d706c652e6f72673126302406035504030c1d4578616d706c6520434fff6572746966696361746520417574686f72697479301e170d3138303630353033353134345a170d3138303830343033353134345a307c310b3009060355040613024652310f300d06035504080c0652616469757331153013060355040a0c0c4578616d706c6520496e632e3123302106035504030c1a4578616d706c65205365727665722043657274696669636174653120301e06092a864886f70d010901161161646d696e406578616d706c652e6f726730820122300d06092a864886f70d01010105000382010f003082010a0282010100cf5456d7e6142383101cf79275f6396e2c9b3f7cb2878d35e5ecc6f47ee11ef20bc8a8b3217a89351c554fff856e5cd5eed2d10037c9bcce89fbdf927e4cc4f069863acbac4accee7e80f2105ad80d837fa50a931c5b41d03c993f5e338cfd8e69e23818360053501c34c08132ec3d6e14df89ff29c5cec5c7a87d48c4afdcf9d3f8290050be5b903ba6a2a5ce2eb79c922cae70869618c75923059f9a8d62144e8ecdaf0a9f02886afa0e73e3d68037ea9fdca2bdd0f0785e05f5ac88031010c105575dbb09eb4f307547622120ee384ab454376de8e14e0afea02f1211801b6c932324ef6dba7abf3f48f8e3e84716c40b59041ec936cb273d684b22aa1c9d24e10203010001a34f304d30130603551d25040c300a06082b0601050507030130360603551d1f042f4ff7302d302ba029a0278625687474703a2f2f7777772e6578616d706c652e636f6d2f6578616d706c655f63612e63726c300d06092a864886f70d01010b0500038201010054fdcdabdc3a153dc167d6b210d1b324ecfac0e3b8d385704463a7f8ebf46e2e6952f249f4436ec66760868860e5ed50b519ec14628179472c312f507bc9349971d21f8f2b7d6b329b02fab448bd90fd4ce4dfbc78f23a8c4eed74d5589f4c3bd11b552535b8ab8a1a6ab9d1dfda21f247a93354702c12fdde1113cb8dd0e46e2a3a94547c9871df2a88943751d8276dc43f7f6aed921f43f6a33f9beba804c3d2b5781d754abe36ba58461798be8585b8b2501226e219fc875fd78976eb2b9b475b14881812c1591073c33305b4fa8bd26dd27eafd9")
message = Radius.parse(packed_message, secret="SECRET", request_authenticator="0d64ffb8bc76d457d337e5f5692534aa")
self.assertEqual(message.packet_id, 6)
self.assertEqual(message.authenticator, "54dbc73332c00c0347fc4b462d1811a7")
msg_attr = message.attributes
self.assertEqual(len(msg_attr.attributes), 3)
self.assertEqual(msg_attr.find(EAPMessage.DESCRIPTION).data_type.data.hex(), "016a03ec15c000000a76160303003e0200003a0303114aa9dae3f9d452ca12535b03aee03cd4dabe3ca7639929dd3b645d1f86ad6500c030000012ff01000100000b000403000102000f00010116030308d30b0008cf0008cc0003de308203da308202c2a003020102020101300d06092a864886f70d01010b0500308193310b3009060355040613024652310f300d06035504080c065261646975733112301006035504070c09536f6d65776865726531153013060355040a0c0c4578616d706c6520496e632e3120301e06092a864886f70d010901161161646d696e406578616d706c652e6f72673126302406035504030c1d4578616d706c652043"
"6572746966696361746520417574686f72697479301e170d3138303630353033353134345a170d3138303830343033353134345a307c310b3009060355040613024652310f300d06035504080c0652616469757331153013060355040a0c0c4578616d706c6520496e632e3123302106035504030c1a4578616d706c65205365727665722043657274696669636174653120301e06092a864886f70d010901161161646d696e406578616d706c652e6f726730820122300d06092a864886f70d01010105000382010f003082010a0282010100cf5456d7e6142383101cf79275f6396e2c9b3f7cb2878d35e5ecc6f47ee11ef20bc8a8b3217a89351c55"
"856e5cd5eed2d10037c9bcce89fbdf927e4cc4f069863acbac4accee7e80f2105ad80d837fa50a931c5b41d03c993f5e338cfd8e69e23818360053501c34c08132ec3d6e14df89ff29c5cec5c7a87d48c4afdcf9d3f8290050be5b903ba6a2a5ce2eb79c922cae70869618c75923059f9a8d62144e8ecdaf0a9f02886afa0e73e3d68037ea9fdca2bdd0f0785e05f5ac88031010c105575dbb09eb4f307547622120ee384ab454376de8e14e0afea02f1211801b6c932324ef6dba7abf3f48f8e3e84716c40b59041ec936cb273d684b22aa1c9d24e10203010001a34f304d30130603551d25040c300a06082b0601050507030130360603551d1f042f"
"302d302ba029a0278625687474703a2f2f7777772e6578616d706c652e636f6d2f6578616d706c655f63612e63726c300d06092a864886f70d01010b0500038201010054fdcdabdc3a153dc167d6b210d1b324ecfac0e3b8d385704463a7f8ebf46e2e6952f249f4436ec66760868860e5ed50b519ec14628179472c312f507bc9349971d21f8f2b7d6b329b02fab448bd90fd4ce4dfbc78f23a8c4eed74d5589f4c3bd11b552535b8ab8a1a6ab9d1dfda21f247a93354702c12fdde1113cb8dd0e46e2a3a94547c9871df2a88943751d8276dc43f7f6aed921f43f6a33f9beba804c3d2b5781d754abe36ba58461798be8585b8b2")
self.assertEqual(msg_attr.find(MessageAuthenticator.DESCRIPTION).data_type.data.hex(), "26e219fc875fd78976eb2b9b475b1488")
self.assertEqual(msg_attr.find(State.DESCRIPTION).data_type.data.hex(), "c1591073c33305b4fa8bd26dd27eafd9")
def test_radius_access_challenge_packs(self):
expected_packed_message = build_byte_string("0b06042c"
"54dbc73332c00c0347fc4b462d1811a74fff016a03ec15c000000a76160303003e0200003a0303114aa9dae3f9d452ca12535b03aee03cd4dabe3ca7639929dd3b645d1f86ad6500c030000012ff01000100000b000403000102000f00010116030308d30b0008cf0008cc0003de308203da308202c2a003020102020101300d06092a864886f70d01010b0500308193310b3009060355040613024652310f300d06035504080c065261646975733112301006035504070c09536f6d65776865726531153013060355040a0c0c4578616d706c6520496e632e3120301e06092a864886f70d010901161161646d696e406578616d706c652e6f72673126302406035504030c1d4578616d706c6520434fff6572746966696361746520417574686f72697479301e170d3138303630353033353134345a170d3138303830343033353134345a307c310b3009060355040613024652310f300d06035504080c0652616469757331153013060355040a0c0c4578616d706c6520496e632e3123302106035504030c1a4578616d706c65205365727665722043657274696669636174653120301e06092a864886f70d010901161161646d696e406578616d706c652e6f726730820122300d06092a864886f70d01010105000382010f003082010a0282010100cf5456d7e6142383101cf79275f6396e2c9b3f7cb2878d35e5ecc6f47ee11ef20bc8a8b3217a89351c554fff856e5cd5eed2d10037c9bcce89fbdf927e4cc4f069863acbac4accee7e80f2105ad80d837fa50a931c5b41d03c993f5e338cfd8e69e23818360053501c34c08132ec3d6e14df89ff29c5cec5c7a87d48c4afdcf9d3f8290050be5b903ba6a2a5ce2eb79c922cae70869618c75923059f9a8d62144e8ecdaf0a9f02886afa0e73e3d68037ea9fdca2bdd0f0785e05f5ac88031010c105575dbb09eb4f307547622120ee384ab454376de8e14e0afea02f1211801b6c932324ef6dba7abf3f48f8e3e84716c40b59041ec936cb273d684b22aa1c9d24e10203010001a34f304d30130603551d25040c300a06082b0601050507030130360603551d1f042f4ff7302d302ba029a0278625687474703a2f2f7777772e6578616d706c652e636f6d2f6578616d706c655f63612e63726c300d06092a864886f70d01010b0500038201010054fdcdabdc3a153dc167d6b210d1b324ecfac0e3b8d385704463a7f8ebf46e2e6952f249f4436ec66760868860e5ed50b519ec14628179472c312f507bc9349971d21f8f2b7d6b329b02fab448bd90fd4ce4dfbc78f23a8c4eed74d5589f4c3bd11b552535b8ab8a1a6ab9d1dfda21f247a93354702c12fdde1113cb8dd0e46e2a3a94547c9871df2a88943751d8276dc43f7f6aed921f43f6a33f9beba804c3d2b5781d754abe36ba58461798be8585b8b2"
"501226e219fc875fd78976eb2b9b475b1488"
"1812c1591073c33305b4fa8bd26dd27eafd9")
attr_list = list()
attr_list.append(EAPMessage.parse(bytes.fromhex("016a03ec15c000000a76160303003e0200003a0303114aa9dae3f9d452ca12535b03aee03cd4dabe3ca7639929dd3b645d1f86ad6500c030000012ff01000100000b000403000102000f00010116030308d30b0008cf0008cc0003de308203da308202c2a003020102020101300d06092a864886f70d01010b0500308193310b3009060355040613024652310f300d06035504080c065261646975733112301006035504070c09536f6d65776865726531153013060355040a0c0c4578616d706c6520496e632e3120301e06092a864886f70d010901161161646d696e406578616d706c652e6f72673126302406035504030c1d4578616d706c652043")))
attr_list.append(EAPMessage.parse(bytes.fromhex("6572746966696361746520417574686f72697479301e170d3138303630353033353134345a170d3138303830343033353134345a307c310b3009060355040613024652310f300d06035504080c0652616469757331153013060355040a0c0c4578616d706c6520496e632e3123302106035504030c1a4578616d706c65205365727665722043657274696669636174653120301e06092a864886f70d010901161161646d696e406578616d706c652e6f726730820122300d06092a864886f70d01010105000382010f003082010a0282010100cf5456d7e6142383101cf79275f6396e2c9b3f7cb2878d35e5ecc6f47ee11ef20bc8a8b3217a89351c55")))
attr_list.append(EAPMessage.parse(bytes.fromhex("856e5cd5eed2d10037c9bcce89fbdf927e4cc4f069863acbac4accee7e80f2105ad80d837fa50a931c5b41d03c993f5e338cfd8e69e23818360053501c34c08132ec3d6e14df89ff29c5cec5c7a87d48c4afdcf9d3f8290050be5b903ba6a2a5ce2eb79c922cae70869618c75923059f9a8d62144e8ecdaf0a9f02886afa0e73e3d68037ea9fdca2bdd0f0785e05f5ac88031010c105575dbb09eb4f307547622120ee384ab454376de8e14e0afea02f1211801b6c932324ef6dba7abf3f48f8e3e84716c40b59041ec936cb273d684b22aa1c9d24e10203010001a34f304d30130603551d25040c300a06082b0601050507030130360603551d1f042f")))
attr_list.append(EAPMessage.parse(bytes.fromhex("302d302ba029a0278625687474703a2f2f7777772e6578616d706c652e636f6d2f6578616d706c655f63612e63726c300d06092a864886f70d01010b0500038201010054fdcdabdc3a153dc167d6b210d1b324ecfac0e3b8d385704463a7f8ebf46e2e6952f249f4436ec66760868860e5ed50b519ec14628179472c312f507bc9349971d21f8f2b7d6b329b02fab448bd90fd4ce4dfbc78f23a8c4eed74d5589f4c3bd11b552535b8ab8a1a6ab9d1dfda21f247a93354702c12fdde1113cb8dd0e46e2a3a94547c9871df2a88943751d8276dc43f7f6aed921f43f6a33f9beba804c3d2b5781d754abe36ba58461798be8585b8b2")))
attr_list.append(MessageAuthenticator.parse(bytes.fromhex("26e219fc875fd78976eb2b9b475b1488")))
attr_list.append(State.parse(bytes.fromhex("c1591073c33305b4fa8bd26dd27eafd9")))
attributes = RadiusAttributesList(attr_list)
access_challenge = RadiusAccessChallenge(6, "54dbc73332c00c0347fc4b462d1811a7", attributes)
packed_message = access_challenge.pack()
self.assertEqual(len(expected_packed_message), len(packed_message))
self.assertEqual(expected_packed_message, packed_message)
def test_radius_access_challenge_packs2(self):
expected_packed_message = build_byte_string("0b06042c"
"54dbc73332c00c0347fc4b462d1811a74fff016a03ec15c000000a76160303003e0200003a0303114aa9dae3f9d452ca12535b03aee03cd4dabe3ca7639929dd3b645d1f86ad6500c030000012ff01000100000b000403000102000f00010116030308d30b0008cf0008cc0003de308203da308202c2a003020102020101300d06092a864886f70d01010b0500308193310b3009060355040613024652310f300d06035504080c065261646975733112301006035504070c09536f6d65776865726531153013060355040a0c0c4578616d706c6520496e632e3120301e06092a864886f70d010901161161646d696e406578616d706c652e6f72673126302406035504030c1d4578616d706c6520434fff6572746966696361746520417574686f72697479301e170d3138303630353033353134345a170d3138303830343033353134345a307c310b3009060355040613024652310f300d06035504080c0652616469757331153013060355040a0c0c4578616d706c6520496e632e3123302106035504030c1a4578616d706c65205365727665722043657274696669636174653120301e06092a864886f70d010901161161646d696e406578616d706c652e6f726730820122300d06092a864886f70d01010105000382010f003082010a0282010100cf5456d7e6142383101cf79275f6396e2c9b3f7cb2878d35e5ecc6f47ee11ef20bc8a8b3217a89351c554fff856e5cd5eed2d10037c9bcce89fbdf927e4cc4f069863acbac4accee7e80f2105ad80d837fa50a931c5b41d03c993f5e338cfd8e69e23818360053501c34c08132ec3d6e14df89ff29c5cec5c7a87d48c4afdcf9d3f8290050be5b903ba6a2a5ce2eb79c922cae70869618c75923059f9a8d62144e8ecdaf0a9f02886afa0e73e3d68037ea9fdca2bdd0f0785e05f5ac88031010c105575dbb09eb4f307547622120ee384ab454376de8e14e0afea02f1211801b6c932324ef6dba7abf3f48f8e3e84716c40b59041ec936cb273d684b22aa1c9d24e10203010001a34f304d30130603551d25040c300a06082b0601050507030130360603551d1f042f4ff7302d302ba029a0278625687474703a2f2f7777772e6578616d706c652e636f6d2f6578616d706c655f63612e63726c300d06092a864886f70d01010b0500038201010054fdcdabdc3a153dc167d6b210d1b324ecfac0e3b8d385704463a7f8ebf46e2e6952f249f4436ec66760868860e5ed50b519ec14628179472c312f507bc9349971d21f8f2b7d6b329b02fab448bd90fd4ce4dfbc78f23a8c4eed74d5589f4c3bd11b552535b8ab8a1a6ab9d1dfda21f247a93354702c12fdde1113cb8dd0e46e2a3a94547c9871df2a88943751d8276dc43f7f6aed921f43f6a33f9beba804c3d2b5781d754abe36ba58461798be8585b8b2"
"501226e219fc875fd78976eb2b9b475b1488"
"1812c1591073c33305b4fa8bd26dd27eafd9")
attr_list = list()
attr_list.append(EAPMessage.parse(bytes.fromhex(
"016a03ec15c000000a76160303003e0200003a0303114aa9dae3f9d452ca12535b03aee03cd4dabe3ca7639929dd3b645d1f86ad6500c030000012ff01000100000b000403000102000f00010116030308d30b0008cf0008cc0003de308203da308202c2a003020102020101300d06092a864886f70d01010b0500308193310b3009060355040613024652310f300d06035504080c065261646975733112301006035504070c09536f6d65776865726531153013060355040a0c0c4578616d706c6520496e632e3120301e06092a864886f70d010901161161646d696e406578616d706c652e6f72673126302406035504030c1d4578616d706c652043"
"6572746966696361746520417574686f72697479301e170d3138303630353033353134345a170d3138303830343033353134345a307c310b3009060355040613024652310f300d06035504080c0652616469757331153013060355040a0c0c4578616d706c6520496e632e3123302106035504030c1a4578616d706c65205365727665722043657274696669636174653120301e06092a864886f70d010901161161646d696e406578616d706c652e6f726730820122300d06092a864886f70d01010105000382010f003082010a0282010100cf5456d7e6142383101cf79275f6396e2c9b3f7cb2878d35e5ecc6f47ee11ef20bc8a8b3217a89351c55"
"856e5cd5eed2d10037c9bcce89fbdf927e4cc4f069863acbac4accee7e80f2105ad80d837fa50a931c5b41d03c993f5e338cfd8e69e23818360053501c34c08132ec3d6e14df89ff29c5cec5c7a87d48c4afdcf9d3f8290050be5b903ba6a2a5ce2eb79c922cae70869618c75923059f9a8d62144e8ecdaf0a9f02886afa0e73e3d68037ea9fdca2bdd0f0785e05f5ac88031010c105575dbb09eb4f307547622120ee384ab454376de8e14e0afea02f1211801b6c932324ef6dba7abf3f48f8e3e84716c40b59041ec936cb273d684b22aa1c9d24e10203010001a34f304d30130603551d25040c300a06082b0601050507030130360603551d1f042f"
"302d302ba029a0278625687474703a2f2f7777772e6578616d706c652e636f6d2f6578616d706c655f63612e63726c300d06092a864886f70d01010b0500038201010054fdcdabdc3a153dc167d6b210d1b324ecfac0e3b8d385704463a7f8ebf46e2e6952f249f4436ec66760868860e5ed50b519ec14628179472c312f507bc9349971d21f8f2b7d6b329b02fab448bd90fd4ce4dfbc78f23a8c4eed74d5589f4c3bd11b552535b8ab8a1a6ab9d1dfda21f247a93354702c12fdde1113cb8dd0e46e2a3a94547c9871df2a88943751d8276dc43f7f6aed921f43f6a33f9beba804c3d2b5781d754abe36ba58461798be8585b8b2")))
attr_list.append(
MessageAuthenticator.parse(bytes.fromhex("26e219fc875fd78976eb2b9b475b1488")))
attr_list.append(State.parse(bytes.fromhex("c1591073c33305b4fa8bd26dd27eafd9")))
attributes = RadiusAttributesList(attr_list)
access_challenge = RadiusAccessChallenge(6, "54dbc73332c00c0347fc4b462d1811a7",
attributes)
packed_message = access_challenge.pack()
self.assertEqual(len(expected_packed_message), len(packed_message))
self.assertEqual(expected_packed_message, packed_message)
def test_radius_access_request_packs(self):
expected_packed_message = build_byte_string("010e01dc688d6504db3c757243f995d5f0d32e50010b686f737431757365721e1434342d34342d34342d34342d34342d34343a3d06000000130606000000021f1330302d30302d30302d31312d31312d30314d17434f4e4e45435420304d627073203830322e3131622c12433634383030344139433930353537390c06000005784fff02250133150016030101280100012403032c36dbf8ee16b94b28efdb8c5603e07823f9b716557b5ef2624b026daea115760000aac030c02cc028c024c014c00a00a500a300a1009f006b006a0069006800390038003700360088008700860085c032c02ec02ac026c00fc005009d003d00350084c02fc02bc027c023c013c00900a400a200a0009e00670040003f003e0033003200310030009a0099009800970045004400430042c031c02dc029c025c00ec004009c003c002f00960041c011c007c00cc00200050004c012c008001600130010000dc00dc003000a00ff01000051000b000403000102000a001c001a00170019001c001b0018001a004f3816000e000d000b000c0009000a000d0020001e060106020603050105020503040104020403030103020303020102020203000f0001011812cefe6083cfdb75dd64722c274ec353725012ab67ed568931f12d258f9ffda931159e")
attr_list = list()
attr_list.append(UserName.parse("host1user".encode()))
attr_list.append(CalledStationId.parse("44-44-44-44-44-44:".encode()))
attr_list.append(NASPortType.parse(bytes.fromhex("00000013")))
attr_list.append(ServiceType.parse(bytes.fromhex("00000002")))
attr_list.append(CallingStationId.parse("00-00-00-11-11-01".encode()))
attr_list.append(ConnectInfo.parse("CONNECT 0Mbps 802.11b".encode()))
attr_list.append(AcctSessionId.parse("C648004A9C905579".encode()))
attr_list.append(FramedMTU.parse(bytes.fromhex("00000578")))
attr_list.append(EAPMessage.parse(
bytes.fromhex("02250133150016030101280100012403032c36dbf8ee16b94b28efdb8c5603e07823f9b716557b5ef2624b026daea115760000aac030c02cc028c024c014c00a00a500a300a1009f006b006a0069006800390038003700360088008700860085c032c02ec02ac026c00fc005009d003d00350084c02fc02bc027c023c013c00900a400a200a0009e00670040003f003e0033003200310030009a0099009800970045004400430042c031c02dc029c025c00ec004009c003c002f00960041c011c007c00cc00200050004c012c008001600130010000dc00dc003000a00ff01000051000b000403000102000a001c001a00170019001c001b0018001a0016000e000d000b000c0009000a000d0020001e060106020603050105020503040104020403030103020303020102020203000f000101")))
attr_list.append(State.parse(bytes.fromhex("cefe6083cfdb75dd64722c274ec35372")))
attr_list.append(MessageAuthenticator.parse(bytes.fromhex("00000000000000000000000000000000")))
attributes = RadiusAttributesList(attr_list)
access_request = RadiusAccessRequest(14, "688d6504db3c757243f995d5f0d32e50", attributes)
packed_message = access_request.build("SECRET")
self.assertEqual(len(expected_packed_message), len(packed_message))
self.assertEqual(expected_packed_message, packed_message) | 161.109589 | 2,151 | 0.876244 | 836 | 23,522 | 24.430622 | 0.148325 | 0.027908 | 0.016451 | 0.019389 | 0.494908 | 0.473218 | 0.470182 | 0.439728 | 0.435076 | 0.425823 | 0 | 0.557473 | 0.08609 | 23,522 | 146 | 2,152 | 161.109589 | 0.392613 | 0 | 0 | 0.351145 | 0 | 0 | 0.674361 | 0.662926 | 0 | 1 | 0 | 0 | 0.290076 | 1 | 0.061069 | false | 0 | 0.030534 | 0 | 0.099237 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
db84033a5bc08fc36982a4b4a8b46342dce86178 | 3,721 | py | Python | GwasJP/analysisPipeline.py | 2waybene/GwasJP | ddd54b276655baa79556b5f10d7959099a2e3a0b | [
"BSD-3-Clause"
] | null | null | null | GwasJP/analysisPipeline.py | 2waybene/GwasJP | ddd54b276655baa79556b5f10d7959099a2e3a0b | [
"BSD-3-Clause"
] | null | null | null | GwasJP/analysisPipeline.py | 2waybene/GwasJP | ddd54b276655baa79556b5f10d7959099a2e3a0b | [
"BSD-3-Clause"
] | null | null | null | import sys
import shlex
import subprocess as sp
from .utils import statFittings
def launchModelStep1 (filepath, phenotype = "pheno_data.txt"):
print ("****** Begin JOB:' " + str(filepath) + "'")
#for path in filepath :
print ('*************************************')
print ('This is the working path entered from the user:', str(filepath))
## Create system command
## ON NCSU cluter server
cmd = "sbatch -p standard -o " + filepath + "/model_setup_step1.out ./bin/model_setup_step1.sh " + filepath + " " + str(phenotype)
## ON Bionformatic slurm system
## cmd = "srun --partition=bioinfo --cpus-per-task=8 -o " + filepath + "/model_setup_step1.out ./bin/model_setup_step1.sh " + filepath + " " + str(phenotype)
print (cmd)
sp.call(cmd, shell=True)
print ("Launching model setup step 1:" + cmd)
print ("Check the job status with command: squeue ")
#echo;echo "Create complete cases phenotype data (bin/pheno_data_step1.r)"
#R --slave --vanilla --file=bin/pheno_data_step1.r --args $p $2
def launchModelStep2 (filepath):
print ("****** Begin JOB:' " + str(filepath) + "'")
#for path in filepath :
print ('*************************************')
print ('This is the working path entered from the user:', str(filepath))
## Create system command
## ON NCSU cluter server
cmd = 'sbatch -p standard -o '+filepath+'/model_setup_step2.out ./bin/model_setup_step2.sh ' + filepath
## ON Bionformatic slurm system
#3 cmd = "srun --partition=bioinfo --cpus-per-task=8 -o " + filepath + "/model_setup_step2.out ./bin/model_setup_step2.sh " + filepath
print (cmd)
sp.call(cmd, shell=True)
print ("Launching model setup step 2:" + cmd)
print ("Check the job status with command: squeue ")
def launchHeritability (filepath):
print ("****** Begin JOB:' " + str(filepath) + "'")
#for path in filepath :
print ('*************************************')
print ('This is the working path entered from the user:', str(filepath))
## Create system command
## ON NCSU cluter server
cmd = 'sbatch -p standard -o '+filepath+'/sbatch_logs/gcta.out ./bin/run_gcta.sh ' + filepath
## on Bioinfomatic slurm
## cmd = "srun --partition=bioinfo --cpus-per-task=8 -o " + filepath + "/sbatch_logs/gcta.out ./bin/run_gcta.sh " + filepath
print (cmd)
sp.call(cmd, shell=True)
print ("Launching launchHeritability step 1 of 3:" + cmd)
print ("Check the job status with command: squeue ")
def genoCommondVarAnalysis (filepath):
print ("****** Begin JOB:' " + str(filepath) + "'")
#for path in filepath :
print ('*************************************')
print ('This is the working path entered from the user:', str(filepath))
## Create system command
# cmd = 'sbatch -p standard -o '+path+'/sbatch_logs/gcta.out ./bin/run_gcta.sh',path))
cmd = "place holder"
print (cmd)
sp.call(cmd, shell=True)
print ("Launching genotype common variant analysis step 2 of 3:" + cmd)
print ("Check the job status with command: squeue ")
def imputeCommondVarAnalysis (filepath):
print ("****** Begin JOB:' " + str(filepath) + "'")
#for path in filepath :
print ('*************************************')
print ('This is the working path entered from the user:', str(filepath))
## Create system command
# cmd = 'sbatch -p standard -o '+path+'/sbatch_logs/gcta.out ./bin/run_gcta.sh',path))
cmd = "place holder"
print (cmd)
sp.call(cmd, shell=True)
print ("Launching impute common variant analysis step 3 of 3:" + cmd)
print ("Check the job status with command: squeue ")
| 34.453704 | 165 | 0.607095 | 469 | 3,721 | 4.754797 | 0.198294 | 0.064126 | 0.029148 | 0.035874 | 0.810762 | 0.794619 | 0.794619 | 0.794619 | 0.794619 | 0.775785 | 0 | 0.008486 | 0.208277 | 3,721 | 107 | 166 | 34.775701 | 0.748473 | 0.292932 | 0 | 0.653061 | 0 | 0 | 0.457418 | 0.116378 | 0 | 0 | 0 | 0 | 0 | 1 | 0.102041 | false | 0 | 0.081633 | 0 | 0.183673 | 0.612245 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
dbbd3c6f1d60f8e31be48ce220c277b8aeafc03b | 19,176 | py | Python | monitoring/prober/scd/test_operation_simple_heavy_traffic.py | Orbitalize/InterUSS-Platform | a1d60ec928dc5c63f9dcddd195bfeda7c4c1c84b | [
"Apache-2.0"
] | 58 | 2019-10-03T19:15:47.000Z | 2022-03-09T16:50:47.000Z | monitoring/prober/scd/test_operation_simple_heavy_traffic.py | Orbitalize/InterUSS-Platform | a1d60ec928dc5c63f9dcddd195bfeda7c4c1c84b | [
"Apache-2.0"
] | 283 | 2019-09-30T18:35:02.000Z | 2022-03-29T13:36:53.000Z | monitoring/prober/scd/test_operation_simple_heavy_traffic.py | Orbitalize/InterUSS-Platform | a1d60ec928dc5c63f9dcddd195bfeda7c4c1c84b | [
"Apache-2.0"
] | 51 | 2019-10-08T18:47:36.000Z | 2022-03-23T08:44:06.000Z | """Basic Operation tests with hundreds of operations created SEQUENTIALLY in the SAME area:
- make sure operations do not exist with get or query
- create 100 operations sequentially, with each covers non-overlapping area that are close to others
- get by IDs
- search with earliest_time and latest_time
- mutate
- delete
- confirm deletion by get and query
"""
import datetime
from monitoring.monitorlib import scd
from monitoring.monitorlib.scd import SCOPE_SC
from monitoring.monitorlib.infrastructure import default_scope
from monitoring.monitorlib.testing import assert_datetimes_are_equal
from monitoring.prober.infrastructure import depends_on, for_api_versions, register_resource_type
BASE_URL = 'https://example.com/uss'
OP_TYPES = [register_resource_type(10 + i, 'Operational intent {}'.format(i)) for i in range(20)]
ovn_map = {}
# Generate request with volumes that cover a circle area that initially centered at (-56, 178)
# The circle's center lat shifts 0.001 degree (111 meters) per sequential idx change
def _make_op_request(idx):
time_start = datetime.datetime.utcnow() + datetime.timedelta(minutes=20)
time_end = time_start + datetime.timedelta(minutes=60)
lat = -56 - 0.001 * idx
return {
'extents': [scd.make_vol4(time_start, time_end, 0, 120, scd.make_circle(lat, 178, 50))],
'old_version': 0,
'state': 'Accepted',
'uss_base_url': BASE_URL,
'new_subscription': {
'uss_base_url': BASE_URL,
'notify_for_constraints': False
}
}
def _intersection(list1, list2):
return list(set(list1) & set(list2))
@for_api_versions(scd.API_0_3_5)
def test_ensure_clean_workspace_v5(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operation_references/{}'.format(op_id), scope=SCOPE_SC)
if resp.status_code == 200:
resp = scd_session.delete('/operation_references/{}'.format(op_id), scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
elif resp.status_code == 404:
# As expected.
pass
else:
assert False, resp.content
@for_api_versions(scd.API_0_3_17)
def test_ensure_clean_workspace_v15(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operational_intent_references/{}'.format(op_id), scope=SCOPE_SC)
if resp.status_code == 200:
resp = scd_session.delete('/operational_intent_references/{}'.format(op_id), scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
elif resp.status_code == 404:
# As expected.
pass
else:
assert False, resp.content
# Preconditions: None
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_ops_do_not_exist_get_v5(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operation_references/{}'.format(op_id))
assert resp.status_code == 404, resp.content
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_ops_do_not_exist_get_v15(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operation_references/{}'.format(op_id))
assert resp.status_code == 404, resp.content
# Preconditions: None
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_ops_do_not_exist_query_v5(ids, scd_api, scd_session):
time_now = datetime.datetime.utcnow()
end_time = time_now + datetime.timedelta(hours=1)
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(time_now, end_time, 0, 5000, scd.make_circle(-56, 178, 12000))
}, scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_ops_do_not_exist_query_v15(ids, scd_api, scd_session):
time_now = datetime.datetime.utcnow()
end_time = time_now + datetime.timedelta(hours=1)
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(time_now, end_time, 0, 5000, scd.make_circle(-56, 178, 12000))
}, scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_reference', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
# Preconditions: None
# Mutations: Operations with ids in OP_IDS created by scd_session user
@for_api_versions(scd.API_0_3_5)
def test_create_ops_v5(ids, scd_api, scd_session):
assert len(ovn_map) == 0
for idx, op_id in enumerate(map(ids, OP_TYPES)):
req = _make_op_request(idx)
req['key'] = list(ovn_map.values())
resp = scd_session.put('/operation_references/{}'.format(op_id), json=req, scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
data = resp.json()
op = data['operation_reference']
assert op['id'] == op_id
assert op['uss_base_url'] == BASE_URL
assert_datetimes_are_equal(op['time_start']['value'], req['extents'][0]['time_start']['value'])
assert_datetimes_are_equal(op['time_end']['value'], req['extents'][0]['time_end']['value'])
assert op['version'] == 1
assert op['ovn']
assert 'subscription_id' in op
assert 'state' not in op
ovn_map[op_id] = op['ovn']
assert len(ovn_map) == len(OP_TYPES)
@for_api_versions(scd.API_0_3_17)
def test_create_ops_v15(ids, scd_api, scd_session):
assert len(ovn_map) == 0
for idx, op_id in enumerate(map(ids, OP_TYPES)):
req = _make_op_request(idx)
req['key'] = list(ovn_map.values())
resp = scd_session.put(
'/operational_intent_references/{}'.format(op_id), json=req, scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
data = resp.json()
op = data['operational_intent_reference']
assert op['id'] == op_id
assert op['uss_base_url'] == BASE_URL
assert op['uss_availability'] == "Unknown"
assert_datetimes_are_equal(op['time_start']['value'], req['extents'][0]['time_start']['value'])
assert_datetimes_are_equal(op['time_end']['value'], req['extents'][0]['time_end']['value'])
assert op['version'] == 1
assert op['ovn']
assert 'subscription_id' in op
ovn_map[op_id] = op['ovn']
assert len(ovn_map) == len(OP_TYPES)
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: None
@for_api_versions(scd.API_0_3_5)
def test_get_ops_by_ids_v5(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operation_references/{}'.format(op_id), scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
data = resp.json()
op = data['operation_reference']
assert op['id'] == op_id
assert op['uss_base_url'] == BASE_URL
assert op['version'] == 1
assert 'state' not in op
@for_api_versions(scd.API_0_3_17)
def test_get_ops_by_ids_v15(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operational_intent_references/{}'.format(op_id), scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
data = resp.json()
op = data['operational_intent_reference']
assert op['id'] == op_id
assert op['uss_base_url'] == BASE_URL
assert op['version'] == 1
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_v5(ids, scd_api, scd_session):
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(None, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert len(_intersection(map(ids, OP_TYPES), found_ids)) == len(OP_TYPES)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_v15(ids, scd_api, scd_session):
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(None, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_references', [])]
print(found_ids)
assert len(_intersection(map(ids, OP_TYPES), found_ids)) == len(OP_TYPES)
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_earliest_time_included_v5(ids, scd_api, scd_session):
earliest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=59)
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(earliest_time, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert len(_intersection(map(ids, OP_TYPES), found_ids)) == len(OP_TYPES)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_earliest_time_included_v15(ids, scd_api, scd_session):
earliest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=59)
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(earliest_time, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_references', [])]
assert len(_intersection(map(ids, OP_TYPES), found_ids)) == len(OP_TYPES)
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_earliest_time_excluded_v5(ids, scd_api, scd_session):
earliest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=81)
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(earliest_time, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_earliest_time_excluded_v15(ids, scd_api, scd_session):
earliest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=81)
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(earliest_time, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_reference', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_latest_time_included_v5(ids, scd_api, scd_session):
latest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=20)
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(None, latest_time, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert len(_intersection(map(ids, OP_TYPES), found_ids)) == len(OP_TYPES)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_latest_time_included_v15(ids, scd_api, scd_session):
latest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=20)
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(None, latest_time, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_references', [])]
assert len(_intersection(map(ids, OP_TYPES), found_ids)) == len(OP_TYPES)
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_latest_time_excluded_v5(ids, scd_api, scd_session):
latest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=1)
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(None, latest_time, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_ops_by_search_latest_time_excluded_v15(ids, scd_api, scd_session):
latest_time = datetime.datetime.utcnow() + datetime.timedelta(minutes=1)
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(None, latest_time, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_references', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
# Preconditions: Operations with ids in OP_IDS created by scd_session user
# Mutations: Operations with ids in OP_IDS mutated to second version
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_mutate_ops_v5(ids, scd_api, scd_session):
for idx, op_id in enumerate(map(ids, OP_TYPES)):
# GET current op
resp = scd_session.get('/operation_references/{}'.format(op_id))
assert resp.status_code == 200, resp.content
existing_op = resp.json().get('operation_reference', None)
assert existing_op is not None
req = _make_op_request(idx)
# QUERY ops in the area and get their ovns
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': req['extents'][0]
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
ovns = [ovn_map[id] for id in found_ids]
# UPDATE operation
req = {
'key': ovns,
'extents': req['extents'],
'old_version': existing_op['version'],
'state': 'Activated',
'uss_base_url': 'https://example.com/uss2',
'subscription_id': existing_op['subscription_id']
}
resp = scd_session.put('/operation_references/{}'.format(op_id), json=req, scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
data = resp.json()
op = data['operation_reference']
assert op['id'] == op_id
assert op['uss_base_url'] == 'https://example.com/uss2'
assert op['version'] == 2
assert op['subscription_id'] == existing_op['subscription_id']
assert 'state' not in op
ovn_map[op_id] = op['ovn']
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_mutate_ops_v17(ids, scd_api, scd_session):
for idx, op_id in enumerate(map(ids, OP_TYPES)):
# GET current op
resp = scd_session.get('/operational_intent_references/{}'.format(op_id))
assert resp.status_code == 200, resp.content
existing_op = resp.json().get('operational_intent_reference', None)
assert existing_op is not None
req = _make_op_request(idx)
# QUERY ops in the area and get their ovns
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': req['extents'][0]
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_references', [])]
ovns = [ovn_map[id] for id in found_ids]
# UPDATE operation
req = {
'key': ovns,
'extents': req['extents'],
'old_version': existing_op['version'],
'state': 'Activated',
'uss_base_url': 'https://example.com/uss2',
'subscription_id': existing_op['subscription_id']
}
resp = scd_session.put('/operational_intent_references/{}/{}'.format(op_id, existing_op['ovn']), json=req, scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
data = resp.json()
op = data['operational_intent_reference']
assert op['id'] == op_id
assert op['uss_base_url'] == 'https://example.com/uss2'
assert op['uss_availability'] == "Unknown"
assert op['version'] != existing_op['version']
assert op['subscription_id'] == existing_op['subscription_id']
ovn_map[op_id] = op['ovn']
# Preconditions: Operations with ids in OP_IDS mutated to second version
# Mutations: Operations with ids in OP_IDS deleted
@for_api_versions(scd.API_0_3_5)
def test_delete_op_v5(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.delete('/operation_references/{}'.format(op_id), scope=SCOPE_SC)
assert resp.status_code == 200, resp.content
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_delete_op_v15(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.delete('/operational_intent_references/{}/{}'.format(op_id, ovn_map[op_id]))
assert resp.status_code == 200, resp.content
# Preconditions: Operations with ids in OP_IDS deleted
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_deleted_ops_by_ids_v5(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operation_references/{}'.format(op_id))
assert resp.status_code == 404, resp.content
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_deleted_ops_by_ids_v15(ids, scd_api, scd_session):
for op_id in map(ids, OP_TYPES):
resp = scd_session.get('/operational_intent_references/{}'.format(op_id))
assert resp.status_code == 404, resp.content
# Preconditions: Operations with ids in OP_IDS deleted
# Mutations: None
@for_api_versions(scd.API_0_3_5)
@default_scope(SCOPE_SC)
def test_get_deleted_ops_by_search_v5(ids, scd_api, scd_session):
resp = scd_session.post('/operation_references/query', json={
'area_of_interest': scd.make_vol4(None, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operation_references', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
@for_api_versions(scd.API_0_3_17)
@default_scope(SCOPE_SC)
def test_get_deleted_ops_by_search_v15(ids, scd_api, scd_session):
resp = scd_session.post('/operational_intent_references/query', json={
'area_of_interest': scd.make_vol4(None, None, 0, 5000, scd.make_circle(-56, 178, 12000))
})
assert resp.status_code == 200, resp.content
found_ids = [op['id'] for op in resp.json().get('operational_intent_reference', [])]
assert not _intersection(map(ids, OP_TYPES), found_ids)
| 39.214724 | 126 | 0.728045 | 2,958 | 19,176 | 4.406356 | 0.067613 | 0.053706 | 0.038668 | 0.049102 | 0.911462 | 0.902256 | 0.89489 | 0.885914 | 0.877474 | 0.875403 | 0 | 0.033275 | 0.139602 | 19,176 | 488 | 127 | 39.295082 | 0.756713 | 0.093711 | 0 | 0.819484 | 0 | 0 | 0.159268 | 0.07949 | 0 | 0 | 0 | 0 | 0.252149 | 1 | 0.08596 | false | 0.005731 | 0.017192 | 0.002865 | 0.108883 | 0.002865 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dbc4aaba4f1c84d5f0f98ab2034c12025af858be | 3,271 | py | Python | pyapr/io/io_api.py | mosaic-group/PyLibAPR | 4b5af50c26b4770c460460f9491bd840af2537da | [
"Apache-2.0"
] | 7 | 2021-07-02T11:08:30.000Z | 2022-03-07T20:54:33.000Z | pyapr/io/io_api.py | mosaic-group/PyLibAPR | 4b5af50c26b4770c460460f9491bd840af2537da | [
"Apache-2.0"
] | 19 | 2020-12-17T09:32:09.000Z | 2022-01-08T20:22:16.000Z | pyapr/io/io_api.py | mosaic-group/PyLibAPR | 4b5af50c26b4770c460460f9491bd840af2537da | [
"Apache-2.0"
] | 1 | 2021-01-19T14:23:36.000Z | 2021-01-19T14:23:36.000Z | import pyapr
def read(fpath, apr, parts, t=0, channel_name_apr='t', channel_name_parts='particles'):
# Initialize APRFile for I/O
aprfile = pyapr.io.APRFile()
aprfile.set_read_write_tree(True)
# Read APR and particles from file
aprfile.open(fpath, 'READ')
aprfile.read_apr(apr, t=t, channel_name=channel_name_apr)
aprfile.read_particles(apr, channel_name_parts, parts)
aprfile.close()
def write(fpath, apr, parts, t=0, channel_name_apr='t', channel_name_parts='particles'):
if not fpath:
print('Empty path given. Ignoring call to pyapr.io.write')
return
# Initialize APRFile for I/O
aprfile = pyapr.io.APRFile()
aprfile.set_read_write_tree(True)
# Write APR and particles to file
aprfile.open(fpath, 'WRITE')
aprfile.write_apr(apr, t=t, channel_name=channel_name_apr)
aprfile.write_particles(channel_name_parts, parts, t=t)
aprfile.close()
def write_multichannel(fpath, apr, parts_list, t=0, channel_name_apr='t', channel_names_parts=None):
if not fpath:
print('Empty path given. Ignoring call to pyapr.io.write')
return
if isinstance(parts_list, (tuple, list)):
for p in parts_list:
if not isinstance(p, (pyapr.ShortParticles, pyapr.FloatParticles)):
raise AssertionError(
'argument \'parts_list\' to pyapr.io.write_multichannel must be a \
tuple or list of pyapr.XParticles objects'
)
else:
raise AssertionError(
'argument \'parts_list\' to pyapr.io.write_multichannel must be a tuple or list of pyapr.XParticles objects'
)
if channel_names_parts is None:
channel_names_parts = ['particles' + str(i) for i in range(len(parts_list))]
# Initialize APRFile for I/O
aprfile = pyapr.io.APRFile()
aprfile.set_read_write_tree(True)
# Write APR and particles to file
aprfile.open(fpath, 'WRITE')
aprfile.write_apr(apr, t=t, channel_name=channel_name_apr)
for i in range(len(parts_list)):
aprfile.write_particles(channel_names_parts[i], parts_list[i], t=t)
aprfile.close()
def read_multichannel(fpath, apr, parts_list, t=0, channel_name_apr='t', channel_names_parts=None):
if isinstance(parts_list, (tuple, list)):
for p in parts_list:
if not isinstance(p, (pyapr.ShortParticles, pyapr.FloatParticles)):
raise AssertionError(
'argument \'parts_list\' to pyapr.io.read_multichannel must be a \
tuple or list of pyapr.XParticles objects')
else:
raise AssertionError(
'argument \'parts_list\' to pyapr.io.read_multichannel must be a tuple or list of pyapr.XParticles objects')
if channel_names_parts is None:
channel_names_parts = ['particles' + str(i) for i in range(len(parts_list))]
# Initialize APRFile for I/O
aprfile = pyapr.io.APRFile()
aprfile.set_read_write_tree(True)
# Write APR and particles to file
aprfile.open(fpath, 'READ')
aprfile.read_apr(apr, t=t, channel_name=channel_name_apr)
for i in range(len(parts_list)):
aprfile.read_particles(apr, channel_names_parts[i], parts_list[i], t=t)
aprfile.close()
| 35.554348 | 120 | 0.672577 | 460 | 3,271 | 4.595652 | 0.134783 | 0.083254 | 0.05298 | 0.024598 | 0.937086 | 0.909177 | 0.909177 | 0.909177 | 0.909177 | 0.909177 | 0 | 0.001581 | 0.226536 | 3,271 | 91 | 121 | 35.945055 | 0.833992 | 0.072149 | 0 | 0.721311 | 0 | 0 | 0.119299 | 0.017515 | 0 | 0 | 0 | 0 | 0.065574 | 1 | 0.065574 | false | 0 | 0.016393 | 0 | 0.114754 | 0.032787 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dbd431b7aac805c777609ec5bbeb0982afa598c6 | 1,613 | py | Python | src/2021/may/26/sudoku_quadrant_checker.py | xaverrd/braingu-toy-problems | 608030ab83d6f3161d70d782157f759677cc9d3e | [
"MIT"
] | null | null | null | src/2021/may/26/sudoku_quadrant_checker.py | xaverrd/braingu-toy-problems | 608030ab83d6f3161d70d782157f759677cc9d3e | [
"MIT"
] | null | null | null | src/2021/may/26/sudoku_quadrant_checker.py | xaverrd/braingu-toy-problems | 608030ab83d6f3161d70d782157f759677cc9d3e | [
"MIT"
] | 2 | 2021-05-27T14:23:04.000Z | 2021-05-28T14:18:35.000Z | # Have the function SudokuQuadrantChecker(strArr) read the strArr parameter being passed which will represent a 9x9 Sudoku board of integers ranging from 1 to 9. The rules of Sudoku are to place each of the 9 integers integer in every row and column and not have any integers repeat in the respective row, column, or 3x3 sub-grid. The input strArr will represent a Sudoku board and it will be structured in the following format: ["(N,N,N,N,N,x,x,x,x)","(...)","(...)",...)] where N stands for an integer between 1 and 9 and x will stand for an empty cell. Your program will determine if the board is legal; the board also does not necessarily have to be finished. If the board is legal, your program should return the string legal but if it isn't legal, it should return the 3x3 quadrants (separated by commas) where the errors exist. The 3x3 quadrants are numbered from 1 to 9 starting from top-left going to bottom-right.
# For example, if strArr is: ["(1,2,3,4,5,6,7,8,1)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(1,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)"] then your program should return 1,3,4 since the errors are in quadrants 1, 3 and 4 because of the repeating integer 1.
# Another example, if strArr is: ["(1,2,3,4,5,6,7,8,9)","(x,x,x,x,x,x,x,x,x)","(6,x,5,x,3,x,x,4,x)","(2,x,1,1,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,x)","(x,x,x,x,x,x,x,x,9)"] then your program should return 3,4,5,9.
def sudoku_quadrant_checker(str):
# code goes here
return str
| 201.625 | 923 | 0.676379 | 384 | 1,613 | 2.835938 | 0.276042 | 0.236915 | 0.338843 | 0.433425 | 0.247934 | 0.167126 | 0.167126 | 0.167126 | 0.167126 | 0.158861 | 0 | 0.03783 | 0.131432 | 1,613 | 7 | 924 | 230.428571 | 0.739472 | 0.963422 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
9189ad7666bb19decccfb17025720577bee911c8 | 1,342 | py | Python | pythonTools/pandasConnector.py | ZacharyCalabrese/PythonTools | 31efc13b2bd6346c6ec02dd8307e14f9390404fc | [
"Apache-2.0"
] | null | null | null | pythonTools/pandasConnector.py | ZacharyCalabrese/PythonTools | 31efc13b2bd6346c6ec02dd8307e14f9390404fc | [
"Apache-2.0"
] | null | null | null | pythonTools/pandasConnector.py | ZacharyCalabrese/PythonTools | 31efc13b2bd6346c6ec02dd8307e14f9390404fc | [
"Apache-2.0"
] | null | null | null | import pandas
from pythonTools.convertToDictionaries import *
def csv_to_data_frame(path_and_file_name):
"""
Take in a path and file name for a csv file
and returns a pandas dataframe
Args:
path_and_file_name (string): absolute path for
file to be converted to a pandas dataframe
Returns:
dataframe (pandas Dataframe): A dataframe created
based on a list of dictionaries
None: If there is an error or the file is empty
"""
list_of_dictionaries = csv_to_list_of_dictionaries(path_and_file_name)
dataframe = pandas.DataFrame(list_of_dictionaries)
if dataframe is None:
return None
return dataframe
def excel_to_data_frame(path_and_file_name):
"""
Take in a path and file name for an excel file
and returns a pandas dataframe
Args:
path_and_file_name (string): absolute path for
file to be converted to a pandas dataframe
Returns:
dataframe (pandas Dataframe): A dataframe created
based on a list of dictionaries
None: If there is an error or the file is empty
"""
list_of_dictionaries = excel_to_list_of_dictionaries(path_and_file_name)
dataframe = pandas.DataFrame(list_of_dictionaries)
if dataframe is None:
return None
return dataframe
| 27.387755 | 76 | 0.695231 | 190 | 1,342 | 4.7 | 0.221053 | 0.06271 | 0.098544 | 0.134379 | 0.902576 | 0.902576 | 0.902576 | 0.902576 | 0.902576 | 0.902576 | 0 | 0 | 0.264531 | 1,342 | 48 | 77 | 27.958333 | 0.904762 | 0.508942 | 0 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
533b617e0e5f937c8cc1c758b3a6ddfc9c22f7ce | 23,641 | py | Python | moztrap/model/library/migrations/0002_auto__del_unique_suitecase_case_suite__del_unique_caseversion_case_pro.py | yifanjiang/moztrap | 2130c7101b7596b19a2697ab5f1c745e93e7c95b | [
"BSD-2-Clause"
] | 1 | 2015-02-10T15:09:42.000Z | 2015-02-10T15:09:42.000Z | moztrap/model/library/migrations/0002_auto__del_unique_suitecase_case_suite__del_unique_caseversion_case_pro.py | yifanjiang/moztrap | 2130c7101b7596b19a2697ab5f1c745e93e7c95b | [
"BSD-2-Clause"
] | null | null | null | moztrap/model/library/migrations/0002_auto__del_unique_suitecase_case_suite__del_unique_caseversion_case_pro.py | yifanjiang/moztrap | 2130c7101b7596b19a2697ab5f1c745e93e7c95b | [
"BSD-2-Clause"
] | null | null | null | # encoding: utf-8
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Removing unique constraint on 'CaseStep', fields ['caseversion', 'number']
db.delete_unique('library_casestep', ['caseversion_id', 'number'])
# Removing unique constraint on 'CaseVersion', fields ['case', 'productversion']
db.delete_unique('library_caseversion', ['case_id', 'productversion_id'])
# Removing unique constraint on 'SuiteCase', fields ['case', 'suite']
db.delete_unique('library_suitecase', ['case_id', 'suite_id'])
def backwards(self, orm):
# Adding unique constraint on 'SuiteCase', fields ['case', 'suite']
db.create_unique('library_suitecase', ['case_id', 'suite_id'])
# Adding unique constraint on 'CaseVersion', fields ['case', 'productversion']
db.create_unique('library_caseversion', ['case_id', 'productversion_id'])
# Adding unique constraint on 'CaseStep', fields ['caseversion', 'number']
db.create_unique('library_casestep', ['caseversion_id', 'number'])
models = {
'auth.group': {
'Meta': {'object_name': 'Group'},
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '80'}),
'permissions': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.Permission']", 'symmetrical': 'False', 'blank': 'True'})
},
'auth.permission': {
'Meta': {'ordering': "('content_type__app_label', 'content_type__model', 'codename')", 'unique_together': "(('content_type', 'codename'),)", 'object_name': 'Permission'},
'codename': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'content_type': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['contenttypes.ContentType']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '50'})
},
'auth.user': {
'Meta': {'object_name': 'User'},
'date_joined': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
'email': ('django.db.models.fields.EmailField', [], {'max_length': '75', 'blank': 'True'}),
'first_name': ('django.db.models.fields.CharField', [], {'max_length': '30', 'blank': 'True'}),
'groups': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.Group']", 'symmetrical': 'False', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'is_active': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'is_staff': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'is_superuser': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'last_login': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
'last_name': ('django.db.models.fields.CharField', [], {'max_length': '30', 'blank': 'True'}),
'password': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
'user_permissions': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.Permission']", 'symmetrical': 'False', 'blank': 'True'}),
'username': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '30'})
},
'contenttypes.contenttype': {
'Meta': {'ordering': "('name',)", 'unique_together': "(('app_label', 'model'),)", 'object_name': 'ContentType', 'db_table': "'django_content_type'"},
'app_label': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'model': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
},
'core.product': {
'Meta': {'object_name': 'Product'},
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 571453)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'has_team': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 571641)'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'own_team': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.User']", 'symmetrical': 'False', 'blank': 'True'})
},
'core.productversion': {
'Meta': {'ordering': "['product', 'order']", 'object_name': 'ProductVersion'},
'codename': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 569302)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'environments': ('django.db.models.fields.related.ManyToManyField', [], {'related_name': "'productversion'", 'symmetrical': 'False', 'to': "orm['environments.Environment']"}),
'has_team': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'latest': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 569491)'}),
'order': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'own_team': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.User']", 'symmetrical': 'False', 'blank': 'True'}),
'product': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'versions'", 'to': "orm['core.Product']"}),
'version': ('django.db.models.fields.CharField', [], {'max_length': '100'})
},
'environments.category': {
'Meta': {'object_name': 'Category'},
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 575347)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 575551)'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200'})
},
'environments.element': {
'Meta': {'object_name': 'Element'},
'category': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'elements'", 'to': "orm['environments.Category']"}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 580000)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 580199)'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200'})
},
'environments.environment': {
'Meta': {'object_name': 'Environment'},
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 572458)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'elements': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['environments.Element']", 'symmetrical': 'False'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 572638)'}),
'profile': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'environments'", 'null': 'True', 'to': "orm['environments.Profile']"})
},
'environments.profile': {
'Meta': {'object_name': 'Profile'},
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 579139)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 579360)'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200'})
},
'library.case': {
'Meta': {'object_name': 'Case'},
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 570673)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 570856)'}),
'product': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'cases'", 'to': "orm['core.Product']"})
},
'library.caseattachment': {
'Meta': {'object_name': 'CaseAttachment'},
'attachment': ('django.db.models.fields.files.FileField', [], {'max_length': '100'}),
'caseversion': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'attachments'", 'to': "orm['library.CaseVersion']"}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 564457)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 564680)'})
},
'library.casestep': {
'Meta': {'ordering': "['caseversion', 'number']", 'object_name': 'CaseStep'},
'caseversion': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'steps'", 'to': "orm['library.CaseVersion']"}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 565387)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'expected': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'instruction': ('django.db.models.fields.TextField', [], {}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 565583)'}),
'number': ('django.db.models.fields.IntegerField', [], {})
},
'library.caseversion': {
'Meta': {'ordering': "['case', 'productversion__order']", 'object_name': 'CaseVersion'},
'case': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'versions'", 'to': "orm['library.Case']"}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 573738)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'environments': ('django.db.models.fields.related.ManyToManyField', [], {'related_name': "'caseversion'", 'symmetrical': 'False', 'to': "orm['environments.Environment']"}),
'envs_narrowed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'latest': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 573925)'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
'productversion': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'caseversions'", 'to': "orm['core.ProductVersion']"}),
'status': ('django.db.models.fields.CharField', [], {'default': "'draft'", 'max_length': '30', 'db_index': 'True'}),
'tags': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['tags.Tag']", 'symmetrical': 'False', 'blank': 'True'})
},
'library.suite': {
'Meta': {'object_name': 'Suite'},
'cases': ('django.db.models.fields.related.ManyToManyField', [], {'related_name': "'suites'", 'symmetrical': 'False', 'through': "orm['library.SuiteCase']", 'to': "orm['library.Case']"}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 566422)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 566606)'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
'product': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'suites'", 'to': "orm['core.Product']"}),
'status': ('django.db.models.fields.CharField', [], {'default': "'draft'", 'max_length': '30', 'db_index': 'True'})
},
'library.suitecase': {
'Meta': {'ordering': "['order']", 'object_name': 'SuiteCase'},
'case': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'suitecases'", 'to': "orm['library.Case']"}),
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 576178)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 576378)'}),
'order': ('django.db.models.fields.IntegerField', [], {'default': '0', 'db_index': 'True'}),
'suite': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'suitecases'", 'to': "orm['library.Suite']"})
},
'tags.tag': {
'Meta': {'object_name': 'Tag'},
'created_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'created_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 568081)'}),
'deleted_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'deleted_on': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified_by': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'+'", 'null': 'True', 'to': "orm['auth.User']"}),
'modified_on': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime(2012, 1, 31, 22, 30, 11, 568264)'}),
'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '100'}),
'product': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['core.Product']", 'null': 'True', 'blank': 'True'})
}
}
complete_apps = ['library']
| 95.712551 | 199 | 0.567954 | 2,487 | 23,641 | 5.295135 | 0.068758 | 0.097198 | 0.169033 | 0.241476 | 0.849191 | 0.843648 | 0.827929 | 0.805528 | 0.741059 | 0.678184 | 0 | 0.02859 | 0.172962 | 23,641 | 246 | 200 | 96.101626 | 0.644947 | 0.019162 | 0 | 0.358407 | 0 | 0 | 0.623986 | 0.310828 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00885 | false | 0.004425 | 0.017699 | 0 | 0.039823 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
727c3d392a1e9ce03f7849f7ff39ec4e5937103c | 43,250 | py | Python | clients/hydra/python/ory_hydra_client/api/public_api.py | russelg/sdk | 2515b35981784319bd7d58fcf0b5ab85b501b62f | [
"Apache-2.0"
] | null | null | null | clients/hydra/python/ory_hydra_client/api/public_api.py | russelg/sdk | 2515b35981784319bd7d58fcf0b5ab85b501b62f | [
"Apache-2.0"
] | null | null | null | clients/hydra/python/ory_hydra_client/api/public_api.py | russelg/sdk | 2515b35981784319bd7d58fcf0b5ab85b501b62f | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
ORY Hydra
Welcome to the ORY Hydra HTTP API documentation. You will find documentation for all HTTP APIs here. # noqa: E501
The version of the OpenAPI document: v1.10.6
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from ory_hydra_client.api_client import ApiClient
from ory_hydra_client.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class PublicApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def disconnect_user(self, **kwargs): # noqa: E501
"""OpenID Connect Front-Backchannel Enabled Logout # noqa: E501
This endpoint initiates and completes user logout at ORY Hydra and initiates OpenID Connect Front-/Back-channel logout: https://openid.net/specs/openid-connect-frontchannel-1_0.html https://openid.net/specs/openid-connect-backchannel-1_0.html # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disconnect_user(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.disconnect_user_with_http_info(**kwargs) # noqa: E501
def disconnect_user_with_http_info(self, **kwargs): # noqa: E501
"""OpenID Connect Front-Backchannel Enabled Logout # noqa: E501
This endpoint initiates and completes user logout at ORY Hydra and initiates OpenID Connect Front-/Back-channel logout: https://openid.net/specs/openid-connect-frontchannel-1_0.html https://openid.net/specs/openid-connect-backchannel-1_0.html # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disconnect_user_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method disconnect_user" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/oauth2/sessions/logout', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def discover_open_id_configuration(self, **kwargs): # noqa: E501
"""OpenID Connect Discovery # noqa: E501
The well known endpoint an be used to retrieve information for OpenID Connect clients. We encourage you to not roll your own OpenID Connect client but to use an OpenID Connect client library instead. You can learn more on this flow at https://openid.net/specs/openid-connect-discovery-1_0.html . Popular libraries for OpenID Connect clients include oidc-client-js (JavaScript), go-oidc (Golang), and others. For a full list of clients go here: https://openid.net/developers/certified/ # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.discover_open_id_configuration(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: WellKnown
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.discover_open_id_configuration_with_http_info(**kwargs) # noqa: E501
def discover_open_id_configuration_with_http_info(self, **kwargs): # noqa: E501
"""OpenID Connect Discovery # noqa: E501
The well known endpoint an be used to retrieve information for OpenID Connect clients. We encourage you to not roll your own OpenID Connect client but to use an OpenID Connect client library instead. You can learn more on this flow at https://openid.net/specs/openid-connect-discovery-1_0.html . Popular libraries for OpenID Connect clients include oidc-client-js (JavaScript), go-oidc (Golang), and others. For a full list of clients go here: https://openid.net/developers/certified/ # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.discover_open_id_configuration_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(WellKnown, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method discover_open_id_configuration" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/.well-known/openid-configuration', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WellKnown', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def is_instance_ready(self, **kwargs): # noqa: E501
"""Check Readiness Status # noqa: E501
This endpoint returns a 200 status code when the HTTP server is up running and the environment dependencies (e.g. the database) are responsive as well. If the service supports TLS Edge Termination, this endpoint does not require the `X-Forwarded-Proto` header to be set. Be aware that if you are running multiple nodes of this service, the health status will never refer to the cluster state, only to a single instance. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.is_instance_ready(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: HealthStatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.is_instance_ready_with_http_info(**kwargs) # noqa: E501
def is_instance_ready_with_http_info(self, **kwargs): # noqa: E501
"""Check Readiness Status # noqa: E501
This endpoint returns a 200 status code when the HTTP server is up running and the environment dependencies (e.g. the database) are responsive as well. If the service supports TLS Edge Termination, this endpoint does not require the `X-Forwarded-Proto` header to be set. Be aware that if you are running multiple nodes of this service, the health status will never refer to the cluster state, only to a single instance. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.is_instance_ready_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(HealthStatus, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method is_instance_ready" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/health/ready', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='HealthStatus', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def oauth2_token(self, grant_type, **kwargs): # noqa: E501
"""The OAuth 2.0 Token Endpoint # noqa: E501
The client makes a request to the token endpoint by sending the following parameters using the \"application/x-www-form-urlencoded\" HTTP request entity-body. > Do not implement a client for this endpoint yourself. Use a library. There are many libraries > available for any programming language. You can find a list of libraries here: https://oauth.net/code/ > > Do note that Hydra SDK does not implement this endpoint properly. Use one of the libraries listed above! # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.oauth2_token(grant_type, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str grant_type: (required)
:param str code:
:param str refresh_token:
:param str redirect_uri:
:param str client_id:
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Oauth2TokenResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.oauth2_token_with_http_info(grant_type, **kwargs) # noqa: E501
def oauth2_token_with_http_info(self, grant_type, **kwargs): # noqa: E501
"""The OAuth 2.0 Token Endpoint # noqa: E501
The client makes a request to the token endpoint by sending the following parameters using the \"application/x-www-form-urlencoded\" HTTP request entity-body. > Do not implement a client for this endpoint yourself. Use a library. There are many libraries > available for any programming language. You can find a list of libraries here: https://oauth.net/code/ > > Do note that Hydra SDK does not implement this endpoint properly. Use one of the libraries listed above! # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.oauth2_token_with_http_info(grant_type, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str grant_type: (required)
:param str code:
:param str refresh_token:
:param str redirect_uri:
:param str client_id:
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Oauth2TokenResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'grant_type',
'code',
'refresh_token',
'redirect_uri',
'client_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method oauth2_token" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'grant_type' is set
if self.api_client.client_side_validation and ('grant_type' not in local_var_params or # noqa: E501
local_var_params['grant_type'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `grant_type` when calling `oauth2_token`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'grant_type' in local_var_params:
form_params.append(('grant_type', local_var_params['grant_type'])) # noqa: E501
if 'code' in local_var_params:
form_params.append(('code', local_var_params['code'])) # noqa: E501
if 'refresh_token' in local_var_params:
form_params.append(('refresh_token', local_var_params['refresh_token'])) # noqa: E501
if 'redirect_uri' in local_var_params:
form_params.append(('redirect_uri', local_var_params['redirect_uri'])) # noqa: E501
if 'client_id' in local_var_params:
form_params.append(('client_id', local_var_params['client_id'])) # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['basic', 'oauth2'] # noqa: E501
return self.api_client.call_api(
'/oauth2/token', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Oauth2TokenResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def oauth_auth(self, **kwargs): # noqa: E501
"""The OAuth 2.0 Authorize Endpoint # noqa: E501
This endpoint is not documented here because you should never use your own implementation to perform OAuth2 flows. OAuth2 is a very popular protocol and a library for your programming language will exists. To learn more about this flow please refer to the specification: https://tools.ietf.org/html/rfc6749 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.oauth_auth(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.oauth_auth_with_http_info(**kwargs) # noqa: E501
def oauth_auth_with_http_info(self, **kwargs): # noqa: E501
"""The OAuth 2.0 Authorize Endpoint # noqa: E501
This endpoint is not documented here because you should never use your own implementation to perform OAuth2 flows. OAuth2 is a very popular protocol and a library for your programming language will exists. To learn more about this flow please refer to the specification: https://tools.ietf.org/html/rfc6749 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.oauth_auth_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method oauth_auth" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/oauth2/auth', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def revoke_o_auth2_token(self, token, **kwargs): # noqa: E501
"""Revoke OAuth2 Tokens # noqa: E501
Revoking a token (both access and refresh) means that the tokens will be invalid. A revoked access token can no longer be used to make access requests, and a revoked refresh token can no longer be used to refresh an access token. Revoking a refresh token also invalidates the access token that was created with it. A token may only be revoked by the client the token was generated for. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.revoke_o_auth2_token(token, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str token: (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.revoke_o_auth2_token_with_http_info(token, **kwargs) # noqa: E501
def revoke_o_auth2_token_with_http_info(self, token, **kwargs): # noqa: E501
"""Revoke OAuth2 Tokens # noqa: E501
Revoking a token (both access and refresh) means that the tokens will be invalid. A revoked access token can no longer be used to make access requests, and a revoked refresh token can no longer be used to refresh an access token. Revoking a refresh token also invalidates the access token that was created with it. A token may only be revoked by the client the token was generated for. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.revoke_o_auth2_token_with_http_info(token, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str token: (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'token'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method revoke_o_auth2_token" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'token' is set
if self.api_client.client_side_validation and ('token' not in local_var_params or # noqa: E501
local_var_params['token'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `token` when calling `revoke_o_auth2_token`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'token' in local_var_params:
form_params.append(('token', local_var_params['token'])) # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['basic', 'oauth2'] # noqa: E501
return self.api_client.call_api(
'/oauth2/revoke', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def userinfo(self, **kwargs): # noqa: E501
"""OpenID Connect Userinfo # noqa: E501
This endpoint returns the payload of the ID Token, including the idTokenExtra values, of the provided OAuth 2.0 Access Token. For more information please [refer to the spec](http://openid.net/specs/openid-connect-core-1_0.html#UserInfo). In the case of authentication error, a WWW-Authenticate header might be set in the response with more information about the error. See [the spec](https://datatracker.ietf.org/doc/html/rfc6750#section-3) for more details about header format. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.userinfo(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: UserinfoResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.userinfo_with_http_info(**kwargs) # noqa: E501
def userinfo_with_http_info(self, **kwargs): # noqa: E501
"""OpenID Connect Userinfo # noqa: E501
This endpoint returns the payload of the ID Token, including the idTokenExtra values, of the provided OAuth 2.0 Access Token. For more information please [refer to the spec](http://openid.net/specs/openid-connect-core-1_0.html#UserInfo). In the case of authentication error, a WWW-Authenticate header might be set in the response with more information about the error. See [the spec](https://datatracker.ietf.org/doc/html/rfc6750#section-3) for more details about header format. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.userinfo_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(UserinfoResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method userinfo" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/userinfo', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserinfoResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def well_known(self, **kwargs): # noqa: E501
"""JSON Web Keys Discovery # noqa: E501
This endpoint returns JSON Web Keys to be used as public keys for verifying OpenID Connect ID Tokens and, if enabled, OAuth 2.0 JWT Access Tokens. This endpoint can be used with client libraries like [node-jwks-rsa](https://github.com/auth0/node-jwks-rsa) among others. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.well_known(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: JSONWebKeySet
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.well_known_with_http_info(**kwargs) # noqa: E501
def well_known_with_http_info(self, **kwargs): # noqa: E501
"""JSON Web Keys Discovery # noqa: E501
This endpoint returns JSON Web Keys to be used as public keys for verifying OpenID Connect ID Tokens and, if enabled, OAuth 2.0 JWT Access Tokens. This endpoint can be used with client libraries like [node-jwks-rsa](https://github.com/auth0/node-jwks-rsa) among others. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.well_known_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(JSONWebKeySet, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method well_known" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/.well-known/jwks.json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JSONWebKeySet', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 47.01087 | 507 | 0.606197 | 5,005 | 43,250 | 5.032368 | 0.072328 | 0.033033 | 0.044467 | 0.028586 | 0.943622 | 0.935761 | 0.934569 | 0.916108 | 0.911343 | 0.908326 | 0 | 0.014678 | 0.324231 | 43,250 | 919 | 508 | 47.062024 | 0.847093 | 0.536486 | 0 | 0.70922 | 0 | 0 | 0.154778 | 0.041581 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040189 | false | 0 | 0.01182 | 0 | 0.092199 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72d2230e76c0435093ee838f9a19d7cf5a0fef0e | 31,983 | py | Python | kimai_python/api/customer_api.py | kbancerz/kimai-python | c5401acca8fe8cfa7db486dee5a215bd7daea95b | [
"MIT"
] | 6 | 2019-12-19T16:01:58.000Z | 2022-01-19T18:10:16.000Z | kimai_python/api/customer_api.py | kbancerz/kimai-python | c5401acca8fe8cfa7db486dee5a215bd7daea95b | [
"MIT"
] | 4 | 2020-05-16T23:33:15.000Z | 2021-07-06T20:53:32.000Z | kimai_python/api/customer_api.py | kbancerz/kimai-python | c5401acca8fe8cfa7db486dee5a215bd7daea95b | [
"MIT"
] | 3 | 2020-05-16T23:14:13.000Z | 2021-06-30T08:53:11.000Z | # coding: utf-8
"""
Kimai 2 - API Docs
JSON API for the Kimai 2 time-tracking software. Read more about its usage in the [API documentation](https://www.kimai.org/documentation/rest-api.html) and then download a [Swagger file](doc.json) for import e.g. in Postman. Be aware: it is not yet considered stable and BC breaks might happen, especially when using code generation. The order of JSON attributes is not guaranteed. # noqa: E501
OpenAPI spec version: 0.6
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from kimai_python.api_client import ApiClient
class CustomerApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def api_customers_get(self, **kwargs): # noqa: E501
"""Returns a collection of customers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str visible: Visibility status to filter activities (1=visible, 2=hidden, 3=both)
:param str order: The result order. Allowed values: ASC, DESC (default: ASC)
:param str order_by: The field by which results will be ordered. Allowed values: id, name (default: name)
:param str term: Free search term
:return: list[CustomerCollection]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.api_customers_get_with_http_info(**kwargs) # noqa: E501
return data
def api_customers_get_with_http_info(self, **kwargs): # noqa: E501
"""Returns a collection of customers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str visible: Visibility status to filter activities (1=visible, 2=hidden, 3=both)
:param str order: The result order. Allowed values: ASC, DESC (default: ASC)
:param str order_by: The field by which results will be ordered. Allowed values: id, name (default: name)
:param str term: Free search term
:return: list[CustomerCollection]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['visible', 'order', 'order_by', 'term'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_get" % key
)
params[key] = val
del params['kwargs']
if 'visible' in params and not re.match(r'\d+', params['visible']): # noqa: E501
raise ValueError("Invalid value for parameter `visible` when calling `api_customers_get`, must conform to the pattern `/\\d+/`") # noqa: E501
if 'order' in params and not re.match(r'ASC|DESC', params['order']): # noqa: E501
raise ValueError("Invalid value for parameter `order` when calling `api_customers_get`, must conform to the pattern `/ASC|DESC/`") # noqa: E501
if 'order_by' in params and not re.match(r'id|name', params['order_by']): # noqa: E501
raise ValueError("Invalid value for parameter `order_by` when calling `api_customers_get`, must conform to the pattern `/id|name/`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'visible' in params:
query_params.append(('visible', params['visible'])) # noqa: E501
if 'order' in params:
query_params.append(('order', params['order'])) # noqa: E501
if 'order_by' in params:
query_params.append(('orderBy', params['order_by'])) # noqa: E501
if 'term' in params:
query_params.append(('term', params['term'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[CustomerCollection]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_id_get(self, id, **kwargs): # noqa: E501
"""Returns one customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_get(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_id_get_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.api_customers_id_get_with_http_info(id, **kwargs) # noqa: E501
return data
def api_customers_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Returns one customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_id_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `api_customers_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CustomerEntity', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_id_meta_patch(self, id, **kwargs): # noqa: E501
"""Sets the value of a meta-field for an existing customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_meta_patch(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: Customer record ID to set the meta-field value for (required)
:param Body1 body:
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_id_meta_patch_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.api_customers_id_meta_patch_with_http_info(id, **kwargs) # noqa: E501
return data
def api_customers_id_meta_patch_with_http_info(self, id, **kwargs): # noqa: E501
"""Sets the value of a meta-field for an existing customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_meta_patch_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: Customer record ID to set the meta-field value for (required)
:param Body1 body:
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_id_meta_patch" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `api_customers_id_meta_patch`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers/{id}/meta', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CustomerEntity', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_id_patch(self, body, id, **kwargs): # noqa: E501
"""Update an existing customer # noqa: E501
Update an existing customer, you can pass all or just a subset of all attributes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_patch(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CustomerEditForm body: (required)
:param int id: Customer ID to update (required)
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_id_patch_with_http_info(body, id, **kwargs) # noqa: E501
else:
(data) = self.api_customers_id_patch_with_http_info(body, id, **kwargs) # noqa: E501
return data
def api_customers_id_patch_with_http_info(self, body, id, **kwargs): # noqa: E501
"""Update an existing customer # noqa: E501
Update an existing customer, you can pass all or just a subset of all attributes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_patch_with_http_info(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CustomerEditForm body: (required)
:param int id: Customer ID to update (required)
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_id_patch" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `api_customers_id_patch`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `api_customers_id_patch`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CustomerEntity', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_id_rates_get(self, id, **kwargs): # noqa: E501
"""Returns a collection of all rates for one customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_rates_get(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: The customer whose rates will be returned (required)
:return: list[CustomerRate]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_id_rates_get_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.api_customers_id_rates_get_with_http_info(id, **kwargs) # noqa: E501
return data
def api_customers_id_rates_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Returns a collection of all rates for one customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_rates_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: The customer whose rates will be returned (required)
:return: list[CustomerRate]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_id_rates_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `api_customers_id_rates_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers/{id}/rates', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[CustomerRate]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_id_rates_post(self, id, body, **kwargs): # noqa: E501
"""Adds a new rate to a customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_rates_post(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: The customer to add the rate for (required)
:param CustomerRateForm body: (required)
:return: CustomerRate
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_id_rates_post_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.api_customers_id_rates_post_with_http_info(id, body, **kwargs) # noqa: E501
return data
def api_customers_id_rates_post_with_http_info(self, id, body, **kwargs): # noqa: E501
"""Adds a new rate to a customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_rates_post_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: The customer to add the rate for (required)
:param CustomerRateForm body: (required)
:return: CustomerRate
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_id_rates_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `api_customers_id_rates_post`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `api_customers_id_rates_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers/{id}/rates', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CustomerRate', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_id_rates_rate_id_delete(self, id, rate_id, **kwargs): # noqa: E501
"""Deletes one rate for an customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_rates_rate_id_delete(id, rate_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: The customer whose rate will be removed (required)
:param int rate_id: The rate to remove (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_id_rates_rate_id_delete_with_http_info(id, rate_id, **kwargs) # noqa: E501
else:
(data) = self.api_customers_id_rates_rate_id_delete_with_http_info(id, rate_id, **kwargs) # noqa: E501
return data
def api_customers_id_rates_rate_id_delete_with_http_info(self, id, rate_id, **kwargs): # noqa: E501
"""Deletes one rate for an customer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_id_rates_rate_id_delete_with_http_info(id, rate_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: The customer whose rate will be removed (required)
:param int rate_id: The rate to remove (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'rate_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_id_rates_rate_id_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `api_customers_id_rates_rate_id_delete`") # noqa: E501
# verify the required parameter 'rate_id' is set
if ('rate_id' not in params or
params['rate_id'] is None):
raise ValueError("Missing the required parameter `rate_id` when calling `api_customers_id_rates_rate_id_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
if 'rate_id' in params:
path_params['rateId'] = params['rate_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers/{id}/rates/{rateId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def api_customers_post(self, body, **kwargs): # noqa: E501
"""Creates a new customer # noqa: E501
Creates a new customer and returns it afterwards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_post(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CustomerEditForm body: (required)
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.api_customers_post_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.api_customers_post_with_http_info(body, **kwargs) # noqa: E501
return data
def api_customers_post_with_http_info(self, body, **kwargs): # noqa: E501
"""Creates a new customer # noqa: E501
Creates a new customer and returns it afterwards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_customers_post_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CustomerEditForm body: (required)
:return: CustomerEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method api_customers_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `api_customers_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['apiToken', 'apiUser'] # noqa: E501
return self.api_client.call_api(
'/api/customers', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CustomerEntity', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 40.331652 | 401 | 0.60923 | 3,848 | 31,983 | 4.820686 | 0.06289 | 0.044852 | 0.043019 | 0.031051 | 0.948302 | 0.939623 | 0.931806 | 0.919137 | 0.905337 | 0.893477 | 0 | 0.014745 | 0.300222 | 31,983 | 792 | 402 | 40.382576 | 0.814083 | 0.334959 | 0 | 0.767816 | 0 | 0.004598 | 0.195653 | 0.054181 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03908 | false | 0 | 0.009195 | 0 | 0.105747 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72f93165e621d44269a2d011826958efe4eb5194 | 73 | py | Python | SimpleHTTPSAuthUploadServer/__main__.py | KenichiTanino/SimpleHTTPSAuthUploadServer | db36f002cc17739cd4f2b5002f03caa3d4153bbd | [
"MIT"
] | null | null | null | SimpleHTTPSAuthUploadServer/__main__.py | KenichiTanino/SimpleHTTPSAuthUploadServer | db36f002cc17739cd4f2b5002f03caa3d4153bbd | [
"MIT"
] | null | null | null | SimpleHTTPSAuthUploadServer/__main__.py | KenichiTanino/SimpleHTTPSAuthUploadServer | db36f002cc17739cd4f2b5002f03caa3d4153bbd | [
"MIT"
] | null | null | null | from . import simple_https_auth_upload
simple_https_auth_upload.main()
| 14.6 | 38 | 0.849315 | 11 | 73 | 5.090909 | 0.636364 | 0.392857 | 0.535714 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09589 | 73 | 4 | 39 | 18.25 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
f448d33436f235e6a316f25fc0bdc3cb42b69ac5 | 45 | py | Python | test/__init__.py | shakefu/simon-etsy | 5ac57f40317ca1f060038c2f0fd42d794bc41322 | [
"Apache-2.0"
] | null | null | null | test/__init__.py | shakefu/simon-etsy | 5ac57f40317ca1f060038c2f0fd42d794bc41322 | [
"Apache-2.0"
] | null | null | null | test/__init__.py | shakefu/simon-etsy | 5ac57f40317ca1f060038c2f0fd42d794bc41322 | [
"Apache-2.0"
] | 3 | 2019-08-06T21:08:24.000Z | 2021-09-05T21:53:23.000Z | def test_it_imports():
import simon_etsy
| 15 | 22 | 0.755556 | 7 | 45 | 4.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 23 | 22.5 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 1 | 0 | 1.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f44bf379c046935551be575ad1adb0dd71eb88f4 | 38,224 | py | Python | Cogs/Hw.py | RehanPlayz/CorpBot.py | 336a36d202bb7eda88b6ca1f8b6778a9222ed50b | [
"MIT"
] | 368 | 2016-10-17T21:21:12.000Z | 2022-03-18T09:22:56.000Z | Cogs/Hw.py | RehanPlayz/CorpBot.py | 336a36d202bb7eda88b6ca1f8b6778a9222ed50b | [
"MIT"
] | 60 | 2017-01-01T01:35:10.000Z | 2022-01-19T18:43:00.000Z | Cogs/Hw.py | RehanPlayz/CorpBot.py | 336a36d202bb7eda88b6ca1f8b6778a9222ed50b | [
"MIT"
] | 189 | 2016-10-10T20:38:11.000Z | 2022-03-26T12:23:49.000Z | import discord, time
from discord.ext import commands
from Cogs import Utils, PCPP, DisplayName, Message, PickList
def setup(bot):
# Add the bot and deps
settings = bot.get_cog("Settings")
bot.add_cog(Hw(bot, settings))
# This is the Uptime module. It keeps track of how long the bot's been up
class Hw(commands.Cog):
# Init with the bot reference, and a reference to the settings var
def __init__(self, bot, settings):
self.bot = bot
self.settings = settings
self.hwactive = {}
self.charset = "0123456789"
global Utils, DisplayName
Utils = self.bot.get_cog("Utils")
DisplayName = self.bot.get_cog("DisplayName")
def gen_id(self):
# Just use the current time as that shouldn't ever be the same (unless a user
# manages to do this twice in < 1 second)
return str(time.time())
@commands.command(pass_context=True)
async def cancelhw(self, ctx):
"""Cancels a current hardware session."""
if str(ctx.author.id) in self.hwactive:
self._stop_hw(ctx.author)
await ctx.send("You've left your current hardware session!".format(ctx.prefix))
return
await ctx.send("You're not in a current hardware session.")
def _stop_hw(self, author):
if str(author.id) in self.hwactive:
del self.hwactive[str(author.id)]
@commands.command(pass_context=True)
async def sethwchannel(self, ctx, *, channel: discord.TextChannel = None):
"""Sets the channel for hardware (admin only)."""
if not await Utils.is_admin_reply(ctx): return
if channel == None:
self.settings.setServerStat(ctx.guild, "HardwareChannel", "")
msg = 'Hardware works *only* in pm now.'
return await ctx.send(msg)
# If we made it this far - then we can add it
self.settings.setServerStat(ctx.guild, "HardwareChannel", channel.id)
msg = 'Hardware channel set to **{}**.'.format(channel.name)
await ctx.send(Utils.suppressed(ctx,msg))
@sethwchannel.error
async def sethwchannel_error(self, error, ctx):
# do stuff
msg = 'sethwchannel Error: {}'.format(error)
await ctx.send(msg)
@commands.command(pass_context=True)
async def pcpp(self, ctx, url = None, style = None, escape = None):
"""Convert a pcpartpicker.com link into markdown parts. Available styles: normal, md, mdblock, bold, and bolditalic."""
usage = "Usage: `{}pcpp [url] [style=normal, md, mdblock, bold, bolditalic] [escape=yes/no (optional)]`".format(ctx.prefix)
if not style:
style = 'normal'
if not url:
return await ctx.send(usage)
if escape == None:
escape = 'no'
escape = escape.lower() in ["yes","true","on","enable","enabled"]
output = await PCPP.getMarkdown(url, style, escape)
if not output:
msg = 'Something went wrong! Make sure you use a valid pcpartpicker link.'
return await ctx.send(msg)
if len(output) > 2000:
msg = "That's an *impressive* list of parts - but the max length allowed for messages in Discord is 2000 characters, and you're at *{}*.".format(len(output))
msg += '\nMaybe see if you can prune up that list a bit and try again?'
return await ctx.send(msg)
await ctx.send(Utils.suppressed(ctx,output))
@commands.command(pass_context=True)
async def mainhw(self, ctx, *, build = None):
"""Sets a new main build from your build list."""
if not build:
return await ctx.send("Usage: `{}mainhw [build name or number]`".format(ctx.prefix))
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
mainBuild = None
# Get build by name first - then by number
for b in buildList:
if b['Name'].lower() == build.lower():
# Found it
mainBuild = b
break
if mainBuild:
# Found it!
for b in buildList:
if b is mainBuild:
b['Main'] = True
else:
b['Main'] = False
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
msg = "{} set as main!".format(mainBuild['Name'])
return await ctx.send(Utils.suppressed(ctx,msg))
try:
build = int(build)-1
if build >= 0 and build < len(buildList):
mainBuild = buildList[build]
except:
pass
if mainBuild:
# Found it!
for b in buildList:
if b is mainBuild:
b['Main'] = True
else:
b['Main'] = False
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
msg = "{} set as main!".format(mainBuild['Name'])
return await ctx.send(Utils.suppressed(ctx,msg))
msg = "I couldn't find that build or number."
await ctx.send(msg)
@commands.command(pass_context=True)
async def delhw(self, ctx, *, build = None):
"""Removes a build from your build list."""
if not build:
return await ctx.send("Usage: `{}delhw [build name or number]`".format(ctx.prefix))
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
# Get build by name first - then by number
for b in buildList:
if b['Name'].lower() == build.lower():
# Found it
buildList.remove(b)
if b['Main'] and len(buildList):
buildList[0]['Main'] = True
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
msg = "{} removed!".format(b['Name'])
return await ctx.send(Utils.suppressed(ctx,msg))
try:
build = int(build)-1
if build >= 0 and build < len(buildList):
b = buildList.pop(build)
if b['Main'] and len(buildList):
buildList[0]['Main'] = True
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
msg = "{} removed!".format(b['Name'])
return await ctx.send(Utils.suppressed(ctx,msg))
except:
pass
msg = "I couldn't find that build or number."
await ctx.send(msg)
@commands.command(pass_context=True)
async def edithw(self, ctx, *, build = None):
"""Edits a build from your build list."""
hwChannel = None
if ctx.guild:
# Not a pm
hwChannel = self.settings.getServerStat(ctx.guild, "HardwareChannel")
if not (not hwChannel or hwChannel == ""):
# We need the channel id
if not str(hwChannel) == str(ctx.channel.id):
msg = 'This isn\'t the channel for that...'
for chan in ctx.guild.channels:
if str(chan.id) == str(hwChannel):
msg = 'This isn\'t the channel for that. Take the hardware talk to the **{}** channel.'.format(chan.name)
break
return await ctx.send(Utils.suppressed(ctx,msg))
else:
hwChannel = self.bot.get_channel(hwChannel)
if not hwChannel:
# Nothing set - pm
hwChannel = ctx.author
# Make sure we're not already in a parts transaction
if str(ctx.author.id) in self.hwactive:
return await ctx.send("You're already in a hardware session! You can leave with `{}cancelhw`".format(ctx.prefix))
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
if not len(buildList):
# No parts!
msg = 'You have no builds on file! You can add some with the `{}newhw` command.'.format(ctx.prefix)
return await ctx.send(msg)
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
mainBuild = None
# Get build by name first - then by number
if build is not None:
for b in buildList:
if b['Name'].lower() == build.lower():
# Found it
mainBuild = b
break
if not mainBuild:
try:
build = int(build)-1
if build >= 0 and build < len(buildList):
mainBuild = buildList[build]
except:
pass
else:
# No build passed - get the main if it exists
for b in buildList:
if b['Main']:
mainBuild = b
break
if not mainBuild:
msg = "I couldn't find that build or number."
return await ctx.send(msg)
# Set our HWActive flag
hw_id = self.gen_id()
self.hwactive[str(ctx.author.id)] = hw_id
# Here, we have a build
bname = Utils.suppressed(ctx,mainBuild['Name'])
bparts = Utils.suppressed(ctx,mainBuild['Hardware'])
msg = '"{}"\'s current parts:'.format(bname)
try:
await hwChannel.send(msg)
except:
# Can't send to the destination
self._stop_hw(ctx.author)
if hwChannel == ctx.author:
# Must not accept pms
await ctx.send("It looks like you don't accept pms. Please enable them and try again.")
return
if hwChannel == ctx.author and ctx.channel != ctx.author.dm_channel:
await ctx.message.add_reaction("📬")
await hwChannel.send(bparts)
msg = 'Alright, *{}*, what parts does "{}" have now? (Please include *all* parts for this build - you can add new lines with *shift + enter*)\n'.format(DisplayName.name(ctx.author), bname)
msg += 'You can also pass pcpartpicker links to have them formatted automagically - I can also format them using different styles.\n'
msg += 'For example: '
msg += '```https://pcpartpicker.com/list/123456 mdblock``` would format with the markdown block style.\n'
msg += 'Markdown styles available are *normal, md, mdblock, bold, bolditalic*'
while True:
parts = await self.prompt(hw_id, ctx, msg, hwChannel, DisplayName.name(ctx.author))
if not parts:
self._stop_hw(ctx.author)
return
if 'pcpartpicker.com' in parts.content.lower():
# Possibly a pc partpicker link?
msg = 'It looks like you sent a pc part picker link - did you want me to try and format that? (y/n/stop)'
test = await self.confirm(hw_id, ctx, parts, hwChannel, msg)
if test == None:
self._stop_hw(ctx.author)
return
elif test == True:
partList = parts.content.split()
if len(partList) == 1:
partList.append(None)
output = None
try:
output = await PCPP.getMarkdown(partList[0], partList[1], False)
except:
pass
if not output:
msg = 'Something went wrong! Make sure you use a valid pcpartpicker link.'
await hwChannel.send(msg)
self._stop_hw(ctx.author)
return
if len(output) > 2000:
msg = "That's an *impressive* list of parts - but the max length allowed for messages in Discord is 2000 characters, and you're at *{}*.".format(len(output))
msg += '\nMaybe see if you can prune up that list a bit and try again?'
await hwChannel.send(msg)
self._stop_hw(ctx.author)
return
# Make sure
conf = await self.confirm(hw_id, ctx, output, hwChannel, None, ctx.author)
if conf == None:
# Timed out
self._stop_hw(ctx.author)
return
elif conf == False:
# Didn't get our answer
msg = 'Alright, *{}*, what parts does "{}" have now? (Please include *all* parts for this build - you can add new lines with *shift + enter*)'.format(DisplayName.name(ctx.author), bname)
continue
m = '{} set to:\n{}'.format(bname, output)
await hwChannel.send(m)
mainBuild['Hardware'] = output
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
break
mainBuild['Hardware'] = parts.content
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
break
msg = '*{}*, {} was edited successfully!'.format(DisplayName.name(ctx.author), bname)
self._stop_hw(ctx.author)
await hwChannel.send(msg)
@commands.command(pass_context=True)
async def renhw(self, ctx, *, build = None):
"""Renames a build from your build list."""
hwChannel = None
if ctx.guild:
# Not a pm
hwChannel = self.settings.getServerStat(ctx.guild, "HardwareChannel")
if not (not hwChannel or hwChannel == ""):
# We need the channel id
if not str(hwChannel) == str(ctx.channel.id):
msg = 'This isn\'t the channel for that...'
for chan in ctx.guild.channels:
if str(chan.id) == str(hwChannel):
msg = 'This isn\'t the channel for that. Take the hardware talk to the **{}** channel.'.format(chan.name)
await ctx.send(msg)
return
else:
hwChannel = self.bot.get_channel(hwChannel)
if not hwChannel:
# Nothing set - pm
hwChannel = ctx.author
# Make sure we're not already in a parts transaction
if str(ctx.author.id) in self.hwactive:
await ctx.send("You're already in a hardware session! You can leave with `{}cancelhw`".format(ctx.prefix))
return
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
if not len(buildList):
# No parts!
msg = 'You have no builds on file! You can add some with the `{}newhw` command.'.format(ctx.prefix)
await ctx.send(msg)
return
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
mainBuild = None
# Get build by name first - then by number
if build is not None:
for b in buildList:
if b['Name'].lower() == build.lower():
# Found it
mainBuild = b
break
if not mainBuild:
try:
build = int(build)-1
if build >= 0 and build < len(buildList):
mainBuild = buildList[build]
except:
pass
else:
# No build passed - get the main if it exists
for b in buildList:
if b['Main']:
mainBuild = b
break
if not mainBuild:
msg = "I couldn't find that build or number."
await ctx.send(msg)
return
# Set our HWActive flag
hw_id = self.gen_id()
self.hwactive[str(ctx.author.id)] = hw_id
# Post the dm reaction
if hwChannel == ctx.author and ctx.channel != ctx.author.dm_channel:
await ctx.message.add_reaction("📬")
# Here, we have a build
bname = Utils.suppressed(ctx,mainBuild['Name'])
msg = 'Alright, *{}*, what do you want to rename "{}" to?'.format(DisplayName.name(ctx.author), bname)
while True:
try:
buildName = await self.prompt(hw_id, ctx, msg, hwChannel, DisplayName.name(ctx.author))
except:
# Can't send to the destination
self._stop_hw(ctx.author)
if hwChannel == ctx.author:
# Must not accept pms
await ctx.send("It looks like you don't accept pms. Please enable them and try again.")
return
if not buildName:
self._stop_hw(ctx.author)
return
buildExists = False
for build in buildList:
if build['Name'].lower() == buildName.content.lower():
mesg = 'It looks like you already have a build by that name, *{}*. Try again.'.format(DisplayName.name(ctx.author))
await hwChannel.send(mesg)
buildExists = True
break
if not buildExists:
mainBuild['Name'] = buildName.content
# Flush settings to all servers
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
break
bname2 = Utils.suppressed(ctx,buildName.content)
msg = '*{}*, {} was renamed to {} successfully!'.format(DisplayName.name(ctx.author), bname, bname2)
self._stop_hw(ctx.author)
await hwChannel.send(msg)
@commands.command(pass_context=True)
async def gethw(self, ctx, *, user = None, search = None):
"""Searches the user's hardware for a specific search term."""
if not user:
usage = "Usage: `{}gethw [user] [search term]`".format(ctx.prefix)
return await ctx.send(usage)
# Let's check for username and search term
parts = user.split()
memFromName = None
entries = []
for j in range(len(parts)):
# Reverse search direction
i = len(parts)-1-j
memFromName = None
# Name = 0 up to i joined by space
nameStr = ' '.join(parts[0:i])
buildStr = ' '.join(parts[i:])
memFromName = DisplayName.memberForName(nameStr, ctx.guild)
if memFromName:
# Got a member - let's check the remainder length, and search!
if len(buildStr) < 3:
usage = "Search term must be at least 3 characters."
return await ctx.send(usage)
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware", [])
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
for build in buildList:
bParts = build['Hardware']
for line in bParts.splitlines():
if buildStr.lower() in line.lower():
entries.append({"name":"{}. {}".format(len(entries)+1,build["Name"]),"value":line})
if len(entries):
# We're in business
return await PickList.PagePicker(title="Search results for \"{}\" ({:,} total)".format(buildStr, len(entries)),list=entries,ctx=ctx).pick()
# If we're here - then we didn't find a member - set it to the author, and run another quick search
buildStr = user
if len(buildStr) < 3:
usage = "Search term must be at least 3 characters."
return await ctx.send(usage)
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware", [])
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
for build in buildList:
bParts = build['Hardware']
for line in bParts.splitlines():
if buildStr.lower() in line.lower():
entries.append({"name":"{}. {}".format(len(entries)+1,build["Name"]),"value":line})
if len(entries):
# We're in business
return await PickList.PagePicker(title="Search results for \"{}\" ({:,} total)".format(buildStr, len(entries)),list=entries,ctx=ctx).pick()
return await Message.EmbedText(title="Nothing found for that search.",color=ctx.author).send(ctx)
@commands.command(pass_context=True)
async def hw(self, ctx, *, user : str = None, build = None):
"""Lists the hardware for either the user's default build - or the passed build."""
if not user:
user = "{}".format(ctx.author.mention)
# Let's check for username and build name
parts = user.split()
memFromName = None
buildParts = None
for j in range(len(parts)):
# Reverse search direction
i = len(parts)-1-j
# Name = 0 up to i joined by space
nameStr = ' '.join(parts[0:i])
buildStr = ' '.join(parts[i:])
memFromName = DisplayName.memberForName(nameStr, ctx.guild)
if memFromName:
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware")
if buildList == None:
buildList = []
for build in buildList:
if build['Name'].lower() == buildStr.lower():
# Ha! Found it!
buildParts = build
break
if buildParts:
# We're in business
break
else:
memFromName = None
if not memFromName:
# Try again with numbers
for j in range(len(parts)):
# Reverse search direction
i = len(parts)-1-j
# Name = 0 up to i joined by space
nameStr = ' '.join(parts[0:i])
buildStr = ' '.join(parts[i:])
memFromName = DisplayName.memberForName(nameStr, ctx.guild)
if memFromName:
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
try:
buildStr = int(buildStr)-1
if buildStr >= 0 and buildStr < len(buildList):
buildParts = buildList[buildStr]
except Exception:
memFromName = None
buildParts = None
if buildParts:
# We're in business
break
else:
memFromName = None
if not memFromName:
# One last shot - check if it's a build for us
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
for build in buildList:
if build['Name'].lower() == user.lower():
memFromName = ctx.author
buildParts = build
break
if not memFromName:
# Okay - *this* time is the last - check for number
try:
user_as_build = int(user)-1
if user_as_build >= 0 and user_as_build < len(buildList):
buildParts = buildList[user_as_build]
memFromName = ctx.author
except Exception:
pass
if not memFromName:
# Last check for a user passed as the only param
memFromName = DisplayName.memberForName(user, ctx.guild)
if not memFromName:
# We couldn't find them :(
msg = "I couldn't find that user/build combo..."
return await ctx.send(msg)
if buildParts == None:
# Check if that user has no builds
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware")
if buildList == None:
buildList = []
if not len(buildList):
# No parts!
msg = '*{}* has no builds on file! They can add some with the `{}newhw` command.'.format(DisplayName.name(memFromName), ctx.prefix)
return await ctx.send(msg)
# Must be the default build
for build in buildList:
if build['Main']:
buildParts = build
break
if not buildParts:
# Well... uh... no defaults
msg = "I couldn't find that user/build combo..."
return await ctx.send(msg)
# At this point - we *should* have a user and a build
msg_head = "__**{}'s {}:**__\n\n".format(DisplayName.name(memFromName), buildParts['Name'])
msg = msg_head + buildParts['Hardware']
if len(msg) > 2000: # is there somwhere the discord char count is defined, to avoid hardcoding?
msg = buildParts['Hardware'] # if the header pushes us over the limit, omit it and send just the string
await ctx.send(Utils.suppressed(ctx,msg))
@commands.command(pass_context=True)
async def rawhw(self, ctx, *, user : str = None, build = None):
"""Lists the raw markdown for either the user's default build - or the passed build."""
if not user:
user = "{}#{}".format(ctx.author.name, ctx.author.discriminator)
# Let's check for username and build name
parts = user.split()
memFromName = None
buildParts = None
for j in range(len(parts)):
# Reverse search direction
i = len(parts)-1-j
# Name = 0 up to i joined by space
nameStr = ' '.join(parts[0:i])
buildStr = ' '.join(parts[i:])
memFromName = DisplayName.memberForName(nameStr, ctx.guild)
if memFromName:
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware")
if buildList == None:
buildList = []
for build in buildList:
if build['Name'].lower() == buildStr.lower():
# Ha! Found it!
buildParts = build
break
if buildParts:
# We're in business
break
else:
memFromName = None
if not memFromName:
# Try again with numbers
for j in range(len(parts)):
# Reverse search direction
i = len(parts)-1-j
# Name = 0 up to i joined by space
nameStr = ' '.join(parts[0:i])
buildStr = ' '.join(parts[i:])
memFromName = DisplayName.memberForName(nameStr, ctx.guild)
if memFromName:
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
try:
buildStr = int(buildStr)-1
if buildStr >= 0 and buildStr < len(buildList):
buildParts = buildList[buildStr]
except Exception:
memFromName = None
buildParts = None
if buildParts:
# We're in business
break
else:
memFromName = None
if not memFromName:
# One last shot - check if it's a build for us
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
for build in buildList:
if build['Name'].lower() == user.lower():
memFromName = ctx.author
buildParts = build
break
if not memFromName:
# Okay - *this* time is the last - check for number
try:
user_as_build = int(user)-1
if user_as_build >= 0 and user_as_build < len(buildList):
buildParts = buildList[user_as_build]
memFromName = ctx.author
except Exception:
pass
if not memFromName:
# Last check for a user passed as the only param
memFromName = DisplayName.memberForName(user, ctx.guild)
if not memFromName:
# We couldn't find them :(
msg = "I couldn't find that user/build combo..."
return await ctx.send(msg)
if buildParts == None:
# Check if that user has no builds
buildList = self.settings.getGlobalUserStat(memFromName, "Hardware")
if buildList == None:
buildList = []
if not len(buildList):
# No parts!
msg = '*{}* has no builds on file! They can add some with the `{}newhw` command.'.format(DisplayName.name(memFromName), ctx.prefix)
return await ctx.send(msg)
# Must be the default build
for build in buildList:
if build['Main']:
buildParts = build
break
if not buildParts:
# Well... uh... no defaults
msg = "I couldn't find that user/build combo..."
return await ctx.send(msg)
# At this point - we *should* have a user and a build
p = discord.utils.escape_markdown(buildParts['Hardware'])
msg = "__**{}'s {} (Raw Markdown):**__\n\n{}".format(DisplayName.name(memFromName), buildParts['Name'], p)
await ctx.send(Utils.suppressed(ctx,msg))
@commands.command(pass_context=True)
async def listhw(self, ctx, *, user = None):
"""Lists the builds for the specified user - or yourself if no user passed."""
usage = 'Usage: `{}listhw [user]`'.format(ctx.prefix)
if not user:
user = "{}#{}".format(ctx.author.name, ctx.author.discriminator)
member = DisplayName.memberForName(user, ctx.guild)
if not member:
return await ctx.send(usage)
buildList = self.settings.getGlobalUserStat(member, "Hardware")
if buildList == None:
buildList = []
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
if not len(buildList):
msg = '*{}* has no builds on file! They can add some with the `{}newhw` command.'.format(DisplayName.name(member), ctx.prefix)
return await ctx.send(msg)
items = [{"name":"{}. {}".format(i,x["Name"]+(" (Main Build)" if x["Main"] else "")),"value":Utils.truncate_string(x["Hardware"])} for i,x in enumerate(buildList,start=1)]
return await PickList.PagePicker(title="{}'s Builds ({:,} total)".format(DisplayName.name(member),len(buildList)),list=items,ctx=ctx).pick()
@commands.command()
async def lhw(self, ctx, *, user = None):
"""Lists only the titles of the builds for the specified user - or yourself if no user passed."""
usage = 'Usage: `{}lhw [user]`'.format(ctx.prefix)
if not user: user = ctx.author.id
member = DisplayName.memberForName(user, ctx.guild)
if not member: return await ctx.send(usage)
buildList = self.settings.getGlobalUserStat(member, "Hardware", [])
buildList = sorted(buildList, key=lambda x:x['Name'].lower())
if not len(buildList):
msg = '*{}* has no builds on file! They can add some with the `{}newhw` command.'.format(DisplayName.name(member), ctx.prefix)
return await ctx.send(msg)
desc = "\n".join([Utils.truncate_string("{}. {}".format(i,x["Name"]+(" (Main Build)" if x["Main"] else ""))) for i,x in enumerate(buildList,start=1)])
return await PickList.PagePicker(
title="{}'s Builds ({:,} total)".format(DisplayName.name(member),len(buildList)),
description=desc,
ctx=ctx
).pick()
@commands.command(pass_context=True)
async def newhw(self, ctx):
"""Initiate a new-hardware conversation with the bot. The hardware added will also be set as the Main Build."""
buildList = self.settings.getGlobalUserStat(ctx.author, "Hardware")
if buildList == None:
buildList = []
hwChannel = None
if ctx.guild:
# Not a pm
hwChannel = self.settings.getServerStat(ctx.guild, "HardwareChannel")
if not (not hwChannel or hwChannel == ""):
# We need the channel id
if not str(hwChannel) == str(ctx.channel.id):
msg = 'This isn\'t the channel for that...'
for chan in ctx.guild.channels:
if str(chan.id) == str(hwChannel):
msg = 'This isn\'t the channel for that. Take the hardware talk to the **{}** channel.'.format(chan.name)
return await ctx.send(msg)
else:
hwChannel = self.bot.get_channel(hwChannel)
if not hwChannel:
# Nothing set - pm
hwChannel = ctx.author
# Make sure we're not already in a parts transaction
if str(ctx.author.id) in self.hwactive:
return await ctx.send("You're already in a hardware session! You can leave with `{}cancelhw`".format(ctx.prefix))
# Set our HWActive flag
hw_id = self.gen_id()
self.hwactive[str(ctx.author.id)] = hw_id
msg = 'Alright, *{}*, let\'s add a new build.\n\n'.format(DisplayName.name(ctx.author))
if len(buildList) == 1:
msg += 'You currently have *1 build* on file.\n\n'
else:
msg += 'You currently have *{} builds* on file.\n\nLet\'s get started!'.format(len(buildList))
try:
await hwChannel.send(msg)
except:
# Can't send to the destination
self._stop_hw(ctx.author)
if hwChannel == ctx.author:
# Must not accept pms
await ctx.send("It looks like you don't accept pms. Please enable them and try again.")
return
if hwChannel == ctx.author and ctx.channel != ctx.author.dm_channel:
await ctx.message.add_reaction("📬")
msg = '*{}*, tell me what you\'d like to call this build (type stop to cancel):'.format(DisplayName.name(ctx.author))
# Get the build name
newBuild = { 'Main': True }
while True:
buildName = await self.prompt(hw_id, ctx, msg, hwChannel, DisplayName.name(ctx.author))
if not buildName:
self._stop_hw(ctx.author)
return
buildExists = False
for build in buildList:
if build['Name'].lower() == buildName.content.lower():
mesg = 'It looks like you already have a build by that name, *{}*. Try again.'.format(DisplayName.name(ctx.author))
await hwChannel.send(mesg)
buildExists = True
break
if not buildExists:
newBuild['Name'] = buildName.content
break
bname = Utils.suppressed(ctx,buildName.content)
msg = 'Alright, *{}*, what parts does "{}" have? (Please include *all* parts for this build - you can add new lines with *shift + enter*)\n'.format(DisplayName.name(ctx.author), bname)
msg += 'You can also pass pcpartpicker links to have them formatted automagically - I can also format them using different styles.\n'
msg += 'For example: '
msg += '```https://pcpartpicker.com/list/123456 mdblock``` would format with the markdown block style.\n'
msg += 'Markdown styles available are *normal, md, mdblock, bold, bolditalic*'
while True:
parts = await self.prompt(hw_id, ctx, msg, hwChannel, DisplayName.name(ctx.author))
if not parts:
self._stop_hw(ctx.author)
return
if 'pcpartpicker.com' in parts.content.lower():
# Possibly a pc partpicker link?
msg = 'It looks like you sent a pc part picker link - did you want me to try and format that? (y/n/stop)'
test = await self.confirm(hw_id, ctx, parts, hwChannel, msg)
if test == None:
self._stop_hw(ctx.author)
return
elif test == True:
partList = parts.content.split()
if len(partList) == 1:
partList.append(None)
output = None
try:
output = await PCPP.getMarkdown(partList[0], partList[1], False)
except:
pass
#output = PCPP.getMarkdown(parts.content)
if not output:
msg = 'Something went wrong! Make sure you use a valid pcpartpicker link.'
await hwChannel.send(msg)
self._stop_hw(ctx.author)
return
if len(output) > 2000:
msg = "That's an *impressive* list of parts - but the max length allowed for messages in Discord is 2000 characters, and you're at *{}*.".format(len(output))
msg += '\nMaybe see if you can prune up that list a bit and try again?'
await hwChannel.send(msg)
self._stop_hw(ctx.author)
return
# Make sure
conf = await self.confirm(hw_id, ctx, output, hwChannel, None, ctx.author)
if conf == None:
# Timed out
self._stop_hw(ctx.author)
return
elif conf == False:
# Didn't get our answer
msg = 'Alright, *{}*, what parts does "{}" have? (Please include *all* parts for this build - you can add new lines with *shift + enter*)'.format(DisplayName.name(ctx.author), bname)
continue
m = '{} set to:\n{}'.format(bname, output)
await hwChannel.send(m)
newBuild['Hardware'] = output
break
newBuild['Hardware'] = parts.content
break
# Check if we already have a main build and clear it
for build in buildList:
if build['Main']:
build['Main'] = False
buildList.append(newBuild)
self.settings.setGlobalUserStat(ctx.author, "Hardware", buildList)
msg = '*{}*, {} was created successfully! It has been set as your main build. To select a different main, you can use `{}mainhw`'.format(DisplayName.name(ctx.author), bname, ctx.prefix)
self._stop_hw(ctx.author)
await hwChannel.send(msg)
# New HW helper methods
def channelCheck(self, msg, dest = None):
if self.stillHardwaring(msg.author) == False:
# any message is a valid check if we're not editing
return True
if dest:
# We have a target channel
if type(dest) is discord.User or type(dest) is discord.Member:
dest = dest.dm_channel.id
elif type(dest) is discord.TextChannel:
dest = dest.id
elif type(dest) is discord.Guild:
dest = dest.get_channel(dest.id).id
if not dest == msg.channel.id:
return False
else:
# Just make sure it's in pm or the hw channel
if msg.channel == discord.TextChannel:
# Let's check our server stuff
hwChannel = self.settings.getServerStat(msg.guild, "HardwareChannel")
if not (not hwChannel or hwChannel == ""):
# We need the channel id
if not str(hwChannel) == str(ctx.channel.id):
return False
else:
# Nothing set - pm
if not type(msg.channel) == discord.DMChannel:
return False
return True
# Makes sure we're still editing - if this gets set to False,
# that means the user stopped editing/newhw
def stillHardwaring(self, author):
return str(author.id) in self.hwactive
def confirmCheck(self, msg, dest = None):
if not self.channelCheck(msg, dest):
return False
msgStr = msg.content.lower()
if msgStr.startswith('y'):
return True
if msgStr.startswith('n'):
return True
elif msgStr.startswith('stop'):
return True
return False
async def confirm(self, hw_id, ctx, message, dest = None, m = None, author = None):
# Get author name
authorName = None
if author:
if type(author) is str:
authorName = author
else:
try:
authorName = DisplayName.name(author)
except Exception:
pass
else:
if message:
try:
author = message.author
except Exception:
pass
try:
authorName = DisplayName.name(message.author)
except Exception:
pass
if not dest:
dest = message.channel
if not m:
if authorName:
msg = '*{}*, I got:'.format(Utils.suppressed(ctx,authorName))
else:
msg = "I got:"
if type(message) is str:
msg2 = Utils.suppressed(ctx,message)
else:
msg2 = '{}'.format(Utils.suppressed(ctx,message.content))
msg3 = 'Is that correct? (y/n/stop)'
await dest.send(msg)
await dest.send(msg2)
await dest.send(msg3)
else:
msg = m
await dest.send(Utils.suppressed(ctx,msg))
while True:
def littleCheck(m):
return ctx.author.id == m.author.id and self.confirmCheck(m, dest) and len(m.content)
try:
talk = await self.bot.wait_for('message', check=littleCheck, timeout=300)
except Exception:
talk = None
# See if we're still in the right context
if not hw_id == self.hwactive.get(str(ctx.author.id),None):
return None
# Hardware ended
if not self.stillHardwaring(ctx.author):
return None
if not talk:
if authorName:
msg = "*{}*, I'm out of time...".format(authorName)
else:
msg = "I'm out of time..."
await dest.send(msg)
return None
else:
# We got something
if talk.content.lower().startswith('y'):
return True
elif talk.content.lower().startswith('stop'):
if authorName:
msg = "No problem, *{}!* See you later!".format(authorName)
else:
msg = "No problem! See you later!"
await dest.send(msg)
return None
else:
return False
async def prompt(self, hw_id, ctx, message, dest = None, author = None):
# Get author name
authorName = None
if author:
if type(author) is str:
authorName = author
else:
try:
authorName = DisplayName.name(author)
except Exception:
pass
else:
if message:
try:
author = message.author
except Exception:
pass
try:
authorName = DisplayName.name(message.author)
except Exception:
pass
if not dest:
dest = ctx.channel
await dest.send(Utils.suppressed(ctx,message))
while True:
def littleCheck(m):
return ctx.author.id == m.author.id and self.channelCheck(m, dest) and len(m.content)
try:
talk = await self.bot.wait_for('message', check=littleCheck, timeout=300)
except Exception:
talk = None
# See if we're still in the right context
if not hw_id == self.hwactive.get(str(ctx.author.id),None):
return None
# Hardware ended
if not self.stillHardwaring(ctx.author):
return None
if not talk:
msg = "*{}*, I'm out of time...".format(authorName)
await dest.send(msg)
return None
else:
# Check for a stop
if talk.content.lower() == 'stop':
msg = "No problem, *{}!* See you later!".format(authorName, ctx.prefix)
await dest.send(msg)
return None
# Make sure
conf = await self.confirm(hw_id, ctx, talk, dest, "", author)
if conf == True:
# We're sure - return the value
return talk
elif conf == False:
# Not sure - ask again
return await self.prompt(hw_id, ctx, message, dest, author)
else:
# Timed out
return None
| 34.907763 | 193 | 0.641743 | 5,229 | 38,224 | 4.667049 | 0.083955 | 0.032822 | 0.022128 | 0.022128 | 0.8104 | 0.79364 | 0.767046 | 0.743239 | 0.719103 | 0.713162 | 0 | 0.003868 | 0.235768 | 38,224 | 1,094 | 194 | 34.939671 | 0.831439 | 0.094365 | 0 | 0.804762 | 0 | 0.036905 | 0.174242 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010714 | false | 0.033333 | 0.003571 | 0.004762 | 0.110714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.