hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c56bee727081ed52a7aeec0f7b767c67d3a92196 | 39 | bzl | Python | go/deps.bzl | f110/rules_extras | 23175122db6205204ff30291dc9a2d62752a1862 | [
"MIT"
] | 1 | 2021-02-09T04:19:24.000Z | 2021-02-09T04:19:24.000Z | go/deps.bzl | f110/rules_extras | 23175122db6205204ff30291dc9a2d62752a1862 | [
"MIT"
] | null | null | null | go/deps.bzl | f110/rules_extras | 23175122db6205204ff30291dc9a2d62752a1862 | [
"MIT"
] | null | null | null | def go_extras_dependencies():
pass
| 13 | 29 | 0.74359 | 5 | 39 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 39 | 2 | 30 | 19.5 | 0.84375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
3d73d0ffea34c41360b45203c041882c2d3577ad | 144 | py | Python | utils/__init__.py | fy255/perceptual_cgh | 71a999a8ca8fb355d2f8fb41f97321e48f219324 | [
"MIT"
] | 2 | 2022-02-24T16:19:17.000Z | 2022-03-27T01:13:33.000Z | utils/__init__.py | fy255/perceptual_cgh | 71a999a8ca8fb355d2f8fb41f97321e48f219324 | [
"MIT"
] | null | null | null | utils/__init__.py | fy255/perceptual_cgh | 71a999a8ca8fb355d2f8fb41f97321e48f219324 | [
"MIT"
] | 1 | 2022-03-27T01:13:34.000Z | 2022-03-27T01:13:34.000Z | from .image_processing import *
from utils.image_dataset import *
from .torch_complex import *
from .configs import *
from .slmDisplay import *
| 24 | 33 | 0.791667 | 19 | 144 | 5.842105 | 0.526316 | 0.36036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 144 | 5 | 34 | 28.8 | 0.895161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3de3e2bfb4de170799a9ea745d11cce53ca2fe7f | 1,709 | py | Python | mapi/util/decoder.py | gmax-ws/py-mapi | 0ce2de81f680f8bd6071b4c032437948bb0e3921 | [
"MIT"
] | null | null | null | mapi/util/decoder.py | gmax-ws/py-mapi | 0ce2de81f680f8bd6071b4c032437948bb0e3921 | [
"MIT"
] | null | null | null | mapi/util/decoder.py | gmax-ws/py-mapi | 0ce2de81f680f8bd6071b4c032437948bb0e3921 | [
"MIT"
] | null | null | null | import struct
__all_ = ['int8', 'uint8', 'int16', 'uint16', 'int32',
'uint32', 'int64', 'uint64', 'utf8', 'utf16',
'uint16be', 'uint32be', 'to_hex', 'guid', 'override']
def override(f):
return f
def int8(data):
return None if data is None else struct.unpack("b", data)[0]
def uint8(data):
return None if data is None else struct.unpack("B", data)[0]
def int16(data):
return None if data is None else struct.unpack("<h", data)[0]
def uint16(data):
return None if data is None else struct.unpack("<H", data)[0]
def uint16be(data):
return None if data is None else struct.unpack(">H", data)[0]
def int32(data):
return None if data is None else struct.unpack("<i", data)[0]
def uint32(data):
return None if data is None else struct.unpack("<I", data)[0]
def uint32be(data):
return None if data is None else struct.unpack(">I", data)[0]
def int64(data):
return None if data is None else struct.unpack("<q", data)[0]
def uint64(data):
return None if data is None else struct.unpack("<Q", data)[0]
def float(data):
return None if data is None else struct.unpack("<f", data)[0]
def double(data):
return None if data is None else struct.unpack("<d", data)[0]
def utf8(data):
return None if data is None else data.decode("utf-8")
def utf16(data):
return None if data is None else data.decode("utf-16")
def to_hex(data):
return "".join(["%02X" % b for b in data])
def guid(data):
data1 = to_hex(data[0:4])
data2 = to_hex(data[4:6])
data3 = to_hex(data[6:8])
data4a = to_hex(data[8:10])
data4b = to_hex(data[10:])
return "%s-%s-%s-%s-%s" % (data1, data2, data3, data4a, data4b)
| 21.632911 | 67 | 0.632534 | 287 | 1,709 | 3.731707 | 0.184669 | 0.140056 | 0.183007 | 0.20915 | 0.634921 | 0.634921 | 0.634921 | 0.634921 | 0.634921 | 0.634921 | 0 | 0.058997 | 0.206554 | 1,709 | 78 | 68 | 21.910256 | 0.730826 | 0 | 0 | 0 | 0 | 0 | 0.079579 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.395349 | false | 0 | 0.023256 | 0.372093 | 0.813953 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
3de5427b8b9e215242e5bfe77ca12dd7bce2fa09 | 84,337 | py | Python | baidubce/services/bcc/bcc_client.py | onionsheep/bce-sdk-python | 29c260b26eb49dbbd41d205915ab25fc3529f20b | [
"Apache-2.0"
] | null | null | null | baidubce/services/bcc/bcc_client.py | onionsheep/bce-sdk-python | 29c260b26eb49dbbd41d205915ab25fc3529f20b | [
"Apache-2.0"
] | null | null | null | baidubce/services/bcc/bcc_client.py | onionsheep/bce-sdk-python | 29c260b26eb49dbbd41d205915ab25fc3529f20b | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2014 Baidu.com, Inc. All Rights Reserved
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file
# except in compliance with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed under the
# License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific language governing permissions
# and limitations under the License.
"""
This module provides a client class for BCC.
"""
from __future__ import unicode_literals
import copy
import json
import logging
import random
import string
import uuid
from baidubce import bce_base_client
from baidubce.auth import bce_v1_signer
from baidubce.http import bce_http_client
from baidubce.http import handler
from baidubce.http import http_methods
from baidubce.services.bcc import bcc_model
from baidubce.utils import aes128_encrypt_16char_key
from baidubce.utils import required
from baidubce import compat
_logger = logging.getLogger(__name__)
FETCH_MODE_SYNC = b"sync"
FETCH_MODE_ASYNC = b"async"
ENCRYPTION_ALGORITHM = "AES256"
default_billing_to_purchase_created = bcc_model.Billing('Postpaid')
default_billing_to_purchase_reserved = bcc_model.Billing()
class BccClient(bce_base_client.BceBaseClient):
"""
Bcc base sdk client
"""
prefix = b'/v2'
def __init__(self, config=None):
bce_base_client.BceBaseClient.__init__(self, config)
def _merge_config(self, config=None):
if config is None:
return self.config
else:
new_config = copy.copy(self.config)
new_config.merge_non_none_values(config)
return new_config
def _send_request(self, http_method, path,
body=None, headers=None, params=None,
config=None, body_parser=None):
config = self._merge_config(config)
if body_parser is None:
body_parser = handler.parse_json
return bce_http_client.send_request(
config, bce_v1_signer.sign, [handler.parse_error, body_parser],
http_method, BccClient.prefix + path, body, headers, params)
@required(cpu_count=int,
memory_capacity_in_gb=int,
image_id=(bytes, str)) # ***Unicode***
def create_instance(self, cpu_count, memory_capacity_in_gb, image_id, instance_type=None,
billing=None, create_cds_list=None, root_disk_size_in_gb = 0, root_disk_storage_type = None,
network_capacity_in_mbps=0, purchase_count=1, cardCount=1, name=None,
admin_pass=None, zone_name=None, subnet_id=None, security_group_id=None,
gpuCard=None, fpgaCard=None,
client_token=None, config=None):
"""
Create a bcc Instance with the specified options.
You must fill the field of clientToken,which is especially for keeping idempotent.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_type:
The specified Specification to create the instance,
See more detail on
https://cloud.baidu.com/doc/BCC/API.html#InstanceType
:type instance_type: string
:param cpu_count:
The parameter to specified the cpu core to create the instance.
:type cpu_count: int
:param memory_capacity_in_gb:
The parameter to specified the capacity of memory in GB to create the instance.
:type memory_capacity_in_gb: int
:param image_id:
The id of image, list all available image in BccClient.list_images.
:type image_id: string
:param billing:
Billing information.
:type billing: bcc_model.Billing
:param create_cds_list:
The optional list of volume detail info to create.
:type create_cds_list: list<bcc_model.CreateCdsModel>
:param network_capacity_in_mbps:
The optional parameter to specify the bandwidth in Mbps for the new instance.
It must among 0 and 200, default value is 0.
If it's specified to 0, it will get the internal ip address only.
:type network_capacity_in_mbps: int
:param purchase_count:
The number of instance to buy, the default value is 1.
:type purchase_count: int
:param name:
The optional parameter to desc the instance that will be created.
:type name: string
:param admin_pass:
The optional parameter to specify the password for the instance.
If specify the adminPass,the adminPass must be a 8-16 characters String
which must contains letters, numbers and symbols.
The symbols only contains "!@#$%^*()".
The adminPass will be encrypted in AES-128 algorithm
with the substring of the former 16 characters of user SecretKey.
If not specify the adminPass, it will be specified by an random string.
See more detail on
https://bce.baidu.com/doc/BCC/API.html#.7A.E6.31.D8.94.C1.A1.C2.1A.8D.92.ED.7F.60.7D.AF
:type admin_pass: string
:param zone_name:
The optional parameter to specify the available zone for the instance.
See more detail through list_zones method
:type zone_name: string
:param subnet_id:
The optional parameter to specify the id of subnet from vpc, optional param
default value is default subnet from default vpc
:type subnet_id: string
:param security_group_id:
The optional parameter to specify the securityGroupId of the instance
vpcId of the securityGroupId must be the same as the vpcId of subnetId
See more detail through listSecurityGroups method
:type security_group_id: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:param fpgaCard:
specify the fpgaCard info of creating FPGA instance,
see all of supported fpga card type at baidubce.services.bcc.fpga_card_type
:type gpuCard: string
:param gpuCard:
specify the gpuCard info of creating GPU instance,
see all of supported gpu card type at baidubce.services.bcc.gpu_card_type
:type gpuCard: string
:param cardCount:
The parameter to specify the card count for creating GPU/FPGA instance
:type cardCount: int
:param root_disk_size_in_gb:
The parameter to specify the root disk size in GB.
The root disk excludes the system disk, available is 40-500GB.
:type root_disk_size_in_gb: int
:param root_disk_storage_type:
The parameter to specify the root disk storage type.
Default use of HP1 cloud disk.
:type root_disk_storage_type: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/instance'
params = {}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
if billing is None:
billing = default_billing_to_purchase_created
body = {
'cpuCount': cpu_count,
'memoryCapacityInGB': memory_capacity_in_gb,
'imageId': image_id,
'billing': billing.__dict__
}
if instance_type is not None:
body['instanceType'] = instance_type
if root_disk_size_in_gb != 0:
body['rootDiskSizeInGb'] = root_disk_size_in_gb
if root_disk_storage_type is not None:
body['rootDiskStorageType'] = root_disk_storage_type
if create_cds_list is not None:
body['createCdsList'] = create_cds_list
if network_capacity_in_mbps != 0:
body['networkCapacityInMbps'] = network_capacity_in_mbps
if purchase_count > 0:
body['purchaseCount'] = purchase_count
if name is not None:
body['name'] = name
if admin_pass is not None:
secret_access_key = self.config.credentials.secret_access_key
cipher_admin_pass = aes128_encrypt_16char_key(admin_pass, secret_access_key)
body['adminPass'] = cipher_admin_pass
if zone_name is not None:
body['zoneName'] = zone_name
if subnet_id is not None:
body['subnetId'] = subnet_id
if security_group_id is not None:
body['securityGroupId'] = security_group_id
if gpuCard is not None:
body['gpuCard'] = gpuCard
body['cardCount'] = cardCount if cardCount > 1 else 1
if fpgaCard is not None:
body['fpgaCard'] = fpgaCard
body['cardCount'] = cardCount if cardCount > 1 else 1
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(cpu_count=int, memory_capacity_in_gb=int, dedicated_host_id=(bytes, str), # ***Unicode***
image_id=(bytes, str)) # ***Unicode***
def create_instance_from_dedicated_host(self, cpu_count, memory_capacity_in_gb, image_id,
dedicated_host_id, ephemeral_disks=None,
purchase_count=1, name=None, admin_pass=None,
subnet_id=None, security_group_id=None,
client_token=None, config=None):
"""
Create a Instance from dedicatedHost with the specified options.
You must fill the field of clientToken,which is especially for keeping idempotent.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param cpu_count:
The specified number of cpu core to create the instance,
is less than or equal to the remain of dedicated host.
:type cpu_count: int
:param memory_capacity_in_gb:
The capacity of memory to create the instance,
is less than or equal to the remain of dedicated host.
:type memory_capacity_in_gb: int
:param image_id:
The id of image, list all available image in BccClient.list_images.
:type image_id: string
:param dedicated_host_id:
The id of dedicated host, we can locate the instance in specified dedicated host.
:type dedicated_host_id: string
:param ephemeral_disks:
The optional list of ephemeral volume detail info to create.
:type ephemeral_disks: list<bcc_model.EphemeralDisk>
:param purchase_count:
The number of instance to buy, the default value is 1.
:type purchase_count: int
:param name:
The optional parameter to desc the instance that will be created.
:type name: string
:param admin_pass:
The optional parameter to specify the password for the instance.
If specify the adminPass,the adminPass must be a 8-16 characters String
which must contains letters, numbers and symbols.
The symbols only contains "!@#$%^*()".
The adminPass will be encrypted in AES-128 algorithm
with the substring of the former 16 characters of user SecretKey.
If not specify the adminPass, it will be specified by an random string.
See more detail on
https://bce.baidu.com/doc/BCC/API.html#.7A.E6.31.D8.94.C1.A1.C2.1A.8D.92.ED.7F.60.7D.AF
:type admin_pass: string
:param subnet_id:
The optional parameter to specify the id of subnet from vpc, optional param
default value is default subnet from default vpc
:type subnet_id: string
:param security_group_id:
The optional parameter to specify the securityGroupId of the instance
vpcId of the securityGroupId must be the same as the vpcId of subnetId
See more detail through listSecurityGroups method
:type security_group_id: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/instance'
params = {}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
body = {
'cpuCount': cpu_count,
'memoryCapacityInGB': memory_capacity_in_gb,
'imageId': image_id,
'dedicatedHostId': dedicated_host_id
}
if ephemeral_disks is not None:
body['ephemeralDisks'] = ephemeral_disks
if purchase_count > 0:
body['purchaseCount'] = purchase_count
if name is not None:
body['name'] = name
if admin_pass is not None:
body['adminPass'] = admin_pass
if subnet_id is not None:
body['subnetId'] = subnet_id
if security_group_id is not None:
body['securityGroupId'] = security_group_id
return self._send_request(http_methods.POST, path, json.dumps(body), params=params,
config=config)
def list_instances(self, marker=None, max_keys=None, internal_ip=None, dedicated_host_id=None,
zone_name=None, config=None):
"""
Return a list of instances owned by the authenticated user.
:param marker:
The optional parameter marker specified in the original request to specify
where in the results to begin listing.
Together with the marker, specifies the list result which listing should begin.
If the marker is not specified, the list result will listing from the first one.
:type marker: string
:param max_keys:
The optional parameter to specifies the max number of list result to return.
The default value is 1000.
:type max_keys: int
:param internal_ip:
The identified internal ip of instance.
:type internal_ip: string
:param dedicated_host_id:
get instance list filtered by id of dedicated host
:type dedicated_host_id: string
:param zone_name:
get instance list filtered by name of available zone
:type zone_name: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/instance'
params = {}
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
if internal_ip is not None:
params['internalIp'] = internal_ip
if dedicated_host_id is not None:
params['dedicatedHostId'] = dedicated_host_id
if zone_name is not None:
params['zoneName'] = zone_name
return self._send_request(http_methods.GET, path, params=params, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***
def get_instance(self, instance_id, config=None):
"""
Get the detail information of specified instance.
:param instance_id:
The id of instance.
:type instance_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = instance_id.encode(encoding='utf-8')
path = b'/instance/%s' % instance_id
return self._send_request(http_methods.GET, path, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***s
def start_instance(self, instance_id, config=None):
"""
Starting the instance owned by the user.
You can start the instance only when the instance is Stopped,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id: id of instance proposed to start
:type instance_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
params = {
'start': None
}
return self._send_request(http_methods.PUT, path, params=params, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***
def stop_instance(self, instance_id, force_stop=False, config=None):
"""
Stopping the instance owned by the user.
You can stop the instance only when the instance is Running,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id:
The id of instance.
:type instance_id: string
:param force_stop:
The optional parameter to stop the instance forcibly.If true,
it will stop the instance just like power off immediately
and it may result in losing important data which have not been written to disk.
:type force_stop: boolean
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'forceStop': force_stop
}
params = {
'stop': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***
def reboot_instance(self, instance_id, force_stop=False, config=None):
"""
Rebooting the instance owned by the user.
You can reboot the instance only when the instance is Running,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id:
The id of instance.
:type instance_id: string
:param force_stop:
The optional parameter to stop the instance forcibly.If true,
it will stop the instance just like power off immediately
and it may result in losing important data which have not been written to disk.
:type force_stop: boolean
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'forceStop': force_stop
}
params = {
'reboot': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str), # ***Unicode***
admin_pass=(bytes, str)) # ***Unicode***
def modify_instance_password(self, instance_id, admin_pass, config=None):
"""
Modifying the password of the instance.
You can change the instance password only when the instance is Running or Stopped ,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id:
The id of instance.
:type instance_id: string
:param admin_pass:
The new password to update.
The adminPass will be encrypted in AES-128 algorithm
with the substring of the former 16 characters of user SecretKey.
:type admin_pass: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
secret_access_key = self.config.credentials.secret_access_key
cipher_admin_pass = aes128_encrypt_16char_key(admin_pass, secret_access_key)
path = b'/instance/%s' % instance_id
body = {
'adminPass': cipher_admin_pass
}
params = {
'changePass': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str), # ***Unicode***
name=(bytes, str)) # ***Unicode***
def modify_instance_attributes(self, instance_id, name, config=None):
"""
Modifying the special attribute to new value of the instance.
You can reboot the instance only when the instance is Running or Stopped ,
otherwise, it's will get 409 errorCode.
:param instance_id:
The id of instance.
:type instance_id: string
:param name:
The new value for instance's name.
:type name: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'name': name
}
params = {
'modifyAttribute': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str),
desc=(bytes, str))
def modify_instance_desc(self, instance_id, desc, config=None):
"""
Modifying the description of the instance.
You can reboot the instance only when the instance is Running or Stopped ,
otherwise, it's will get 409 errorCode.
:param instance_id:
The id of instance.
:type instance_id: string
:param desc:
The new value for instance's description.
:type name: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'desc': desc
}
params = {
'modifyDesc': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str), # ***Unicode***
image_id=(bytes, str), # ***Unicode***
admin_pass=(bytes, str)) # ***Unicode***
def rebuild_instance(self, instance_id, image_id, admin_pass, config=None):
"""
Rebuilding the instance owned by the user.
After rebuilding the instance,
all of snapshots created from original instance system disk will be deleted,
all of customized images will be saved for using in the future.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id:
The id of instance.
:type instance_id: string
:param image_id:
The id of the image which is used to rebuild the instance.
:type image_id: string
:param admin_pass:
The admin password to login the instance.
The admin password will be encrypted in AES-128 algorithm
with the substring of the former 16 characters of user SecretKey.
See more detail on
https://bce.baidu.com/doc/BCC/API.html#.7A.E6.31.D8.94.C1.A1.C2.1A.8D.92.ED.7F.60.7D.AF
:type admin_pass: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
secret_access_key = self.config.credentials.secret_access_key
cipher_admin_pass = aes128_encrypt_16char_key(admin_pass, secret_access_key)
path = b'/instance/%s' % instance_id
body = {
'imageId': image_id,
'adminPass': cipher_admin_pass
}
params = {
'rebuild': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***
def release_instance(self, instance_id, config=None):
"""
Releasing the instance owned by the user.
Only the Postpaid instance or Prepaid which is expired can be released.
After releasing the instance,
all of the data will be deleted.
all of volumes attached will be auto detached, but the volume snapshots will be saved.
all of snapshots created from original instance system disk will be deleted,
all of customized images created from original instance system disk will be reserved.
:param instance_id:
The id of instance.
:type instance_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
return self._send_request(http_methods.DELETE, path, config=config)
@required(instance_id=(bytes, str), # ***Unicode***
cpu_count=int,
memory_capacity_in_gb=int)
def resize_instance(self, instance_id, cpu_count, memory_capacity_in_gb,
client_token=None, config=None):
"""
Resizing the instance owned by the user.
The Prepaid instance can not be downgrade.
Only the Running/Stopped instance can be resized, otherwise, it's will get 409 errorCode.
After resizing the instance,it will be reboot once.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id:
The id of instance.
:type instance_id: string
:param cpu_count:
The parameter of specified the cpu core to resize the instance.
:type cpu_count: int
:param memory_capacity_in_gb:
The parameter of specified the capacity of memory in GB to resize the instance.
:type memory_capacity_in_gb: int
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'cpuCount': cpu_count,
'memoryCapacityInGB': memory_capacity_in_gb
}
params = None
if client_token is None:
params = {
'resize': None,
'clientToken': generate_client_token()
}
else:
params = {
'resize': None,
'clientToken': client_token
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str), # ***Unicode***
security_group_id=(bytes, str)) # ***Unicode***
def bind_instance_to_security_group(self, instance_id, security_group_id, config=None):
"""
Binding the instance to specified securitygroup.
:param instance_id:
The id of the instance.
:type instance_id: string
:param securitygroup_id:
The id of the securitygroup.
:type securitygroup_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'securityGroupId': security_group_id
}
params = {
'bind': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str), # ***Unicode***
security_group_id=(bytes, str)) # ***Unicode***
def unbind_instance_from_security_group(self, instance_id, security_group_id, config=None):
"""
Unbinding the instance from securitygroup.
:param instance_id:
The id of the instance.
:type instance_id: string
:param securitygroup_id:
The id of the securitygroup.
:type securitygroup_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
body = {
'securityGroupId': security_group_id
}
params = {
'unbind': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str),
tags=list)
def bind_instance_to_tags(self, instance_id, tags, config=None):
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s/tag' % instance_id
tag_list = [tag.__dict__ for tag in tags]
body = {
'changeTags': tag_list
}
params = {
'bind': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str),
tags=list)
def unbind_instance_from_tags(self, instance_id, tags, config=None):
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s/tag' % instance_id
tag_list = [tag.__dict__ for tag in tags]
body = {
'changeTags': tag_list
}
params = {
'unbind': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***
def get_instance_vnc(self, instance_id, config=None):
"""
Getting the vnc url to access the instance.
The vnc url can be used once.
:param instance_id:
The id of the instance.
:type instance_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s/vnc' % instance_id
return self._send_request(http_methods.GET, path, config=config)
@required(instance_id=(bytes, str)) # ***Unicode***
def purchase_reserved_instance(self,
instance_id,
billing=None,
client_token=None,
config=None):
"""
PurchaseReserved the instance with fixed duration.
You can not purchaseReserved the instance which is resizing.
This is an asynchronous interface,
you can get the latest status by BccClient.get_instance.
:param instance_id:
The id of the instance.
:type instance_id: string
:param billing:
Billing information.
:type billing: bcc_model.Billing
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
instance_id = compat.convert_to_bytes(instance_id)
path = b'/instance/%s' % instance_id
if billing is None:
billing = default_billing_to_purchase_reserved
body = {
'billing': billing.__dict__
}
params = None
if client_token is None:
params = {
'purchaseReserved': None,
'clientToken': generate_client_token()
}
else:
params = {
'purchaseReserved': None,
'clientToken': client_token
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
def list_instance_specs(self, config=None):
"""
The interface will be deprecated in the future,
we suggest to use triad (instanceType, cpuCount, memoryCapacityInGB) to specified the instance configuration.
Listing all of specification for instance resource to buy.
See more detail on
https://bce.baidu.com/doc/BCC/API.html#.E5.AE.9E.E4.BE.8B.E5.A5.97.E9.A4.90.E8.A7.84.E6.A0.BC
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/instance/spec'
return self._send_request(http_methods.GET, path, config=config)
@required(cds_size_in_gb=int)
def create_volume_with_cds_size(self, cds_size_in_gb, billing=None, purchase_count=1,
storage_type='hp1', zone_name=None, client_token=None,
config=None):
"""
Create a volume with the specified options.
You can use this method to create a new empty volume by specified options
or you can create a new volume from customized volume snapshot but not system disk snapshot.
By using the cdsSizeInGB parameter you can create a newly empty volume.
By using snapshotId parameter to create a volume form specific snapshot.
:param cds_size_in_gb:
The size of volume to create in GB.
By specifying the snapshotId,
it will create volume from the specified snapshot and the parameter cdsSizeInGB will be ignored.
:type cds_size_in_gb: int
:param billing:
Billing information.
:type billing: bcc_model.Billing
:param purchase_count:
The optional parameter to specify how many volumes to buy, default value is 1.
The maximum to create for one time is 5.
:type purchase_count: int
:param storage_type:
The storage type of volume, see more detail in
https://bce.baidu.com/doc/BCC/API.html#StorageType
:type storage_type: menu{'hp1', 'std1'}
:param zone_name:
The optional parameter to specify the available zone for the volume.
See more detail through list_zones method
:type zone_name: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/volume'
params = {}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
if billing is None:
billing = default_billing_to_purchase_created
body = {
'cdsSizeInGB': cds_size_in_gb,
'billing': billing.__dict__
}
if purchase_count is not None:
body['purchaseCount'] = purchase_count
if storage_type is not None:
body['storageType'] = storage_type
if zone_name is not None:
body['zoneName'] = zone_name
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(snapshot_id=(bytes, str)) # ***Unicode***
def create_volume_with_snapshot_id(self, snapshot_id, billing=None, purchase_count=1,
storage_type='hp1', zone_name=None, client_token=None,
config=None):
"""
Create a volume with the specified options.
You can use this method to create a new empty volume by specified options
or you can create a new volume from customized volume snapshot but not system disk snapshot.
By using the cdsSizeInGB parameter you can create a newly empty volume.
By using snapshotId parameter to create a volume form specific snapshot.
:param snapshot_id:
The id of snapshot.
By specifying the snapshotId,
it will create volume from the specified snapshot and the parameter cdsSizeInGB will be ignored.
:type snapshot_id: string
:param billing:
Billing information.
:type billing: bcc_model.Billing
:param purchase_count:
The optional parameter to specify how many volumes to buy, default value is 1.
The maximum to create for one time is 5.
:type purchase_count: int
:param storage_type:
The storage type of volume, see more detail in
https://bce.baidu.com/doc/BCC/API.html#StorageType
:type storage_type: menu{'hp1', 'std1'}
:param zone_name:
The optional parameter to specify the available zone for the volume.
See more detail through list_zones method
:type zone_name: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/volume'
params = {}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
if billing is None:
billing = default_billing_to_purchase_created
body = {
'snapshotId': snapshot_id,
'billing': billing.__dict__
}
if purchase_count is not None:
body['purchaseCount'] = purchase_count
if storage_type is not None:
body['storageType'] = storage_type
if zone_name is not None:
body['zoneName'] = zone_name
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
def list_volumes(self, instance_id=None, zone_name=None, marker=None, max_keys=None,
config=None):
"""
Listing volumes owned by the authenticated user.
:param instance_id:
The id of instance. The optional parameter to list the volume.
If it's specified,only the volumes attached to the specified instance will be listed.
:type instance_id: string
:param zone_name:
The name of available zone. The optional parameter to list volumes
:type zone_name: string
:param marker:
The optional parameter marker specified in the original request to specify
where in the results to begin listing.
Together with the marker, specifies the list result which listing should begin.
If the marker is not specified, the list result will listing from the first one.
:type marker: string
:param max_keys:
The optional parameter to specifies the max number of list result to return.
The default value is 1000.
:type max_keys: int
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/volume'
params = {}
if instance_id is not None:
params['instanceId'] = instance_id
if zone_name is not None:
params['zoneName'] = zone_name
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
return self._send_request(http_methods.GET, path, params=params, config=config)
@required(volume_id=(bytes, str)) # ***Unicode***
def get_volume(self, volume_id, config=None):
"""
Get the detail information of specified volume.
:param volume_id:
The id of the volume.
:type volume_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
return self._send_request(http_methods.GET, path, config=config)
@required(volume_id=(bytes, str), # ***Unicode***
instance_id=(bytes, str)) # ***Unicode***
def attach_volume(self, volume_id, instance_id, config=None):
"""
Attaching the specified volume to a specified instance.
You can attach the specified volume to a specified instance only
when the volume is Available and the instance is Running or Stopped,
otherwise, it's will get 409 errorCode.
:param volume_id:
The id of the volume which will be attached to specified instance.
:type volume_id: string
:param instance_id:
The id of the instance which will be attached with a volume.
:type instance_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
body = {
'instanceId': instance_id
}
params = {
'attach': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(volume_id=(bytes, str), # ***Unicode***
instance_id=(bytes, str)) # ***Unicode***
def detach_volume(self, volume_id, instance_id, config=None):
"""
Detaching the specified volume from a specified instance.
You can detach the specified volume from a specified instance only
when the instance is Running or Stopped ,
otherwise, it's will get 409 errorCode.
:param volume_id:
The id of the volume which will be attached to specified instance.
:type volume_id: string
:param instance_id:
The id of the instance which will be attached with a volume.
:type instance_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
body = {
'instanceId': instance_id
}
params = {
'detach': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(volume_id=(bytes, str)) # ***Unicode***
def release_volume(self, volume_id, config=None):
"""
Releasing the specified volume owned by the user.
You can release the specified volume only
when the instance is among state of Available/Expired/Error,
otherwise, it's will get 409 errorCode.
:param volume_id:
The id of the volume which will be released.
:type volume_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
return self._send_request(http_methods.DELETE, path, config=config)
@required(volume_id=(bytes, str), # ***Unicode***
new_cds_size=int)
def resize_volume(self, volume_id, new_cds_size, client_token=None, config=None):
"""
Resizing the specified volume with newly size.
You can resize the specified volume only when the volume is Available,
otherwise, it's will get 409 errorCode.
The prepaid volume can not be downgrade.
This is an asynchronous interface,
you can get the latest status by BccClient.get_volume.
:param volume_id:
The id of volume which you want to resize.
:type volume_id: string
:param new_cds_size:
The new volume size you want to resize in GB.
:type new_cds_size: int
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
body = {
'newCdsSizeInGB': new_cds_size
}
params = None
if client_token is None:
params = {
'resize': None,
'clientToken': generate_client_token()
}
else:
params = {
'resize': None,
'clientToken': client_token
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(volume_id=(bytes, str), # ***Unicode***
snapshot_id=(bytes, str)) # ***Unicode***
def rollback_volume(self, volume_id, snapshot_id, config=None):
"""
Rollback the volume with the specified volume snapshot.
You can rollback the specified volume only when the volume is Available,
otherwise, it's will get 409 errorCode.
The snapshot used to rollback must be created by the volume,
otherwise,it's will get 404 errorCode.
If rolling back the system volume,the instance must be Running or Stopped,
otherwise, it's will get 409 errorCode.After rolling back the
volume,all the system disk data will erase.
:param volume_id:
The id of volume which will be rollback.
:type volume_id: string
:param snapshot_id:
The id of snapshot which will be used to rollback the volume.
:type snapshot_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
body = {
'snapshotId': snapshot_id
}
params = {
'rollback': None,
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(volume_id=(bytes, str)) # ***Unicode***
def purchase_reserved_volume(self,
volume_id,
billing=None,
client_token=None,
config=None):
"""
PurchaseReserved the instance with fixed duration.
You can not purchaseReserved the instance which is resizing.
This is an asynchronous interface,
you can get the latest status by BccClient.get_volume.
:param volume_id:
The id of volume which will be renew.
:type volume_id: string
:param billing:
Billing information.
:type billing: bcc_model.Billing
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
if billing is None:
billing = default_billing_to_purchase_reserved
body = {
'billing': billing.__dict__
}
params = None
if client_token is None:
params = {
'purchaseReserved': None,
'clientToken': generate_client_token()
}
else:
params = {
'purchaseReserved': None,
'clientToken': client_token
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(volume_id=(bytes, str),
name=(bytes, str),
desc=(bytes, str))
def modify_volume_Attribute(self,
volume_id,
cdsName,
config=None):
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
body = {
'cdsName': cdsName
}
params = {
'modify': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(volume_id=(bytes, str))
def modify_volume_charge_type(self,
volume_id,
billing=None,
config=None):
volume_id = compat.convert_to_bytes(volume_id)
path = b'/volume/%s' % volume_id
if billing is None:
billing = default_billing_to_purchase_reserved
body = {
'billing': billing.__dict__
}
params = {
'modifyChargeType': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(image_name=(bytes, str), # ***Unicode***
instance_id=(bytes, str)) # ***Unicode***
def create_image_from_instance_id(self,
image_name,
instance_id,
client_token=None,
config=None):
"""
Creating a customized image which can be used for creating instance.
You can create an image from an instance with this method.
While creating an image from an instance, the instance must be Running or Stopped,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_image.
:param image_name:
The name for the image that will be created.
The name length from 1 to 65,only contains letters,digital and underline.
:type image_name: string
:param instance_id:
The optional parameter specify the id of the instance which will be used to create the new image.
When instanceId and snapshotId are specified ,only instanceId will be used.
:type instance_id: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/image'
params = None
if client_token is None:
params = {
'clientToken': generate_client_token()
}
else:
params = {
'clientToken': client_token
}
body = {
'imageName': image_name,
'instanceId': instance_id
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(image_name=(bytes, str), # ***Unicode***
snapshot_id=(bytes, str)) # ***Unicode***
def create_image_from_snapshot_id(self,
image_name,
snapshot_id,
client_token=None,
config=None):
"""
Creating a customized image which can be used for creating instance.
You can create an image from an snapshot with tihs method.
You can create the image only from system snapshot.
While creating an image from a system snapshot,the snapshot must be Available,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_image.
:param image_name:
The name for the image that will be created.
The name length from 1 to 65,only contains letters,digital and underline.
:type image_name: string
:param snapshot_id:
The optional parameter specify the id of the snapshot which will be used to create the new image.
When instanceId and snapshotId are specified ,only instanceId will be used.
:type snapshot_id: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/image'
params = {}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
body = {
'imageName': image_name,
'snapshotId': snapshot_id
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
def list_images(self, image_type='All', marker=None, max_keys=None, config=None):
"""
Listing images owned by the authenticated user.
:param image_type:
The optional parameter to filter image to list.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#ImageType"
:type image_type: menu{'All', System', 'Custom', 'Integration'}
:param marker:
The optional parameter marker specified in the original request to specify
where in the results to begin listing.
Together with the marker, specifies the list result which listing should begin.
If the marker is not specified, the list result will listing from the first one.
:type marker: string
:param max_keys:
The optional parameter to specifies the max number of list result to return.
The default value is 1000.
:type max_keys: int
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/image'
params = {
'imageType': image_type
}
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
return self._send_request(http_methods.GET, path, params=params, config=config)
@required(image_id=(bytes, str)) # ***Unicode***
def get_image(self, image_id, config=None):
"""
Get the detail information of specified image.
:param image_id:
The id of image.
:type image_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s' % image_id
return self._send_request(http_methods.GET, path, config=config)
@required(image_id=(bytes, str)) # ***Unicode***
def delete_image(self, image_id, config=None):
"""
Deleting the specified image.
Only the customized image can be deleted,
otherwise, it's will get 403 errorCode.
:param image_id:
The id of image.
:type image_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s' % image_id
return self._send_request(http_methods.DELETE, path, config=config)
@required(image_id=(bytes, str),
name=(bytes, str),
destRegions=list)
def remote_copy_image(self,
image_id,
name,
destRegions,
config=None):
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s' % image_id
body = {
'name': name,
'destRegion': destRegions
}
params = {
'remoteCopy': None
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(image_id=(bytes, str))
def cancle_remote_copy_image(self,
image_id,
config=None):
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s' % image_id
params = {
'cancelRemoteCopy': None
}
return self._send_request(http_methods.POST, path, params=params, config=config)
@required(image_id=(bytes, str))
def share_image(self,
image_id,
account=None,
account_id=None,
config=None):
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s' % image_id
body = {}
if account is not None:
body['account'] = account
if account_id is not None:
body['accountId'] = account_id
params = {
'share': None
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(image_id=(bytes, str))
def unshare_image(self,
image_id,
account=None,
account_id=None,
config=None):
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s' % image_id
body = {}
if account is not None:
body['account'] = account
if account_id is not None:
body['accountId'] = account_id
params = {
'unshare': None
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(image_id=(bytes, str))
def list_shared_user(self,
image_id,
config=None):
image_id = compat.convert_to_bytes(image_id)
path = b'/image/%s/sharedUsers' % image_id
return self._send_request(http_methods.GET, path, config=config)
@required(instance_ids=list)
def list_os(self,
instance_ids=None,
config=None):
path = b'/image/os'
instance_id_list = instance_ids
body = {
'instanceIds': instance_id_list
}
return self._send_request(http_methods.POST, path, json.dumps(body), config=config)
@required(volume_id=(bytes, str), # ***Unicode***
snapshot_name=(bytes, str)) # ***Unicode***
def create_snapshot(self,
volume_id,
snapshot_name,
desc=None,
client_token=None,
config=None):
"""
Creating snapshot from specified volume.
You can create snapshot from system volume and CDS volume.
While creating snapshot from system volume,the instance must be Running or Stopped,
otherwise, it's will get 409 errorCode.
While creating snapshot from CDS volume, the volume must be InUs or Available,
otherwise, it's will get 409 errorCode.
This is an asynchronous interface,
you can get the latest status by BccClient.get_snapshot.
:param volume_id:
The id which specify where the snapshot will be created from.
If you want to create an snapshot from a customized volume, a id of the volume will be set.
If you want to create an snapshot from a system volume, a id of the instance will be set.
:type volume_id: string
:param snapshot_name:
The name for the snapshot that will be created.
The name length from 1 to 65,only contains letters,digital and underline.
:type snapshot_name: string
:param desc:
The optional parameter to describe the information of the new snapshot.
:type desc: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/snapshot'
params = None
if client_token is None:
params = {
'clientToken': generate_client_token()
}
else:
params = {
'clientToken': client_token
}
body = {
'volumeId': volume_id,
'snapshotName': snapshot_name
}
if desc is not None:
body['desc'] = desc
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
def list_snapshots(self, marker=None, max_keys=None, volume_id=None, config=None):
"""
List snapshots
:param marker:
The optional parameter marker specified in the original request to specify
where in the results to begin listing.
Together with the marker, specifies the list result which listing should begin.
If the marker is not specified, the list result will listing from the first one.
:type params: string
:param max_keys:
The optional parameter to specifies the max number of list result to return.
The default value is 1000.
:type params: int
:param volume_id:
The id of the volume.
:type volume_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/snapshot'
params = None
if marker is not None or max_keys is not None or volume_id is not None:
params = {}
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
if volume_id is not None:
params['volumeId'] = volume_id
return self._send_request(http_methods.GET, path, params=params, config=config)
@required(snapshot_id=(bytes, str)) # ***Unicode***
def get_snapshot(self, snapshot_id, config=None):
"""
Get the detail information of specified snapshot.
:param snapshot_id:
The id of snapshot.
:type snapshot_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
snapshot_id = compat.convert_to_bytes(snapshot_id)
path = b'/snapshot/%s' % snapshot_id
return self._send_request(http_methods.GET, path, config=config)
@required(snapshot_id=(bytes, str)) # ***Unicode***
def delete_snapshot(self, snapshot_id, config=None):
"""
Deleting the specified snapshot.
Only when the snapshot is CreatedFailed or Available,the specified snapshot can be deleted.
otherwise, it's will get 403 errorCode.
:param snapshot_id:
The id of snapshot.
:type snapshot_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
snapshot_id = compat.convert_to_bytes(snapshot_id)
path = b'/snapshot/%s' % snapshot_id
return self._send_request(http_methods.DELETE, path, config=config)
@required(name=(bytes, str), # ***Unicode***
rules=list)
def create_security_group(self,
name,
rules,
vpc_id=None,
desc=None,
client_token=None,
config=None):
"""
Creating a newly SecurityGroup with specified rules.
:param name:
The name of SecurityGroup that will be created.
:type name: string
:param rules:
The list of rules which define how the SecurityGroup works.
:type rules: list<bcc_model.SecurityGroupRuleModel>
:param vpc_id:
The optional parameter to specify the id of VPC to SecurityGroup
:type vpc_id: string
:param desc:
The optional parameter to describe the SecurityGroup that will be created.
:type desc: string
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/securityGroup'
params = None
if client_token is None:
params = {
'clientToken': generate_client_token()
}
else:
params = {
'clientToken': client_token
}
rule_list = [rule.__dict__ for rule in rules]
body = {
'name': name,
'rules': rule_list
}
if vpc_id is not None:
body['vpcId'] = vpc_id
if desc is not None:
body['desc'] = desc
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
def list_security_groups(self, instance_id=None, vpc_id=None, marker=None, max_keys=None,
config=None):
"""
Listing SecurityGroup owned by the authenticated user.
:param instance_id:
The id of instance. The optional parameter to list the SecurityGroup.
If it's specified,only the SecurityGroup related to the specified instance will be listed
:type instance_id: string
:param vpc_id:
filter by vpcId, optional parameter
:type vpc_id: string
:param marker:
The optional parameter marker specified in the original request to specify
where in the results to begin listing.
Together with the marker, specifies the list result which listing should begin.
If the marker is not specified, the list result will listing from the first one.
:type marker: string
:param max_keys:
The optional parameter to specifies the max number of list result to return.
The default value is 1000.
:type max_keys: int
:return:
:rtype baidubce.bce_response.BceResponse
"""
path = b'/securityGroup'
params = None
if instance_id is not None or marker is not None or max_keys is not None:
params = {}
if instance_id is not None:
params['instanceId'] = instance_id
if vpc_id is not None:
params['vpcId'] = vpc_id
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
return self._send_request(http_methods.GET, path,
params=params, config=config)
@required(security_group_id=(bytes, str)) # ***Unicode***
def delete_security_group(self, security_group_id, config=None):
"""
Deleting the specified SecurityGroup.
:param security_group_id:
The id of SecurityGroup that will be deleted.
:type security_group_id: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
security_group_id = compat.convert_to_bytes(security_group_id)
path = b'/securityGroup/%s' % security_group_id
return self._send_request(http_methods.DELETE, path, config=config)
@required(security_group_id=(bytes, str), # ***Unicode***
rule=bcc_model.SecurityGroupRuleModel)
def authorize_security_group_rule(self, security_group_id, rule, client_token=None,
config=None):
"""
authorize a security group rule to the specified security group
:param security_group_id:
The id of SecurityGroup that will be authorized.
:type security_group_id: string
:param rule:
security group rule detail.
Through protocol/portRange/direction/sourceIp/sourceGroupId, we can confirmed only one rule.
:type rule: bcc_model.SecurityGroupRuleModel
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
security_group_id = compat.convert_to_bytes(security_group_id)
path = b'/securityGroup/%s' % security_group_id
params = {'authorizeRule': ''}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
body = {
'rule': rule.__dict__
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(security_group_id=(bytes, str), # ***Unicode***
rule=bcc_model.SecurityGroupRuleModel)
def revoke_security_group_rule(self, security_group_id, rule, client_token=None, config=None):
"""
revoke a security group rule from the specified security group
:param security_group_id:
The id of SecurityGroup that will be revoked.
:type security_group_id: string
:param rule:
security group rule detail.
Through protocol/portRange/direction/sourceIp/sourceGroupId, we can confirmed only one rule.
:type rule: bcc_model.SecurityGroupRuleModel
:param client_token:
An ASCII string whose length is less than 64.
The request will be idempotent if client token is provided.
If the clientToken is not specified by the user,
a random String generated by default algorithm will be used.
See more detail at
https://bce.baidu.com/doc/BCC/API.html#.E5.B9.82.E7.AD.89.E6.80.A7
:type client_token: string
:return:
:rtype baidubce.bce_response.BceResponse
"""
security_group_id = compat.convert_to_bytes(security_group_id)
path = b'/securityGroup/%s' % security_group_id
params = {'revokeRule': ''}
if client_token is None:
params['clientToken'] = generate_client_token()
else:
params['clientToken'] = client_token
body = {
'rule': rule.__dict__
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
def list_zones(self, config=None):
"""
Get zone detail list within current region
:param config:
:return:
"""
path = b'/zone'
return self._send_request(http_methods.GET, path, config=config)
@required(asp_name=(bytes, str),
time_points=list,
repeat_week_days=list,
retention_days=(bytes, str))
def create_asp(self,
asp_name=None,
time_points=None,
repeat_week_days=None,
retention_days=None,
client_token=None,
config=None):
path = b'/asp'
params = None
if client_token is None:
params = {
'clientToken': generate_client_token()
}
else:
params = {
'clientToken': client_token
}
body = {
'name': asp_name,
'timePoints': time_points,
'repeatWeekdays': repeat_week_days,
'retentionDays': retention_days
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(asp_id=(bytes, str),
volume_ids=list)
def attach_asp(self,
asp_id=None,
volume_ids=None,
config=None):
asp_id = compat.convert_to_bytes(asp_id)
path = b'/asp/%s' % asp_id
body = {
'volumeIds': volume_ids
}
params = {
'attach': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(asp_id=(bytes, str),
volume_ids=list)
def detach_asp(self,
asp_id=None,
volume_ids=None,
config=None):
asp_id = compat.convert_to_bytes(asp_id)
path = b'/asp/%s' % asp_id
body = {
'volumeIds': volume_ids
}
params = {
'detach': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(asp_id=(bytes, str))
def delete_asp(self,
asp_id=None,
config=None):
asp_id = compat.convert_to_bytes(asp_id)
path = b'/asp/%s' % asp_id
return self._send_request(http_methods.DELETE, path, config=config)
def list_asps(self, marker=None, max_keys=None, asp_name=None, volume_name=None, config=None):
path = b'/asp'
params = None
if marker is not None or max_keys is not None or asp_name is not None or volume_name is not None:
params = {}
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
if asp_name is not None:
params['aspName'] = asp_name
if volume_name is not None:
params['volumeName'] = volume_name
return self._send_request(http_methods.GET, path, params=params, config=config)
@required(asp_id=(bytes, str))
def get_asp(self, asp_id=None, config=None):
asp_id = compat.convert_to_bytes(asp_id)
path = b'/asp/%s' % asp_id
return self._send_request(http_methods.GET, path, config=config)
@required(keypair_name=(bytes, str))
def create_keypair(self,
keypair_name=None,
keypair_desc=None,
config=None):
path = b'/keypair'
body = {
'name': keypair_name,
'description': keypair_desc
}
params = {
'create': None
}
return self._send_request(http_methods.POST, path, json.dumps(body),
params=params, config=config)
@required(keypair_name=(bytes, str),
public_key=(bytes, str))
def import_keypair(self,
keypair_name=None,
keypair_desc=None,
public_key=None,
config=None):
path = b'/keypair'
body = {
'name': keypair_name,
'publicKey': public_key
}
if keypair_desc is not None:
body['description'] = keypair_desc
params = {
'import': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
def list_keypairs(self, marker=None, max_keys=None, config=None):
path = b'/keypair'
params = None
if marker is not None or max_keys is not None:
params = {}
if marker is not None:
params['marker'] = marker
if max_keys is not None:
params['maxKeys'] = max_keys
return self._send_request(http_methods.GET, path, params=params, config=config)
@required(keypair_id=(bytes, str))
def get_keypair(self, keypair_id=None, config=None):
keypair_id = compat.convert_to_bytes(keypair_id)
path = b'/keypair/%s' % keypair_id
return self._send_request(http_methods.GET, path, config=config)
@required(keypair_id=(bytes, str),
instance_ids=list)
def attach_keypair(self,
keypair_id=None,
instance_ids=None,
config=None):
keypair_id = compat.convert_to_bytes(keypair_id)
path = b'/keypair/%s' % keypair_id
body = {
'instanceIds': instance_ids
}
params = {
'attach': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(keypair_id=(bytes, str),
instance_id=list)
def detach_keypair(self,
keypair_id=None,
instance_ids=None,
config=None):
keypair_id = compat.convert_to_bytes(keypair_id)
path = b'/keypair/%s' % keypair_id
body = {
'instanceIds': instance_ids
}
params = {
'detach': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(keypair_id=(bytes, str))
def delete_keypair(self,
keypair_id=None,
config=None):
keypair_id = compat.convert_to_bytes(keypair_id)
path = b'/keypair/%s' % keypair_id
return self._send_request(http_methods.DELETE, path, config=config)
@required(keypair_id=(bytes, str),
keypair_name=(bytes, str))
def rename_keypair(self,
keypair_id=None,
keypair_name=None,
config=None):
keypair_id = compat.convert_to_bytes(keypair_id)
path = b'/keypair/%s' % keypair_id
body = {
'name': keypair_name
}
params = {
'rename': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
@required(keypair_id=(bytes, str),
keypair_desc=(bytes, str))
def update_keypair_desc(self,
keypair_id=None,
keypair_desc=None,
config=None):
keypair_id = compat.convert_to_bytes(keypair_id)
path = b'/keypair/%s' % keypair_id
body = {
'description': keypair_desc
}
params = {
'updateDesc': None
}
return self._send_request(http_methods.PUT, path, json.dumps(body),
params=params, config=config)
def generate_client_token_by_uuid():
"""
The default method to generate the random string for client_token
if the optional parameter client_token is not specified by the user.
:return:
:rtype string
"""
return str(uuid.uuid4())
def generate_client_token_by_random():
"""
The alternative method to generate the random string for client_token
if the optional parameter client_token is not specified by the user.
:return:
:rtype string
"""
client_token = ''.join(random.sample(string.ascii_letters + string.digits, 36))
return client_token
generate_client_token = generate_client_token_by_uuid
| 37.038647 | 117 | 0.59318 | 10,075 | 84,337 | 4.804963 | 0.056377 | 0.028507 | 0.019665 | 0.029498 | 0.8371 | 0.803863 | 0.771246 | 0.760938 | 0.738133 | 0.710246 | 0 | 0.007922 | 0.33244 | 84,337 | 2,276 | 118 | 37.054921 | 0.851936 | 0.405504 | 0 | 0.676301 | 0 | 0 | 0.057719 | 0.000973 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070328 | false | 0.015414 | 0.017341 | 0 | 0.159923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9aa73320a32658fc4d17087353089c56c4dd9b53 | 38,508 | py | Python | pybind/nos/v6_0_2c/interface_vlan/interface/vlan/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/nos/v6_0_2c/interface_vlan/interface/vlan/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/nos/v6_0_2c/interface_vlan/interface/vlan/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import private_vlan
import ip
import ipv6
import mac
import spanning_tree
class vlan(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-interface - based on the path /interface-vlan/interface/vlan. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: The list of vlans in the managed device. Each row
represents a vlan. User can create/delete an entry in
to this list.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__name','__transport_service','__ifindex','__description','__vlan_name','__private_vlan','__ip','__ipv6','__mac','__remote_span','__spanning_tree',)
_yang_name = 'vlan'
_rest_name = 'Vlan'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__spanning_tree = YANGDynClass(base=spanning_tree.spanning_tree, is_container='container', presence=False, yang_name="spanning-tree", rest_name="spanning-tree", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Spanning tree commands', u'sort-priority': u'98', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-xstp', defining_module='brocade-xstp', yang_type='container', is_config=True)
self.__name = YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..8191']}), is_leaf=True, yang_name="name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-suppress-range': None, u'cli-custom-range': None}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='vlan-type', is_config=True)
self.__ip = YANGDynClass(base=ip.ip, is_container='container', presence=False, yang_name="ip", rest_name="ip", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol (IP).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
self.__ipv6 = YANGDynClass(base=ipv6.ipv6, is_container='container', presence=False, yang_name="ipv6", rest_name="ipv6", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol v6 (IPv6).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
self.__vlan_name = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 32']}), is_leaf=True, yang_name="vlan-name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Alternative name for the VLAN', u'cli-multi-value': None, u'alt-name': u'name'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)
self.__mac = YANGDynClass(base=mac.mac, is_container='container', presence=False, yang_name="mac", rest_name="mac", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure MAC parameters', u'callpoint': u'MacaclAccessgroupIntVlanCP', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_MAC_ACL_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-mac-access-list', defining_module='brocade-mac-access-list', yang_type='container', is_config=True)
self.__transport_service = YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), restriction_dict={'range': [u'1..1000']}), is_leaf=True, yang_name="transport-service", rest_name="transport-service", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u' Set tlsid for Transparent vlan ', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_TRANSPORT_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint16', is_config=True)
self.__private_vlan = YANGDynClass(base=private_vlan.private_vlan, is_container='container', presence=False, yang_name="private-vlan", rest_name="private-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure vlan as private vlan', u'cli-full-no': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_PVLAN_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
self.__ifindex = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="ifindex", rest_name="ifindex", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint64', is_config=False)
self.__remote_span = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="remote-span", rest_name="rspan-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure the vlan as rspan vlan', u'alt-name': u'rspan-vlan', u'callpoint': u'interface_vlan'}}, namespace='urn:brocade.com:mgmt:brocade-span', defining_module='brocade-span', yang_type='empty', is_config=True)
self.__description = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 63']}), is_leaf=True, yang_name="description", rest_name="description", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Interface specific description', u'cli-multi-value': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_BASIC_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'interface-vlan', u'interface', u'vlan']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'interface', u'Vlan']
def _get_name(self):
"""
Getter method for name, mapped from YANG variable /interface_vlan/interface/vlan/name (vlan-type)
YANG Description: The vlan identifier.
"""
return self.__name
def _set_name(self, v, load=False):
"""
Setter method for name, mapped from YANG variable /interface_vlan/interface/vlan/name (vlan-type)
If this variable is read-only (config: false) in the
source YANG file, then _set_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_name() directly.
YANG Description: The vlan identifier.
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..8191']}), is_leaf=True, yang_name="name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-suppress-range': None, u'cli-custom-range': None}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='vlan-type', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """name must be of a type compatible with vlan-type""",
'defined-type': "brocade-interface:vlan-type",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..8191']}), is_leaf=True, yang_name="name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-suppress-range': None, u'cli-custom-range': None}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='vlan-type', is_config=True)""",
})
self.__name = t
if hasattr(self, '_set'):
self._set()
def _unset_name(self):
self.__name = YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..8191']}), is_leaf=True, yang_name="name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-suppress-range': None, u'cli-custom-range': None}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='vlan-type', is_config=True)
def _get_transport_service(self):
"""
Getter method for transport_service, mapped from YANG variable /interface_vlan/interface/vlan/transport_service (uint16)
YANG Description: Transparent vlan
"""
return self.__transport_service
def _set_transport_service(self, v, load=False):
"""
Setter method for transport_service, mapped from YANG variable /interface_vlan/interface/vlan/transport_service (uint16)
If this variable is read-only (config: false) in the
source YANG file, then _set_transport_service is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_transport_service() directly.
YANG Description: Transparent vlan
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), restriction_dict={'range': [u'1..1000']}), is_leaf=True, yang_name="transport-service", rest_name="transport-service", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u' Set tlsid for Transparent vlan ', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_TRANSPORT_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint16', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """transport_service must be of a type compatible with uint16""",
'defined-type': "uint16",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), restriction_dict={'range': [u'1..1000']}), is_leaf=True, yang_name="transport-service", rest_name="transport-service", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u' Set tlsid for Transparent vlan ', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_TRANSPORT_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint16', is_config=True)""",
})
self.__transport_service = t
if hasattr(self, '_set'):
self._set()
def _unset_transport_service(self):
self.__transport_service = YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), restriction_dict={'range': [u'1..1000']}), is_leaf=True, yang_name="transport-service", rest_name="transport-service", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u' Set tlsid for Transparent vlan ', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_TRANSPORT_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint16', is_config=True)
def _get_ifindex(self):
"""
Getter method for ifindex, mapped from YANG variable /interface_vlan/interface/vlan/ifindex (uint64)
"""
return self.__ifindex
def _set_ifindex(self, v, load=False):
"""
Setter method for ifindex, mapped from YANG variable /interface_vlan/interface/vlan/ifindex (uint64)
If this variable is read-only (config: false) in the
source YANG file, then _set_ifindex is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ifindex() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="ifindex", rest_name="ifindex", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ifindex must be of a type compatible with uint64""",
'defined-type': "uint64",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="ifindex", rest_name="ifindex", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint64', is_config=False)""",
})
self.__ifindex = t
if hasattr(self, '_set'):
self._set()
def _unset_ifindex(self):
self.__ifindex = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="ifindex", rest_name="ifindex", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='uint64', is_config=False)
def _get_description(self):
"""
Getter method for description, mapped from YANG variable /interface_vlan/interface/vlan/description (string)
"""
return self.__description
def _set_description(self, v, load=False):
"""
Setter method for description, mapped from YANG variable /interface_vlan/interface/vlan/description (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_description is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_description() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 63']}), is_leaf=True, yang_name="description", rest_name="description", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Interface specific description', u'cli-multi-value': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_BASIC_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """description must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 63']}), is_leaf=True, yang_name="description", rest_name="description", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Interface specific description', u'cli-multi-value': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_BASIC_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)""",
})
self.__description = t
if hasattr(self, '_set'):
self._set()
def _unset_description(self):
self.__description = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 63']}), is_leaf=True, yang_name="description", rest_name="description", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Interface specific description', u'cli-multi-value': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_BASIC_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)
def _get_vlan_name(self):
"""
Getter method for vlan_name, mapped from YANG variable /interface_vlan/interface/vlan/vlan_name (string)
"""
return self.__vlan_name
def _set_vlan_name(self, v, load=False):
"""
Setter method for vlan_name, mapped from YANG variable /interface_vlan/interface/vlan/vlan_name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_vlan_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_vlan_name() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 32']}), is_leaf=True, yang_name="vlan-name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Alternative name for the VLAN', u'cli-multi-value': None, u'alt-name': u'name'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """vlan_name must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 32']}), is_leaf=True, yang_name="vlan-name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Alternative name for the VLAN', u'cli-multi-value': None, u'alt-name': u'name'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)""",
})
self.__vlan_name = t
if hasattr(self, '_set'):
self._set()
def _unset_vlan_name(self):
self.__vlan_name = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1 .. 32']}), is_leaf=True, yang_name="vlan-name", rest_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Alternative name for the VLAN', u'cli-multi-value': None, u'alt-name': u'name'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='string', is_config=True)
def _get_private_vlan(self):
"""
Getter method for private_vlan, mapped from YANG variable /interface_vlan/interface/vlan/private_vlan (container)
YANG Description: Configure a Private Vlan
"""
return self.__private_vlan
def _set_private_vlan(self, v, load=False):
"""
Setter method for private_vlan, mapped from YANG variable /interface_vlan/interface/vlan/private_vlan (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_private_vlan is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_private_vlan() directly.
YANG Description: Configure a Private Vlan
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=private_vlan.private_vlan, is_container='container', presence=False, yang_name="private-vlan", rest_name="private-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure vlan as private vlan', u'cli-full-no': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_PVLAN_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """private_vlan must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=private_vlan.private_vlan, is_container='container', presence=False, yang_name="private-vlan", rest_name="private-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure vlan as private vlan', u'cli-full-no': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_PVLAN_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)""",
})
self.__private_vlan = t
if hasattr(self, '_set'):
self._set()
def _unset_private_vlan(self):
self.__private_vlan = YANGDynClass(base=private_vlan.private_vlan, is_container='container', presence=False, yang_name="private-vlan", rest_name="private-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure vlan as private vlan', u'cli-full-no': None, u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_L2_PVLAN_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
def _get_ip(self):
"""
Getter method for ip, mapped from YANG variable /interface_vlan/interface/vlan/ip (container)
YANG Description: The IP configurations for an interface.
"""
return self.__ip
def _set_ip(self, v, load=False):
"""
Setter method for ip, mapped from YANG variable /interface_vlan/interface/vlan/ip (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_ip is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ip() directly.
YANG Description: The IP configurations for an interface.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=ip.ip, is_container='container', presence=False, yang_name="ip", rest_name="ip", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol (IP).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ip must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=ip.ip, is_container='container', presence=False, yang_name="ip", rest_name="ip", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol (IP).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)""",
})
self.__ip = t
if hasattr(self, '_set'):
self._set()
def _unset_ip(self):
self.__ip = YANGDynClass(base=ip.ip, is_container='container', presence=False, yang_name="ip", rest_name="ip", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol (IP).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
def _get_ipv6(self):
"""
Getter method for ipv6, mapped from YANG variable /interface_vlan/interface/vlan/ipv6 (container)
YANG Description: The IP configurations for an interface.
"""
return self.__ipv6
def _set_ipv6(self, v, load=False):
"""
Setter method for ipv6, mapped from YANG variable /interface_vlan/interface/vlan/ipv6 (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_ipv6 is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ipv6() directly.
YANG Description: The IP configurations for an interface.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=ipv6.ipv6, is_container='container', presence=False, yang_name="ipv6", rest_name="ipv6", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol v6 (IPv6).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ipv6 must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=ipv6.ipv6, is_container='container', presence=False, yang_name="ipv6", rest_name="ipv6", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol v6 (IPv6).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)""",
})
self.__ipv6 = t
if hasattr(self, '_set'):
self._set()
def _unset_ipv6(self):
self.__ipv6 = YANGDynClass(base=ipv6.ipv6, is_container='container', presence=False, yang_name="ipv6", rest_name="ipv6", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'The Internet Protocol v6 (IPv6).', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-interface', defining_module='brocade-interface', yang_type='container', is_config=True)
def _get_mac(self):
"""
Getter method for mac, mapped from YANG variable /interface_vlan/interface/vlan/mac (container)
"""
return self.__mac
def _set_mac(self, v, load=False):
"""
Setter method for mac, mapped from YANG variable /interface_vlan/interface/vlan/mac (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_mac is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mac() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=mac.mac, is_container='container', presence=False, yang_name="mac", rest_name="mac", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure MAC parameters', u'callpoint': u'MacaclAccessgroupIntVlanCP', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_MAC_ACL_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-mac-access-list', defining_module='brocade-mac-access-list', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mac must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=mac.mac, is_container='container', presence=False, yang_name="mac", rest_name="mac", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure MAC parameters', u'callpoint': u'MacaclAccessgroupIntVlanCP', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_MAC_ACL_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-mac-access-list', defining_module='brocade-mac-access-list', yang_type='container', is_config=True)""",
})
self.__mac = t
if hasattr(self, '_set'):
self._set()
def _unset_mac(self):
self.__mac = YANGDynClass(base=mac.mac, is_container='container', presence=False, yang_name="mac", rest_name="mac", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure MAC parameters', u'callpoint': u'MacaclAccessgroupIntVlanCP', u'sort-priority': u'RUNNCFG_INTERFACE_LEVEL_MAC_ACL_CONFIG'}}, namespace='urn:brocade.com:mgmt:brocade-mac-access-list', defining_module='brocade-mac-access-list', yang_type='container', is_config=True)
def _get_remote_span(self):
"""
Getter method for remote_span, mapped from YANG variable /interface_vlan/interface/vlan/remote_span (empty)
"""
return self.__remote_span
def _set_remote_span(self, v, load=False):
"""
Setter method for remote_span, mapped from YANG variable /interface_vlan/interface/vlan/remote_span (empty)
If this variable is read-only (config: false) in the
source YANG file, then _set_remote_span is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_remote_span() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="remote-span", rest_name="rspan-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure the vlan as rspan vlan', u'alt-name': u'rspan-vlan', u'callpoint': u'interface_vlan'}}, namespace='urn:brocade.com:mgmt:brocade-span', defining_module='brocade-span', yang_type='empty', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """remote_span must be of a type compatible with empty""",
'defined-type': "empty",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="remote-span", rest_name="rspan-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure the vlan as rspan vlan', u'alt-name': u'rspan-vlan', u'callpoint': u'interface_vlan'}}, namespace='urn:brocade.com:mgmt:brocade-span', defining_module='brocade-span', yang_type='empty', is_config=True)""",
})
self.__remote_span = t
if hasattr(self, '_set'):
self._set()
def _unset_remote_span(self):
self.__remote_span = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="remote-span", rest_name="rspan-vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure the vlan as rspan vlan', u'alt-name': u'rspan-vlan', u'callpoint': u'interface_vlan'}}, namespace='urn:brocade.com:mgmt:brocade-span', defining_module='brocade-span', yang_type='empty', is_config=True)
def _get_spanning_tree(self):
"""
Getter method for spanning_tree, mapped from YANG variable /interface_vlan/interface/vlan/spanning_tree (container)
"""
return self.__spanning_tree
def _set_spanning_tree(self, v, load=False):
"""
Setter method for spanning_tree, mapped from YANG variable /interface_vlan/interface/vlan/spanning_tree (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_spanning_tree is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_spanning_tree() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=spanning_tree.spanning_tree, is_container='container', presence=False, yang_name="spanning-tree", rest_name="spanning-tree", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Spanning tree commands', u'sort-priority': u'98', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-xstp', defining_module='brocade-xstp', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """spanning_tree must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=spanning_tree.spanning_tree, is_container='container', presence=False, yang_name="spanning-tree", rest_name="spanning-tree", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Spanning tree commands', u'sort-priority': u'98', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-xstp', defining_module='brocade-xstp', yang_type='container', is_config=True)""",
})
self.__spanning_tree = t
if hasattr(self, '_set'):
self._set()
def _unset_spanning_tree(self):
self.__spanning_tree = YANGDynClass(base=spanning_tree.spanning_tree, is_container='container', presence=False, yang_name="spanning-tree", rest_name="spanning-tree", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Spanning tree commands', u'sort-priority': u'98', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-xstp', defining_module='brocade-xstp', yang_type='container', is_config=True)
name = __builtin__.property(_get_name, _set_name)
transport_service = __builtin__.property(_get_transport_service, _set_transport_service)
ifindex = __builtin__.property(_get_ifindex)
description = __builtin__.property(_get_description, _set_description)
vlan_name = __builtin__.property(_get_vlan_name, _set_vlan_name)
private_vlan = __builtin__.property(_get_private_vlan, _set_private_vlan)
ip = __builtin__.property(_get_ip, _set_ip)
ipv6 = __builtin__.property(_get_ipv6, _set_ipv6)
mac = __builtin__.property(_get_mac, _set_mac)
remote_span = __builtin__.property(_get_remote_span, _set_remote_span)
spanning_tree = __builtin__.property(_get_spanning_tree, _set_spanning_tree)
_pyangbind_elements = {'name': name, 'transport_service': transport_service, 'ifindex': ifindex, 'description': description, 'vlan_name': vlan_name, 'private_vlan': private_vlan, 'ip': ip, 'ipv6': ipv6, 'mac': mac, 'remote_span': remote_span, 'spanning_tree': spanning_tree, }
| 75.952663 | 654 | 0.737431 | 5,254 | 38,508 | 5.17244 | 0.045679 | 0.037533 | 0.047395 | 0.032382 | 0.865764 | 0.850751 | 0.843134 | 0.831175 | 0.8279 | 0.817302 | 0 | 0.010056 | 0.124546 | 38,508 | 506 | 655 | 76.102767 | 0.796067 | 0.148177 | 0 | 0.461279 | 0 | 0.037037 | 0.402525 | 0.173707 | 0 | 0 | 0 | 0 | 0 | 1 | 0.121212 | false | 0 | 0.043771 | 0 | 0.276094 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b14fd14bb806504f039aa7455931473b8a75c37c | 159 | py | Python | QuestionAnswerSummaryAndReasoning/utils/__init__.py | xiaobuguilaile/qa-summary-and-reasoning | b0d71f8c7b31e5a856c4904b0d053892ca0ae184 | [
"MIT"
] | 1 | 2021-12-16T03:30:28.000Z | 2021-12-16T03:30:28.000Z | QuestionAnswerSummaryAndReasoning/utils/__init__.py | xiaobuguilaile/qa-summary-and-reasoning | b0d71f8c7b31e5a856c4904b0d053892ca0ae184 | [
"MIT"
] | 3 | 2021-05-21T14:39:35.000Z | 2022-02-10T02:16:54.000Z | QuestionAnswerSummaryAndReasoning/utils/__init__.py | xiaobuguilaile/qa-summary-and-reasoning | b0d71f8c7b31e5a856c4904b0d053892ca0ae184 | [
"MIT"
] | null | null | null |
from .build_w2v import *
from .data_reader import *
from .data_utils import *
from .dataset_split import *
from .preprocess import *
from .tokenizer import *
| 19.875 | 28 | 0.767296 | 22 | 159 | 5.363636 | 0.5 | 0.423729 | 0.237288 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007463 | 0.157233 | 159 | 7 | 29 | 22.714286 | 0.873134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b185d2d236d8f13580a103b327466a0555d74380 | 109 | py | Python | backend/app/rest/statistic/views.py | air-services/boilerplate | 8c69927c299882048f9aaaaa58483eea2a6a1cd0 | [
"MIT"
] | null | null | null | backend/app/rest/statistic/views.py | air-services/boilerplate | 8c69927c299882048f9aaaaa58483eea2a6a1cd0 | [
"MIT"
] | null | null | null | backend/app/rest/statistic/views.py | air-services/boilerplate | 8c69927c299882048f9aaaaa58483eea2a6a1cd0 | [
"MIT"
] | null | null | null | from app.core.crud import CrudView
class CardView(CrudView):
pass
class IconView(CrudView):
pass
| 10.9 | 34 | 0.724771 | 14 | 109 | 5.642857 | 0.714286 | 0.303797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.201835 | 109 | 9 | 35 | 12.111111 | 0.908046 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.4 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
b18efd38a675242a2dda8083f39f7f6632229491 | 21 | py | Python | pudzu-dates/pudzu/dates/__init__.py | Udzu/pudzu-packages | 928930271fac701fbbbf02e6d607e2c5b24cc1e7 | [
"MIT"
] | 1 | 2022-02-28T06:59:40.000Z | 2022-02-28T06:59:40.000Z | pudzu-dates/pudzu/dates/__init__.py | Udzu/pudzu-packages | 928930271fac701fbbbf02e6d607e2c5b24cc1e7 | [
"MIT"
] | 2 | 2022-03-12T01:10:04.000Z | 2022-03-12T01:10:11.000Z | pudzu-dates/pudzu/dates/__init__.py | Udzu/pudzu-packages | 928930271fac701fbbbf02e6d607e2c5b24cc1e7 | [
"MIT"
] | null | null | null | from .dates import *
| 10.5 | 20 | 0.714286 | 3 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
49249cbfc75ef5c344cc519ff87e1a5bd967b805 | 107 | py | Python | main/models.py | debajyotiroyc/WebMapping_with_DjangoandGoogleAPIS | f1f8e76f86510d9886c33ec26113db52502deef9 | [
"MIT"
] | null | null | null | main/models.py | debajyotiroyc/WebMapping_with_DjangoandGoogleAPIS | f1f8e76f86510d9886c33ec26113db52502deef9 | [
"MIT"
] | null | null | null | main/models.py | debajyotiroyc/WebMapping_with_DjangoandGoogleAPIS | f1f8e76f86510d9886c33ec26113db52502deef9 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
# Create your models here.
| 17.833333 | 44 | 0.757009 | 16 | 107 | 5.0625 | 0.6875 | 0.246914 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186916 | 107 | 5 | 45 | 21.4 | 0.931034 | 0.224299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4948a27df17c8d51e00ab045e1e6f571d02ac058 | 209 | py | Python | app/utils/jwt.py | coolexplorer/py-session | 86ffbd3f1744df18d2deeaa77eebe5dd9980c6ae | [
"MIT"
] | 2 | 2021-08-17T07:15:46.000Z | 2021-08-17T07:18:15.000Z | app/utils/jwt.py | coolexplorer/py-session | 86ffbd3f1744df18d2deeaa77eebe5dd9980c6ae | [
"MIT"
] | 1 | 2022-02-24T06:52:08.000Z | 2022-02-24T06:52:08.000Z | app/utils/jwt.py | coolexplorer/py-session | 86ffbd3f1744df18d2deeaa77eebe5dd9980c6ae | [
"MIT"
] | null | null | null | import jwt
from config import config
settings: config.Settings = config.get_settings()
def decode(token: str) -> dict:
return jwt.decode(token, settings.jwt_secret, algorithms=[settings.jwt_algorithm]) | 23.222222 | 86 | 0.770335 | 28 | 209 | 5.642857 | 0.535714 | 0.177215 | 0.253165 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124402 | 209 | 9 | 86 | 23.222222 | 0.863388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
495f4a774b5b584679558394e88b981f6a7773a9 | 47 | py | Python | src/aws_ssm_copy/__main__.py | mvanholsteijn/aws-ssm-copy-parameters | e32e1933ee02e070d7538a1f3ee09d2a5bfadb9c | [
"Apache-2.0"
] | 44 | 2018-03-06T18:19:58.000Z | 2022-03-29T22:47:25.000Z | src/aws_ssm_copy/__main__.py | mvanholsteijn/aws-ssm-copy-parameters | e32e1933ee02e070d7538a1f3ee09d2a5bfadb9c | [
"Apache-2.0"
] | 18 | 2018-03-08T15:15:41.000Z | 2022-03-30T16:42:57.000Z | src/aws_ssm_copy/__main__.py | mvanholsteijn/aws-ssm-copy-parameters | e32e1933ee02e070d7538a1f3ee09d2a5bfadb9c | [
"Apache-2.0"
] | 17 | 2018-03-06T20:33:02.000Z | 2022-02-02T19:59:32.000Z | from aws_ssm_copy.ssm_copy import main
main()
| 11.75 | 38 | 0.808511 | 9 | 47 | 3.888889 | 0.666667 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 47 | 3 | 39 | 15.666667 | 0.853659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
498d35bcf534e55e5f754e2aa54f05981133aacb | 68,663 | py | Python | sdk/storage/azure-storage-blob/tests/test_block_blob_async.py | isabella232/azure-sdk-for-python | c9240aa68c2870a1a72249b4c86f3ca95551cc29 | [
"MIT"
] | null | null | null | sdk/storage/azure-storage-blob/tests/test_block_blob_async.py | isabella232/azure-sdk-for-python | c9240aa68c2870a1a72249b4c86f3ca95551cc29 | [
"MIT"
] | 1 | 2021-02-23T23:11:26.000Z | 2021-02-23T23:11:26.000Z | sdk/storage/azure-storage-blob/tests/test_block_blob_async.py | isabella232/azure-sdk-for-python | c9240aa68c2870a1a72249b4c86f3ca95551cc29 | [
"MIT"
] | null | null | null | # coding: utf-8
# -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
import os
import unittest
import pytest
import asyncio
import uuid
from datetime import datetime, timedelta
from azure.storage.blob._shared.policies import StorageContentValidation
from azure.core.exceptions import HttpResponseError, ResourceExistsError, ResourceModifiedError, ResourceNotFoundError
from azure.core.pipeline.transport import AioHttpTransport
from multidict import CIMultiDict, CIMultiDictProxy
from devtools_testutils import ResourceGroupPreparer, StorageAccountPreparer
from _shared.testcase import GlobalStorageAccountPreparer, GlobalResourceGroupPreparer
from _shared.asynctestcase import AsyncStorageTestCase
from azure.storage.blob import (
BlobType,
ContentSettings,
BlobBlock,
StandardBlobTier,
generate_blob_sas,
BlobSasPermissions, CustomerProvidedEncryptionKey
)
from azure.storage.blob.aio import (
BlobServiceClient,
ContainerClient,
BlobClient,
)
#------------------------------------------------------------------------------
TEST_BLOB_PREFIX = 'blob'
LARGE_BLOB_SIZE = 64 * 1024 + 5
#------------------------------------------------------------------------------
class AiohttpTestTransport(AioHttpTransport):
"""Workaround to vcrpy bug: https://github.com/kevin1024/vcrpy/pull/461
"""
async def send(self, request, **config):
response = await super(AiohttpTestTransport, self).send(request, **config)
if not isinstance(response.headers, CIMultiDictProxy):
response.headers = CIMultiDictProxy(CIMultiDict(response.internal_response.headers))
response.content_type = response.headers.get("content-type")
return response
class StorageBlockBlobTestAsync(AsyncStorageTestCase):
#--Helpers-----------------------------------------------------------------
async def _setup(self, storage_account, key, container_name='utcontainer'):
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
self.bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=key,
connection_data_block_size=4 * 1024,
max_single_put_size=32 * 1024,
max_block_size=4 * 1024,
transport=AiohttpTestTransport())
self.config = self.bsc._config
self.container_name = self.get_resource_name(container_name)
if self.is_live:
try:
await self.bsc.create_container(self.container_name)
except ResourceExistsError:
pass
def _teardown(self, FILE_PATH):
if os.path.isfile(FILE_PATH):
try:
os.remove(FILE_PATH)
except:
pass
def _get_blob_reference(self):
return self.get_resource_name(TEST_BLOB_PREFIX)
def _get_blob_with_special_chars_reference(self):
return 'भारत¥test/testsubÐirÍ/'+self.get_resource_name('srcÆblob')
async def _create_source_blob_url_with_special_chars(self, tags=None):
blob_name = self._get_blob_with_special_chars_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.upload_blob(self.get_random_bytes(8 * 1024))
sas_token_for_special_chars = generate_blob_sas(
blob.account_name,
blob.container_name,
blob.blob_name,
snapshot=blob.snapshot,
account_key=blob.credential.account_key,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1),
)
return BlobClient.from_blob_url(blob.url, credential=sas_token_for_special_chars).url
async def _create_blob(self, tags=None, data=b'', **kwargs):
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.upload_blob(data, tags=tags, **kwargs)
return blob
async def assertBlobEqual(self, container_name, blob_name, expected_data):
blob = self.bsc.get_blob_client(container_name, blob_name)
stream = await blob.download_blob()
actual_data = await stream.readall()
self.assertEqual(actual_data, expected_data)
class NonSeekableFile(object):
def __init__(self, wrapped_file):
self.wrapped_file = wrapped_file
def write(self, data):
self.wrapped_file.write(data)
def read(self, count):
return self.wrapped_file.read(count)
#--Test cases for block blobs --------------------------------------------
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_with_and_without_overwrite(
self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key)
blob = await self._create_blob(data=b"source blob data")
# Act
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob_client = self.bsc.get_blob_client(self.container_name, blob_name)
await new_blob_client.upload_blob(b'destination blob data')
# Assert
with self.assertRaises(ResourceExistsError):
await new_blob_client.upload_blob_from_url(source_blob, overwrite=False)
new_blob = await new_blob_client.upload_blob_from_url(source_blob, overwrite=True)
self.assertIsNotNone(new_blob)
new_blob_download = await new_blob_client.download_blob()
new_blob_content = await new_blob_download.readall()
self.assertEqual(new_blob_content, b'source blob data')
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_with_existing_blob(
self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key, container_name="testcontainer")
blob = await self._create_blob(data=b"test data")
# Act
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob_client = self.bsc.get_blob_client(self.container_name, blob_name)
new_blob = await new_blob_client.upload_blob_from_url(source_blob)
# Assert
self.assertIsNotNone(new_blob)
downloaded_blob = await new_blob_client.download_blob()
new_blob_content = await downloaded_blob.readall()
self.assertEqual(new_blob_content, b'test data')
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_with_standard_tier_specified(
self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key, container_name="testcontainer")
blob = await self._create_blob()
self.bsc.get_blob_client(self.container_name, blob.blob_name)
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
# Act
source_blob = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob = self.bsc.get_blob_client(self.container_name, blob_name)
blob_tier = StandardBlobTier.Hot
await new_blob.upload_blob_from_url(source_blob, standard_blob_tier=blob_tier)
new_blob_properties = await new_blob.get_blob_properties()
# Assert
self.assertEqual(new_blob_properties.blob_tier, blob_tier)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_with_destination_lease(
self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key)
source_blob = await self._create_blob()
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=source_blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob_url = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, source_blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob_client = self.bsc.get_blob_client(self.container_name, blob_name)
await new_blob_client.upload_blob(data="test")
new_blob_lease = await new_blob_client.acquire_lease()
with self.assertRaises(HttpResponseError):
await new_blob_client.upload_blob_from_url(
source_blob_url, destination_lease="baddde9e-8247-4276-8bfa-c7a8081eba1d", overwrite=True)
with self.assertRaises(HttpResponseError):
await new_blob_client.upload_blob_from_url(source_blob_url)
await new_blob_client.upload_blob_from_url(
source_blob_url, destination_lease=new_blob_lease)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_if_match_condition(
self, resource_group, location, storage_account, storage_account_key):
# Act
await self._setup(storage_account, storage_account_key)
source_blob = await self._create_blob()
early_test_datetime = (datetime.utcnow() - timedelta(minutes=15))
late_test_datetime = (datetime.utcnow() + timedelta(minutes=15))
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=source_blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob_url = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, source_blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob_client = self.bsc.get_blob_client(self.container_name, blob_name)
await new_blob_client.upload_blob(data="fake data")
# Assert
with self.assertRaises(ResourceModifiedError):
await new_blob_client.upload_blob_from_url(
source_blob_url, if_modified_since=late_test_datetime, overwrite=True)
await new_blob_client.upload_blob_from_url(
source_blob_url, if_modified_since=early_test_datetime, overwrite=True)
with self.assertRaises(ResourceModifiedError):
await new_blob_client.upload_blob_from_url(
source_blob_url, if_unmodified_since=early_test_datetime, overwrite=True)
await new_blob_client.upload_blob_from_url(
source_blob_url, if_unmodified_since=late_test_datetime, overwrite=True)
with self.assertRaises(ResourceNotFoundError):
await new_blob_client.upload_blob_from_url(
source_blob_url, source_if_modified_since=late_test_datetime, overwrite=True)
await new_blob_client.upload_blob_from_url(
source_blob_url, source_if_modified_since=early_test_datetime, overwrite=True)
with self.assertRaises(ResourceNotFoundError):
await new_blob_client.upload_blob_from_url(
source_blob_url, source_if_unmodified_since=early_test_datetime, overwrite=True)
await new_blob_client.upload_blob_from_url(
source_blob_url, source_if_unmodified_since=late_test_datetime, overwrite=True)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_with_cpk(self, resource_group, location, storage_account, storage_account_key):
# Act
await self._setup(storage_account, storage_account_key)
source_blob = await self._create_blob(data=b"This is test data to be copied over.")
test_cpk = CustomerProvidedEncryptionKey(key_value="MDEyMzQ1NjcwMTIzNDU2NzAxMjM0NTY3MDEyMzQ1Njc=",
key_hash="3QFFFpRA5+XANHqwwbT4yXDmrT/2JaLt/FKHjzhOdoE=")
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=source_blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob_url = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, source_blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob = self.bsc.get_blob_client(self.container_name, blob_name)
await new_blob.upload_blob_from_url(
source_blob_url, include_source_blob_properties=True, cpk=test_cpk)
# Assert
with self.assertRaises(HttpResponseError):
await new_blob.create_snapshot()
await new_blob.create_snapshot(cpk=test_cpk)
self.assertIsNotNone(new_blob.create_snapshot)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_overwrite_properties(
self, resource_group, location, storage_account, storage_account_key):
# Act
await self._setup(storage_account, storage_account_key)
source_blob_content_settings = ContentSettings(content_language='spanish')
new_blob_content_settings = ContentSettings(content_language='english')
source_blob_tags = {"tag1": "sourcetag", "tag2": "secondsourcetag"}
new_blob_tags = {"tag1": "copytag"}
new_blob_cpk = CustomerProvidedEncryptionKey(key_value="MDEyMzQ1NjcwMTIzNDU2NzAxMjM0NTY3MDEyMzQ1Njc=",
key_hash="3QFFFpRA5+XANHqwwbT4yXDmrT/2JaLt/FKHjzhOdoE=")
source_blob = await self._create_blob(
data=b"This is test data to be copied over.",
tags=source_blob_tags,
content_settings=source_blob_content_settings,
)
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=source_blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob_url = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, source_blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob = self.bsc.get_blob_client(self.container_name, blob_name)
await new_blob.upload_blob_from_url(source_blob_url,
include_source_blob_properties=True,
tags=new_blob_tags,
content_settings=new_blob_content_settings,
cpk=new_blob_cpk)
new_blob_props = await new_blob.get_blob_properties(cpk=new_blob_cpk)
# Assert that source blob properties did not take precedence.
self.assertEqual(new_blob_props.tag_count, 1)
self.assertEqual(new_blob_props.content_settings.content_language, new_blob_content_settings.content_language)
self.assertEqual(new_blob_props.encryption_key_sha256, new_blob_cpk.key_hash)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_with_source_content_md5(
self, resource_group, location, storage_account, storage_account_key):
# Act
await self._setup(storage_account, storage_account_key)
source_blob = await self._create_blob(data=b"This is test data to be copied over.")
source_blob_props = await source_blob.get_blob_properties()
source_md5 = source_blob_props.content_settings.content_md5
bad_source_md5 = StorageContentValidation.get_content_md5(b"this is bad data")
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=source_blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob_url = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, source_blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob = self.bsc.get_blob_client(self.container_name, blob_name)
# Assert
await new_blob.upload_blob_from_url(
source_blob_url, include_source_blob_properties=True, source_content_md5=source_md5)
with self.assertRaises(HttpResponseError):
await new_blob.upload_blob_from_url(
source_blob_url, include_source_blob_properties=False, source_content_md5=bad_source_md5)
new_blob_props = await new_blob.get_blob_properties()
new_blob_content_md5 = new_blob_props.content_settings.content_md5
self.assertEqual(new_blob_content_md5, source_md5)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_upload_blob_from_url_source_and_destination_properties(
self, resource_group, location, storage_account, storage_account_key):
# Act
await self._setup(storage_account, storage_account_key)
content_settings = ContentSettings(
content_type='application/octet-stream',
content_language='spanish',
content_disposition='inline'
)
source_blob = await self._create_blob(
data=b"This is test data to be copied over.",
tags={"tag1": "firsttag", "tag2": "secondtag", "tag3": "thirdtag"},
content_settings=content_settings,
standard_blob_tier=StandardBlobTier.Cool
)
await source_blob.acquire_lease()
source_blob_props = await source_blob.get_blob_properties()
sas = generate_blob_sas(account_name=storage_account.name, account_key=storage_account_key,
container_name=self.container_name, blob_name=source_blob.blob_name,
permission=BlobSasPermissions(read=True), expiry=datetime.utcnow() + timedelta(hours=1))
source_blob_url = '{0}/{1}/{2}?{3}'.format(
self.account_url(storage_account, "blob"), self.container_name, source_blob.blob_name, sas)
blob_name = self.get_resource_name("blobcopy")
new_blob_copy1 = self.bsc.get_blob_client(self.container_name, blob_name)
new_blob_copy2 = self.bsc.get_blob_client(self.container_name, 'blob2copy')
await new_blob_copy1.upload_blob_from_url(
source_blob_url, include_source_blob_properties=True)
await new_blob_copy2.upload_blob_from_url(
source_blob_url, include_source_blob_properties=False)
new_blob_copy1_props = await new_blob_copy1.get_blob_properties()
new_blob_copy2_props = await new_blob_copy2.get_blob_properties()
# Assert
self.assertEqual(new_blob_copy1_props.content_settings.content_language,
source_blob_props.content_settings.content_language)
self.assertNotEqual(new_blob_copy2_props.content_settings.content_language,
source_blob_props.content_settings.content_language)
self.assertEqual(source_blob_props.lease.status, 'locked')
self.assertEqual(new_blob_copy1_props.lease.status, 'unlocked')
self.assertEqual(new_blob_copy2_props.lease.status, 'unlocked')
self.assertEqual(source_blob_props.blob_tier, 'Cool')
self.assertEqual(new_blob_copy1_props.blob_tier, 'Hot')
self.assertEqual(new_blob_copy2_props.blob_tier, 'Hot')
self.assertEqual(source_blob_props.tag_count, 3)
self.assertEqual(new_blob_copy1_props.tag_count, None)
self.assertEqual(new_blob_copy2_props.tag_count, None)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block(self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key)
# Arrange
blob = await self._create_blob()
# Act
for i in range(5):
headers = await blob.stage_block(i, 'block {0}'.format(i).encode('utf-8'))
self.assertIn('content_crc64', headers)
# Assert
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_copy_blob_async(self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key)
dest_blob = await self._create_blob()
source_blob_url = await self._create_source_blob_url_with_special_chars()
# Act
copy_props = await dest_blob.start_copy_from_url(source_blob_url, requires_sync=True)
# Assert
self.assertIsNotNone(copy_props)
self.assertIsNotNone(copy_props['copy_id'])
self.assertEqual('success', copy_props['copy_status'])
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_from_url_and_commit(self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key)
dest_blob = await self._create_blob()
source_blob_url = await self._create_source_blob_url_with_special_chars()
split = 4 * 1024
# Act part 1: make put block from url calls
await dest_blob.stage_block_from_url(
block_id=1,
source_url=source_blob_url,
source_offset=0,
source_length=split)
await dest_blob.stage_block_from_url(
block_id=2,
source_url=source_blob_url,
source_offset=split,
source_length=split)
# Assert blocks
committed, uncommitted = await dest_blob.get_block_list('all')
self.assertEqual(len(uncommitted), 2)
self.assertEqual(len(committed), 0)
# Act part 2: commit the blocks
await dest_blob.commit_block_list(['1', '2'])
committed, uncommitted = await dest_blob.get_block_list('all')
self.assertEqual(len(uncommitted), 0)
self.assertEqual(len(committed), 2)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_with_response(self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account.name, storage_account_key)
# Arrange
def return_response(resp, _, headers):
return (resp, headers)
blob = await self._create_blob()
# Act
resp, headers = await blob.stage_block(0, 'block 0', cls=return_response)
# Assert
self.assertEqual(201, resp.status_code)
self.assertIn('x-ms-content-crc64', headers)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_unicode(self, resource_group, location, storage_account, storage_account_key):
await self._setup(storage_account, storage_account_key)
# Arrange
blob = await self._create_blob()
# Act
headers = await blob.stage_block('1', u'啊齄丂狛狜')
self.assertIn('content_crc64', headers)
# Assert
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_with_md5(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob = await self._create_blob()
# Act
await blob.stage_block(1, b'block', validate_content=True)
# Assert
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_list(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.stage_block('1', b'AAA')
await blob.stage_block('2', b'BBB')
await blob.stage_block('3', b'CCC')
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
put_block_list_resp = await blob.commit_block_list(block_list)
# Assert
content = await blob.download_blob()
actual = await content.readall()
self.assertEqual(actual, b'AAABBBCCC')
self.assertEqual(content.properties.etag, put_block_list_resp.get('etag'))
self.assertEqual(content.properties.last_modified, put_block_list_resp.get('last_modified'))
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_list_invalid_block_id(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.stage_block('1', b'AAA')
await blob.stage_block('2', b'BBB')
await blob.stage_block('3', b'CCC')
# Act
try:
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='4')]
await blob.commit_block_list(block_list)
self.fail()
except HttpResponseError as e:
self.assertGreaterEqual(str(e).find('specified block list is invalid'), 0)
# Assert
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_put_block_list_with_md5(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.stage_block('1', b'AAA')
await blob.stage_block('2', b'BBB')
await blob.stage_block('3', b'CCC')
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
await blob.commit_block_list(block_list, validate_content=True)
# Assert
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def _test_put_block_list_with_blob_tier_specified(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob_client = self.bsc.get_blob_client(self.container_name, blob_name)
await blob_client.stage_block('1', b'AAA')
await blob_client.stage_block('2', b'BBB')
await blob_client.stage_block('3', b'CCC')
blob_tier = StandardBlobTier.Cool
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
await blob_client.commit_block_list(block_list,
standard_blob_tier=blob_tier)
# Assert
blob_properties = await blob_client.get_blob_properties()
self.assertEqual(blob_properties.blob_tier, blob_tier)
@GlobalResourceGroupPreparer()
@StorageAccountPreparer(random_name_enabled=True, location="canadacentral", name_prefix='storagename')
@AsyncStorageTestCase.await_prepared_test
async def test_get_block_list_no_blocks(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
tags = {"tag1": "firsttag", "tag2": "secondtag", "tag3": "thirdtag"}
blob = await self._create_blob(tags=tags)
# Act
with self.assertRaises(ResourceModifiedError):
await blob.get_block_list('all', if_tags_match_condition="\"condition tag\"='wrong tag'")
block_list = await blob.get_block_list('all', if_tags_match_condition="\"tag1\"='firsttag'")
# Assert
self.assertIsNotNone(block_list)
self.assertEqual(len(block_list[1]), 0)
self.assertEqual(len(block_list[0]), 0)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_get_block_list_uncommitted_blocks(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.stage_block('1', b'AAA')
await blob.stage_block('2', b'BBB')
await blob.stage_block('3', b'CCC')
# Act
block_list = await blob.get_block_list('uncommitted')
# Assert
self.assertIsNotNone(block_list)
self.assertEqual(len(block_list), 2)
self.assertEqual(len(block_list[1]), 3)
self.assertEqual(len(block_list[0]), 0)
self.assertEqual(block_list[1][0].id, '1')
self.assertEqual(block_list[1][0].size, 3)
self.assertEqual(block_list[1][1].id, '2')
self.assertEqual(block_list[1][1].size, 3)
self.assertEqual(block_list[1][2].id, '3')
self.assertEqual(block_list[1][2].size, 3)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_get_block_list_committed_blocks(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
await blob.stage_block('1', b'AAA')
await blob.stage_block('2', b'BBB')
await blob.stage_block('3', b'CCC')
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
await blob.commit_block_list(block_list)
# Act
block_list = await blob.get_block_list('committed')
# Assert
self.assertIsNotNone(block_list)
self.assertEqual(len(block_list), 2)
self.assertEqual(len(block_list[1]), 0)
self.assertEqual(len(block_list[0]), 3)
self.assertEqual(block_list[0][0].id, '1')
self.assertEqual(block_list[0][0].size, 3)
self.assertEqual(block_list[0][1].id, '2')
self.assertEqual(block_list[0][1].size, 3)
self.assertEqual(block_list[0][2].id, '3')
self.assertEqual(block_list[0][2].size, 3)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_small_block_blob_with_no_overwrite(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data1 = b'hello world'
data2 = b'hello second world'
# Act
create_resp = await blob.upload_blob(data1, overwrite=True)
with self.assertRaises(ResourceExistsError):
await blob.upload_blob(data2, overwrite=False)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data1)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
self.assertEqual(props.blob_type, BlobType.BlockBlob)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_small_block_blob_with_overwrite(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data1 = b'hello world'
data2 = b'hello second world'
# Act
create_resp = await blob.upload_blob(data1, overwrite=True)
update_resp = await blob.upload_blob(data2, overwrite=True)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data2)
self.assertEqual(props.etag, update_resp.get('etag'))
self.assertEqual(props.last_modified, update_resp.get('last_modified'))
self.assertEqual(props.blob_type, BlobType.BlockBlob)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_large_block_blob_with_no_overwrite(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data1 = self.get_random_bytes(LARGE_BLOB_SIZE)
data2 = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
create_resp = await blob.upload_blob(data1, overwrite=True, metadata={'blobdata': 'data1'})
with self.assertRaises(ResourceExistsError):
await blob.upload_blob(data2, overwrite=False, metadata={'blobdata': 'data2'})
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data1)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
self.assertEqual(props.blob_type, BlobType.BlockBlob)
self.assertEqual(props.metadata, {'blobdata': 'data1'})
self.assertEqual(props.size, LARGE_BLOB_SIZE)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_large_block_blob_with_overwrite(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data1 = self.get_random_bytes(LARGE_BLOB_SIZE)
data2 = self.get_random_bytes(LARGE_BLOB_SIZE + 512)
# Act
create_resp = await blob.upload_blob(data1, overwrite=True, metadata={'blobdata': 'data1'})
update_resp = await blob.upload_blob(data2, overwrite=True, metadata={'blobdata': 'data2'})
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data2)
self.assertEqual(props.etag, update_resp.get('etag'))
self.assertEqual(props.last_modified, update_resp.get('last_modified'))
self.assertEqual(props.blob_type, BlobType.BlockBlob)
self.assertEqual(props.metadata, {'blobdata': 'data2'})
self.assertEqual(props.size, LARGE_BLOB_SIZE + 512)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_single_put(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = b'hello world'
# Act
create_resp = await blob.upload_blob(data)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_0_bytes(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = b''
# Act
create_resp = await blob.upload_blob(data)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_from_bytes_blob_unicode(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = b'hello world'
# Act
create_resp = await blob.upload_blob(data)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_from_bytes_blob_with_lease_id(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob = await self._create_blob()
data = self.get_random_bytes(LARGE_BLOB_SIZE)
lease = await blob.acquire_lease()
# Act
create_resp = await blob.upload_blob(data, lease=lease)
# Assert
output = await blob.download_blob(lease=lease)
actual = await output.readall()
self.assertEqual(actual, data)
self.assertEqual(output.properties.etag, create_resp.get('etag'))
self.assertEqual(output.properties.last_modified, create_resp.get('last_modified'))
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_with_metadata(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
metadata = {'hello': 'world', 'number': '42'}
# Act
await blob.upload_blob(data, metadata=metadata)
# Assert
md = await blob.get_blob_properties()
md = md.metadata
self.assertDictEqual(md, metadata)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_with_properties(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
content_settings=ContentSettings(
content_type='image/png',
content_language='spanish')
await blob.upload_blob(data, content_settings=content_settings)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
properties = await blob.get_blob_properties()
self.assertEqual(properties.content_settings.content_type, content_settings.content_type)
self.assertEqual(properties.content_settings.content_language, content_settings.content_language)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_with_progress(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
progress = []
def callback(response):
current = response.context['upload_stream_current']
total = response.context['data_stream_total']
if current is not None:
progress.append((current, total))
create_resp = await blob.upload_blob(data, raw_response_hook=callback)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assert_upload_progress(len(data), self.config.max_block_size, progress)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_with_index(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
await blob.upload_blob(data[3:])
# Assert
db = await blob.download_blob()
output = await db.readall()
self.assertEqual(data[3:], output)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_with_index_and_count(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
await blob.upload_blob(data[3:], length=5)
# Assert
db = await blob.download_blob()
output = await db.readall()
self.assertEqual(data[3:8], output)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_frm_bytes_with_index_cnt_props(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
content_settings=ContentSettings(
content_type='image/png',
content_language='spanish')
await blob.upload_blob(data[3:], length=5, content_settings=content_settings)
# Assert
db = await blob.download_blob()
output = await db.readall()
self.assertEqual(data[3:8],output)
properties = await blob.get_blob_properties()
self.assertEqual(properties.content_settings.content_type, content_settings.content_type)
self.assertEqual(properties.content_settings.content_language, content_settings.content_language)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_bytes_non_parallel(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
await blob.upload_blob(data, length=LARGE_BLOB_SIZE, max_concurrency=1)
# Assert
await self.assertBlobEqual(self.container_name, blob.blob_name, data)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def _test_create_blob_from_bytes_with_blob_tier_specified(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob_client = self.bsc.get_blob_client(self.container_name, blob_name)
data = b'hello world'
blob_tier = StandardBlobTier.Cool
# Act
await blob_client.upload_blob(data, standard_blob_tier=blob_tier)
blob_properties = await blob_client.get_blob_properties()
# Assert
self.assertEqual(blob_properties.blob_tier, blob_tier)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_path(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'create_blob_from_path.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
with open(FILE_PATH, 'rb') as stream:
create_resp = await blob.upload_blob(stream)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
self._teardown(FILE_PATH)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_path_non_parallel(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(100)
FILE_PATH = 'from_path_non_parallel.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
with open(FILE_PATH, 'rb') as stream:
create_resp = await blob.upload_blob(stream, length=100, max_concurrency=1)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
self._teardown(FILE_PATH)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def _test_upload_blob_from_path_non_parallel_with_standard_blob_tier(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
FILE_PATH = 'non_parallel_with_standard_blob_tier.temp.{}.dat'.format(str(uuid.uuid4()))
data = self.get_random_bytes(100)
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
blob_tier = StandardBlobTier.Cool
# Act
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, length=100, max_concurrency=1, standard_blob_tier=blob_tier)
props = await blob.get_blob_properties()
# Assert
self.assertEqual(props.blob_tier, blob_tier)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_path_with_progress(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'blob_from_path_with_progressasync.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
progress = []
def callback(response):
current = response.context['upload_stream_current']
total = response.context['data_stream_total']
if current is not None:
progress.append((current, total))
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, raw_response_hook=callback)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assert_upload_progress(len(data), self.config.max_block_size, progress)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_path_with_properties(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'create_blob_from_path_with_propertiesasync.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
content_settings=ContentSettings(
content_type='image/png',
content_language='spanish')
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, content_settings=content_settings)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
properties = await blob.get_blob_properties()
self.assertEqual(properties.content_settings.content_type, content_settings.content_type)
self.assertEqual(properties.content_settings.content_language, content_settings.content_language)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_stream_chunked_upload(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'create_blob_from_stream_chunked_uploadasync.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
with open(FILE_PATH, 'rb') as stream:
create_resp = await blob.upload_blob(stream)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_frm_stream_nonseek_chunk_upload_knwn_size(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
blob_size = len(data) - 66
FILE_PATH = 'create_frm_stream_nonseek_chunk_upload_knwn_sizeasync.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
with open(FILE_PATH, 'rb') as stream:
non_seekable_file = StorageBlockBlobTestAsync.NonSeekableFile(stream)
await blob.upload_blob(non_seekable_file, length=blob_size, max_concurrency=1)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data[:blob_size])
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_blob_frm_strm_nonseek_chunk_upld_unkwn_size(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'strm_nonseek_chunk_upld_unkwn_size_async.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
with open(FILE_PATH, 'rb') as stream:
non_seekable_file = StorageBlockBlobTestAsync.NonSeekableFile(stream)
await blob.upload_blob(non_seekable_file, max_concurrency=1)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_stream_with_progress_chunked_upload(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'from_stream_with_progress_chunked_upload.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
progress = []
def callback(response):
current = response.context['upload_stream_current']
total = response.context['data_stream_total']
if current is not None:
progress.append((current, total))
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, raw_response_hook=callback)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assert_upload_progress(len(data), self.config.max_block_size, progress)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_stream_chunked_upload_with_count(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'blob_from_stream_chunked_upload_with.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
blob_size = len(data) - 301
with open(FILE_PATH, 'rb') as stream:
resp = await blob.upload_blob(stream, length=blob_size)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data[:blob_size])
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_frm_stream_chu_upld_with_countandprops(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = '_frm_stream_chu_upld_with_count.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
content_settings=ContentSettings(
content_type='image/png',
content_language='spanish')
blob_size = len(data) - 301
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, length=blob_size, content_settings=content_settings)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data[:blob_size])
properties = await blob.get_blob_properties()
self.assertEqual(properties.content_settings.content_type, content_settings.content_type)
self.assertEqual(properties.content_settings.content_language, content_settings.content_language)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_stream_chunked_upload_with_properties(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = '_from_stream_chunked_upload_with_propert.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
# Act
content_settings=ContentSettings(
content_type='image/png',
content_language='spanish')
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, content_settings=content_settings)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
properties = await blob.get_blob_properties()
self.assertEqual(properties.content_settings.content_type, content_settings.content_type)
self.assertEqual(properties.content_settings.content_language, content_settings.content_language)
self._teardown(FILE_PATH)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def _test_create_blob_from_stream_chunked_upload_with_properties(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
FILE_PATH = 'tream_chunked_upload_with_properti.temp.{}.dat'.format(str(uuid.uuid4()))
with open(FILE_PATH, 'wb') as stream:
stream.write(data)
blob_tier = StandardBlobTier.Cool
# Act
content_settings = ContentSettings(
content_type='image/png',
content_language='spanish')
with open(FILE_PATH, 'rb') as stream:
await blob.upload_blob(stream, content_settings=content_settings, max_concurrency=2,
standard_blob_tier=blob_tier)
properties = await blob.get_blob_properties()
# Assert
self.assertEqual(properties.blob_tier, blob_tier)
self._teardown(FILE_PATH)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_text(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
text = u'hello 啊齄丂狛狜 world'
data = text.encode('utf-8')
# Act
create_resp = await blob.upload_blob(text)
props = await blob.get_blob_properties()
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assertEqual(props.etag, create_resp.get('etag'))
self.assertEqual(props.last_modified, create_resp.get('last_modified'))
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_text_with_encoding(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
text = u'hello 啊齄丂狛狜 world'
data = text.encode('utf-16')
# Act
await blob.upload_blob(text, encoding='utf-16')
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_text_with_encoding_and_progress(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
text = u'hello 啊齄丂狛狜 world'
data = text.encode('utf-16')
# Act
progress = []
def callback(response):
current = response.context['upload_stream_current']
total = response.context['data_stream_total']
if current is not None:
progress.append((current, total))
await blob.upload_blob(text, encoding='utf-16', raw_response_hook=callback)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, data)
self.assert_upload_progress(len(data), self.config.max_block_size, progress)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_from_text_chunked_upload(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
await self._setup(storage_account, storage_account_key)
# Arrange
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_text_data(LARGE_BLOB_SIZE)
encoded_data = data.encode('utf-8')
# Act
await blob.upload_blob(data)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, encoded_data)
# Assert
await self.assertBlobEqual(self.container_name, blob_name, encoded_data)
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_with_md5(self, resource_group, location, storage_account, storage_account_key):
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = b'hello world'
# Act
await blob.upload_blob(data, validate_content=True)
# Assert
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
@AsyncStorageTestCase.await_prepared_test
async def test_create_blob_with_md5_chunked(self, resource_group, location, storage_account, storage_account_key):
# parallel tests introduce random order of requests, can only run live
# Arrange
await self._setup(storage_account, storage_account_key)
blob_name = self._get_blob_reference()
blob = self.bsc.get_blob_client(self.container_name, blob_name)
data = self.get_random_bytes(LARGE_BLOB_SIZE)
# Act
await blob.upload_blob(data, validate_content=True)
# Assert
#------------------------------------------------------------------------------
| 46.900956 | 149 | 0.697086 | 8,216 | 68,663 | 5.469693 | 0.049659 | 0.080064 | 0.046908 | 0.070407 | 0.87917 | 0.851977 | 0.827922 | 0.800975 | 0.784797 | 0.776275 | 0 | 0.006992 | 0.210594 | 68,663 | 1,463 | 150 | 46.933014 | 0.822058 | 0.047376 | 0 | 0.660612 | 0 | 0 | 0.042323 | 0.014859 | 0 | 0 | 0 | 0 | 0.14914 | 1 | 0.010516 | false | 0.001912 | 0.01434 | 0.003824 | 0.034417 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
49956d18cae29bfa5f7debe70156d0c229f829b5 | 179,306 | py | Python | cottonformation/res/emr.py | gitter-badger/cottonformation-project | 354f1dce7ea106e209af2d5d818b6033a27c193c | [
"BSD-2-Clause"
] | null | null | null | cottonformation/res/emr.py | gitter-badger/cottonformation-project | 354f1dce7ea106e209af2d5d818b6033a27c193c | [
"BSD-2-Clause"
] | null | null | null | cottonformation/res/emr.py | gitter-badger/cottonformation-project | 354f1dce7ea106e209af2d5d818b6033a27c193c | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
This module
"""
import attr
import typing
from ..core.model import (
Property, Resource, Tag, GetAtt, TypeHint, TypeCheck,
)
from ..core.constant import AttrMeta
#--- Property declaration ---
@attr.s
class ClusterComputeLimits(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ComputeLimits"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html
Property Document:
- ``rp_MaximumCapacityUnits``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-maximumcapacityunits
- ``rp_MinimumCapacityUnits``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-minimumcapacityunits
- ``rp_UnitType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-unittype
- ``p_MaximumCoreCapacityUnits``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-maximumcorecapacityunits
- ``p_MaximumOnDemandCapacityUnits``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-maximumondemandcapacityunits
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ComputeLimits"
rp_MaximumCapacityUnits: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "MaximumCapacityUnits"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-maximumcapacityunits"""
rp_MinimumCapacityUnits: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "MinimumCapacityUnits"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-minimumcapacityunits"""
rp_UnitType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "UnitType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-unittype"""
p_MaximumCoreCapacityUnits: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "MaximumCoreCapacityUnits"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-maximumcorecapacityunits"""
p_MaximumOnDemandCapacityUnits: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "MaximumOnDemandCapacityUnits"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-computelimits.html#cfn-elasticmapreduce-cluster-computelimits-maximumondemandcapacityunits"""
@attr.s
class ClusterSpotProvisioningSpecification(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.SpotProvisioningSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html
Property Document:
- ``rp_TimeoutAction``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-timeoutaction
- ``rp_TimeoutDurationMinutes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-timeoutdurationminutes
- ``p_AllocationStrategy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-allocationstrategy
- ``p_BlockDurationMinutes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-blockdurationminutes
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.SpotProvisioningSpecification"
rp_TimeoutAction: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "TimeoutAction"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-timeoutaction"""
rp_TimeoutDurationMinutes: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "TimeoutDurationMinutes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-timeoutdurationminutes"""
p_AllocationStrategy: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "AllocationStrategy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-allocationstrategy"""
p_BlockDurationMinutes: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "BlockDurationMinutes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-spotprovisioningspecification.html#cfn-elasticmapreduce-cluster-spotprovisioningspecification-blockdurationminutes"""
@attr.s
class ClusterManagedScalingPolicy(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ManagedScalingPolicy"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-managedscalingpolicy.html
Property Document:
- ``p_ComputeLimits``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-managedscalingpolicy.html#cfn-elasticmapreduce-cluster-managedscalingpolicy-computelimits
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ManagedScalingPolicy"
p_ComputeLimits: typing.Union['ClusterComputeLimits', dict] = attr.ib(
default=None,
converter=ClusterComputeLimits.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterComputeLimits)),
metadata={AttrMeta.PROPERTY_NAME: "ComputeLimits"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-managedscalingpolicy.html#cfn-elasticmapreduce-cluster-managedscalingpolicy-computelimits"""
@attr.s
class ClusterKeyValue(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.KeyValue"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-keyvalue.html
Property Document:
- ``p_Key``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-keyvalue.html#cfn-elasticmapreduce-cluster-keyvalue-key
- ``p_Value``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-keyvalue.html#cfn-elasticmapreduce-cluster-keyvalue-value
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.KeyValue"
p_Key: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Key"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-keyvalue.html#cfn-elasticmapreduce-cluster-keyvalue-key"""
p_Value: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Value"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-keyvalue.html#cfn-elasticmapreduce-cluster-keyvalue-value"""
@attr.s
class ClusterVolumeSpecification(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.VolumeSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html
Property Document:
- ``rp_SizeInGB``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html#cfn-elasticmapreduce-cluster-volumespecification-sizeingb
- ``rp_VolumeType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html#cfn-elasticmapreduce-cluster-volumespecification-volumetype
- ``p_Iops``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html#cfn-elasticmapreduce-cluster-volumespecification-iops
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.VolumeSpecification"
rp_SizeInGB: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "SizeInGB"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html#cfn-elasticmapreduce-cluster-volumespecification-sizeingb"""
rp_VolumeType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "VolumeType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html#cfn-elasticmapreduce-cluster-volumespecification-volumetype"""
p_Iops: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "Iops"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-volumespecification.html#cfn-elasticmapreduce-cluster-volumespecification-iops"""
@attr.s
class InstanceGroupConfigConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.Configuration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html
Property Document:
- ``p_Classification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html#cfn-emr-cluster-configuration-classification
- ``p_ConfigurationProperties``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html#cfn-emr-cluster-configuration-configurationproperties
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html#cfn-emr-cluster-configuration-configurations
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.Configuration"
p_Classification: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Classification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html#cfn-emr-cluster-configuration-classification"""
p_ConfigurationProperties: typing.Dict[str, TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_mapping(key_validator=attr.validators.instance_of(str), value_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type))),
metadata={AttrMeta.PROPERTY_NAME: "ConfigurationProperties"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html#cfn-emr-cluster-configuration-configurationproperties"""
p_Configurations: typing.List[typing.Union['InstanceGroupConfigConfiguration', dict]] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-cluster-configuration.html#cfn-emr-cluster-configuration-configurations"""
@attr.s
class InstanceGroupConfigMetricDimension(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.MetricDimension"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-metricdimension.html
Property Document:
- ``rp_Key``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-metricdimension.html#cfn-elasticmapreduce-instancegroupconfig-metricdimension-key
- ``rp_Value``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-metricdimension.html#cfn-elasticmapreduce-instancegroupconfig-metricdimension-value
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.MetricDimension"
rp_Key: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Key"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-metricdimension.html#cfn-elasticmapreduce-instancegroupconfig-metricdimension-key"""
rp_Value: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Value"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-metricdimension.html#cfn-elasticmapreduce-instancegroupconfig-metricdimension-value"""
@attr.s
class InstanceFleetConfigConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.Configuration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html
Property Document:
- ``p_Classification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html#cfn-elasticmapreduce-instancefleetconfig-configuration-classification
- ``p_ConfigurationProperties``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html#cfn-elasticmapreduce-instancefleetconfig-configuration-configurationproperties
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html#cfn-elasticmapreduce-instancefleetconfig-configuration-configurations
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.Configuration"
p_Classification: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Classification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html#cfn-elasticmapreduce-instancefleetconfig-configuration-classification"""
p_ConfigurationProperties: typing.Dict[str, TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_mapping(key_validator=attr.validators.instance_of(str), value_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type))),
metadata={AttrMeta.PROPERTY_NAME: "ConfigurationProperties"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html#cfn-elasticmapreduce-instancefleetconfig-configuration-configurationproperties"""
p_Configurations: typing.List[typing.Union['InstanceFleetConfigConfiguration', dict]] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-configuration.html#cfn-elasticmapreduce-instancefleetconfig-configuration-configurations"""
@attr.s
class ClusterKerberosAttributes(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.KerberosAttributes"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html
Property Document:
- ``rp_KdcAdminPassword``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-kdcadminpassword
- ``rp_Realm``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-realm
- ``p_ADDomainJoinPassword``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-addomainjoinpassword
- ``p_ADDomainJoinUser``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-addomainjoinuser
- ``p_CrossRealmTrustPrincipalPassword``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-crossrealmtrustprincipalpassword
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.KerberosAttributes"
rp_KdcAdminPassword: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "KdcAdminPassword"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-kdcadminpassword"""
rp_Realm: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Realm"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-realm"""
p_ADDomainJoinPassword: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "ADDomainJoinPassword"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-addomainjoinpassword"""
p_ADDomainJoinUser: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "ADDomainJoinUser"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-addomainjoinuser"""
p_CrossRealmTrustPrincipalPassword: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CrossRealmTrustPrincipalPassword"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-kerberosattributes.html#cfn-elasticmapreduce-cluster-kerberosattributes-crossrealmtrustprincipalpassword"""
@attr.s
class InstanceGroupConfigSimpleScalingPolicyConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.SimpleScalingPolicyConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html
Property Document:
- ``rp_ScalingAdjustment``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration-scalingadjustment
- ``p_AdjustmentType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration-adjustmenttype
- ``p_CoolDown``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration-cooldown
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.SimpleScalingPolicyConfiguration"
rp_ScalingAdjustment: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "ScalingAdjustment"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration-scalingadjustment"""
p_AdjustmentType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "AdjustmentType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration-adjustmenttype"""
p_CoolDown: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "CoolDown"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-instancegroupconfig-simplescalingpolicyconfiguration-cooldown"""
@attr.s
class ClusterApplication(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.Application"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html
Property Document:
- ``p_AdditionalInfo``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-additionalinfo
- ``p_Args``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-args
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-name
- ``p_Version``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-version
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.Application"
p_AdditionalInfo: typing.Dict[str, TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_mapping(key_validator=attr.validators.instance_of(str), value_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type))),
metadata={AttrMeta.PROPERTY_NAME: "AdditionalInfo"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-additionalinfo"""
p_Args: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Args"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-args"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-name"""
p_Version: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Version"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-application.html#cfn-elasticmapreduce-cluster-application-version"""
@attr.s
class ClusterConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.Configuration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html
Property Document:
- ``p_Classification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html#cfn-elasticmapreduce-cluster-configuration-classification
- ``p_ConfigurationProperties``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html#cfn-elasticmapreduce-cluster-configuration-configurationproperties
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html#cfn-elasticmapreduce-cluster-configuration-configurations
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.Configuration"
p_Classification: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Classification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html#cfn-elasticmapreduce-cluster-configuration-classification"""
p_ConfigurationProperties: typing.Dict[str, TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_mapping(key_validator=attr.validators.instance_of(str), value_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type))),
metadata={AttrMeta.PROPERTY_NAME: "ConfigurationProperties"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html#cfn-elasticmapreduce-cluster-configuration-configurationproperties"""
p_Configurations: typing.List[typing.Union['ClusterConfiguration', dict]] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-configuration.html#cfn-elasticmapreduce-cluster-configuration-configurations"""
@attr.s
class ClusterScriptBootstrapActionConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ScriptBootstrapActionConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scriptbootstrapactionconfig.html
Property Document:
- ``rp_Path``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scriptbootstrapactionconfig.html#cfn-elasticmapreduce-cluster-scriptbootstrapactionconfig-path
- ``p_Args``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scriptbootstrapactionconfig.html#cfn-elasticmapreduce-cluster-scriptbootstrapactionconfig-args
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ScriptBootstrapActionConfig"
rp_Path: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Path"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scriptbootstrapactionconfig.html#cfn-elasticmapreduce-cluster-scriptbootstrapactionconfig-path"""
p_Args: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Args"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scriptbootstrapactionconfig.html#cfn-elasticmapreduce-cluster-scriptbootstrapactionconfig-args"""
@attr.s
class InstanceGroupConfigCloudWatchAlarmDefinition(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.CloudWatchAlarmDefinition"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html
Property Document:
- ``rp_ComparisonOperator``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-comparisonoperator
- ``rp_MetricName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-metricname
- ``rp_Period``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-period
- ``rp_Threshold``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-threshold
- ``p_Dimensions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-dimensions
- ``p_EvaluationPeriods``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-evaluationperiods
- ``p_Namespace``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-namespace
- ``p_Statistic``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-statistic
- ``p_Unit``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-unit
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.CloudWatchAlarmDefinition"
rp_ComparisonOperator: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ComparisonOperator"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-comparisonoperator"""
rp_MetricName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "MetricName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-metricname"""
rp_Period: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "Period"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-period"""
rp_Threshold: float = attr.ib(
default=None,
validator=attr.validators.instance_of(float),
metadata={AttrMeta.PROPERTY_NAME: "Threshold"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-threshold"""
p_Dimensions: typing.List[typing.Union['InstanceGroupConfigMetricDimension', dict]] = attr.ib(
default=None,
converter=InstanceGroupConfigMetricDimension.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceGroupConfigMetricDimension), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Dimensions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-dimensions"""
p_EvaluationPeriods: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "EvaluationPeriods"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-evaluationperiods"""
p_Namespace: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Namespace"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-namespace"""
p_Statistic: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Statistic"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-statistic"""
p_Unit: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Unit"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-instancegroupconfig-cloudwatchalarmdefinition-unit"""
@attr.s
class ClusterMetricDimension(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.MetricDimension"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-metricdimension.html
Property Document:
- ``rp_Key``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-metricdimension.html#cfn-elasticmapreduce-cluster-metricdimension-key
- ``rp_Value``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-metricdimension.html#cfn-elasticmapreduce-cluster-metricdimension-value
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.MetricDimension"
rp_Key: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Key"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-metricdimension.html#cfn-elasticmapreduce-cluster-metricdimension-key"""
rp_Value: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Value"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-metricdimension.html#cfn-elasticmapreduce-cluster-metricdimension-value"""
@attr.s
class ClusterOnDemandProvisioningSpecification(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.OnDemandProvisioningSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ondemandprovisioningspecification.html
Property Document:
- ``rp_AllocationStrategy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ondemandprovisioningspecification.html#cfn-elasticmapreduce-cluster-ondemandprovisioningspecification-allocationstrategy
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.OnDemandProvisioningSpecification"
rp_AllocationStrategy: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "AllocationStrategy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ondemandprovisioningspecification.html#cfn-elasticmapreduce-cluster-ondemandprovisioningspecification-allocationstrategy"""
@attr.s
class InstanceFleetConfigSpotProvisioningSpecification(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.SpotProvisioningSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html
Property Document:
- ``rp_TimeoutAction``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-timeoutaction
- ``rp_TimeoutDurationMinutes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-timeoutdurationminutes
- ``p_AllocationStrategy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-allocationstrategy
- ``p_BlockDurationMinutes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-blockdurationminutes
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.SpotProvisioningSpecification"
rp_TimeoutAction: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "TimeoutAction"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-timeoutaction"""
rp_TimeoutDurationMinutes: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "TimeoutDurationMinutes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-timeoutdurationminutes"""
p_AllocationStrategy: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "AllocationStrategy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-allocationstrategy"""
p_BlockDurationMinutes: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "BlockDurationMinutes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-spotprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-spotprovisioningspecification-blockdurationminutes"""
@attr.s
class StepKeyValue(Property):
"""
AWS Object Type = "AWS::EMR::Step.KeyValue"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-keyvalue.html
Property Document:
- ``p_Key``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-keyvalue.html#cfn-elasticmapreduce-step-keyvalue-key
- ``p_Value``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-keyvalue.html#cfn-elasticmapreduce-step-keyvalue-value
"""
AWS_OBJECT_TYPE = "AWS::EMR::Step.KeyValue"
p_Key: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Key"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-keyvalue.html#cfn-elasticmapreduce-step-keyvalue-key"""
p_Value: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Value"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-keyvalue.html#cfn-elasticmapreduce-step-keyvalue-value"""
@attr.s
class InstanceGroupConfigScalingAction(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.ScalingAction"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingaction.html
Property Document:
- ``rp_SimpleScalingPolicyConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingaction.html#cfn-elasticmapreduce-instancegroupconfig-scalingaction-simplescalingpolicyconfiguration
- ``p_Market``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingaction.html#cfn-elasticmapreduce-instancegroupconfig-scalingaction-market
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.ScalingAction"
rp_SimpleScalingPolicyConfiguration: typing.Union['InstanceGroupConfigSimpleScalingPolicyConfiguration', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigSimpleScalingPolicyConfiguration.from_dict,
validator=attr.validators.instance_of(InstanceGroupConfigSimpleScalingPolicyConfiguration),
metadata={AttrMeta.PROPERTY_NAME: "SimpleScalingPolicyConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingaction.html#cfn-elasticmapreduce-instancegroupconfig-scalingaction-simplescalingpolicyconfiguration"""
p_Market: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Market"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingaction.html#cfn-elasticmapreduce-instancegroupconfig-scalingaction-market"""
@attr.s
class InstanceGroupConfigScalingTrigger(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.ScalingTrigger"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingtrigger.html
Property Document:
- ``rp_CloudWatchAlarmDefinition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingtrigger.html#cfn-elasticmapreduce-instancegroupconfig-scalingtrigger-cloudwatchalarmdefinition
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.ScalingTrigger"
rp_CloudWatchAlarmDefinition: typing.Union['InstanceGroupConfigCloudWatchAlarmDefinition', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigCloudWatchAlarmDefinition.from_dict,
validator=attr.validators.instance_of(InstanceGroupConfigCloudWatchAlarmDefinition),
metadata={AttrMeta.PROPERTY_NAME: "CloudWatchAlarmDefinition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingtrigger.html#cfn-elasticmapreduce-instancegroupconfig-scalingtrigger-cloudwatchalarmdefinition"""
@attr.s
class InstanceFleetConfigOnDemandProvisioningSpecification(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.OnDemandProvisioningSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ondemandprovisioningspecification.html
Property Document:
- ``rp_AllocationStrategy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ondemandprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-ondemandprovisioningspecification-allocationstrategy
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.OnDemandProvisioningSpecification"
rp_AllocationStrategy: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "AllocationStrategy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ondemandprovisioningspecification.html#cfn-elasticmapreduce-instancefleetconfig-ondemandprovisioningspecification-allocationstrategy"""
@attr.s
class InstanceGroupConfigVolumeSpecification(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.VolumeSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html
Property Document:
- ``rp_SizeInGB``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification-sizeingb
- ``rp_VolumeType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification-volumetype
- ``p_Iops``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification-iops
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.VolumeSpecification"
rp_SizeInGB: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "SizeInGB"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification-sizeingb"""
rp_VolumeType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "VolumeType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification-volumetype"""
p_Iops: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "Iops"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification-iops"""
@attr.s
class InstanceFleetConfigVolumeSpecification(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.VolumeSpecification"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html
Property Document:
- ``rp_SizeInGB``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html#cfn-elasticmapreduce-instancefleetconfig-volumespecification-sizeingb
- ``rp_VolumeType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html#cfn-elasticmapreduce-instancefleetconfig-volumespecification-volumetype
- ``p_Iops``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html#cfn-elasticmapreduce-instancefleetconfig-volumespecification-iops
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.VolumeSpecification"
rp_SizeInGB: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "SizeInGB"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html#cfn-elasticmapreduce-instancefleetconfig-volumespecification-sizeingb"""
rp_VolumeType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "VolumeType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html#cfn-elasticmapreduce-instancefleetconfig-volumespecification-volumetype"""
p_Iops: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "Iops"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-volumespecification.html#cfn-elasticmapreduce-instancefleetconfig-volumespecification-iops"""
@attr.s
class ClusterScalingConstraints(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ScalingConstraints"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingconstraints.html
Property Document:
- ``rp_MaxCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingconstraints.html#cfn-elasticmapreduce-cluster-scalingconstraints-maxcapacity
- ``rp_MinCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingconstraints.html#cfn-elasticmapreduce-cluster-scalingconstraints-mincapacity
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ScalingConstraints"
rp_MaxCapacity: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "MaxCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingconstraints.html#cfn-elasticmapreduce-cluster-scalingconstraints-maxcapacity"""
rp_MinCapacity: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "MinCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingconstraints.html#cfn-elasticmapreduce-cluster-scalingconstraints-mincapacity"""
@attr.s
class ClusterSimpleScalingPolicyConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.SimpleScalingPolicyConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html
Property Document:
- ``rp_ScalingAdjustment``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-cluster-simplescalingpolicyconfiguration-scalingadjustment
- ``p_AdjustmentType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-cluster-simplescalingpolicyconfiguration-adjustmenttype
- ``p_CoolDown``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-cluster-simplescalingpolicyconfiguration-cooldown
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.SimpleScalingPolicyConfiguration"
rp_ScalingAdjustment: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "ScalingAdjustment"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-cluster-simplescalingpolicyconfiguration-scalingadjustment"""
p_AdjustmentType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "AdjustmentType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-cluster-simplescalingpolicyconfiguration-adjustmenttype"""
p_CoolDown: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "CoolDown"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-simplescalingpolicyconfiguration.html#cfn-elasticmapreduce-cluster-simplescalingpolicyconfiguration-cooldown"""
@attr.s
class ClusterPlacementType(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.PlacementType"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-placementtype.html
Property Document:
- ``rp_AvailabilityZone``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-placementtype.html#cfn-elasticmapreduce-cluster-placementtype-availabilityzone
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.PlacementType"
rp_AvailabilityZone: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "AvailabilityZone"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-placementtype.html#cfn-elasticmapreduce-cluster-placementtype-availabilityzone"""
@attr.s
class InstanceGroupConfigScalingConstraints(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.ScalingConstraints"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingconstraints.html
Property Document:
- ``rp_MaxCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingconstraints.html#cfn-elasticmapreduce-instancegroupconfig-scalingconstraints-maxcapacity
- ``rp_MinCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingconstraints.html#cfn-elasticmapreduce-instancegroupconfig-scalingconstraints-mincapacity
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.ScalingConstraints"
rp_MaxCapacity: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "MaxCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingconstraints.html#cfn-elasticmapreduce-instancegroupconfig-scalingconstraints-maxcapacity"""
rp_MinCapacity: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "MinCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingconstraints.html#cfn-elasticmapreduce-instancegroupconfig-scalingconstraints-mincapacity"""
@attr.s
class InstanceFleetConfigInstanceFleetProvisioningSpecifications(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.InstanceFleetProvisioningSpecifications"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications.html
Property Document:
- ``p_OnDemandSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications-ondemandspecification
- ``p_SpotSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications-spotspecification
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.InstanceFleetProvisioningSpecifications"
p_OnDemandSpecification: typing.Union['InstanceFleetConfigOnDemandProvisioningSpecification', dict] = attr.ib(
default=None,
converter=InstanceFleetConfigOnDemandProvisioningSpecification.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(InstanceFleetConfigOnDemandProvisioningSpecification)),
metadata={AttrMeta.PROPERTY_NAME: "OnDemandSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications-ondemandspecification"""
p_SpotSpecification: typing.Union['InstanceFleetConfigSpotProvisioningSpecification', dict] = attr.ib(
default=None,
converter=InstanceFleetConfigSpotProvisioningSpecification.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(InstanceFleetConfigSpotProvisioningSpecification)),
metadata={AttrMeta.PROPERTY_NAME: "SpotSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-instancefleetconfig-instancefleetprovisioningspecifications-spotspecification"""
@attr.s
class InstanceFleetConfigEbsBlockDeviceConfig(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.EbsBlockDeviceConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig.html
Property Document:
- ``rp_VolumeSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig.html#cfn-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig-volumespecification
- ``p_VolumesPerInstance``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig.html#cfn-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig-volumesperinstance
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.EbsBlockDeviceConfig"
rp_VolumeSpecification: typing.Union['InstanceFleetConfigVolumeSpecification', dict] = attr.ib(
default=None,
converter=InstanceFleetConfigVolumeSpecification.from_dict,
validator=attr.validators.instance_of(InstanceFleetConfigVolumeSpecification),
metadata={AttrMeta.PROPERTY_NAME: "VolumeSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig.html#cfn-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig-volumespecification"""
p_VolumesPerInstance: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "VolumesPerInstance"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig.html#cfn-elasticmapreduce-instancefleetconfig-ebsblockdeviceconfig-volumesperinstance"""
@attr.s
class ClusterHadoopJarStepConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.HadoopJarStepConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html
Property Document:
- ``rp_Jar``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-jar
- ``p_Args``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-args
- ``p_MainClass``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-mainclass
- ``p_StepProperties``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-stepproperties
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.HadoopJarStepConfig"
rp_Jar: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Jar"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-jar"""
p_Args: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Args"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-args"""
p_MainClass: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "MainClass"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-mainclass"""
p_StepProperties: typing.List[typing.Union['ClusterKeyValue', dict]] = attr.ib(
default=None,
converter=ClusterKeyValue.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterKeyValue), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StepProperties"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-hadoopjarstepconfig.html#cfn-elasticmapreduce-cluster-hadoopjarstepconfig-stepproperties"""
@attr.s
class StepHadoopJarStepConfig(Property):
"""
AWS Object Type = "AWS::EMR::Step.HadoopJarStepConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html
Property Document:
- ``rp_Jar``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-jar
- ``p_Args``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-args
- ``p_MainClass``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-mainclass
- ``p_StepProperties``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-stepproperties
"""
AWS_OBJECT_TYPE = "AWS::EMR::Step.HadoopJarStepConfig"
rp_Jar: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Jar"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-jar"""
p_Args: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Args"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-args"""
p_MainClass: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "MainClass"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-mainclass"""
p_StepProperties: typing.List[typing.Union['StepKeyValue', dict]] = attr.ib(
default=None,
converter=StepKeyValue.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(StepKeyValue), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StepProperties"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-step-hadoopjarstepconfig.html#cfn-elasticmapreduce-step-hadoopjarstepconfig-stepproperties"""
@attr.s
class ClusterBootstrapActionConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.BootstrapActionConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-bootstrapactionconfig.html
Property Document:
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-bootstrapactionconfig.html#cfn-elasticmapreduce-cluster-bootstrapactionconfig-name
- ``rp_ScriptBootstrapAction``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-bootstrapactionconfig.html#cfn-elasticmapreduce-cluster-bootstrapactionconfig-scriptbootstrapaction
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.BootstrapActionConfig"
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-bootstrapactionconfig.html#cfn-elasticmapreduce-cluster-bootstrapactionconfig-name"""
rp_ScriptBootstrapAction: typing.Union['ClusterScriptBootstrapActionConfig', dict] = attr.ib(
default=None,
converter=ClusterScriptBootstrapActionConfig.from_dict,
validator=attr.validators.instance_of(ClusterScriptBootstrapActionConfig),
metadata={AttrMeta.PROPERTY_NAME: "ScriptBootstrapAction"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-bootstrapactionconfig.html#cfn-elasticmapreduce-cluster-bootstrapactionconfig-scriptbootstrapaction"""
@attr.s
class ClusterStepConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.StepConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html
Property Document:
- ``rp_HadoopJarStep``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html#cfn-elasticmapreduce-cluster-stepconfig-hadoopjarstep
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html#cfn-elasticmapreduce-cluster-stepconfig-name
- ``p_ActionOnFailure``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html#cfn-elasticmapreduce-cluster-stepconfig-actiononfailure
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.StepConfig"
rp_HadoopJarStep: typing.Union['ClusterHadoopJarStepConfig', dict] = attr.ib(
default=None,
converter=ClusterHadoopJarStepConfig.from_dict,
validator=attr.validators.instance_of(ClusterHadoopJarStepConfig),
metadata={AttrMeta.PROPERTY_NAME: "HadoopJarStep"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html#cfn-elasticmapreduce-cluster-stepconfig-hadoopjarstep"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html#cfn-elasticmapreduce-cluster-stepconfig-name"""
p_ActionOnFailure: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "ActionOnFailure"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-stepconfig.html#cfn-elasticmapreduce-cluster-stepconfig-actiononfailure"""
@attr.s
class ClusterEbsBlockDeviceConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.EbsBlockDeviceConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsblockdeviceconfig.html
Property Document:
- ``rp_VolumeSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsblockdeviceconfig.html#cfn-elasticmapreduce-cluster-ebsblockdeviceconfig-volumespecification
- ``p_VolumesPerInstance``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsblockdeviceconfig.html#cfn-elasticmapreduce-cluster-ebsblockdeviceconfig-volumesperinstance
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.EbsBlockDeviceConfig"
rp_VolumeSpecification: typing.Union['ClusterVolumeSpecification', dict] = attr.ib(
default=None,
converter=ClusterVolumeSpecification.from_dict,
validator=attr.validators.instance_of(ClusterVolumeSpecification),
metadata={AttrMeta.PROPERTY_NAME: "VolumeSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsblockdeviceconfig.html#cfn-elasticmapreduce-cluster-ebsblockdeviceconfig-volumespecification"""
p_VolumesPerInstance: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "VolumesPerInstance"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsblockdeviceconfig.html#cfn-elasticmapreduce-cluster-ebsblockdeviceconfig-volumesperinstance"""
@attr.s
class ClusterCloudWatchAlarmDefinition(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.CloudWatchAlarmDefinition"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html
Property Document:
- ``rp_ComparisonOperator``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-comparisonoperator
- ``rp_MetricName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-metricname
- ``rp_Period``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-period
- ``rp_Threshold``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-threshold
- ``p_Dimensions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-dimensions
- ``p_EvaluationPeriods``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-evaluationperiods
- ``p_Namespace``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-namespace
- ``p_Statistic``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-statistic
- ``p_Unit``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-unit
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.CloudWatchAlarmDefinition"
rp_ComparisonOperator: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ComparisonOperator"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-comparisonoperator"""
rp_MetricName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "MetricName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-metricname"""
rp_Period: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "Period"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-period"""
rp_Threshold: float = attr.ib(
default=None,
validator=attr.validators.instance_of(float),
metadata={AttrMeta.PROPERTY_NAME: "Threshold"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-threshold"""
p_Dimensions: typing.List[typing.Union['ClusterMetricDimension', dict]] = attr.ib(
default=None,
converter=ClusterMetricDimension.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterMetricDimension), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Dimensions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-dimensions"""
p_EvaluationPeriods: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "EvaluationPeriods"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-evaluationperiods"""
p_Namespace: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Namespace"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-namespace"""
p_Statistic: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Statistic"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-statistic"""
p_Unit: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Unit"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-cloudwatchalarmdefinition.html#cfn-elasticmapreduce-cluster-cloudwatchalarmdefinition-unit"""
@attr.s
class ClusterInstanceFleetProvisioningSpecifications(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.InstanceFleetProvisioningSpecifications"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetprovisioningspecifications.html
Property Document:
- ``p_OnDemandSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-cluster-instancefleetprovisioningspecifications-ondemandspecification
- ``p_SpotSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-cluster-instancefleetprovisioningspecifications-spotspecification
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.InstanceFleetProvisioningSpecifications"
p_OnDemandSpecification: typing.Union['ClusterOnDemandProvisioningSpecification', dict] = attr.ib(
default=None,
converter=ClusterOnDemandProvisioningSpecification.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterOnDemandProvisioningSpecification)),
metadata={AttrMeta.PROPERTY_NAME: "OnDemandSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-cluster-instancefleetprovisioningspecifications-ondemandspecification"""
p_SpotSpecification: typing.Union['ClusterSpotProvisioningSpecification', dict] = attr.ib(
default=None,
converter=ClusterSpotProvisioningSpecification.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterSpotProvisioningSpecification)),
metadata={AttrMeta.PROPERTY_NAME: "SpotSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetprovisioningspecifications.html#cfn-elasticmapreduce-cluster-instancefleetprovisioningspecifications-spotspecification"""
@attr.s
class InstanceGroupConfigScalingRule(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.ScalingRule"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html
Property Document:
- ``rp_Action``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-action
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-name
- ``rp_Trigger``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-trigger
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-description
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.ScalingRule"
rp_Action: typing.Union['InstanceGroupConfigScalingAction', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigScalingAction.from_dict,
validator=attr.validators.instance_of(InstanceGroupConfigScalingAction),
metadata={AttrMeta.PROPERTY_NAME: "Action"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-action"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-name"""
rp_Trigger: typing.Union['InstanceGroupConfigScalingTrigger', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigScalingTrigger.from_dict,
validator=attr.validators.instance_of(InstanceGroupConfigScalingTrigger),
metadata={AttrMeta.PROPERTY_NAME: "Trigger"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-trigger"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-scalingrule.html#cfn-elasticmapreduce-instancegroupconfig-scalingrule-description"""
@attr.s
class ClusterEbsConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.EbsConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsconfiguration.html
Property Document:
- ``p_EbsBlockDeviceConfigs``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsconfiguration.html#cfn-elasticmapreduce-cluster-ebsconfiguration-ebsblockdeviceconfigs
- ``p_EbsOptimized``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsconfiguration.html#cfn-elasticmapreduce-cluster-ebsconfiguration-ebsoptimized
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.EbsConfiguration"
p_EbsBlockDeviceConfigs: typing.List[typing.Union['ClusterEbsBlockDeviceConfig', dict]] = attr.ib(
default=None,
converter=ClusterEbsBlockDeviceConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterEbsBlockDeviceConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "EbsBlockDeviceConfigs"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsconfiguration.html#cfn-elasticmapreduce-cluster-ebsconfiguration-ebsblockdeviceconfigs"""
p_EbsOptimized: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "EbsOptimized"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-ebsconfiguration.html#cfn-elasticmapreduce-cluster-ebsconfiguration-ebsoptimized"""
@attr.s
class ClusterInstanceTypeConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.InstanceTypeConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html
Property Document:
- ``rp_InstanceType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-instancetype
- ``p_BidPrice``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-bidprice
- ``p_BidPriceAsPercentageOfOnDemandPrice``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-bidpriceaspercentageofondemandprice
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-configurations
- ``p_EbsConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-ebsconfiguration
- ``p_WeightedCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-weightedcapacity
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.InstanceTypeConfig"
rp_InstanceType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "InstanceType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-instancetype"""
p_BidPrice: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "BidPrice"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-bidprice"""
p_BidPriceAsPercentageOfOnDemandPrice: float = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(float)),
metadata={AttrMeta.PROPERTY_NAME: "BidPriceAsPercentageOfOnDemandPrice"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-bidpriceaspercentageofondemandprice"""
p_Configurations: typing.List[typing.Union['ClusterConfiguration', dict]] = attr.ib(
default=None,
converter=ClusterConfiguration.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterConfiguration), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-configurations"""
p_EbsConfiguration: typing.Union['ClusterEbsConfiguration', dict] = attr.ib(
default=None,
converter=ClusterEbsConfiguration.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterEbsConfiguration)),
metadata={AttrMeta.PROPERTY_NAME: "EbsConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-ebsconfiguration"""
p_WeightedCapacity: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "WeightedCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancetypeconfig.html#cfn-elasticmapreduce-cluster-instancetypeconfig-weightedcapacity"""
@attr.s
class ClusterScalingTrigger(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ScalingTrigger"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingtrigger.html
Property Document:
- ``rp_CloudWatchAlarmDefinition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingtrigger.html#cfn-elasticmapreduce-cluster-scalingtrigger-cloudwatchalarmdefinition
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ScalingTrigger"
rp_CloudWatchAlarmDefinition: typing.Union['ClusterCloudWatchAlarmDefinition', dict] = attr.ib(
default=None,
converter=ClusterCloudWatchAlarmDefinition.from_dict,
validator=attr.validators.instance_of(ClusterCloudWatchAlarmDefinition),
metadata={AttrMeta.PROPERTY_NAME: "CloudWatchAlarmDefinition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingtrigger.html#cfn-elasticmapreduce-cluster-scalingtrigger-cloudwatchalarmdefinition"""
@attr.s
class InstanceGroupConfigEbsBlockDeviceConfig(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.EbsBlockDeviceConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig.html
Property Document:
- ``rp_VolumeSpecification``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification
- ``p_VolumesPerInstance``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumesperinstance
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.EbsBlockDeviceConfig"
rp_VolumeSpecification: typing.Union['InstanceGroupConfigVolumeSpecification', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigVolumeSpecification.from_dict,
validator=attr.validators.instance_of(InstanceGroupConfigVolumeSpecification),
metadata={AttrMeta.PROPERTY_NAME: "VolumeSpecification"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumespecification"""
p_VolumesPerInstance: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "VolumesPerInstance"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration-ebsblockdeviceconfig.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfig-volumesperinstance"""
@attr.s
class ClusterInstanceFleetConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.InstanceFleetConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html
Property Document:
- ``p_InstanceTypeConfigs``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-instancetypeconfigs
- ``p_LaunchSpecifications``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-launchspecifications
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-name
- ``p_TargetOnDemandCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-targetondemandcapacity
- ``p_TargetSpotCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-targetspotcapacity
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.InstanceFleetConfig"
p_InstanceTypeConfigs: typing.List[typing.Union['ClusterInstanceTypeConfig', dict]] = attr.ib(
default=None,
converter=ClusterInstanceTypeConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterInstanceTypeConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "InstanceTypeConfigs"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-instancetypeconfigs"""
p_LaunchSpecifications: typing.Union['ClusterInstanceFleetProvisioningSpecifications', dict] = attr.ib(
default=None,
converter=ClusterInstanceFleetProvisioningSpecifications.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterInstanceFleetProvisioningSpecifications)),
metadata={AttrMeta.PROPERTY_NAME: "LaunchSpecifications"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-launchspecifications"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-name"""
p_TargetOnDemandCapacity: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "TargetOnDemandCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-targetondemandcapacity"""
p_TargetSpotCapacity: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "TargetSpotCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancefleetconfig.html#cfn-elasticmapreduce-cluster-instancefleetconfig-targetspotcapacity"""
@attr.s
class InstanceFleetConfigEbsConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.EbsConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsconfiguration.html
Property Document:
- ``p_EbsBlockDeviceConfigs``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsconfiguration.html#cfn-elasticmapreduce-instancefleetconfig-ebsconfiguration-ebsblockdeviceconfigs
- ``p_EbsOptimized``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsconfiguration.html#cfn-elasticmapreduce-instancefleetconfig-ebsconfiguration-ebsoptimized
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.EbsConfiguration"
p_EbsBlockDeviceConfigs: typing.List[typing.Union['InstanceFleetConfigEbsBlockDeviceConfig', dict]] = attr.ib(
default=None,
converter=InstanceFleetConfigEbsBlockDeviceConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceFleetConfigEbsBlockDeviceConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "EbsBlockDeviceConfigs"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsconfiguration.html#cfn-elasticmapreduce-instancefleetconfig-ebsconfiguration-ebsblockdeviceconfigs"""
p_EbsOptimized: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "EbsOptimized"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-ebsconfiguration.html#cfn-elasticmapreduce-instancefleetconfig-ebsconfiguration-ebsoptimized"""
@attr.s
class InstanceGroupConfigEbsConfiguration(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.EbsConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration.html
Property Document:
- ``p_EbsBlockDeviceConfigs``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfigs
- ``p_EbsOptimized``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration.html#cfn-emr-ebsconfiguration-ebsoptimized
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.EbsConfiguration"
p_EbsBlockDeviceConfigs: typing.List[typing.Union['InstanceGroupConfigEbsBlockDeviceConfig', dict]] = attr.ib(
default=None,
converter=InstanceGroupConfigEbsBlockDeviceConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceGroupConfigEbsBlockDeviceConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "EbsBlockDeviceConfigs"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration.html#cfn-emr-ebsconfiguration-ebsblockdeviceconfigs"""
p_EbsOptimized: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "EbsOptimized"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-emr-ebsconfiguration.html#cfn-emr-ebsconfiguration-ebsoptimized"""
@attr.s
class ClusterScalingAction(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ScalingAction"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingaction.html
Property Document:
- ``rp_SimpleScalingPolicyConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingaction.html#cfn-elasticmapreduce-cluster-scalingaction-simplescalingpolicyconfiguration
- ``p_Market``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingaction.html#cfn-elasticmapreduce-cluster-scalingaction-market
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ScalingAction"
rp_SimpleScalingPolicyConfiguration: typing.Union['ClusterSimpleScalingPolicyConfiguration', dict] = attr.ib(
default=None,
converter=ClusterSimpleScalingPolicyConfiguration.from_dict,
validator=attr.validators.instance_of(ClusterSimpleScalingPolicyConfiguration),
metadata={AttrMeta.PROPERTY_NAME: "SimpleScalingPolicyConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingaction.html#cfn-elasticmapreduce-cluster-scalingaction-simplescalingpolicyconfiguration"""
p_Market: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Market"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingaction.html#cfn-elasticmapreduce-cluster-scalingaction-market"""
@attr.s
class ClusterScalingRule(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.ScalingRule"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html
Property Document:
- ``rp_Action``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-action
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-name
- ``rp_Trigger``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-trigger
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-description
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.ScalingRule"
rp_Action: typing.Union['ClusterScalingAction', dict] = attr.ib(
default=None,
converter=ClusterScalingAction.from_dict,
validator=attr.validators.instance_of(ClusterScalingAction),
metadata={AttrMeta.PROPERTY_NAME: "Action"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-action"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-name"""
rp_Trigger: typing.Union['ClusterScalingTrigger', dict] = attr.ib(
default=None,
converter=ClusterScalingTrigger.from_dict,
validator=attr.validators.instance_of(ClusterScalingTrigger),
metadata={AttrMeta.PROPERTY_NAME: "Trigger"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-trigger"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-scalingrule.html#cfn-elasticmapreduce-cluster-scalingrule-description"""
@attr.s
class ClusterAutoScalingPolicy(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.AutoScalingPolicy"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-autoscalingpolicy.html
Property Document:
- ``rp_Constraints``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-autoscalingpolicy.html#cfn-elasticmapreduce-cluster-autoscalingpolicy-constraints
- ``rp_Rules``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-autoscalingpolicy.html#cfn-elasticmapreduce-cluster-autoscalingpolicy-rules
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.AutoScalingPolicy"
rp_Constraints: typing.Union['ClusterScalingConstraints', dict] = attr.ib(
default=None,
converter=ClusterScalingConstraints.from_dict,
validator=attr.validators.instance_of(ClusterScalingConstraints),
metadata={AttrMeta.PROPERTY_NAME: "Constraints"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-autoscalingpolicy.html#cfn-elasticmapreduce-cluster-autoscalingpolicy-constraints"""
rp_Rules: typing.List[typing.Union['ClusterScalingRule', dict]] = attr.ib(
default=None,
converter=ClusterScalingRule.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterScalingRule), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Rules"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-autoscalingpolicy.html#cfn-elasticmapreduce-cluster-autoscalingpolicy-rules"""
@attr.s
class InstanceGroupConfigAutoScalingPolicy(Property):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig.AutoScalingPolicy"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-autoscalingpolicy.html
Property Document:
- ``rp_Constraints``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-autoscalingpolicy.html#cfn-elasticmapreduce-instancegroupconfig-autoscalingpolicy-constraints
- ``rp_Rules``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-autoscalingpolicy.html#cfn-elasticmapreduce-instancegroupconfig-autoscalingpolicy-rules
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig.AutoScalingPolicy"
rp_Constraints: typing.Union['InstanceGroupConfigScalingConstraints', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigScalingConstraints.from_dict,
validator=attr.validators.instance_of(InstanceGroupConfigScalingConstraints),
metadata={AttrMeta.PROPERTY_NAME: "Constraints"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-autoscalingpolicy.html#cfn-elasticmapreduce-instancegroupconfig-autoscalingpolicy-constraints"""
rp_Rules: typing.List[typing.Union['InstanceGroupConfigScalingRule', dict]] = attr.ib(
default=None,
converter=InstanceGroupConfigScalingRule.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceGroupConfigScalingRule), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Rules"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancegroupconfig-autoscalingpolicy.html#cfn-elasticmapreduce-instancegroupconfig-autoscalingpolicy-rules"""
@attr.s
class ClusterInstanceGroupConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.InstanceGroupConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html
Property Document:
- ``rp_InstanceCount``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-instancecount
- ``rp_InstanceType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-instancetype
- ``p_AutoScalingPolicy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-autoscalingpolicy
- ``p_BidPrice``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-bidprice
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-configurations
- ``p_EbsConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-ebsconfiguration
- ``p_Market``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-market
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-name
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.InstanceGroupConfig"
rp_InstanceCount: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "InstanceCount"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-instancecount"""
rp_InstanceType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "InstanceType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-instancetype"""
p_AutoScalingPolicy: typing.Union['ClusterAutoScalingPolicy', dict] = attr.ib(
default=None,
converter=ClusterAutoScalingPolicy.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterAutoScalingPolicy)),
metadata={AttrMeta.PROPERTY_NAME: "AutoScalingPolicy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-autoscalingpolicy"""
p_BidPrice: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "BidPrice"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-bidprice"""
p_Configurations: typing.List[typing.Union['ClusterConfiguration', dict]] = attr.ib(
default=None,
converter=ClusterConfiguration.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterConfiguration), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-configurations"""
p_EbsConfiguration: typing.Union['ClusterEbsConfiguration', dict] = attr.ib(
default=None,
converter=ClusterEbsConfiguration.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterEbsConfiguration)),
metadata={AttrMeta.PROPERTY_NAME: "EbsConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-ebsconfiguration"""
p_Market: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Market"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-market"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-instancegroupconfig.html#cfn-elasticmapreduce-cluster-instancegroupconfig-name"""
@attr.s
class InstanceFleetConfigInstanceTypeConfig(Property):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig.InstanceTypeConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html
Property Document:
- ``rp_InstanceType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-instancetype
- ``p_BidPrice``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-bidprice
- ``p_BidPriceAsPercentageOfOnDemandPrice``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-bidpriceaspercentageofondemandprice
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-configurations
- ``p_EbsConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-ebsconfiguration
- ``p_WeightedCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-weightedcapacity
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig.InstanceTypeConfig"
rp_InstanceType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "InstanceType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-instancetype"""
p_BidPrice: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "BidPrice"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-bidprice"""
p_BidPriceAsPercentageOfOnDemandPrice: float = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(float)),
metadata={AttrMeta.PROPERTY_NAME: "BidPriceAsPercentageOfOnDemandPrice"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-bidpriceaspercentageofondemandprice"""
p_Configurations: typing.List[typing.Union['InstanceFleetConfigConfiguration', dict]] = attr.ib(
default=None,
converter=InstanceFleetConfigConfiguration.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceFleetConfigConfiguration), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-configurations"""
p_EbsConfiguration: typing.Union['InstanceFleetConfigEbsConfiguration', dict] = attr.ib(
default=None,
converter=InstanceFleetConfigEbsConfiguration.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(InstanceFleetConfigEbsConfiguration)),
metadata={AttrMeta.PROPERTY_NAME: "EbsConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-ebsconfiguration"""
p_WeightedCapacity: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "WeightedCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-instancefleetconfig-instancetypeconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfig-weightedcapacity"""
@attr.s
class ClusterJobFlowInstancesConfig(Property):
"""
AWS Object Type = "AWS::EMR::Cluster.JobFlowInstancesConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html
Property Document:
- ``p_AdditionalMasterSecurityGroups``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-additionalmastersecuritygroups
- ``p_AdditionalSlaveSecurityGroups``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-additionalslavesecuritygroups
- ``p_CoreInstanceFleet``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-coreinstancefleet
- ``p_CoreInstanceGroup``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-coreinstancegroup
- ``p_Ec2KeyName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-ec2keyname
- ``p_Ec2SubnetId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-ec2subnetid
- ``p_Ec2SubnetIds``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-ec2subnetids
- ``p_EmrManagedMasterSecurityGroup``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-emrmanagedmastersecuritygroup
- ``p_EmrManagedSlaveSecurityGroup``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-emrmanagedslavesecuritygroup
- ``p_HadoopVersion``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-hadoopversion
- ``p_KeepJobFlowAliveWhenNoSteps``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-keepjobflowalivewhennosteps
- ``p_MasterInstanceFleet``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-masterinstancefleet
- ``p_MasterInstanceGroup``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-masterinstancegroup
- ``p_Placement``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-placement
- ``p_ServiceAccessSecurityGroup``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-serviceaccesssecuritygroup
- ``p_TerminationProtected``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-terminationprotected
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster.JobFlowInstancesConfig"
p_AdditionalMasterSecurityGroups: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "AdditionalMasterSecurityGroups"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-additionalmastersecuritygroups"""
p_AdditionalSlaveSecurityGroups: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "AdditionalSlaveSecurityGroups"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-additionalslavesecuritygroups"""
p_CoreInstanceFleet: typing.Union['ClusterInstanceFleetConfig', dict] = attr.ib(
default=None,
converter=ClusterInstanceFleetConfig.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterInstanceFleetConfig)),
metadata={AttrMeta.PROPERTY_NAME: "CoreInstanceFleet"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-coreinstancefleet"""
p_CoreInstanceGroup: typing.Union['ClusterInstanceGroupConfig', dict] = attr.ib(
default=None,
converter=ClusterInstanceGroupConfig.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterInstanceGroupConfig)),
metadata={AttrMeta.PROPERTY_NAME: "CoreInstanceGroup"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-coreinstancegroup"""
p_Ec2KeyName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Ec2KeyName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-ec2keyname"""
p_Ec2SubnetId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Ec2SubnetId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-ec2subnetid"""
p_Ec2SubnetIds: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Ec2SubnetIds"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-ec2subnetids"""
p_EmrManagedMasterSecurityGroup: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "EmrManagedMasterSecurityGroup"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-emrmanagedmastersecuritygroup"""
p_EmrManagedSlaveSecurityGroup: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "EmrManagedSlaveSecurityGroup"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-emrmanagedslavesecuritygroup"""
p_HadoopVersion: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "HadoopVersion"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-hadoopversion"""
p_KeepJobFlowAliveWhenNoSteps: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "KeepJobFlowAliveWhenNoSteps"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-keepjobflowalivewhennosteps"""
p_MasterInstanceFleet: typing.Union['ClusterInstanceFleetConfig', dict] = attr.ib(
default=None,
converter=ClusterInstanceFleetConfig.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterInstanceFleetConfig)),
metadata={AttrMeta.PROPERTY_NAME: "MasterInstanceFleet"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-masterinstancefleet"""
p_MasterInstanceGroup: typing.Union['ClusterInstanceGroupConfig', dict] = attr.ib(
default=None,
converter=ClusterInstanceGroupConfig.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterInstanceGroupConfig)),
metadata={AttrMeta.PROPERTY_NAME: "MasterInstanceGroup"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-masterinstancegroup"""
p_Placement: typing.Union['ClusterPlacementType', dict] = attr.ib(
default=None,
converter=ClusterPlacementType.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterPlacementType)),
metadata={AttrMeta.PROPERTY_NAME: "Placement"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-placement"""
p_ServiceAccessSecurityGroup: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "ServiceAccessSecurityGroup"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-serviceaccesssecuritygroup"""
p_TerminationProtected: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "TerminationProtected"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-elasticmapreduce-cluster-jobflowinstancesconfig.html#cfn-elasticmapreduce-cluster-jobflowinstancesconfig-terminationprotected"""
#--- Resource declaration ---
@attr.s
class InstanceFleetConfig(Resource):
"""
AWS Object Type = "AWS::EMR::InstanceFleetConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html
Property Document:
- ``rp_ClusterId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-clusterid
- ``rp_InstanceFleetType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancefleettype
- ``p_InstanceTypeConfigs``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfigs
- ``p_LaunchSpecifications``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-launchspecifications
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-name
- ``p_TargetOnDemandCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-targetondemandcapacity
- ``p_TargetSpotCapacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-targetspotcapacity
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceFleetConfig"
rp_ClusterId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ClusterId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-clusterid"""
rp_InstanceFleetType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "InstanceFleetType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancefleettype"""
p_InstanceTypeConfigs: typing.List[typing.Union['InstanceFleetConfigInstanceTypeConfig', dict]] = attr.ib(
default=None,
converter=InstanceFleetConfigInstanceTypeConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceFleetConfigInstanceTypeConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "InstanceTypeConfigs"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-instancetypeconfigs"""
p_LaunchSpecifications: typing.Union['InstanceFleetConfigInstanceFleetProvisioningSpecifications', dict] = attr.ib(
default=None,
converter=InstanceFleetConfigInstanceFleetProvisioningSpecifications.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(InstanceFleetConfigInstanceFleetProvisioningSpecifications)),
metadata={AttrMeta.PROPERTY_NAME: "LaunchSpecifications"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-launchspecifications"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-name"""
p_TargetOnDemandCapacity: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "TargetOnDemandCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-targetondemandcapacity"""
p_TargetSpotCapacity: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "TargetSpotCapacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-instancefleetconfig.html#cfn-elasticmapreduce-instancefleetconfig-targetspotcapacity"""
@attr.s
class InstanceGroupConfig(Resource):
"""
AWS Object Type = "AWS::EMR::InstanceGroupConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html
Property Document:
- ``rp_InstanceCount``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfiginstancecount-
- ``rp_InstanceRole``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-instancerole
- ``rp_InstanceType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-instancetype
- ``rp_JobFlowId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-jobflowid
- ``p_AutoScalingPolicy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-elasticmapreduce-instancegroupconfig-autoscalingpolicy
- ``p_BidPrice``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-bidprice
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-configurations
- ``p_EbsConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-ebsconfiguration
- ``p_Market``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-market
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-name
"""
AWS_OBJECT_TYPE = "AWS::EMR::InstanceGroupConfig"
rp_InstanceCount: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "InstanceCount"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfiginstancecount-"""
rp_InstanceRole: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "InstanceRole"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-instancerole"""
rp_InstanceType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "InstanceType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-instancetype"""
rp_JobFlowId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "JobFlowId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-jobflowid"""
p_AutoScalingPolicy: typing.Union['InstanceGroupConfigAutoScalingPolicy', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigAutoScalingPolicy.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(InstanceGroupConfigAutoScalingPolicy)),
metadata={AttrMeta.PROPERTY_NAME: "AutoScalingPolicy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-elasticmapreduce-instancegroupconfig-autoscalingpolicy"""
p_BidPrice: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "BidPrice"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-bidprice"""
p_Configurations: typing.List[typing.Union['InstanceGroupConfigConfiguration', dict]] = attr.ib(
default=None,
converter=InstanceGroupConfigConfiguration.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(InstanceGroupConfigConfiguration), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-configurations"""
p_EbsConfiguration: typing.Union['InstanceGroupConfigEbsConfiguration', dict] = attr.ib(
default=None,
converter=InstanceGroupConfigEbsConfiguration.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(InstanceGroupConfigEbsConfiguration)),
metadata={AttrMeta.PROPERTY_NAME: "EbsConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-ebsconfiguration"""
p_Market: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Market"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-market"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-instancegroupconfig.html#cfn-emr-instancegroupconfig-name"""
@attr.s
class Step(Resource):
"""
AWS Object Type = "AWS::EMR::Step"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html
Property Document:
- ``rp_ActionOnFailure``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-actiononfailure
- ``rp_HadoopJarStep``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-hadoopjarstep
- ``rp_JobFlowId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-jobflowid
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-name
"""
AWS_OBJECT_TYPE = "AWS::EMR::Step"
rp_ActionOnFailure: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ActionOnFailure"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-actiononfailure"""
rp_HadoopJarStep: typing.Union['StepHadoopJarStepConfig', dict] = attr.ib(
default=None,
converter=StepHadoopJarStepConfig.from_dict,
validator=attr.validators.instance_of(StepHadoopJarStepConfig),
metadata={AttrMeta.PROPERTY_NAME: "HadoopJarStep"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-hadoopjarstep"""
rp_JobFlowId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "JobFlowId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-jobflowid"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-step.html#cfn-elasticmapreduce-step-name"""
@attr.s
class Studio(Resource):
"""
AWS Object Type = "AWS::EMR::Studio"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html
Property Document:
- ``rp_AuthMode``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-authmode
- ``rp_DefaultS3Location``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-defaults3location
- ``rp_EngineSecurityGroupId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-enginesecuritygroupid
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-name
- ``rp_ServiceRole``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-servicerole
- ``rp_SubnetIds``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-subnetids
- ``rp_UserRole``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-userrole
- ``rp_VpcId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-vpcid
- ``rp_WorkspaceSecurityGroupId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-workspacesecuritygroupid
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-description
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-tags
"""
AWS_OBJECT_TYPE = "AWS::EMR::Studio"
rp_AuthMode: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "AuthMode"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-authmode"""
rp_DefaultS3Location: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "DefaultS3Location"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-defaults3location"""
rp_EngineSecurityGroupId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "EngineSecurityGroupId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-enginesecuritygroupid"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-name"""
rp_ServiceRole: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ServiceRole"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-servicerole"""
rp_SubnetIds: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "SubnetIds"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-subnetids"""
rp_UserRole: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "UserRole"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-userrole"""
rp_VpcId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "VpcId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-vpcid"""
rp_WorkspaceSecurityGroupId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "WorkspaceSecurityGroupId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-workspacesecuritygroupid"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-description"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#cfn-emr-studio-tags"""
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#aws-resource-emr-studio-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_StudioId(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#aws-resource-emr-studio-return-values"""
return GetAtt(resource=self, attr_name="StudioId")
@property
def rv_Url(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studio.html#aws-resource-emr-studio-return-values"""
return GetAtt(resource=self, attr_name="Url")
@attr.s
class SecurityConfiguration(Resource):
"""
AWS Object Type = "AWS::EMR::SecurityConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-securityconfiguration.html
Property Document:
- ``rp_SecurityConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-securityconfiguration.html#cfn-emr-securityconfiguration-securityconfiguration
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-securityconfiguration.html#cfn-emr-securityconfiguration-name
"""
AWS_OBJECT_TYPE = "AWS::EMR::SecurityConfiguration"
rp_SecurityConfiguration: dict = attr.ib(
default=None,
validator=attr.validators.instance_of(dict),
metadata={AttrMeta.PROPERTY_NAME: "SecurityConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-securityconfiguration.html#cfn-emr-securityconfiguration-securityconfiguration"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-securityconfiguration.html#cfn-emr-securityconfiguration-name"""
@attr.s
class Cluster(Resource):
"""
AWS Object Type = "AWS::EMR::Cluster"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html
Property Document:
- ``rp_Instances``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-instances
- ``rp_JobFlowRole``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-jobflowrole
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-name
- ``rp_ServiceRole``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-servicerole
- ``p_AdditionalInfo``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-additionalinfo
- ``p_Applications``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-applications
- ``p_AutoScalingRole``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-autoscalingrole
- ``p_BootstrapActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-bootstrapactions
- ``p_Configurations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-configurations
- ``p_CustomAmiId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-customamiid
- ``p_EbsRootVolumeSize``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-ebsrootvolumesize
- ``p_KerberosAttributes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-kerberosattributes
- ``p_LogEncryptionKmsKeyId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-logencryptionkmskeyid
- ``p_LogUri``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-loguri
- ``p_ManagedScalingPolicy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-managedscalingpolicy
- ``p_ReleaseLabel``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-releaselabel
- ``p_ScaleDownBehavior``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-scaledownbehavior
- ``p_SecurityConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-securityconfiguration
- ``p_StepConcurrencyLevel``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-stepconcurrencylevel
- ``p_Steps``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-steps
- ``p_VisibleToAllUsers``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-visibletoallusers
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-tags
"""
AWS_OBJECT_TYPE = "AWS::EMR::Cluster"
rp_Instances: typing.Union['ClusterJobFlowInstancesConfig', dict] = attr.ib(
default=None,
converter=ClusterJobFlowInstancesConfig.from_dict,
validator=attr.validators.instance_of(ClusterJobFlowInstancesConfig),
metadata={AttrMeta.PROPERTY_NAME: "Instances"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-instances"""
rp_JobFlowRole: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "JobFlowRole"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-jobflowrole"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-name"""
rp_ServiceRole: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ServiceRole"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-servicerole"""
p_AdditionalInfo: dict = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(dict)),
metadata={AttrMeta.PROPERTY_NAME: "AdditionalInfo"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-additionalinfo"""
p_Applications: typing.List[typing.Union['ClusterApplication', dict]] = attr.ib(
default=None,
converter=ClusterApplication.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterApplication), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Applications"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-applications"""
p_AutoScalingRole: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "AutoScalingRole"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-autoscalingrole"""
p_BootstrapActions: typing.List[typing.Union['ClusterBootstrapActionConfig', dict]] = attr.ib(
default=None,
converter=ClusterBootstrapActionConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterBootstrapActionConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "BootstrapActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-bootstrapactions"""
p_Configurations: typing.List[typing.Union['ClusterConfiguration', dict]] = attr.ib(
default=None,
converter=ClusterConfiguration.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterConfiguration), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Configurations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-configurations"""
p_CustomAmiId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CustomAmiId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-customamiid"""
p_EbsRootVolumeSize: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "EbsRootVolumeSize"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-ebsrootvolumesize"""
p_KerberosAttributes: typing.Union['ClusterKerberosAttributes', dict] = attr.ib(
default=None,
converter=ClusterKerberosAttributes.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterKerberosAttributes)),
metadata={AttrMeta.PROPERTY_NAME: "KerberosAttributes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-kerberosattributes"""
p_LogEncryptionKmsKeyId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LogEncryptionKmsKeyId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-logencryptionkmskeyid"""
p_LogUri: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LogUri"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-loguri"""
p_ManagedScalingPolicy: typing.Union['ClusterManagedScalingPolicy', dict] = attr.ib(
default=None,
converter=ClusterManagedScalingPolicy.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(ClusterManagedScalingPolicy)),
metadata={AttrMeta.PROPERTY_NAME: "ManagedScalingPolicy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-managedscalingpolicy"""
p_ReleaseLabel: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "ReleaseLabel"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-releaselabel"""
p_ScaleDownBehavior: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "ScaleDownBehavior"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-scaledownbehavior"""
p_SecurityConfiguration: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "SecurityConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-securityconfiguration"""
p_StepConcurrencyLevel: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "StepConcurrencyLevel"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-stepconcurrencylevel"""
p_Steps: typing.List[typing.Union['ClusterStepConfig', dict]] = attr.ib(
default=None,
converter=ClusterStepConfig.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(ClusterStepConfig), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Steps"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-steps"""
p_VisibleToAllUsers: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "VisibleToAllUsers"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-visibletoallusers"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#cfn-elasticmapreduce-cluster-tags"""
@property
def rv_MasterPublicDNS(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticmapreduce-cluster.html#aws-resource-elasticmapreduce-cluster-return-values"""
return GetAtt(resource=self, attr_name="MasterPublicDNS")
@attr.s
class StudioSessionMapping(Resource):
"""
AWS Object Type = "AWS::EMR::StudioSessionMapping"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html
Property Document:
- ``rp_IdentityName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-identityname
- ``rp_IdentityType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-identitytype
- ``rp_SessionPolicyArn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-sessionpolicyarn
- ``rp_StudioId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-studioid
"""
AWS_OBJECT_TYPE = "AWS::EMR::StudioSessionMapping"
rp_IdentityName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "IdentityName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-identityname"""
rp_IdentityType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "IdentityType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-identitytype"""
rp_SessionPolicyArn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "SessionPolicyArn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-sessionpolicyarn"""
rp_StudioId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "StudioId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-emr-studiosessionmapping.html#cfn-emr-studiosessionmapping-studioid"""
| 72.242546 | 296 | 0.789198 | 17,541 | 179,306 | 7.97674 | 0.015507 | 0.089423 | 0.041038 | 0.063422 | 0.93983 | 0.933869 | 0.919883 | 0.884298 | 0.882783 | 0.882369 | 0 | 0.000129 | 0.091916 | 179,306 | 2,481 | 297 | 72.271665 | 0.859199 | 0.34363 | 0 | 0.556429 | 0 | 0 | 0.101216 | 0.065519 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002857 | false | 0.004286 | 0.002857 | 0 | 0.255714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
620535e80c94944e77763b5a638ec1ce66c8e863 | 121 | py | Python | test_file_example.py | corps-g/scale_plots | ea18cab82da331628c2932a0cedfe3878166c84a | [
"MIT"
] | null | null | null | test_file_example.py | corps-g/scale_plots | ea18cab82da331628c2932a0cedfe3878166c84a | [
"MIT"
] | null | null | null | test_file_example.py | corps-g/scale_plots | ea18cab82da331628c2932a0cedfe3878166c84a | [
"MIT"
] | 1 | 2021-02-25T20:17:03.000Z | 2021-02-25T20:17:03.000Z | import scale_plots
scale_plots.Test('scale.rev05.44groupcov')
print('\n')
scale_plots.Test('scale.rev08.56groupcov7.1')
| 20.166667 | 45 | 0.785124 | 18 | 121 | 5.111111 | 0.611111 | 0.326087 | 0.304348 | 0.413043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 0.049587 | 121 | 5 | 46 | 24.2 | 0.713043 | 0 | 0 | 0 | 0 | 0 | 0.404959 | 0.38843 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6217173a4264c90b96cf76aef9717d580ec33379 | 5,800 | py | Python | tests/test_mit_integration_by_year.py | DrewStern/benford-analysis | ad1e2e78daec7c971087032116fbb908dc791bad | [
"MIT"
] | 1 | 2020-11-08T18:37:36.000Z | 2020-11-08T18:37:36.000Z | tests/test_mit_integration_by_year.py | DrewStern/benford-analysis | ad1e2e78daec7c971087032116fbb908dc791bad | [
"MIT"
] | null | null | null | tests/test_mit_integration_by_year.py | DrewStern/benford-analysis | ad1e2e78daec7c971087032116fbb908dc791bad | [
"MIT"
] | null | null | null | class MitIntegrationByYearTestCases(unittest.TestCase):
def setUp(self) -> None:
data_path = self.read_presidential_votes_county_data()
self.election_result_repository = ElectionResultRepository(data_path)
self.election_result_service = ElectionResultService(self.election_result_repository)
self.benford_analysis_service = BenfordAnalysisService(self.election_result_service)
def test_calculate_benford_distribution_by_year_2000(self):
expected_distribution = [0, 28.39, 18.69, 13.44, 10.63, 8.04, 6.76, 4.93, 5.2, 3.92]
actual_distribution = self.benford_analysis_service.calculate_benford_distribution(self.election_result_service.get_election_results(year_filter="2000"))
self.assertEqual(expected_distribution, actual_distribution)
expected_deviations = ["INF", 5.68, 6.19, 7.52, 9.59, 1.77, 0.9, 15.0, 1.96, 14.78]
actual_deviations = self.benford_analysis_service.calculate_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_deviations, actual_deviations)
expected_max_dev = 15.0
actual_max_dev = self.benford_analysis_service.get_maximum_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_max_dev, actual_max_dev)
def test_calculate_benford_distribution_by_year_2004(self):
expected_distribution = [0, 28.58, 17.76, 12.97, 10.46, 8.7, 6.71, 5.71, 4.77, 4.34]
actual_distribution = self.benford_analysis_service.calculate_benford_distribution(self.election_result_service.get_election_results(year_filter="2004"))
self.assertEqual(expected_distribution, actual_distribution)
expected_deviations = ["INF", 5.05, 0.91, 3.76, 7.84, 10.13, 0.15, 1.55, 6.47, 5.65]
actual_deviations = self.benford_analysis_service.calculate_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_deviations, actual_deviations)
expected_max_dev = 10.13
actual_max_dev = self.benford_analysis_service.get_maximum_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_max_dev, actual_max_dev)
def test_calculate_benford_distribution_by_year_2008(self):
expected_distribution = [0, 28.46, 17.8, 13.02, 10.56, 8.32, 6.67, 5.88, 4.85, 4.44]
actual_distribution = self.benford_analysis_service.calculate_benford_distribution(self.election_result_service.get_election_results(year_filter="2008"))
self.assertEqual(expected_distribution, actual_distribution)
expected_deviations = ["INF", 5.45, 1.14, 4.16, 8.87, 5.32, 0.45, 1.38, 4.9, 3.48]
actual_deviations = self.benford_analysis_service.calculate_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_deviations, actual_deviations)
expected_max_dev = 8.87
actual_max_dev = self.benford_analysis_service.get_maximum_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_max_dev, actual_max_dev)
def test_calculate_benford_distribution_by_year_2012(self):
expected_distribution = [0, 28.83, 17.73, 12.96, 10.3, 8.25, 6.75, 5.34, 5.05, 4.78]
actual_distribution = self.benford_analysis_service.calculate_benford_distribution(self.election_result_service.get_election_results(year_filter="2012"))
self.assertEqual(expected_distribution, actual_distribution)
expected_deviations = ["INF", 4.22, 0.74, 3.68, 6.19, 4.43, 0.75, 7.93, 0.98, 3.91]
actual_deviations = self.benford_analysis_service.calculate_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_deviations, actual_deviations)
expected_max_dev = 7.93
actual_max_dev = self.benford_analysis_service.get_maximum_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_max_dev, actual_max_dev)
def test_calculate_benford_distribution_by_year_2016(self):
expected_distribution = [0, 29.67, 17.17, 12.99, 9.9, 8.05, 6.61, 6.16, 5.01, 4.44]
actual_distribution = self.benford_analysis_service.calculate_benford_distribution(self.election_result_service.get_election_results(year_filter="2016"))
self.assertEqual(expected_distribution, actual_distribution)
expected_deviations = ["INF", 1.43, 2.44, 3.92, 2.06, 1.9, 1.34, 6.21, 1.76, 3.48]
actual_deviations = self.benford_analysis_service.calculate_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_deviations, actual_deviations)
expected_max_dev = 6.21
actual_max_dev = self.benford_analysis_service.get_maximum_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_max_dev, actual_max_dev)
def test_calculate_benford_distribution_by_year_2020(self):
expected_distribution = [4.06, 29.02, 17.21, 11.72, 9.55, 7.44, 6.15, 5.28, 5.0, 4.56]
actual_distribution = self.benford_analysis_service.calculate_benford_distribution(self.election_result_service.get_election_results(year_filter="2020"))
self.assertEqual(expected_distribution, actual_distribution)
expected_deviations = ["INF", 3.59, 2.22, 6.24, 1.55, 5.82, 8.21, 8.97, 1.96, 0.87]
actual_deviations = self.benford_analysis_service.calculate_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_deviations, actual_deviations)
expected_max_dev = 8.97
actual_max_dev = self.benford_analysis_service.get_maximum_deviation_from_benford_distribution(actual_distribution)
self.assertEqual(expected_max_dev, actual_max_dev) | 80.555556 | 162 | 0.762759 | 780 | 5,800 | 5.297436 | 0.153846 | 0.110358 | 0.087367 | 0.119555 | 0.818732 | 0.792594 | 0.792594 | 0.782672 | 0.782672 | 0.728945 | 0 | 0.083636 | 0.146552 | 5,800 | 72 | 163 | 80.555556 | 0.751111 | 0 | 0 | 0.454545 | 0 | 0 | 0.00733 | 0 | 0 | 0 | 0 | 0 | 0.272727 | 1 | 0.106061 | false | 0 | 0 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6223d8335b459d09f126d5d48ef3009eaead7edf | 1,806 | py | Python | GPy/testing/gplvm_tests.py | rokroskar/GPy | 0f8dbba56d480902c86cfe8bad9e79d9eabae009 | [
"BSD-3-Clause"
] | 1 | 2016-08-04T21:28:11.000Z | 2016-08-04T21:28:11.000Z | GPy/testing/gplvm_tests.py | rokroskar/GPy | 0f8dbba56d480902c86cfe8bad9e79d9eabae009 | [
"BSD-3-Clause"
] | null | null | null | GPy/testing/gplvm_tests.py | rokroskar/GPy | 0f8dbba56d480902c86cfe8bad9e79d9eabae009 | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) 2012, Nicolo Fusi
# Licensed under the BSD 3-clause license (see LICENSE.txt)
import unittest
import numpy as np
import GPy
class GPLVMTests(unittest.TestCase):
def test_bias_kern(self):
num_data, num_inducing, input_dim, output_dim = 10, 3, 2, 4
X = np.random.rand(num_data, input_dim)
k = GPy.kern.rbf(input_dim) + GPy.kern.white(input_dim, 0.00001)
K = k.K(X)
Y = np.random.multivariate_normal(np.zeros(num_data),K,output_dim).T
k = GPy.kern.bias(input_dim) + GPy.kern.white(input_dim, 0.00001)
m = GPy.models.GPLVM(Y, input_dim, kernel = k)
m.randomize()
self.assertTrue(m.checkgrad())
def test_linear_kern(self):
num_data, num_inducing, input_dim, output_dim = 10, 3, 2, 4
X = np.random.rand(num_data, input_dim)
k = GPy.kern.rbf(input_dim) + GPy.kern.white(input_dim, 0.00001)
K = k.K(X)
Y = np.random.multivariate_normal(np.zeros(num_data),K,output_dim).T
k = GPy.kern.linear(input_dim) + GPy.kern.white(input_dim, 0.00001)
m = GPy.models.GPLVM(Y, input_dim, kernel = k)
m.randomize()
self.assertTrue(m.checkgrad())
def test_rbf_kern(self):
num_data, num_inducing, input_dim, output_dim = 10, 3, 2, 4
X = np.random.rand(num_data, input_dim)
k = GPy.kern.rbf(input_dim) + GPy.kern.white(input_dim, 0.00001)
K = k.K(X)
Y = np.random.multivariate_normal(np.zeros(num_data),K,output_dim).T
k = GPy.kern.rbf(input_dim) + GPy.kern.white(input_dim, 0.00001)
m = GPy.models.GPLVM(Y, input_dim, kernel = k)
m.randomize()
self.assertTrue(m.checkgrad())
if __name__ == "__main__":
print "Running unit tests, please be (very) patient..."
unittest.main()
| 40.133333 | 76 | 0.640089 | 289 | 1,806 | 3.806228 | 0.238754 | 0.152727 | 0.043636 | 0.081818 | 0.79 | 0.79 | 0.79 | 0.79 | 0.79 | 0.79 | 0 | 0.039971 | 0.224252 | 1,806 | 44 | 77 | 41.045455 | 0.745182 | 0.04928 | 0 | 0.675676 | 0 | 0 | 0.032089 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 0 | null | null | 0 | 0.081081 | null | null | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6232fa55a9f32fc347eef9e78412d729dfaa2436 | 40 | py | Python | __init__.py | XXXalice/umemiya_killme | 8d6d13cf28ded05d89bbb2b7a705afae9d0932bb | [
"MIT"
] | 1 | 2020-05-29T18:11:54.000Z | 2020-05-29T18:11:54.000Z | __init__.py | XXXalice/umemiya_killme | 8d6d13cf28ded05d89bbb2b7a705afae9d0932bb | [
"MIT"
] | 1 | 2020-06-01T14:09:00.000Z | 2020-06-01T14:09:00.000Z | __init__.py | XXXalice/umemiya_killme | 8d6d13cf28ded05d89bbb2b7a705afae9d0932bb | [
"MIT"
] | 1 | 2020-05-29T18:12:02.000Z | 2020-05-29T18:12:02.000Z | # API自体を参照用パッケージング
# config.pyのimportのため | 20 | 21 | 0.85 | 3 | 40 | 11.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 2 | 21 | 20 | 0.918919 | 0.9 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
623d198d38eec0d10b1e69dd6acd24891d8efc04 | 5,692 | py | Python | UnityEngine/ParticleSystem/MinMaxCurve/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | UnityEngine/ParticleSystem/MinMaxCurve/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | UnityEngine/ParticleSystem/MinMaxCurve/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | from typing import overload
from UdonPie import System
from UdonPie import UnityEngine
from UdonPie.Undefined import *
class MinMaxCurve:
def __new__(cls, arg1=None):
'''
:returns: MinMaxCurve
:rtype: UnityEngine.ParticleSystem.MinMaxCurve
'''
pass
@staticmethod
@overload
def ctor(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
:returns: ParticleSystem+MinMaxCurve
:rtype: UnityEngine.ParticleSystem+MinMaxCurve
'''
pass
@staticmethod
@overload
def ctor(arg1, arg2):
'''
:param arg1: Single
:type arg1: System.Single or float
:param arg2: AnimationCurve
:type arg2: UnityEngine.AnimationCurve
:returns: ParticleSystem+MinMaxCurve
:rtype: UnityEngine.ParticleSystem+MinMaxCurve
'''
pass
@staticmethod
@overload
def ctor(arg1, arg2, arg3):
'''
:param arg1: Single
:type arg1: System.Single or float
:param arg2: AnimationCurve
:type arg2: UnityEngine.AnimationCurve
:param arg3: AnimationCurve
:type arg3: UnityEngine.AnimationCurve
:returns: ParticleSystem+MinMaxCurve
:rtype: UnityEngine.ParticleSystem+MinMaxCurve
'''
pass
@staticmethod
@overload
def ctor(arg1, arg2):
'''
:param arg1: Single
:type arg1: System.Single or float
:param arg2: Single
:type arg2: System.Single or float
:returns: ParticleSystem+MinMaxCurve
:rtype: UnityEngine.ParticleSystem+MinMaxCurve
'''
pass
@staticmethod
def ctor(arg1=None, arg2=None, arg3=None):
pass
@staticmethod
def op_Implicit(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
:returns: ParticleSystem+MinMaxCurve
:rtype: UnityEngine.ParticleSystem+MinMaxCurve
'''
pass
@staticmethod
def get_mode():
'''
:returns: ParticleSystemCurveMode
:rtype: UnityEngine.ParticleSystemCurveMode
'''
pass
@staticmethod
def set_mode(arg1):
'''
:param arg1: ParticleSystemCurveMode
:type arg1: UnityEngine.ParticleSystemCurveMode
'''
pass
@staticmethod
def get_curveMultiplier():
'''
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
def set_curveMultiplier(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
'''
pass
@staticmethod
def get_curveMax():
'''
:returns: AnimationCurve
:rtype: UnityEngine.AnimationCurve
'''
pass
@staticmethod
def set_curveMax(arg1):
'''
:param arg1: AnimationCurve
:type arg1: UnityEngine.AnimationCurve
'''
pass
@staticmethod
def get_curveMin():
'''
:returns: AnimationCurve
:rtype: UnityEngine.AnimationCurve
'''
pass
@staticmethod
def set_curveMin(arg1):
'''
:param arg1: AnimationCurve
:type arg1: UnityEngine.AnimationCurve
'''
pass
@staticmethod
def get_constantMax():
'''
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
def set_constantMax(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
'''
pass
@staticmethod
def get_constantMin():
'''
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
def set_constantMin(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
'''
pass
@staticmethod
def get_constant():
'''
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
def set_constant(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
'''
pass
@staticmethod
def get_curve():
'''
:returns: AnimationCurve
:rtype: UnityEngine.AnimationCurve
'''
pass
@staticmethod
def set_curve(arg1):
'''
:param arg1: AnimationCurve
:type arg1: UnityEngine.AnimationCurve
'''
pass
@staticmethod
@overload
def Evaluate(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
@overload
def Evaluate(arg1, arg2):
'''
:param arg1: Single
:type arg1: System.Single or float
:param arg2: Single
:type arg2: System.Single or float
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
def Evaluate(arg1=None, arg2=None):
pass
@staticmethod
def Equals(arg1):
'''
:param arg1: Object
:type arg1: System.Object
:returns: Boolean
:rtype: System.Boolean
'''
pass
@staticmethod
def ToString():
'''
:returns: String
:rtype: System.String
'''
pass
@staticmethod
def GetHashCode():
'''
:returns: Int32
:rtype: System.Int32
'''
pass
@staticmethod
def GetType():
'''
:returns: Type
:rtype: System.Type
'''
pass
| 21.00369 | 55 | 0.542691 | 478 | 5,692 | 6.41841 | 0.106695 | 0.151239 | 0.142438 | 0.080508 | 0.754889 | 0.720339 | 0.707953 | 0.707953 | 0.706975 | 0.567797 | 0 | 0.020188 | 0.364722 | 5,692 | 270 | 56 | 21.081481 | 0.828263 | 0.418131 | 0 | 0.67 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0.3 | 0.04 | 0 | 0.35 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
6583683298b3a8e73775d8a08c0106d5ec29c879 | 15,793 | py | Python | lyapunov_reachability/common/models.py | 62subinh/deeprl_safety_specification | 2ae7c4d1564f16539336bedc48110743ecd60935 | [
"MIT"
] | 3 | 2021-03-23T13:46:04.000Z | 2021-09-28T14:54:43.000Z | lyapunov_reachability/common/models.py | 62subinh/deeprl_safety_specification | 2ae7c4d1564f16539336bedc48110743ecd60935 | [
"MIT"
] | null | null | null | lyapunov_reachability/common/models.py | 62subinh/deeprl_safety_specification | 2ae7c4d1564f16539336bedc48110743ecd60935 | [
"MIT"
] | 1 | 2020-11-28T08:05:09.000Z | 2020-11-28T08:05:09.000Z | import math
import numpy as np
import copy
import torch
import torch.nn as nn
from lyapunov_reachability.common.networks import functional_finder, Mlp, Cnn
from lyapunov_reachability.common.utils import output_shape, clip_but_pass_gradient
EPS = 1e-8
MATH_LOG_2PI = np.log(2 * np.pi)
def normal_likelihood(input_, mu, log_std):
return -0.5 * (((input_ - mu) / torch.exp(log_std)) ** 2 + 2 * log_std + MATH_LOG_2PI).sum(1, keepdim=True)
def normal_entropy(log_std):
return log_std + 0.5 + 0.5 * MATH_LOG_2PI
def atanh(x):
return 0.5 * torch.log((1. + EPS + x)/(1. + EPS - x))
class ProbCritic(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None, ):
"""
In: state & action
Out: Q-value for given state-action pair
:param ob_space : Shape of the observation space.
:param ac_space : Shape of the action space. (must be 1-dimensional)
:param extractor : Class of the extractor network.
:param extractor_params : Keyword arguments for the extractor network. (optional)
:param decoder : Class of the decoder network. (optional)
:param decoder_params : Keyword arguments for the decoder network.
"""
super(ProbCritic, self).__init__()
self.ac_size = ac_space
self.decoder = None
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size + self.ac_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size + self.ac_size, extractor_params), dtype=np.int)
self.value_layer = nn.Linear(self.feature_size, 1)
def forward(self, observation, action):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
# Compute Q--value.
feature = self.extractor(torch.cat((state, action), dim=-1))
pre_q = self.value_layer(feature.view(-1, self.feature_size))
return clip_but_pass_gradient(pre_q, 0., 1.)
# return torch.exp(clip_but_pass_gradient(pre_q, lower=-10., upper=0.))
class GeneralCritic(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None,
value_processing='none'):
"""
In: state & action
Out: Q-value for given state-action pair
:param ob_space : Shape of the observation space.
:param ac_space : Shape of the action space. (must be 1-dimensional)
:param extractor : Class of the extractor network.
:param extractor_params : Keyword arguments for the extractor network. (optional)
:param decoder : Class of the decoder network. (optional)
:param decoder_params : Keyword arguments for the decoder network.
"""
super(GeneralCritic, self).__init__()
self.ac_size = ac_space
self.decoder = None
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size + self.ac_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size + self.ac_size, extractor_params), dtype=np.int)
self.value_layer = nn.Linear(self.feature_size, 1)
self.value_processing = functional_finder(value_processing)
def forward(self, observation, action):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
# Compute Q--value.
feature = self.extractor(torch.cat((state, action), dim=-1))
pre_q = self.value_layer(feature.view(-1, self.feature_size))
return self.value_processing(pre_q)
class Value(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None, ):
"""
In: state & action
Out: Q-value for given state-action pair
:param ob_space : Shape of the observation space.
:param ac_space : Shape of the action space. (must be 1-dimensional)
:param extractor : Class of the extractor network.
:param extractor_params : Keyword arguments for the extractor network. (optional)
:param decoder : Class of the decoder network. (optional)
:param decoder_params : Keyword arguments for the decoder network.
"""
super(Value, self).__init__()
self.ac_size = ac_space
self.decoder = None
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size, extractor_params), dtype=np.int)
self.value_layer = nn.Linear(self.feature_size, 1)
def forward(self, observation):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
# Compute value function.
feature = self.extractor(state)
return self.value_layer(feature.view(-1, self.feature_size))
class DetActor(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None, ):
"""
Deterministic actor for DDPG implementation.
"""
super(DetActor, self).__init__()
self.ac_size = ac_space
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size, extractor_params), dtype=np.int)
self.mean_layer = nn.Linear(self.feature_size, self.ac_size)
def forward(self, observation):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
feature = self.extractor(state)
return torch.tanh(self.mean_layer(feature.view(-1, self.feature_size)))
class Actor(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None,):
super(Actor, self).__init__()
self.ac_size = ac_space
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size + self.ac_size, extractor_params), dtype=np.int)
self.mean_layer = nn.Linear(self.feature_size, self.ac_size)
self.logstd_layer = nn.Linear(self.feature_size, self.ac_size)
def forward(self, observation):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
feature = self.extractor(state).view(-1, self.feature_size)
mean = self.mean_layer(feature)
logstd = clip_but_pass_gradient(self.logstd_layer(feature), -6., 2.)
std = torch.exp(logstd)
# Reparameterization trick
pre_sample = mean + std * torch.randn(mean.size(), dtype=mean.dtype, device=mean.device)
sample = torch.tanh(pre_sample)
log_prob = normal_likelihood(pre_sample, mean, logstd) - torch.log(-sample ** 2 + 1. + EPS).sum(1, keepdim=True)
return sample, torch.tanh(mean), log_prob
def sample(self, observation, deterministic=False):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
feature = self.extractor(state).view(-1, self.feature_size)
mean = self.mean_layer(feature)
logstd = clip_but_pass_gradient(self.logstd_layer(feature), -6., 2.)
std = torch.exp(logstd)
if deterministic:
return torch.tanh(mean)
else:
pre_sample = mean + std * torch.randn(mean.size(), dtype=mean.dtype, device=mean.device)
return torch.tanh(pre_sample)
def log_prob(self, observation, action):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
feature = self.extractor(state).view(-1, self.feature_size)
mean = self.mean_layer(feature)
logstd = clip_but_pass_gradient(self.logstd_layer(feature), -6., 2.)
pre_action = atanh(action)
log_prob = normal_likelihood(pre_action, mean, logstd) - torch.log(-action ** 2 + 1. + EPS).sum(1, keepdim=True)
return log_prob
class VAE(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None, ):
"""
Conditional VAE, designed to match with BCQ (Fujimoto, 2019).
"""
super(VAE, self).__init__()
self.ac_size = ac_space
self.obs_decoder = None
if decoder is not None:
self.obs_decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.obs_decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
vae_params = copy.deepcopy(extractor_params)
vae_params['activ'] = 'relu'
self.feature_size = np.prod(output_shape(self.ob_size, vae_params), dtype=np.int)
# encoder
self.enc_net = extractor(self.ob_size + self.ac_size, **vae_params)
self.mu_layer = nn.Linear(self.feature_size, 2 * self.ac_size)
self.logstd_layer = nn.Linear(self.feature_size, 2 * self.ac_size)
# decoder
self.dec_net = extractor(self.ob_size + 2 * self.ac_size, **vae_params)
self.recon_layer = nn.Linear(self.feature_size, self.ac_size)
del vae_params
def encode(self, obs, act):
if self.obs_decoder is not None:
state = self.obs_decoder(obs)
else:
state = obs
encoded = self.enc_net(torch.cat((state, act), dim=-1))
return self.mu_layer(encoded), self.logstd_layer(encoded)
def reparameterize(self, mu, logstd):
std = torch.exp(logstd)
eps = torch.clamp(torch.randn_like(std), min=-0.5, max=0.5)
return mu + eps * std
def decode(self, obs, z):
if self.obs_decoder is not None:
state = self.obs_decoder(obs)
else:
state = obs
pre_recon = self.dec_net(torch.cat((state, z), dim=-1))
return torch.tanh(self.recon_layer(pre_recon))
def forward(self, obs, act):
mu, logstd = self.encode(obs, act)
z = self.reparameterize(mu, logstd)
return self.decode(obs, z), mu, logstd
def generate(self, obs):
"""
:param obs: Observation. shape=(batch_size, ob_space)
:return: Generated actions. shape=(batch_size, ac_size)
"""
z_n = torch.randn((list(obs.shape)[0], 2 * self.ac_size), device=obs.device)
return self.decode(obs, z_n)
class Perturb(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None, ):
"""
In: state & action
Out: Action perturbation, pre-scaled.
:param ob_space : Shape of the observation space.
:param ac_space : Shape of the action space. (must be 1-dimensional)
:param extractor : Class of the extractor network.
:param extractor_params : Keyword arguments for the extractor network. (optional)
:param decoder : Class of the decoder network. (optional)
:param decoder_params : Keyword arguments for the decoder network.
"""
super(Perturb, self).__init__()
self.ac_size = ac_space
self.decoder = None
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size + self.ac_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size + self.ac_size, extractor_params), dtype=np.int)
self.perturb_layer = nn.Linear(self.feature_size, self.ac_size)
def forward(self, observation, action):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
feature = self.extractor(torch.cat((state, action), dim=-1))
perturbation = self.perturb_layer(feature.view(-1, self.feature_size))
return torch.tanh(perturbation)
class Lambda(nn.Module):
def __init__(self, ob_space, ac_space, extractor, extractor_params, decoder=None, decoder_params=None, ):
"""
In: state,
Out: state-wise log-lambda (Lagrangian multiplier), scalar.
:param ob_space : Shape of the observation space.
:param ac_space : Shape of the action space. (must be 1-dimensional)
:param extractor : Class of the extractor network.
:param extractor_params : Keyword arguments for the extractor network. (optional)
:param decoder : Class of the decoder network. (optional)
:param decoder_params : Keyword arguments for the decoder network.
"""
super(Lambda, self).__init__()
self.ac_size = ac_space
self.decoder = None
if decoder is not None:
self.decoder = decoder(ob_space, **decoder_params)
self.ob_size = np.prod(np.array(output_shape(ob_space, decoder_params)), dtype=np.int)
else:
self.decoder = None
self.ob_size = np.prod(np.array(ob_space), dtype=np.int)
self.extractor = extractor(self.ob_size, **extractor_params)
self.feature_size = np.prod(output_shape(self.ob_size, extractor_params), dtype=np.int)
self.lambda_layer = nn.Linear(self.feature_size, 1)
def forward(self, observation):
if self.decoder is not None:
state = self.decoder(observation)
else:
state = observation
feature = self.extractor(state)
log_lambda = self.lambda_layer(feature.view(-1, self.feature_size))
return clip_but_pass_gradient(log_lambda, lower=-10., upper=6.)
| 41.670185 | 120 | 0.632622 | 2,062 | 15,793 | 4.651794 | 0.079534 | 0.025646 | 0.034404 | 0.031693 | 0.802544 | 0.786384 | 0.779817 | 0.776793 | 0.756568 | 0.745934 | 0 | 0.006437 | 0.262205 | 15,793 | 378 | 121 | 41.780423 | 0.81677 | 0.18103 | 0 | 0.662551 | 0 | 0 | 0.001044 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.102881 | false | 0.024691 | 0.028807 | 0.012346 | 0.238683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
65c1a73aa3c0ca26464ef6469b0b082971dac3a1 | 8,494 | py | Python | def/relatorios.py | cienwfp/appAnaliseMovimento | 6b20b08602af7839909b0065db4b9725ec79d3b2 | [
"MIT"
] | null | null | null | def/relatorios.py | cienwfp/appAnaliseMovimento | 6b20b08602af7839909b0065db4b9725ec79d3b2 | [
"MIT"
] | null | null | null | def/relatorios.py | cienwfp/appAnaliseMovimento | 6b20b08602af7839909b0065db4b9725ec79d3b2 | [
"MIT"
] | null | null | null | import os
import time
from reportlab.lib import utils
from reportlab.lib.pagesizes import A4, landscape
from reportlab.lib.units import mm
from reportlab.pdfgen import canvas
from reportlab.lib.enums import TA_JUSTIFY, TA_CENTER
from reportlab.platypus import SimpleDocTemplate, NextPageTemplate, PageTemplate, PageBreak, Paragraph, Spacer, Image, Table, TableStyle, Frame
from reportlab.lib.styles import getSampleStyleSheet, ParagraphStyle
from reportlab.lib import colors
class relatorios():
def addPageNumber(canvas, doc):
"""
Add the page number
"""
page_num = canvas.getPageNumber()
text = "Pag %s" % page_num
canvas.drawRightString(200*mm, 10*mm, text)
def relatorioOpcao1(name, cpf, placa, table):
'''
Report of Option one. This has content of the
ten last observations of the veficle
'''
if os.path.isdir(os.getcwd() + '/report/' + str(cpf)):
rep = os.getcwd() + '/report/' + str(cpf) + str('/reportOption1.pdf')
else:
os.makedirs(os.getcwd() + '/report/' + str(cpf))
rep = os.getcwd() + '/report/' + str(cpf) + str('/reportOption1.pdf')
doc = SimpleDocTemplate(rep,pagesize=(A4),
rightMargin=30,leftMargin=30,
topMargin=30,bottomMargin=30)
styles=getSampleStyleSheet()
Story=[]
logo = os.getcwd() + '/payload/' + str('logo.png')
formatted_time = time.ctime()
#### Página 2 ######################################################################
im = Image(logo, 20*mm, 20*mm)
im.hAlign = 'LEFT'
Story.append(im)
Story.append(Spacer(1, 48))
styles.add(ParagraphStyle(name='Justify', alignment=TA_JUSTIFY))
ptext = '<font size="12">Olá senhor %s</font>' %(name)
Story.append(Paragraph(ptext, styles["Justify"]))
Story.append(Spacer(1, 12))
ptext = '<font size="12">Este relatório foi produzido seguindo os seguintes parâmetros de pesquisa:</font>'
Story.append(Paragraph(ptext, styles["Justify"]))
Story.append(Spacer(1, 12))
ptext = '<font size="12">Placa: %s</font>'%(placa)
Story.append(Paragraph(ptext, styles["Justify"]))
Story.append(Spacer(1, 36))
styleN = styles["Normal"]
# Make heading for each column and start data list
column1Heading = "COLUMN ONE HEADING"
column2Heading = "COLUMN TWO HEADING"
# Assemble data for each column using simple loop to append it into data list
data = [[column1Heading,column2Heading]]
#for i in table:
# data.append([str(i),str(i)])
data_ = table.drop(columns = ['idMovimento'])
data = [data_.columns[:,].values.astype(str).tolist()] + data_.values.tolist()
tableThatSplitsOverPages = Table(data, repeatRows=1)
tableThatSplitsOverPages.hAlign = 'CENTER'
tblStyle = TableStyle([('TEXTCOLOR',(0,0),(-1,-1),colors.black),
('FONTSIZE',(0,0),(-1,-1),5.5),
('VALIGN',(0,0),(-1,-1),'TOP'),
('LINEBELOW',(0,0),(-1,-1),1,colors.black),
('BOX',(0,0),(-1,-1),1,colors.black),
('BOX',(0,0),(-1,-1),1,colors.black)])
tblStyle.add('BACKGROUND',(0,0),(-1,0),colors.lightblue)
tblStyle.add('BACKGROUND',(0,1),(-1,-1),colors.white)
tableThatSplitsOverPages.setStyle(tblStyle)
Story.append(tableThatSplitsOverPages)
doc.build(Story,onFirstPage=relatorios.addPageNumber, onLaterPages=relatorios.addPageNumber)
def relatorioOpcao2(name, cpf, placa, dataInicial, dataFinal, table):
'''
Report of Option two. This has data between periods.
This produce graph.
'''
if os.path.isdir(os.getcwd() + '/report/' + str(cpf)):
rep = os.getcwd() + '/report/' + str(cpf) + str('/reportOption2.pdf')
else:
os.makedirs(os.getcwd() + '/report/' + str(cpf))
rep = os.getcwd() + '/report/' + str(cpf) + str('/reportOption2.pdf')
doc = SimpleDocTemplate(rep,pagesize=(A4),
rightMargin=30,leftMargin=30,
topMargin=30,bottomMargin=30)
styles=getSampleStyleSheet()
Story=[]
logo = os.getcwd() + '/payload/' + str('logo.png')
formatted_time = time.ctime()
#### Página 1 ######################################################################
cabecalho = ["SECRETARIA DE POLÍCIA CIVIL",
"SUBSECRETARIA DE INTELIGÊNCIA",
"COORDENAÇÃO DE P&D"]
styles.add(ParagraphStyle(name='Center', alignment=TA_CENTER))
ptext = '<font size="24">RELATÓRIO</font>'
Story.append(Paragraph(ptext, styles["Center"]))
Story.append(Spacer(1, 60))
im = Image(logo, 50*mm, 50*mm)
Story.append(im)
Story.append(Spacer(1, 60))
for part in cabecalho:
ptext = '<font size="16">%s</font>' % part.strip()
Story.append(Paragraph(ptext, styles["Center"]))
Story.append(Spacer(1, 12))
Story.append(Spacer(1, 144))
styles.add(ParagraphStyle(name='Justify', alignment=TA_JUSTIFY))
ptext = '<font size="12"> emitido: %s</font>' % formatted_time
Story.append(Paragraph(ptext, styles["Normal"]))
ptext = '<font size="12">Para: %s</font>' % name
Story.append(Paragraph(ptext, styles["Normal"]))
Story.append(PageBreak())
#### Página 2 ######################################################################
im = Image(logo, 20*mm, 20*mm)
im.hAlign = 'LEFT'
Story.append(im)
Story.append(Spacer(1, 48))
ptext = '<font size="12">Olá senhor %s</font>' %(name)
Story.append(Paragraph(ptext, styles["Justify"]))
Story.append(Spacer(1, 12))
ptext = '<font size="12">Este relatório foi produzido seguindo os seguintes parâmetros de pesquisa:</font>'
Story.append(Paragraph(ptext, styles["Justify"]))
Story.append(Spacer(1, 12))
ptext = '<font size="12">Placa: %s</font>'%(placa)
Story.append(Paragraph(ptext, styles["Justify"]))
ptext = '<font size="12">Período: De %s até %s</font>'%(dataInicial, dataFinal)
Story.append(Paragraph(ptext, styles["Justify"]))
Story.append(Spacer(1, 48))
graf = os.getcwd() + '/graph/' + str(cpf) + str('/graficos.png')
im = Image(graf, 180*mm, 155*mm)
Story.append(im)
Story.append(PageBreak())
#### Tabela #########################################################################
#Story.append(landscape(A4))
styleN = styles["Normal"]
# Make heading for each column and start data list
column1Heading = "COLUMN ONE HEADING"
column2Heading = "COLUMN TWO HEADING"
# Assemble data for each column using simple loop to append it into data list
data = [[column1Heading,column2Heading]]
#for i in table:
# data.append([str(i),str(i)])
data_ = table.drop(columns = ['idMovimento', 'Ano', 'Mes', 'Dia', 'Sem', 'Hora', 'Minuto', 'Segundo'])
data = [data_.columns[:,].values.astype(str).tolist()] + data_.values.tolist()
tableThatSplitsOverPages = Table(data, repeatRows=1)
tableThatSplitsOverPages.hAlign = 'CENTER'
tblStyle = TableStyle([('TEXTCOLOR',(0,0),(-1,-1),colors.black),
('FONTSIZE',(0,0),(-1,-1),5.5),
('VALIGN',(0,0),(-1,-1),'TOP'),
('LINEBELOW',(0,0),(-1,-1),1,colors.black),
('BOX',(0,0),(-1,-1),1,colors.black),
('BOX',(0,0),(-1,-1),1,colors.black)])
tblStyle.add('BACKGROUND',(0,0),(-1,0),colors.lightblue)
tblStyle.add('BACKGROUND',(0,1),(-1,-1),colors.white)
tableThatSplitsOverPages.setStyle(tblStyle)
Story.append(tableThatSplitsOverPages)
doc.build(Story,onFirstPage=relatorios.addPageNumber, onLaterPages=relatorios.addPageNumber) | 37.751111 | 143 | 0.542972 | 903 | 8,494 | 5.089701 | 0.222591 | 0.076588 | 0.009138 | 0.046997 | 0.737163 | 0.732158 | 0.71584 | 0.709095 | 0.700392 | 0.700392 | 0 | 0.031676 | 0.275253 | 8,494 | 225 | 144 | 37.751111 | 0.714912 | 0.068872 | 0 | 0.744361 | 0 | 0 | 0.151257 | 0.003459 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022556 | false | 0 | 0.075188 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
65c35e98e3a36e88b2e1e2a2001ddb5b7f610bee | 118 | py | Python | pytorch_toolbelt/utils/catalyst/__init__.py | valyukov/pytorch-toolbelt | 71a8c2ea98fae06c6eb165f8966c4b1e5c9e67fc | [
"MIT"
] | 377 | 2021-03-24T05:13:49.000Z | 2022-03-31T06:14:24.000Z | reference_code/GSNet-release/pytorch_toolbelt/utils/catalyst/__init__.py | PionnerLC/gsnet | 69c418fd5c8ec9ee90b4298888f59d9ce5b37749 | [
"MIT"
] | 89 | 2021-03-24T12:32:51.000Z | 2022-03-29T12:11:06.000Z | reference_code/GSNet-release/pytorch_toolbelt/utils/catalyst/__init__.py | PionnerLC/gsnet | 69c418fd5c8ec9ee90b4298888f59d9ce5b37749 | [
"MIT"
] | 51 | 2021-03-26T00:36:17.000Z | 2022-03-16T00:51:11.000Z | from __future__ import absolute_import
from .metrics import *
from .visualization import *
from .criterions import *
| 19.666667 | 38 | 0.805085 | 14 | 118 | 6.428571 | 0.5 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144068 | 118 | 5 | 39 | 23.6 | 0.891089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b832515fa82e4f16879fbe53d09437adca524d56 | 118 | py | Python | niteru/utils.py | ninoseki/niteru | 9d05a221bffc6c399ce10586ce5546fdb0b483c8 | [
"MIT"
] | 3 | 2021-04-30T17:33:53.000Z | 2021-08-24T13:05:49.000Z | niteru/utils.py | ninoseki/niteru | 9d05a221bffc6c399ce10586ce5546fdb0b483c8 | [
"MIT"
] | null | null | null | niteru/utils.py | ninoseki/niteru | 9d05a221bffc6c399ce10586ce5546fdb0b483c8 | [
"MIT"
] | null | null | null | from niteru.dataclasses import ParsedHTML
def is_html(parsed: ParsedHTML) -> bool:
return len(parsed.tags) != 0
| 19.666667 | 41 | 0.737288 | 16 | 118 | 5.375 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010101 | 0.161017 | 118 | 5 | 42 | 23.6 | 0.858586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
b84ba8737bb63301cf75fd959ef76a6f4387de4e | 25 | py | Python | grove/factory/__init__.py | Mehmet-Erkan/grove.py | 9f0add055e5cac0b3b3533ba67bd485d0b4f99db | [
"MIT"
] | 122 | 2018-12-04T16:42:32.000Z | 2022-03-16T09:15:07.000Z | grove/factory/__init__.py | Mehmet-Erkan/grove.py | 9f0add055e5cac0b3b3533ba67bd485d0b4f99db | [
"MIT"
] | 28 | 2019-03-27T19:26:25.000Z | 2022-03-30T04:49:54.000Z | grove/factory/__init__.py | Mehmet-Erkan/grove.py | 9f0add055e5cac0b3b3533ba67bd485d0b4f99db | [
"MIT"
] | 91 | 2018-06-30T06:35:23.000Z | 2022-03-20T14:56:15.000Z |
from .factory import *
| 6.25 | 22 | 0.68 | 3 | 25 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.24 | 25 | 3 | 23 | 8.333333 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b869c5a7cef846dad8d14b3c0e0aea67109aff8d | 70 | py | Python | alarme/extras/action/gsm/__init__.py | insolite/alarme | 2312e88299a07d47435f475e5617213404e6d365 | [
"MIT"
] | null | null | null | alarme/extras/action/gsm/__init__.py | insolite/alarme | 2312e88299a07d47435f475e5617213404e6d365 | [
"MIT"
] | 1 | 2017-02-04T13:03:05.000Z | 2017-02-04T13:03:05.000Z | alarme/extras/action/gsm/__init__.py | insolite/alarme | 2312e88299a07d47435f475e5617213404e6d365 | [
"MIT"
] | null | null | null | from .call_action import CallAction
from .sms_action import SmsAction
| 23.333333 | 35 | 0.857143 | 10 | 70 | 5.8 | 0.7 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 70 | 2 | 36 | 35 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b86ff411b3c32138f0a0358edb1e12c165ded45d | 84 | py | Python | stocks/stocks.py | mhoss2008/Stocks | 44bdb4bbdf5aca88e84f71b3884b3d4953b4f025 | [
"MIT"
] | null | null | null | stocks/stocks.py | mhoss2008/Stocks | 44bdb4bbdf5aca88e84f71b3884b3d4953b4f025 | [
"MIT"
] | null | null | null | stocks/stocks.py | mhoss2008/Stocks | 44bdb4bbdf5aca88e84f71b3884b3d4953b4f025 | [
"MIT"
] | null | null | null | def New_Project():
"""
New Project Testing
"""
print("New Project") | 14 | 24 | 0.547619 | 9 | 84 | 5 | 0.555556 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.297619 | 84 | 6 | 24 | 14 | 0.762712 | 0.22619 | 0 | 0 | 0 | 0 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
b21ef953ad925f5d1bdebfc26c17865ad82e56ba | 2,891 | py | Python | ssh/ssh_testing.py | wirth-m/cs338 | f4bdc6629bbe4bda5c8f5a1989e51f6c92ca6633 | [
"MIT"
] | null | null | null | ssh/ssh_testing.py | wirth-m/cs338 | f4bdc6629bbe4bda5c8f5a1989e51f6c92ca6633 | [
"MIT"
] | null | null | null | ssh/ssh_testing.py | wirth-m/cs338 | f4bdc6629bbe4bda5c8f5a1989e51f6c92ca6633 | [
"MIT"
] | null | null | null | # Antonia Ritter and McKenna Wirth
# CS338 Spring 2022
# SSH File Formats Assignment
# RSA Testing
import math
p = 0x00f577cca4e0c4492ed39c896c8a52344dd3369daf380e5d0dfa241b84272f865f0462412337cd08355812e849f15e7c6e27429c8eb28d3129c038ba703841864d7deb64aaa5bce1e343e0fead0d98a4cca58c93d99348105851ea2f8535d177e4b35f3824a6db7a091805ce9f3ae3dd111fe49090c474992708f15c7bee763a3384a047fa6f88615696f3014fb3b790003c8de27df5272a269f4e81173817dee3808c48b5218d854aac310b959612b6636a7e94e2a9ed96e4e40cd14ead00e88b
q = 0x00df7e543cf1df2965e6ecce343ee5de780c0cb2d6b222e50c05770b867c4fa0a6ec386086f681d7641fd2f925331d1e5db993cd89cc6c96a38786a826ef1583e6a3047b1bd28fb4b46183772ee1256363113da89e86c3af4006b807d5eae0c90042bf2d7593ccebcdff881eba3ee8a4869bc07ae944b60bebc14770c007c84ecbad0e38145c4c7f82046b3bec0f3540da8b6d34096bdd336cfd5e381b2aa5eb48d9a8d4e7bf807b73b3b5593adaa2dd5c1ff9390164f9a69f9a689c5688393729
e = 65537
d = 4607533641130930694729632356118812402743921811935311628416463571209425490759360448675330320778431311999518993427979387327361314974987141187269672857138540366784774202164588694272228813990671790097106740681333742306296056959780759432132387164153899397525820626889525916370211217060981378740691263045588283518874692118264409565415991294509130305982267586687596990175876204430464852584322587335403218896868505118996830695512179554441929802744652203756582253770739133074144802537978567551417846310455685263620108936790823683642736554766116485829188875566624332318402235689205910925173153565541652074492169599826905082092605110088123257027107238360910565967130794581772283753586792586589270346424586070612502262704933500529864676386346251982681725596864901888052867987345437192524681718132956443891222768802422079706202869682674753703431856814971566771624893497979129549204409514206800253084934344277839057249762920897614544549681
n = 0x00d64c7d0018176f754615721c611ccf0b81665e5f3ee2009a437eb2ac4829f64031abc53d658f820fa91ccea72aa3b3b4bedcdee55e6983e31b9dcfd35d3f4bd06ac9b3cfbd859617bc60868a6e475866b93b164011c1437ebdd819ba947234e65909b1066a0bc8c76625feec16f49fa514f92eaf3c545bd739c2f002ba64151dc636383050f7c6f5d045fa6c2032a20e334d7c18d090b9a7e29a97d52a2fe66ea9822ad14eb68a28b911009eb5b9301f06e7e38227c040d15d36094e3f750ba2953442f8b97070cc5a9ed1acd5313cdf65b339c77b1cb55003801c6f2b1c2c80347b2997e522042c8353f9ce6533af2c058d0c28f79bb02a36f1aaf9bbd49840d425288baaf37b1cc5cdb41a64c5ebbc87f460c7dc6edbe0eec50a6d796e9eeb2e06c500aa98e0af816340c734fe64173ed5dc78f0f6cfda23ae3cf47b0872f9a3a41d495318211596b0cb25a10baf675cda95b971be787209d7ddb78c1ec0f0396320c2d274f55918c578be58bcdf28d6b5a029c20694256fed4034860e1b43
if n==p*q:
print("n = p*q as it should")
else:
print("n doesn't equal p*q")
lambdaN = math.lcm(p-1, q-1)
# print(f"lambda(n) = {lambdaN}")
check1 = math.gcd(e, lambdaN)
print(f"{check1} should be 1")
check2 = (e * d) % lambdaN
print(f"{check2} should be 1")
| 103.25 | 929 | 0.945002 | 78 | 2,891 | 35.025641 | 0.564103 | 0.002196 | 0.002196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.687165 | 0.032515 | 2,891 | 27 | 930 | 107.074074 | 0.289596 | 0.042892 | 0 | 0 | 0 | 0 | 0.028623 | 0 | 0 | 1 | 0.56087 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0.266667 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b23136ffc301df13ded06d233d8e03e80a1fbf03 | 12,291 | py | Python | packages/core/minos-microservice-saga/tests/test_saga/test_executions/test_saga/test_raw.py | sorasful/minos-python | 1189330eebf6444627a2af6b29f347670f95a4dd | [
"MIT"
] | 4 | 2021-04-12T15:40:14.000Z | 2021-04-17T18:54:15.000Z | packages/core/minos-microservice-saga/tests/test_saga/test_executions/test_saga/test_raw.py | sorasful/minos-python | 1189330eebf6444627a2af6b29f347670f95a4dd | [
"MIT"
] | 143 | 2021-04-06T08:51:09.000Z | 2022-01-28T11:58:06.000Z | tests/test_saga/test_executions/test_saga/test_raw.py | Clariteia/minos_microservice_saga | fd94d378c5560d04ff4c0557247761085f61e7d0 | [
"MIT"
] | null | null | null | import unittest
from unittest.mock import (
MagicMock,
patch,
)
from uuid import (
UUID,
uuid4,
)
from minos.saga import (
SagaContext,
SagaExecution,
SagaPausedExecutionStepException,
SagaResponse,
)
from tests.utils import (
ADD_ORDER,
Foo,
MinosTestCase,
)
class TestSagaExecution(MinosTestCase):
def setUp(self) -> None:
super().setUp()
self.user = uuid4()
self.publish_mock = MagicMock(side_effect=self.broker_publisher.send)
self.broker_publisher.send = self.publish_mock
def test_from_raw(self):
with patch("uuid.uuid4", return_value=UUID("a74d9d6d-290a-492e-afcc-70607958f65d")):
expected = SagaExecution.from_definition(ADD_ORDER, user=self.user)
observed = SagaExecution.from_raw(expected)
self.assertEqual(expected, observed)
def test_from_raw_without_user(self):
with patch("uuid.uuid4", return_value=UUID("a74d9d6d-290a-492e-afcc-70607958f65d")):
expected = SagaExecution.from_definition(ADD_ORDER)
observed = SagaExecution.from_raw(expected)
self.assertEqual(expected, observed)
def test_created(self):
with patch("uuid.uuid4", return_value=UUID("a74d9d6d-290a-492e-afcc-70607958f65d")):
execution = SagaExecution.from_definition(ADD_ORDER, user=self.user)
expected = {
"already_rollback": False,
"context": SagaContext().avro_str,
"definition": {
"committed": True,
"steps": [
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_order"},
"on_success": {"callback": "tests.utils.handle_order_success"},
"on_error": None,
"on_failure": {"callback": "tests.utils.send_delete_order"},
},
{
"cls": "minos.saga.definitions.steps.local.LocalSagaStep",
"on_execute": {"callback": "tests.utils.create_payment"},
"on_failure": {"callback": "tests.utils.delete_payment"},
},
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_ticket"},
"on_success": {"callback": "tests.utils.handle_ticket_success"},
"on_error": {"callback": "tests.utils.handle_ticket_error"},
"on_failure": {"callback": "tests.utils.send_delete_ticket"},
},
],
},
"executed_steps": [],
"paused_step": None,
"status": "created",
"user": str(self.user),
"uuid": "a74d9d6d-290a-492e-afcc-70607958f65d",
}
observed = execution.raw
self.assertEqual(
SagaContext.from_avro_str(expected.pop("context")), SagaContext.from_avro_str(observed.pop("context"))
)
self.assertEqual(expected, observed)
def test_created_without_user(self):
with patch("uuid.uuid4", return_value=UUID("a74d9d6d-290a-492e-afcc-70607958f65d")):
execution = SagaExecution.from_definition(ADD_ORDER)
expected = {
"already_rollback": False,
"context": SagaContext().avro_str,
"definition": {
"committed": True,
"steps": [
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_order"},
"on_success": {"callback": "tests.utils.handle_order_success"},
"on_error": None,
"on_failure": {"callback": "tests.utils.send_delete_order"},
},
{
"cls": "minos.saga.definitions.steps.local.LocalSagaStep",
"on_execute": {"callback": "tests.utils.create_payment"},
"on_failure": {"callback": "tests.utils.delete_payment"},
},
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_ticket"},
"on_success": {"callback": "tests.utils.handle_ticket_success"},
"on_error": {"callback": "tests.utils.handle_ticket_error"},
"on_failure": {"callback": "tests.utils.send_delete_ticket"},
},
],
},
"executed_steps": [],
"paused_step": None,
"status": "created",
"user": None,
"uuid": "a74d9d6d-290a-492e-afcc-70607958f65d",
}
observed = execution.raw
self.assertEqual(
SagaContext.from_avro_str(expected.pop("context")), SagaContext.from_avro_str(observed.pop("context"))
)
self.assertEqual(expected, observed)
async def test_partial_step(self):
raw = {
"already_rollback": False,
"context": SagaContext().avro_str,
"definition": {
"committed": True,
"steps": [
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_order"},
"on_success": {"callback": "tests.utils.handle_order_success"},
"on_error": None,
"on_failure": {"callback": "tests.utils.send_delete_order"},
},
{
"cls": "minos.saga.definitions.steps.local.LocalSagaStep",
"on_execute": {"callback": "tests.utils.create_payment"},
"on_failure": {"callback": "tests.utils.delete_payment"},
},
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_ticket"},
"on_success": {"callback": "tests.utils.handle_ticket_success"},
"on_error": {"callback": "tests.utils.handle_ticket_error"},
"on_failure": {"callback": "tests.utils.send_delete_ticket"},
},
],
},
"executed_steps": [],
"paused_step": {
"cls": "minos.saga.executions.steps.remote.RemoteSagaStepExecution",
"definition": {
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_order"},
"on_success": {"callback": "tests.utils.handle_order_success"},
"on_error": {"callback": "tests.utils.handle_ticket_error"},
"on_failure": {"callback": "tests.utils.send_delete_order"},
},
"status": "paused-by-on-execute",
"already_rollback": False,
},
"user": str(self.user),
"status": "paused",
"uuid": "a74d9d6d-290a-492e-afcc-70607958f65d",
}
with patch("uuid.uuid4", return_value=UUID("a74d9d6d-290a-492e-afcc-70607958f65d")):
expected = SagaExecution.from_definition(ADD_ORDER, user=self.user)
with self.assertRaises(SagaPausedExecutionStepException):
await expected.execute()
observed = SagaExecution.from_raw(raw)
self.assertEqual(expected, observed)
async def test_executed_step(self):
raw = {
"already_rollback": False,
"context": SagaContext(order=Foo("hola"), payment="payment").avro_str,
"definition": {
"committed": True,
"steps": [
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_order"},
"on_success": {"callback": "tests.utils.handle_order_success"},
"on_error": None,
"on_failure": {"callback": "tests.utils.send_delete_order"},
},
{
"cls": "minos.saga.definitions.steps.local.LocalSagaStep",
"on_execute": {"callback": "tests.utils.create_payment"},
"on_failure": {"callback": "tests.utils.delete_payment"},
},
{
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_ticket"},
"on_success": {"callback": "tests.utils.handle_ticket_success"},
"on_error": {"callback": "tests.utils.handle_ticket_error"},
"on_failure": {"callback": "tests.utils.send_delete_ticket"},
},
],
},
"executed_steps": [
{
"cls": "minos.saga.executions.steps.remote.RemoteSagaStepExecution",
"definition": {
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_order"},
"on_success": {"callback": "tests.utils.handle_order_success"},
"on_error": None,
"on_failure": {"callback": "tests.utils.send_delete_order"},
},
"status": "finished",
"related_services": ["ticket", "order"],
"already_rollback": False,
},
{
"cls": "minos.saga.executions.steps.local.LocalSagaStepExecution",
"definition": {
"cls": "minos.saga.definitions.steps.local.LocalSagaStep",
"on_execute": {"callback": "tests.utils.create_payment"},
"on_failure": {"callback": "tests.utils.delete_payment"},
},
"status": "finished",
"related_services": ["order"],
"already_rollback": False,
},
],
"paused_step": {
"cls": "minos.saga.executions.steps.remote.RemoteSagaStepExecution",
"definition": {
"cls": "minos.saga.definitions.steps.remote.RemoteSagaStep",
"on_execute": {"callback": "tests.utils.send_create_ticket"},
"on_success": {"callback": "tests.utils.handle_ticket_success"},
"on_error": {"callback": "tests.utils.handle_ticket_error"},
"on_failure": {"callback": "tests.utils.send_delete_ticket"},
},
"status": "paused-by-on-execute",
"already_rollback": False,
"related_services": ["order"],
},
"user": str(self.user),
"status": "paused",
"uuid": "a74d9d6d-290a-492e-afcc-70607958f65d",
}
with patch("uuid.uuid4", return_value=UUID("a74d9d6d-290a-492e-afcc-70607958f65d")):
expected = SagaExecution.from_definition(ADD_ORDER, user=self.user)
with self.assertRaises(SagaPausedExecutionStepException):
await expected.execute()
response = SagaResponse(Foo("hola"), {"ticket"})
with self.assertRaises(SagaPausedExecutionStepException):
await expected.execute(response)
observed = SagaExecution.from_raw(raw)
self.assertEqual(expected, observed)
if __name__ == "__main__":
unittest.main()
| 45.522222 | 114 | 0.515825 | 1,020 | 12,291 | 5.997059 | 0.096078 | 0.081739 | 0.144188 | 0.079124 | 0.901586 | 0.888998 | 0.887363 | 0.880007 | 0.840772 | 0.817721 | 0 | 0.026187 | 0.353755 | 12,291 | 269 | 115 | 45.69145 | 0.743925 | 0 | 0 | 0.649402 | 0 | 0 | 0.369457 | 0.230331 | 0 | 0 | 0 | 0 | 0.043825 | 1 | 0.01992 | false | 0 | 0.01992 | 0 | 0.043825 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b24796e60981077ce50ff945bbe53b8b06e9e3a9 | 1,702 | py | Python | 08/euler8.py | adamkkarl/ProjectEuler | 767286c74a484b7a569d6060ab6d54a298195aa3 | [
"MIT"
] | 2 | 2018-05-07T00:16:57.000Z | 2018-05-22T02:57:16.000Z | 08/euler8.py | adamkkarl/ProjectEuler | 767286c74a484b7a569d6060ab6d54a298195aa3 | [
"MIT"
] | null | null | null | 08/euler8.py | adamkkarl/ProjectEuler | 767286c74a484b7a569d6060ab6d54a298195aa3 | [
"MIT"
] | null | null | null | #!/bin/python3
__author__ = "Adam Karl"
"""Find the greatest product of K consecutive digits in the N digit number"""
#https://projecteuler.net/problem=8
largeNum = """73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450""".replace("\n", "")
def largestProductInSeries(k, n):
max_prod = 0
for start in range(len(n)-k):
product = 1
for index in range(k):
product *= int(n[start+index])
if product > max_prod:
max_prod = product
return max_prod
def main():
print("Find the greatest product of K consecutive digits. K = ", end="")
k = int(input())
print("Largest product is %d" % largestProductInSeries(k, largeNum))
if __name__ == "__main__":
main()
| 36.212766 | 77 | 0.834313 | 110 | 1,702 | 12.763636 | 0.6 | 0.019943 | 0.021368 | 0.031339 | 0.059829 | 0.059829 | 0.059829 | 0.059829 | 0 | 0 | 0 | 0.662706 | 0.109871 | 1,702 | 46 | 78 | 37 | 0.264026 | 0.027615 | 0 | 0 | 0 | 0 | 0.706853 | 0.634518 | 0 | 1 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.083333 | 0.055556 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b24f129ee25c2a640cd84c925cc969bd1991e17a | 120 | py | Python | src/testcases/CreateMethod/out.py | BlackBeard98/Code-Generation | 769ef8fd9162c0fae6326dc9d08e20bb5d20c067 | [
"MIT"
] | 1 | 2021-11-03T01:24:21.000Z | 2021-11-03T01:24:21.000Z | src/testcases/CreateMethod/out.py | BlackBeard98/Code-Generation | 769ef8fd9162c0fae6326dc9d08e20bb5d20c067 | [
"MIT"
] | null | null | null | src/testcases/CreateMethod/out.py | BlackBeard98/Code-Generation | 769ef8fd9162c0fae6326dc9d08e20bb5d20c067 | [
"MIT"
] | null | null | null | class A:
def __init__(self) ->None:
pass
def A_generated(self, name='test'):
print(f'{name}')
| 15 | 39 | 0.541667 | 16 | 120 | 3.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.3 | 120 | 7 | 40 | 17.142857 | 0.714286 | 0 | 0 | 0 | 1 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0.2 | 0 | 0 | 0.6 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
b279fd66f981d8785b547e04a2bc7f3c54ddee6e | 46 | py | Python | build/lib.linux-x86_64-2.7/ryu/app/experiments/PureSDN/test2.py | Helloworld1995/Ryu_SDN_Controller | 2680f967debca361adc6ff14ddadcbbcde0c7082 | [
"Apache-2.0"
] | 1 | 2021-03-11T01:47:35.000Z | 2021-03-11T01:47:35.000Z | build/lib.linux-x86_64-2.7/ryu/app/experiments/PureSDN/test2.py | Helloworld1995/Ryu_SDN_Controller | 2680f967debca361adc6ff14ddadcbbcde0c7082 | [
"Apache-2.0"
] | null | null | null | build/lib.linux-x86_64-2.7/ryu/app/experiments/PureSDN/test2.py | Helloworld1995/Ryu_SDN_Controller | 2680f967debca361adc6ff14ddadcbbcde0c7082 | [
"Apache-2.0"
] | null | null | null | list=[123,321,77,68]
_list=list[:]
print _list | 15.333333 | 20 | 0.717391 | 9 | 46 | 3.444444 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.232558 | 0.065217 | 46 | 3 | 21 | 15.333333 | 0.488372 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.333333 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b2a439342929887299c65dd4de056c20c92ce1d1 | 72 | py | Python | collagen/callbacks/logging/__init__.py | MIPT-Oulu/Collagen | 0cbc4285d60e5c9fcc89f629fcf4321e80b7452c | [
"MIT"
] | 4 | 2019-05-14T14:44:51.000Z | 2020-03-13T08:37:48.000Z | collagen/callbacks/logging/__init__.py | MIPT-Oulu/Collagen | 0cbc4285d60e5c9fcc89f629fcf4321e80b7452c | [
"MIT"
] | 26 | 2019-04-21T20:35:22.000Z | 2022-03-12T00:32:57.000Z | collagen/callbacks/logging/__init__.py | MIPT-Oulu/Collagen | 0cbc4285d60e5c9fcc89f629fcf4321e80b7452c | [
"MIT"
] | 1 | 2019-05-14T14:53:28.000Z | 2019-05-14T14:53:28.000Z | from ._git import *
from .loggers import *
from .visualization import *
| 18 | 28 | 0.75 | 9 | 72 | 5.888889 | 0.555556 | 0.377358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 72 | 3 | 29 | 24 | 0.883333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a22c3c35df1601dc7aa311a08b32a3722d77a3f0 | 1,175 | py | Python | memorious/tests/test_extract.py | x0rzkov/memorious | 88454a4e6bfe6621be3064acac0e8cbe1ae396fe | [
"MIT"
] | null | null | null | memorious/tests/test_extract.py | x0rzkov/memorious | 88454a4e6bfe6621be3064acac0e8cbe1ae396fe | [
"MIT"
] | null | null | null | memorious/tests/test_extract.py | x0rzkov/memorious | 88454a4e6bfe6621be3064acac0e8cbe1ae396fe | [
"MIT"
] | null | null | null | import os
import tempfile
from memorious.operations.extract import extract_7zip, extract_tar, extract_zip
def test_extract_7zip(context):
file_path = os.path.realpath(__file__)
archive_path = os.path.normpath(os.path.join(
file_path, "../testdata/test.7z"
))
extract_dir = tempfile.mkdtemp(prefix="memorious_test")
assert extract_7zip(archive_path, extract_dir, context) == [
os.path.join(extract_dir, "test/a/1.txt")
]
def test_extract_zip(context):
file_path = os.path.realpath(__file__)
archive_path = os.path.normpath(os.path.join(
file_path, "../testdata/test.zip"
))
extract_dir = tempfile.mkdtemp(prefix="memorious_test")
assert extract_zip(archive_path, extract_dir, context) == [
os.path.join(extract_dir, "test/a/1.txt")
]
def test_extract_tar(context):
file_path = os.path.realpath(__file__)
archive_path = os.path.normpath(os.path.join(
file_path, "../testdata/test.tar.gz"
))
extract_dir = tempfile.mkdtemp(prefix="memorious_test")
assert extract_tar(archive_path, extract_dir, context) == [
os.path.join(extract_dir, "test/a/1.txt")
]
| 31.756757 | 79 | 0.693617 | 160 | 1,175 | 4.79375 | 0.18125 | 0.093872 | 0.078227 | 0.066493 | 0.826597 | 0.826597 | 0.826597 | 0.826597 | 0.826597 | 0.603651 | 0 | 0.007216 | 0.174468 | 1,175 | 36 | 80 | 32.638889 | 0.783505 | 0 | 0 | 0.5 | 0 | 0 | 0.119149 | 0.019574 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a249d1aa6036ee17924e1a86f1320ab04dd872ea | 51 | py | Python | lambda_scraper/drivers/driver.py | Galsor/lambda-scraper | ef38b47f16db5b415fbc67a31ec093f5d0b8cda1 | [
"MIT"
] | null | null | null | lambda_scraper/drivers/driver.py | Galsor/lambda-scraper | ef38b47f16db5b415fbc67a31ec093f5d0b8cda1 | [
"MIT"
] | null | null | null | lambda_scraper/drivers/driver.py | Galsor/lambda-scraper | ef38b47f16db5b415fbc67a31ec093f5d0b8cda1 | [
"MIT"
] | null | null | null | from abc import ABC
class Driver(ABC):
pass
| 7.285714 | 19 | 0.666667 | 8 | 51 | 4.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27451 | 51 | 6 | 20 | 8.5 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
a258fc893601570348ba8775a345ac0e2fa99a95 | 69 | py | Python | drf_jsonapi/serializers/__init__.py | kelly-vacasa/drf-jsonapi | 1fef7ef18078aa29a84cf01833fab99cef44b6a3 | [
"MIT"
] | null | null | null | drf_jsonapi/serializers/__init__.py | kelly-vacasa/drf-jsonapi | 1fef7ef18078aa29a84cf01833fab99cef44b6a3 | [
"MIT"
] | null | null | null | drf_jsonapi/serializers/__init__.py | kelly-vacasa/drf-jsonapi | 1fef7ef18078aa29a84cf01833fab99cef44b6a3 | [
"MIT"
] | null | null | null | from .objects import *
from .utils import *
from .resources import *
| 17.25 | 24 | 0.73913 | 9 | 69 | 5.666667 | 0.555556 | 0.392157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 69 | 3 | 25 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a2fb49d6102e864f6a0134bd3d4a50bd62e6ed6e | 27 | py | Python | docs/archive/slash/__init__.py | sjdv1982/seamless | 1b814341e74a56333c163f10e6f6ceab508b7df9 | [
"MIT"
] | 15 | 2017-06-07T12:49:12.000Z | 2020-07-25T18:06:04.000Z | docs/archive/slash/__init__.py | sjdv1982/seamless | 1b814341e74a56333c163f10e6f6ceab508b7df9 | [
"MIT"
] | 110 | 2016-06-21T23:20:44.000Z | 2022-02-24T16:15:22.000Z | docs/archive/slash/__init__.py | sjdv1982/seamless | 1b814341e74a56333c163f10e6f6ceab508b7df9 | [
"MIT"
] | 6 | 2016-06-21T11:19:22.000Z | 2019-01-21T13:45:39.000Z | from .slash0 import slash0
| 13.5 | 26 | 0.814815 | 4 | 27 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 0.148148 | 27 | 1 | 27 | 27 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0c018e74ce66f9347dc1894ceaf4244f674b1b66 | 154 | py | Python | flowx/poisson/__init__.py | AbhilashReddyM/flowX | 57efa4c076d40d559ffc82ce541d340cf2519a3d | [
"BSD-3-Clause"
] | null | null | null | flowx/poisson/__init__.py | AbhilashReddyM/flowX | 57efa4c076d40d559ffc82ce541d340cf2519a3d | [
"BSD-3-Clause"
] | 2 | 2020-04-14T20:00:37.000Z | 2020-04-17T21:26:23.000Z | flowx/poisson/__init__.py | AbhilashReddyM/flowX | 57efa4c076d40d559ffc82ce541d340cf2519a3d | [
"BSD-3-Clause"
] | 3 | 2020-04-18T22:30:55.000Z | 2021-06-08T15:23:21.000Z | from .jacobi import solve_jacobi
from .direct import solve_direct
from .cg import solve_cg
from .superlu import solve_lu
from .sparse import build_sparse
| 25.666667 | 32 | 0.837662 | 25 | 154 | 4.96 | 0.4 | 0.354839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12987 | 154 | 5 | 33 | 30.8 | 0.925373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0c277adab45fde4735fdf5f01d28a9aa250ca662 | 27 | py | Python | michiru/transports/__init__.py | moeIO/michiru | f1bafb90c2d82debee9e0402b426eba592038f24 | [
"WTFPL"
] | 1 | 2018-01-25T15:39:12.000Z | 2018-01-25T15:39:12.000Z | michiru/transports/__init__.py | moeIO/michiru | f1bafb90c2d82debee9e0402b426eba592038f24 | [
"WTFPL"
] | null | null | null | michiru/transports/__init__.py | moeIO/michiru | f1bafb90c2d82debee9e0402b426eba592038f24 | [
"WTFPL"
] | null | null | null | from . import irc, discord
| 13.5 | 26 | 0.740741 | 4 | 27 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185185 | 27 | 1 | 27 | 27 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a74f59bed43289e9bca9d8f77566c687ee2274bc | 41 | py | Python | Python_proficiency_test/latex/codes/18b.py | ALFA-group/neural_program_comprehension | 0253911f376cf282af5a5627e38e0a591ad38860 | [
"MIT"
] | 6 | 2020-04-24T08:16:51.000Z | 2021-11-01T09:50:46.000Z | Python_proficiency_test/latex/codes/18b.py | ALFA-group/neural_program_comprehension | 0253911f376cf282af5a5627e38e0a591ad38860 | [
"MIT"
] | null | null | null | Python_proficiency_test/latex/codes/18b.py | ALFA-group/neural_program_comprehension | 0253911f376cf282af5a5627e38e0a591ad38860 | [
"MIT"
] | 4 | 2021-02-17T20:21:31.000Z | 2022-02-14T12:43:23.000Z | x: ['secret value!']
y: ['secret value!'] | 20.5 | 20 | 0.585366 | 6 | 41 | 4 | 0.666667 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 41 | 2 | 21 | 20.5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.619048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a7557b2f7067d6cdcf76bf842fc578d3687d8bea | 882 | py | Python | tests/test_text.py | Kwentar/Eve | fa4af5d795030d4e214f73bc1100b049792274e1 | [
"MIT"
] | null | null | null | tests/test_text.py | Kwentar/Eve | fa4af5d795030d4e214f73bc1100b049792274e1 | [
"MIT"
] | null | null | null | tests/test_text.py | Kwentar/Eve | fa4af5d795030d4e214f73bc1100b049792274e1 | [
"MIT"
] | null | null | null | import unittest
from model.text import Text
class TestText(unittest.TestCase):
def test_split(self):
text = '''Белеет парус одинокий
В тумане неба голубом.
Что ищет он в стране далекой?
Что кинул он в стране родной?'''
t = Text(text)
self.assertListEqual([x.paragraph for x in t.paragraphs],
['Белеет парус одинокий', 'В тумане неба голубом.',
'Что ищет он в стране далекой?', 'Что кинул он в стране родной?'])
text = '''Здравствуй!
Как твои дела? Что нового? У меня все хорошо.
Пишу тебе, потому что мне нужен этот тест.'''
t = Text(text)
self.assertListEqual([x.paragraph for x in t.paragraphs],
['Здравствуй!', 'Как твои дела? Что нового? У меня все хорошо.',
'Пишу тебе, потому что мне нужен этот тест.'])
| 40.090909 | 96 | 0.590703 | 115 | 882 | 4.521739 | 0.426087 | 0.023077 | 0.069231 | 0.076923 | 0.826923 | 0.826923 | 0.826923 | 0.826923 | 0.826923 | 0.826923 | 0 | 0 | 0.312925 | 882 | 21 | 97 | 42 | 0.858086 | 0 | 0 | 0.210526 | 0 | 0 | 0.456916 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 1 | 0.052632 | false | 0 | 0.105263 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a79e86dc83dbc263b828e0481623fe9733646ac2 | 36 | py | Python | csinventorypy/__init__.py | JannesT3011/csinventory-py | eb46354ad74b3439e9c8a069481d8c3d14664f5a | [
"MIT"
] | null | null | null | csinventorypy/__init__.py | JannesT3011/csinventory-py | eb46354ad74b3439e9c8a069481d8c3d14664f5a | [
"MIT"
] | null | null | null | csinventorypy/__init__.py | JannesT3011/csinventory-py | eb46354ad74b3439e9c8a069481d8c3d14664f5a | [
"MIT"
] | null | null | null | from .csinventory import CSInventory | 36 | 36 | 0.888889 | 4 | 36 | 8 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.969697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ac105021ef7f405c5d9b08a32cad01215b8910bc | 4,572 | py | Python | py/testdir_single_jvm/test_exec2_cbind_fail.py | jeffreybreen/h2o | 1d1bc87c13ca81e67f333fa6146959dc70af4d6c | [
"Apache-2.0"
] | 1 | 2017-12-14T04:57:51.000Z | 2017-12-14T04:57:51.000Z | py/testdir_single_jvm/test_exec2_cbind_fail.py | hansight/h2o | d7eee8b69dd3538b83412989eee361d2a2883857 | [
"Apache-2.0"
] | null | null | null | py/testdir_single_jvm/test_exec2_cbind_fail.py | hansight/h2o | d7eee8b69dd3538b83412989eee361d2a2883857 | [
"Apache-2.0"
] | 1 | 2018-12-18T06:18:55.000Z | 2018-12-18T06:18:55.000Z | import unittest, random, sys, time
sys.path.extend(['.','..','py'])
import h2o, h2o_exec as h2e, h2o_hosts, h2o_import as h2i
class Basic(unittest.TestCase):
def tearDown(self):
h2o.check_sandbox_for_errors()
@classmethod
def setUpClass(cls):
global SEED, localhost
SEED = h2o.setup_random_seed()
localhost = h2o.decide_if_localhost()
if (localhost):
h2o.build_cloud(1)
else:
h2o_hosts.build_cloud_with_hosts(1)
@classmethod
def tearDownClass(cls):
h2o.tear_down_cloud()
@unittest.skip("Skip RefCnt Failing Test")
def test_exec2_cbind_fail1(self):
for i in range(5):
# execExpr = "a=c(0,0,0); b=c(0,0,0)"
execExpr = "a=c(0,0,0)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "b = a"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
h2o.check_sandbox_for_errors()
def test_exec2_cbind_fail2(self):
for i in range(5):
execExpr = "a=c(0,0,0); b=c(0,0,0)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
h2o.check_sandbox_for_errors()
@unittest.skip("Skip RefCnt Failing Test")
def test_exec2_cbind_fail3(self):
for i in range(5):
execExpr = "h <- cbind(c(0,0,0), c(1,1,1))"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
# have to make sure they're created as keys for reuse between execs
execExpr = "a=c(0,0,0); b=c(0,0,0); d=c(0,0,0); e=c(0,0,0); f=c(0,0,0); g= c(0,0,0);"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "b=a; d=a; f=a; g=a;"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b, d)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b, d, e)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b, d, e, f)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "h <- cbind(a, b, d, e, f, g)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
h2o.check_sandbox_for_errors()
@unittest.skip("Skip RefCnt Failing Test")
def test_exec2_cbind_fail4(self):
for i in range(5):
execExpr = "b=c(0,0,0,0,0,0,0,0,0,0,0,0)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
# have to make sure they're created as keys for reuse between execs
execExpr = "a=b"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "d=b"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "cbind(a,b,d)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
h2o.check_sandbox_for_errors()
@unittest.skip("Skip RefCnt Failing Test")
def test_exec2_cbind_fail5(self):
print "try combining different compression schemes"
for i in range(5):
execExpr = "b=c(0,0,0,0,0,0,0,0,0,0,0,0)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = 'a=runif(b, -1)',
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "d=b"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "cbind(a,b,d)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
h2o.check_sandbox_for_errors()
@unittest.skip("Skip RefCnt Failing Test")
def test_exec2_cbind_fail6(self):
print "fails with argument exception if rows are unequal size"
for i in range(5):
execExpr = "b=c(0,0,0,0,0,0,0,0,0,0,0,0)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
# have to make sure they're created as keys for reuse between execs
execExpr = "a=c(0,0,0,0,0,0,0,0,0,0,0,NA)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "d=b"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
execExpr = "cbind(a,b,d)"
h2e.exec_expr(execExpr=execExpr, timeoutSecs=30)
h2o.check_sandbox_for_errors()
if __name__ == '__main__':
h2o.unit_main()
| 37.170732 | 97 | 0.590332 | 653 | 4,572 | 4 | 0.151608 | 0.051302 | 0.058576 | 0.053599 | 0.792879 | 0.783691 | 0.780245 | 0.769525 | 0.76876 | 0.76876 | 0 | 0.060891 | 0.277997 | 4,572 | 122 | 98 | 37.47541 | 0.730385 | 0.050962 | 0 | 0.6 | 0 | 0.052632 | 0.162667 | 0.026073 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.021053 | null | null | 0.021053 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ac15254d55b2b5a458a5619646411028c57eb401 | 7,004 | py | Python | python/oneflow/test/modules/test_chunk.py | wangyuyue/oneflow | 0a71c22fe8355392acc8dc0e301589faee4c4832 | [
"Apache-2.0"
] | 1 | 2021-09-13T02:34:53.000Z | 2021-09-13T02:34:53.000Z | python/oneflow/test/modules/test_chunk.py | wangyuyue/oneflow | 0a71c22fe8355392acc8dc0e301589faee4c4832 | [
"Apache-2.0"
] | null | null | null | python/oneflow/test/modules/test_chunk.py | wangyuyue/oneflow | 0a71c22fe8355392acc8dc0e301589faee4c4832 | [
"Apache-2.0"
] | 1 | 2021-01-17T03:34:39.000Z | 2021-01-17T03:34:39.000Z | """
Copyright 2020 The OneFlow Authors. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import unittest
from collections import OrderedDict
import numpy as np
from test_util import GenArgList
import oneflow as flow
import oneflow.unittest
def _test_2_dim_forward(test_case, device):
np_arr = np.random.randn(2, 3).astype(np.float32)
input = flow.Tensor(np_arr, device=flow.device(device))
dim = 0
chunks = 2
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(1, 3), (1, 3)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 1
chunks = 2
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(2, 1), (2, 2)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 1
chunks = 3
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(2, 1), (2, 1), (2, 1)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
def _test_2_dim_tensor_function_forward(test_case, device):
np_arr = np.random.randn(2, 3).astype(np.float32)
input = flow.Tensor(np_arr, device=flow.device(device))
dim = 0
chunks = 2
of_out = input.chunk(chunks, dim)
np_out_shape = [(1, 3), (1, 3)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 1
chunks = 2
of_out = input.chunk(chunks, dim)
np_out_shape = [(2, 1), (2, 2)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 1
chunks = 3
of_out = input.chunk(chunks, dim)
np_out_shape = [(2, 1), (2, 1), (2, 1)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
def _test_4_dim_forward(test_case, device):
np_arr = np.random.randn(5, 3, 6, 9).astype(np.float32)
input = flow.Tensor(np_arr, device=flow.device(device))
dim = 2
chunks = 3
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(5, 3, 2, 9), (5, 3, 2, 9), (5, 3, 2, 9)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 2
chunks = 4
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(5, 3, 1, 9), (5, 3, 1, 9), (5, 3, 1, 9), (5, 3, 3, 9)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 3
chunks = 3
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(5, 3, 6, 3), (5, 3, 6, 3), (5, 3, 6, 3)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 3
chunks = 2
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(5, 3, 6, 4), (5, 3, 6, 5)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 3
chunks = 4
of_out = flow.chunk(input, chunks, dim)
np_out_shape = [(5, 3, 6, 2), (5, 3, 6, 2), (5, 3, 6, 2), (5, 3, 6, 3)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
def _test_4_dim_tensor_function_forward(test_case, device):
np_arr = np.random.randn(5, 3, 6, 9).astype(np.float32)
input = flow.Tensor(np_arr, device=flow.device(device))
dim = 2
chunks = 3
of_out = input.chunk(chunks, dim)
np_out_shape = [(5, 3, 2, 9), (5, 3, 2, 9), (5, 3, 2, 9)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 2
chunks = 4
of_out = input.chunk(chunks, dim)
np_out_shape = [(5, 3, 1, 9), (5, 3, 1, 9), (5, 3, 1, 9), (5, 3, 3, 9)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 3
chunks = 3
of_out = input.chunk(chunks, dim)
np_out_shape = [(5, 3, 6, 3), (5, 3, 6, 3), (5, 3, 6, 3)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 3
chunks = 2
of_out = input.chunk(chunks, dim)
np_out_shape = [(5, 3, 6, 4), (5, 3, 6, 5)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
dim = 3
chunks = 4
of_out = input.chunk(chunks, dim)
np_out_shape = [(5, 3, 6, 2), (5, 3, 6, 2), (5, 3, 6, 2), (5, 3, 6, 3)]
for i in range(0, chunks):
of_out_shape = of_out[i].numpy().shape
test_case.assertTrue(np.allclose(of_out_shape, np_out_shape[i], 1e-05, 1e-05))
def _test_chunk_backward(test_case, device):
np_arr = np.random.randn(2, 3).astype(np.float32)
input = flow.Tensor(np_arr, device=flow.device(device))
input.requires_grad = True
y = flow.chunk(input, chunks=2, dim=0)
(z1, z2) = (y[0].sum(), y[1].sum())
z1.backward()
z2.backward()
np_grad = np.ones((2, 3))
test_case.assertTrue(np.array_equal(input.grad.numpy(), np_grad))
@flow.unittest.skip_unless_1n1d()
class TestChunk(flow.unittest.TestCase):
def test_chunk(test_case):
arg_dict = OrderedDict()
arg_dict["test_fun"] = [
_test_2_dim_forward,
_test_4_dim_forward,
_test_2_dim_tensor_function_forward,
_test_4_dim_tensor_function_forward,
_test_chunk_backward,
]
arg_dict["device"] = ["cpu", "cuda"]
for arg in GenArgList(arg_dict):
arg[0](test_case, *arg[1:])
if __name__ == "__main__":
unittest.main()
| 37.058201 | 86 | 0.628355 | 1,218 | 7,004 | 3.399015 | 0.106732 | 0.077295 | 0.077295 | 0.082126 | 0.767874 | 0.75942 | 0.75942 | 0.744203 | 0.744203 | 0.744203 | 0 | 0.064978 | 0.222159 | 7,004 | 188 | 87 | 37.255319 | 0.694934 | 0.082953 | 0 | 0.782051 | 0 | 0 | 0.004521 | 0 | 0 | 0 | 0 | 0 | 0.108974 | 1 | 0.038462 | false | 0 | 0.038462 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ac199358f20b16eca8276bd3e1213f27b2834564 | 48 | py | Python | py_ice_cascade/hillslope/tests/test_null.py | keithfma/py_ice_cascade | 22fe9d51777c52dfff7cce2be9f3ed1b3c6f8bc2 | [
"MIT"
] | null | null | null | py_ice_cascade/hillslope/tests/test_null.py | keithfma/py_ice_cascade | 22fe9d51777c52dfff7cce2be9f3ed1b3c6f8bc2 | [
"MIT"
] | null | null | null | py_ice_cascade/hillslope/tests/test_null.py | keithfma/py_ice_cascade | 22fe9d51777c52dfff7cce2be9f3ed1b3c6f8bc2 | [
"MIT"
] | null | null | null | # TODO: add tests to confirm expected behaviors
| 24 | 47 | 0.791667 | 7 | 48 | 5.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 48 | 1 | 48 | 48 | 0.95 | 0.9375 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 1 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ac813c1df4cbc2cde6004408c688eed3f8601322 | 26 | py | Python | commands/lights/__init__.py | bcbwilla/bcbbot | 4dbfd011aed270464391c33129a46f4c362999d8 | [
"MIT"
] | 1 | 2016-03-03T17:37:27.000Z | 2016-03-03T17:37:27.000Z | commands/lights/__init__.py | bcbwilla/bcbbot | 4dbfd011aed270464391c33129a46f4c362999d8 | [
"MIT"
] | null | null | null | commands/lights/__init__.py | bcbwilla/bcbbot | 4dbfd011aed270464391c33129a46f4c362999d8 | [
"MIT"
] | null | null | null | from lights import Lights
| 13 | 25 | 0.846154 | 4 | 26 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3bb87b7fc739c681b427f3b188cb7870acbff010 | 39 | py | Python | test/test_version.py | maple-q/python-web-framwork | 8b7b1f9f6b0d0d735a654c32d8c849151968c99f | [
"MIT"
] | null | null | null | test/test_version.py | maple-q/python-web-framwork | 8b7b1f9f6b0d0d735a654c32d8c849151968c99f | [
"MIT"
] | null | null | null | test/test_version.py | maple-q/python-web-framwork | 8b7b1f9f6b0d0d735a654c32d8c849151968c99f | [
"MIT"
] | null | null | null | import tramp
print(tramp.__version__)
| 9.75 | 24 | 0.820513 | 5 | 39 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 39 | 3 | 25 | 13 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
3bce5b06a9175687331cd047c22f6199d6e3271b | 215 | py | Python | main.py | hichem2h/task-analyzer | a69902888fed468acae129b83eea4b801b1b3069 | [
"Apache-2.0"
] | null | null | null | main.py | hichem2h/task-analyzer | a69902888fed468acae129b83eea4b801b1b3069 | [
"Apache-2.0"
] | null | null | null | main.py | hichem2h/task-analyzer | a69902888fed468acae129b83eea4b801b1b3069 | [
"Apache-2.0"
] | null | null | null | from task_analyzer import TaskAnalyzer
from task01_tests import check_remainder, check_negative
task_analyzer = TaskAnalyzer([check_remainder, check_negative])
task_analyzer.analyze()
print(task_analyzer.failures) | 30.714286 | 63 | 0.865116 | 27 | 215 | 6.555556 | 0.481481 | 0.271186 | 0.214689 | 0.305085 | 0.440678 | 0.440678 | 0 | 0 | 0 | 0 | 0 | 0.01005 | 0.074419 | 215 | 7 | 64 | 30.714286 | 0.879397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0.2 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
3bd955efb0da641159a5533833ebd31564230687 | 55 | py | Python | envs/swimmer_rep/swimmer_rep/envs/__init__.py | hari-sikchi/stable-baselines | f3627c4b8625c1021a1a893b0a2fd8bfed9e84ed | [
"MIT"
] | null | null | null | envs/swimmer_rep/swimmer_rep/envs/__init__.py | hari-sikchi/stable-baselines | f3627c4b8625c1021a1a893b0a2fd8bfed9e84ed | [
"MIT"
] | null | null | null | envs/swimmer_rep/swimmer_rep/envs/__init__.py | hari-sikchi/stable-baselines | f3627c4b8625c1021a1a893b0a2fd8bfed9e84ed | [
"MIT"
] | null | null | null | from swimmer_rep.envs.swimmer_rep import SwimmerEnvRep
| 27.5 | 54 | 0.890909 | 8 | 55 | 5.875 | 0.75 | 0.425532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072727 | 55 | 1 | 55 | 55 | 0.921569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce17871c3719ef5f66116afdb0c29a4e27c07015 | 43 | py | Python | PyPortOpt/Optimizers/test.py | 2037/PyPortOpt | dabf9bd2e34ab533e2ab85db335027339933326f | [
"MIT"
] | null | null | null | PyPortOpt/Optimizers/test.py | 2037/PyPortOpt | dabf9bd2e34ab533e2ab85db335027339933326f | [
"MIT"
] | null | null | null | PyPortOpt/Optimizers/test.py | 2037/PyPortOpt | dabf9bd2e34ab533e2ab85db335027339933326f | [
"MIT"
] | null | null | null | def displayText():
print("Optimized!")
| 14.333333 | 23 | 0.651163 | 4 | 43 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 2 | 24 | 21.5 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ce1d0e4162f8794e2519a32ee8ea948d38349d1e | 7,120 | py | Python | mealpy/bio_based/BBO.py | Alhassan20/mealpy | 7ed365c5c495ad1c1e066662c90159b3d5e9b8e3 | [
"MIT"
] | 162 | 2020-08-31T10:13:06.000Z | 2022-03-31T09:38:19.000Z | mealpy/bio_based/BBO.py | Alhassan20/mealpy | 7ed365c5c495ad1c1e066662c90159b3d5e9b8e3 | [
"MIT"
] | 51 | 2020-09-13T10:46:31.000Z | 2022-03-30T06:12:08.000Z | mealpy/bio_based/BBO.py | Alhassan20/mealpy | 7ed365c5c495ad1c1e066662c90159b3d5e9b8e3 | [
"MIT"
] | 58 | 2020-09-12T13:29:18.000Z | 2022-03-31T09:38:21.000Z | #!/usr/bin/env python
# ------------------------------------------------------------------------------------------------------%
# Created by "Thieu Nguyen" at 12:24, 18/03/2020 %
# %
# Email: nguyenthieu2102@gmail.com %
# Homepage: https://www.researchgate.net/profile/Thieu_Nguyen6 %
# Github: https://github.com/thieu1995 %
#-------------------------------------------------------------------------------------------------------%
from numpy.random import uniform
from numpy import array, where
from copy import deepcopy
from mealpy.optimizer import Root
class BaseBBO(Root):
"""
My version of: Biogeography-based optimization (BBO)
Biogeography-Based Optimization
Link:
https://ieeexplore.ieee.org/abstract/document/4475427
"""
def __init__(self, obj_func=None, lb=None, ub=None, verbose=True, epoch=750, pop_size=100, p_m=0.01, elites=2, **kwargs):
super().__init__(obj_func, lb, ub, verbose, kwargs)
self.epoch = epoch
self.pop_size = pop_size
self.p_m = p_m # mutation probability
self.elites = elites # Number of elites will be keep for next generation
def train(self):
pop = [self.create_solution() for _ in range(self.pop_size)]
pop_sorted = sorted(pop, key=lambda temp: temp[self.ID_FIT])
# Save the best solutions and costs in the elite arrays
pop_elites = deepcopy(pop_sorted[:self.elites])
# Compute migration rates, assuming the population is sorted from most train to least train
mu = (self.pop_size + 1 - array(range(1, self.pop_size + 1))) / (self.pop_size + 1)
mr = 1 - mu
for epoch in range(self.epoch):
# Use migration rates to decide how much information to share between solutions
pop_new = deepcopy(pop)
for i in range(self.pop_size):
# Probabilistic migration to the i-th position
list_fitness = [item[self.ID_FIT] for item in pop]
pos_old = pop_new[i][self.ID_POS]
# Pick a position from which to emigrate (roulette wheel selection)
idx_selected = self.get_index_roulette_wheel_selection(list_fitness)
# this is the migration step
pos_new = where(uniform(0, 1, self.problem_size) < mr[i], pop_new[idx_selected][self.ID_POS], pos_old)
# Mutation
temp = uniform(self.lb, self.ub)
pos_new = where(uniform(0, 1, self.problem_size) < self.p_m, temp, pos_new)
# Re-calculated fitness
pop_new[i] = [pos_new, self.get_fitness_position(pos_new)]
# replace the solutions with their new migrated and mutated versions then Merge Populations
pop = sorted(deepcopy(pop_new + pop_elites), key=lambda temp: temp[self.ID_FIT])
pop = pop[:self.pop_size]
# Update all elite solutions
for i in range(self.elites):
if pop_elites[i][self.ID_FIT] > pop[i][self.ID_FIT]:
pop_elites[i] = deepcopy(pop[i])
self.loss_train.append(pop_elites[self.ID_MIN_PROB][self.ID_FIT])
if self.verbose:
print("> Epoch: {}, Best fit: {}".format(epoch + 1, pop_elites[self.ID_MIN_PROB][self.ID_FIT]))
self.solution = pop_elites[self.ID_MIN_PROB]
return pop_elites[self.ID_MIN_PROB][self.ID_POS], pop_elites[self.ID_MIN_PROB][self.ID_FIT], self.loss_train
class OriginalBBO(Root):
"""
The original version of: Biogeography-based optimization (BBO)
Biogeography-Based Optimization
Link:
https://ieeexplore.ieee.org/abstract/document/4475427
"""
def __init__(self, obj_func=None, lb=None, ub=None, verbose=True, epoch=750, pop_size=100, p_m=0.01, elites=2, **kwargs):
super().__init__(obj_func, lb, ub, verbose, kwargs)
self.epoch = epoch
self.pop_size = pop_size
self.p_m = p_m # mutation probability
self.elites = elites # Number of elites will be keep for next generation
def train(self):
pop = [self.create_solution() for _ in range(self.pop_size)]
pop_sorted = sorted(pop, key=lambda temp: temp[self.ID_FIT])
# Save the best solutions and costs in the elite arrays
pop_elites = deepcopy(pop_sorted[:self.elites])
# Compute migration rates, assuming the population is sorted from most train to least train
mu = (self.pop_size + 1 - array(range(1, self.pop_size+1))) / (self.pop_size + 1)
mr = 1 - mu
for epoch in range(self.epoch):
# Use migration rates to decide how much information to share between solutions
pop_new = deepcopy(pop)
for i in range(self.pop_size):
# Probabilistic migration to the i-th position
for j in range(self.problem_size):
if uniform() < mr[i]: # Should we immigrate?
# Pick a position from which to emigrate (roulette wheel selection)
random_number = uniform() * sum(mu)
select = mu[0]
select_index = 0
while (random_number > select) and (select_index < self.pop_size - 1):
select_index += 1
select += mu[select_index]
# this is the migration step
pop_new[i][self.ID_POS][j] = pop[select_index][self.ID_POS][j]
# Mutation
for i in range(self.pop_size):
temp = uniform(self.lb, self.ub)
pos_new = where(uniform(0, 1, self.problem_size) < self.p_m, temp, pop_new[i][self.ID_POS])
# Re-calculated fitness
pop_new[i][self.ID_FIT] = self.get_fitness_position(pos_new)
# replace the solutions with their new migrated and mutated versions then Merge Populations
pop = deepcopy(pop_new)
pop = pop + pop_elites
pop = sorted(pop, key=lambda temp: temp[self.ID_FIT])
pop = pop[:self.pop_size]
# Update all elite solutions
for i in range(self.elites):
if pop_elites[i][self.ID_FIT] > pop[i][self.ID_FIT]:
pop_elites[i] = deepcopy(pop[i])
self.loss_train.append(pop_elites[self.ID_MIN_PROB][self.ID_FIT])
if self.verbose:
print("> Epoch: {}, Best fit: {}".format(epoch + 1, pop_elites[self.ID_MIN_PROB][self.ID_FIT]))
self.solution = pop_elites[self.ID_MIN_PROB]
return pop_elites[self.ID_MIN_PROB][self.ID_POS], pop_elites[self.ID_MIN_PROB][self.ID_FIT], self.loss_train
| 50.140845 | 125 | 0.559972 | 898 | 7,120 | 4.257238 | 0.199332 | 0.051792 | 0.046037 | 0.039236 | 0.806696 | 0.793618 | 0.76746 | 0.761705 | 0.761705 | 0.750196 | 0 | 0.015857 | 0.317978 | 7,120 | 141 | 126 | 50.496454 | 0.771417 | 0.315028 | 0 | 0.653846 | 0 | 0 | 0.010438 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0 | 0.051282 | 0 | 0.153846 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ce3c17393e73184aea7ed167927eb9df7d07b271 | 48 | py | Python | convokit/forecaster/CRAFT/__init__.py | sophieball/Cornell-Conversational-Analysis-Toolkit | da65358baffc34a518114be2d94f1748f8e01240 | [
"MIT"
] | 371 | 2016-07-19T22:10:13.000Z | 2022-03-28T08:04:32.000Z | convokit/forecaster/CRAFT/__init__.py | sophieball/Cornell-Conversational-Analysis-Toolkit | da65358baffc34a518114be2d94f1748f8e01240 | [
"MIT"
] | 92 | 2017-07-25T22:04:11.000Z | 2022-03-29T13:46:07.000Z | convokit/forecaster/CRAFT/__init__.py | sophieball/Cornell-Conversational-Analysis-Toolkit | da65358baffc34a518114be2d94f1748f8e01240 | [
"MIT"
] | 105 | 2016-07-04T15:04:53.000Z | 2022-03-30T01:36:38.000Z | from .CRAFTUtil import *
from .CRAFTNN import *
| 16 | 24 | 0.75 | 6 | 48 | 6 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 48 | 2 | 25 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cbf00c976424122df9e542ff9db091fb8c81e650 | 23 | py | Python | urls/project/stag.py | hoatle/django-boilerplate | 12a4ac7d663196a8a860f51ca34039bc5c00bad6 | [
"BSD-3-Clause"
] | null | null | null | urls/project/stag.py | hoatle/django-boilerplate | 12a4ac7d663196a8a860f51ca34039bc5c00bad6 | [
"BSD-3-Clause"
] | null | null | null | urls/project/stag.py | hoatle/django-boilerplate | 12a4ac7d663196a8a860f51ca34039bc5c00bad6 | [
"BSD-3-Clause"
] | null | null | null | from urls.stag import * | 23 | 23 | 0.782609 | 4 | 23 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 23 | 1 | 23 | 23 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cbf9f5d2c655e96a2df2aee03ee8c0396612efe0 | 7,316 | py | Python | psydac/linalg/tests/test_kron_solver_3d.py | GabrielJie/psydac | 51814f04501fa14bc100f0ab224f50a2bbe86612 | [
"MIT"
] | 1 | 2022-01-19T02:26:49.000Z | 2022-01-19T02:26:49.000Z | psydac/linalg/tests/test_kron_solver_3d.py | GabrielJie/psydac | 51814f04501fa14bc100f0ab224f50a2bbe86612 | [
"MIT"
] | null | null | null | psydac/linalg/tests/test_kron_solver_3d.py | GabrielJie/psydac | 51814f04501fa14bc100f0ab224f50a2bbe86612 | [
"MIT"
] | null | null | null | # -*- coding: UTF-8 -*-
import pytest
import time
import numpy as np
from mpi4py import MPI
from scipy.sparse import csc_matrix, dia_matrix, kron
from scipy.sparse.linalg import splu
from psydac.ddm.cart import CartDecomposition
from psydac.linalg.stencil import StencilVectorSpace, StencilVector, StencilMatrix
from psydac.linalg.kron import kronecker_solve_3d_par
from psydac.linalg.direct_solvers import SparseSolver, BandedSolver
# ... return X, solution of (A1 kron A2 kron A3)X = Y
def kron_solve_seq_ref(A1, A2, A3, Y):
# ...
A1_csr = A1.tosparse().tocsr()
A2_csr = A2.tosparse().tocsr()
A3_csr = A3.tosparse().tocsr()
C = csc_matrix(kron(kron(A1_csr, A2_csr), A3_csr))
C_op = splu(C)
X = C_op.solve(Y.flatten())
return X.reshape(Y.shape)
# ...
# ... convert a 1D stencil matrix to band matrix
def to_bnd(A):
dmat = dia_matrix(A.toarray())
la = abs(dmat.offsets.min())
ua = dmat.offsets.max()
cmat = dmat.tocsr()
A_bnd = np.zeros((1+ua+2*la, cmat.shape[1]))
for i,j in zip(*cmat.nonzero()):
A_bnd[la+ua+i-j, j] = cmat[i,j]
return A_bnd, la, ua
# ...
#===============================================================================
@pytest.mark.parametrize( 'n1', [8,16] )
@pytest.mark.parametrize( 'n2', [4,8] )
@pytest.mark.parametrize( 'n3', [4] )
@pytest.mark.parametrize( 'p1', [1, 2, 3] )
@pytest.mark.parametrize( 'p2', [2] )
@pytest.mark.parametrize( 'p3', [2] )
@pytest.mark.parallel
def test_kron_solver_3d_band_par( n1, n2, n3, p1, p2, p3, P1=False, P2=False, P3=False ):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
# ... 2D MPI cart
cart = CartDecomposition(
npts = [n1, n2, n3],
pads = [p1, p2, p3],
periods = [P1, P2, P3],
reorder = True,
comm = comm
)
# ...
sizes1 = cart.global_ends[0] - cart.global_starts[0] + 1
sizes2 = cart.global_ends[1] - cart.global_starts[1] + 1
sizes3 = cart.global_ends[2] - cart.global_starts[2] + 1
disps = cart.global_starts[0]*n2*n3 + cart.global_starts[1]*n3 + cart.global_starts[2]
sizes = sizes1*sizes2*sizes3
# ...
# ... Vector Spaces
V = StencilVectorSpace(cart)
[s1, s2, s3] = V.starts
[e1, e2, e3] = V.ends
# TODO: make MPI type available through property
mpi_type = V._mpi_type
# ...
V1 = StencilVectorSpace([n1], [p1], [P1])
V2 = StencilVectorSpace([n2], [p2], [P2])
V3 = StencilVectorSpace([n3], [p3], [P3])
# ... Matrices and Direct solvers
A1 = StencilMatrix(V1, V1)
A1[:,-p1:0 ] = -4
A1[:, 0 :1 ] = 10*p1
A1[:, 1 :p1+1] = -4
A1.remove_spurious_entries()
A1_bnd, la1, ua1 = to_bnd(A1)
solver_1 = BandedSolver(ua1, la1, A1_bnd)
A2 = StencilMatrix(V2, V2)
A2[:,-p2:0 ] = -1
A2[:, 0 :1 ] = 2*p2
A2[:, 1 :p2+1] = -1
A2.remove_spurious_entries()
A2_bnd, la2, ua2 = to_bnd(A2)
solver_2 = BandedSolver(ua2, la2, A2_bnd)
A3 = StencilMatrix(V3, V3)
A3[:,-p3:0 ] = -2
A3[:, 0 :1 ] = 3*p2
A3[:, 1 :p3+1] = -2
A3.remove_spurious_entries()
A3_bnd, la3, ua3 = to_bnd(A3)
solver_3 = BandedSolver(ua3, la3, A3_bnd)
# ... RHS
Y = StencilVector(V)
Y_glob = np.array([[[(i1+1)*100+(i2+1)*10 +(i3+1) for i3 in range(n3)] for i2 in range(n2)] for i1 in range(n1)])
Y[s1:e1+1, s2:e2+1, s3:e3+1] = Y_glob[s1:e1+1, s2:e2+1, s3:e3+1]
Y.update_ghost_regions()
# ...
X_glob = kron_solve_seq_ref(A1, A2, A3, Y_glob)
X = kronecker_solve_3d_par(solver_1, solver_2, solver_3, Y)
for i in range(comm.Get_size()):
if rank == i:
print('rank= ', rank)
print('X_glob = \n', X_glob)
print('X = \n', X.toarray().reshape(n1, n2, n3))
print('', flush=True)
time.sleep(0.1)
comm.Barrier()
# ...
# ... Check data
assert np.allclose( X[s1:e1+1, s2:e2+1, s3:e3+1], X_glob[s1:e1+1, s2:e2+1, s3:e3+1], rtol=1e-8, atol=1e-8 )
#===============================================================================
#===============================================================================
@pytest.mark.parametrize( 'n1', [8,16] )
@pytest.mark.parametrize( 'n2', [4,8] )
@pytest.mark.parametrize( 'n3', [4] )
@pytest.mark.parametrize( 'p1', [1, 2, 3] )
@pytest.mark.parametrize( 'p2', [2] )
@pytest.mark.parametrize( 'p3', [2] )
@pytest.mark.parallel
def test_kron_solver_3d_sparse_par( n1, n2, n3, p1, p2, p3, P1=False, P2=False, P3=False ):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
# ... 2D MPI cart
cart = CartDecomposition(
npts = [n1, n2, n3],
pads = [p1, p2, p3],
periods = [P1, P2, P3],
reorder = True,
comm = comm
)
# ...
sizes1 = cart.global_ends[0] - cart.global_starts[0] + 1
sizes2 = cart.global_ends[1] - cart.global_starts[1] + 1
sizes3 = cart.global_ends[2] - cart.global_starts[2] + 1
disps = cart.global_starts[0]*n2*n3 + cart.global_starts[1]*n3 + cart.global_starts[2]
sizes = sizes1*sizes2*sizes3
# ...
# ... Vector Spaces
V = StencilVectorSpace(cart)
[s1, s2, s3] = V.starts
[e1, e2, e3] = V.ends
# TODO: make MPI type available through property
mpi_type = V._mpi_type
# ...
V1 = StencilVectorSpace([n1], [p1], [P1])
V2 = StencilVectorSpace([n2], [p2], [P2])
V3 = StencilVectorSpace([n3], [p3], [P3])
# ... Matrices and Direct solvers
A1 = StencilMatrix(V1, V1)
A1[:,-p1:0 ] = -4
A1[:, 0 :1 ] = 10*p1
A1[:, 1 :p1+1] = -4
A1.remove_spurious_entries()
solver_1 = SparseSolver(A1.tosparse())
A2 = StencilMatrix(V2, V2)
A2[:,-p2:0 ] = -1
A2[:, 0 :1 ] = 2*p2
A2[:, 1 :p2+1] = -1
A2.remove_spurious_entries()
solver_2 = SparseSolver(A2.tosparse())
A3 = StencilMatrix(V3, V3)
A3[:,-p3:0 ] = -2
A3[:, 0 :1 ] = 3*p2
A3[:, 1 :p3+1] = -2
A3.remove_spurious_entries()
solver_3 = SparseSolver(A3.tosparse())
# ... RHS
Y = StencilVector(V)
Y_glob = np.array([[[(i1+1)*100+(i2+1)*10 +(i3+1) for i3 in range(n3)] for i2 in range(n2)] for i1 in range(n1)])
Y[s1:e1+1, s2:e2+1, s3:e3+1] = Y_glob[s1:e1+1, s2:e2+1, s3:e3+1]
Y.update_ghost_regions()
# ...
X_glob = kron_solve_seq_ref(A1, A2, A3, Y_glob)
X = kronecker_solve_3d_par(solver_1, solver_2, solver_3, Y)
for i in range(comm.Get_size()):
if rank == i:
print('rank= ', rank)
print('X_glob = \n', X_glob)
print('X = \n', X.toarray().reshape(n1, n2, n3))
print('', flush=True)
time.sleep(0.1)
comm.Barrier()
# ...
# ... Check data
assert np.allclose( X[s1:e1+1, s2:e2+1, s3:e3+1], X_glob[s1:e1+1, s2:e2+1, s3:e3+1], rtol=1e-8, atol=1e-8 )
#===============================================================================
#===============================================================================
# SCRIPT FUNCTIONALITY
#===============================================================================
if __name__ == "__main__":
import sys
# pytest.main( sys.argv )
test_kron_solver_3d_band_par( 8, 4, 4, 2, 1, 1)
| 30.231405 | 117 | 0.534308 | 1,054 | 7,316 | 3.57685 | 0.165085 | 0.047745 | 0.066844 | 0.014854 | 0.729178 | 0.729178 | 0.722016 | 0.722016 | 0.71618 | 0.71618 | 0 | 0.082199 | 0.236741 | 7,316 | 241 | 118 | 30.356846 | 0.592944 | 0.132313 | 0 | 0.726115 | 0 | 0 | 0.012993 | 0 | 0 | 0 | 0 | 0.004149 | 0.012739 | 1 | 0.025478 | false | 0 | 0.070064 | 0 | 0.10828 | 0.050955 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
022281f59f6dcd46f10246eba13c9a73a410ae82 | 229 | py | Python | featuretools/primitives/base/api.py | ridicolos/featuretools | 0af409da206e0b691ec64a3e0e618a43f1701dd9 | [
"BSD-3-Clause"
] | 942 | 2020-11-10T02:59:39.000Z | 2022-03-31T16:34:33.000Z | featuretools/primitives/base/api.py | 167rgc911/featuretools | bbad3f7392b203b7b9c250a93465052e7fc06bbc | [
"BSD-3-Clause"
] | 721 | 2020-11-09T23:12:06.000Z | 2022-03-31T22:33:35.000Z | featuretools/primitives/base/api.py | 167rgc911/featuretools | bbad3f7392b203b7b9c250a93465052e7fc06bbc | [
"BSD-3-Clause"
] | 127 | 2020-11-10T10:12:30.000Z | 2022-03-27T08:55:05.000Z | # flake8: noqa
from .aggregation_primitive_base import (
AggregationPrimitive,
make_agg_primitive
)
from .primitive_base import PrimitiveBase
from .transform_primitive_base import TransformPrimitive, make_trans_primitive
| 28.625 | 78 | 0.842795 | 25 | 229 | 7.36 | 0.56 | 0.211957 | 0.309783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004951 | 0.117904 | 229 | 7 | 79 | 32.714286 | 0.905941 | 0.052402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
025945c75c0f012cbb88ec6957f0650fb198d1d0 | 904 | py | Python | src/fortresstools/command/rsync.py | gembcior/FortressTools | 0230a9625034038ad6839c8dfab7f6b6a7f2108b | [
"MIT"
] | null | null | null | src/fortresstools/command/rsync.py | gembcior/FortressTools | 0230a9625034038ad6839c8dfab7f6b6a7f2108b | [
"MIT"
] | null | null | null | src/fortresstools/command/rsync.py | gembcior/FortressTools | 0230a9625034038ad6839c8dfab7f6b6a7f2108b | [
"MIT"
] | null | null | null | from .base import BaseCommand
from ..executor import *
class RsyncFromRemoteCommand(BaseCommand):
def get_supported_executors(self):
return [ShellExecutor, SshExecutor, VenvExecutor, DockerExecutor]
def execute(self):
rsync = "rsync -az --delete"
source = f"{self.login}@{self.remote}:{self.directory}/"
destination = self.directory
cmd = f"{rsync} {source} {destination} {self.options}"
return self._executor.run(cmd)
class RsyncToRemoteCommand(BaseCommand):
def get_supported_executors(self):
return [ShellExecutor, SshExecutor, VenvExecutor, DockerExecutor]
def execute(self):
rsync = "rsync -az --delete"
source = f"{self.directory}/"
destination = f"{self.login}@{self.remote}:{self.directory}"
cmd = f"{rsync} {source} {destination} {self.options}"
return self._executor.run(cmd)
| 33.481481 | 73 | 0.668142 | 94 | 904 | 6.361702 | 0.329787 | 0.086957 | 0.056856 | 0.086957 | 0.789298 | 0.789298 | 0.789298 | 0.70903 | 0.70903 | 0.70903 | 0 | 0 | 0.205752 | 904 | 26 | 74 | 34.769231 | 0.832869 | 0 | 0 | 0.6 | 0 | 0 | 0.254425 | 0.096239 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0.1 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
5a2e0a531cb9b64aad61f51a01d7b85144ae930b | 106 | py | Python | terrascript/postgresql/__init__.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/postgresql/__init__.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/postgresql/__init__.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | # terrascript/postgresql/__init__.py
import terrascript
class postgresql(terrascript.Provider):
pass | 17.666667 | 39 | 0.811321 | 11 | 106 | 7.454545 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113208 | 106 | 6 | 40 | 17.666667 | 0.87234 | 0.320755 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
cea3a6642739faace26e8cf60858d818bb6d9356 | 225 | py | Python | docker_registry_cli/commands/ls.py | iknowright/docker-registry-cli | 23cba75c62ec1de211c68cb53f4babbfcb41aa90 | [
"MIT"
] | null | null | null | docker_registry_cli/commands/ls.py | iknowright/docker-registry-cli | 23cba75c62ec1de211c68cb53f4babbfcb41aa90 | [
"MIT"
] | null | null | null | docker_registry_cli/commands/ls.py | iknowright/docker-registry-cli | 23cba75c62ec1de211c68cb53f4babbfcb41aa90 | [
"MIT"
] | null | null | null | from docker_registry_cli import out
from docker_registry_cli.registry import api
class ListRepo:
"""Listing repositories"""
def __call__(self):
out.write("\n".join(api.list_repos().json()["repositories"]))
| 22.5 | 69 | 0.715556 | 29 | 225 | 5.241379 | 0.689655 | 0.131579 | 0.236842 | 0.276316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151111 | 225 | 9 | 70 | 25 | 0.795812 | 0.088889 | 0 | 0 | 0 | 0 | 0.070352 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ceb3e52837c9ba023b3ccf740e18be1f509fe743 | 35 | py | Python | teste.py | dsjocimar/python | 5716f46a9fa7f64aa78a39df9c262c5392571340 | [
"MIT"
] | null | null | null | teste.py | dsjocimar/python | 5716f46a9fa7f64aa78a39df9c262c5392571340 | [
"MIT"
] | null | null | null | teste.py | dsjocimar/python | 5716f46a9fa7f64aa78a39df9c262c5392571340 | [
"MIT"
] | null | null | null | print(f'{"Lojas da Bahia":^40} ur') | 35 | 35 | 0.628571 | 7 | 35 | 3.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.085714 | 35 | 1 | 35 | 35 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0.694444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
cecacf5e1b18a712ea862ea760d22c6af9a0d9fc | 42 | py | Python | grafog/transforms/__init__.py | rish-16/grafog | 4141f94f6fa19bbbfda8403b60a5720214bb4a42 | [
"MIT"
] | 2 | 2022-03-06T03:21:03.000Z | 2022-03-23T14:15:09.000Z | grafog/__init__.py | rish-16/grafog | 4141f94f6fa19bbbfda8403b60a5720214bb4a42 | [
"MIT"
] | 1 | 2022-03-23T14:20:37.000Z | 2022-03-28T08:09:03.000Z | grafog/transforms/__init__.py | rish-16/grafog | 4141f94f6fa19bbbfda8403b60a5720214bb4a42 | [
"MIT"
] | null | null | null | from grafog.transforms.transforms import * | 42 | 42 | 0.857143 | 5 | 42 | 7.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 42 | 1 | 42 | 42 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ced0d4c39a3a579308dacc9dd9eced402be07522 | 372 | py | Python | martypy/Exceptions.py | robotical/martypy | afc1f89d471875ca1beb775f375438f97fc33679 | [
"Apache-2.0"
] | 8 | 2017-08-02T11:31:50.000Z | 2022-01-05T14:36:53.000Z | martypy/Exceptions.py | robotical/martypy | afc1f89d471875ca1beb775f375438f97fc33679 | [
"Apache-2.0"
] | 17 | 2017-07-24T22:39:43.000Z | 2022-01-05T14:41:20.000Z | martypy/Exceptions.py | robotical/martypy | afc1f89d471875ca1beb775f375438f97fc33679 | [
"Apache-2.0"
] | 5 | 2017-11-12T08:51:18.000Z | 2020-11-27T09:28:46.000Z |
class UnavailableCommandException(Exception):
pass
class MartyCommandException(Exception):
pass
class MartyConfigException(Exception):
pass
class MartyConnectException(Exception):
pass
class ArgumentOutOfRangeException(Exception):
pass
class UnavailableClientTypeException(Exception):
pass
class MartyTransferException(Exception):
pass
| 16.173913 | 48 | 0.790323 | 28 | 372 | 10.5 | 0.357143 | 0.309524 | 0.367347 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153226 | 372 | 22 | 49 | 16.909091 | 0.933333 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
ced2e7a90232d39453e541216c5fe4a4031cb375 | 4,270 | py | Python | yolo2/models/yolo2_mobilenetv3_large.py | gan3sh500/keras-YOLOv3-model-set | 1a1108c52073c01c130618dc68607f455adadf28 | [
"MIT"
] | 601 | 2019-08-24T10:14:52.000Z | 2022-03-29T15:05:33.000Z | yolo2/models/yolo2_mobilenetv3_large.py | lvxiaojie111/keras-YOLOv3-model-set | 80f263700d8251376e59a7908dd1ae8408e5eb01 | [
"MIT"
] | 220 | 2019-10-04T18:57:59.000Z | 2022-03-31T15:30:37.000Z | yolo2/models/yolo2_mobilenetv3_large.py | lvxiaojie111/keras-YOLOv3-model-set | 80f263700d8251376e59a7908dd1ae8408e5eb01 | [
"MIT"
] | 218 | 2019-10-31T03:32:11.000Z | 2022-03-25T14:44:19.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""YOLO_v2 MobileNetV3Large Model Defined in Keras."""
from tensorflow.keras.layers import MaxPooling2D, Lambda, Concatenate, GlobalAveragePooling2D, Softmax
from tensorflow.keras.models import Model
from common.backbones.mobilenet_v3 import MobileNetV3Large
from yolo2.models.layers import compose, DarknetConv2D, DarknetConv2D_BN_Leaky, Depthwise_Separable_Conv2D_BN_Leaky, yolo2_predictions, yolo2lite_predictions
def yolo2_mobilenetv3large_body(inputs, num_anchors, num_classes, alpha=1.0):
"""Create YOLO_V2 MobileNetV3Large model CNN body in Keras."""
mobilenetv3large = MobileNetV3Large(input_tensor=inputs, weights='imagenet', include_top=False, alpha=alpha)
print('backbone layers number: {}'.format(len(mobilenetv3large.layers)))
# input: 416 x 416 x 3
# mobilenetv3large.output(layer 194, final feature map): 13 x 13 x (960*alpha)
# expanded_conv_14/Add(layer 191, end of block14): 13 x 13 x (160*alpha)
# activation_29(layer 146, middle in block12) : 26 x 26 x (672*alpha)
# expanded_conv_11/Add(layer 143, end of block11) : 26 x 26 x (112*alpha)
# NOTE: activation layer name may different for TF1.x/2.x, so we
# use index to fetch layer
# f1: 13 x 13 x (960*alpha)
f1 = mobilenetv3large.output
# f2: 26 x 26 x (672*alpha)
f2 = mobilenetv3large.layers[146].output
f1_channel_num = int(960*alpha)
f2_channel_num = int(672*alpha)
y = yolo2_predictions((f1, f2), (f1_channel_num, f2_channel_num), num_anchors, num_classes)
return Model(inputs, y)
def yolo2lite_mobilenetv3large_body(inputs, num_anchors, num_classes, alpha=1.0):
"""Create YOLO_V2 Lite MobileNetV3Large model CNN body in Keras."""
mobilenetv3large = MobileNetV3Large(input_tensor=inputs, weights='imagenet', include_top=False, alpha=alpha)
print('backbone layers number: {}'.format(len(mobilenetv3large.layers)))
# input: 416 x 416 x 3
# mobilenetv3large.output(layer 194, final feature map): 13 x 13 x (960*alpha)
# expanded_conv_14/Add(layer 191, end of block14): 13 x 13 x (160*alpha)
# activation_29(layer 146, middle in block12) : 26 x 26 x (672*alpha)
# expanded_conv_11/Add(layer 143, end of block11) : 26 x 26 x (112*alpha)
# NOTE: activation layer name may different for TF1.x/2.x, so we
# use index to fetch layer
# f1: 13 x 13 x (960*alpha)
f1 = mobilenetv3large.output
# f2: 26 x 26 x (672*alpha)
f2 = mobilenetv3large.layers[146].output
f1_channel_num = int(960*alpha)
f2_channel_num = int(672*alpha)
y = yolo2lite_predictions((f1, f2), (f1_channel_num, f2_channel_num), num_anchors, num_classes)
return Model(inputs, y)
def tiny_yolo2_mobilenetv3large_body(inputs, num_anchors, num_classes, alpha=1.0):
"""Create Tiny YOLO_V2 MobileNetV3Large model CNN body in Keras."""
mobilenetv3large = MobileNetV3Large(input_tensor=inputs, weights='imagenet', include_top=False, alpha=alpha)
print('backbone layers number: {}'.format(len(mobilenetv3large.layers)))
# input: 416 x 416 x 3
# mobilenetv3large.output(layer 194, final feature map): 13 x 13 x (960*alpha)
# f1: 13 x 13 x (960*alpha)
f1 = mobilenetv3large.output
f1_channel_num = int(960*alpha)
y = compose(
DarknetConv2D_BN_Leaky(f1_channel_num, (3,3)),
DarknetConv2D(num_anchors*(num_classes+5), (1,1), name='predict_conv'))(f1)
return Model(inputs, y)
def tiny_yolo2lite_mobilenetv3large_body(inputs, num_anchors, num_classes, alpha=1.0):
"""Create Tiny YOLO_V2 Lite MobileNetV3Large model CNN body in Keras."""
mobilenetv3large = MobileNetV3Large(input_tensor=inputs, weights='imagenet', include_top=False, alpha=alpha)
print('backbone layers number: {}'.format(len(mobilenetv3large.layers)))
# input: 416 x 416 x 3
# mobilenetv3large.output(layer 194, final feature map): 13 x 13 x (960*alpha)
# f1: 13 x 13 x (960*alpha)
f1 = mobilenetv3large.output
f1_channel_num = int(960*alpha)
y = compose(
Depthwise_Separable_Conv2D_BN_Leaky(f1_channel_num, (3,3), block_id_str='pred_1'),
DarknetConv2D(num_anchors*(num_classes+5), (1,1), name='predict_conv'))(f1)
return Model(inputs, y)
| 42.7 | 157 | 0.717799 | 621 | 4,270 | 4.782609 | 0.191626 | 0.020202 | 0.016835 | 0.020202 | 0.868013 | 0.849495 | 0.845791 | 0.83165 | 0.83165 | 0.83165 | 0 | 0.093017 | 0.171663 | 4,270 | 99 | 158 | 43.131313 | 0.746678 | 0.348009 | 0 | 0.7 | 0 | 0 | 0.060739 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.3 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0c802a73991007153f0138a3634c5694008ead6b | 189 | py | Python | satdetect/viz/__init__.py | michaelchughes/satdetect | 23319fd1c6bd7d36709b5948584efcc723c428c1 | [
"MIT"
] | 3 | 2016-03-15T17:27:31.000Z | 2019-02-25T16:46:05.000Z | satdetect/viz/__init__.py | michaelchughes/satdetect | 23319fd1c6bd7d36709b5948584efcc723c428c1 | [
"MIT"
] | 1 | 2018-05-24T21:57:08.000Z | 2018-05-24T21:57:08.000Z | satdetect/viz/__init__.py | michaelchughes/satdetect | 23319fd1c6bd7d36709b5948584efcc723c428c1 | [
"MIT"
] | 2 | 2016-07-08T10:14:59.000Z | 2019-02-25T16:46:08.000Z | from VizUtil import imshow, showExamples, makeImageWithBBoxAnnotations
import ShowExamples # Script
__all__ = [ShowExamples,
imshow, showExamples, makeImageWithBBoxAnnotations] | 31.5 | 70 | 0.804233 | 14 | 189 | 10.571429 | 0.571429 | 0.243243 | 0.621622 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 189 | 6 | 71 | 31.5 | 0.919255 | 0.031746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0ca700f6e75d6b6d5b265bb8a619026e3a3d3642 | 3,837 | py | Python | src/test/jython/responses/__init__.py | jdewinne/xlr-sdelements-plugin | 177f0b6df92b8fb1450b926996c353a9bd07e32b | [
"MIT"
] | null | null | null | src/test/jython/responses/__init__.py | jdewinne/xlr-sdelements-plugin | 177f0b6df92b8fb1450b926996c353a9bd07e32b | [
"MIT"
] | 2 | 2018-04-24T14:27:24.000Z | 2018-05-07T22:00:08.000Z | src/test/jython/responses/__init__.py | jdewinne/xlr-sdelements-plugin | 177f0b6df92b8fb1450b926996c353a9bd07e32b | [
"MIT"
] | 1 | 2017-04-20T15:45:01.000Z | 2017-04-20T15:45:01.000Z | GET_TASK_RESPONSE = '{"accepted": true, "ad_hoc": false, "artifact_proxy": "ABC-XYZ", "assigned_to": [{ "first_name": "Admin", "last_name": "Testerton", "is_active": true, "email": "admin@example.com", "role": { "id": "UR1", "name": "User" }, "id": 1}], "text": "Insecure forgotten password.", "id": "1-T2", "library_task_created": "2010-10-20T17:46:50Z", "library_task_updated": "2015-05-07T18:58:26.732000Z", "note_count": 0, "phase": "Requirements", "priority": "8", "relevant": true, "status": "TS2", "task_id": "T2", "title": "Secure forgotten password", "updated": "2015-07-08T02:16:33.923315Z", "url": "http://example.com/bunits/bu1/app1/proj1/tasks/phase/requirements/1-T2", "verification_status": null}'
GET_TASKS_RESPONSE = '{"results": [{"id": "1-T2","task_id": "T2","url": "http://example.com/bunits/new-business-unit/...","title": "Secure forgotten password","text": "Insecure forgotten password and password reset...","priority": 8,"phase": "Requirements","ad_hoc": false,"relevant": true,"accepted": true,"assigned_to": [],"updated": "2015-06-16T19:37:44.710100Z","library_task_created": "2015-06-16T19:36:57.863684Z","library_task_updated": "2015-06-16T19:36:57.836874Z","verification_status": null,"status": "TS2","note_count": 0,"artifact_proxy": null}]}'
GET_TASKS_33_RESPONSE = '{"results": [{"id": "1-T2","task_id": "T2","url": "http://example.com/bunits/new-business-unit/...","title": "Secure forgotten password","text": "Insecure forgotten password and password reset...","priority": 8,"phase": "Requirements","ad_hoc": false,"relevant": true,"accepted": true,"assigned_to": [],"updated": "2015-06-16T19:37:44.710100Z","library_task_created": "2015-06-16T19:36:57.863684Z","library_task_updated": "2015-06-16T19:36:57.836874Z","verification_status": null,"status": "TS2","note_count": 0,"artifact_proxy": null},{"id": "1-T3","task_id": "T3","url": "http://example.com/bunits/new-business-unit/...","title": "Secure forgotten password","text": "Insecure forgotten password and password reset...","priority": 8,"phase": "Requirements","ad_hoc": false,"relevant": true,"accepted": true,"assigned_to": [],"updated": "2015-06-16T19:37:44.710100Z","library_task_created": "2015-06-16T19:36:57.863684Z","library_task_updated": "2015-06-16T19:36:57.836874Z","verification_status": null,"status": "TS2","note_count": 0,"artifact_proxy": null},{"id": "1-T4","task_id": "T4","url": "http://example.com/bunits/new-business-unit/...","title": "Secure forgotten password","text": "Insecure forgotten password and password reset...","priority": 8,"phase": "Requirements","ad_hoc": false,"relevant": true,"accepted": true,"assigned_to": [],"updated": "2015-06-16T19:37:44.710100Z","library_task_created": "2015-06-16T19:36:57.863684Z","library_task_updated": "2015-06-16T19:36:57.836874Z","verification_status": null,"status": "TS3","note_count": 0,"artifact_proxy": null}]}'
GET_APPLICATION_RESPONSE = '{"results": [{"id": 1,"business_unit": 1,"name": "Application Test","created": "2015-04-15T20:27:24.396442Z","updated": "2015-04-15T20:27:24.389957Z","priority": "0-none","slug": "application-test"}]}'
GET_PROJECT_RESPONSE = '{"results": [{"id": 1,"slug": "project-test","url": "http://example.com/bunits/bu-test/app-test/project-test","application": 1,"profile" : {"id": "P9","name": "Android App","logo_url": "/static/images/android.png"},"archived": false,"name": "Project Test","creator": 1,"description": "API Project","tags": ["foo", "bar"],"created": "2015-04-15T19:30:04.132712Z","updated": "2015-04-15T19:57:15.042353Z","parent": null,"users": [{"id": "1","email": "test@example.com","role": "PR4","first_name": "Admin","last_name": "Testerton","is_active": true}],"groups": [{"id": "G1","name": "Devs","role": "PR4"}],"custom_attributes": { "slug": "value"},"locked_on": null,"locked_by": null,"locked": false}]}'
| 639.5 | 1,609 | 0.685953 | 536 | 3,837 | 4.776119 | 0.253731 | 0.051563 | 0.051563 | 0.05625 | 0.632422 | 0.602734 | 0.602734 | 0.589844 | 0.589844 | 0.55625 | 0 | 0.117289 | 0.057858 | 3,837 | 5 | 1,610 | 767.4 | 0.590871 | 0 | 0 | 0 | 0 | 1 | 0.966119 | 0.340109 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
0ccc87e5cd948a46790bcb103de0bbb509eef677 | 1,260 | py | Python | temboo/core/Library/Foursquare/Lists/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Foursquare/Lists/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Foursquare/Lists/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.Foursquare.Lists.AddItem import AddItem, AddItemInputSet, AddItemResultSet, AddItemChoreographyExecution
from temboo.Library.Foursquare.Lists.AddList import AddList, AddListInputSet, AddListResultSet, AddListChoreographyExecution
from temboo.Library.Foursquare.Lists.DeleteItem import DeleteItem, DeleteItemInputSet, DeleteItemResultSet, DeleteItemChoreographyExecution
from temboo.Library.Foursquare.Lists.FollowList import FollowList, FollowListInputSet, FollowListResultSet, FollowListChoreographyExecution
from temboo.Library.Foursquare.Lists.ListDetails import ListDetails, ListDetailsInputSet, ListDetailsResultSet, ListDetailsChoreographyExecution
from temboo.Library.Foursquare.Lists.ListFollowers import ListFollowers, ListFollowersInputSet, ListFollowersResultSet, ListFollowersChoreographyExecution
from temboo.Library.Foursquare.Lists.UnfollowList import UnfollowList, UnfollowListInputSet, UnfollowListResultSet, UnfollowListChoreographyExecution
from temboo.Library.Foursquare.Lists.UpdateItem import UpdateItem, UpdateItemInputSet, UpdateItemResultSet, UpdateItemChoreographyExecution
from temboo.Library.Foursquare.Lists.UpdateList import UpdateList, UpdateListInputSet, UpdateListResultSet, UpdateListChoreographyExecution
| 126 | 154 | 0.9 | 99 | 1,260 | 11.454545 | 0.424242 | 0.079365 | 0.134921 | 0.214286 | 0.253968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 1,260 | 9 | 155 | 140 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0cd46e709d05f08bdd1fe8128cdab16c8b1370fe | 8,856 | py | Python | tests/test_plotter.py | TheGoldfish01/pydpf-core | 75ca8a180454f94cedafbc68c1d6f20dcfc4c795 | [
"MIT"
] | 11 | 2021-01-31T15:50:02.000Z | 2021-10-01T23:15:38.000Z | tests/test_plotter.py | TheGoldfish01/pydpf-core | 75ca8a180454f94cedafbc68c1d6f20dcfc4c795 | [
"MIT"
] | 46 | 2021-01-14T05:00:50.000Z | 2021-10-06T18:30:37.000Z | tests/test_plotter.py | TheGoldfish01/pydpf-core | 75ca8a180454f94cedafbc68c1d6f20dcfc4c795 | [
"MIT"
] | 3 | 2021-06-30T07:18:30.000Z | 2021-09-15T08:43:11.000Z | import os
import pytest
from ansys import dpf
from ansys.dpf import core
from ansys.dpf.core import Model, Operator
from ansys.dpf.core import errors as dpf_errors
from ansys.dpf.core import misc
if misc.module_exists("pyvista"):
HAS_PYVISTA = True
from ansys.dpf.core.plotter import Plotter as DpfPlotter
from pyvista.plotting.renderer import CameraPosition # noqa: F401
else:
HAS_PYVISTA = False
# currently running dpf on docker. Used for testing on CI
RUNNING_DOCKER = os.environ.get("DPF_DOCKER", False)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_chart_plotter(plate_msup):
model = Model(plate_msup)
mesh = model.metadata.meshed_region
tfq = model.metadata.time_freq_support
timeids = list(range(1, tfq.n_sets + 1))
disp = model.results.displacement()
disp.inputs.time_scoping.connect(timeids)
new_fields_container = disp.get_output(0, dpf.core.types.fields_container)
pl = DpfPlotter(model.metadata.meshed_region)
ret = pl.plot_chart(new_fields_container)
assert ret
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plotter_on_mesh(allkindofcomplexity):
model = Model(allkindofcomplexity)
pl = DpfPlotter(model.metadata.meshed_region)
cpos = pl.plot_mesh()
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plotter_on_field(allkindofcomplexity):
model = Model(allkindofcomplexity)
stress = model.results.stress()
stress.inputs.requested_location.connect("Elemental")
avg_op = Operator("to_elemental_fc")
avg_op.inputs.fields_container.connect(stress.outputs.fields_container)
fc = avg_op.outputs.fields_container()
field = fc[1]
pl = DpfPlotter(model.metadata.meshed_region)
fields_container = dpf.core.FieldsContainer()
fields_container.add_label("time")
fields_container.add_field({"time": 1}, field)
cpos = pl.plot_contour(fields_container)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plotter_on_fields_container_elemental(allkindofcomplexity):
model = Model(allkindofcomplexity)
stress = model.results.stress()
stress.inputs.requested_location.connect("Elemental")
avg_op = Operator("to_elemental_fc")
avg_op.inputs.fields_container.connect(stress.outputs.fields_container)
fc = avg_op.outputs.fields_container()
pl = DpfPlotter(model.metadata.meshed_region)
cpos = pl.plot_contour(fc)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plotter_on_fields_container_nodal(allkindofcomplexity):
model = Model(allkindofcomplexity)
stress = model.results.stress()
stress.inputs.requested_location.connect("Elemental")
avg_op = Operator("to_nodal_fc")
avg_op.inputs.fields_container.connect(stress.outputs.fields_container)
fc = avg_op.outputs.fields_container()
pl = DpfPlotter(model.metadata.meshed_region)
cpos = pl.plot_contour(fc)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plot_fieldscontainer_on_mesh(allkindofcomplexity):
model = Model(allkindofcomplexity)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Elemental")
avg_op = Operator("to_elemental_fc")
avg_op.inputs.fields_container.connect(stress.outputs.fields_container)
fc = avg_op.outputs.fields_container()
mesh.plot(fc)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_field_elemental_plot(allkindofcomplexity):
model = Model(allkindofcomplexity)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Elemental")
avg_op = Operator("to_elemental_fc")
avg_op.inputs.fields_container.connect(stress.outputs.fields_container)
fc = avg_op.outputs.fields_container()
f = fc[1]
f.plot()
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_field_nodal_plot(allkindofcomplexity):
model = Model(allkindofcomplexity)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Elemental")
avg_op = Operator("to_nodal_fc")
avg_op.inputs.fields_container.connect(stress.outputs.fields_container)
fc = avg_op.outputs.fields_container()
f = fc[1]
f.plot()
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_field_solid_plot(allkindofcomplexity):
model = Model(allkindofcomplexity)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Nodal")
fc = stress.outputs.fields_container()
f = fc[1]
f.plot()
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_field_shell_plot(allkindofcomplexity):
model = Model(allkindofcomplexity)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Nodal")
fc = stress.outputs.fields_container()
f = fc[0]
f.plot()
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_field_solid_plot_scoping_nodal(multishells):
model = core.Model(multishells)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Nodal")
scoping = core.Scoping()
scoping.location = "Nodal"
l = list(range(0, 400))
l += list(range(1500, 2000))
l += list(range(2200, 2600))
scoping.ids = l
stress.inputs.mesh_scoping.connect(scoping)
s = stress.outputs.fields_container()
f = s[0]
f.plot()
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_field_shell_plot_scoping_elemental(multishells):
model = core.Model(multishells)
mesh = model.metadata.meshed_region
stress = model.results.stress()
scoping = core.Scoping()
scoping.location = "Elemental"
l = list(range(3000, 4500))
scoping.ids = l
stress.inputs.mesh_scoping.connect(scoping)
avg = core.Operator("to_elemental_fc")
avg.inputs.fields_container.connect(stress.outputs.fields_container)
s = avg.outputs.fields_container()
f = s[1]
f.plot(shell_layers=core.shell_layers.top)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plot_fieldscontainer_on_mesh_scoping(multishells):
model = core.Model(multishells)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Nodal")
scoping = core.Scoping()
scoping.location = "Nodal"
l = list(range(0, 400))
l += list(range(1500, 2000))
l += list(range(2200, 2600))
scoping.ids = l
stress.inputs.mesh_scoping.connect(scoping)
s = stress.outputs.fields_container()
mesh.plot(s, shell_layers=core.shell_layers.top)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_plot_fields_on_mesh_scoping(multishells):
model = core.Model(multishells)
mesh = model.metadata.meshed_region
stress = model.results.stress()
stress.inputs.requested_location.connect("Nodal")
scoping = core.Scoping()
scoping.location = "Nodal"
l = list(range(0, 400))
l += list(range(1500, 2000))
l += list(range(2200, 2600))
scoping.ids = l
stress.inputs.mesh_scoping.connect(scoping)
s = stress.outputs.fields_container()
mesh.plot(s[0])
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_throw_on_several_time_steps(plate_msup):
model = core.Model(plate_msup)
scoping = core.Scoping()
scoping.ids = range(3, len(model.metadata.time_freq_support.time_frequencies) + 1)
stress = model.results.displacement()
stress.inputs.time_scoping.connect(scoping)
fc = stress.outputs.fields_container()
mesh = model.metadata.meshed_region
with pytest.raises(dpf_errors.FieldContainerPlottingError):
mesh.plot(fc)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
def test_throw_complex_file(complex_model):
model = core.Model(complex_model)
stress = model.results.displacement()
fc = stress.outputs.fields_container()
mesh = model.metadata.meshed_region
with pytest.raises(dpf_errors.ComplexPlottingError):
mesh.plot(fc)
@pytest.mark.skipif(not HAS_PYVISTA, reason="Please install pyvista")
@pytest.mark.skipif(RUNNING_DOCKER, reason="Path hidden within docker container")
def test_plot_contour_using_vtk_file(complex_model):
model = core.Model(complex_model)
stress = model.results.displacement()
fc = stress.outputs.fields_container()
pl = DpfPlotter(model.metadata.meshed_region)
pl._plot_contour_using_vtk_file(fc)
| 36.595041 | 86 | 0.746612 | 1,155 | 8,856 | 5.526407 | 0.110823 | 0.0893 | 0.075826 | 0.0705 | 0.826101 | 0.782861 | 0.768917 | 0.768917 | 0.751997 | 0.731318 | 0 | 0.011206 | 0.143519 | 8,856 | 241 | 87 | 36.746888 | 0.830323 | 0.007565 | 0 | 0.689655 | 0 | 0 | 0.07216 | 0 | 0 | 0 | 0 | 0 | 0.004926 | 1 | 0.083744 | false | 0 | 0.044335 | 0 | 0.128079 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0b2fa106f0dfcdbf569e3419549da19ec5c10039 | 611 | py | Python | temboo/core/Library/Facebook/Actions/General/Likes/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Facebook/Actions/General/Likes/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Facebook/Actions/General/Likes/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.Facebook.Actions.General.Likes.CreateLike import CreateLike, CreateLikeInputSet, CreateLikeResultSet, CreateLikeChoreographyExecution
from temboo.Library.Facebook.Actions.General.Likes.DeleteLike import DeleteLike, DeleteLikeInputSet, DeleteLikeResultSet, DeleteLikeChoreographyExecution
from temboo.Library.Facebook.Actions.General.Likes.ReadLikes import ReadLikes, ReadLikesInputSet, ReadLikesResultSet, ReadLikesChoreographyExecution
from temboo.Library.Facebook.Actions.General.Likes.UpdateLike import UpdateLike, UpdateLikeInputSet, UpdateLikeResultSet, UpdateLikeChoreographyExecution
| 122.2 | 153 | 0.895254 | 52 | 611 | 10.519231 | 0.461538 | 0.073126 | 0.124314 | 0.182815 | 0.321755 | 0.321755 | 0.321755 | 0 | 0 | 0 | 0 | 0 | 0.045827 | 611 | 4 | 154 | 152.75 | 0.93825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0b40ceb58b8ec96d6f03ddd93f97dd8c10e9f5d7 | 144 | py | Python | test.py | Anders-Fogh/AF09 | 35ddff26c12d3ff0fd555fab6f81fd47ea6da24e | [
"MIT"
] | 1 | 2021-04-12T06:57:19.000Z | 2021-04-12T06:57:19.000Z | test.py | Anders-Fogh/AF09 | 35ddff26c12d3ff0fd555fab6f81fd47ea6da24e | [
"MIT"
] | null | null | null | test.py | Anders-Fogh/AF09 | 35ddff26c12d3ff0fd555fab6f81fd47ea6da24e | [
"MIT"
] | null | null | null | import subprocess
subprocess.call("wget -O build.sh https://gitlab.com/wireguard-vpn/build/-/raw/master/build.sh && bash build.sh", shell=True)
| 48 | 125 | 0.756944 | 23 | 144 | 4.73913 | 0.73913 | 0.192661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069444 | 144 | 2 | 126 | 72 | 0.813433 | 0 | 0 | 0 | 0 | 0.5 | 0.652778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0b7f0813ba098a5eceec74f6e337144eeceeb53f | 96 | py | Python | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/_pydev_imps/_pydev_inspect.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/_pydev_imps/_pydev_inspect.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/_pydev_imps/_pydev_inspect.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/67/6e/24/8caea8de436eda06d88e601739fde6cccea189752a0ec710254e46f832 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.40625 | 0 | 96 | 1 | 96 | 96 | 0.489583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0b9a7d49be9b10ca928ccdf7ec521def2774a87b | 100 | py | Python | flowtext/__init__.py | Oneflow-Inc/text | d7c45c4ed517eab49e8578193f18641b213a6643 | [
"BSD-3-Clause"
] | 1 | 2022-03-11T07:07:14.000Z | 2022-03-11T07:07:14.000Z | flowtext/__init__.py | Oneflow-Inc/text | d7c45c4ed517eab49e8578193f18641b213a6643 | [
"BSD-3-Clause"
] | 2 | 2021-11-23T02:36:15.000Z | 2021-12-01T07:37:30.000Z | flowtext/__init__.py | Oneflow-Inc/text | d7c45c4ed517eab49e8578193f18641b213a6643 | [
"BSD-3-Clause"
] | null | null | null | from . import data
from . import models
from . import nn
from . import datasets
from . import utils
| 16.666667 | 22 | 0.75 | 15 | 100 | 5 | 0.466667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 100 | 5 | 23 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f01b6137393c3d11105598ccc6759a10e5cdaaa7 | 43 | py | Python | protocol/__init__.py | Anarchid/uberserver | 312b69e379e8999c440f1dd6e5c0b00a9932e7b1 | [
"MIT"
] | null | null | null | protocol/__init__.py | Anarchid/uberserver | 312b69e379e8999c440f1dd6e5c0b00a9932e7b1 | [
"MIT"
] | null | null | null | protocol/__init__.py | Anarchid/uberserver | 312b69e379e8999c440f1dd6e5c0b00a9932e7b1 | [
"MIT"
] | null | null | null | import AutoDict, Battle, Channel, Protocol
| 21.5 | 42 | 0.813953 | 5 | 43 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 43 | 1 | 43 | 43 | 0.921053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f04f348b80ede60461850ce695314c5324b3015d | 13,461 | py | Python | tests/test_models.py | CodeYellowBV/django-partial-index | 38cf7f772c4706eff35d8b324c89758f3334ee1a | [
"BSD-3-Clause"
] | null | null | null | tests/test_models.py | CodeYellowBV/django-partial-index | 38cf7f772c4706eff35d8b324c89758f3334ee1a | [
"BSD-3-Clause"
] | null | null | null | tests/test_models.py | CodeYellowBV/django-partial-index | 38cf7f772c4706eff35d8b324c89758f3334ee1a | [
"BSD-3-Clause"
] | 1 | 2021-01-28T17:59:36.000Z | 2021-01-28T17:59:36.000Z | """
Tests for actual use of the indexes after creating models with them.
"""
from django.db import IntegrityError
from django.test import TransactionTestCase
from django.utils import timezone
from django.core.exceptions import NON_FIELD_ERRORS, ValidationError
from testapp.models import User, Room, RoomBookingText, JobText, ComparisonText, NullableRoomNumberText, RoomBookingQ, JobQ, ComparisonQ, NullableRoomNumberQ, Label
class PartialIndexRoomBookingTest(TransactionTestCase):
"""Test that partial unique constraints work as expected when inserting data to the db.
Models and indexes are created when django creates the test db, they do not need to be set up.
"""
def setUp(self):
self.user1 = User.objects.create(name='User1')
self.user2 = User.objects.create(name='User2')
self.room1 = Room.objects.create(name='Room1')
self.room2 = Room.objects.create(name='Room2')
def test_roombooking_text_different_rooms(self):
RoomBookingText.objects.create(user=self.user1, room=self.room1)
RoomBookingText.objects.create(user=self.user1, room=self.room2)
def test_roombooking_q_different_rooms(self):
RoomBookingQ.objects.create(user=self.user1, room=self.room1)
RoomBookingQ.objects.create(user=self.user1, room=self.room2)
def test_roombooking_text_different_users(self):
RoomBookingText.objects.create(user=self.user1, room=self.room1)
RoomBookingText.objects.create(user=self.user2, room=self.room1)
def test_roombooking_q_different_users(self):
RoomBookingQ.objects.create(user=self.user1, room=self.room1)
RoomBookingQ.objects.create(user=self.user2, room=self.room1)
def test_roombooking_text_same_mark_first_deleted(self):
for i in range(3):
book = RoomBookingText.objects.create(user=self.user1, room=self.room1)
book.deleted_at = timezone.now()
book.save()
RoomBookingText.objects.create(user=self.user1, room=self.room1)
def test_roombooking_q_same_mark_first_deleted(self):
for i in range(3):
book = RoomBookingQ.objects.create(user=self.user1, room=self.room1)
book.deleted_at = timezone.now()
book.save()
RoomBookingQ.objects.create(user=self.user1, room=self.room1)
def test_roombooking_text_same_conflict(self):
RoomBookingText.objects.create(user=self.user1, room=self.room1)
with self.assertRaises(IntegrityError):
RoomBookingText.objects.create(user=self.user1, room=self.room1)
def test_roombooking_q_same_conflict(self):
RoomBookingQ.objects.create(user=self.user1, room=self.room1)
room_booking = RoomBookingQ(user=self.user1, room=self.room1)
with self.assertRaises(ValidationError) as cm:
room_booking.full_clean()
self.assertSetEqual({NON_FIELD_ERRORS}, set(cm.exception.message_dict.keys()))
self.assertEqual('unique_together', cm.exception.error_dict[NON_FIELD_ERRORS][0].code)
with self.assertRaises(IntegrityError):
room_booking.save()
class PartialIndexJobTest(TransactionTestCase):
"""Test that partial unique constraints work as expected when inserting data to the db.
Models and indexes are created when django creates the test db, they do not need to be set up.
"""
def test_job_text_same_id(self):
job1 = JobText.objects.create(order=1, group=1)
job2 = JobText.objects.create(order=1, group=2)
self.assertEqual(job1.order, job2.order)
def test_job_q_same_id(self):
job1 = JobQ.objects.create(order=1, group=1)
job2 = JobQ.objects.create(order=1, group=2)
self.assertEqual(job1.order, job2.order)
def test_job_text_same_group(self):
JobText.objects.create(order=1, group=1)
with self.assertRaises(IntegrityError):
JobText.objects.create(order=2, group=1)
def test_job_q_same_group(self):
JobQ.objects.create(order=1, group=1)
job = JobQ(order=2, group=1)
with self.assertRaises(ValidationError) as cm:
job.full_clean()
self.assertSetEqual({'group'}, set(cm.exception.message_dict.keys()))
self.assertEqual('unique', cm.exception.error_dict['group'][0].code)
with self.assertRaises(IntegrityError):
job.save()
def test_job_text_complete_same_group(self):
job1 = JobText.objects.create(order=1, group=1, is_complete=True)
job2 = JobText.objects.create(order=1, group=1)
self.assertEqual(job1.order, job2.order)
def test_job_q_complete_same_group(self):
job1 = JobQ.objects.create(order=1, group=1, is_complete=True)
job2 = JobQ.objects.create(order=1, group=1)
self.assertEqual(job1.order, job2.order)
def test_job_text_complete_later_same_group(self):
job1 = JobText.objects.create(order=1, group=1)
job2 = JobText.objects.create(order=1, group=1, is_complete=True)
self.assertEqual(job1.order, job2.order)
def test_job_q_complete_later_same_group(self):
job1 = JobQ.objects.create(order=1, group=1)
job2 = JobQ.objects.create(order=1, group=1, is_complete=True)
self.assertEqual(job1.order, job2.order)
class PartialIndexComparisonTest(TransactionTestCase):
"""Test that partial unique constraints work as expected when inserting data to the db.
Models and indexes are created when django creates the test db, they do not need to be set up.
"""
def test_comparison_text_duplicate_same_number(self):
ComparisonText.objects.create(a=1, b=1)
with self.assertRaises(IntegrityError):
ComparisonText.objects.create(a=1, b=1)
def test_comparison_q_duplicate_same_number(self):
ComparisonQ.objects.create(a=1, b=1)
with self.assertRaises(IntegrityError):
ComparisonQ.objects.create(a=1, b=1)
def test_comparison_text_different_same_number(self):
ComparisonText.objects.create(a=1, b=1)
ComparisonText.objects.create(a=2, b=2)
def test_comparison_q_different_same_number(self):
ComparisonQ.objects.create(a=1, b=1)
ComparisonQ.objects.create(a=2, b=2)
def test_comparison_text_duplicate_different_numbers(self):
ComparisonText.objects.create(a=1, b=2)
ComparisonText.objects.create(a=1, b=2)
def test_comparison_q_duplicate_different_numbers(self):
ComparisonQ.objects.create(a=1, b=2)
ComparisonQ.objects.create(a=1, b=2)
class PartialIndexRoomNumberTest(TransactionTestCase):
"""Test that partial unique constraints work as expected when inserting data to the db.
Models and indexes are created when django creates the test db, they do not need to be set up.
"""
def setUp(self):
self.room1 = Room.objects.create(name='Room1')
self.room2 = Room.objects.create(name='Room2')
def test_nullable_roomnumber_text_different_rooms(self):
NullableRoomNumberText.objects.create(room=self.room1, room_number=1)
NullableRoomNumberText.objects.create(room=self.room2, room_number=1)
def test_nullable_roomnumber_q_different_rooms(self):
NullableRoomNumberQ.objects.create(room=self.room1, room_number=1)
NullableRoomNumberQ.objects.create(room=self.room2, room_number=1)
def test_nullable_roomnumber_text_different_room_numbers(self):
NullableRoomNumberText.objects.create(room=self.room1, room_number=1)
NullableRoomNumberText.objects.create(room=self.room1, room_number=2)
def test_nullable_roomnumber_q_different_users(self):
NullableRoomNumberQ.objects.create(room=self.room1, room_number=1)
NullableRoomNumberQ.objects.create(room=self.room1, room_number=2)
def test_nullable_roomnumber_text_same_mark_first_deleted(self):
for i in range(3):
nr = NullableRoomNumberText.objects.create(room=self.room1, room_number=1)
nr.deleted_at = timezone.now()
nr.save()
NullableRoomNumberText.objects.create(room=self.room1, room_number=1)
def test_nullable_roomnumber_q_same_mark_first_deleted(self):
for i in range(3):
nr = NullableRoomNumberQ.objects.create(room=self.room1, room_number=1)
nr.deleted_at = timezone.now()
nr.save()
NullableRoomNumberQ.objects.create(room=self.room1, room_number=1)
def test_nullable_roomnumber_text_same_conflict(self):
NullableRoomNumberText.objects.create(room=self.room1, room_number=1)
with self.assertRaises(IntegrityError):
NullableRoomNumberText.objects.create(room=self.room1, room_number=1)
def test_nullable_roomnumber_q_same_conflict(self):
NullableRoomNumberQ.objects.create(room=self.room1, room_number=1)
with self.assertRaises(IntegrityError):
NullableRoomNumberQ.objects.create(room=self.room1, room_number=1)
def test_nullable_roomnumber_text_same_no_conflict_for_null_number(self):
NullableRoomNumberText.objects.create(room=self.room1, room_number=None)
NullableRoomNumberText.objects.create(room=self.room1, room_number=None)
def test_nullable_roomnumber_q_same_no_conflict_for_null_number(self):
NullableRoomNumberQ.objects.create(room=self.room1, room_number=None)
NullableRoomNumberQ.objects.create(room=self.room1, room_number=None)
class PartialIndexLabelValidationTest(TransactionTestCase):
"""Test that partial unique validations are all executed."""
def setUp(self):
self.room1 = Room.objects.create(name='room 1')
self.room2 = Room.objects.create(name='room 2')
self.user1 = User.objects.create(name='user 1')
self.user2 = User.objects.create(name='user 2')
def test_single_unique_constraints_are_still_evaluated(self):
Label.objects.create(label='a', user=self.user1, room=self.room1, uuid='11111111-0000-0000-0000-000000000000', created_at='2019-01-01T00:00:00')
label = Label(label='b', user=self.user2, room=self.room2, uuid='22222222-0000-0000-0000-000000000000', created_at='2019-01-01T00:00:00')
with self.assertRaises(ValidationError) as cm:
label.full_clean()
self.assertSetEqual({'created_at'}, set(cm.exception.message_dict.keys()))
self.assertEqual('unique', cm.exception.error_dict['created_at'][0].code)
with self.assertRaises(IntegrityError):
label.save()
def test_standard_single_field_unique_constraints_do_not_block_evaluation_of_partial_index_constraints(self):
Label.objects.create(label='a', user=self.user1, room=self.room1, uuid='11111111-0000-0000-0000-000000000000', created_at='2019-01-01T00:00:00')
label = Label(label='b', user=self.user2, room=self.room2, uuid='11111111-0000-0000-0000-000000000000', created_at='2019-01-01T00:00:00')
with self.assertRaises(ValidationError) as cm:
label.full_clean()
self.assertSetEqual({'created_at', 'uuid'}, set(cm.exception.message_dict.keys()))
self.assertEqual('unique', cm.exception.error_dict['created_at'][0].code)
self.assertEqual('unique', cm.exception.error_dict['uuid'][0].code)
with self.assertRaises(IntegrityError):
label.save()
def test_standard_unique_together_constraints_do_not_block_evaluation_of_partial_index_constraints(self):
Label.objects.create(label='a', user=self.user1, room=self.room1, uuid='11111111-0000-0000-0000-000000000000', created_at='2019-01-01T11:11:11')
label = Label(label='b', user=self.user1, room=self.room1, uuid='11111111-0000-0000-0000-000000000000', created_at='2019-01-02T22:22:22')
with self.assertRaises(ValidationError) as cm:
label.full_clean()
self.assertSetEqual({NON_FIELD_ERRORS, 'uuid'}, set(cm.exception.message_dict.keys()))
self.assertEqual(1, len(cm.exception.error_dict['uuid']))
self.assertEqual(1, len(cm.exception.error_dict[NON_FIELD_ERRORS]))
self.assertEqual('unique', cm.exception.error_dict['uuid'][0].code)
self.assertEqual('unique_together', cm.exception.error_dict[NON_FIELD_ERRORS][0].code)
with self.assertRaises(IntegrityError):
label.save()
def test_all_partial_constraints_are_included_in_validation_errors(self):
Label.objects.create(label='a', user=self.user1, room=self.room1, uuid='11111111-0000-0000-0000-000000000000', created_at='2019-01-01T11:11:11')
label = Label(label='a', user=self.user1, room=self.room1, uuid='22222222-0000-0000-0000-000000000000', created_at='2019-01-02T22:22:22')
with self.assertRaises(ValidationError) as cm:
label.full_clean()
self.assertSetEqual({NON_FIELD_ERRORS}, set(cm.exception.message_dict.keys()))
self.assertEqual(2, len(cm.exception.error_dict[NON_FIELD_ERRORS]))
self.assertEqual('unique_together', cm.exception.error_dict[NON_FIELD_ERRORS][0].code)
self.assertEqual(['label', 'room'], cm.exception.error_dict[NON_FIELD_ERRORS][0].params['unique_check'])
self.assertEqual('unique_together', cm.exception.error_dict[NON_FIELD_ERRORS][1].code)
self.assertEqual(['label', 'user'], cm.exception.error_dict[NON_FIELD_ERRORS][1].params['unique_check'])
with self.assertRaises(IntegrityError):
label.save()
| 46.902439 | 164 | 0.718149 | 1,767 | 13,461 | 5.298246 | 0.092247 | 0.105533 | 0.052767 | 0.036317 | 0.882824 | 0.858257 | 0.810617 | 0.782525 | 0.768105 | 0.698355 | 0 | 0.050952 | 0.168932 | 13,461 | 286 | 165 | 47.066434 | 0.785912 | 0.062923 | 0 | 0.507538 | 0 | 0 | 0.056118 | 0.022957 | 0 | 0 | 0 | 0 | 0.221106 | 1 | 0.19598 | false | 0 | 0.025126 | 0 | 0.246231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f06f8af4892d8c7c9cb43750bc79b0231517aa84 | 14 | py | Python | vit/version.py | RonMcKay/vit | e37a75da8ff0856b9b7efd5c226741d2892ee175 | [
"MIT"
] | 171 | 2017-03-06T19:49:23.000Z | 2020-07-26T16:32:06.000Z | vit/version.py | RonMcKay/vit | e37a75da8ff0856b9b7efd5c226741d2892ee175 | [
"MIT"
] | 255 | 2017-02-01T11:49:12.000Z | 2020-07-26T22:31:25.000Z | vit/version.py | RonMcKay/vit | e37a75da8ff0856b9b7efd5c226741d2892ee175 | [
"MIT"
] | 26 | 2017-01-17T20:31:13.000Z | 2020-06-17T13:09:01.000Z | VIT = '2.2.0'
| 7 | 13 | 0.428571 | 4 | 14 | 1.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.272727 | 0.214286 | 14 | 1 | 14 | 14 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b2b86854da7fc8365bf2418ca3368c913d54f2c5 | 820 | py | Python | nofap.py | tunglambk/discord-C-bot | 341d5f82370fa4e35f6f822a06ec83ee2b704579 | [
"MIT"
] | null | null | null | nofap.py | tunglambk/discord-C-bot | 341d5f82370fa4e35f6f822a06ec83ee2b704579 | [
"MIT"
] | null | null | null | nofap.py | tunglambk/discord-C-bot | 341d5f82370fa4e35f6f822a06ec83ee2b704579 | [
"MIT"
] | null | null | null | nofap = [
'https://cdn.discordapp.com/attachments/856061609632595978/859418145394786324/1.PNG',
'https://cdn.discordapp.com/attachments/856061609632595978/859418149216059412/2.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418151363936286/3.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418153275490354/4.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418152987394058/5.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418154454876170/6.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418156087246848/7.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418159147909170/8.jpg',
'https://cdn.discordapp.com/attachments/856061609632595978/859418160351150090/8.png'
] | 74.545455 | 89 | 0.796341 | 82 | 820 | 7.963415 | 0.317073 | 0.11026 | 0.248086 | 0.289433 | 0.721286 | 0.721286 | 0.568147 | 0 | 0 | 0 | 0 | 0.431347 | 0.058537 | 820 | 11 | 90 | 74.545455 | 0.414508 | 0 | 0 | 0 | 0 | 0 | 0.898904 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b2fd4fcdf967db9e79949483e60a79464afc1bed | 91 | py | Python | taggo/util/__init__.py | vttach/taggo | 7eff7286d635a9fa7d790cbe57cf08477eeb5a37 | [
"MIT"
] | null | null | null | taggo/util/__init__.py | vttach/taggo | 7eff7286d635a9fa7d790cbe57cf08477eeb5a37 | [
"MIT"
] | null | null | null | taggo/util/__init__.py | vttach/taggo | 7eff7286d635a9fa7d790cbe57cf08477eeb5a37 | [
"MIT"
] | null | null | null | from .request import make_request, send_payload
__all__ = ['make_request', 'send_payload'] | 30.333333 | 47 | 0.791209 | 12 | 91 | 5.333333 | 0.583333 | 0.34375 | 0.46875 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098901 | 91 | 3 | 48 | 30.333333 | 0.780488 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
330b72e68c01414607102fedf272ccba01c5d84d | 34 | py | Python | pypkgtemp/__init__.py | k-sunako/CryptoMath | 467288c26301606ed1667f424e276d81c20ab640 | [
"MIT"
] | null | null | null | pypkgtemp/__init__.py | k-sunako/CryptoMath | 467288c26301606ed1667f424e276d81c20ab640 | [
"MIT"
] | 1 | 2021-06-01T22:11:33.000Z | 2021-06-01T22:11:33.000Z | pypkgtemp/__init__.py | costrouc/python-package-template | 1058f8f2ec4a34c0a600072b9eb5fe8d6fcb9b09 | [
"MIT"
] | null | null | null | print('hello world -- pypkgtemp')
| 17 | 33 | 0.705882 | 4 | 34 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
683661a95d455a9109191be8edae2387e55a48b8 | 208 | py | Python | approvaltests/approval_exception.py | skalinets/ApprovalTests.Python | ec7933d66a5a275632b46da0e55e7ca7de06c092 | [
"Apache-2.0"
] | 82 | 2015-10-06T19:25:21.000Z | 2022-03-04T00:09:31.000Z | approvaltests/approval_exception.py | skalinets/ApprovalTests.Python | ec7933d66a5a275632b46da0e55e7ca7de06c092 | [
"Apache-2.0"
] | 92 | 2015-10-05T00:27:22.000Z | 2022-02-13T18:53:10.000Z | approvaltests/approval_exception.py | skalinets/ApprovalTests.Python | ec7933d66a5a275632b46da0e55e7ca7de06c092 | [
"Apache-2.0"
] | 49 | 2015-08-04T18:02:59.000Z | 2022-03-06T18:49:09.000Z | class ApprovalException(Exception):
def __init__(self, value: str) -> None:
self.value = value
def __str__(self):
return self.value
class FrameNotFound(ApprovalException):
pass
| 18.909091 | 43 | 0.673077 | 22 | 208 | 6 | 0.545455 | 0.204545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235577 | 208 | 10 | 44 | 20.8 | 0.830189 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0.142857 | 0 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
6844ac4a17a15e4f3e9204027faa0ab1e5a12080 | 23 | py | Python | my_classes/.history/ModulesPackages_PackageNamespaces/example3b/module2_20210726190930.py | minefarmer/deep-Dive-1 | b0675b853180c5b5781888266ea63a3793b8d855 | [
"Unlicense"
] | null | null | null | my_classes/.history/ModulesPackages_PackageNamespaces/example3b/module2_20210726190930.py | minefarmer/deep-Dive-1 | b0675b853180c5b5781888266ea63a3793b8d855 | [
"Unlicense"
] | null | null | null | my_classes/.history/ModulesPackages_PackageNamespaces/example3b/module2_20210726190930.py | minefarmer/deep-Dive-1 | b0675b853180c5b5781888266ea63a3793b8d855 | [
"Unlicense"
] | null | null | null | import module1
print() | 7.666667 | 14 | 0.782609 | 3 | 23 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.130435 | 23 | 3 | 15 | 7.666667 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
685829103242991e98d17c2130739dbc597f59f9 | 1,222 | py | Python | tests/test_utility.py | neuroio/neuroio-python | 160f96515877e5e2ee0e888b7424c77cb2d7496a | [
"MIT"
] | null | null | null | tests/test_utility.py | neuroio/neuroio-python | 160f96515877e5e2ee0e888b7424c77cb2d7496a | [
"MIT"
] | 6 | 2021-09-06T08:23:09.000Z | 2021-11-10T16:19:20.000Z | tests/test_utility.py | neuroio/neuroio-python | 160f96515877e5e2ee0e888b7424c77cb2d7496a | [
"MIT"
] | null | null | null | import pytest
import respx
from neuroio.constants import API_BASE_URL
@respx.mock
def test_test_compare_200(client):
request = respx.post(f"{API_BASE_URL}/v1/utility/compare/").respond(
status_code=200
)
response = client.utility.compare(b"image1", b"image2")
assert request.called
assert response.status_code == 200
@respx.mock
@pytest.mark.asyncio
async def test_async_compare_200(async_client):
request = respx.post(f"{API_BASE_URL}/v1/utility/compare/").respond(
status_code=200
)
response = await async_client.utility.compare(b"image1", b"image2")
assert request.called
assert response.status_code == 200
@respx.mock
def test_asm_200(client):
request = respx.post(f"{API_BASE_URL}/v1/utility/asm/").respond(
status_code=200
)
response = client.utility.asm(b"image")
assert request.called
assert response.status_code == 200
@respx.mock
@pytest.mark.asyncio
async def test_async_asm_200(async_client):
request = respx.post(f"{API_BASE_URL}/v1/utility/asm/").respond(
status_code=200
)
response = await async_client.utility.asm(b"image")
assert request.called
assert response.status_code == 200
| 23.960784 | 72 | 0.713584 | 172 | 1,222 | 4.877907 | 0.203488 | 0.095352 | 0.123957 | 0.104887 | 0.873659 | 0.873659 | 0.873659 | 0.873659 | 0.873659 | 0.849821 | 0 | 0.043435 | 0.171031 | 1,222 | 50 | 73 | 24.44 | 0.784798 | 0 | 0 | 0.594595 | 0 | 0 | 0.13257 | 0.104746 | 0 | 0 | 0 | 0 | 0.216216 | 1 | 0.054054 | false | 0 | 0.081081 | 0 | 0.135135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
686a4b7db8dc5a67b828b1b6e7cac341a7ea331c | 76 | py | Python | timestamps_middleware/__init__.py | pachacamac/TinyDBTimestamps | 15460cd2654bd136d99516cc2ac2ab3b468e48dd | [
"BSD-3-Clause"
] | 1 | 2020-12-25T13:51:47.000Z | 2020-12-25T13:51:47.000Z | timestamps_middleware/__init__.py | pachacamac/TinyDBTimestamps | 15460cd2654bd136d99516cc2ac2ab3b468e48dd | [
"BSD-3-Clause"
] | 2 | 2020-12-15T12:47:21.000Z | 2020-12-15T20:46:35.000Z | timestamps_middleware/__init__.py | pachacamac/TinyDBTimestamps | 15460cd2654bd136d99516cc2ac2ab3b468e48dd | [
"BSD-3-Clause"
] | 1 | 2020-12-25T13:51:50.000Z | 2020-12-25T13:51:50.000Z | from timestamps_middleware.timestamps_middleware import TimestampsMiddleware | 76 | 76 | 0.947368 | 7 | 76 | 10 | 0.714286 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039474 | 76 | 1 | 76 | 76 | 0.958904 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d7b9e55630834bba3334bcca193a8b083df52f87 | 9,450 | py | Python | pydip/test/datc_tests/test__i__building.py | otech-nl/pydip | fbe085ccc0473b145d2fe424460163ad90c8ea57 | [
"MIT"
] | 13 | 2018-04-04T02:48:59.000Z | 2020-09-13T09:38:24.000Z | pydip/test/datc_tests/test__i__building.py | otech-nl/pydip | fbe085ccc0473b145d2fe424460163ad90c8ea57 | [
"MIT"
] | 6 | 2018-06-24T19:52:37.000Z | 2021-02-11T20:13:58.000Z | pydip/test/datc_tests/test__i__building.py | otech-nl/pydip | fbe085ccc0473b145d2fe424460163ad90c8ea57 | [
"MIT"
] | 5 | 2018-02-28T21:03:46.000Z | 2022-03-25T15:58:27.000Z | import pytest
from pydip.map.predefined import vanilla_dip
from pydip.player.unit import Unit
from pydip.player.unit import UnitTypes
from pydip.test.adjustment_helper import AdjustmentHelper
from pydip.test.command_helper import AdjustmentCommandHelper, AdjustmentCommandType
from pydip.test.player_helper import PlayerHelper
def test_i_1a__too_many_build_orders__with_validation():
"""
This test has been modified from the DATC -- it includes a build from a non-home territory, which
fails for a completely separate reason, before we even reach order resolution.
"""
# Germany has captured one new territory, all other players have stayed still
player_units = vanilla_dip.generate_starting_player_units()
player_units['Germany'] = {
Unit(UnitTypes.FLEET, 'Holland Coast'),
Unit(UnitTypes.TROOP, 'Prussia'),
Unit(UnitTypes.TROOP, 'Tyrolia'),
}
helper = AdjustmentHelper(
[
PlayerHelper('Germany', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Berlin'),
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Kiel'),
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Munich'),
]),
],
player_units=player_units,
)
with pytest.raises(AssertionError):
helper.resolve__validated()
def test_i_1b__too_many_build_orders():
"""
This test has been modified from the DATC -- it includes a build from a non-home territory, which
fails for a completely separate reason, before we even reach order resolution.
"""
# Germany has captured one new territory, all other players have stayed still
player_units = vanilla_dip.generate_starting_player_units()
player_units['Germany'] = {
Unit(UnitTypes.FLEET, 'Holland Coast'),
Unit(UnitTypes.TROOP, 'Prussia'),
Unit(UnitTypes.TROOP, 'Tyrolia'),
}
helper = AdjustmentHelper(
[
PlayerHelper('Germany', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Berlin'),
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Kiel'),
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Munich'),
]),
],
player_units=player_units,
)
# prioritize first-issued command
results = helper.resolve()
expected_results = vanilla_dip.generate_starting_player_units()
expected_results['Germany'] = {
Unit(UnitTypes.FLEET, 'Holland Coast'),
Unit(UnitTypes.TROOP, 'Prussia'),
Unit(UnitTypes.TROOP, 'Tyrolia'),
Unit(UnitTypes.TROOP, 'Berlin'),
}
assert results == expected_results
def test_i_2__fleets_cannot_be_built_in_land_areas():
# Russia has captured one new territory, all other players have stayed still
player_units = vanilla_dip.generate_starting_player_units()
player_units['Russia'] = {
Unit(UnitTypes.FLEET, 'Sweden Coast'),
Unit(UnitTypes.TROOP, 'Prussia'),
Unit(UnitTypes.TROOP, 'Galicia'),
Unit(UnitTypes.FLEET, 'Black Sea'),
}
with pytest.raises(AssertionError):
AdjustmentHelper(
[
PlayerHelper('Russia', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.FLEET, 'Moscow'),
]),
],
player_units=player_units,
)
def test_i_3__supply_center_must_be_empty_for_building():
# Germany has captured one new territory, all other players have stayed still
player_units = vanilla_dip.generate_starting_player_units()
player_units['Germany'] = {
Unit(UnitTypes.FLEET, 'Holland Coast'),
Unit(UnitTypes.TROOP, 'Berlin'),
Unit(UnitTypes.TROOP, 'Tyrolia'),
}
with pytest.raises(AssertionError):
AdjustmentHelper(
[
PlayerHelper('Germany', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Berlin'),
]),
],
player_units=player_units,
)
def test_i_4__both_coasts_must_be_empty_for_building():
# Russia has captured one new territory, all other players have stayed still
player_units = vanilla_dip.generate_starting_player_units()
player_units['Russia'] = {
Unit(UnitTypes.FLEET, 'St. Petersburg South Coast'),
Unit(UnitTypes.TROOP, 'Prussia'),
Unit(UnitTypes.TROOP, 'Galicia'),
Unit(UnitTypes.FLEET, 'Rumania Coast'),
}
with pytest.raises(AssertionError):
AdjustmentHelper(
[
PlayerHelper('Russia', [
AdjustmentCommandHelper(
AdjustmentCommandType.CREATE,
UnitTypes.FLEET,
'St. Petersburg North Coast',
),
]),
],
player_units=player_units,
)
def test_i_5__building_in_home_supply_center_that_is_not_owned():
# Russia has captured Berlin, and moved out, and Germany will be capturing Holland
player_units = vanilla_dip.generate_starting_player_units()
player_units['Germany'] = {
Unit(UnitTypes.FLEET, 'Kiel Coast'),
Unit(UnitTypes.TROOP, 'Holland'),
}
player_units['Russia'] = {
Unit(UnitTypes.FLEET, 'St. Petersburg South Coast'),
Unit(UnitTypes.TROOP, 'Moscow'),
Unit(UnitTypes.FLEET, 'Sevastopol Coast'),
Unit(UnitTypes.TROOP, 'Prussia'),
Unit(UnitTypes.TROOP, 'Warsaw'),
}
owned_territories = vanilla_dip.generate_home_territories()
owned_territories['Germany'] = { 'Kiel', 'Munich' }
owned_territories['Russia'] = { 'St. Petersburg', 'Sevastopol', 'Moscow', 'Berlin', 'Warsaw' }
with pytest.raises(AssertionError):
AdjustmentHelper(
[
PlayerHelper('Germany', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Berlin'),
]),
],
player_units=player_units,
owned_territories=owned_territories,
)
def test_i_6__building_in_owned_supply_center_that_is_not_a_home_supply_center():
# Germany has captured Warsaw, and moved out
player_units = vanilla_dip.generate_starting_player_units()
player_units['Germany'] = {
Unit(UnitTypes.FLEET, 'Kiel Coast'),
Unit(UnitTypes.TROOP, 'Munich'),
Unit(UnitTypes.TROOP, 'Ukraine'),
Unit(UnitTypes.TROOP, 'Berlin'),
}
player_units['Russia'] = {
Unit(UnitTypes.FLEET, 'St. Petersburg South Coast'),
Unit(UnitTypes.TROOP, 'Moscow'),
Unit(UnitTypes.FLEET, 'Sevastopol Coast'),
}
owned_territories = vanilla_dip.generate_home_territories()
owned_territories['Germany'] = { 'Kiel', 'Munich', 'Berlin', 'Warsaw' }
owned_territories['Russia'] = { 'St. Petersburg', 'Sevastopol', 'Moscow' }
with pytest.raises(AssertionError):
AdjustmentHelper(
[
PlayerHelper('Germany', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Warsaw'),
]),
],
player_units=player_units,
owned_territories=owned_territories,
)
def test_i_7a__only_one_build_in_a_home_supply_center__validated():
# Russia has captured two territories, everyone else has stayed put
player_units = vanilla_dip.generate_starting_player_units()
player_units['Russia'] = {
Unit(UnitTypes.FLEET, 'Sweden Coast'),
Unit(UnitTypes.TROOP, 'Warsaw'),
Unit(UnitTypes.TROOP, 'Ukraine'),
Unit(UnitTypes.FLEET, 'Rumania Coast'),
}
helper = AdjustmentHelper(
[
PlayerHelper('Russia', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Moscow'),
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Moscow'),
]),
],
player_units=player_units,
)
with pytest.raises(AssertionError):
helper.resolve__validated()
def test_i_7b__only_one_build_in_a_home_supply_center__not_validated():
# Russia has captured two territories, everyone else has stayed put
player_units = vanilla_dip.generate_starting_player_units()
player_units['Russia'] = {
Unit(UnitTypes.FLEET, 'Sweden Coast'),
Unit(UnitTypes.TROOP, 'Warsaw'),
Unit(UnitTypes.TROOP, 'Ukraine'),
Unit(UnitTypes.FLEET, 'Rumania Coast'),
}
helper = AdjustmentHelper(
[
PlayerHelper('Russia', [
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Moscow'),
AdjustmentCommandHelper(AdjustmentCommandType.CREATE, UnitTypes.TROOP, 'Moscow'),
]),
],
player_units=player_units,
)
results = helper.resolve()
expected_results = vanilla_dip.generate_starting_player_units()
expected_results['Russia'] = {
Unit(UnitTypes.FLEET, 'Sweden Coast'),
Unit(UnitTypes.TROOP, 'Warsaw'),
Unit(UnitTypes.TROOP, 'Ukraine'),
Unit(UnitTypes.FLEET, 'Rumania Coast'),
Unit(UnitTypes.TROOP, 'Moscow'),
}
assert results == expected_results
| 36.486486 | 101 | 0.647196 | 917 | 9,450 | 6.437296 | 0.14831 | 0.09131 | 0.08538 | 0.067085 | 0.875148 | 0.843131 | 0.836693 | 0.808233 | 0.790446 | 0.790446 | 0 | 0.001272 | 0.251111 | 9,450 | 258 | 102 | 36.627907 | 0.832839 | 0.107937 | 0 | 0.691176 | 0 | 0 | 0.102004 | 0 | 0 | 0 | 0 | 0 | 0.044118 | 1 | 0.044118 | false | 0 | 0.034314 | 0 | 0.078431 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d7e05e250a197cd81be752155feefe93453a6516 | 212 | py | Python | lib/__init__.py | cca/libraries_course_lists2 | 5dd68101f8267d82827e21a3ae46e3acf4ad21b0 | [
"ECL-2.0"
] | null | null | null | lib/__init__.py | cca/libraries_course_lists2 | 5dd68101f8267d82827e21a3ae46e3acf4ad21b0 | [
"ECL-2.0"
] | 8 | 2019-08-29T20:18:34.000Z | 2021-02-26T15:50:17.000Z | lib/__init__.py | cca/libraries_course_lists2 | 5dd68101f8267d82827e21a3ae46e3acf4ad21b0 | [
"ECL-2.0"
] | null | null | null | # import everything from all sub-modules
from .add_to_taxos import *
from .course import *
from .get_groups import *
from .get_taxos import *
from .group import *
from .taxonomy import *
from .utilities import *
| 23.555556 | 40 | 0.764151 | 31 | 212 | 5.096774 | 0.483871 | 0.379747 | 0.189873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160377 | 212 | 8 | 41 | 26.5 | 0.88764 | 0.179245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0bc5c6786c2856cb15abd6fa37ba257b20de3a0d | 195 | py | Python | server/group_chat_app/admin.py | AzamatKomaev/social_site | 4db54c593e6315aa9a767b4db75f43a0cd7f2683 | [
"MIT"
] | 2 | 2021-11-16T04:46:47.000Z | 2022-02-15T12:06:08.000Z | server/group_chat_app/admin.py | AzamatKomaev/social_site | 4db54c593e6315aa9a767b4db75f43a0cd7f2683 | [
"MIT"
] | null | null | null | server/group_chat_app/admin.py | AzamatKomaev/social_site | 4db54c593e6315aa9a767b4db75f43a0cd7f2683 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import *
admin.site.register(GroupChat)
admin.site.register(GroupChatRole)
admin.site.register(GroupMessage)
admin.site.register(GroupChatRequest)
| 21.666667 | 37 | 0.830769 | 24 | 195 | 6.75 | 0.5 | 0.222222 | 0.419753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071795 | 195 | 8 | 38 | 24.375 | 0.895028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0405c0998f94cd10f9796d2d6aa30f9c50a138da | 47 | py | Python | library/src/test/unit/detectors/__init__.py | unSAD-admin/unSAD | 9f1d0e680a0086d140bc8d1c55fe21dd7de87df5 | [
"Apache-2.0"
] | 3 | 2019-11-01T04:51:51.000Z | 2019-12-17T04:25:18.000Z | library/src/test/unit/utils/__init__.py | unSAD-admin/unSAD | 9f1d0e680a0086d140bc8d1c55fe21dd7de87df5 | [
"Apache-2.0"
] | 1 | 2019-11-11T18:29:36.000Z | 2019-11-11T18:29:36.000Z | library/src/test/unit/detectors/__init__.py | unSAD-admin/unSAD | 9f1d0e680a0086d140bc8d1c55fe21dd7de87df5 | [
"Apache-2.0"
] | 2 | 2019-12-18T11:49:00.000Z | 2020-03-27T20:06:15.000Z | # Created by Xinyu Zhu on 10/22/2019, 12:47 PM
| 23.5 | 46 | 0.702128 | 11 | 47 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.315789 | 0.191489 | 47 | 1 | 47 | 47 | 0.552632 | 0.93617 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
041272fc416c45a24d28406f95ac9af3c82f6ca5 | 53,330 | py | Python | test_autogalaxy/unit/profiles/mass_profiles/test_total_mass_profiles.py | jonathanfrawley/PyAutoGalaxy_copy | 1cedbfdcf65020538128163f7d8a7f8e169646e0 | [
"MIT"
] | null | null | null | test_autogalaxy/unit/profiles/mass_profiles/test_total_mass_profiles.py | jonathanfrawley/PyAutoGalaxy_copy | 1cedbfdcf65020538128163f7d8a7f8e169646e0 | [
"MIT"
] | null | null | null | test_autogalaxy/unit/profiles/mass_profiles/test_total_mass_profiles.py | jonathanfrawley/PyAutoGalaxy_copy | 1cedbfdcf65020538128163f7d8a7f8e169646e0 | [
"MIT"
] | null | null | null | import os
from autoconf import conf
import autogalaxy as ag
import numpy as np
import pytest
grid = np.array([[1.0, 1.0], [2.0, 2.0], [3.0, 3.0], [2.0, 4.0]])
class TestPointMass:
def test__deflections__correct_values(self):
# The radial coordinate at (1.0, 1.0) is sqrt(2)
# This is decomposed into (y,x) angles of sin(45) = cos(45) = sqrt(2) / 2.0
# Thus, for an EinR of 1.0, the deflection angle is (1.0 / sqrt(2)) * (sqrt(2) / 2.0)
point_mass = ag.mp.PointMass(centre=(0.0, 0.0), einstein_radius=1.0)
deflections = point_mass.deflections_from_grid(grid=np.array([[1.0, 1.0]]))
assert deflections[0, 0] == pytest.approx(0.5, 1e-3)
assert deflections[0, 1] == pytest.approx(0.5, 1e-3)
point_mass = ag.mp.PointMass(centre=(0.0, 0.0), einstein_radius=2.0)
deflections = point_mass.deflections_from_grid(grid=np.array([[1.0, 1.0]]))
assert deflections[0, 0] == pytest.approx(2.0, 1e-3)
assert deflections[0, 1] == pytest.approx(2.0, 1e-3)
point_mass = ag.mp.PointMass(centre=(0.0, 0.0), einstein_radius=1.0)
deflections = point_mass.deflections_from_grid(grid=np.array([[2.0, 2.0]]))
assert deflections[0, 0] == pytest.approx(0.25, 1e-3)
assert deflections[0, 1] == pytest.approx(0.25, 1e-3)
point_mass = ag.mp.PointMass(centre=(0.0, 0.0), einstein_radius=1.0)
deflections = point_mass.deflections_from_grid(grid=np.array([[2.0, 1.0]]))
assert deflections[0, 0] == pytest.approx(0.4, 1e-3)
assert deflections[0, 1] == pytest.approx(0.2, 1e-3)
point_mass = ag.mp.PointMass(centre=(0.0, 0.0), einstein_radius=2.0)
deflections = point_mass.deflections_from_grid(grid=np.array([[4.0, 9.0]]))
assert deflections[0, 0] == pytest.approx(16.0 / 97.0, 1e-3)
assert deflections[0, 1] == pytest.approx(36.0 / 97.0, 1e-3)
point_mass = ag.mp.PointMass(centre=(1.0, 2.0), einstein_radius=1.0)
deflections = point_mass.deflections_from_grid(grid=np.array([[2.0, 3.0]]))
assert deflections[0, 0] == pytest.approx(0.5, 1e-3)
assert deflections[0, 1] == pytest.approx(0.5, 1e-3)
def test__deflections__change_geometry(self):
point_mass_0 = ag.mp.PointMass(centre=(0.0, 0.0))
point_mass_1 = ag.mp.PointMass(centre=(1.0, 1.0))
deflections_0 = point_mass_0.deflections_from_grid(grid=np.array([[1.0, 1.0]]))
deflections_1 = point_mass_1.deflections_from_grid(grid=np.array([[0.0, 0.0]]))
assert deflections_0[0, 0] == pytest.approx(-deflections_1[0, 0], 1e-5)
assert deflections_0[0, 1] == pytest.approx(-deflections_1[0, 1], 1e-5)
point_mass_0 = ag.mp.PointMass(centre=(0.0, 0.0))
point_mass_1 = ag.mp.PointMass(centre=(0.0, 0.0))
deflections_0 = point_mass_0.deflections_from_grid(grid=np.array([[1.0, 0.0]]))
deflections_1 = point_mass_1.deflections_from_grid(grid=np.array([[0.0, 1.0]]))
assert deflections_0[0, 0] == pytest.approx(deflections_1[0, 1], 1e-5)
assert deflections_0[0, 1] == pytest.approx(deflections_1[0, 0], 1e-5)
def test__multiple_coordinates_in__multiple_coordinates_out(self):
point_mass = ag.mp.PointMass(centre=(1.0, 2.0), einstein_radius=1.0)
deflections = point_mass.deflections_from_grid(
grid=np.array([[2.0, 3.0], [2.0, 3.0], [2.0, 3.0]])
)
assert deflections[0, 0] == pytest.approx(0.5, 1e-3)
assert deflections[0, 1] == pytest.approx(0.5, 1e-3)
assert deflections[1, 0] == pytest.approx(0.5, 1e-3)
assert deflections[1, 1] == pytest.approx(0.5, 1e-3)
assert deflections[2, 0] == pytest.approx(0.5, 1e-3)
assert deflections[2, 1] == pytest.approx(0.5, 1e-3)
point_mass = ag.mp.PointMass(centre=(0.0, 0.0), einstein_radius=1.0)
deflections = point_mass.deflections_from_grid(
grid=np.array([[1.0, 1.0], [2.0, 2.0], [1.0, 1.0], [2.0, 2.0]])
)
assert deflections[0, 0] == pytest.approx(0.5, 1e-3)
assert deflections[0, 1] == pytest.approx(0.5, 1e-3)
assert deflections[1, 0] == pytest.approx(0.25, 1e-3)
assert deflections[1, 1] == pytest.approx(0.25, 1e-3)
assert deflections[2, 0] == pytest.approx(0.5, 1e-3)
assert deflections[2, 1] == pytest.approx(0.5, 1e-3)
assert deflections[3, 0] == pytest.approx(0.25, 1e-3)
assert deflections[3, 1] == pytest.approx(0.25, 1e-3)
def test__output_are_autoarrays(self):
grid = ag.Grid2D.uniform(shape_native=(2, 2), pixel_scales=1.0, sub_size=1)
point_mass = ag.mp.PointMass()
deflections = point_mass.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
class TestBrokenPowerLaw:
def test__convergence_correct_values(self):
broken_power_law = ag.mp.SphericalBrokenPowerLaw(
centre=(0, 0),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
convergence = broken_power_law.convergence_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert convergence == pytest.approx(0.0355237, 1e-4)
convergence = broken_power_law.convergence_from_grid(
grid=np.array([[0.5, 1.0], [0.5, 1.0]])
)
assert convergence == pytest.approx([0.0355237, 0.0355237], 1e-4)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(0.096225, 0.055555),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
convergence = broken_power_law.convergence_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert convergence == pytest.approx(0.05006035, 1e-4)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(-0.113433, 0.135184),
einstein_radius=1.0,
inner_slope=1.8,
outer_slope=2.2,
break_radius=0.1,
)
convergence = broken_power_law.convergence_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert convergence == pytest.approx(0.034768, 1e-4)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(0.113433, -0.135184),
einstein_radius=1.0,
inner_slope=1.8,
outer_slope=2.2,
break_radius=0.1,
)
convergence = broken_power_law.convergence_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert convergence == pytest.approx(0.03622852, 1e-4)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(-0.173789, -0.030643),
einstein_radius=1.0,
inner_slope=1.8,
outer_slope=2.2,
break_radius=0.1,
)
convergence = broken_power_law.convergence_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert convergence == pytest.approx(0.026469, 1e-4)
def test__deflections__correct_values(self):
broken_power_law = ag.mp.SphericalBrokenPowerLaw(
centre=(0, 0),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert deflections[0, 0] == pytest.approx(0.404076, 1e-3)
assert deflections[0, 1] == pytest.approx(0.808152, 1e-3)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0], [0.5, 1.0]])
)
assert deflections[0, 0] == pytest.approx(0.404076, 1e-3)
assert deflections[0, 1] == pytest.approx(0.808152, 1e-3)
assert deflections[1, 0] == pytest.approx(0.404076, 1e-3)
assert deflections[1, 1] == pytest.approx(0.808152, 1e-3)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(0.096225, 0.055555),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert deflections[0, 0] == pytest.approx(0.40392, 1e-3)
assert deflections[0, 1] == pytest.approx(0.811619, 1e-3)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(-0.07142, -0.085116),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert deflections[0, 0] == pytest.approx(0.4005338, 1e-3)
assert deflections[0, 1] == pytest.approx(0.8067221, 1e-3)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(0.109423, 0.019294),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert deflections[0, 0] == pytest.approx(0.399651, 1e-3)
assert deflections[0, 1] == pytest.approx(0.813372, 1e-3)
broken_power_law = ag.mp.EllipticalBrokenPowerLaw(
centre=(0, 0),
elliptical_comps=(-0.216506, -0.125),
einstein_radius=1.0,
inner_slope=1.5,
outer_slope=2.5,
break_radius=0.1,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
assert deflections[0, 0] == pytest.approx(0.402629, 1e-3)
assert deflections[0, 1] == pytest.approx(0.798795, 1e-3)
def test__convergence__change_geometry(self):
broken_power_law_0 = ag.mp.SphericalBrokenPowerLaw(centre=(0.0, 0.0))
broken_power_law_1 = ag.mp.SphericalBrokenPowerLaw(centre=(1.0, 1.0))
convergence_0 = broken_power_law_0.convergence_from_grid(
grid=np.array([[1.0, 1.0]])
)
convergence_1 = broken_power_law_1.convergence_from_grid(
grid=np.array([[0.0, 0.0]])
)
assert convergence_0 == convergence_1
broken_power_law_0 = ag.mp.SphericalBrokenPowerLaw(centre=(0.0, 0.0))
broken_power_law_1 = ag.mp.SphericalBrokenPowerLaw(centre=(0.0, 0.0))
convergence_0 = broken_power_law_0.convergence_from_grid(
grid=np.array([[1.0, 0.0]])
)
convergence_1 = broken_power_law_1.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence_0 == convergence_1
broken_power_law_0 = ag.mp.EllipticalBrokenPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.111111)
)
broken_power_law_1 = ag.mp.EllipticalBrokenPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, -0.111111)
)
convergence_0 = broken_power_law_0.convergence_from_grid(
grid=np.array([[1.0, 0.0]])
)
convergence_1 = broken_power_law_1.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence_0 == convergence_1
def test__deflections__change_geometry(self):
broken_power_law_0 = ag.mp.SphericalBrokenPowerLaw(centre=(0.0, 0.0))
broken_power_law_1 = ag.mp.SphericalBrokenPowerLaw(centre=(1.0, 1.0))
deflections_0 = broken_power_law_0.deflections_from_grid(
grid=np.array([[1.0, 1.0]])
)
deflections_1 = broken_power_law_1.deflections_from_grid(
grid=np.array([[0.0, 0.0]])
)
assert deflections_0[0, 0] == pytest.approx(-deflections_1[0, 0], 1e-5)
assert deflections_0[0, 1] == pytest.approx(-deflections_1[0, 1], 1e-5)
broken_power_law_0 = ag.mp.SphericalBrokenPowerLaw(centre=(0.0, 0.0))
broken_power_law_1 = ag.mp.SphericalBrokenPowerLaw(centre=(0.0, 0.0))
deflections_0 = broken_power_law_0.deflections_from_grid(
grid=np.array([[1.0, 0.0]])
)
deflections_1 = broken_power_law_1.deflections_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert deflections_0[0, 0] == pytest.approx(deflections_1[0, 1], 1e-5)
assert deflections_0[0, 1] == pytest.approx(deflections_1[0, 0], 1e-5)
broken_power_law_0 = ag.mp.EllipticalBrokenPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.111111)
)
broken_power_law_1 = ag.mp.EllipticalBrokenPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, -0.111111)
)
deflections_0 = broken_power_law_0.deflections_from_grid(
grid=np.array([[1.0, 0.0]])
)
deflections_1 = broken_power_law_1.deflections_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert deflections_0[0, 0] == pytest.approx(deflections_1[0, 1], 1e-5)
assert deflections_0[0, 1] == pytest.approx(deflections_1[0, 0], 1e-5)
def test__deflections__compare_to_power_law(self):
broken_power_law = ag.mp.SphericalBrokenPowerLaw(
centre=(0, 0),
einstein_radius=2.0,
inner_slope=1.999,
outer_slope=2.0001,
break_radius=0.0001,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
# Use of ratio avoids normalization definition difference effects
broken_yx_ratio = deflections[0, 0] / deflections[0, 1]
power_law = ag.mp.SphericalPowerLaw(
centre=(0, 0), einstein_radius=2.0, slope=2.0
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.5, 1.0]]))
power_law_yx_ratio = deflections[0, 0] / deflections[0, 1]
assert broken_yx_ratio == pytest.approx(power_law_yx_ratio, 1.0e-4)
broken_power_law = ag.mp.SphericalBrokenPowerLaw(
centre=(0, 0),
einstein_radius=2.0,
inner_slope=2.399,
outer_slope=2.4001,
break_radius=0.0001,
)
deflections = broken_power_law.deflections_from_grid(
grid=np.array([[0.5, 1.0]])
)
# Use of ratio avoids normalization difference effects
broken_yx_ratio = deflections[0, 0] / deflections[0, 1]
power_law = ag.mp.SphericalPowerLaw(
centre=(0, 0), einstein_radius=2.0, slope=2.4
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.5, 1.0]]))
power_law_yx_ratio = deflections[0, 0] / deflections[0, 1]
assert broken_yx_ratio == pytest.approx(power_law_yx_ratio, 1.0e-4)
def test__output_are_autoarrays(self):
grid = ag.Grid2D.uniform(shape_native=(2, 2), pixel_scales=1.0, sub_size=1)
cored_power_law = ag.mp.EllipticalBrokenPowerLaw()
convergence = cored_power_law.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
deflections = cored_power_law.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
cored_power_law = ag.mp.SphericalBrokenPowerLaw()
convergence = cored_power_law.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
deflections = cored_power_law.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
class TestCoredPowerLaw:
def test__convergence_correct_values(self):
cored_power_law = ag.mp.SphericalCoredPowerLaw(
centre=(1, 1), einstein_radius=1.0, slope=2.2, core_radius=0.1
)
convergence = cored_power_law.convergence_func(grid_radius=1.0)
assert convergence == pytest.approx(0.39762, 1e-4)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
slope=2.3,
core_radius=0.2,
)
convergence = cored_power_law.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(0.45492, 1e-3)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=2.0,
slope=1.7,
core_radius=0.2,
)
convergence = cored_power_law.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(1.3887, 1e-3)
def test__potential_correct_values(self):
power_law = ag.mp.SphericalCoredPowerLaw(
centre=(-0.7, 0.5), einstein_radius=1.0, slope=1.8, core_radius=0.2
)
potential = power_law.potential_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert potential == pytest.approx(0.54913, 1e-3)
power_law = ag.mp.SphericalCoredPowerLaw(
centre=(0.2, -0.2), einstein_radius=0.5, slope=2.4, core_radius=0.5
)
potential = power_law.potential_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert potential == pytest.approx(0.01820, 1e-3)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.2, -0.2),
elliptical_comps=(-0.216506, -0.125),
einstein_radius=0.5,
slope=2.4,
core_radius=0.5,
)
potential = cored_power_law.potential_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert potential == pytest.approx(0.02319, 1e-3)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
slope=1.8,
core_radius=0.2,
)
potential = cored_power_law.potential_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert potential == pytest.approx(0.71185, 1e-3)
def test__deflections__correct_values(self):
power_law = ag.mp.SphericalCoredPowerLaw(
centre=(-0.7, 0.5), einstein_radius=1.0, slope=1.8, core_radius=0.2
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert deflections[0, 0] == pytest.approx(0.80677, 1e-3)
assert deflections[0, 1] == pytest.approx(-0.30680, 1e-3)
power_law = ag.mp.SphericalCoredPowerLaw(
centre=(0.2, -0.2), einstein_radius=0.5, slope=2.4, core_radius=0.5
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert deflections[0, 0] == pytest.approx(-0.00321, 1e-3)
assert deflections[0, 1] == pytest.approx(0.09316, 1e-3)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
slope=1.8,
core_radius=0.2,
)
deflections = cored_power_law.deflections_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.9869, 1e-3)
assert deflections[0, 1] == pytest.approx(-0.54882, 1e-3)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.2, -0.2),
elliptical_comps=(-0.216506, -0.125),
einstein_radius=0.5,
slope=2.4,
core_radius=0.5,
)
deflections = cored_power_law.deflections_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.01111, 1e-3)
assert deflections[0, 1] == pytest.approx(0.11403, 1e-3)
def test__convergence__change_geometry(self):
cored_power_law_0 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
cored_power_law_1 = ag.mp.SphericalCoredPowerLaw(centre=(1.0, 1.0))
convergence_0 = cored_power_law_0.convergence_from_grid(
grid=np.array([[1.0, 1.0]])
)
convergence_1 = cored_power_law_1.convergence_from_grid(
grid=np.array([[0.0, 0.0]])
)
assert convergence_0 == convergence_1
cored_power_law_0 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
cored_power_law_1 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
convergence_0 = cored_power_law_0.convergence_from_grid(
grid=np.array([[1.0, 0.0]])
)
convergence_1 = cored_power_law_1.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence_0 == convergence_1
cored_power_law_0 = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.111111)
)
cored_power_law_1 = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, -0.111111)
)
convergence_0 = cored_power_law_0.convergence_from_grid(
grid=np.array([[1.0, 0.0]])
)
convergence_1 = cored_power_law_1.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence_0 == convergence_1
def test__potential__change_geometry(self):
cored_power_law_0 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
cored_power_law_1 = ag.mp.SphericalCoredPowerLaw(centre=(1.0, 1.0))
potential_0 = cored_power_law_0.potential_from_grid(grid=np.array([[1.0, 1.0]]))
potential_1 = cored_power_law_1.potential_from_grid(grid=np.array([[0.0, 0.0]]))
assert potential_0 == potential_1
cored_power_law_0 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
cored_power_law_1 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
potential_0 = cored_power_law_0.potential_from_grid(grid=np.array([[1.0, 0.0]]))
potential_1 = cored_power_law_1.potential_from_grid(grid=np.array([[0.0, 1.0]]))
assert potential_0 == potential_1
cored_power_law_0 = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.111111)
)
cored_power_law_1 = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, -0.111111)
)
potential_0 = cored_power_law_0.potential_from_grid(grid=np.array([[1.0, 0.0]]))
potential_1 = cored_power_law_1.potential_from_grid(grid=np.array([[0.0, 1.0]]))
assert potential_0 == potential_1
def test__deflections__change_geometry(self):
cored_power_law_0 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
cored_power_law_1 = ag.mp.SphericalCoredPowerLaw(centre=(1.0, 1.0))
deflections_0 = cored_power_law_0.deflections_from_grid(
grid=np.array([[1.0, 1.0]])
)
deflections_1 = cored_power_law_1.deflections_from_grid(
grid=np.array([[0.0, 0.0]])
)
assert deflections_0[0, 0] == pytest.approx(-deflections_1[0, 0], 1e-5)
assert deflections_0[0, 1] == pytest.approx(-deflections_1[0, 1], 1e-5)
cored_power_law_0 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
cored_power_law_1 = ag.mp.SphericalCoredPowerLaw(centre=(0.0, 0.0))
deflections_0 = cored_power_law_0.deflections_from_grid(
grid=np.array([[1.0, 0.0]])
)
deflections_1 = cored_power_law_1.deflections_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert deflections_0[0, 0] == pytest.approx(deflections_1[0, 1], 1e-5)
assert deflections_0[0, 1] == pytest.approx(deflections_1[0, 0], 1e-5)
cored_power_law_0 = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.111111)
)
cored_power_law_1 = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0), elliptical_comps=(0.0, -0.111111)
)
deflections_0 = cored_power_law_0.deflections_from_grid(
grid=np.array([[1.0, 0.0]])
)
deflections_1 = cored_power_law_1.deflections_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert deflections_0[0, 0] == pytest.approx(deflections_1[0, 1], 1e-5)
assert deflections_0[0, 1] == pytest.approx(deflections_1[0, 0], 1e-5)
def test__spherical_and_elliptical_match(self):
elliptical = ag.mp.EllipticalCoredPowerLaw(
centre=(1.1, 1.1),
elliptical_comps=(0.0, 0.0),
einstein_radius=3.0,
slope=2.2,
core_radius=0.1,
)
spherical = ag.mp.SphericalCoredPowerLaw(
centre=(1.1, 1.1), einstein_radius=3.0, slope=2.2, core_radius=0.1
)
assert elliptical.convergence_from_grid(grid=grid) == pytest.approx(
spherical.convergence_from_grid(grid=grid), 1e-4
)
assert elliptical.potential_from_grid(grid=grid) == pytest.approx(
spherical.potential_from_grid(grid=grid), 1e-4
)
assert elliptical.deflections_from_grid(grid=grid) == pytest.approx(
spherical.deflections_from_grid(grid=grid), 1e-4
)
def test__output_are_autoarrays(self):
grid = ag.Grid2D.uniform(shape_native=(2, 2), pixel_scales=1.0, sub_size=1)
cored_power_law = ag.mp.EllipticalCoredPowerLaw()
convergence = cored_power_law.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = cored_power_law.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = cored_power_law.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
cored_power_law = ag.mp.SphericalCoredPowerLaw()
convergence = cored_power_law.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = cored_power_law.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = cored_power_law.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
class TestPowerLaw:
def test__convergence_correct_values(self):
power_law = ag.mp.SphericalPowerLaw(
centre=(0.0, 0.0), einstein_radius=1.0, slope=2.0
)
convergence = power_law.convergence_from_grid(grid=np.array([[1.0, 0.0]]))
assert convergence == pytest.approx(0.5, 1e-3)
power_law = ag.mp.SphericalPowerLaw(
centre=(0.0, 0.0), einstein_radius=2.0, slope=2.2
)
convergence = power_law.convergence_from_grid(grid=np.array([[2.0, 0.0]]))
assert convergence == pytest.approx(0.4, 1e-3)
power_law = ag.mp.SphericalPowerLaw(
centre=(0.0, 0.0), einstein_radius=2.0, slope=2.2
)
convergence = power_law.convergence_from_grid(grid=np.array([[2.0, 0.0]]))
assert convergence == pytest.approx(0.4, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
slope=2.3,
)
convergence = power_law.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(0.466666, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=2.0,
slope=1.7,
)
convergence = power_law.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(1.4079, 1e-3)
def test__potential_correct_values(self):
power_law = ag.mp.SphericalPowerLaw(
centre=(-0.7, 0.5), einstein_radius=1.3, slope=2.3
)
potential = power_law.potential_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert potential == pytest.approx(1.90421, 1e-3)
power_law = ag.mp.SphericalPowerLaw(
centre=(-0.7, 0.5), einstein_radius=1.3, slope=1.8
)
potential = power_law.potential_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert potential == pytest.approx(0.93758, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
slope=2.2,
)
potential = power_law.potential_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert potential == pytest.approx(1.53341, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
slope=1.8,
)
potential = power_law.potential_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert potential == pytest.approx(0.96723, 1e-3)
def test__deflections__correct_values(self):
power_law = ag.mp.SphericalPowerLaw(
centre=(0.2, 0.2), einstein_radius=1.0, slope=2.0
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert deflections[0, 0] == pytest.approx(-0.31622, 1e-3)
assert deflections[0, 1] == pytest.approx(-0.94868, 1e-3)
power_law = ag.mp.SphericalPowerLaw(
centre=(0.2, 0.2), einstein_radius=1.0, slope=2.5
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert deflections[0, 0] == pytest.approx(-1.59054, 1e-3)
assert deflections[0, 1] == pytest.approx(-4.77162, 1e-3)
power_law = ag.mp.SphericalPowerLaw(
centre=(0.2, 0.2), einstein_radius=1.0, slope=1.5
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert deflections[0, 0] == pytest.approx(-0.06287, 1e-3)
assert deflections[0, 1] == pytest.approx(-0.18861, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(0, 0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
slope=2.0,
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert deflections[0, 0] == pytest.approx(0.79421, 1e-3)
assert deflections[0, 1] == pytest.approx(0.50734, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(0, 0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
slope=2.5,
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert deflections[0, 0] == pytest.approx(1.29641, 1e-3)
assert deflections[0, 1] == pytest.approx(0.99629, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(0, 0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
slope=1.5,
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert deflections[0, 0] == pytest.approx(0.48036, 1e-3)
assert deflections[0, 1] == pytest.approx(0.26729, 1e-3)
power_law = ag.mp.EllipticalPowerLaw(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
slope=1.9,
)
deflections = power_law.deflections_from_grid(grid=np.array([[0.1625, 0.1625]]))
# assert deflections[0, 0] == pytest.approx(1.12841, 1e-3)
# assert deflections[0, 1] == pytest.approx(-0.60205, 1e-3)
def test__compare_to_cored_power_law(self):
power_law = ag.mp.EllipticalPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.333333, 0.0),
einstein_radius=1.0,
slope=2.3,
)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.333333, 0.0),
einstein_radius=1.0,
slope=2.3,
core_radius=0.0,
)
assert power_law.potential_from_grid(grid=grid) == pytest.approx(
cored_power_law.potential_from_grid(grid=grid), 1e-3
)
assert power_law.potential_from_grid(grid=grid) == pytest.approx(
cored_power_law.potential_from_grid(grid=grid), 1e-3
)
assert power_law.deflections_from_grid(grid=grid) == pytest.approx(
cored_power_law.deflections_from_grid(grid=grid), 1e-3
)
assert power_law.deflections_from_grid(grid=grid) == pytest.approx(
cored_power_law.deflections_from_grid(grid=grid), 1e-3
)
def test__spherical_and_elliptical_match(self):
elliptical = ag.mp.EllipticalPowerLaw(
centre=(1.1, 1.1),
elliptical_comps=(0.0, 0.0),
einstein_radius=3.0,
slope=2.4,
)
spherical = ag.mp.SphericalPowerLaw(
centre=(1.1, 1.1), einstein_radius=3.0, slope=2.4
)
assert elliptical.convergence_from_grid(grid=grid) == pytest.approx(
spherical.convergence_from_grid(grid=grid), 1e-4
)
assert elliptical.potential_from_grid(grid=grid) == pytest.approx(
spherical.potential_from_grid(grid=grid), 1e-4
)
assert elliptical.deflections_from_grid(grid=grid) == pytest.approx(
spherical.deflections_from_grid(grid=grid), 1e-4
)
def test__output_are_autoarrays(self):
grid = ag.Grid2D.uniform(shape_native=(2, 2), pixel_scales=1.0, sub_size=1)
power_law = ag.mp.EllipticalPowerLaw()
convergence = power_law.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = power_law.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = power_law.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
power_law = ag.mp.SphericalPowerLaw()
convergence = power_law.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = power_law.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = power_law.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
class TestCoredIsothermal:
def test__convergence_correct_values(self):
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(1, 1), einstein_radius=1.0, core_radius=0.1
)
convergence = cored_isothermal.convergence_func(grid_radius=1.0)
assert convergence == pytest.approx(0.49752, 1e-4)
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(1, 1), einstein_radius=1.0, core_radius=0.1
)
convergence = cored_isothermal.convergence_func(grid_radius=1.0)
assert convergence == pytest.approx(0.49752, 1e-4)
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(0.0, 0.0), einstein_radius=1.0, core_radius=0.2
)
convergence = cored_isothermal.convergence_from_grid(
grid=np.array([[1.0, 0.0]])
)
assert convergence == pytest.approx(0.49029, 1e-3)
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(0.0, 0.0), einstein_radius=2.0, core_radius=0.2
)
convergence = cored_isothermal.convergence_from_grid(
grid=np.array([[1.0, 0.0]])
)
assert convergence == pytest.approx(2.0 * 0.49029, 1e-3)
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(0.0, 0.0), einstein_radius=1.0, core_radius=0.2
)
convergence = cored_isothermal.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence == pytest.approx(0.49029, 1e-3)
# axis ratio changes only einstein_rescaled, so wwe can use the above value and times by 1.0/1.5.
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
core_radius=0.2,
)
convergence = cored_isothermal.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence == pytest.approx(0.49029 * 1.33333, 1e-3)
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.0),
einstein_radius=2.0,
core_radius=0.2,
)
convergence = cored_isothermal.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence == pytest.approx(2.0 * 0.49029, 1e-3)
# for axis_ratio = 1.0, the factor is 1/2
# for axis_ratio = 0.5, the factor is 1/(1.5)
# So the change in the value is 0.5 / (1/1.5) = 1.0 / 0.75
# axis ratio changes only einstein_rescaled, so wwe can use the above value and times by 1.0/1.5.
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(0.0, 0.0),
elliptical_comps=(0.0, 0.333333),
einstein_radius=1.0,
core_radius=0.2,
)
convergence = cored_isothermal.convergence_from_grid(
grid=np.array([[0.0, 1.0]])
)
assert convergence == pytest.approx((1.0 / 0.75) * 0.49029, 1e-3)
def test__potential__correct_values(self):
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(-0.7, 0.5), einstein_radius=1.3, core_radius=0.2
)
potential = cored_isothermal.potential_from_grid(
grid=np.array([[0.1875, 0.1625]])
)
assert potential == pytest.approx(0.72231, 1e-3)
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(0.2, -0.2), einstein_radius=0.5, core_radius=0.5
)
potential = cored_isothermal.potential_from_grid(
grid=np.array([[0.1875, 0.1625]])
)
assert potential == pytest.approx(0.03103, 1e-3)
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
core_radius=0.2,
)
potential = cored_isothermal.potential_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert potential == pytest.approx(0.74354, 1e-3)
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(0.2, -0.2),
elliptical_comps=(-0.216506, -0.125),
einstein_radius=0.5,
core_radius=0.5,
)
potential = cored_isothermal.potential_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert potential == pytest.approx(0.04024, 1e-3)
def test__deflections__correct_values(self):
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(-0.7, 0.5), einstein_radius=1.3, core_radius=0.2
)
deflections = cored_isothermal.deflections_from_grid(
grid=np.array([[0.1875, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.98582, 1e-3)
assert deflections[0, 1] == pytest.approx(-0.37489, 1e-3)
cored_isothermal = ag.mp.SphericalCoredIsothermal(
centre=(0.2, -0.2), einstein_radius=0.5, core_radius=0.5
)
deflections = cored_isothermal.deflections_from_grid(
grid=np.array([[0.1875, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(-0.00559, 1e-3)
assert deflections[0, 1] == pytest.approx(0.16216, 1e-3)
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
core_radius=0.2,
)
deflections = cored_isothermal.deflections_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.95429, 1e-3)
assert deflections[0, 1] == pytest.approx(-0.52047, 1e-3)
cored_isothermal = ag.mp.EllipticalCoredIsothermal(
centre=(0.2, -0.2),
elliptical_comps=(-0.216506, -0.125),
einstein_radius=0.5,
core_radius=0.5,
)
deflections = cored_isothermal.deflections_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.02097, 1e-3)
assert deflections[0, 1] == pytest.approx(0.20500, 1e-3)
def test__compare_to_cored_power_law(self):
power_law = ag.mp.EllipticalCoredIsothermal(
centre=(0.0, 0.0),
elliptical_comps=(0.333333, 0.0),
einstein_radius=1.0,
core_radius=0.1,
)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.333333, 0.0),
einstein_radius=1.0,
slope=2.0,
core_radius=0.1,
)
assert power_law.potential_from_grid(grid=grid) == pytest.approx(
cored_power_law.potential_from_grid(grid=grid), 1e-3
)
assert power_law.potential_from_grid(grid=grid) == pytest.approx(
cored_power_law.potential_from_grid(grid=grid), 1e-3
)
assert power_law.deflections_from_grid(grid=grid) == pytest.approx(
cored_power_law.deflections_from_grid(grid=grid), 1e-3
)
assert power_law.deflections_from_grid(grid=grid) == pytest.approx(
cored_power_law.deflections_from_grid(grid=grid), 1e-3
)
def test__spherical_and_elliptical_match(self):
elliptical = ag.mp.EllipticalCoredIsothermal(
centre=(1.1, 1.1),
elliptical_comps=(0.0, 0.0),
einstein_radius=3.0,
core_radius=1.0,
)
spherical = ag.mp.SphericalCoredIsothermal(
centre=(1.1, 1.1), einstein_radius=3.0, core_radius=1.0
)
assert elliptical.convergence_from_grid(grid=grid) == pytest.approx(
spherical.convergence_from_grid(grid=grid), 1e-4
)
assert elliptical.potential_from_grid(grid=grid) == pytest.approx(
spherical.potential_from_grid(grid=grid), 1e-4
)
assert elliptical.deflections_from_grid(grid=grid) == pytest.approx(
spherical.deflections_from_grid(grid=grid), 1e-4
)
def test__output_are_autoarrays(self):
grid = ag.Grid2D.uniform(shape_native=(2, 2), pixel_scales=1.0, sub_size=1)
cored_isothermal = ag.mp.EllipticalCoredIsothermal()
convergence = cored_isothermal.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = cored_isothermal.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = cored_isothermal.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
cored_isothermal = ag.mp.SphericalCoredIsothermal()
convergence = cored_isothermal.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = cored_isothermal.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = cored_isothermal.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
class TestIsothermal:
def test__convergence__correct_values(self):
# eta = 1.0
# kappa = 0.5 * 1.0 ** 1.0
isothermal = ag.mp.SphericalIsothermal(centre=(0.0, 0.0), einstein_radius=2.0)
convergence = isothermal.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(0.5 * 2.0, 1e-3)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.0), einstein_radius=1.0
)
convergence = isothermal.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(0.5, 1e-3)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.0), einstein_radius=2.0
)
convergence = isothermal.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(0.5 * 2.0, 1e-3)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.333333), einstein_radius=1.0
)
convergence = isothermal.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
assert convergence == pytest.approx(0.66666, 1e-3)
def test__potential__correct_values(self):
isothermal = ag.mp.SphericalIsothermal(centre=(-0.7, 0.5), einstein_radius=1.3)
potential = isothermal.potential_from_grid(grid=np.array([[0.1875, 0.1625]]))
assert potential == pytest.approx(1.23435, 1e-3)
isothermal = ag.mp.EllipticalIsothermal(
centre=(-0.7, 0.5),
elliptical_comps=(0.152828, -0.088235),
einstein_radius=1.3,
)
potential = isothermal.potential_from_grid(grid=np.array([[0.1625, 0.1625]]))
assert potential == pytest.approx(1.19268, 1e-3)
def test__deflections__correct_values(self):
isothermal = ag.mp.SphericalIsothermal(centre=(-0.7, 0.5), einstein_radius=1.3)
deflections = isothermal.deflections_from_grid(
grid=np.array([[0.1875, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(1.21510, 1e-4)
assert deflections[0, 1] == pytest.approx(-0.46208, 1e-4)
isothermal = ag.mp.SphericalIsothermal(centre=(-0.1, 0.1), einstein_radius=5.0)
deflections = isothermal.deflections_from_grid(
grid=np.array([[0.1875, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(4.88588, 1e-4)
assert deflections[0, 1] == pytest.approx(1.06214, 1e-4)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0, 0), elliptical_comps=(0.0, 0.333333), einstein_radius=1.0
)
deflections = isothermal.deflections_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.79421, 1e-3)
assert deflections[0, 1] == pytest.approx(0.50734, 1e-3)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0, 0), elliptical_comps=(0.0, 0.333333), einstein_radius=1.0
)
deflections = isothermal.deflections_from_grid(
grid=np.array([[0.1625, 0.1625]])
)
assert deflections[0, 0] == pytest.approx(0.79421, 1e-3)
assert deflections[0, 1] == pytest.approx(0.50734, 1e-3)
def test__shear__correct_values(self):
isothermal = ag.mp.SphericalIsothermal(centre=(0.0, 0.0), einstein_radius=2.0)
convergence = isothermal.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
shear = isothermal.shear_from_grid(grid=np.array([[0.0, 1.0]]))
assert shear[0, 0] == pytest.approx(0.0, 1e-4)
assert shear[0, 1] == pytest.approx(-convergence, 1e-4)
convergence = isothermal.convergence_from_grid(grid=np.array([[2.0, 1.0]]))
shear = isothermal.shear_from_grid(grid=np.array([[2.0, 1.0]]))
assert shear[0, 0] == pytest.approx(-(4.0 / 5.0) * convergence, 1e-4)
assert shear[0, 1] == pytest.approx((3.0 / 5.0) * convergence, 1e-4)
convergence = isothermal.convergence_from_grid(grid=np.array([[3.0, 5.0]]))
shear = isothermal.shear_from_grid(grid=np.array([[3.0, 5.0]]))
assert shear[0, 0] == pytest.approx(-(30.0 / 34.0) * convergence, 1e-4)
assert shear[0, 1] == pytest.approx(-(16.0 / 34.0) * convergence, 1e-4)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0.0, 0.0), elliptical_comps=(0.0, 0.0), einstein_radius=2.0
)
convergence = isothermal.convergence_from_grid(grid=np.array([[0.0, 1.0]]))
shear = isothermal.shear_from_grid(grid=np.array([[0.0, 1.0]]))
assert shear[0, 0] == pytest.approx(0.0, 1e-4)
assert shear[0, 1] == pytest.approx(-convergence, 1e-4)
isothermal = ag.mp.EllipticalIsothermal(
centre=(0.0, 0.0), elliptical_comps=(0.3, 0.4), einstein_radius=2.0
)
shear = isothermal.shear_from_grid(grid=np.array([[0.0, 1.0]]))
assert shear[0, 0] == pytest.approx(0.35355, 1e-4)
assert shear[0, 1] == pytest.approx(-1.06066, 1e-4)
def test__compare_to_cored_power_law(self):
isothermal = ag.mp.EllipticalIsothermal(
centre=(0.0, 0.0), elliptical_comps=(0.333333, 0.0), einstein_radius=1.0
)
cored_power_law = ag.mp.EllipticalCoredPowerLaw(
centre=(0.0, 0.0),
elliptical_comps=(0.333333, 0.0),
einstein_radius=1.0,
core_radius=0.0,
)
assert isothermal.potential_from_grid(grid=grid) == pytest.approx(
cored_power_law.potential_from_grid(grid=grid), 1e-3
)
assert isothermal.potential_from_grid(grid=grid) == pytest.approx(
cored_power_law.potential_from_grid(grid=grid), 1e-3
)
assert isothermal.deflections_from_grid(grid=grid) == pytest.approx(
cored_power_law.deflections_from_grid(grid=grid), 1e-3
)
assert isothermal.deflections_from_grid(grid=grid) == pytest.approx(
cored_power_law.deflections_from_grid(grid=grid), 1e-3
)
def test__spherical_and_elliptical_match(self):
elliptical = ag.mp.EllipticalIsothermal(
centre=(1.1, 1.1), elliptical_comps=(0.0, 0.0), einstein_radius=3.0
)
spherical = ag.mp.SphericalIsothermal(centre=(1.1, 1.1), einstein_radius=3.0)
assert elliptical.convergence_from_grid(grid=grid) == pytest.approx(
spherical.convergence_from_grid(grid=grid), 1e-4
)
assert elliptical.potential_from_grid(grid=grid) == pytest.approx(
spherical.potential_from_grid(grid=grid), 1e-4
)
assert elliptical.deflections_from_grid(grid=grid) == pytest.approx(
spherical.deflections_from_grid(grid=grid), 1e-4
)
def test__output_are_autoarrays(self):
grid = ag.Grid2D.uniform(shape_native=(2, 2), pixel_scales=1.0, sub_size=1)
isothermal = ag.mp.EllipticalIsothermal()
convergence = isothermal.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = isothermal.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = isothermal.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
isothermal = ag.mp.SphericalIsothermal()
convergence = isothermal.convergence_from_grid(grid=grid)
assert convergence.shape_native == (2, 2)
potential = isothermal.potential_from_grid(grid=grid)
assert potential.shape_native == (2, 2)
deflections = isothermal.deflections_from_grid(grid=grid)
assert deflections.shape_native == (2, 2)
| 35.672241 | 106 | 0.592425 | 7,034 | 53,330 | 4.29201 | 0.034262 | 0.028817 | 0.019278 | 0.054256 | 0.958264 | 0.95005 | 0.938721 | 0.930639 | 0.917754 | 0.88006 | 0 | 0.096634 | 0.277574 | 53,330 | 1,494 | 107 | 35.696118 | 0.686973 | 0.015076 | 0 | 0.662868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.195632 | 1 | 0.035138 | false | 0 | 0.004748 | 0 | 0.045584 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0425939c11a40d373f9ab1d4c3cdcbc6dbc612c4 | 45 | py | Python | integration_tests/test-packages/python/testpkgtres/testpkgtres/__init__.py | bburns632/doppel-cli | e0c708f565db0f558bca6f2fbe28a41d45121344 | [
"BSD-3-Clause"
] | null | null | null | integration_tests/test-packages/python/testpkgtres/testpkgtres/__init__.py | bburns632/doppel-cli | e0c708f565db0f558bca6f2fbe28a41d45121344 | [
"BSD-3-Clause"
] | null | null | null | integration_tests/test-packages/python/testpkgtres/testpkgtres/__init__.py | bburns632/doppel-cli | e0c708f565db0f558bca6f2fbe28a41d45121344 | [
"BSD-3-Clause"
] | null | null | null |
from testpkgtres.SomeClass import SomeClass
| 15 | 43 | 0.866667 | 5 | 45 | 7.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 45 | 2 | 44 | 22.5 | 0.975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
043099ef38731873ef67ebc120d2a99ac1062093 | 99 | py | Python | muak/layouts/__init__.py | ok65/muak | 885d2bca83b90a37c17df88c1aeb66b49958ae2d | [
"WTFPL"
] | null | null | null | muak/layouts/__init__.py | ok65/muak | 885d2bca83b90a37c17df88c1aeb66b49958ae2d | [
"WTFPL"
] | null | null | null | muak/layouts/__init__.py | ok65/muak | 885d2bca83b90a37c17df88c1aeb66b49958ae2d | [
"WTFPL"
] | null | null | null |
from muak.layouts.layout import Layout
from muak.layouts.horizontal_layout import HorizontalLayout | 33 | 59 | 0.878788 | 13 | 99 | 6.615385 | 0.538462 | 0.186047 | 0.348837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080808 | 99 | 3 | 59 | 33 | 0.945055 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9bcd333dcf239c2e1887f18f8eddbbea6ffc0282 | 26 | py | Python | __init__.py | cmhc/cs | 75d413be045eabbdbd5842806f7ffdd5768138eb | [
"Apache-2.0"
] | 1 | 2020-02-17T17:50:57.000Z | 2020-02-17T17:50:57.000Z | __init__.py | cmhc/cs | 75d413be045eabbdbd5842806f7ffdd5768138eb | [
"Apache-2.0"
] | null | null | null | __init__.py | cmhc/cs | 75d413be045eabbdbd5842806f7ffdd5768138eb | [
"Apache-2.0"
] | null | null | null | #coding:utf8
import html
| 6.5 | 12 | 0.769231 | 4 | 26 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.153846 | 26 | 3 | 13 | 8.666667 | 0.863636 | 0.423077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9bdd535acfb0b66d1cbe6fc204708f8a29e6f764 | 208 | py | Python | main/context_processors.py | abhay-ranawat/playlistor | d1e7d0c5b7db736c1168c6650d15410865664038 | [
"MIT"
] | 206 | 2019-07-04T05:28:42.000Z | 2022-03-30T03:27:05.000Z | main/context_processors.py | abhay-ranawat/playlistor | d1e7d0c5b7db736c1168c6650d15410865664038 | [
"MIT"
] | 11 | 2019-10-18T21:52:53.000Z | 2021-11-30T09:45:29.000Z | main/context_processors.py | abhay-ranawat/playlistor | d1e7d0c5b7db736c1168c6650d15410865664038 | [
"MIT"
] | 16 | 2019-07-17T08:52:30.000Z | 2022-03-17T08:34:45.000Z | from django.conf import settings
from .utils import generate_auth_token
def default_context(request):
token = generate_auth_token()
return {"APPLE_DEVELOPER_TOKEN": token, "DEBUG": settings.DEBUG}
| 23.111111 | 68 | 0.774038 | 27 | 208 | 5.703704 | 0.62963 | 0.155844 | 0.220779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139423 | 208 | 8 | 69 | 26 | 0.860335 | 0 | 0 | 0 | 1 | 0 | 0.125 | 0.100962 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9bdd6b2bf0ca45ca2bf1b4e30e669ef014b6fc6d | 8,154 | py | Python | test_autoarray/inversion/inversion/test_inversion_util.py | Jammy2211/PyAutoArray | 1fb9c84ca2a3333abedfbf96d070fc355e2628e4 | [
"MIT"
] | 5 | 2019-09-26T02:18:25.000Z | 2021-12-11T16:29:20.000Z | test_autoarray/inversion/inversion/test_inversion_util.py | Jammy2211/PyAutoArray | 1fb9c84ca2a3333abedfbf96d070fc355e2628e4 | [
"MIT"
] | 3 | 2020-03-30T14:25:57.000Z | 2021-12-21T17:10:55.000Z | test_autoarray/inversion/inversion/test_inversion_util.py | Jammy2211/PyAutoArray | 1fb9c84ca2a3333abedfbf96d070fc355e2628e4 | [
"MIT"
] | 4 | 2020-03-03T11:35:41.000Z | 2022-01-21T17:37:35.000Z | import autoarray as aa
import numpy as np
class TestCurvatureRegMatrix:
def test__uses_pixel_neighbors_to_add_matrices_correctly(self):
pixel_neighbors = np.array(
[
[1, 3, -1, -1],
[4, 2, 0, -1],
[1, 5, -1, -1],
[4, 6, 0, -1],
[7, 1, 5, 3],
[4, 2, 8, -1],
[7, 3, -1, -1],
[4, 8, 6, -1],
[7, 5, -1, -1],
]
)
pixel_neighbors_sizes = np.array([2, 3, 2, 3, 4, 3, 2, 3, 2])
regularization_matrix = aa.util.regularization.constant_regularization_matrix_from(
coefficient=1.0,
pixel_neighbors=pixel_neighbors,
pixel_neighbors_sizes=pixel_neighbors_sizes,
)
curvature_matrix = np.ones(regularization_matrix.shape)
curvature_reg_matrix = curvature_matrix + regularization_matrix
curvature_reg_matrix_util = aa.util.inversion.curvature_reg_matrix_from(
curvature_matrix=curvature_matrix,
regularization_matrix=regularization_matrix,
pixel_neighbors=pixel_neighbors,
pixel_neighbors_sizes=pixel_neighbors_sizes,
)
assert (curvature_reg_matrix == curvature_reg_matrix_util).all()
class TestPreconditionerMatrix:
def test__simple_calculations(self):
mapping_matrix = np.array(
[
[1.0, 0.0, 0.0],
[1.0, 0.0, 0.0],
[0.0, 1.0, 0.0],
[0.0, 1.0, 0.0],
[0.0, 0.0, 1.0],
[0.0, 0.0, 1.0],
]
)
preconditioner_matrix = aa.util.inversion.preconditioner_matrix_via_mapping_matrix_from(
mapping_matrix=mapping_matrix,
preconditioner_noise_normalization=1.0,
regularization_matrix=np.zeros((3, 3)),
)
assert (
preconditioner_matrix
== np.array([[2.0, 0.0, 0.0], [0.0, 2.0, 0.0], [0.0, 0.0, 2.0]])
).all()
preconditioner_matrix = aa.util.inversion.preconditioner_matrix_via_mapping_matrix_from(
mapping_matrix=mapping_matrix,
preconditioner_noise_normalization=2.0,
regularization_matrix=np.zeros((3, 3)),
)
assert (
preconditioner_matrix
== np.array([[4.0, 0.0, 0.0], [0.0, 4.0, 0.0], [0.0, 0.0, 4.0]])
).all()
regularization_matrix = np.array(
[[1.0, 2.0, 3.0], [4.0, 5.0, 6.0], [7.0, 8.0, 9.0]]
)
preconditioner_matrix = aa.util.inversion.preconditioner_matrix_via_mapping_matrix_from(
mapping_matrix=mapping_matrix,
preconditioner_noise_normalization=2.0,
regularization_matrix=regularization_matrix,
)
assert (
preconditioner_matrix
== np.array([[5.0, 2.0, 3.0], [4.0, 9.0, 6.0], [7.0, 8.0, 13.0]])
).all()
class TestPixelizationQuantity:
def test__residuals(self,):
pixelization_values = np.ones(3)
reconstructed_data_1d = np.ones(9)
slim_index_for_sub_slim_index = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
all_sub_slim_indexes_for_pixelization_index = [[0, 0, 0], [1, 1, 1], [2, 2, 2]]
pixelization_residuals = aa.util.inversion.inversion_residual_map_from(
reconstruction=pixelization_values,
data=reconstructed_data_1d,
slim_index_for_sub_slim_index=slim_index_for_sub_slim_index,
all_sub_slim_indexes_for_pixelization_index=all_sub_slim_indexes_for_pixelization_index,
)
assert (pixelization_residuals == np.zeros(3)).all()
pixelization_values = np.ones(3)
reconstructed_data_1d = np.array([1.0, 1.0, 1.0, 2.0, 2.0, 2.0, 3.0, 3.0, 3.0])
slim_index_for_sub_slim_index = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
all_sub_slim_indexes_for_pixelization_index = [[0, 1, 2], [3, 4, 5], [6, 7, 8]]
pixelization_residuals = aa.util.inversion.inversion_residual_map_from(
reconstruction=pixelization_values,
data=reconstructed_data_1d,
slim_index_for_sub_slim_index=slim_index_for_sub_slim_index,
all_sub_slim_indexes_for_pixelization_index=all_sub_slim_indexes_for_pixelization_index,
)
assert (pixelization_residuals == np.array([0.0, 1.0, 2.0])).all()
def test__normalized_residuals__pixelization_perfectly_reconstructed_data__quantities_like_residuals_all_zeros(
self,
):
pixelization_values = np.ones(3)
reconstructed_data_1d = np.ones(9)
noise_map_1d = np.ones(9)
slim_index_for_sub_slim_index = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
all_sub_slim_indexes_for_pixelization_index = [[0, 0, 0], [1, 1, 1], [2, 2, 2]]
pixelization_normalized_residuals = aa.util.inversion.inversion_normalized_residual_map_from(
reconstruction=pixelization_values,
data=reconstructed_data_1d,
noise_map_1d=noise_map_1d,
slim_index_for_sub_slim_index=slim_index_for_sub_slim_index,
all_sub_slim_indexes_for_pixelization_index=all_sub_slim_indexes_for_pixelization_index,
)
assert (pixelization_normalized_residuals == np.zeros(3)).all()
pixelization_values = np.ones(3)
reconstructed_data_1d = np.array([1.0, 1.0, 1.0, 2.0, 2.0, 2.0, 3.0, 3.0, 3.0])
noise_map_1d = np.array([0.5, 0.5, 0.5, 1.0, 1.0, 1.0, 2.0, 2.0, 2.0])
slim_index_for_sub_slim_index = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
all_sub_slim_indexes_for_pixelization_index = [[0, 1, 2], [3, 4, 5], [6, 7, 8]]
pixelization_normalized_residuals = aa.util.inversion.inversion_normalized_residual_map_from(
reconstruction=pixelization_values,
data=reconstructed_data_1d,
noise_map_1d=noise_map_1d,
slim_index_for_sub_slim_index=slim_index_for_sub_slim_index,
all_sub_slim_indexes_for_pixelization_index=all_sub_slim_indexes_for_pixelization_index,
)
assert (pixelization_normalized_residuals == np.array([0.0, 1.0, 1.0])).all()
def test__chi_squared__pixelization_perfectly_reconstructed_data__quantities_like_residuals_all_zeros(
self,
):
pixelization_values = np.ones(3)
reconstructed_data_1d = np.ones(9)
noise_map_1d = np.ones(9)
slim_index_for_sub_slim_index = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
all_sub_slim_indexes_for_pixelization_index = [[0, 0, 0], [1, 1, 1], [2, 2, 2]]
pixelization_chi_squareds = aa.util.inversion.inversion_chi_squared_map_from(
reconstruction=pixelization_values,
data=reconstructed_data_1d,
noise_map_1d=noise_map_1d,
slim_index_for_sub_slim_index=slim_index_for_sub_slim_index,
all_sub_slim_indexes_for_pixelization_index=all_sub_slim_indexes_for_pixelization_index,
)
assert (pixelization_chi_squareds == np.zeros(3)).all()
pixelization_values = np.ones(3)
reconstructed_data_1d = np.array([1.0, 1.0, 1.0, 2.0, 2.0, 2.0, 3.0, 3.0, 3.0])
noise_map_1d = np.array([0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 4.0, 4.0, 4.0])
slim_index_for_sub_slim_index = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
all_sub_slim_indexes_for_pixelization_index = [[0, 1, 2], [3, 4, 5], [6, 7, 8]]
pixelization_chi_squareds = aa.util.inversion.inversion_chi_squared_map_from(
reconstruction=pixelization_values,
data=reconstructed_data_1d,
noise_map_1d=noise_map_1d,
slim_index_for_sub_slim_index=slim_index_for_sub_slim_index,
all_sub_slim_indexes_for_pixelization_index=all_sub_slim_indexes_for_pixelization_index,
)
assert (pixelization_chi_squareds == np.array([0.0, 4.0, 0.25])).all()
| 41.181818 | 116 | 0.610621 | 1,117 | 8,154 | 4.081468 | 0.068039 | 0.025444 | 0.027638 | 0.026322 | 0.855451 | 0.811362 | 0.811362 | 0.801272 | 0.801272 | 0.801272 | 0 | 0.07155 | 0.278391 | 8,154 | 197 | 117 | 41.390863 | 0.703263 | 0 | 0 | 0.608974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064103 | 1 | 0.032051 | false | 0 | 0.012821 | 0 | 0.064103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
500e7637234cb6b6db944943503a5f9ec9004da6 | 1,189 | py | Python | Programming basics/Exercises PROVERKI/lab/trade_commissions.py | antonarnaudov/SoftUniProjects | 01cbdce2b350b57240045d1bc3e21d34f9d0351d | [
"MIT"
] | null | null | null | Programming basics/Exercises PROVERKI/lab/trade_commissions.py | antonarnaudov/SoftUniProjects | 01cbdce2b350b57240045d1bc3e21d34f9d0351d | [
"MIT"
] | null | null | null | Programming basics/Exercises PROVERKI/lab/trade_commissions.py | antonarnaudov/SoftUniProjects | 01cbdce2b350b57240045d1bc3e21d34f9d0351d | [
"MIT"
] | null | null | null | city = input()
sells_area = float(input())
commissions = 0
if city == 'Sofia':
if 0 <= sells_area <= 500:
commissions = sells_area * 0.05
elif 500 <= sells_area <= 1000:
commissions = sells_area * 0.07
elif 1000 <= sells_area <= 10000:
commissions = sells_area * 0.08
elif sells_area > 10000:
commissions = sells_area * 0.12
else:
print('error')
elif city == 'Varna':
if 0 <= sells_area <= 500:
commissions = sells_area * 0.045
elif 500 <= sells_area <= 1000:
commissions = sells_area * 0.075
elif 1000 <= sells_area <= 10000:
commissions = sells_area * 0.10
elif sells_area > 10000:
commissions = sells_area * 0.13
else:
print('error')
elif city == 'Plovdiv':
if 0 <= sells_area <= 500:
commissions = sells_area * 0.055
elif 500 <= sells_area <= 1000:
commissions = sells_area * 0.08
elif 1000 <= sells_area <= 10000:
commissions = sells_area * 0.12
elif sells_area > 10000:
commissions = sells_area * 0.145
else:
print('error')
else:
print('error')
if commissions != 0:
print(f'{commissions:.2f}') | 28.309524 | 40 | 0.587048 | 153 | 1,189 | 4.398693 | 0.196078 | 0.334324 | 0.356612 | 0.374443 | 0.786033 | 0.720654 | 0.71471 | 0.71471 | 0.534918 | 0 | 0 | 0.140476 | 0.293524 | 1,189 | 42 | 41 | 28.309524 | 0.660714 | 0 | 0 | 0.6 | 0 | 0 | 0.045378 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
accf88a733e11cc1f2fcb52973508acdd9e98f60 | 33,124 | py | Python | desktop/core/ext-py/repoze.who-2.3/repoze/who/plugins/tests/test_authtkt.py | kokosing/hue | 2307f5379a35aae9be871e836432e6f45138b3d9 | [
"Apache-2.0"
] | 5,079 | 2015-01-01T03:39:46.000Z | 2022-03-31T07:38:22.000Z | desktop/core/ext-py/repoze.who-2.3/repoze/who/plugins/tests/test_authtkt.py | zks888/hue | 93a8c370713e70b216c428caa2f75185ef809deb | [
"Apache-2.0"
] | 1,623 | 2015-01-01T08:06:24.000Z | 2022-03-30T19:48:52.000Z | desktop/core/ext-py/repoze.who-2.3/repoze/who/plugins/tests/test_authtkt.py | zks888/hue | 93a8c370713e70b216c428caa2f75185ef809deb | [
"Apache-2.0"
] | 2,033 | 2015-01-04T07:18:02.000Z | 2022-03-28T19:55:47.000Z | import unittest
class TestAuthTktCookiePlugin(unittest.TestCase):
tempdir = None
_now_testing = None
def setUp(self):
pass
def tearDown(self):
if self.tempdir is not None:
import shutil
shutil.rmtree(self.tempdir)
if self._now_testing is not None:
self._setNowTesting(self._now_testing)
def _getTargetClass(self):
from repoze.who.plugins.auth_tkt import AuthTktCookiePlugin
return AuthTktCookiePlugin
def _makeEnviron(self, kw=None):
from wsgiref.util import setup_testing_defaults
environ = {}
setup_testing_defaults(environ)
if kw is not None:
environ.update(kw)
environ['REMOTE_ADDR'] = '1.1.1.1'
environ['HTTP_HOST'] = 'localhost'
return environ
def _makeOne(self, secret='s33kr3t', *arg, **kw):
plugin = self._getTargetClass()(secret, *arg, **kw)
return plugin
def _makeTicket(self, userid='userid', remote_addr='0.0.0.0',
tokens = [], userdata='userdata',
cookie_name='auth_tkt', secure=False,
time=None, digest_algo="md5"):
import repoze.who._auth_tkt as auth_tkt
ticket = auth_tkt.AuthTicket(
'secret',
userid,
remote_addr,
tokens=tokens,
user_data=userdata,
time=time,
cookie_name=cookie_name,
secure=secure,
digest_algo=digest_algo)
return ticket.cookie_value()
def _setNowTesting(self, value):
from repoze.who.plugins import auth_tkt
auth_tkt._UTCNOW, self._now_testing = value, auth_tkt._UTCNOW
def test_class_conforms_to_IIdentifier(self):
from zope.interface.verify import verifyClass
from repoze.who.interfaces import IIdentifier
klass = self._getTargetClass()
verifyClass(IIdentifier, klass)
def test_instance_conforms_to_IIdentifier(self):
from zope.interface.verify import verifyObject
from repoze.who.interfaces import IIdentifier
klass = self._getTargetClass()
verifyObject(IIdentifier, self._makeOne())
def test_class_conforms_to_IAuthenticator(self):
from zope.interface.verify import verifyClass
from repoze.who.interfaces import IAuthenticator
klass = self._getTargetClass()
verifyClass(IAuthenticator, klass)
def test_instance_conforms_to_IAuthenticator(self):
from zope.interface.verify import verifyObject
from repoze.who.interfaces import IAuthenticator
klass = self._getTargetClass()
verifyObject(IAuthenticator, self._makeOne())
def test_timeout_no_reissue(self):
self.assertRaises(ValueError, self._makeOne, 'userid', timeout=1)
def test_timeout_lower_than_reissue(self):
self.assertRaises(ValueError, self._makeOne, 'userid', timeout=1,
reissue_time=2)
def test_identify_nocookie(self):
plugin = self._makeOne('secret')
environ = self._makeEnviron()
result = plugin.identify(environ)
self.assertEqual(result, None)
def test_identify_good_cookie_include_ip(self):
plugin = self._makeOne('secret', include_ip=True)
val = self._makeTicket(remote_addr='1.1.1.1', userdata='foo=123')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(len(result), 4)
self.assertEqual(result['tokens'], [''])
self.assertEqual(result['repoze.who.plugins.auth_tkt.userid'], 'userid')
self.assertEqual(result['userdata'], {'foo': '123'})
self.assertTrue('timestamp' in result)
self.assertEqual(environ['REMOTE_USER_TOKENS'], [''])
self.assertEqual(environ['REMOTE_USER_DATA'],'foo=123')
self.assertEqual(environ['AUTH_TYPE'],'cookie')
def test_identify_good_cookie_dont_include_ip(self):
plugin = self._makeOne('secret', include_ip=False)
val = self._makeTicket(userdata='foo=123')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(len(result), 4)
self.assertEqual(result['tokens'], [''])
self.assertEqual(result['repoze.who.plugins.auth_tkt.userid'], 'userid')
self.assertEqual(result['userdata'], {'foo': '123'})
self.assertTrue('timestamp' in result)
self.assertEqual(environ['REMOTE_USER_TOKENS'], [''])
self.assertEqual(environ['REMOTE_USER_DATA'],'foo=123')
self.assertEqual(environ['AUTH_TYPE'],'cookie')
def test_identify_good_cookie_int_useridtype(self):
plugin = self._makeOne('secret', include_ip=False)
val = self._makeTicket(userid='1', userdata='userid_type=int')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(len(result), 4)
self.assertEqual(result['tokens'], [''])
self.assertEqual(result['repoze.who.plugins.auth_tkt.userid'], 1)
self.assertEqual(result['userdata'], {'userid_type': 'int'})
self.assertTrue('timestamp' in result)
self.assertEqual(environ['REMOTE_USER_TOKENS'], [''])
self.assertEqual(environ['REMOTE_USER_DATA'],'userid_type=int')
self.assertEqual(environ['AUTH_TYPE'],'cookie')
def test_identify_good_cookie_unknown_useridtype(self):
plugin = self._makeOne('secret', include_ip=False)
val = self._makeTicket(userid='userid', userdata='userid_type=unknown')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(len(result), 4)
self.assertEqual(result['tokens'], [''])
self.assertEqual(result['repoze.who.plugins.auth_tkt.userid'], 'userid')
self.assertEqual(result['userdata'], {'userid_type':'unknown'})
self.assertTrue('timestamp' in result)
self.assertEqual(environ['REMOTE_USER_TOKENS'], [''])
self.assertEqual(environ['REMOTE_USER_DATA'],'userid_type=unknown')
self.assertEqual(environ['AUTH_TYPE'],'cookie')
def test_identify_bad_cookie(self):
plugin = self._makeOne('secret', include_ip=True)
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=bogus'})
result = plugin.identify(environ)
self.assertEqual(result, None)
def test_identify_bad_cookie_expired(self):
import time
plugin = self._makeOne('secret', timeout=2, reissue_time=1)
val = self._makeTicket(userid='userid', time=time.time()-3)
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(result, None)
def test_identify_with_checker_and_existing_account(self):
plugin = self._makeOne('secret', userid_checker=dummy_userid_checker)
val = self._makeTicket(userid='existing', userdata='foo=123')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(len(result), 4)
self.assertEqual(result['tokens'], [''])
self.assertEqual(result['repoze.who.plugins.auth_tkt.userid'], 'existing')
self.assertEqual(result['userdata'], {'foo': '123'})
self.assertTrue('timestamp' in result)
self.assertEqual(environ['REMOTE_USER_TOKENS'], [''])
self.assertEqual(environ['REMOTE_USER_DATA'],'foo=123')
self.assertEqual(environ['AUTH_TYPE'],'cookie')
def test_identify_with_alternate_hash(self):
plugin = self._makeOne('secret', include_ip=False, digest_algo="sha256")
val = self._makeTicket(userdata='foo=123', digest_algo="sha256")
md5_val = self._makeTicket(userdata='foo=123')
self.assertNotEqual(val, md5_val)
# md5 is 16*2 characters long, sha256 is 32*2
self.assertEqual(len(val), len(md5_val)+32)
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.identify(environ)
self.assertEqual(len(result), 4)
self.assertEqual(result['tokens'], [''])
self.assertEqual(result['repoze.who.plugins.auth_tkt.userid'], 'userid')
self.assertEqual(result['userdata'], {'foo': '123'})
self.assertTrue('timestamp' in result)
self.assertEqual(environ['REMOTE_USER_TOKENS'], [''])
self.assertEqual(environ['REMOTE_USER_DATA'],'foo=123')
self.assertEqual(environ['AUTH_TYPE'],'cookie')
def test_identify_bad_cookie_with_alternate_hash(self):
plugin = self._makeOne('secret', include_ip=True, digest_algo="sha256")
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=bogus'})
result = plugin.identify(environ)
self.assertEqual(result, None)
def test_remember_creds_same(self):
plugin = self._makeOne('secret')
val = self._makeTicket(userid='userid', userdata='foo=123')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':{'foo': '123'}})
self.assertEqual(result, None)
def test_remember_creds_same_alternate_hash(self):
plugin = self._makeOne('secret', digest_algo="sha1")
val = self._makeTicket(userid='userid', userdata='foo=123', digest_algo="sha1")
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % val})
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':{'foo': '123'}})
self.assertEqual(result, None)
def test_remember_creds_hash_mismatch(self):
plugin = self._makeOne('secret', digest_algo="sha1")
old_val = self._makeTicket(userid='userid', userdata='foo=123', digest_algo="md5")
new_val = self._makeTicket(userid='userid', userdata='foo=123', digest_algo="sha1")
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':{'foo': '123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_secure_alternate_hash(self):
plugin = self._makeOne('secret', secure=True, digest_algo="sha512")
val = self._makeTicket(userid='userid', secure=True, userdata='foo=123', digest_algo="sha512")
environ = self._makeEnviron()
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':{'foo':'123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'secure; '
'HttpOnly' % val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost; '
'secure; HttpOnly'
% val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost; '
'secure; HttpOnly'
% val))
def test_remember_creds_secure(self):
plugin = self._makeOne('secret', secure=True)
val = self._makeTicket(userid='userid', secure=True, userdata='foo=123')
environ = self._makeEnviron()
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':{'foo':'123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'secure; '
'HttpOnly' % val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost; '
'secure; HttpOnly'
% val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost; '
'secure; HttpOnly'
% val))
def test_remember_creds_different(self):
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='other', userdata='foo=123')
result = plugin.remember(environ, {'repoze.who.userid':'other',
'userdata':{'foo':'123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_different_strips_port(self):
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val,
'HTTP_HOST': 'localhost:8080',
})
new_val = self._makeTicket(userid='other', userdata='foo=123')
result = plugin.remember(environ, {'repoze.who.userid':'other',
'userdata':{'foo': '123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_different_include_ip(self):
plugin = self._makeOne('secret', include_ip=True)
old_val = self._makeTicket(userid='userid', remote_addr='1.1.1.1')
environ = self._makeEnviron({'HTTP_COOKIE': 'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='other',
userdata='foo=123',
remote_addr='1.1.1.1')
result = plugin.remember(environ, {'repoze.who.userid':'other',
'userdata':{'foo': '123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_different_bad_old_cookie(self):
plugin = self._makeOne('secret')
old_val = 'BOGUS'
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='other', userdata='foo=123')
result = plugin.remember(environ, {'repoze.who.userid':'other',
'userdata':{'foo': '123'}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_different_with_tokens(self):
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='userid',
userdata='foo=123',
tokens=['foo', 'bar'],
)
result = plugin.remember(environ, {'repoze.who.userid': 'userid',
'userdata': {'foo': '123'},
'tokens': ['foo', 'bar'],
})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_different_with_tuple_tokens(self):
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='userid',
userdata='foo=123',
tokens=['foo', 'bar'],
)
result = plugin.remember(environ, {'repoze.who.userid': 'userid',
'userdata': {'foo': '123'},
'tokens': ('foo', 'bar'),
})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/' % new_val))
self.assertEqual(result[1],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=localhost'
% new_val))
self.assertEqual(result[2],
('Set-Cookie',
'auth_tkt="%s"; '
'Path=/; '
'Domain=.localhost'
% new_val))
def test_remember_creds_different_int_userid(self):
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='1', userdata='userid_type=int')
result = plugin.remember(environ, {'repoze.who.userid':1,
'userdata':{}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; Path=/' % new_val))
def test_remember_creds_different_long_userid(self):
try:
long
except NameError: #pragma NO COVER Python >= 3.0
return
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='1', userdata='userid_type=int')
result = plugin.remember(environ, {'repoze.who.userid':long(1),
'userdata':{}})
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; Path=/' % new_val))
def test_remember_creds_different_unicode_userid(self):
plugin = self._makeOne('secret')
old_val = self._makeTicket(userid='userid')
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
userid = b'\xc2\xa9'.decode('utf-8')
if type(b'') == type(''):
userdata = 'userid_type=unicode'
else: # pragma: no cover Py3k
userdata = ''
new_val = self._makeTicket(userid=userid.encode('utf-8'),
userdata=userdata)
result = plugin.remember(environ, {'repoze.who.userid':userid,
'userdata':{}})
self.assertEqual(type(result[0][1]), str)
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; Path=/' % new_val))
def test_remember_creds_reissue(self):
import time
plugin = self._makeOne('secret', reissue_time=1)
old_val = self._makeTicket(userid='userid', userdata='',
time=time.time()-2)
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='userid', userdata='')
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':''})
self.assertEqual(type(result[0][1]), str)
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; Path=/' % new_val))
def test_remember_creds_reissue_alternate_hash(self):
import time
plugin = self._makeOne('secret', reissue_time=1, digest_algo="sha256")
old_val = self._makeTicket(userid='userid', userdata='',
time=time.time()-2, digest_algo="sha256")
environ = self._makeEnviron({'HTTP_COOKIE':'auth_tkt=%s' % old_val})
new_val = self._makeTicket(userid='userid', userdata='',
digest_algo="sha256")
result = plugin.remember(environ, {'repoze.who.userid':'userid',
'userdata':''})
self.assertEqual(type(result[0][1]), str)
self.assertEqual(len(result), 3)
self.assertEqual(result[0],
('Set-Cookie',
'auth_tkt="%s"; Path=/' % new_val))
def test_l10n_sane_cookie_date(self):
from datetime import datetime
now = datetime(2009, 11, 8, 16, 15, 22)
self._setNowTesting(now)
plugin = self._makeOne('secret')
environ = {'HTTP_HOST': 'example.com'}
tkt = self._makeTicket(userid='chris', userdata='')
result = plugin.remember(environ, {'repoze.who.userid': 'chris',
'max_age': '500'})
name, value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(
value.endswith('; Expires=Sun, 08 Nov 2009 16:23:42 GMT'))
def test_remember_max_age(self):
from datetime import datetime
now = datetime(2009, 11, 8, 16, 15, 22)
self._setNowTesting(now)
plugin = self._makeOne('secret')
environ = {'HTTP_HOST': 'example.com'}
tkt = self._makeTicket(userid='chris', userdata='')
result = plugin.remember(environ, {'repoze.who.userid': 'chris',
'max_age': '500'})
name, value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(
value.startswith('auth_tkt="%s"; Path=/; Max-Age=500' % tkt),
value)
self.assertTrue(
value.endswith('; Expires=Sun, 08 Nov 2009 16:23:42 GMT'))
name, value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(
value.startswith(
'auth_tkt="%s"; Path=/; Domain=example.com; Max-Age=500'
% tkt), value)
self.assertTrue(
value.endswith('; Expires=Sun, 08 Nov 2009 16:23:42 GMT'))
name, value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(
value.startswith(
'auth_tkt="%s"; Path=/; Domain=.example.com; Max-Age=500' % tkt),
value)
self.assertTrue(
value.endswith('; Expires=Sun, 08 Nov 2009 16:23:42 GMT'))
def test_forget(self):
from datetime import datetime
now = datetime(2009, 11, 5, 16, 15, 22)
self._setNowTesting(now)
plugin = self._makeOne('secret')
environ = self._makeEnviron()
headers = plugin.forget(environ, None)
self.assertEqual(len(headers), 3)
header = headers[0]
name, value = header
self.assertEqual(name, 'Set-Cookie')
self.assertEqual(value,
'auth_tkt="INVALID"; Path=/; '
'Max-Age=0; Expires=Thu, 05 Nov 2009 16:15:22 GMT'
)
header = headers[1]
name, value = header
self.assertEqual(name, 'Set-Cookie')
self.assertEqual(value,
'auth_tkt="INVALID"; Path=/; Domain=localhost; '
'Max-Age=0; Expires=Thu, 05 Nov 2009 16:15:22 GMT'
)
header = headers[2]
name, value = header
self.assertEqual(name, 'Set-Cookie')
self.assertEqual(value,
'auth_tkt="INVALID"; Path=/; Domain=.localhost; '
'Max-Age=0; Expires=Thu, 05 Nov 2009 16:15:22 GMT'
)
def test_authenticate_non_auth_tkt_credentials(self):
plugin = self._makeOne()
self.assertEqual(plugin.authenticate(environ={}, identity={}), None)
def test_authenticate_without_checker(self):
plugin = self._makeOne()
identity = {'repoze.who.plugins.auth_tkt.userid': 'phred'}
self.assertEqual(plugin.authenticate({}, identity), 'phred')
def test_authenticate_with_checker_and_non_existing_account(self):
plugin = self._makeOne('secret', userid_checker=dummy_userid_checker)
identity = {'repoze.who.plugins.auth_tkt.userid': 'phred'}
self.assertEqual(plugin.authenticate({}, identity), None)
def test_authenticate_with_checker_and_existing_account(self):
plugin = self._makeOne('secret', userid_checker=dummy_userid_checker)
identity = {'repoze.who.plugins.auth_tkt.userid': 'existing'}
self.assertEqual(plugin.authenticate({}, identity), 'existing')
def test_factory_wo_secret_wo_secretfile_raises_ValueError(self):
from repoze.who.plugins.auth_tkt import make_plugin
self.assertRaises(ValueError, make_plugin)
def test_factory_w_secret_w_secretfile_raises_ValueError(self):
from repoze.who.plugins.auth_tkt import make_plugin
self.assertRaises(ValueError, make_plugin, 'secret', 'secretfile')
def test_factory_w_bad_secretfile_raises_ValueError(self):
from repoze.who.plugins.auth_tkt import make_plugin
self.assertRaises(ValueError, make_plugin, secretfile='nonesuch.txt')
def test_factory_w_secret(self):
from repoze.who.plugins.auth_tkt import make_plugin
plugin = make_plugin('secret')
self.assertEqual(plugin.cookie_name, 'auth_tkt')
self.assertEqual(plugin.secret, 'secret')
self.assertEqual(plugin.include_ip, False)
self.assertEqual(plugin.secure, False)
def test_factory_w_secretfile(self):
import os
from tempfile import mkdtemp
from repoze.who.plugins.auth_tkt import make_plugin
tempdir = self.tempdir = mkdtemp()
path = os.path.join(tempdir, 'who.secret')
secret = open(path, 'w')
secret.write('s33kr1t\n')
secret.flush()
secret.close()
plugin = make_plugin(secretfile=path)
self.assertEqual(plugin.secret, 's33kr1t')
def test_factory_with_timeout_and_reissue_time(self):
from repoze.who.plugins.auth_tkt import make_plugin
plugin = make_plugin('secret', timeout=5, reissue_time=1)
self.assertEqual(plugin.timeout, 5)
self.assertEqual(plugin.reissue_time, 1)
def test_factory_with_userid_checker(self):
from repoze.who.plugins.auth_tkt import make_plugin
plugin = make_plugin(
'secret',
userid_checker='repoze.who.plugins.auth_tkt:make_plugin')
self.assertEqual(plugin.userid_checker, make_plugin)
def test_factory_with_alternate_hash(self):
from repoze.who.plugins.auth_tkt import make_plugin
import hashlib
plugin = make_plugin('secret', digest_algo="sha1")
self.assertEqual(plugin.digest_algo, hashlib.sha1)
def test_factory_with_alternate_hash_func(self):
from repoze.who.plugins.auth_tkt import make_plugin
import hashlib
plugin = make_plugin('secret', digest_algo=hashlib.sha1)
self.assertEqual(plugin.digest_algo, hashlib.sha1)
def test_factory_with_bogus_hash(self):
from repoze.who.plugins.auth_tkt import make_plugin
self.assertRaises(ValueError, make_plugin,
secret="fiddly", digest_algo='foo23')
def test_remember_max_age_unicode(self):
from repoze.who._compat import u
plugin = self._makeOne('secret')
environ = {'HTTP_HOST':'example.com'}
tkt = self._makeTicket(userid='chris', userdata='')
result = plugin.remember(environ, {'repoze.who.userid': 'chris',
'max_age': u('500')})
name, value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(isinstance(value, str))
self.assertTrue(
value.startswith('auth_tkt="%s"; Path=/; Max-Age=500' % tkt),
(value, tkt))
self.assertTrue('; Expires=' in value)
name,value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(
value.startswith(
'auth_tkt="%s"; Path=/; Domain=example.com; Max-Age=500'
% tkt), value)
self.assertTrue('; Expires=' in value)
name,value = result.pop(0)
self.assertEqual('Set-Cookie', name)
self.assertTrue(
value.startswith(
'auth_tkt="%s"; Path=/; Domain=.example.com; Max-Age=500' % tkt),
value)
self.assertTrue('; Expires=' in value)
def dummy_userid_checker(userid):
return userid == 'existing'
| 44.402145 | 102 | 0.539065 | 3,318 | 33,124 | 5.180229 | 0.068113 | 0.10996 | 0.027461 | 0.04317 | 0.840237 | 0.823132 | 0.801257 | 0.785664 | 0.774203 | 0.750465 | 0 | 0.019996 | 0.33118 | 33,124 | 745 | 103 | 44.461745 | 0.755845 | 0.002838 | 0 | 0.682635 | 0 | 0 | 0.163296 | 0.010446 | 0 | 0 | 0 | 0 | 0.229042 | 1 | 0.086826 | false | 0.001497 | 0.052395 | 0.001497 | 0.152695 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c59d0daa3e89cade6bdb3177645f160c770e841d | 6,439 | py | Python | tests/test_multiply.py | skn123/butterfly | 7217b5d93bc78e1229fed3761bcc70d943f604b7 | [
"Apache-2.0"
] | 52 | 2020-08-05T08:32:24.000Z | 2022-03-27T21:56:34.000Z | tests/test_multiply.py | skn123/butterfly | 7217b5d93bc78e1229fed3761bcc70d943f604b7 | [
"Apache-2.0"
] | 13 | 2020-09-14T23:34:32.000Z | 2022-02-15T10:51:03.000Z | tests/test_multiply.py | skn123/butterfly | 7217b5d93bc78e1229fed3761bcc70d943f604b7 | [
"Apache-2.0"
] | 11 | 2020-10-15T07:03:25.000Z | 2022-03-25T12:03:49.000Z | import math
import unittest
import numpy as np
import torch
from torch import nn
from torch.nn import functional as F
import torch_butterfly
class ButterflyTest(unittest.TestCase):
def setUp(self):
self.rtol = 1e-3
self.atol = 1e-5
def test_multiply(self):
for batch_size, n in [(10, 4096), (8192, 512)]: # Test size smaller than 1024 and large batch size for race conditions
# for batch_size, n in [(10, 64)]:
# for batch_size, n in [(1, 2)]:
log_n = int(math.log2(n))
nstacks = 2
nblocks = 3
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
# for device in ['cuda']:
for complex in [False, True]:
# for complex in [False]:
for increasing_stride in [True, False]:
# for increasing_stride in [True]:
if batch_size > 1024 and (device == 'cpu'):
continue
dtype = torch.float32 if not complex else torch.complex64
# complex randn already has the correct scaling of stddev=1.0
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstacks, nblocks, log_n, n // 2, 2, 2), dtype=dtype, requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstacks, n), dtype=dtype, requires_grad=True, device=twiddle.device)
output = torch_butterfly.butterfly_multiply(twiddle, input, increasing_stride)
output_torch = torch_butterfly.multiply.butterfly_multiply_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, complex, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, complex, increasing_stride))
# if device == 'cuda' and batch_size > 1024 and not complex and increasing_stride:
# print((d_twiddle - d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# print(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# i = ((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().argmax()
# print(d_twiddle.flatten()[i])
# print(d_twiddle_torch.flatten()[i])
# print(d_twiddle.flatten()[i-5:i+5])
# print(d_twiddle_torch.flatten()[i-5:i+5])
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(batch_size, n), device, complex, increasing_stride))
def test_input_padding_output_slicing(self):
batch_size = 10
nstacks = 2
nblocks = 3
for n in [32, 4096]:
log_n = int(math.log2(n))
for input_size in [n // 2 - 1, n - 4, 2 * n + 7]:
for output_size in [None, n // 2 - 2, n - 5]:
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for complex in [False, True]:
for increasing_stride in [True, False]:
dtype = torch.float32 if not complex else torch.complex64
# complex randn already has the correct scaling of stddev=1.0
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstacks, nblocks, log_n, n // 2, 2, 2), dtype=dtype, requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstacks, input_size), dtype=dtype, requires_grad=True, device=twiddle.device)
output = torch_butterfly.butterfly_multiply(twiddle, input, increasing_stride, output_size)
output_torch = torch_butterfly.multiply.butterfly_multiply_torch(twiddle, input, increasing_stride, output_size)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, complex, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, complex, increasing_stride))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(d_twiddle - d_twiddle_torch).abs().max().item(),
(batch_size, n), device, complex, increasing_stride))
if __name__ == "__main__":
unittest.main()
| 67.072917 | 152 | 0.528498 | 729 | 6,439 | 4.469136 | 0.144033 | 0.068754 | 0.063843 | 0.039288 | 0.838858 | 0.807551 | 0.732351 | 0.717311 | 0.717311 | 0.699509 | 0 | 0.02916 | 0.366206 | 6,439 | 95 | 153 | 67.778947 | 0.769174 | 0.12533 | 0 | 0.57971 | 0 | 0 | 0.004452 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.043478 | false | 0 | 0.101449 | 0 | 0.15942 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c5af6e418ec664980ab735a1496bf60dc457839d | 135 | py | Python | students/K33402/Velts Andrey/lab0304/backend/charity/admin.py | ShubhamKunal/ITMO_ICT_WebDevelopment_2020-2021 | bb91c91a56d21cec2b12ae4cc722eaa652a88420 | [
"MIT"
] | 4 | 2020-09-03T15:41:42.000Z | 2021-12-24T15:28:20.000Z | students/K33402/Velts Andrey/lab0304/backend/charity/admin.py | ShubhamKunal/ITMO_ICT_WebDevelopment_2020-2021 | bb91c91a56d21cec2b12ae4cc722eaa652a88420 | [
"MIT"
] | 48 | 2020-09-13T20:22:42.000Z | 2021-04-30T11:13:30.000Z | students/K33402/Velts Andrey/lab0304/backend/charity/admin.py | ShubhamKunal/ITMO_ICT_WebDevelopment_2020-2021 | bb91c91a56d21cec2b12ae4cc722eaa652a88420 | [
"MIT"
] | 69 | 2020-09-06T10:32:37.000Z | 2021-11-28T18:13:17.000Z | from django.contrib import admin
from .models import Charity
@admin.register(Charity)
class CharityAdmin(admin.ModelAdmin):
pass
| 16.875 | 37 | 0.792593 | 17 | 135 | 6.294118 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 135 | 7 | 38 | 19.285714 | 0.91453 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
a8952e492b913409f8dd5d0d817240ba8973fc67 | 178 | py | Python | tests/test_import.py | useblocks/libpdf | 2c1cf007ac9c7a74871342d67bff118254ee8dd5 | [
"MIT"
] | 7 | 2021-01-21T20:42:01.000Z | 2022-03-16T06:11:49.000Z | tests/test_import.py | useblocks/libpdf | 2c1cf007ac9c7a74871342d67bff118254ee8dd5 | [
"MIT"
] | 8 | 2021-01-19T09:27:15.000Z | 2022-02-04T22:50:15.000Z | tests/test_import.py | useblocks/libpdf | 2c1cf007ac9c7a74871342d67bff118254ee8dd5 | [
"MIT"
] | null | null | null | """Import tests."""
def test_import():
"""Check if the app modules can be imported."""
from libpdf import core # pylint: disable=import-outside-toplevel
del core
| 19.777778 | 70 | 0.668539 | 24 | 178 | 4.916667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207865 | 178 | 8 | 71 | 22.25 | 0.836879 | 0.539326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a89e43199390c0f78ffd517d22f7bf8dfa888de8 | 38 | py | Python | skdaccess/solar/sdo/__init__.py | samiriff/scikit-dataaccess-ode | dc08fd67c772d3cd83d0d34183196661b6b53778 | [
"MIT"
] | 41 | 2017-06-12T19:52:07.000Z | 2022-01-02T11:03:01.000Z | skdaccess/solar/sdo/__init__.py | skdaccess/skdaccess | 935bfd54149abd9542fe38e77b7eabab48b1c3a1 | [
"MIT"
] | 2 | 2018-06-19T20:01:30.000Z | 2018-11-15T21:03:28.000Z | skdaccess/solar/sdo/__init__.py | skdaccess/skdaccess | 935bfd54149abd9542fe38e77b7eabab48b1c3a1 | [
"MIT"
] | 13 | 2017-08-28T12:00:26.000Z | 2021-12-08T10:12:44.000Z | from .data_fetcher import DataFetcher
| 19 | 37 | 0.868421 | 5 | 38 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
764156ef938271585aa26100843d52892bbbceab | 24 | py | Python | gui/__init__.py | gbroques/gerrymandering | 8ce8f64ed9e6cadf002310cae50052cedd0fa32f | [
"MIT"
] | null | null | null | gui/__init__.py | gbroques/gerrymandering | 8ce8f64ed9e6cadf002310cae50052cedd0fa32f | [
"MIT"
] | null | null | null | gui/__init__.py | gbroques/gerrymandering | 8ce8f64ed9e6cadf002310cae50052cedd0fa32f | [
"MIT"
] | null | null | null | from gui.app import App
| 12 | 23 | 0.791667 | 5 | 24 | 3.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
764f202d6c2aeaa911f45a13f3b2959a9fbe9d34 | 135 | py | Python | arraytool/documentation/examples/__init__.py | markuskreitzer/arraytool | e8d50a4a062dae4d9551daf75c808d2733b6563c | [
"CNRI-Python"
] | 1 | 2021-07-26T14:14:58.000Z | 2021-07-26T14:14:58.000Z | arraytool/documentation/examples/__init__.py | elec3647/arraytool | e8d50a4a062dae4d9551daf75c808d2733b6563c | [
"CNRI-Python"
] | null | null | null | arraytool/documentation/examples/__init__.py | elec3647/arraytool | e8d50a4a062dae4d9551daf75c808d2733b6563c | [
"CNRI-Python"
] | 1 | 2022-03-27T01:32:47.000Z | 2022-03-27T01:32:47.000Z | # Author: Srinivasa Rao Zinka (srinivas . zinka [at] gmail . com)
# Copyright (c) 2011 Srinivasa Rao Zinka
# License: New BSD License.
| 33.75 | 65 | 0.718519 | 19 | 135 | 5.105263 | 0.736842 | 0.247423 | 0.350515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036036 | 0.177778 | 135 | 3 | 66 | 45 | 0.837838 | 0.948148 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
767ada446328b3f655c5b6817d5373b44318f794 | 160 | py | Python | api/views.py | ankit98040/E-Commerce-Project-Django-React | 4fc9bc9950c925f9fe60e77953e55f751a43e8bd | [
"Apache-2.0"
] | null | null | null | api/views.py | ankit98040/E-Commerce-Project-Django-React | 4fc9bc9950c925f9fe60e77953e55f751a43e8bd | [
"Apache-2.0"
] | null | null | null | api/views.py | ankit98040/E-Commerce-Project-Django-React | 4fc9bc9950c925f9fe60e77953e55f751a43e8bd | [
"Apache-2.0"
] | 1 | 2021-05-15T07:23:37.000Z | 2021-05-15T07:23:37.000Z | from django.http import JsonResponse
# Create your views here.
def home(request):
return JsonResponse({'info': 'Django React Course', 'name': "hitesh"})
| 20 | 74 | 0.7125 | 20 | 160 | 5.7 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 160 | 7 | 75 | 22.857143 | 0.844444 | 0.14375 | 0 | 0 | 0 | 0 | 0.244444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
76b9d9e99f0acf2274927cd7c1303bd8c1a0da47 | 14,744 | py | Python | sdk/python/pulumi_azure/network/network_interface.py | kenny-wealth/pulumi-azure | e57e3a81f95bf622e7429c53f0bff93e33372aa1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/network/network_interface.py | kenny-wealth/pulumi-azure | e57e3a81f95bf622e7429c53f0bff93e33372aa1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/network/network_interface.py | kenny-wealth/pulumi-azure | e57e3a81f95bf622e7429c53f0bff93e33372aa1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import json
import warnings
import pulumi
import pulumi.runtime
from typing import Union
from .. import utilities, tables
class NetworkInterface(pulumi.CustomResource):
applied_dns_servers: pulumi.Output[list]
"""
If the VM that uses this NIC is part of an Availability Set, then this list will have the union of all DNS servers from all NICs that are part of the Availability Set
"""
dns_servers: pulumi.Output[list]
"""
List of DNS servers IP addresses to use for this NIC, overrides the VNet-level server list
"""
enable_accelerated_networking: pulumi.Output[bool]
"""
Enables Azure Accelerated Networking using SR-IOV. Only certain VM instance sizes are supported. Refer to [Create a Virtual Machine with Accelerated Networking](https://docs.microsoft.com/en-us/azure/virtual-network/create-vm-accelerated-networking-cli). Defaults to `false`.
"""
enable_ip_forwarding: pulumi.Output[bool]
"""
Enables IP Forwarding on the NIC. Defaults to `false`.
"""
internal_dns_name_label: pulumi.Output[str]
"""
Relative DNS name for this NIC used for internal communications between VMs in the same VNet
"""
internal_fqdn: pulumi.Output[str]
ip_configurations: pulumi.Output[list]
"""
One or more `ip_configuration` associated with this NIC as documented below.
* `applicationGatewayBackendAddressPoolsIds` (`list`)
* `applicationSecurityGroupIds` (`list`)
* `loadBalancerBackendAddressPoolsIds` (`list`)
* `loadBalancerInboundNatRulesIds` (`list`)
* `name` (`str`) - The name of the network interface. Changing this forces a new resource to be created.
* `primary` (`bool`)
* `private_ip_address` (`str`) - The first private IP address of the network interface.
* `privateIpAddressAllocation` (`str`)
* `privateIpAddressVersion` (`str`)
* `publicIpAddressId` (`str`)
* `subnet_id` (`str`)
"""
location: pulumi.Output[str]
"""
The location/region where the network interface is created. Changing this forces a new resource to be created.
"""
mac_address: pulumi.Output[str]
"""
The media access control (MAC) address of the network interface.
"""
name: pulumi.Output[str]
"""
The name of the network interface. Changing this forces a new resource to be created.
"""
network_security_group_id: pulumi.Output[str]
"""
The ID of the Network Security Group to associate with the network interface.
"""
private_ip_address: pulumi.Output[str]
"""
The first private IP address of the network interface.
"""
private_ip_addresses: pulumi.Output[list]
"""
The private IP addresses of the network interface.
"""
resource_group_name: pulumi.Output[str]
"""
The name of the resource group in which to create the network interface. Changing this forces a new resource to be created.
"""
tags: pulumi.Output[dict]
"""
A mapping of tags to assign to the resource.
"""
virtual_machine_id: pulumi.Output[str]
"""
Reference to a VM with which this NIC has been associated.
"""
def __init__(__self__, resource_name, opts=None, applied_dns_servers=None, dns_servers=None, enable_accelerated_networking=None, enable_ip_forwarding=None, internal_dns_name_label=None, internal_fqdn=None, ip_configurations=None, location=None, mac_address=None, name=None, network_security_group_id=None, resource_group_name=None, tags=None, virtual_machine_id=None, __props__=None, __name__=None, __opts__=None):
"""
Manages a Network Interface located in a Virtual Network, usually attached to a Virtual Machine.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[list] applied_dns_servers: If the VM that uses this NIC is part of an Availability Set, then this list will have the union of all DNS servers from all NICs that are part of the Availability Set
:param pulumi.Input[list] dns_servers: List of DNS servers IP addresses to use for this NIC, overrides the VNet-level server list
:param pulumi.Input[bool] enable_accelerated_networking: Enables Azure Accelerated Networking using SR-IOV. Only certain VM instance sizes are supported. Refer to [Create a Virtual Machine with Accelerated Networking](https://docs.microsoft.com/en-us/azure/virtual-network/create-vm-accelerated-networking-cli). Defaults to `false`.
:param pulumi.Input[bool] enable_ip_forwarding: Enables IP Forwarding on the NIC. Defaults to `false`.
:param pulumi.Input[str] internal_dns_name_label: Relative DNS name for this NIC used for internal communications between VMs in the same VNet
:param pulumi.Input[list] ip_configurations: One or more `ip_configuration` associated with this NIC as documented below.
:param pulumi.Input[str] location: The location/region where the network interface is created. Changing this forces a new resource to be created.
:param pulumi.Input[str] mac_address: The media access control (MAC) address of the network interface.
:param pulumi.Input[str] name: The name of the network interface. Changing this forces a new resource to be created.
:param pulumi.Input[str] network_security_group_id: The ID of the Network Security Group to associate with the network interface.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the network interface. Changing this forces a new resource to be created.
:param pulumi.Input[dict] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] virtual_machine_id: Reference to a VM with which this NIC has been associated.
The **ip_configurations** object supports the following:
* `applicationGatewayBackendAddressPoolsIds` (`pulumi.Input[list]`)
* `applicationSecurityGroupIds` (`pulumi.Input[list]`)
* `loadBalancerBackendAddressPoolsIds` (`pulumi.Input[list]`)
* `loadBalancerInboundNatRulesIds` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the network interface. Changing this forces a new resource to be created.
* `primary` (`pulumi.Input[bool]`)
* `private_ip_address` (`pulumi.Input[str]`) - The first private IP address of the network interface.
* `privateIpAddressAllocation` (`pulumi.Input[str]`)
* `privateIpAddressVersion` (`pulumi.Input[str]`)
* `publicIpAddressId` (`pulumi.Input[str]`)
* `subnet_id` (`pulumi.Input[str]`)
> This content is derived from https://github.com/terraform-providers/terraform-provider-azurerm/blob/master/website/docs/r/network_interface.html.markdown.
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
__props__['applied_dns_servers'] = applied_dns_servers
__props__['dns_servers'] = dns_servers
__props__['enable_accelerated_networking'] = enable_accelerated_networking
__props__['enable_ip_forwarding'] = enable_ip_forwarding
__props__['internal_dns_name_label'] = internal_dns_name_label
__props__['internal_fqdn'] = internal_fqdn
if ip_configurations is None:
raise TypeError("Missing required property 'ip_configurations'")
__props__['ip_configurations'] = ip_configurations
__props__['location'] = location
__props__['mac_address'] = mac_address
__props__['name'] = name
__props__['network_security_group_id'] = network_security_group_id
if resource_group_name is None:
raise TypeError("Missing required property 'resource_group_name'")
__props__['resource_group_name'] = resource_group_name
__props__['tags'] = tags
__props__['virtual_machine_id'] = virtual_machine_id
__props__['private_ip_address'] = None
__props__['private_ip_addresses'] = None
super(NetworkInterface, __self__).__init__(
'azure:network/networkInterface:NetworkInterface',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name, id, opts=None, applied_dns_servers=None, dns_servers=None, enable_accelerated_networking=None, enable_ip_forwarding=None, internal_dns_name_label=None, internal_fqdn=None, ip_configurations=None, location=None, mac_address=None, name=None, network_security_group_id=None, private_ip_address=None, private_ip_addresses=None, resource_group_name=None, tags=None, virtual_machine_id=None):
"""
Get an existing NetworkInterface resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param str id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[list] applied_dns_servers: If the VM that uses this NIC is part of an Availability Set, then this list will have the union of all DNS servers from all NICs that are part of the Availability Set
:param pulumi.Input[list] dns_servers: List of DNS servers IP addresses to use for this NIC, overrides the VNet-level server list
:param pulumi.Input[bool] enable_accelerated_networking: Enables Azure Accelerated Networking using SR-IOV. Only certain VM instance sizes are supported. Refer to [Create a Virtual Machine with Accelerated Networking](https://docs.microsoft.com/en-us/azure/virtual-network/create-vm-accelerated-networking-cli). Defaults to `false`.
:param pulumi.Input[bool] enable_ip_forwarding: Enables IP Forwarding on the NIC. Defaults to `false`.
:param pulumi.Input[str] internal_dns_name_label: Relative DNS name for this NIC used for internal communications between VMs in the same VNet
:param pulumi.Input[list] ip_configurations: One or more `ip_configuration` associated with this NIC as documented below.
:param pulumi.Input[str] location: The location/region where the network interface is created. Changing this forces a new resource to be created.
:param pulumi.Input[str] mac_address: The media access control (MAC) address of the network interface.
:param pulumi.Input[str] name: The name of the network interface. Changing this forces a new resource to be created.
:param pulumi.Input[str] network_security_group_id: The ID of the Network Security Group to associate with the network interface.
:param pulumi.Input[str] private_ip_address: The first private IP address of the network interface.
:param pulumi.Input[list] private_ip_addresses: The private IP addresses of the network interface.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the network interface. Changing this forces a new resource to be created.
:param pulumi.Input[dict] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] virtual_machine_id: Reference to a VM with which this NIC has been associated.
The **ip_configurations** object supports the following:
* `applicationGatewayBackendAddressPoolsIds` (`pulumi.Input[list]`)
* `applicationSecurityGroupIds` (`pulumi.Input[list]`)
* `loadBalancerBackendAddressPoolsIds` (`pulumi.Input[list]`)
* `loadBalancerInboundNatRulesIds` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the network interface. Changing this forces a new resource to be created.
* `primary` (`pulumi.Input[bool]`)
* `private_ip_address` (`pulumi.Input[str]`) - The first private IP address of the network interface.
* `privateIpAddressAllocation` (`pulumi.Input[str]`)
* `privateIpAddressVersion` (`pulumi.Input[str]`)
* `publicIpAddressId` (`pulumi.Input[str]`)
* `subnet_id` (`pulumi.Input[str]`)
> This content is derived from https://github.com/terraform-providers/terraform-provider-azurerm/blob/master/website/docs/r/network_interface.html.markdown.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
__props__["applied_dns_servers"] = applied_dns_servers
__props__["dns_servers"] = dns_servers
__props__["enable_accelerated_networking"] = enable_accelerated_networking
__props__["enable_ip_forwarding"] = enable_ip_forwarding
__props__["internal_dns_name_label"] = internal_dns_name_label
__props__["internal_fqdn"] = internal_fqdn
__props__["ip_configurations"] = ip_configurations
__props__["location"] = location
__props__["mac_address"] = mac_address
__props__["name"] = name
__props__["network_security_group_id"] = network_security_group_id
__props__["private_ip_address"] = private_ip_address
__props__["private_ip_addresses"] = private_ip_addresses
__props__["resource_group_name"] = resource_group_name
__props__["tags"] = tags
__props__["virtual_machine_id"] = virtual_machine_id
return NetworkInterface(resource_name, opts=opts, __props__=__props__)
def translate_output_property(self, prop):
return tables._CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return tables._SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 63.551724 | 418 | 0.711205 | 1,871 | 14,744 | 5.361839 | 0.12186 | 0.054825 | 0.044657 | 0.033493 | 0.789673 | 0.764952 | 0.758174 | 0.749601 | 0.73126 | 0.719099 | 0 | 0.000085 | 0.205779 | 14,744 | 231 | 419 | 63.82684 | 0.856618 | 0.443909 | 0 | 0.022222 | 1 | 0 | 0.16965 | 0.041387 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044444 | false | 0.011111 | 0.066667 | 0.022222 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
76c870787c20eefe3c11a739a12f1c2fc004a46b | 277 | py | Python | data_structures/hash_table/conftest.py | ShannonTully/data-structures-and-algorithms | f43f24f9f60beb37a2717f387c1050c9fd1ece60 | [
"MIT"
] | null | null | null | data_structures/hash_table/conftest.py | ShannonTully/data-structures-and-algorithms | f43f24f9f60beb37a2717f387c1050c9fd1ece60 | [
"MIT"
] | null | null | null | data_structures/hash_table/conftest.py | ShannonTully/data-structures-and-algorithms | f43f24f9f60beb37a2717f387c1050c9fd1ece60 | [
"MIT"
] | null | null | null | """conftest for Hash Table tests."""
from .hash_table import HashTable
import pytest
@pytest.fixture
def empty_hash_table():
"""Empty hash table."""
return HashTable()
@pytest.fixture
def basic_hash_table():
"""Not empty hash table."""
return HashTable(1)
| 16.294118 | 36 | 0.693141 | 36 | 277 | 5.194444 | 0.444444 | 0.28877 | 0.224599 | 0.213904 | 0.31016 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004386 | 0.176895 | 277 | 16 | 37 | 17.3125 | 0.815789 | 0.252708 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
4f638f15f4d8c53153616c2dd141aa0dd0ccb886 | 29 | py | Python | hydra/gen/__init__.py | JimBoonie/hydra | 63665090812e4e209c67d5dc0b84b5bb35a57ead | [
"MIT"
] | 28 | 2015-12-30T22:38:16.000Z | 2021-03-21T07:52:39.000Z | hydra/gen/__init__.py | JimBoonie/hydra | 63665090812e4e209c67d5dc0b84b5bb35a57ead | [
"MIT"
] | 2 | 2017-02-23T09:54:09.000Z | 2018-12-14T12:20:28.000Z | hydra/gen/__init__.py | JimBoonie/hydra | 63665090812e4e209c67d5dc0b84b5bb35a57ead | [
"MIT"
] | 7 | 2017-02-23T09:43:24.000Z | 2022-01-10T12:17:36.000Z | from .devebec import devebec
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.