hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
e188e75485c3ee41664d84bc3869842cd21e813d | 75 | py | Python | examples/customizations/new_builder/hello/__init__.py | cournape/Bento | 37de23d784407a7c98a4a15770ffc570d5f32d70 | [
"BSD-3-Clause"
] | 55 | 2015-01-20T21:12:52.000Z | 2021-11-23T12:29:32.000Z | examples/simples/single_extension_waf/hello/__init__.py | esc/Bento | 5e13318c0a74e956f9e80fa7617fb31ffc356088 | [
"BSD-3-Clause"
] | 6 | 2015-01-16T07:01:29.000Z | 2021-08-19T20:00:17.000Z | examples/simples/single_extension_waf/hello/__init__.py | esc/Bento | 5e13318c0a74e956f9e80fa7617fb31ffc356088 | [
"BSD-3-Clause"
] | 6 | 2015-08-12T18:11:47.000Z | 2019-01-05T08:36:05.000Z | from hello.bar import \
foo
from hello._bar import \
hello
| 15 | 24 | 0.6 | 10 | 75 | 4.4 | 0.5 | 0.409091 | 0.545455 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.346667 | 75 | 4 | 25 | 18.75 | 0.897959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
e1aa17f43e9cf675d09da8ec45d93d43b74a863f | 135 | py | Python | app/auth/views.py | carlsplace/learn-flask | 268a49806c1d42d7038ac1c788058bca7a90246d | [
"MIT"
] | null | null | null | app/auth/views.py | carlsplace/learn-flask | 268a49806c1d42d7038ac1c788058bca7a90246d | [
"MIT"
] | null | null | null | app/auth/views.py | carlsplace/learn-flask | 268a49806c1d42d7038ac1c788058bca7a90246d | [
"MIT"
] | null | null | null | from flask import render_template
from . import auth
@auth.route('/login')
def login():
return render_template('auth/login.html')
| 19.285714 | 45 | 0.740741 | 19 | 135 | 5.157895 | 0.578947 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 135 | 6 | 46 | 22.5 | 0.837607 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
bedc90f2a187c9e165cff2fb9a6e45ddb7afd5e6 | 35,037 | py | Python | test/probe/test_object_metadata_replication.py | OyTao/swift-learning | 09fa9dddd72f4aeebd2576c517f3b4d7988a7fa1 | [
"Apache-2.0"
] | null | null | null | test/probe/test_object_metadata_replication.py | OyTao/swift-learning | 09fa9dddd72f4aeebd2576c517f3b4d7988a7fa1 | [
"Apache-2.0"
] | null | null | null | test/probe/test_object_metadata_replication.py | OyTao/swift-learning | 09fa9dddd72f4aeebd2576c517f3b4d7988a7fa1 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python -u
# Copyright (c) 2010-2012 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from io import StringIO
import unittest
import os
import uuid
from swift.common.direct_client import direct_get_suffix_hashes
from swift.common.exceptions import DiskFileDeleted
from swift.common.internal_client import UnexpectedResponse
from swift.container.backend import ContainerBroker
from swift.common import utils
from swiftclient import client
from swift.common.ring import Ring
from swift.common.utils import Timestamp, get_logger, hash_path
from swift.obj.diskfile import DiskFileManager
from swift.common.storage_policy import POLICIES
from test.probe.brain import BrainSplitter
from test.probe.common import ReplProbeTest
class Test(ReplProbeTest):
def setUp(self):
"""
Reset all environment and start all servers.
"""
super(Test, self).setUp()
self.container_name = 'container-%s' % uuid.uuid4()
self.object_name = 'object-%s' % uuid.uuid4()
self.brain = BrainSplitter(self.url, self.token, self.container_name,
self.object_name, 'object',
policy=self.policy)
self.container_brain = BrainSplitter(self.url, self.token,
self.container_name)
self.int_client = self.make_internal_client(object_post_as_copy=False)
def _get_object_info(self, account, container, obj, number):
obj_conf = self.configs['object-server']
config_path = obj_conf[number]
options = utils.readconf(config_path, 'app:object-server')
swift_dir = options.get('swift_dir', '/etc/swift')
ring = POLICIES.get_object_ring(int(self.policy), swift_dir)
part, nodes = ring.get_nodes(account, container, obj)
for node in nodes:
# assumes one to one mapping
if node['port'] == int(options.get('bind_port')):
device = node['device']
break
else:
return None
mgr = DiskFileManager(options, get_logger(options))
disk_file = mgr.get_diskfile(device, part, account, container, obj,
self.policy)
info = disk_file.read_metadata()
return info
def _assert_consistent_object_metadata(self):
obj_info = []
for i in range(1, 5):
info_i = self._get_object_info(self.account, self.container_name,
self.object_name, i)
if info_i:
obj_info.append(info_i)
self.assertGreater(len(obj_info), 1)
for other in obj_info[1:]:
self.assertDictEqual(obj_info[0], other)
def _assert_consistent_deleted_object(self):
for i in range(1, 5):
try:
info = self._get_object_info(self.account, self.container_name,
self.object_name, i)
if info is not None:
self.fail('Expected no disk file info but found %s' % info)
except DiskFileDeleted:
pass
def _get_db_info(self, account, container, number):
server_type = 'container'
obj_conf = self.configs['%s-server' % server_type]
config_path = obj_conf[number]
options = utils.readconf(config_path, 'app:container-server')
root = options.get('devices')
swift_dir = options.get('swift_dir', '/etc/swift')
ring = Ring(swift_dir, ring_name=server_type)
part, nodes = ring.get_nodes(account, container)
for node in nodes:
# assumes one to one mapping
if node['port'] == int(options.get('bind_port')):
device = node['device']
break
else:
return None
path_hash = utils.hash_path(account, container)
_dir = utils.storage_directory('%ss' % server_type, part, path_hash)
db_dir = os.path.join(root, device, _dir)
db_file = os.path.join(db_dir, '%s.db' % path_hash)
db = ContainerBroker(db_file)
return db.get_info()
def _assert_consistent_container_dbs(self):
db_info = []
for i in range(1, 5):
info_i = self._get_db_info(self.account, self.container_name, i)
if info_i:
db_info.append(info_i)
self.assertGreater(len(db_info), 1)
for other in db_info[1:]:
self.assertEqual(db_info[0]['hash'], other['hash'],
'Container db hash mismatch: %s != %s'
% (db_info[0]['hash'], other['hash']))
def _assert_object_metadata_matches_listing(self, listing, metadata):
self.assertEqual(listing['bytes'], int(metadata['content-length']))
self.assertEqual(listing['hash'], metadata['etag'])
self.assertEqual(listing['content_type'], metadata['content-type'])
modified = Timestamp(metadata['x-timestamp']).isoformat
self.assertEqual(listing['last_modified'], modified)
def _put_object(self, headers=None, body=u'stuff'):
headers = headers or {}
self.int_client.upload_object(StringIO(body), self.account,
self.container_name,
self.object_name, headers)
def _post_object(self, headers):
self.int_client.set_object_metadata(self.account, self.container_name,
self.object_name, headers)
def _delete_object(self):
self.int_client.delete_object(self.account, self.container_name,
self.object_name)
def _get_object(self, headers=None, expect_statuses=(2,)):
return self.int_client.get_object(self.account,
self.container_name,
self.object_name,
headers,
acceptable_statuses=expect_statuses)
def _get_object_metadata(self):
return self.int_client.get_object_metadata(self.account,
self.container_name,
self.object_name)
def _assert_consistent_suffix_hashes(self):
opart, onodes = self.object_ring.get_nodes(
self.account, self.container_name, self.object_name)
name_hash = hash_path(
self.account, self.container_name, self.object_name)
results = []
for node in onodes:
results.append(
(node,
direct_get_suffix_hashes(node, opart, [name_hash[-3:]])))
for (node, hashes) in results[1:]:
self.assertEqual(results[0][1], hashes,
'Inconsistent suffix hashes found: %s' % results)
def test_object_delete_is_replicated(self):
self.brain.put_container(policy_index=int(self.policy))
# put object
self._put_object()
# put newer object with sysmeta to first server subset
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object()
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# delete object on second server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._delete_object()
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# run replicator
self.get_to_final_state()
# check object deletion has been replicated on first server set
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._get_object(expect_statuses=(4,))
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# check object deletion persists on second server set
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._get_object(expect_statuses=(4,))
# put newer object to second server set
self._put_object()
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# run replicator
self.get_to_final_state()
# check new object has been replicated on first server set
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._get_object()
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# check new object persists on second server set
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._get_object()
def test_object_after_replication_with_subsequent_post(self):
self.brain.put_container(policy_index=0)
# put object
self._put_object(headers={'Content-Type': 'foo'}, body=u'older')
# put newer object to first server subset
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'bar'}, body=u'newer')
metadata = self._get_object_metadata()
etag = metadata['etag']
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# post some user meta to all servers
self._post_object({'x-object-meta-bar': 'meta-bar'})
# run replicator
self.get_to_final_state()
# check that newer data has been replicated to second server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
metadata = self._get_object_metadata()
self.assertEqual(etag, metadata['etag'])
self.assertEqual('bar', metadata['content-type'])
self.assertEqual('meta-bar', metadata['x-object-meta-bar'])
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
self._assert_consistent_object_metadata()
self._assert_consistent_container_dbs()
self._assert_consistent_suffix_hashes()
def test_sysmeta_after_replication_with_subsequent_put(self):
sysmeta = {'x-object-sysmeta-foo': 'older'}
sysmeta2 = {'x-object-sysmeta-foo': 'newer'}
usermeta = {'x-object-meta-bar': 'meta-bar'}
self.brain.put_container(policy_index=0)
# put object with sysmeta to first server subset
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers=sysmeta)
metadata = self._get_object_metadata()
for key in sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta[key])
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# put object with updated sysmeta to second server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers=sysmeta2)
metadata = self._get_object_metadata()
for key in sysmeta2:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta2[key])
self._post_object(usermeta)
metadata = self._get_object_metadata()
for key in usermeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], usermeta[key])
for key in sysmeta2:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta2[key])
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# run replicator
self.get_to_final_state()
# check sysmeta has been replicated to first server subset
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
metadata = self._get_object_metadata()
for key in usermeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], usermeta[key])
for key in sysmeta2.keys():
self.assertIn(key, metadata, key)
self.assertEqual(metadata[key], sysmeta2[key])
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# check user sysmeta ok on second server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
metadata = self._get_object_metadata()
for key in usermeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], usermeta[key])
for key in sysmeta2.keys():
self.assertIn(key, metadata, key)
self.assertEqual(metadata[key], sysmeta2[key])
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
self._assert_consistent_object_metadata()
self._assert_consistent_container_dbs()
self._assert_consistent_suffix_hashes()
def test_sysmeta_after_replication_with_subsequent_post(self):
sysmeta = {'x-object-sysmeta-foo': 'sysmeta-foo'}
usermeta = {'x-object-meta-bar': 'meta-bar'}
transient_sysmeta = {
'x-object-transient-sysmeta-bar': 'transient-sysmeta-bar'}
self.brain.put_container(policy_index=int(self.policy))
# put object
self._put_object()
# put newer object with sysmeta to first server subset
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers=sysmeta)
metadata = self._get_object_metadata()
for key in sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta[key])
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# post some user meta to second server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
user_and_transient_sysmeta = dict(usermeta)
user_and_transient_sysmeta.update(transient_sysmeta)
self._post_object(user_and_transient_sysmeta)
metadata = self._get_object_metadata()
for key in user_and_transient_sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], user_and_transient_sysmeta[key])
for key in sysmeta:
self.assertNotIn(key, metadata)
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# run replicator
self.get_to_final_state()
# check user metadata has been replicated to first server subset
# and sysmeta is unchanged
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
metadata = self._get_object_metadata()
expected = dict(sysmeta)
expected.update(usermeta)
expected.update(transient_sysmeta)
for key in expected.keys():
self.assertIn(key, metadata, key)
self.assertEqual(metadata[key], expected[key])
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# check user metadata and sysmeta both on second server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
metadata = self._get_object_metadata()
for key in expected.keys():
self.assertIn(key, metadata, key)
self.assertEqual(metadata[key], expected[key])
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
self._assert_consistent_object_metadata()
self._assert_consistent_container_dbs()
self._assert_consistent_suffix_hashes()
def test_sysmeta_after_replication_with_prior_post(self):
sysmeta = {'x-object-sysmeta-foo': 'sysmeta-foo'}
usermeta = {'x-object-meta-bar': 'meta-bar'}
transient_sysmeta = {
'x-object-transient-sysmeta-bar': 'transient-sysmeta-bar'}
self.brain.put_container(policy_index=int(self.policy))
# put object
self._put_object()
# put user meta to first server subset
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
user_and_transient_sysmeta = dict(usermeta)
user_and_transient_sysmeta.update(transient_sysmeta)
self._post_object(user_and_transient_sysmeta)
metadata = self._get_object_metadata()
for key in user_and_transient_sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], user_and_transient_sysmeta[key])
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# put newer object with sysmeta to second server subset
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers=sysmeta)
metadata = self._get_object_metadata()
for key in sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta[key])
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# run replicator
self.get_to_final_state()
# check stale user metadata is not replicated to first server subset
# and sysmeta is unchanged
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
metadata = self._get_object_metadata()
for key in sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta[key])
for key in user_and_transient_sysmeta:
self.assertNotIn(key, metadata)
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# check stale user metadata is removed from second server subset
# and sysmeta is replicated
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
metadata = self._get_object_metadata()
for key in sysmeta:
self.assertIn(key, metadata)
self.assertEqual(metadata[key], sysmeta[key])
for key in user_and_transient_sysmeta:
self.assertNotIn(key, metadata)
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
self._assert_consistent_object_metadata()
self._assert_consistent_container_dbs()
self._assert_consistent_suffix_hashes()
def test_post_ctype_replicated_when_previous_incomplete_puts(self):
# primary half handoff half
# ------------ ------------
# t0.data: ctype = foo
# t1.data: ctype = bar
# t2.meta: ctype = baz
#
# ...run replicator and expect...
#
# t1.data:
# t2.meta: ctype = baz
self.brain.put_container(policy_index=0)
# incomplete write to primary half
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers={'Content-Type': 'foo'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# handoff write
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'bar'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# content-type update to primary half
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._post_object(headers={'Content-Type': 'baz'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
self.get_to_final_state()
# check object metadata
metadata = client.head_object(self.url, self.token,
self.container_name,
self.object_name)
# check container listing metadata
container_metadata, objs = client.get_container(self.url, self.token,
self.container_name)
for obj in objs:
if obj['name'] == self.object_name:
break
expected = 'baz'
self.assertEqual(obj['content_type'], expected)
self._assert_object_metadata_matches_listing(obj, metadata)
self._assert_consistent_container_dbs()
self._assert_consistent_object_metadata()
self._assert_consistent_suffix_hashes()
def test_put_ctype_replicated_when_subsequent_post(self):
# primary half handoff half
# ------------ ------------
# t0.data: ctype = foo
# t1.data: ctype = bar
# t2.meta:
#
# ...run replicator and expect...
#
# t1.data: ctype = bar
# t2.meta:
self.brain.put_container(policy_index=0)
# incomplete write
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers={'Content-Type': 'foo'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# handoff write
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'bar'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# metadata update with newest data unavailable
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._post_object(headers={'X-Object-Meta-Color': 'Blue'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
self.get_to_final_state()
# check object metadata
metadata = client.head_object(self.url, self.token,
self.container_name,
self.object_name)
# check container listing metadata
container_metadata, objs = client.get_container(self.url, self.token,
self.container_name)
for obj in objs:
if obj['name'] == self.object_name:
break
else:
self.fail('obj not found in container listing')
expected = 'bar'
self.assertEqual(obj['content_type'], expected)
self.assertEqual(metadata['x-object-meta-color'], 'Blue')
self._assert_object_metadata_matches_listing(obj, metadata)
self._assert_consistent_container_dbs()
self._assert_consistent_object_metadata()
self._assert_consistent_suffix_hashes()
def test_post_ctype_replicated_when_subsequent_post_without_ctype(self):
# primary half handoff half
# ------------ ------------
# t0.data: ctype = foo
# t1.data: ctype = bar
# t2.meta: ctype = bif
# t3.data: ctype = baz, color = 'Red'
# t4.meta: color = Blue
#
# ...run replicator and expect...
#
# t1.data:
# t4-delta.meta: ctype = baz, color = Blue
self.brain.put_container(policy_index=0)
# incomplete write
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers={'Content-Type': 'foo',
'X-Object-Sysmeta-Test': 'older'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# handoff write
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'bar',
'X-Object-Sysmeta-Test': 'newer'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# incomplete post with content type
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._post_object(headers={'Content-Type': 'bif'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# incomplete post to handoff with content type
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._post_object(headers={'Content-Type': 'baz',
'X-Object-Meta-Color': 'Red'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# complete post with no content type
self._post_object(headers={'X-Object-Meta-Color': 'Blue',
'X-Object-Sysmeta-Test': 'ignored'})
# 'baz' wins over 'bar' but 'Blue' wins over 'Red'
self.get_to_final_state()
# check object metadata
metadata = self._get_object_metadata()
# check container listing metadata
container_metadata, objs = client.get_container(self.url, self.token,
self.container_name)
for obj in objs:
if obj['name'] == self.object_name:
break
expected = 'baz'
self.assertEqual(obj['content_type'], expected)
self.assertEqual(metadata['x-object-meta-color'], 'Blue')
self.assertEqual(metadata['x-object-sysmeta-test'], 'newer')
self._assert_object_metadata_matches_listing(obj, metadata)
self._assert_consistent_container_dbs()
self._assert_consistent_object_metadata()
self._assert_consistent_suffix_hashes()
def test_put_ctype_replicated_when_subsequent_posts_without_ctype(self):
# primary half handoff half
# ------------ ------------
# t0.data: ctype = foo
# t1.data: ctype = bar
# t2.meta:
# t3.meta
#
# ...run replicator and expect...
#
# t1.data: ctype = bar
# t3.meta
self.brain.put_container(policy_index=0)
self._put_object(headers={'Content-Type': 'foo',
'X-Object-Sysmeta-Test': 'older'})
# incomplete write to handoff half
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'bar',
'X-Object-Sysmeta-Test': 'newer'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# incomplete post with no content type to primary half
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._post_object(headers={'X-Object-Meta-Color': 'Red',
'X-Object-Sysmeta-Test': 'ignored'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# incomplete post with no content type to handoff half
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._post_object(headers={'X-Object-Meta-Color': 'Blue'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
self.get_to_final_state()
# check object metadata
metadata = self._get_object_metadata()
# check container listing metadata
container_metadata, objs = client.get_container(self.url, self.token,
self.container_name)
for obj in objs:
if obj['name'] == self.object_name:
break
expected = 'bar'
self.assertEqual(obj['content_type'], expected)
self._assert_object_metadata_matches_listing(obj, metadata)
self.assertEqual(metadata['x-object-meta-color'], 'Blue')
self.assertEqual(metadata['x-object-sysmeta-test'], 'newer')
self._assert_object_metadata_matches_listing(obj, metadata)
self._assert_consistent_container_dbs()
self._assert_consistent_object_metadata()
self._assert_consistent_suffix_hashes()
def test_posted_metadata_only_persists_after_prior_put(self):
# newer metadata posted to subset of nodes should persist after an
# earlier put on other nodes, but older content-type on that subset
# should not persist
self.brain.put_container(policy_index=0)
# incomplete put to handoff
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'oldest',
'X-Object-Sysmeta-Test': 'oldest',
'X-Object-Meta-Test': 'oldest'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# incomplete put to primary
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers={'Content-Type': 'oldest',
'X-Object-Sysmeta-Test': 'oldest',
'X-Object-Meta-Test': 'oldest'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# incomplete post with content-type to handoff
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._post_object(headers={'Content-Type': 'newer',
'X-Object-Meta-Test': 'newer'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# incomplete put to primary
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers={'Content-Type': 'newest',
'X-Object-Sysmeta-Test': 'newest',
'X-Object-Meta-Test': 'newer'})
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# incomplete post with no content-type to handoff which still has
# out of date content-type
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._post_object(headers={'X-Object-Meta-Test': 'newest'})
metadata = self._get_object_metadata()
self.assertEqual(metadata['x-object-meta-test'], 'newest')
self.assertEqual(metadata['content-type'], 'newer')
self.brain.start_primary_half()
self.container_brain.start_primary_half()
self.get_to_final_state()
# check object metadata
metadata = self._get_object_metadata()
self.assertEqual(metadata['x-object-meta-test'], 'newest')
self.assertEqual(metadata['x-object-sysmeta-test'], 'newest')
self.assertEqual(metadata['content-type'], 'newest')
# check container listing metadata
container_metadata, objs = client.get_container(self.url, self.token,
self.container_name)
for obj in objs:
if obj['name'] == self.object_name:
break
self.assertEqual(obj['content_type'], 'newest')
self._assert_object_metadata_matches_listing(obj, metadata)
self._assert_object_metadata_matches_listing(obj, metadata)
self._assert_consistent_container_dbs()
self._assert_consistent_object_metadata()
self._assert_consistent_suffix_hashes()
def test_post_trumped_by_prior_delete(self):
# new metadata and content-type posted to subset of nodes should not
# cause object to persist after replication of an earlier delete on
# other nodes.
self.brain.put_container(policy_index=0)
# incomplete put
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._put_object(headers={'Content-Type': 'oldest',
'X-Object-Sysmeta-Test': 'oldest',
'X-Object-Meta-Test': 'oldest'})
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# incomplete put then delete
self.brain.stop_handoff_half()
self.container_brain.stop_handoff_half()
self._put_object(headers={'Content-Type': 'oldest',
'X-Object-Sysmeta-Test': 'oldest',
'X-Object-Meta-Test': 'oldest'})
self._delete_object()
self.brain.start_handoff_half()
self.container_brain.start_handoff_half()
# handoff post
self.brain.stop_primary_half()
self.container_brain.stop_primary_half()
self._post_object(headers={'Content-Type': 'newest',
'X-Object-Sysmeta-Test': 'ignored',
'X-Object-Meta-Test': 'newest'})
# check object metadata
metadata = self._get_object_metadata()
self.assertEqual(metadata['x-object-sysmeta-test'], 'oldest')
self.assertEqual(metadata['x-object-meta-test'], 'newest')
self.assertEqual(metadata['content-type'], 'newest')
self.brain.start_primary_half()
self.container_brain.start_primary_half()
# delete trumps later post
self.get_to_final_state()
# check object is now deleted
self.assertRaises(UnexpectedResponse, self._get_object_metadata)
container_metadata, objs = client.get_container(self.url, self.token,
self.container_name)
self.assertEqual(0, len(objs))
self._assert_consistent_container_dbs()
self._assert_consistent_deleted_object()
self._assert_consistent_suffix_hashes()
if __name__ == "__main__":
unittest.main()
| 41.661118 | 79 | 0.616577 | 3,975 | 35,037 | 5.159748 | 0.07673 | 0.049147 | 0.071965 | 0.086884 | 0.80312 | 0.776207 | 0.757923 | 0.740761 | 0.722477 | 0.692296 | 0 | 0.003045 | 0.28761 | 35,037 | 840 | 80 | 41.710714 | 0.81867 | 0.135714 | 0 | 0.744068 | 0 | 0 | 0.071412 | 0.014541 | 0 | 0 | 0 | 0 | 0.184746 | 1 | 0.040678 | false | 0.001695 | 0.027119 | 0.00339 | 0.079661 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
836a1cca030e9e54fa8e9c66f8b697b8c77b3587 | 81 | py | Python | respite/urls/__init__.py | altayaydemir/bilgi-shuttle-api | da5cf850816c11c6e09ed5d7c5ce414483ef8131 | [
"Apache-2.0"
] | 3 | 2016-01-26T00:22:34.000Z | 2016-01-26T12:22:27.000Z | respite/urls/__init__.py | altayaydemir/bilgi-shuttle-api | da5cf850816c11c6e09ed5d7c5ce414483ef8131 | [
"Apache-2.0"
] | 5 | 2016-01-11T19:03:42.000Z | 2021-08-14T15:34:23.000Z | respite/urls/__init__.py | altayaydemir/bilgi-shuttle-api | da5cf850816c11c6e09ed5d7c5ce414483ef8131 | [
"Apache-2.0"
] | 2 | 2017-03-14T20:24:15.000Z | 2017-03-21T09:13:54.000Z | from respite.urls.routes import route
from respite.urls.resource import resource
| 27 | 42 | 0.851852 | 12 | 81 | 5.75 | 0.583333 | 0.318841 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098765 | 81 | 2 | 43 | 40.5 | 0.945205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
36667c01cbd4daff9540634919511153153e23d5 | 22,570 | py | Python | PyDigitsConverison/UnitTests/DigitUnitTests.py | ashyrokoriadov/PyDigitsConversion | ae4ce21fc55a5c8a88a3d7c08752edf2743a0c7b | [
"MIT"
] | null | null | null | PyDigitsConverison/UnitTests/DigitUnitTests.py | ashyrokoriadov/PyDigitsConversion | ae4ce21fc55a5c8a88a3d7c08752edf2743a0c7b | [
"MIT"
] | null | null | null | PyDigitsConverison/UnitTests/DigitUnitTests.py | ashyrokoriadov/PyDigitsConversion | ae4ce21fc55a5c8a88a3d7c08752edf2743a0c7b | [
"MIT"
] | null | null | null | from enum import Enum
from digit import Digit, DigitType
from decimal_digit import DecimalDigit
from octal_digit import OctalDigit
from binary_digit import BinaryDigit
from hexadecimal_digit import HexadecimalDigit
import unittest
class Test_DecimalDigit(unittest.TestCase):
def setUp(self):
self.decimal_digit = DecimalDigit()
def test_conversion_to_binary_only_integer_part(self):
self.decimal_digit.digit_value = '12'
binary_digit = self.decimal_digit.get_binary()
self.assertEqual(binary_digit, "1100.0", "10: 12 should be 2: 1100.0")
def test_conversion_to_binary_integer__and_fractional_parts(self):
self.decimal_digit.digit_value = '12.18'
binary_digit = self.decimal_digit.get_binary()
self.assertEqual(binary_digit, "1100.0010111000", "10: 12.18 should be 2: 1100.0010111000")
def test_conversion_to_decimal_only_integer_part(self):
self.decimal_digit.digit_value = '12'
decimal_digit_new = self.decimal_digit.get_decimal()
self.assertEqual(decimal_digit_new, "12", "10: 12 should be 10: 12")
def test_conversion_to_decimal_integer__and_fractional_parts(self):
self.decimal_digit.digit_value = '12.18'
decimal_digit_new = self.decimal_digit.get_decimal()
self.assertEqual(decimal_digit_new, "12.18", "10: 12.18 should be 10: 12.18")
def test_conversion_to_octal_only_integer_part(self):
self.decimal_digit.digit_value = '1234'
octal_digit = self.decimal_digit.get_octal()
self.assertEqual(octal_digit, "2322.0", "10: 1234 should be 8: 2322.0")
def test_conversion_to_octal_integer__and_fractional_parts(self):
self.decimal_digit.digit_value = '1234.56'
octal_digit = self.decimal_digit.get_octal()
self.assertEqual(octal_digit, "2322.4365605075", "10: 1234.56 should be 8: 2322.4365605075")
def test_conversion_to_hexadecimal_only_integer_part(self):
self.decimal_digit.digit_value = '1234'
hexadecimal_digit = self.decimal_digit.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "4D2.0", "10: 1234 should be 8: 4D2")
def test_conversion_to_hexadecimal_integer__and_fractional_parts(self):
self.decimal_digit.digit_value = '1234.56'
hexadecimal_digit = self.decimal_digit.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "4D2.8F5C28F5C2", "10: 1234.56 should be 16: 4D2.8F5C28F5C2")
def test_passed_value_is_null_hexadecimal(self):
self.decimal_digit.digit_value = None
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_hexadecimal()
def test_passed_value_is_null_binary(self):
self.decimal_digit.digit_value = None
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_binary()
def test_passed_value_is_null_decimal(self):
self.decimal_digit.digit_value = None
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_decimal()
def test_passed_value_is_null_octal(self):
self.decimal_digit.digit_value = None
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_octal()
def test_passed_value_is_empty_hexadecimal(self):
self.decimal_digit.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_hexadecimal()
def test_passed_value_is_empty_binary(self):
self.decimal_digit.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_binary()
def test_passed_value_is_empty_decimal(self):
self.decimal_digit.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_decimal()
def test_passed_value_is_empty_octal(self):
self.decimal_digit.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_octal()
def test_passed_value_is_nan_hexadecimal(self):
self.decimal_digit.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_hexadecimal()
def test_passed_value_is_nan_binary(self):
self.decimal_digit.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_binary()
def test_passed_value_is_nan_decimal(self):
self.decimal_digit.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_decimal()
def test_passed_value_is_nan_octal(self):
self.decimal_digit.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_octal()
def test_passed_value_is_whitespace_hexadecimal(self):
self.decimal_digit.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_hexadecimal()
def test_passed_value_is_whitespace_binary(self):
self.decimal_digit.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_binary()
def test_passed_value_is_whitespace_decimal(self):
self.decimal_digit.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_decimal()
def test_passed_value_is_whitespace_octal(self):
self.decimal_digit.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.decimal_digit.get_octal()
class Test_BinaryDigit(unittest.TestCase):
def setUp(self):
self.binary_digit_integer = BinaryDigit()
self.binary_digit_integer.digit_value = '111010011'
self.binary_digit_integer_and_fraction = BinaryDigit()
self.binary_digit_integer_and_fraction.digit_value = '111010011.11100011110'
def test_conversion_to_binary_only_integer_part(self):
binary_digit = self.binary_digit_integer.get_binary()
self.assertEqual(binary_digit, "111010011", "2: 111010011 should be 2: 111010011")
def test_conversion_to_binary_integer__and_fractional_parts(self):
binary_digit = self.binary_digit_integer_and_fraction.get_binary()
self.assertEqual(binary_digit, "111010011.11100011110", "2: 111010011.11100011110 should be 2: 111010011.11100011110")
def test_conversion_to_decimal_only_integer_part(self):
decimal_digit = self.binary_digit_integer.get_decimal()
self.assertEqual(decimal_digit, "467.0", "2: 111010011.0 should be 10: 467.0")
def test_conversion_to_decimal_integer__and_fractional_parts(self):
decimal_digit_new = self.binary_digit_integer_and_fraction.get_decimal()
self.assertEqual(decimal_digit_new, "467.8896484375", "2: 111010011.11100011110 should be 10: 467.8896484375")
def test_conversion_to_octal_only_integer_part(self):
octal_digit = self.binary_digit_integer.get_octal()
self.assertEqual(octal_digit, "723.0", "2: 111010011 should be 8: 723.0")
def test_conversion_to_octal_integer__and_fractional_parts(self):
octal_digit = self.binary_digit_integer_and_fraction.get_octal()
self.assertEqual(octal_digit, "723.7074", "2: 111010011.11100011110 should be 8: 723.7074")
def test_conversion_to_hexadecimal_only_integer_part(self):
hexadecimal_digit = self.binary_digit_integer.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "1D3.0", "2: 111010011 should be 8: 1D3.0")
def test_conversion_to_hexadecimal_integer__and_fractional_parts(self):
hexadecimal_digit = self.binary_digit_integer_and_fraction.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "1D3.E3C", "2: 111010011.11100011110 should be 16: 1D3.E3C")
def test_passed_value_is_null_hexadecimal(self):
self.binary_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_hexadecimal()
def test_passed_value_is_null_binary(self):
self.binary_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_binary()
def test_passed_value_is_null_decimal(self):
self.binary_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_decimal()
def test_passed_value_is_null_octal(self):
self.binary_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_octal()
def test_passed_value_is_empty_hexadecimal(self):
self.binary_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_hexadecimal()
def test_passed_value_is_empty_binary(self):
self.binary_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_binary()
def test_passed_value_is_empty_decimal(self):
self.binary_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_decimal()
def test_passed_value_is_empty_octal(self):
self.binary_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_octal()
def test_passed_value_is_nan_hexadecimal(self):
self.binary_digit_integer.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_hexadecimal()
def test_passed_value_is_nan_binary(self):
self.binary_digit_integer.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_binary()
def test_passed_value_is_nan_decimal(self):
self.binary_digit_integer.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_decimal()
def test_passed_value_is_nan_octal(self):
self.binary_digit_integer.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_octal()
def test_passed_value_is_whitespace_hexadecimal(self):
self.binary_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_hexadecimal()
def test_passed_value_is_whitespace_binary(self):
self.binary_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_binary()
def test_passed_value_is_whitespace_decimal(self):
self.binary_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_decimal()
def test_passed_value_is_whitespace_octal(self):
self.binary_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.binary_digit_integer.get_octal()
class Test_OctalDigit(unittest.TestCase):
def setUp(self):
self.digit_octal = OctalDigit()
self.digit_octal.digit_value='12475.0'
self.digit_octal_integer_and_fraction = OctalDigit()
self.digit_octal_integer_and_fraction.digit_value='12475.30712601014'
def test_conversion_to_binary_only_integer_part(self):
binary_digit = self.digit_octal.get_binary()
self.assertEqual(binary_digit, "1010100111101.0", "8: 12475 should be 2: 1010100111101.0")
def test_conversion_to_binary_integer__and_fractional_parts(self):
binary_digit = self.digit_octal_integer_and_fraction.get_binary()
self.assertEqual(binary_digit, "1010100111101.0110001110", "8: 12475.30712601014 should be 2: 1010100111101.0110001110")
def test_conversion_to_decimal_only_integer_part(self):
decimal_digit_new = self.digit_octal.get_decimal()
self.assertEqual(decimal_digit_new, "5437.0", "8: 12475 should be 10: 5437.0")
def test_conversion_to_decimal_integer__and_fractional_parts(self):
decimal_digit_new = self.digit_octal_integer_and_fraction.get_decimal()
self.assertEqual(decimal_digit_new, "5437.3889999999664724", "8: 12475.30712601014 should be 10: 5437.3889999999664724")
def test_conversion_to_octal_only_integer_part(self):
octal_digit = self.digit_octal.get_octal()
self.assertEqual(octal_digit, "12475.0", "8: 12475 should be 8: 12475.0")
def test_conversion_to_octal_integer__and_fractional_parts(self):
octal_digit = self.digit_octal_integer_and_fraction.get_octal()
self.assertEqual(octal_digit, "12475.30712601014", "8: 12475.30712601014 should be 8: 12475.30712601014")
def test_conversion_to_hexadecimal_only_integer_part(self):
hexadecimal_digit = self.digit_octal.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "153D.0", "8: 12475 should be 8: 153D")
def test_conversion_to_hexadecimal_integer__and_fractional_parts(self):
hexadecimal_digit = self.digit_octal_integer_and_fraction.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "153D.6395810624", "8: 12475.30712601014 should be 16: 153D.6395810624")
def test_passed_value_is_null_hexadecimal(self):
self.digit_octal.digit_value = None
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_hexadecimal()
def test_passed_value_is_null_binary(self):
self.digit_octal.digit_value = None
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_binary()
def test_passed_value_is_null_decimal(self):
self.digit_octal.digit_value = None
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_decimal()
def test_passed_value_is_null_octal(self):
self.digit_octal.digit_value = None
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_octal()
def test_passed_value_is_empty_hexadecimal(self):
self.digit_octal.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_hexadecimal()
def test_passed_value_is_empty_binary(self):
self.digit_octal.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_binary()
def test_passed_value_is_empty_decimal(self):
self.digit_octal.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_decimal()
def test_passed_value_is_empty_octal(self):
self.digit_octal.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_octal()
def test_passed_value_is_nan_hexadecimal(self):
self.digit_octal.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_hexadecimal()
def test_passed_value_is_nan_binary(self):
self.digit_octal.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_binary()
def test_passed_value_is_nan_decimal(self):
self.digit_octal.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_decimal()
def test_passed_value_is_nan_octal(self):
self.digit_octal.digit_value = 'ABC'
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_octal()
def test_passed_value_is_whitespace_hexadecimal(self):
self.digit_octal.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_hexadecimal()
def test_passed_value_is_whitespace_binary(self):
self.digit_octal.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_binary()
def test_passed_value_is_whitespace_decimal(self):
self.digit_octal.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_decimal()
def test_passed_value_is_whitespace_octal(self):
self.digit_octal.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.digit_octal.get_octal()
class Test_HexadecimalDigit(unittest.TestCase):
def setUp(self):
self.hexadecimal_digit_integer = HexadecimalDigit()
self.hexadecimal_digit_integer.digit_value='271.0'
self.digit_integer_and_fraction = HexadecimalDigit()
self.digit_integer_and_fraction.digit_value='271.1C28'
def test_conversion_to_binary_only_integer_part(self):
binary_digit = self.hexadecimal_digit_integer.get_binary()
self.assertEqual(binary_digit, "1001110001.0", "16: 271.0 should be 2: 1001110001.0")
def test_conversion_to_binary_integer__and_fractional_parts(self):
binary_digit = self.digit_integer_and_fraction.get_binary()
self.assertEqual(binary_digit, "1001110001.0001110000", "16: 271.1C28 should be 2: 1001110001.0001110000")
def test_conversion_to_decimal_only_integer_part(self):
decimal_digit_new = self.hexadecimal_digit_integer.get_decimal()
self.assertEqual(decimal_digit_new, "625.0", "16: 271.0 should be 10: 625.0")
def test_conversion_to_decimal_integer__and_fractional_parts(self):
decimal_digit_new = self.digit_integer_and_fraction.get_decimal()
self.assertEqual(decimal_digit_new, "625.1099853515625", "16: 271.1C28 should be 10: 625.1099853515625")
def test_conversion_to_octal_only_integer_part(self):
octal_digit = self.hexadecimal_digit_integer.get_octal()
self.assertEqual(octal_digit, "1161.0", "16: 271.0 should be 8: 1161.0")
def test_conversion_to_octal_integer__and_fractional_parts(self):
octal_digit = self.digit_integer_and_fraction.get_octal()
self.assertEqual(octal_digit, "1161.0702436560", "16: 271.1C28 should be 8: 1161.0702436560")
def test_conversion_to_hexadecimal_only_integer_part(self):
hexadecimal_digit = self.hexadecimal_digit_integer.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "271.0", "16: 271.0 should be 8: 271.0")
def test_conversion_to_hexadecimal_integer__and_fractional_parts(self):
hexadecimal_digit = self.digit_integer_and_fraction.get_hexadecimal()
self.assertEqual(hexadecimal_digit, "271.1C28", "16: 271.1C28 should be 16: 271.1C28")
def test_passed_value_is_null_hexadecimal(self):
self.hexadecimal_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_hexadecimal()
def test_passed_value_is_null_binary(self):
self.hexadecimal_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_binary()
def test_passed_value_is_null_decimal(self):
self.hexadecimal_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_decimal()
def test_passed_value_is_null_octal(self):
self.hexadecimal_digit_integer.digit_value = None
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_octal()
def test_passed_value_is_empty_hexadecimal(self):
self.hexadecimal_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_hexadecimal()
def test_passed_value_is_empty_binary(self):
self.hexadecimal_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_binary()
def test_passed_value_is_empty_decimal(self):
self.hexadecimal_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_decimal()
def test_passed_value_is_empty_octal(self):
self.hexadecimal_digit_integer.digit_value = ''
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_octal()
def test_passed_value_is_nan_hexadecimal(self):
self.hexadecimal_digit_integer.digit_value = '11-11'
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_hexadecimal()
def test_passed_value_is_nan_binary(self):
self.hexadecimal_digit_integer.digit_value = '11-11'
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_binary()
def test_passed_value_is_nan_decimal(self):
self.hexadecimal_digit_integer.digit_value = '11-11'
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_decimal()
def test_passed_value_is_nan_octal(self):
self.hexadecimal_digit_integer.digit_value = '11-11'
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_octal()
def test_passed_value_is_whitespace_hexadecimal(self):
self.hexadecimal_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_hexadecimal()
def test_passed_value_is_whitespace_binary(self):
self.hexadecimal_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_binary()
def test_passed_value_is_whitespace_decimal(self):
self.hexadecimal_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_decimal()
def test_passed_value_is_whitespace_octal(self):
self.hexadecimal_digit_integer.digit_value = ' '
with self.assertRaises(ValueError) as cm:
self.hexadecimal_digit_integer.get_octal()
if __name__ == '__main__':
try:
unittest.main()
except:
pass
| 45.14 | 128 | 0.710102 | 2,861 | 22,570 | 5.211465 | 0.032856 | 0.04507 | 0.055801 | 0.077264 | 0.936821 | 0.903152 | 0.858887 | 0.839302 | 0.806707 | 0.779074 | 0 | 0.059924 | 0.208861 | 22,570 | 499 | 129 | 45.230461 | 0.77509 | 0 | 0 | 0.783715 | 0 | 0 | 0.078027 | 0.012362 | 0 | 0 | 0 | 0 | 0.244275 | 1 | 0.254453 | false | 0.165394 | 0.017812 | 0 | 0.282443 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
3d033bab4ae764651ccbb0f24ea8b6c986d0c17d | 20,033 | py | Python | ross/tests/test_misalignment.py | CisneirosRaphael/ross_c | 0c2176522d8cd4c36013c2bb02466a8139a3a513 | [
"MIT"
] | null | null | null | ross/tests/test_misalignment.py | CisneirosRaphael/ross_c | 0c2176522d8cd4c36013c2bb02466a8139a3a513 | [
"MIT"
] | null | null | null | ross/tests/test_misalignment.py | CisneirosRaphael/ross_c | 0c2176522d8cd4c36013c2bb02466a8139a3a513 | [
"MIT"
] | 1 | 2020-06-08T17:11:30.000Z | 2020-06-08T17:11:30.000Z | import os
from pathlib import Path
from tempfile import tempdir
import numpy as np
import pytest
from numpy.testing import assert_allclose, assert_almost_equal
import ross as rs
from ross.defects.misalignment import MisalignmentFlex
from ross.units import Q_
steel2 = rs.Material(name="Steel", rho=7850, E=2.17e11, G_s=81.2e9)
# Rotor with 6 DoFs, with internal damping, with 10 shaft elements, 2 disks and 2 bearings.
i_d = 0
o_d = 0.019
n = 33
# fmt: off
L = np.array(
[0 , 25, 64, 104, 124, 143, 175, 207, 239, 271,
303, 335, 345, 355, 380, 408, 436, 466, 496, 526,
556, 586, 614, 647, 657, 667, 702, 737, 772, 807,
842, 862, 881, 914]
)/ 1000
# fmt: on
L = [L[i] - L[i - 1] for i in range(1, len(L))]
shaft_elem = [
rs.ShaftElement6DoF(
material=steel2,
L=l,
idl=i_d,
odl=o_d,
idr=i_d,
odr=o_d,
alpha=8.0501,
beta=1.0e-5,
rotary_inertia=True,
shear_effects=True,
)
for l in L
]
Id = 0.003844540885417
Ip = 0.007513248437500
disk0 = rs.DiskElement6DoF(n=12, m=2.6375, Id=Id, Ip=Ip)
disk1 = rs.DiskElement6DoF(n=24, m=2.6375, Id=Id, Ip=Ip)
kxx1 = 4.40e5
kyy1 = 4.6114e5
kzz = 0
cxx1 = 27.4
cyy1 = 2.505
czz = 0
kxx2 = 9.50e5
kyy2 = 1.09e8
cxx2 = 50.4
cyy2 = 100.4553
bearing0 = rs.BearingElement6DoF(
n=4, kxx=kxx1, kyy=kyy1, cxx=cxx1, cyy=cyy1, kzz=kzz, czz=czz
)
bearing1 = rs.BearingElement6DoF(
n=31, kxx=kxx2, kyy=kyy2, cxx=cxx2, cyy=cyy2, kzz=kzz, czz=czz
)
rotor = rs.Rotor(shaft_elem, [disk0, disk1], [bearing0, bearing1])
@pytest.fixture
def mis_comb():
massunbt = np.array([5e-4, 0])
phaseunbt = np.array([-np.pi / 2, 0])
misalignment = rotor.run_misalignment(
coupling="flex",
dt=0.1,
tI=0,
tF=5,
kd=40 * 10 ** (3),
ks=38 * 10 ** (3),
eCOUPx=2 * 10 ** (-4),
eCOUPy=2 * 10 ** (-4),
misalignment_angle=5 * np.pi / 180,
TD=0,
TL=0,
n1=0,
speed=1200,
massunb=massunbt,
phaseunb=phaseunbt,
mis_type="combined",
print_progress=False,
)
return misalignment
def test_mis_comb_parameters(mis_comb):
assert mis_comb.dt == 0.1
assert mis_comb.tI == 0
assert mis_comb.tF == 5
assert mis_comb.kd == 40 * 10 ** (3)
assert mis_comb.ks == 38 * 10 ** (3)
assert mis_comb.eCOUPx == 2 * 10 ** (-4)
assert mis_comb.eCOUPy == 2 * 10 ** (-4)
assert mis_comb.misalignment_angle == 5 * np.pi / 180
assert mis_comb.TD == 0
assert mis_comb.TL == 0
assert mis_comb.n1 == 0
assert mis_comb.speed == 1200
def test_mis_comb_forces(mis_comb):
assert mis_comb.forces[mis_comb.n1 * 6, :] == pytest.approx(
# fmt: off
np.array(
[-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,-4.40604748,-4.40604748,-4.40604748,-4.40604748,
-4.40604748,
]
)
# fmt: on
)
assert mis_comb.forces[mis_comb.n1 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,1.0821174,1.0821174,1.0821174,1.0821174,
1.0821174,
]
)
# fmt: on
)
assert mis_comb.forces[mis_comb.n2 * 6, :] == pytest.approx(
# fmt: off
np.array(
[4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,4.40604748,4.40604748,4.40604748,4.40604748,
4.40604748,
]
)
# fmt: on
)
assert mis_comb.forces[mis_comb.n2 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,-1.0821174,-1.0821174,-1.0821174,-1.0821174,
-1.0821174,
]
)
# fmt: on
)
@pytest.fixture
def mis_parallel():
massunbt = np.array([5e-4, 0])
phaseunbt = np.array([-np.pi / 2, 0])
misalignment = rotor.run_misalignment(
coupling="flex",
dt=0.1,
tI=0,
tF=5,
kd=40 * 10 ** (3),
ks=38 * 10 ** (3),
eCOUPx=2 * 10 ** (-4),
eCOUPy=2 * 10 ** (-4),
misalignment_angle=5 * np.pi / 180,
TD=0,
TL=0,
n1=0,
speed=1200,
massunb=massunbt,
phaseunb=phaseunbt,
mis_type="parallel",
print_progress=False,
)
return misalignment
def test_mis_parallel_parameters(mis_parallel):
assert mis_parallel.dt == 0.1
assert mis_parallel.tI == 0
assert mis_parallel.tF == 5
assert mis_parallel.kd == 40 * 10 ** (3)
assert mis_parallel.ks == 38 * 10 ** (3)
assert mis_parallel.eCOUPx == 2 * 10 ** (-4)
assert mis_parallel.eCOUPy == 2 * 10 ** (-4)
assert mis_parallel.misalignment_angle == 5 * np.pi / 180
assert mis_parallel.TD == 0
assert mis_parallel.TL == 0
assert mis_parallel.n1 == 0
assert mis_parallel.speed == 1200
def test_mis_parallel_forces(mis_parallel):
assert mis_parallel.forces[mis_parallel.n1 * 6, :] == pytest.approx(
# fmt: off
np.array(
[-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529, -6.78312529, -6.78312529, -6.78312529, -6.78312529,
-6.78312529
]
)
# fmt: on
)
assert mis_parallel.forces[mis_parallel.n1 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174, 1.0821174,
1.0821174, 1.0821174, 1.0821174
]
)
# fmt: on
)
assert mis_parallel.forces[mis_parallel.n2 * 6, :] == pytest.approx(
# fmt: off
np.array(
[6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529, 6.78312529, 6.78312529, 6.78312529, 6.78312529,
6.78312529
]
)
# fmt: on
)
assert mis_parallel.forces[mis_parallel.n2 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174, -1.0821174, -1.0821174, -1.0821174, -1.0821174,
-1.0821174
]
)
# fmt: on
)
@pytest.fixture
def mis_angular():
massunbt = np.array([5e-4, 0])
phaseunbt = np.array([-np.pi / 2, 0])
misalignment = rotor.run_misalignment(
coupling="flex",
dt=0.1,
tI=0,
tF=5,
kd=40 * 10 ** (3),
ks=38 * 10 ** (3),
eCOUPx=2 * 10 ** (-4),
eCOUPy=2 * 10 ** (-4),
misalignment_angle=5 * np.pi / 180,
TD=0,
TL=0,
n1=0,
speed=1200,
massunb=massunbt,
phaseunb=phaseunbt,
mis_type="angular",
print_progress=False,
)
return misalignment
def test_mis_angular_parameters(mis_angular):
assert mis_angular.dt == 0.1
assert mis_angular.tI == 0
assert mis_angular.tF == 5
assert mis_angular.kd == 40 * 10 ** (3)
assert mis_angular.ks == 38 * 10 ** (3)
assert mis_angular.eCOUPx == 2 * 10 ** (-4)
assert mis_angular.eCOUPy == 2 * 10 ** (-4)
assert mis_angular.misalignment_angle == 5 * np.pi / 180
assert mis_angular.TD == 0
assert mis_angular.TL == 0
assert mis_angular.n1 == 0
assert mis_angular.speed == 1200
def test_mis_angular_forces(mis_angular):
assert mis_angular.forces[mis_angular.n1 * 6, :] == pytest.approx(
# fmt: off
np.array(
[2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782, 2.37707782, 2.37707782, 2.37707782, 2.37707782,
2.37707782
]
)
# fmt: on
)
assert mis_angular.forces[mis_angular.n1 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[-2.66453526e-15, 1.66147096e-11, 3.32343042e-11, 4.98774355e-11,
6.64779343e-11, 8.30784330e-11, 9.97633087e-11, 1.16381571e-10,
1.32931444e-10, 1.49549262e-10, 1.66149317e-10, 1.82851956e-10,
1.99536387e-10, 2.16086704e-10, 2.32772024e-10, 2.49321896e-10,
2.65872213e-10, 2.82556645e-10, 2.99107850e-10, 3.15792725e-10,
3.32375905e-10, 3.48960416e-10, 3.65779851e-10, 3.82330612e-10,
3.99150490e-10, 4.15429913e-10, 4.32250680e-10, 4.48800552e-10,
4.65620431e-10, 4.82170748e-10, 4.98721064e-10, 5.15271381e-10,
5.31820366e-10, 5.48641577e-10, 5.65191449e-10, 5.81740878e-10,
5.98291194e-10, 6.15111961e-10, 6.31661390e-10, 6.48481713e-10,
6.64762023e-10, 6.81716905e-10, 6.97726765e-10, 7.14276194e-10,
7.31366523e-10, 7.47917284e-10, 7.64467156e-10, 7.81017029e-10,
7.98106914e-10, 8.14657675e-10, 8.30667091e-10
]
)
# fmt: on
)
assert mis_angular.forces[mis_angular.n2 * 6, :] == pytest.approx(
# fmt: off
np.array(
[-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782, -2.37707782, -2.37707782, -2.37707782, -2.37707782,
-2.37707782
]
)
# fmt: on
)
assert mis_angular.forces[mis_angular.n2 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[ 2.66453526e-15, -1.66147096e-11, -3.32343042e-11, -4.98774355e-11,
-6.64779343e-11, -8.30784330e-11, -9.97633087e-11, -1.16381571e-10,
-1.32931444e-10, -1.49549262e-10, -1.66149317e-10, -1.82851956e-10,
-1.99536387e-10, -2.16086704e-10, -2.32772024e-10, -2.49321896e-10,
-2.65872213e-10, -2.82556645e-10, -2.99107850e-10, -3.15792725e-10,
-3.32375905e-10, -3.48960416e-10, -3.65779851e-10, -3.82330612e-10,
-3.99150490e-10, -4.15429913e-10, -4.32250680e-10, -4.48800552e-10,
-4.65620431e-10, -4.82170748e-10, -4.98721064e-10, -5.15271381e-10,
-5.31820366e-10, -5.48641577e-10, -5.65191449e-10, -5.81740878e-10,
-5.98291194e-10, -6.15111961e-10, -6.31661390e-10, -6.48481713e-10,
-6.64762023e-10, -6.81716905e-10, -6.97726765e-10, -7.14276194e-10,
-7.31366523e-10, -7.47917284e-10, -7.64467156e-10, -7.81017029e-10,
-7.98106914e-10, -8.14657675e-10, -8.30667091e-10
]
)
# fmt: on
)
@pytest.fixture
def mis_rigid():
massunbt = np.array([5e-4, 0])
phaseunbt = np.array([-np.pi / 2, 0])
misalignment = rotor.run_misalignment(
coupling="rigid",
dt=0.0001,
tI=0,
tF=0.005,
eCOUP=2e-4,
TD=0,
TL=0,
n1=0,
speed=1200,
massunb=massunbt,
phaseunb=phaseunbt,
print_progress=False,
)
return misalignment
def test_mis_rigid_parameters(mis_rigid):
assert mis_rigid.dt == 0.0001
assert mis_rigid.tI == 0
assert mis_rigid.tF == 0.005
assert mis_rigid.eCOUP == 2e-4
assert mis_rigid.TD == 0
assert mis_rigid.TL == 0
assert mis_rigid.n1 == 0
assert mis_rigid.speed == 1200
def test_mis_rigid_forces(mis_rigid):
assert mis_rigid.forces[mis_rigid.n1 * 6, :] == pytest.approx(
# fmt: off
np.array(
[0.00000000e+00, 4.36964689e+00, 1.74771689e+01, 3.93186457e+01,
6.98878473e+01, 1.09176370e+02, 1.57173806e+02, 2.13867938e+02,
2.79244943e+02, 3.53289607e+02, 4.35985520e+02, 5.27315254e+02,
6.27260511e+02, 7.35802235e+02, 8.52920676e+02, 9.78595415e+02,
1.11280534e+03, 1.25552860e+03, 1.40674247e+03, 1.56642326e+03,
1.73454619e+03, 1.91108521e+03, 2.09601285e+03, 2.28930016e+03,
2.49091658e+03, 2.70082986e+03, 2.91900610e+03, 3.14540972e+03,
3.38000359e+03, 3.62274912e+03, 3.87360643e+03, 4.13253452e+03,
4.39949148e+03, 4.67443471e+03, 4.95732112e+03, 5.24810728e+03,
5.54674961e+03, 5.85320444e+03, 6.16742807e+03, 6.48937676e+03,
6.81900661e+03, 7.15627347e+03, 7.50113266e+03, 7.85353879e+03,
8.21344538e+03, 8.58080462e+03, 8.95556694e+03, 9.33768077e+03,
9.72709218e+03, 1.01237446e+04, 1.05275788e+04
]
)
# fmt: on
)
assert mis_rigid.forces[mis_rigid.n1 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[ 0. , -695.44989191, -1390.76098145, -2085.80340207,
-2780.44896672, -3474.57174737, -4168.04855579, -4860.75930785,
-5552.58725831, -6243.41909941, -6933.14492308, -7621.65805293,
-8308.85475863, -8994.63387028, -9678.89631528, -10361.54460307,
-11042.48228557, -11721.61342139, -12398.84207127, -13074.07184973,
-13747.20555483, -14418.14489306, -15086.79031112, -15753.04094068,
-16416.79465562, -17077.94823571, -17736.39762468, -18392.03826576,
-19044.76549395, -19694.47496121, -20341.06306924, -20984.42738466,
-21624.46701205, -22261.08290347, -22894.17808613, -23523.65779508,
-24149.42950266, -24771.40284243, -25389.48943124, -26003.60259834,
-26613.65703604, -27219.56839029, -27821.25281318, -28418.62650121,
-29011.60524412, -29600.10400851, -30184.03657848, -30763.31527277,
-31337.85075352, -31907.55193714, -32472.32601231
]
)
# fmt: on
)
assert mis_rigid.forces[mis_rigid.n2 * 6, :] == pytest.approx(
# fmt: off
np.array(
[ 0.00000000e+00, -4.36964689e+00, -1.74771689e+01, -3.93186457e+01,
-6.98878473e+01, -1.09176370e+02, -1.57173806e+02, -2.13867938e+02,
-2.79244943e+02, -3.53289607e+02, -4.35985520e+02, -5.27315254e+02,
-6.27260511e+02, -7.35802235e+02, -8.52920676e+02, -9.78595415e+02,
-1.11280534e+03, -1.25552860e+03, -1.40674247e+03, -1.56642326e+03,
-1.73454619e+03, -1.91108521e+03, -2.09601285e+03, -2.28930016e+03,
-2.49091658e+03, -2.70082986e+03, -2.91900610e+03, -3.14540972e+03,
-3.38000359e+03, -3.62274912e+03, -3.87360643e+03, -4.13253452e+03,
-4.39949148e+03, -4.67443471e+03, -4.95732112e+03, -5.24810728e+03,
-5.54674961e+03, -5.85320444e+03, -6.16742807e+03, -6.48937676e+03,
-6.81900661e+03, -7.15627347e+03, -7.50113266e+03, -7.85353879e+03,
-8.21344538e+03, -8.58080462e+03, -8.95556694e+03, -9.33768077e+03,
-9.72709218e+03, -1.01237446e+04, -1.05275788e+04
]
)
# fmt: on
)
assert mis_rigid.forces[mis_rigid.n2 * 6 + 1, :] == pytest.approx(
# fmt: off
np.array(
[ 0. , 695.44989191, 1390.76098145, 2085.80340207,
2780.44896672, 3474.57174737, 4168.04855579, 4860.75930785,
5552.58725831, 6243.41909941, 6933.14492308, 7621.65805293,
8308.85475863, 8994.63387028, 9678.89631528, 10361.54460307,
11042.48228557, 11721.61342139, 12398.84207127, 13074.07184973,
13747.20555483, 14418.14489306, 15086.79031112, 15753.04094068,
16416.79465562, 17077.94823571, 17736.39762468, 18392.03826576,
19044.76549395, 19694.47496121, 20341.06306924, 20984.42738466,
21624.46701205, 22261.08290347, 22894.17808613, 23523.65779508,
24149.42950266, 24771.40284243, 25389.48943124, 26003.60259834,
26613.65703604, 27219.56839029, 27821.25281318, 28418.62650121,
29011.60524412, 29600.10400851, 30184.03657848, 30763.31527277,
31337.85075352, 31907.55193714, 32472.32601231
]
)
# fmt: on
)
| 35.709447 | 92 | 0.626317 | 2,959 | 20,033 | 4.191619 | 0.141264 | 0.131581 | 0.145126 | 0.258002 | 0.873821 | 0.846973 | 0.825445 | 0.823188 | 0.789809 | 0.789486 | 0 | 0.532243 | 0.206559 | 20,033 | 560 | 93 | 35.773214 | 0.248065 | 0.018869 | 0 | 0.391398 | 0 | 0 | 0.002294 | 0 | 0 | 0 | 0 | 0 | 0.131183 | 1 | 0.025806 | false | 0 | 0.019355 | 0 | 0.053763 | 0.008602 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
181d1e5facbced869e894b58fd89de198542d1c2 | 254 | py | Python | skconfig/parameter/convience.py | thomasjpfan/skconfig | 962eb6486f1d11fc4858396189e54957bd95db07 | [
"MIT"
] | 11 | 2019-03-26T15:44:59.000Z | 2021-09-30T03:04:54.000Z | skconfig/parameter/convience.py | amueller/skconfig | fc2fc6268f5ff4cd572262ed9a53af8c87294c0d | [
"MIT"
] | 1 | 2019-03-28T22:05:55.000Z | 2019-03-28T22:05:55.000Z | skconfig/parameter/convience.py | amueller/skconfig | fc2fc6268f5ff4cd572262ed9a53af8c87294c0d | [
"MIT"
] | 4 | 2019-03-26T14:47:03.000Z | 2021-09-30T03:04:55.000Z | from numpy.random import RandomState
from .types import NoneParam
from .types import ObjectParam
from .types import IntParam
from .types import UnionParam
def RandomStateParam():
return UnionParam(NoneParam(), IntParam(), ObjectParam(RandomState))
| 25.4 | 72 | 0.80315 | 29 | 254 | 7.034483 | 0.448276 | 0.176471 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125984 | 254 | 9 | 73 | 28.222222 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | true | 0 | 0.714286 | 0.142857 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
181d663517256516424d439f3f3655bd3b89d710 | 2,147 | py | Python | irekua_permissions/data_collections/sites.py | CONABIO-audio/irekua-permissions | 563c558e59788054504c852a6a6017bce7469a12 | [
"BSD-4-Clause"
] | null | null | null | irekua_permissions/data_collections/sites.py | CONABIO-audio/irekua-permissions | 563c558e59788054504c852a6a6017bce7469a12 | [
"BSD-4-Clause"
] | 2 | 2020-02-12T03:00:51.000Z | 2020-04-26T23:27:52.000Z | irekua_permissions/data_collections/sites.py | CONABIO-audio/irekua-permissions | 563c558e59788054504c852a6a6017bce7469a12 | [
"BSD-4-Clause"
] | null | null | null | def view(user, collection_site=None, **kwargs):
if collection_site is None:
return False
collection = collection_site.collection
if collection.is_open:
return True
if not user.is_authenticated:
return False
if collection_site.created_by == user:
return True
if user.is_special:
return True
if collection.collection_type.is_admin(user):
return True
if collection.is_admin(user):
return True
if not collection.is_user(user):
return False
role = collection.get_user_role(user)
return role.has_permission('view_collection_sites')
def create(user, collection=None, **kwargs):
if collection is None:
return False
if not user.is_authenticated:
return False
if user.is_superuser:
return True
if collection.collection_type.is_admin(user):
return True
if collection.is_admin(user):
return True
if not collection.is_user(user):
return False
role = collection.get_user_role(user)
return role.has_permission('add_collection_site')
def change(user, collection_site=None, **kwargs):
if collection_site is None:
return False
if not user.is_authenticated:
return False
if collection_site.created_by == user:
return True
if user.is_superuser:
return True
collection = collection_site.collection
if collection.collection_type.is_admin(user):
return True
if collection.is_admin(user):
return True
if not collection.is_user(user):
return False
role = collection.get_user_role(user)
return role.has_permission('change_collection_sites')
def delete(user, collection_site=None, **kwargs):
if collection_site is None:
return False
if not user.is_authenticated:
return False
if collection_site.created_by == user:
return True
if user.is_superuser:
return True
collection = collection_site.collection
if collection.collection_type.is_admin(user):
return True
return collection.is_admin(user)
| 22.134021 | 57 | 0.676293 | 273 | 2,147 | 5.120879 | 0.10989 | 0.114449 | 0.103004 | 0.103004 | 0.873391 | 0.873391 | 0.829757 | 0.829757 | 0.829757 | 0.829757 | 0 | 0 | 0.255706 | 2,147 | 96 | 58 | 22.364583 | 0.874844 | 0 | 0 | 0.833333 | 0 | 0 | 0.029343 | 0.020494 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0 | 0 | 0.515152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
185795e21b6a0ed848c35d621234835460915892 | 52,592 | py | Python | test/unittests/kernel_unittests.py | AndrewRLawrence/dp_gp_lvm | b0d4c776714f22e83de31127fbfbbd511f017dcd | [
"MIT"
] | 1 | 2021-01-17T11:44:36.000Z | 2021-01-17T11:44:36.000Z | test/unittests/kernel_unittests.py | AndrewRLawrence/dp_gp_lvm | b0d4c776714f22e83de31127fbfbbd511f017dcd | [
"MIT"
] | 1 | 2020-07-19T20:47:02.000Z | 2020-07-19T20:47:02.000Z | test/unittests/kernel_unittests.py | AndrewRLawrence/dp_gp_lvm | b0d4c776714f22e83de31127fbfbbd511f017dcd | [
"MIT"
] | 1 | 2020-07-21T07:13:13.000Z | 2020-07-21T07:13:13.000Z | """
This file defines unit tests for the various kernel implementations.
"""
from src.kernels.rbf_kernel import k_ard_rbf
from src.utils.constants import GP_DEFAULT_JITTER
from src.utils.types import NP_DTYPE
import numpy as np
import tensorflow as tf
import unittest
def k_ard_rbf_covariance_matrix_naive(input_0, gamma, alpha, beta,
input_1=None, include_noise=False, include_jitter=False):
"""
TODO
:param input_0:
:param input_1:
:param gamma:
:param alpha:
:param beta:
:param include_noise:
:param include_jitter:
:return:
"""
# Check shapes/sizes.
[n0, q0] = np.shape(input_0)
if input_1 is not None:
include_noise = False # Override as only provide noise if k(x,x).
include_jitter = False # Override as only provide noise if k(x,x).
else:
input_1 = input_0
[n1, q1] = np.shape(input_1)
assert q0 == q1, 'Input dimensionality must be the same for inputs 0 and 1.'
assert q0 == np.size(gamma), 'ARD weights must be same size as input dimensionality.'
cov_matrix = np.zeros((n0, n1))
for i in range(n0):
for k in range(n1):
exp_value = 0.0
for j in range(q0):
exp_value += gamma[j] * np.square(input_0[i, j] - input_1[k, j])
cov_matrix[i, k] = alpha * np.exp(-0.5 * exp_value)
if include_noise:
cov_matrix += np.reciprocal(beta) * np.eye(n0)
if include_jitter:
cov_matrix += GP_DEFAULT_JITTER * np.eye(n0)
return cov_matrix
def k_ard_rbf_covariance_diagonal_naive(input_0, gamma, alpha, beta, include_noise=False, include_jitter=False):
"""
TODO
:param input_0:
:param gamma:
:param alpha:
:param beta:
:param include_noise:
:param include_jitter:
:return:
"""
return np.diag(k_ard_rbf_covariance_matrix_naive(input_0, gamma, alpha, beta, input_1=None,
include_noise=include_noise, include_jitter=include_jitter))
def k_ard_rbf_psi_0_naive(num_samples, alpha):
"""
TODO
:param num_samples:
:param alpha:
:return:
"""
return num_samples * alpha
def k_ard_rbf_psi_1_naive(x_mean, x_var, x_u, gamma, alpha):
"""
TODO
:param x_mean:
:param x_var:
:param x_u:
:param gamma:
:param alpha:
:return:
"""
# Determine number of samples, number of inducing points, and number of latent dimensions.
assert np.shape(x_mean) == np.shape(x_var), 'Shape of mean and variance of q(X) must be the same.'
[n, q] = np.shape(x_mean)
[m, qu] = np.shape(x_u)
assert n > m, 'Number of observations must be greater than number of inducing points.'
assert q == qu, 'Latent dimensionality of X and inducing input must be the same.'
assert q == np.size(gamma), 'ARD weights must be same size as latent dimensionality.'
# Initialise log_psi_1.
log_psi_1 = np.log(alpha) * np.ones((n, m))
# Loop through each dimension.
for i in range(n):
for k in range(m):
for j in range(q):
denominator = gamma[j] * x_var[i, j] + 1.0
log_psi_1[i, k] -= 0.5 * (np.log(denominator) +
gamma[j] * np.square(x_mean[i, j] - x_u[k, j]) / denominator)
return np.exp(log_psi_1)
def k_ard_rbf_psi_2_naive(x_mean, x_var, x_u, gamma, alpha):
"""
TODO
:param x_mean:
:param x_var:
:param x_u:
:param gamma:
:param alpha:
:return:
"""
# Determine number of samples, number of inducing points, and number of latent dimensions.
assert np.shape(x_mean) == np.shape(x_var), 'Shape of mean and variance of q(X) must be the same.'
[n, q] = np.shape(x_mean)
[m, qu] = np.shape(x_u)
assert n > m, 'Number of observations must be greater than number of inducing points.'
assert q == qu, 'Latent dimensionality of X and inducing input must be the same.'
assert q == np.size(gamma), 'ARD weights must be same size as latent dimensionality.'
# Initialise log_psi_2.
log_psi_2 = 2.0 * np.log(alpha) * np.ones((n, m, m))
# Loop through each dimension.
for i in range(n):
for k1 in range(m):
for k2 in range(m):
for j in range(q):
denominator = 2.0 * gamma[j] * x_var[i, j] + 1.0
x_u_bar = 0.5 * (x_u[k1, j] + x_u[k2, j])
log_psi_2[i, k1, k2] -= 0.5 * np.log(denominator) + \
0.25 * gamma[j] * np.square(x_u[k1, j] - x_u[k2, j]) + \
gamma[j] * np.square(x_mean[i, j] - x_u_bar) / denominator
return np.sum(np.exp(log_psi_2), axis=0) # [M x M].
class TestRbfKernel(unittest.TestCase):
def setUp(self, seed=1):
"""
TODO
:param seed:
:return:
"""
np.random.seed(seed=seed)
self.n = 100
self.n0 = 100
self.n1 = 75
self.d = 20
self.m = 25
self.q = 10
self.y = np.random.standard_normal((self.n, self.d)).astype(NP_DTYPE)
self.x = np.random.standard_normal((self.n, self.q)).astype(NP_DTYPE)
self.x0 = np.random.standard_normal((self.n0, self.q)).astype(NP_DTYPE)
self.x1 = np.random.standard_normal((self.n1, self.q)).astype(NP_DTYPE)
self.x_mean = np.random.standard_normal((self.n, self.q)).astype(NP_DTYPE)
self.x_var = np.square(np.random.standard_normal((self.n, self.q)).astype(NP_DTYPE)) # [N x Q].
self.x_covar = np.stack(tuple([np.diag(self.x_var[i, :]) for i in range(self.n)]), axis=0) # [N x Q x Q].
self.x_u = np.random.standard_normal((self.m, self.q)).astype(NP_DTYPE)
self.gamma = np.exp(np.random.standard_normal(self.q).astype(NP_DTYPE))
self.alpha = np.square(np.random.standard_normal(1).astype(NP_DTYPE) + 1.0)
self.beta = np.square(np.random.standard_normal(1).astype(NP_DTYPE) + np.sqrt(50.0))
self.kernel = k_ard_rbf(gamma=self.gamma[np.newaxis, :],
alpha=np.reshape(self.alpha, (1, 1)),
beta=np.reshape(self.beta, (1, 1)))
# TensorFlow session.
self.tf_session = tf.Session()
def tearDown(self):
"""
Close the TensorFlow session.
"""
self.tf_session.close()
def test_covariance_matrix(self):
# Calculate a bunch of covariance matrices in a naive manner.
k_xx_naive = k_ard_rbf_covariance_matrix_naive(input_0=self.x0,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=None,
include_noise=False,
include_jitter=True)
k_xx_naive_noisy = k_ard_rbf_covariance_matrix_naive(input_0=self.x0,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=None,
include_noise=True,
include_jitter=True)
k_01_naive = k_ard_rbf_covariance_matrix_naive(input_0=self.x0,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=self.x1,
include_noise=False,
include_jitter=True)
k_10_naive = k_ard_rbf_covariance_matrix_naive(input_0=self.x1,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=self.x0,
include_noise=False,
include_jitter=False)
k_uu_naive = k_ard_rbf_covariance_matrix_naive(input_0=self.x_u,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=self.x_u,
include_noise=False,
include_jitter=False)
k_uu_naive_noisy = k_ard_rbf_covariance_matrix_naive(input_0=self.x_u,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=None,
include_noise=True,
include_jitter=True)
k_xmxm_naive = k_ard_rbf_covariance_matrix_naive(input_0=self.x_mean,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=None,
include_noise=False,
include_jitter=True)
k_xmxm_naive_noisy = k_ard_rbf_covariance_matrix_naive(input_0=self.x_mean,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
input_1=None,
include_noise=True,
include_jitter=False)
k_xx, k_xx_noisy, \
k_01, k_10, \
k_uu, k_uu_noisy, \
k_xmxm, k_xmxm_noisy = self.tf_session.run((tf.squeeze(self.kernel.covariance_matrix(input_0=self.x0,
input_1=None,
include_noise=False,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x0,
input_1=None,
include_noise=True,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x0,
input_1=self.x1,
include_noise=False,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x1,
input_1=self.x0,
include_noise=False,
include_jitter=False)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x_u,
input_1=self.x_u,
include_noise=False,
include_jitter=False)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x_u,
input_1=None,
include_noise=True,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x_mean,
input_1=None,
include_noise=False,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_matrix(input_0=self.x_mean,
input_1=None,
include_noise=True,
include_jitter=False))
))
# Compare all matrices to the naive ones.
np.testing.assert_equal(k_xx_naive.shape, k_xx.shape)
np.testing.assert_allclose(k_xx_naive, k_xx)
np.testing.assert_equal(k_xx_naive_noisy.shape, k_xx_noisy.shape)
np.testing.assert_allclose(k_xx_naive_noisy, k_xx_noisy)
np.testing.assert_equal(k_01_naive.shape, k_01.shape)
np.testing.assert_allclose(k_01_naive, k_01)
np.testing.assert_equal(k_10_naive.shape, k_10.shape)
np.testing.assert_allclose(k_10_naive, k_10)
np.testing.assert_equal(k_uu_naive.shape, k_uu.shape)
np.testing.assert_allclose(k_uu_naive, k_uu)
np.testing.assert_equal(k_uu_naive_noisy.shape, k_uu_noisy.shape)
np.testing.assert_allclose(k_uu_naive_noisy, k_uu_noisy)
np.testing.assert_equal(k_xmxm_naive.shape, k_xmxm.shape)
np.testing.assert_allclose(k_xmxm_naive, k_xmxm)
np.testing.assert_equal(k_xmxm_naive_noisy.shape, k_xmxm_noisy.shape)
np.testing.assert_allclose(k_xmxm_naive_noisy, k_xmxm_noisy)
def test_covariance_diag(self):
# Calculate a bunch of covariance matrix diagonals in a naive manner.
k_x0_naive = k_ard_rbf_covariance_diagonal_naive(input_0=self.x0,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=False,
include_jitter=True)
k_x0_naive_noisy = k_ard_rbf_covariance_diagonal_naive(input_0=self.x0,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=True,
include_jitter=True)
k_x1_naive = k_ard_rbf_covariance_diagonal_naive(input_0=self.x1,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=False,
include_jitter=False)
k_x1_naive_noisy = k_ard_rbf_covariance_diagonal_naive(input_0=self.x1,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=True,
include_jitter=False)
k_uu_naive = k_ard_rbf_covariance_diagonal_naive(input_0=self.x_u,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=False,
include_jitter=False)
k_uu_naive_noisy = k_ard_rbf_covariance_diagonal_naive(input_0=self.x_u,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=True,
include_jitter=True)
k_xmxm_naive = k_ard_rbf_covariance_diagonal_naive(input_0=self.x_mean,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=False,
include_jitter=True)
k_xmxm_naive_noisy = k_ard_rbf_covariance_diagonal_naive(input_0=self.x_mean,
gamma=self.gamma,
alpha=self.alpha,
beta=self.beta,
include_noise=True,
include_jitter=False)
k_x0, k_x0_noisy, \
k_x1, k_x1_noisy, \
k_uu, k_uu_noisy, \
k_xmxm, k_xmxm_noisy = self.tf_session.run((tf.squeeze(self.kernel.covariance_diag(input_0=self.x0,
include_noise=False,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x0,
include_noise=True,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x1,
include_noise=False,
include_jitter=False)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x1,
include_noise=True,
include_jitter=False)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x_u,
include_noise=False,
include_jitter=False)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x_u,
include_noise=True,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x_mean,
include_noise=False,
include_jitter=True)),
tf.squeeze(self.kernel.covariance_diag(input_0=self.x_mean,
include_noise=True,
include_jitter=False))
))
# Compare all matrices to the naive ones.
np.testing.assert_equal(k_x0_naive.shape, k_x0.shape)
np.testing.assert_allclose(k_x0_naive, k_x0)
np.testing.assert_equal(k_x0_naive_noisy.shape, k_x0_noisy.shape)
np.testing.assert_allclose(k_x0_naive_noisy, k_x0_noisy)
np.testing.assert_equal(k_x1_naive.shape, k_x1.shape)
np.testing.assert_allclose(k_x1_naive, k_x1)
np.testing.assert_equal(k_x1_naive_noisy.shape, k_x1_noisy.shape)
np.testing.assert_allclose(k_x1_naive_noisy, k_x1_noisy)
np.testing.assert_equal(k_uu_naive.shape, k_uu.shape)
np.testing.assert_allclose(k_uu_naive, k_uu)
np.testing.assert_equal(k_uu_naive_noisy.shape, k_uu_noisy.shape)
np.testing.assert_allclose(k_uu_naive_noisy, k_uu_noisy)
np.testing.assert_equal(k_xmxm_naive.shape, k_xmxm.shape)
np.testing.assert_allclose(k_xmxm_naive, k_xmxm)
np.testing.assert_equal(k_xmxm_naive_noisy.shape, k_xmxm_noisy.shape)
np.testing.assert_allclose(k_xmxm_naive_noisy, k_xmxm_noisy)
def test_psi_0(self):
# Calculate psi 0s in a naive manner.
psi_0_xx_naive = k_ard_rbf_psi_0_naive(self.n, np.squeeze(self.alpha))
psi_0_x0_naive = k_ard_rbf_psi_0_naive(self.n0, np.squeeze(self.alpha))
psi_0_x1_naive = k_ard_rbf_psi_0_naive(self.n1, np.squeeze(self.alpha))
psi_0_xu_naive = k_ard_rbf_psi_0_naive(self.m, np.squeeze(self.alpha))
psi_0_xx, psi_0_x0, psi_0_x1, psi_0_xu = self.tf_session.run((
tf.squeeze(self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x,
latent_input_covariance=self.x_covar)),
tf.squeeze(self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x0,
latent_input_covariance=self.x_covar)),
tf.squeeze(self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x1,
latent_input_covariance=self.x_covar)),
tf.squeeze(self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x_u,
latent_input_covariance=self.x_covar))
))
# Compare all psi 0s to the naive ones.
np.testing.assert_equal(psi_0_xx_naive.shape, psi_0_xx.shape)
np.testing.assert_allclose(psi_0_xx_naive, psi_0_xx)
np.testing.assert_equal(psi_0_x0_naive.shape, psi_0_x0.shape)
np.testing.assert_allclose(psi_0_x0_naive, psi_0_x0)
np.testing.assert_equal(psi_0_x1_naive.shape, psi_0_x1.shape)
np.testing.assert_allclose(psi_0_x1_naive, psi_0_x1)
np.testing.assert_equal(psi_0_xu_naive.shape, psi_0_xu.shape)
np.testing.assert_allclose(psi_0_xu_naive, psi_0_xu)
def test_psi_1(self):
# Calculate psi 1 in a naive manner.
psi_1_naive = k_ard_rbf_psi_1_naive(self.x_mean, self.x_var, self.x_u, self.gamma, self.alpha)
psi_1 = self.tf_session.run(tf.squeeze(self.kernel.psi_1(inducing_input=self.x_u,
latent_input_mean=self.x_mean,
latent_input_covariance=self.x_covar)))
# Compare psi 1 to naive one.
np.testing.assert_equal(psi_1_naive.shape, psi_1.shape)
np.testing.assert_allclose(psi_1_naive, psi_1)
def test_psi_2(self):
# Calculate psi 2 in a naive manner.
psi_2_naive = k_ard_rbf_psi_2_naive(self.x_mean, self.x_var, self.x_u, self.gamma, self.alpha)
psi_2 = self.tf_session.run(tf.squeeze(self.kernel.psi_2(inducing_input=self.x_u,
latent_input_mean=self.x_mean,
latent_input_covariance=self.x_covar)))
# Compare psi 2 to naive one.
np.testing.assert_equal(psi_2_naive.shape, psi_2.shape)
np.testing.assert_allclose(psi_2_naive, psi_2)
class TestRbfBatchKernel(unittest.TestCase):
def setUp(self, seed=1):
"""
TODO
:param seed:
:return:
"""
np.random.seed(seed=seed)
self.n = 100
self.n0 = 100
self.n1 = 75
self.d = 7
self.m = 25
self.q = 10
self.y = np.random.standard_normal((self.n, self.d)).astype(NP_DTYPE)
self.x = np.random.standard_normal((self.n, self.q)).astype(NP_DTYPE)
self.x0 = np.random.standard_normal((self.n0, self.q)).astype(NP_DTYPE)
self.x1 = np.random.standard_normal((self.n1, self.q)).astype(NP_DTYPE)
self.x_mean = np.random.standard_normal((self.n, self.q)).astype(NP_DTYPE)
self.x_var = np.square(np.random.standard_normal((self.n, self.q)).astype(NP_DTYPE)) # [N x Q].
self.x_covar = np.stack(tuple([np.diag(self.x_var[i, :]) for i in range(self.n)]), axis=0) # [N x Q x Q].
self.x_u = np.random.standard_normal((self.m, self.q)).astype(NP_DTYPE)
self.gamma = np.exp(np.random.standard_normal((self.d, self.q)).astype(NP_DTYPE))
self.alpha = np.square(np.random.standard_normal((self.d, 1)).astype(NP_DTYPE) + 1.0)
self.beta = np.square(np.random.standard_normal((self.d, 1)).astype(NP_DTYPE) + np.sqrt(50.0))
self.kernel = k_ard_rbf(gamma=self.gamma, alpha=self.alpha, beta=self.beta)
# TensorFlow session.
self.tf_session = tf.Session()
def tearDown(self):
"""
Close the TensorFlow session.
"""
self.tf_session.close()
def test_covariance_matrix(self):
# Calculate a bunch of covariance matrices in a naive manner.
k_xx_naive = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x0,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=None,
include_noise=False,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N0 x N0].
k_xx_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x0,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=None,
include_noise=True,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N0 x N0].
k_01_naive = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x0,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=self.x1,
include_noise=False,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N0 x N1].
k_10_naive = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x1,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=self.x0,
include_noise=False,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x N1 x N0].
k_uu_naive = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x_u,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=self.x_u,
include_noise=False,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x M x M].
k_uu_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x_u,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=None,
include_noise=True,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x M x M].
k_xmxm_naive = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x_mean,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=None,
include_noise=False,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N x N].
k_xmxm_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_matrix_naive(input_0=self.x_mean,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
input_1=None,
include_noise=True,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x N x N].
k_xx, k_xx_noisy, \
k_01, k_10, \
k_uu, k_uu_noisy, \
k_xmxm, k_xmxm_noisy = self.tf_session.run((self.kernel.covariance_matrix(input_0=self.x0,
input_1=None,
include_noise=False,
include_jitter=True),
self.kernel.covariance_matrix(input_0=self.x0,
input_1=None,
include_noise=True,
include_jitter=True),
self.kernel.covariance_matrix(input_0=self.x0,
input_1=self.x1,
include_noise=False,
include_jitter=True),
self.kernel.covariance_matrix(input_0=self.x1,
input_1=self.x0,
include_noise=False,
include_jitter=False),
self.kernel.covariance_matrix(input_0=self.x_u,
input_1=self.x_u,
include_noise=False,
include_jitter=False),
self.kernel.covariance_matrix(input_0=self.x_u,
input_1=None,
include_noise=True,
include_jitter=True),
self.kernel.covariance_matrix(input_0=self.x_mean,
input_1=None,
include_noise=False,
include_jitter=True),
self.kernel.covariance_matrix(input_0=self.x_mean,
input_1=None,
include_noise=True,
include_jitter=False)
))
# Compare all matrices to the naive ones.
np.testing.assert_equal(k_xx_naive.shape, k_xx.shape)
np.testing.assert_allclose(k_xx_naive, k_xx)
np.testing.assert_equal(k_xx_naive_noisy.shape, k_xx_noisy.shape)
np.testing.assert_allclose(k_xx_naive_noisy, k_xx_noisy)
np.testing.assert_equal(k_01_naive.shape, k_01.shape)
np.testing.assert_allclose(k_01_naive, k_01)
np.testing.assert_equal(k_10_naive.shape, k_10.shape)
np.testing.assert_allclose(k_10_naive, k_10)
np.testing.assert_equal(k_uu_naive.shape, k_uu.shape)
np.testing.assert_allclose(k_uu_naive, k_uu)
np.testing.assert_equal(k_uu_naive_noisy.shape, k_uu_noisy.shape)
np.testing.assert_allclose(k_uu_naive_noisy, k_uu_noisy)
np.testing.assert_equal(k_xmxm_naive.shape, k_xmxm.shape)
np.testing.assert_allclose(k_xmxm_naive, k_xmxm)
np.testing.assert_equal(k_xmxm_naive_noisy.shape, k_xmxm_noisy.shape)
np.testing.assert_allclose(k_xmxm_naive_noisy, k_xmxm_noisy)
def test_covariance_diag(self):
# Calculate a bunch of covariance matrix diagonals in a naive manner.
k_x0_naive = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x0,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=False,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N0].
k_x0_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x0,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=True,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N0].
k_x1_naive = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x1,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=False,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x N1].
k_x1_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x1,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=True,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x N1].
k_uu_naive = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x_u,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=False,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x M].
k_uu_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x_u,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=True,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x M].
k_xmxm_naive = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x_mean,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=False,
include_jitter=True)
for i in range(self.d)]),
axis=0) # [D x N].
k_xmxm_naive_noisy = np.stack(tuple([k_ard_rbf_covariance_diagonal_naive(input_0=self.x_mean,
gamma=self.gamma[i],
alpha=self.alpha[i],
beta=self.beta[i],
include_noise=True,
include_jitter=False)
for i in range(self.d)]),
axis=0) # [D x N].
k_x0, k_x0_noisy, \
k_x1, k_x1_noisy, \
k_uu, k_uu_noisy, \
k_xmxm, k_xmxm_noisy = self.tf_session.run((self.kernel.covariance_diag(input_0=self.x0,
include_noise=False,
include_jitter=True),
self.kernel.covariance_diag(input_0=self.x0,
include_noise=True,
include_jitter=True),
self.kernel.covariance_diag(input_0=self.x1,
include_noise=False,
include_jitter=False),
self.kernel.covariance_diag(input_0=self.x1,
include_noise=True,
include_jitter=False),
self.kernel.covariance_diag(input_0=self.x_u,
include_noise=False,
include_jitter=False),
self.kernel.covariance_diag(input_0=self.x_u,
include_noise=True,
include_jitter=True),
self.kernel.covariance_diag(input_0=self.x_mean,
include_noise=False,
include_jitter=True),
self.kernel.covariance_diag(input_0=self.x_mean,
include_noise=True,
include_jitter=False)
))
# Compare all matrices to the naive ones.
np.testing.assert_equal(k_x0_naive.shape, k_x0.shape)
np.testing.assert_allclose(k_x0_naive, k_x0)
np.testing.assert_equal(k_x0_naive_noisy.shape, k_x0_noisy.shape)
np.testing.assert_allclose(k_x0_naive_noisy, k_x0_noisy)
np.testing.assert_equal(k_x1_naive.shape, k_x1.shape)
np.testing.assert_allclose(k_x1_naive, k_x1)
np.testing.assert_equal(k_x1_naive_noisy.shape, k_x1_noisy.shape)
np.testing.assert_allclose(k_x1_naive_noisy, k_x1_noisy)
np.testing.assert_equal(k_uu_naive.shape, k_uu.shape)
np.testing.assert_allclose(k_uu_naive, k_uu)
np.testing.assert_equal(k_uu_naive_noisy.shape, k_uu_noisy.shape)
np.testing.assert_allclose(k_uu_naive_noisy, k_uu_noisy)
np.testing.assert_equal(k_xmxm_naive.shape, k_xmxm.shape)
np.testing.assert_allclose(k_xmxm_naive, k_xmxm)
np.testing.assert_equal(k_xmxm_naive_noisy.shape, k_xmxm_noisy.shape)
np.testing.assert_allclose(k_xmxm_naive_noisy, k_xmxm_noisy)
def test_psi_0(self):
# Calculate psi 0s in a naive manner.
psi_0_xx_naive = np.stack(tuple([k_ard_rbf_psi_0_naive(self.n, self.alpha[i])
for i in range(self.d)]),
axis=0) # [D x 1].
psi_0_x0_naive = np.stack(tuple([k_ard_rbf_psi_0_naive(self.n0, self.alpha[i])
for i in range(self.d)]),
axis=0) # [D x 1].
psi_0_x1_naive = np.stack(tuple([k_ard_rbf_psi_0_naive(self.n1, self.alpha[i])
for i in range(self.d)]),
axis=0) # [D x 1].
psi_0_xu_naive = np.stack(tuple([k_ard_rbf_psi_0_naive(self.m, self.alpha[i])
for i in range(self.d)]),
axis=0) # [D x 1].
psi_0_xx, psi_0_x0, psi_0_x1, psi_0_xu = self.tf_session.run((
self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x,
latent_input_covariance=self.x_covar),
self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x0,
latent_input_covariance=self.x_covar),
self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x1,
latent_input_covariance=self.x_covar),
self.kernel.psi_0(inducing_input=self.x_u,
latent_input_mean=self.x_u,
latent_input_covariance=self.x_covar)
))
# Compare all psi 0s to the naive ones.
np.testing.assert_equal(psi_0_xx_naive.shape, psi_0_xx.shape)
np.testing.assert_allclose(psi_0_xx_naive, psi_0_xx)
np.testing.assert_equal(psi_0_x0_naive.shape, psi_0_x0.shape)
np.testing.assert_allclose(psi_0_x0_naive, psi_0_x0)
np.testing.assert_equal(psi_0_x1_naive.shape, psi_0_x1.shape)
np.testing.assert_allclose(psi_0_x1_naive, psi_0_x1)
np.testing.assert_equal(psi_0_xu_naive.shape, psi_0_xu.shape)
np.testing.assert_allclose(psi_0_xu_naive, psi_0_xu)
def test_psi_1(self):
# Calculate psi 1 in a naive manner.
psi_1_naive = np.stack(tuple([k_ard_rbf_psi_1_naive(self.x_mean, self.x_var, self.x_u,
self.gamma[i], self.alpha[i])
for i in range(self.d)]),
axis=0) # [D x N x M].
psi_1 = self.tf_session.run(self.kernel.psi_1(inducing_input=self.x_u,
latent_input_mean=self.x_mean,
latent_input_covariance=self.x_covar))
# Compare psi 1 to naive one.
np.testing.assert_equal(psi_1_naive.shape, psi_1.shape)
np.testing.assert_allclose(psi_1_naive, psi_1)
def test_psi_2(self):
# Calculate psi 2 in a naive manner.
psi_2_naive = np.stack(tuple([k_ard_rbf_psi_2_naive(self.x_mean, self.x_var, self.x_u,
self.gamma[i], self.alpha[i])
for i in range(self.d)]),
axis=0) # [D x M x M].
psi_2 = self.tf_session.run(self.kernel.psi_2(inducing_input=self.x_u,
latent_input_mean=self.x_mean,
latent_input_covariance=self.x_covar))
# Compare psi 2 to naive one.
np.testing.assert_equal(psi_2_naive.shape, psi_2.shape)
np.testing.assert_allclose(psi_2_naive, psi_2)
| 63.363855 | 120 | 0.389185 | 4,931 | 52,592 | 3.854999 | 0.035693 | 0.024199 | 0.069441 | 0.046294 | 0.944921 | 0.94087 | 0.933347 | 0.92956 | 0.917934 | 0.910411 | 0 | 0.022856 | 0.543277 | 52,592 | 829 | 121 | 63.44029 | 0.768526 | 0.041603 | 0 | 0.812018 | 0 | 0 | 0.011815 | 0 | 0 | 0 | 0 | 0.008444 | 0.151002 | 1 | 0.029276 | false | 0 | 0.009245 | 0 | 0.049307 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
186b64832e4755c5a892dcc426a1b231bb01ef98 | 34,438 | py | Python | datasetgen/ui/functions.py | Cloud-PG/dataset-generator | 2bed98552ddee6927fc1bcd7758ca867b586c7c2 | [
"Apache-2.0"
] | null | null | null | datasetgen/ui/functions.py | Cloud-PG/dataset-generator | 2bed98552ddee6927fc1bcd7758ca867b586c7c2 | [
"Apache-2.0"
] | 2 | 2020-05-06T16:15:35.000Z | 2020-05-06T16:15:38.000Z | datasetgen/ui/functions.py | Cloud-PG/dataset-generator | 2bed98552ddee6927fc1bcd7758ca867b586c7c2 | [
"Apache-2.0"
] | null | null | null | import dash
import dash_bootstrap_components as dbc
import dash_core_components as dcc
import dash_html_components as html
from dash.dependencies import Input, Output
class FunctionUI(object):
def __init__(self, app: 'dash.dash.Dash'):
assert isinstance(
app, dash.dash.Dash
), "Function UI needs main app reference..."
self._app = app
def elements(self):
"""Returns the HTML elemets of the UI."""
raise NotImplementedError
def callbacks(self):
"""Returns the elemet callbacks of the UI."""
raise NotImplementedError
def to_dict(self):
"""Export the UI parameters as a dict.
This method is required to call the generator functions
"""
raise NotImplementedError
@property
def name(self):
return repr(self)
@property
def name_id(self):
return "-".join(str(self).lower().split())
class RandomGenerator(FunctionUI):
def __init__(self, app: 'dash.dash.Dash'):
super().__init__(app)
self._num_files = 100
self._min_file_size = 100
self._max_file_size = 24000
self._size_function_generator = "gen_random_sizes"
def __repr__(self):
return "Random Generator"
def to_dict(self):
return {
'num_files': self._num_files,
'min_file_size': self._min_file_size,
'max_file_size': self._max_file_size,
'size_generator_function': self._size_function_generator,
}
def callbacks(self):
@self._app.callback(
Output(f'{self.name_id}-num-file-val', 'children'),
[Input(f'{self.name_id}-num-files', 'value')])
def change_num_files(value):
self._num_files = value
return f"Num. Files: {value}"
@self._app.callback(
Output(f'{self.name_id}-file-size-val', 'children'),
[Input(f'{self.name_id}-file-size', 'value')],
)
def change_size(value):
self._min_file_size, self._max_file_size = value
return f"File Size (MB): {self._min_file_size}-{self._max_file_size}"
@self._app.callback(
Output(f'{self.name_id}-size-function-val', 'children'),
[Input(f'{self.name_id}-size-function', 'value')],
)
def update_function_ui(value):
self._size_function_generator = value
if value == "gen_random_sizes":
return "File size function generator: [0]"
elif value == "gen_in_range_random_sizes":
return "File size function generator: [1]"
def elements(self):
return html.Div([
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-num-file-val',
children="Num. Files: "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-num-files',
min=1,
max=100000,
step=1,
value=self._num_files,
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
10000: {'label': '10000', 'style': {'font-size': "8px"}},
20000: {'label': '20000', 'style': {'font-size': "8px"}},
30000: {'label': '30000', 'style': {'font-size': "8px"}},
50000: {'label': '50000', 'style': {'font-size': "8px"}},
100000: {'label': '100000', 'style': {'font-size': "8px"}},
},
), width=6)
]),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-file-size-val',
children="File Size (MB): "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.RangeSlider(
id=f'{self.name_id}-file-size',
min=1,
max=24000,
step=1,
value=[self._min_file_size, self._max_file_size],
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
2000: {'label': '2000', 'style': {'font-size': "8px"}},
4000: {'label': '4000', 'style': {'font-size': "8px"}},
8000: {'label': '8000', 'style': {'font-size': "8px"}},
16000: {'label': '16000', 'style': {'font-size': "8px"}},
24000: {'label': '24000', 'style': {'font-size': "8px"}},
},
allowCross=False,
), width=6)
]),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-size-function-val',
children="File size function generator: [1]"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Dropdown(
id=f'{self.name_id}-size-function',
options=[
{'label': "(0) gen random sizes",
'value': "gen_random_sizes"},
{'label': "(1) gen in range random sizes",
'value': "gen_in_range_random_sizes"},
],
value='gen_in_range_random_sizes'
), width=6),
]),
])
class HighFrequencyDataset(FunctionUI):
"""UI for HighFrequencyDataset generator."""
def __init__(self, app: 'dash.dash.Dash'):
super().__init__(app)
self._num_files: int = 100
self._min_file_size: int = 100
self._max_file_size: int = 24000
self._lambda_less_req_files: float = 1.
self._lambda_more_req_files: float = 10.
self._perc_more_req_files: float = 10.
self._perc_files_x_day: float = 25.
self._size_function_generator = "gen_random_sizes"
def __repr__(self):
return "High Frequency Dataset"
def to_dict(self):
return {
'num_files': self._num_files,
'min_file_size': self._min_file_size,
'max_file_size': self._max_file_size,
'lambda_less_req_files': self._lambda_less_req_files,
'lambda_more_req_files': self._lambda_more_req_files,
'perc_more_req_files': self._perc_more_req_files,
'perc_files_x_day': self._perc_files_x_day,
'size_generator_function': self._size_function_generator,
}
def callbacks(self):
pass
@self._app.callback(
Output(f'{self.name_id}-num-file-val', 'children'),
[Input(f'{self.name_id}-num-files', 'value')])
def change_num_files(value):
self._num_files = value
return f"Num. Files: {value}"
@self._app.callback(
Output(f'{self.name_id}-file-size-val', 'children'),
[Input(f'{self.name_id}-file-size', 'value')],
)
def change_size(value):
self._min_file_size, self._max_file_size = value
return f"File Size (MB): {self._min_file_size}-{self._max_file_size}"
@self._app.callback(
Output(f'{self.name_id}-size-function-val', 'children'),
[Input(f'{self.name_id}-size-function', 'value')],
)
def update_function_ui(value):
self._size_function_generator = value
if value == "gen_random_sizes":
return "File size function generator: [0]"
elif value == "gen_in_range_random_sizes":
return "File size function generator: [1]"
@self._app.callback(
Output(f'{self.name_id}-hidden-div-lambda-less', 'children'),
[Input(f'{self.name_id}-lambda-less-req-files', 'value')],
)
def change_lambda_less_req_files(value):
self._lambda_less_req_files = value
@self._app.callback(
Output(f'{self.name_id}-hidden-div-lambda-more', 'children'),
[Input(f'{self.name_id}-lambda-more-req-files', 'value')],
)
def change_lambda_more_req_files(value):
self._lambda_more_req_files = value
@self._app.callback(
Output(f'{self.name_id}-perc-more-req-files-val', 'children'),
[Input(f'{self.name_id}-perc-more-req-files', 'value')],
)
def change_percentage_more_req_files(value):
self._perc_more_req_files = value
return f"More requested files: {value}%"
@self._app.callback(
Output(f'{self.name_id}-perc-files-x-day-val', 'children'),
[Input(f'{self.name_id}-perc-files-x-day', 'value')],
)
def change_percentage_files_x_day(value):
self._perc_files_x_day = value
return f"Files x day: {value}%"
def elements(self):
return html.Div([
html.Div(
# For empty output callbacks
id=f'{self.name_id}-hidden-div-lambda-less',
style={'display': "none"}),
html.Div(
# For empty output callbacks
id=f'{self.name_id}-hidden-div-lambda-more',
style={'display': "none"}),
dbc.Row([
dbc.Col(
html.Div(
"NOTE: This function do NOT take into account the above num. of req. x day parameter",
style={'color': "rgb(251, 0, 0)",
'padding-bottom': "2em"},
),
width={'size': 9, 'offset': 3}),
]),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-num-file-val',
children="Num. Files: "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-num-files',
min=1,
max=100000,
step=1,
value=self._num_files,
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
10000: {'label': '10000', 'style': {'font-size': "8px"}},
20000: {'label': '20000', 'style': {'font-size': "8px"}},
30000: {'label': '30000', 'style': {'font-size': "8px"}},
50000: {'label': '50000', 'style': {'font-size': "8px"}},
100000: {'label': '100000', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-file-size-val',
children="File Size (MB): "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.RangeSlider(
id=f'{self.name_id}-file-size',
min=1,
max=24000,
step=1,
value=[self._min_file_size, self._max_file_size],
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
2000: {'label': '2000', 'style': {'font-size': "8px"}},
4000: {'label': '4000', 'style': {'font-size': "8px"}},
8000: {'label': '8000', 'style': {'font-size': "8px"}},
16000: {'label': '16000', 'style': {'font-size': "8px"}},
24000: {'label': '24000', 'style': {'font-size': "8px"}},
},
allowCross=False,
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-size-function-val',
children="File size function generator: [1]"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Dropdown(
id=f'{self.name_id}-size-function',
options=[
{'label': "(0) gen random sizes",
'value': "gen_random_sizes"},
{'label': "(1) gen in range random sizes",
'value': "gen_in_range_random_sizes"},
],
value='gen_in_range_random_sizes'
), width=6),
], style={'padding-bottom': "2em"},),
dbc.Row([
dbc.Col(
html.H5(children="Poisson distribution parameters"),
width={'size': "auto", 'offset': 1}
)
]),
dbc.Row([
dbc.Col(
html.Hr(),
width={'size': "8", 'offset': 1}
)
]),
dbc.Row([
dbc.Col(
html.H5(children="Lambda less requested files"),
width={'size': "auto", 'offset': 2}
),
dbc.Col(dcc.Input(
id=f'{self.name_id}-lambda-less-req-files',
type="number",
placeholder="Lambda less requested files",
value=self._lambda_less_req_files,
), width={'size': "auto", 'offset': 1}),
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(children="Lambda more requested files"),
width={'size': "auto", 'offset': 2}
),
dbc.Col(dcc.Input(
id=f'{self.name_id}-lambda-more-req-files',
type="number",
placeholder="Lambda more requested files",
value=self._lambda_more_req_files,
), width={'size': "auto", 'offset': 1}),
], style={'padding-bottom': "2em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-perc-more-req-files-val',
children="More requested files: %"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-perc-more-req-files',
min=1,
max=100,
step=1,
value=self._perc_more_req_files,
marks={
10: {'label': '10%', 'style': {'font-size': "8px"}},
20: {'label': '20%', 'style': {'font-size': "8px"}},
30: {'label': '30%', 'style': {'font-size': "8px"}},
40: {'label': '40%', 'style': {'font-size': "8px"}},
50: {'label': '50%', 'style': {'font-size': "8px"}},
60: {'label': '60%', 'style': {'font-size': "8px"}},
70: {'label': '70%', 'style': {'font-size': "8px"}},
80: {'label': '80%', 'style': {'font-size': "8px"}},
90: {'label': '90%', 'style': {'font-size': "8px"}},
100: {'label': '100%', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-perc-files-x-day-val',
children="Files x day: %"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-perc-files-x-day',
min=1,
max=100,
step=1,
value=self._perc_files_x_day,
marks={
10: {'label': '10%', 'style': {'font-size': "8px"}},
20: {'label': '20%', 'style': {'font-size': "8px"}},
30: {'label': '30%', 'style': {'font-size': "8px"}},
40: {'label': '40%', 'style': {'font-size': "8px"}},
50: {'label': '50%', 'style': {'font-size': "8px"}},
60: {'label': '60%', 'style': {'font-size': "8px"}},
70: {'label': '70%', 'style': {'font-size': "8px"}},
80: {'label': '80%', 'style': {'font-size': "8px"}},
90: {'label': '90%', 'style': {'font-size': "8px"}},
100: {'label': '100%', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
])
class RecencyFocusedDataset(FunctionUI):
"""UI for RecencyFocusedDataset generator."""
def __init__(self, app: 'dash.dash.Dash'):
super().__init__(app)
self._num_files: int = 100
self._min_file_size: int = 100
self._max_file_size: int = 24000
self._perc_files_x_day: float = 25.
self._size_function_generator = "gen_random_sizes"
def __repr__(self):
return "Recency Focused Dataset"
def to_dict(self):
return {
'num_files': self._num_files,
'min_file_size': self._min_file_size,
'max_file_size': self._max_file_size,
'perc_files_x_day': self._perc_files_x_day,
'size_generator_function': self._size_function_generator,
}
def callbacks(self):
pass
@self._app.callback(
Output(f'{self.name_id}-num-file-val', 'children'),
[Input(f'{self.name_id}-num-files', 'value')])
def change_num_files(value):
self._num_files = value
return f"Num. Files: {value}"
@self._app.callback(
Output(f'{self.name_id}-file-size-val', 'children'),
[Input(f'{self.name_id}-file-size', 'value')],
)
def change_size(value):
self._min_file_size, self._max_file_size = value
return f"File Size (MB): {self._min_file_size}-{self._max_file_size}"
@self._app.callback(
Output(f'{self.name_id}-size-function-val', 'children'),
[Input(f'{self.name_id}-size-function', 'value')],
)
def update_function_ui(value):
self._size_function_generator = value
if value == "gen_random_sizes":
return "File size function generator: [0]"
elif value == "gen_in_range_random_sizes":
return "File size function generator: [1]"
@self._app.callback(
Output(f'{self.name_id}-perc-files-x-day-val', 'children'),
[Input(f'{self.name_id}-perc-files-x-day', 'value')],
)
def change_percentage_files_x_day(value):
self._perc_files_x_day = value
return f"Files x day: {value}%"
def elements(self):
return html.Div([
html.Div(
# For empty output callbacks
id=f'{self.name_id}-hidden-div-lambda-less',
style={'display': "none"}),
html.Div(
# For empty output callbacks
id=f'{self.name_id}-hidden-div-lambda-more',
style={'display': "none"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-num-file-val',
children="Num. Files: "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-num-files',
min=1,
max=100000,
step=1,
value=self._num_files,
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
10000: {'label': '10000', 'style': {'font-size': "8px"}},
20000: {'label': '20000', 'style': {'font-size': "8px"}},
30000: {'label': '30000', 'style': {'font-size': "8px"}},
50000: {'label': '50000', 'style': {'font-size': "8px"}},
100000: {'label': '100000', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-file-size-val',
children="File Size (MB): "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.RangeSlider(
id=f'{self.name_id}-file-size',
min=1,
max=24000,
step=1,
value=[self._min_file_size, self._max_file_size],
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
2000: {'label': '2000', 'style': {'font-size': "8px"}},
4000: {'label': '4000', 'style': {'font-size': "8px"}},
8000: {'label': '8000', 'style': {'font-size': "8px"}},
16000: {'label': '16000', 'style': {'font-size': "8px"}},
24000: {'label': '24000', 'style': {'font-size': "8px"}},
},
allowCross=False,
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-size-function-val',
children="File size function generator: [1]"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Dropdown(
id=f'{self.name_id}-size-function',
options=[
{'label': "(0) gen random sizes",
'value': "gen_random_sizes"},
{'label': "(1) gen in range random sizes",
'value': "gen_in_range_random_sizes"},
],
value='gen_in_range_random_sizes'
), width=6),
], style={'padding-bottom': "2em"},),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-perc-files-x-day-val',
children="Files x day: %"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-perc-files-x-day',
min=1,
max=100,
step=1,
value=self._perc_files_x_day,
marks={
10: {'label': '10%', 'style': {'font-size': "8px"}},
20: {'label': '20%', 'style': {'font-size': "8px"}},
30: {'label': '30%', 'style': {'font-size': "8px"}},
40: {'label': '40%', 'style': {'font-size': "8px"}},
50: {'label': '50%', 'style': {'font-size': "8px"}},
60: {'label': '60%', 'style': {'font-size': "8px"}},
70: {'label': '70%', 'style': {'font-size': "8px"}},
80: {'label': '80%', 'style': {'font-size': "8px"}},
90: {'label': '90%', 'style': {'font-size': "8px"}},
100: {'label': '100%', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
])
class SizeFocusedDataset(FunctionUI):
"""UI for SizeFocusedDataset generator."""
def __init__(self, app: 'dash.dash.Dash'):
super().__init__(app)
self._num_files: int = 100
self._min_file_size: int = 1000
self._max_file_size: int = 8000
self._noise_min_file_size: int = 16000
self._noise_max_file_size: int = 24000
self._perc_noise: float = 5.
self._perc_files_x_day: float = 25.
self._size_function_generator = "gen_random_sizes"
def __repr__(self):
return "Size Focused Dataset"
def to_dict(self):
return {
'num_files': self._num_files,
'min_file_size': self._min_file_size,
'max_file_size': self._max_file_size,
'noise_min_file_size': self._noise_min_file_size,
'noise_max_file_size': self._noise_max_file_size,
'perc_noise': self._perc_noise,
'perc_files_x_day': self._perc_files_x_day,
'size_generator_function': self._size_function_generator,
}
def callbacks(self):
pass
@self._app.callback(
Output(f'{self.name_id}-num-file-val', 'children'),
[Input(f'{self.name_id}-num-files', 'value')])
def change_num_files(value):
self._num_files = value
return f"Num. Files: {value}"
@self._app.callback(
Output(f'{self.name_id}-file-size-val', 'children'),
[Input(f'{self.name_id}-file-size', 'value')],
)
def change_size(value):
self._min_file_size, self._max_file_size = value
return f"File Size (MB): {self._min_file_size}-{self._max_file_size}"
@self._app.callback(
Output(f'{self.name_id}-noise-file-size-val', 'children'),
[Input(f'{self.name_id}-noise-file-size', 'value')],
)
def change_size(value):
self._noise_min_file_size, self._noise_max_file_size = value
return f"Noise File Size (MB): {self._noise_min_file_size}-{self._noise_max_file_size}"
@self._app.callback(
Output(f'{self.name_id}-size-function-val', 'children'),
[Input(f'{self.name_id}-size-function', 'value')],
)
def update_function_ui(value):
self._size_function_generator = value
if value == "gen_random_sizes":
return "File size function generator: [0]"
elif value == "gen_in_range_random_sizes":
return "File size function generator: [1]"
@self._app.callback(
Output(f'{self.name_id}-perc-noise-val', 'children'),
[Input(f'{self.name_id}-perc-noise', 'value')],
)
def change_percentage_more_req_files(value):
self._perc_noise = value
return f"Noise: {value}%"
@self._app.callback(
Output(f'{self.name_id}-perc-files-x-day-val', 'children'),
[Input(f'{self.name_id}-perc-files-x-day', 'value')],
)
def change_percentage_files_x_day(value):
self._perc_files_x_day = value
return f"Files x day: {value}%"
def elements(self):
return html.Div([
html.Div(
# For empty output callbacks
id=f'{self.name_id}-hidden-div-lambda-less',
style={'display': "none"}),
html.Div(
# For empty output callbacks
id=f'{self.name_id}-hidden-div-lambda-more',
style={'display': "none"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-num-file-val',
children="Num. Files: "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-num-files',
min=1,
max=100000,
step=1,
value=self._num_files,
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
10000: {'label': '10000', 'style': {'font-size': "8px"}},
20000: {'label': '20000', 'style': {'font-size': "8px"}},
30000: {'label': '30000', 'style': {'font-size': "8px"}},
50000: {'label': '50000', 'style': {'font-size': "8px"}},
100000: {'label': '100000', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-file-size-val',
children="File Size (MB): "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.RangeSlider(
id=f'{self.name_id}-file-size',
min=1,
max=24000,
step=1,
value=[self._min_file_size, self._max_file_size],
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
2000: {'label': '2000', 'style': {'font-size': "8px"}},
4000: {'label': '4000', 'style': {'font-size': "8px"}},
8000: {'label': '8000', 'style': {'font-size': "8px"}},
16000: {'label': '16000', 'style': {'font-size': "8px"}},
24000: {'label': '24000', 'style': {'font-size': "8px"}},
},
allowCross=False,
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-noise-file-size-val',
children="Noise File Size (MB): "),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.RangeSlider(
id=f'{self.name_id}-noise-file-size',
min=1,
max=24000,
step=1,
value=[self._noise_min_file_size,
self._noise_max_file_size],
marks={
1000: {'label': '1000', 'style': {'font-size': "8px"}},
2000: {'label': '2000', 'style': {'font-size': "8px"}},
4000: {'label': '4000', 'style': {'font-size': "8px"}},
8000: {'label': '8000', 'style': {'font-size': "8px"}},
16000: {'label': '16000', 'style': {'font-size': "8px"}},
24000: {'label': '24000', 'style': {'font-size': "8px"}},
},
allowCross=False,
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-size-function-val',
children="File size function generator: [1]"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Dropdown(
id=f'{self.name_id}-size-function',
options=[
{'label': "(0) gen random sizes",
'value': "gen_random_sizes"},
{'label': "(1) gen in range random sizes",
'value': "gen_in_range_random_sizes"},
],
value='gen_in_range_random_sizes'
), width=6),
], style={'padding-bottom': "2em"},),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-perc-noise-val',
children="Noise: %"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-perc-noise',
min=1,
max=100,
step=1,
value=self._perc_noise,
marks={
10: {'label': '10%', 'style': {'font-size': "8px"}},
20: {'label': '20%', 'style': {'font-size': "8px"}},
30: {'label': '30%', 'style': {'font-size': "8px"}},
40: {'label': '40%', 'style': {'font-size': "8px"}},
50: {'label': '50%', 'style': {'font-size': "8px"}},
60: {'label': '60%', 'style': {'font-size': "8px"}},
70: {'label': '70%', 'style': {'font-size': "8px"}},
80: {'label': '80%', 'style': {'font-size': "8px"}},
90: {'label': '90%', 'style': {'font-size': "8px"}},
100: {'label': '100%', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
dbc.Row([
dbc.Col(
html.H5(id=f'{self.name_id}-perc-files-x-day-val',
children="Files x day: %"),
width={'size': 3, 'offset': 1}),
dbc.Col(dcc.Slider(
id=f'{self.name_id}-perc-files-x-day',
min=1,
max=100,
step=1,
value=self._perc_files_x_day,
marks={
10: {'label': '10%', 'style': {'font-size': "8px"}},
20: {'label': '20%', 'style': {'font-size': "8px"}},
30: {'label': '30%', 'style': {'font-size': "8px"}},
40: {'label': '40%', 'style': {'font-size': "8px"}},
50: {'label': '50%', 'style': {'font-size': "8px"}},
60: {'label': '60%', 'style': {'font-size': "8px"}},
70: {'label': '70%', 'style': {'font-size': "8px"}},
80: {'label': '80%', 'style': {'font-size': "8px"}},
90: {'label': '90%', 'style': {'font-size': "8px"}},
100: {'label': '100%', 'style': {'font-size': "8px"}},
},
), width=6)
], style={'padding-bottom': "1em"}),
])
| 43.537295 | 110 | 0.437395 | 3,512 | 34,438 | 4.096811 | 0.050114 | 0.065054 | 0.093967 | 0.115652 | 0.925424 | 0.907075 | 0.88831 | 0.867668 | 0.850222 | 0.841048 | 0 | 0.05301 | 0.397991 | 34,438 | 790 | 111 | 43.592405 | 0.64099 | 0.01298 | 0 | 0.825662 | 0 | 0 | 0.259797 | 0.09373 | 0 | 0 | 0 | 0 | 0.001395 | 1 | 0.064156 | false | 0.004184 | 0.006974 | 0.019526 | 0.128312 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
18776c0f581fb92059c1120a5fd89a6d12d8203f | 6,683 | py | Python | loldib/getratings/models/NA/na_nidalee/na_nidalee_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_nidalee/na_nidalee_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_nidalee/na_nidalee_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Nidalee_Mid_Aatrox(Ratings):
pass
class NA_Nidalee_Mid_Ahri(Ratings):
pass
class NA_Nidalee_Mid_Akali(Ratings):
pass
class NA_Nidalee_Mid_Alistar(Ratings):
pass
class NA_Nidalee_Mid_Amumu(Ratings):
pass
class NA_Nidalee_Mid_Anivia(Ratings):
pass
class NA_Nidalee_Mid_Annie(Ratings):
pass
class NA_Nidalee_Mid_Ashe(Ratings):
pass
class NA_Nidalee_Mid_AurelionSol(Ratings):
pass
class NA_Nidalee_Mid_Azir(Ratings):
pass
class NA_Nidalee_Mid_Bard(Ratings):
pass
class NA_Nidalee_Mid_Blitzcrank(Ratings):
pass
class NA_Nidalee_Mid_Brand(Ratings):
pass
class NA_Nidalee_Mid_Braum(Ratings):
pass
class NA_Nidalee_Mid_Caitlyn(Ratings):
pass
class NA_Nidalee_Mid_Camille(Ratings):
pass
class NA_Nidalee_Mid_Cassiopeia(Ratings):
pass
class NA_Nidalee_Mid_Chogath(Ratings):
pass
class NA_Nidalee_Mid_Corki(Ratings):
pass
class NA_Nidalee_Mid_Darius(Ratings):
pass
class NA_Nidalee_Mid_Diana(Ratings):
pass
class NA_Nidalee_Mid_Draven(Ratings):
pass
class NA_Nidalee_Mid_DrMundo(Ratings):
pass
class NA_Nidalee_Mid_Ekko(Ratings):
pass
class NA_Nidalee_Mid_Elise(Ratings):
pass
class NA_Nidalee_Mid_Evelynn(Ratings):
pass
class NA_Nidalee_Mid_Ezreal(Ratings):
pass
class NA_Nidalee_Mid_Fiddlesticks(Ratings):
pass
class NA_Nidalee_Mid_Fiora(Ratings):
pass
class NA_Nidalee_Mid_Fizz(Ratings):
pass
class NA_Nidalee_Mid_Galio(Ratings):
pass
class NA_Nidalee_Mid_Gangplank(Ratings):
pass
class NA_Nidalee_Mid_Garen(Ratings):
pass
class NA_Nidalee_Mid_Gnar(Ratings):
pass
class NA_Nidalee_Mid_Gragas(Ratings):
pass
class NA_Nidalee_Mid_Graves(Ratings):
pass
class NA_Nidalee_Mid_Hecarim(Ratings):
pass
class NA_Nidalee_Mid_Heimerdinger(Ratings):
pass
class NA_Nidalee_Mid_Illaoi(Ratings):
pass
class NA_Nidalee_Mid_Irelia(Ratings):
pass
class NA_Nidalee_Mid_Ivern(Ratings):
pass
class NA_Nidalee_Mid_Janna(Ratings):
pass
class NA_Nidalee_Mid_JarvanIV(Ratings):
pass
class NA_Nidalee_Mid_Jax(Ratings):
pass
class NA_Nidalee_Mid_Jayce(Ratings):
pass
class NA_Nidalee_Mid_Jhin(Ratings):
pass
class NA_Nidalee_Mid_Jinx(Ratings):
pass
class NA_Nidalee_Mid_Kalista(Ratings):
pass
class NA_Nidalee_Mid_Karma(Ratings):
pass
class NA_Nidalee_Mid_Karthus(Ratings):
pass
class NA_Nidalee_Mid_Kassadin(Ratings):
pass
class NA_Nidalee_Mid_Katarina(Ratings):
pass
class NA_Nidalee_Mid_Kayle(Ratings):
pass
class NA_Nidalee_Mid_Kayn(Ratings):
pass
class NA_Nidalee_Mid_Kennen(Ratings):
pass
class NA_Nidalee_Mid_Khazix(Ratings):
pass
class NA_Nidalee_Mid_Kindred(Ratings):
pass
class NA_Nidalee_Mid_Kled(Ratings):
pass
class NA_Nidalee_Mid_KogMaw(Ratings):
pass
class NA_Nidalee_Mid_Leblanc(Ratings):
pass
class NA_Nidalee_Mid_LeeSin(Ratings):
pass
class NA_Nidalee_Mid_Leona(Ratings):
pass
class NA_Nidalee_Mid_Lissandra(Ratings):
pass
class NA_Nidalee_Mid_Lucian(Ratings):
pass
class NA_Nidalee_Mid_Lulu(Ratings):
pass
class NA_Nidalee_Mid_Lux(Ratings):
pass
class NA_Nidalee_Mid_Malphite(Ratings):
pass
class NA_Nidalee_Mid_Malzahar(Ratings):
pass
class NA_Nidalee_Mid_Maokai(Ratings):
pass
class NA_Nidalee_Mid_MasterYi(Ratings):
pass
class NA_Nidalee_Mid_MissFortune(Ratings):
pass
class NA_Nidalee_Mid_MonkeyKing(Ratings):
pass
class NA_Nidalee_Mid_Mordekaiser(Ratings):
pass
class NA_Nidalee_Mid_Morgana(Ratings):
pass
class NA_Nidalee_Mid_Nami(Ratings):
pass
class NA_Nidalee_Mid_Nasus(Ratings):
pass
class NA_Nidalee_Mid_Nautilus(Ratings):
pass
class NA_Nidalee_Mid_Nidalee(Ratings):
pass
class NA_Nidalee_Mid_Nocturne(Ratings):
pass
class NA_Nidalee_Mid_Nunu(Ratings):
pass
class NA_Nidalee_Mid_Olaf(Ratings):
pass
class NA_Nidalee_Mid_Orianna(Ratings):
pass
class NA_Nidalee_Mid_Ornn(Ratings):
pass
class NA_Nidalee_Mid_Pantheon(Ratings):
pass
class NA_Nidalee_Mid_Poppy(Ratings):
pass
class NA_Nidalee_Mid_Quinn(Ratings):
pass
class NA_Nidalee_Mid_Rakan(Ratings):
pass
class NA_Nidalee_Mid_Rammus(Ratings):
pass
class NA_Nidalee_Mid_RekSai(Ratings):
pass
class NA_Nidalee_Mid_Renekton(Ratings):
pass
class NA_Nidalee_Mid_Rengar(Ratings):
pass
class NA_Nidalee_Mid_Riven(Ratings):
pass
class NA_Nidalee_Mid_Rumble(Ratings):
pass
class NA_Nidalee_Mid_Ryze(Ratings):
pass
class NA_Nidalee_Mid_Sejuani(Ratings):
pass
class NA_Nidalee_Mid_Shaco(Ratings):
pass
class NA_Nidalee_Mid_Shen(Ratings):
pass
class NA_Nidalee_Mid_Shyvana(Ratings):
pass
class NA_Nidalee_Mid_Singed(Ratings):
pass
class NA_Nidalee_Mid_Sion(Ratings):
pass
class NA_Nidalee_Mid_Sivir(Ratings):
pass
class NA_Nidalee_Mid_Skarner(Ratings):
pass
class NA_Nidalee_Mid_Sona(Ratings):
pass
class NA_Nidalee_Mid_Soraka(Ratings):
pass
class NA_Nidalee_Mid_Swain(Ratings):
pass
class NA_Nidalee_Mid_Syndra(Ratings):
pass
class NA_Nidalee_Mid_TahmKench(Ratings):
pass
class NA_Nidalee_Mid_Taliyah(Ratings):
pass
class NA_Nidalee_Mid_Talon(Ratings):
pass
class NA_Nidalee_Mid_Taric(Ratings):
pass
class NA_Nidalee_Mid_Teemo(Ratings):
pass
class NA_Nidalee_Mid_Thresh(Ratings):
pass
class NA_Nidalee_Mid_Tristana(Ratings):
pass
class NA_Nidalee_Mid_Trundle(Ratings):
pass
class NA_Nidalee_Mid_Tryndamere(Ratings):
pass
class NA_Nidalee_Mid_TwistedFate(Ratings):
pass
class NA_Nidalee_Mid_Twitch(Ratings):
pass
class NA_Nidalee_Mid_Udyr(Ratings):
pass
class NA_Nidalee_Mid_Urgot(Ratings):
pass
class NA_Nidalee_Mid_Varus(Ratings):
pass
class NA_Nidalee_Mid_Vayne(Ratings):
pass
class NA_Nidalee_Mid_Veigar(Ratings):
pass
class NA_Nidalee_Mid_Velkoz(Ratings):
pass
class NA_Nidalee_Mid_Vi(Ratings):
pass
class NA_Nidalee_Mid_Viktor(Ratings):
pass
class NA_Nidalee_Mid_Vladimir(Ratings):
pass
class NA_Nidalee_Mid_Volibear(Ratings):
pass
class NA_Nidalee_Mid_Warwick(Ratings):
pass
class NA_Nidalee_Mid_Xayah(Ratings):
pass
class NA_Nidalee_Mid_Xerath(Ratings):
pass
class NA_Nidalee_Mid_XinZhao(Ratings):
pass
class NA_Nidalee_Mid_Yasuo(Ratings):
pass
class NA_Nidalee_Mid_Yorick(Ratings):
pass
class NA_Nidalee_Mid_Zac(Ratings):
pass
class NA_Nidalee_Mid_Zed(Ratings):
pass
class NA_Nidalee_Mid_Ziggs(Ratings):
pass
class NA_Nidalee_Mid_Zilean(Ratings):
pass
class NA_Nidalee_Mid_Zyra(Ratings):
pass
| 16.026379 | 46 | 0.77151 | 972 | 6,683 | 4.878601 | 0.151235 | 0.203712 | 0.407423 | 0.494728 | 0.808941 | 0.808941 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166243 | 6,683 | 416 | 47 | 16.064904 | 0.851041 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 8 |
43f0210ed27309eaa31276ea183549abe3812a6a | 2,579 | py | Python | src/ikazuchi/tests/data/rst/api_call_sourceblock.py | t2y/ikazuchi | 7023111e92fa47360c50cfefd1398c554475f2c6 | [
"Apache-2.0"
] | null | null | null | src/ikazuchi/tests/data/rst/api_call_sourceblock.py | t2y/ikazuchi | 7023111e92fa47360c50cfefd1398c554475f2c6 | [
"Apache-2.0"
] | null | null | null | src/ikazuchi/tests/data/rst/api_call_sourceblock.py | t2y/ikazuchi | 7023111e92fa47360c50cfefd1398c554475f2c6 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
DATA_SET = [
# standalone sourceblock
( # 0
[u"::\n"
u"",
u" first code",
u" second code",
u""],
u"::\n",
[u"::\n",
u" first code",
u" second code",
u""]
),
( # 1
[u"::\n"
u"",
u" first code",
u" second code"],
u"::\n",
[u"::\n",
u" first code",
u" second code"]
),
( # 2
[u"::\n"
u"",
u"",
u" first code",
u" second code"],
u"::\n",
[u"::\n"
u"",
u"",
u" first code",
u" second code"]
),
( # 3
[u":: \n"
u"",
u" first code",
u" second code",
u""],
u":: \n",
[u":: \n",
u" first code",
u" second code",
u""]
),
( # 4
[u"::\n"
u" first code",
u" second code",
u""],
u"::\n",
[u"::\n first code",
u" second code",
u""]
),
# given first line has "::" and test
( # 5
[u"that is::\n"
u"",
u" first code",
u" second code",
u""],
u"that is::\n",
[u"that is<span class=notranslate>::</span>\n",
u" first code",
u" second code",
u""]
),
( # 6
[u"that is::\n"
u"",
u" first code",
u" second code"],
u"that is::\n",
[u"that is<span class=notranslate>::</span>\n",
u" first code",
u" second code"]
),
( # 7
[u"that is::\n"
u"",
u"",
u" first code",
u" second code"],
u"that is::\n",
[u"that is<span class=notranslate>::</span>\n",
u"",
u" first code",
u" second code"]
),
( # 8
[u"that is:: \n"
u"",
u" first code",
u" second code",
u""],
u"that is:: \n",
[u"that is<span class=notranslate>:: </span>\n",
u" first code",
u" second code",
u""]
),
( # 9
[u"that is::\n",
u" first code",
u" second code",
u""],
u"that is::\n",
[u"that is<span class=notranslate>::</span>\n",
u" first code",
u" second code",
u""]
),
]
| 19.537879 | 57 | 0.305157 | 277 | 2,579 | 2.837545 | 0.111913 | 0.229008 | 0.254453 | 0.407125 | 0.908397 | 0.899491 | 0.863868 | 0.863868 | 0.83715 | 0.833333 | 0 | 0.008682 | 0.508724 | 2,579 | 131 | 58 | 19.687023 | 0.611681 | 0.038387 | 0 | 0.87931 | 0 | 0 | 0.394161 | 0.04704 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
a12fb71bb87c9eb909fce6017706fbc9e9307f5e | 11,877 | py | Python | senlin-7.0.0/senlin/tests/unit/engine/actions/test_replace_nodes.py | scottwedge/OpenStack-Stein | 7077d1f602031dace92916f14e36b124f474de15 | [
"Apache-2.0"
] | null | null | null | senlin-7.0.0/senlin/tests/unit/engine/actions/test_replace_nodes.py | scottwedge/OpenStack-Stein | 7077d1f602031dace92916f14e36b124f474de15 | [
"Apache-2.0"
] | 5 | 2019-08-14T06:46:03.000Z | 2021-12-13T20:01:25.000Z | senlin-7.0.0/senlin/tests/unit/engine/actions/test_replace_nodes.py | scottwedge/OpenStack-Stein | 7077d1f602031dace92916f14e36b124f474de15 | [
"Apache-2.0"
] | 2 | 2020-03-15T01:24:15.000Z | 2020-07-22T20:34:26.000Z | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from senlin.common import consts
from senlin.engine.actions import base as ab
from senlin.engine.actions import cluster_action as ca
from senlin.engine import cluster as cm
from senlin.engine import dispatcher
from senlin.objects import action as ao
from senlin.objects import dependency as dobj
from senlin.objects import node as no
from senlin.tests.unit.common import base
from senlin.tests.unit.common import utils
@mock.patch.object(cm.Cluster, 'load')
class ClusterReplaceNodesTest(base.SenlinTestCase):
def setUp(self):
super(ClusterReplaceNodesTest, self).setUp()
self.ctx = utils.dummy_context()
@mock.patch.object(ao.Action, 'update')
@mock.patch.object(ab.Action, 'create')
@mock.patch.object(no.Node, 'get')
@mock.patch.object(dobj.Dependency, 'create')
@mock.patch.object(dispatcher, 'start_action')
@mock.patch.object(ca.ClusterAction, '_wait_for_dependents')
def test_do_replace_nodes(self, mock_wait, mock_start, mock_dep,
mock_get_node, mock_action, mock_update,
mock_load):
cluster = mock.Mock(id='CLUSTER_ID', desired_capacity=10)
mock_load.return_value = cluster
action = ca.ClusterAction(cluster.id, 'CLUSTER_ACTION', self.ctx)
action.id = 'CLUSTER_ACTION_ID'
action.inputs = {'O_NODE_1': 'R_NODE_1'}
action.outputs = {}
origin_node = mock.Mock(id='O_NODE_1', cluster_id='CLUSTER_ID',
ACTIVE='ACTIVE', status='ACTIVE')
replace_node = mock.Mock(id='R_NODE_1', cluster_id='',
ACTIVE='ACTIVE', status='ACTIVE')
mock_get_node.side_effect = [origin_node, replace_node]
mock_action.side_effect = ['NODE_LEAVE_1', 'NODE_JOIN_1']
mock_wait.return_value = (action.RES_OK, 'Free to fly!')
# do the action
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_OK, res_code)
self.assertEqual('Completed replacing nodes.', res_msg)
mock_get_node.assert_has_calls([
mock.call(action.context, 'O_NODE_1'),
mock.call(action.context, 'R_NODE_1')])
mock_load.assert_called_once_with(
action.context,
'CLUSTER_ID')
mock_action.assert_has_calls([
mock.call(action.context, 'O_NODE_1', 'NODE_LEAVE',
name='node_leave_O_NODE_1',
cause='Derived Action'),
mock.call(action.context, 'R_NODE_1', 'NODE_JOIN',
name='node_join_R_NODE_1',
cause='Derived Action',
inputs={'cluster_id': 'CLUSTER_ID'})])
mock_dep.assert_has_calls([
mock.call(action.context,
['NODE_JOIN_1'],
'CLUSTER_ACTION_ID'),
mock.call(action.context,
['NODE_JOIN_1'],
'NODE_LEAVE_1')])
mock_update.assert_has_calls([
mock.call(action.context,
'NODE_JOIN_1',
{'status': 'READY'}),
mock.call(action.context,
'NODE_LEAVE_1',
{'status': 'READY'})])
mock_start.assert_called_once_with()
mock_wait.assert_called_once_with()
cluster.remove_node.assert_called_once_with(origin_node)
cluster.add_node.assert_called_once_with(replace_node)
cluster.eval_status.assert_called_once_with(
action.context, consts.CLUSTER_REPLACE_NODES)
@mock.patch.object(no.Node, 'get')
def test_do_replace_nodes_original_not_found(self, mock_get_node,
mock_load):
action = ca.ClusterAction('ID', 'CLUSTER_ACTION', self.ctx)
action.inputs = {'ORIGIN_NODE': 'REPLACE_NODE'}
origin_node = None
replace_node = mock.Mock(id='REPLACE_NODE', cluster_id='',
ACTIVE='ACTIVE', status='ACTIVE')
mock_get_node.side_effect = [origin_node, replace_node]
# do the action
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_ERROR, res_code)
self.assertEqual('Original node ORIGIN_NODE not found.',
res_msg)
@mock.patch.object(no.Node, 'get')
def test_do_replace_nodes_replacement_not_found(self, mock_get_node,
mock_load):
action = ca.ClusterAction('ID', 'CLUSTER_ACTION', self.ctx)
action.inputs = {'ORIGIN_NODE': 'REPLACE_NODE'}
origin_node = mock.Mock(id='ORIGIN_NODE', cluster_id='CLUSTER_ID',
ACTIVE='ACTIVE', status='ACTIVE')
replace_node = None
mock_get_node.side_effect = [origin_node, replace_node]
# do the action
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_ERROR, res_code)
self.assertEqual('Replacement node REPLACE_NODE not found.',
res_msg)
@mock.patch.object(no.Node, 'get')
def test_do_replace_nodes_not_a_member(self, mock_get_node,
mock_load):
cluster = mock.Mock(id='FAKE_CLUSTER')
mock_load.return_value = cluster
action = ca.ClusterAction(cluster.id, 'CLUSTER_ACTION', self.ctx)
action.inputs = {'ORIGIN_NODE': 'REPLACE_NODE'}
origin_node = mock.Mock(id='ORIGIN_NODE', cluster_id='')
mock_get_node.return_value = origin_node
# do action
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_ERROR, res_code)
self.assertEqual('Node ORIGIN_NODE is not a member of the '
'cluster FAKE_CLUSTER.', res_msg)
@mock.patch.object(no.Node, 'get')
def test_do_replace_nodes_node_already_member(self, mock_get_node,
mock_load):
cluster = mock.Mock(id='FAKE_CLUSTER')
mock_load.return_value = cluster
action = ca.ClusterAction(cluster.id, 'CLUSTER_ACTION', self.ctx)
action.inputs = {'ORIGIN_NODE': 'REPLACE_NODE'}
replace_node = mock.Mock(id='REPLACE_NODE',
cluster_id='FAKE_CLUSTER')
mock_get_node.return_value = replace_node
# do it
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_ERROR, res_code)
self.assertEqual('Node REPLACE_NODE is already owned by cluster '
'FAKE_CLUSTER.', res_msg)
@mock.patch.object(no.Node, 'get')
def test_do_replace_nodes_in_other_cluster(self, mock_get_node,
mock_load):
cluster = mock.Mock(id='CLUSTER_ID', desired_capacity=10)
mock_load.return_value = cluster
action = ca.ClusterAction(cluster.id, 'CLUSTER_ACTION', self.ctx)
action.id = 'CLUSTER_ACTION_ID'
action.inputs = {'ORIGIN_NODE': 'REPLACE_NODE'}
action.outputs = {}
origin_node = mock.Mock(id='ORIGIN_NODE', cluster_id='CLUSTER_ID',
ACTIVE='ACTIVE', status='ACTIVE')
replace_node = mock.Mock(id='REPLACE_NODE', cluster_id='FAKE_CLUSTER',
ACTIVE='ACTIVE', status='ACTIVE')
mock_get_node.side_effect = [origin_node, replace_node]
# do it
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_ERROR, res_code)
self.assertEqual('Node REPLACE_NODE is already owned by cluster '
'FAKE_CLUSTER.', res_msg)
@mock.patch.object(no.Node, 'get')
def test_do_replace_nodes_node_not_active(self, mock_get_node, mock_load):
cluster = mock.Mock(id='CLUSTER_ID', desired_capacity=10)
mock_load.return_value = cluster
action = ca.ClusterAction(cluster.id, 'CLUSTER_ACTION', self.ctx)
action.id = 'CLUSTER_ACTION_ID'
action.inputs = {'ORIGIN_NODE': 'REPLACE_NODE'}
action.outputs = {}
origin_node = mock.Mock(id='ORIGIN_NODE', cluster_id='CLUSTER_ID',
ACTIVE='ACTIVE', status='ACTIVE')
replace_node = mock.Mock(id='REPLACE_NODE', cluster_id='',
ACTIVE='ACTIVE', status='ERROR')
mock_get_node.side_effect = [origin_node, replace_node]
# do it
res_code, res_msg = action.do_replace_nodes()
# assertions
self.assertEqual(action.RES_ERROR, res_code)
self.assertEqual("Node REPLACE_NODE is not in ACTIVE status.", res_msg)
@mock.patch.object(ao.Action, 'update')
@mock.patch.object(ab.Action, 'create')
@mock.patch.object(no.Node, 'get')
@mock.patch.object(dobj.Dependency, 'create')
@mock.patch.object(dispatcher, 'start_action')
@mock.patch.object(ca.ClusterAction, '_wait_for_dependents')
def test_do_replace_failed_waiting(self, mock_wait, mock_start, mock_dep,
mock_get_node, mock_action,
mock_update, mock_load):
cluster = mock.Mock(id='CLUSTER_ID', desired_capacity=10)
mock_load.return_value = cluster
action = ca.ClusterAction(cluster.id, 'CLUSTER_ACTION', self.ctx)
action.id = 'CLUSTER_ACTION_ID'
action.inputs = {'O_NODE_1': 'R_NODE_1'}
action.outputs = {}
origin_node = mock.Mock(id='O_NODE_1', cluster_id='CLUSTER_ID',
ACTIVE='ACTIVE', status='ACTIVE')
replace_node = mock.Mock(id='R_NODE_1', cluster_id='',
ACTIVE='ACTIVE', status='ACTIVE')
mock_get_node.side_effect = [origin_node, replace_node]
mock_action.side_effect = ['NODE_LEAVE_1', 'NODE_JOIN_1']
mock_wait.return_value = (action.RES_TIMEOUT, 'Timeout!')
# do the action
res_code, res_msg = action.do_replace_nodes()
# assertions
mock_action.assert_has_calls([
mock.call(action.context, 'O_NODE_1', 'NODE_LEAVE',
name='node_leave_O_NODE_1',
cause='Derived Action'),
mock.call(action.context, 'R_NODE_1', 'NODE_JOIN',
name='node_join_R_NODE_1',
cause='Derived Action',
inputs={'cluster_id': 'CLUSTER_ID'})])
mock_dep.assert_has_calls([
mock.call(action.context,
['NODE_JOIN_1'],
'CLUSTER_ACTION_ID'),
mock.call(action.context,
['NODE_JOIN_1'],
'NODE_LEAVE_1')])
mock_update.assert_has_calls([
mock.call(action.context,
'NODE_JOIN_1',
{'status': 'READY'}),
mock.call(action.context,
'NODE_LEAVE_1',
{'status': 'READY'})])
self.assertEqual(action.RES_TIMEOUT, res_code)
self.assertEqual('Timeout!', res_msg)
| 41.968198 | 79 | 0.603183 | 1,421 | 11,877 | 4.741731 | 0.114708 | 0.042743 | 0.042297 | 0.043633 | 0.808103 | 0.784209 | 0.765212 | 0.761205 | 0.761205 | 0.761205 | 0 | 0.005223 | 0.29073 | 11,877 | 282 | 80 | 42.117021 | 0.794634 | 0.058685 | 0 | 0.717703 | 0 | 0 | 0.154053 | 0 | 0 | 0 | 0 | 0 | 0.138756 | 1 | 0.043062 | false | 0 | 0.052632 | 0 | 0.100478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a1422b0aab741ebb6d641092bfc98fc2adc57b6c | 20,224 | py | Python | apps/demo/migrations/0001_initial.py | pydtools/django_table_sharding_example | e8114dce1e1946c2d9f481318ee7672101b1d260 | [
"MIT"
] | null | null | null | apps/demo/migrations/0001_initial.py | pydtools/django_table_sharding_example | e8114dce1e1946c2d9f481318ee7672101b1d260 | [
"MIT"
] | null | null | null | apps/demo/migrations/0001_initial.py | pydtools/django_table_sharding_example | e8114dce1e1946c2d9f481318ee7672101b1d260 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.9 on 2021-12-01 09:46
import apps.base.model_sharding
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='DeviceLog2021',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2021',
'verbose_name_plural': 'DeviceLog2021',
'db_table': 'demo_device_log_2021',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2022',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2022',
'verbose_name_plural': 'DeviceLog2022',
'db_table': 'demo_device_log_2022',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2023',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2023',
'verbose_name_plural': 'DeviceLog2023',
'db_table': 'demo_device_log_2023',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2024',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2024',
'verbose_name_plural': 'DeviceLog2024',
'db_table': 'demo_device_log_2024',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2025',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2025',
'verbose_name_plural': 'DeviceLog2025',
'db_table': 'demo_device_log_2025',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2026',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2026',
'verbose_name_plural': 'DeviceLog2026',
'db_table': 'demo_device_log_2026',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2027',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2027',
'verbose_name_plural': 'DeviceLog2027',
'db_table': 'demo_device_log_2027',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2028',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2028',
'verbose_name_plural': 'DeviceLog2028',
'db_table': 'demo_device_log_2028',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2029',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2029',
'verbose_name_plural': 'DeviceLog2029',
'db_table': 'demo_device_log_2029',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2030',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2030',
'verbose_name_plural': 'DeviceLog2030',
'db_table': 'demo_device_log_2030',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='DeviceLog2031',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status', models.SmallIntegerField(default=0, help_text='状态', verbose_name='status')),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
('create_time', models.DateTimeField(auto_now_add=True, help_text='创建时间')),
('update_time', models.DateTimeField(auto_now=True, help_text='修改时间')),
],
options={
'verbose_name': 'DeviceLog2031',
'verbose_name_plural': 'DeviceLog2031',
'db_table': 'demo_device_log_2031',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='Log2020',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('level', models.PositiveSmallIntegerField(default=0)),
('content', models.TextField()),
('time', models.DateTimeField(auto_now_add=True)),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
],
options={
'verbose_name': 'Log2020',
'verbose_name_plural': 'Log2020',
'db_table': 'demo_log_2020',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='Log2021',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('level', models.PositiveSmallIntegerField(default=0)),
('content', models.TextField()),
('time', models.DateTimeField(auto_now_add=True)),
('is_deleted', models.BooleanField(default=False, help_text='是否删除', verbose_name='is_deleted')),
],
options={
'verbose_name': 'Log2021',
'verbose_name_plural': 'Log2021',
'db_table': 'demo_log_2021',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User0',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User0',
'verbose_name_plural': 'User0',
'db_table': 'demo_user_0',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User1',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User1',
'verbose_name_plural': 'User1',
'db_table': 'demo_user_1',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User2',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User2',
'verbose_name_plural': 'User2',
'db_table': 'demo_user_2',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User3',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User3',
'verbose_name_plural': 'User3',
'db_table': 'demo_user_3',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User4',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User4',
'verbose_name_plural': 'User4',
'db_table': 'demo_user_4',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User5',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User5',
'verbose_name_plural': 'User5',
'db_table': 'demo_user_5',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User6',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User6',
'verbose_name_plural': 'User6',
'db_table': 'demo_user_6',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User7',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User7',
'verbose_name_plural': 'User7',
'db_table': 'demo_user_7',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User8',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User8',
'verbose_name_plural': 'User8',
'db_table': 'demo_user_8',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
migrations.CreateModel(
name='User9',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=50, unique=True)),
('name', models.CharField(max_length=50)),
('age', models.IntegerField(default=18)),
('active', models.BooleanField(default=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'User9',
'verbose_name_plural': 'User9',
'db_table': 'demo_user_9',
},
bases=(models.Model, apps.base.model_sharding.ShardingMixin),
),
]
| 50.059406 | 114 | 0.557555 | 1,913 | 20,224 | 5.64506 | 0.063774 | 0.094731 | 0.093712 | 0.105936 | 0.867858 | 0.847764 | 0.847764 | 0.847764 | 0.843134 | 0.843134 | 0 | 0.023799 | 0.301919 | 20,224 | 403 | 115 | 50.183623 | 0.741111 | 0.002225 | 0 | 0.689394 | 1 | 0 | 0.15929 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005051 | 0 | 0.015152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a1516d95136883397bb8a81ec72a44041dc8a0b6 | 58,436 | py | Python | src/autorun/generate_singlecam_sdf.py | jannsta1/torf | b7866bf1a824b3ab6f44b7fa5da0c7a781766fd0 | [
"BSD-2-Clause"
] | 3 | 2021-06-15T12:01:22.000Z | 2022-01-21T23:17:37.000Z | src/autorun/generate_singlecam_sdf.py | jannsta1/torf | b7866bf1a824b3ab6f44b7fa5da0c7a781766fd0 | [
"BSD-2-Clause"
] | null | null | null | src/autorun/generate_singlecam_sdf.py | jannsta1/torf | b7866bf1a824b3ab6f44b7fa5da0c7a781766fd0 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python2
import numpy as np
import argparse
import rospy
import sys
import os
def generate_singlecam_sdf_function(pitch_angle_deg=0.0, ros_update_rate=10.0,
yaw_angle_left=-2.3562, yaw_angle_right=2.3562,
include_cam_visualisation=True,
fov_horizontal_rad=0.73675,
cam_w=752,
cam_h=480,
):
pitch_angle_rad = (np.pi/2.0) - np.deg2rad(pitch_angle_deg)
# get environmental variable of model location
px4_dir = (os.environ.get('PX4_SRC_DIR'))
if not px4_dir:
raise ('need to set the PX4_SRC_DIR environmental variable for this script to work')
else:
save_dir = os.path.join(px4_dir, 'Tools/sitl_gazebo/models/typhoon_2cam/typhoon_2cam.sdf')
print ('saving output to: {}'.format(save_dir))
# main camera string
main_camera_bit = """
<sensor name='camera' type='camera'>
<pose>-0.051 0 -0.162 0 """ + str(pitch_angle_rad) + """ 3.14159</pose>
<camera name='__default__'>
<horizontal_fov>""" + str(fov_horizontal_rad) + """</horizontal_fov> <!--1.0471975511965976 or 100 degs 1.7453292519943295 or 90 deg 1.5707963267948966 --> <!-- <horizontal_fov>1.0122909661567112</horizontal_fov> 1.0471975511965976 or 100 degs 1.7453292519943295 or 90 deg 1.5707963267948966 -->

<clip>
<near>0.1</near>
<far>100</far>
</clip>
</camera>
<always_on>1</always_on>
<update_rate>10.0</update_rate>
<visualize>true</visualize>
<plugin name='camera_plugin' filename='libgazebo_ros_camera.so'>
<!--<robotNamespace></robotNamespace>-->
<alwaysOn>true</alwaysOn>
<imageTopicName>image_raw</imageTopicName>
<cameraInfoTopicName>camera_info</cameraInfoTopicName>
<updateRate>10.0</updateRate>
<!--<updateRate>30.0</updateRate>-->
<cameraName>usb_cam</cameraName> <!--resize_img -->
<frameName>/robot_camera_link</frameName>
</plugin>
</sensor>
"""
preliminary_str = """
<sdf version='1.5'>
<model name='typhoon_2cam'>
<!-- Typhoon H body -->
<pose>0 0 0.26 0 0 3.1415927</pose>
<link name='base_link'>
<pose>0 0 0 0 0 0</pose>
<inertial>
<!-- rear reference point X: -100.044mm, CAD offset: 1001.049mm -->
<!-- top reference point Z: 33.8663mm, CAD offset: 42.8698mm -->
<pose>0.001005 0 -0.0090035 0 0 0</pose>
<mass>2.02</mass>
<inertia>
<ixx>0.011</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.015</iyy>
<iyz>0</iyz>
<izz>0.021</izz>
</inertia>
</inertial>
<collision name='base_link_collision'>
<pose>0 0 0.0 0 0 0</pose>
<geometry>
<box>
<size>0.67 0.67 0.15</size>
</box>
</geometry>
<surface>
<contact>
<ode>
<min_depth>0.001</min_depth>
<max_vel>0</max_vel>
</ode>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='base_link_visual'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/main_body_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<link name="cgo3_mount_link">
<inertial>
<!-- place holder -->
<pose>-0.041 0 -0.162 0 0 0</pose>
<mass>0.1</mass>
<inertia>
<ixx>0.001</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.001</iyy>
<iyz>0</iyz>
<izz>0.001</izz>
</inertia>
</inertial>
<visual name='cgo3_mount_visual'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/cgo3_mount_remeshed_v1.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
</link>
<joint name='cgo3_mount_joint' type='revolute'>
<child>cgo3_mount_link</child>
<parent>base_link</parent>
<pose>0 0 0 0 0 0</pose>
<axis>
<xyz>0 0 1</xyz>
<limit>
<lower>0</lower>
<upper>0</upper>
<effort>100</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>1</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name="cgo3_vertical_arm_link">
<inertial>
<!-- place holder -->
<pose>-0.041 0 -0.162 0 0 0</pose>
<mass>0.1</mass>
<inertia>
<ixx>0.001</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.001</iyy>
<iyz>0</iyz>
<izz>0.001</izz>
</inertia>
</inertial>
<visual name='cgo3_vertical_arm_visual'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/cgo3_vertical_arm_remeshed_v1.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
</link>
<joint name='cgo3_vertical_arm_joint' type='revolute'>
<child>cgo3_vertical_arm_link</child>
<parent>cgo3_mount_link</parent>
<pose>-0.026 0 -0.10 0 0 0</pose>
<!--
<controlIndex>6</controlIndex>
-->
<axis>
<xyz>0 0 1</xyz>
<limit>
<lower>-1e16</lower>
<upper>1e16</upper>
<effort>100</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.1</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
<limit>
<!-- testing soft limits -->
<cfm>0.1</cfm>
<erp>0.2</erp>
</limit>
</ode>
</physics>
</joint>
<link name="cgo3_horizontal_arm_link">
<inertial>
<!-- place holder -->
<pose>-0.041 0 -0.081 0 0 0</pose>
<mass>0.1</mass>
<inertia>
<ixx>0.001</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.001</iyy>
<iyz>0</iyz>
<izz>0.001</izz>
</inertia>
</inertial>
<visual name='cgo3_horizontal_arm_visual'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/cgo3_horizontal_arm_remeshed_v1.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
</link>
<joint name='cgo3_horizontal_arm_joint' type='revolute'>
<child>cgo3_horizontal_arm_link</child>
<parent>cgo3_vertical_arm_link</parent>
<pose>0.026 0 -0.162 0 0 0</pose>
<!--
<controlIndex>7</controlIndex>
-->
<axis>
<xyz>-1 0 0</xyz>
<limit>
<lower>-0.785398</lower>
<upper>0.785398</upper>
<effort>100</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.1</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
<limit>
<!-- testing soft limits -->
<cfm>0.1</cfm>
<erp>0.2</erp>
</limit>
</ode>
</physics>
</joint>
<link name="cgo3_camera_link">
<inertial>
<!-- place holder -->
<pose>-0.041 0 -0.162 0 0 0</pose>
<mass>0.1</mass>
<inertia>
<ixx>0.001</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.001</iyy>
<iyz>0</iyz>
<izz>0.001</izz>
</inertia>
</inertial>
<collision name='cgo3_camera_collision'>
<pose>-0.041 0 -0.162 0 0 0</pose>
<geometry>
<sphere>
<radius>0.035</radius>
</sphere>
</geometry>
<surface>
<friction>
<ode>
<mu>1</mu>
<mu2>1</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e+8</kp>
<kd>1</kd>
<max_vel>0.01</max_vel>
<min_depth>0.001</min_depth>
</ode>
</contact>
</surface>
</collision>
<visual name='cgo3_camera_visual'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/cgo3_camera_remeshed_v1.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<sensor name="camera_imu" type="imu">
<always_on>1</always_on>
</sensor>
"""
final_str = """
</link>
<joint name='cgo3_camera_joint' type='revolute'>
<child>cgo3_camera_link</child>
<parent>cgo3_horizontal_arm_link</parent>
<pose>-0.041 0.03 -0.162 0 0 0</pose>
<axis>
<xyz>0 -1 0</xyz>
<limit>
<lower>-1.05</lower>
<upper>2.09</upper>
<effort>100</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.1</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
<limit>
<!-- testing soft limits -->
<cfm>0.1</cfm>
<erp>0.2</erp>
</limit>
</ode>
</physics>
</joint>
<link name="left_leg">
<inertial>
<!-- place holder -->
<pose>0 -0.14314 -0.207252 0 0 0</pose>
<mass>0.1</mass>
<inertia>
<ixx>0.001</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.001</iyy>
<iyz>0</iyz>
<izz>0.001</izz>
</inertia>
</inertial>
<collision name='collision'>
<pose>-0.005 -0.14314 -0.207252 0 1.56893 0</pose>
<geometry>
<cylinder>
<radius>0.012209</radius>
<length>0.3</length>
</cylinder>
</geometry>
<surface>
<friction>
<ode>
<mu>1</mu>
<mu2>1</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e+8</kp>
<kd>1</kd>
<max_vel>0.01</max_vel>
<min_depth>0.001</min_depth>
</ode>
</contact>
</surface>
</collision>
<collision name='collision_bar'>
<pose>0.00052 -0.08503 -0.121187 -0.501318 0 0</pose>
<geometry>
<cylinder>
<radius>0.00914984</radius>
<length>0.176893</length>
</cylinder>
</geometry>
<surface>
<friction>
<ode>
<mu>1</mu>
<mu2>1</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e+8</kp>
<kd>1</kd>
<max_vel>0.01</max_vel>
<min_depth>0.001</min_depth>
</ode>
</contact>
</surface>
</collision>
<visual name='base_link_left_leg'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/leg2_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
</link>
<joint name='left_leg_joint' type='revolute'>
<child>left_leg</child>
<parent>base_link</parent>
<pose>0.00026 -0.040515 -0.048 0 0 0</pose>
<axis>
<xyz>-1 0 0</xyz>
<limit>
<lower>0</lower>
<upper>1</upper>
<effort>100</effort>
<velocity>-1</velocity>
<stiffness>100000000</stiffness>
<dissipation>1</dissipation>
</limit>
<dynamics>
<damping>0.1</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name="right_leg">
<pose>0 0 0 0 0 0</pose>
<inertial>
<!-- place holder -->
<pose>0 0.14314 -0.207252 0 0 0</pose>
<mass>0.1</mass>
<inertia>
<ixx>0.001</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.001</iyy>
<iyz>0</iyz>
<izz>0.001</izz>
</inertia>
</inertial>
<collision name='collision'>
<pose>-0.005 0.14314 -0.207252 0 1.56893 0</pose>
<geometry>
<cylinder>
<radius>0.012209</radius>
<length>0.3</length>
</cylinder>
</geometry>
<surface>
<friction>
<ode>
<mu>1</mu>
<mu2>1</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e+8</kp>
<kd>1</kd>
<max_vel>0.01</max_vel>
<min_depth>0.001</min_depth>
</ode>
</contact>
</surface>
</collision>
<collision name='collision_bar'>
<pose>0.00052 0.08503 -0.121187 0.501318 0 0</pose>
<geometry>
<cylinder>
<radius>0.00914984</radius>
<length>0.176893</length>
</cylinder>
</geometry>
<surface>
<friction>
<ode>
<mu>1</mu>
<mu2>1</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e+8</kp>
<kd>1</kd>
<max_vel>0.01</max_vel>
<min_depth>0.001</min_depth>
</ode>
</contact>
</surface>
</collision>
<visual name='base_link_right_leg'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/leg1_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
</link>
<joint name='right_leg_joint' type='revolute'>
<child>right_leg</child>
<parent>base_link</parent>
<pose>0.00026 0.040515 -0.048 0 0 0</pose>
<axis>
<xyz>1 0 0</xyz>
<limit>
<lower>0</lower>
<upper>1</upper>
<effort>100</effort>
<velocity>-1</velocity>
<stiffness>100000000</stiffness>
<dissipation>1</dissipation>
</limit>
<dynamics>
<damping>0.1</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name='typhoon_2cam/imu_link'>
<pose>0 0 0 0 0 3.1415927</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.015</mass>
<inertia>
<ixx>1e-05</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>1e-05</iyy>
<iyz>0</iyz>
<izz>1e-05</izz>
</inertia>
</inertial>
</link>
<joint name='typhoon_2cam/imu_joint' type='revolute'>
<child>typhoon_2cam/imu_link</child>
<parent>base_link</parent>
<axis>
<xyz>1 0 0</xyz>
<limit>
<lower>0</lower>
<upper>0</upper>
<effort>0</effort>
<velocity>0</velocity>
</limit>
<dynamics>
<spring_reference>0</spring_reference>
<spring_stiffness>0</spring_stiffness>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
</joint>
<link name='rotor_3'>
<pose>0.211396 0.119762 0.082219 0 0 0</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.005</mass>
<inertia>
<ixx>9.75e-07</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.000273104</iyy>
<iyz>0</iyz>
<izz>0.000274004</izz>
</inertia>
</inertial>
<collision name='rotor_3_collision'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<cylinder>
<length>0.005</length>
<radius>0.128</radius>
</cylinder>
</geometry>
<surface>
<contact>
<ode/>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='rotor_3_visual'>
<pose>-0.211396 -0.119762 -0.082219 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/prop_ccw_assembly_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/Blue</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<joint name='rotor_3_joint' type='revolute'>
<child>rotor_3</child>
<parent>base_link</parent>
<axis>
<xyz>0.0446 -0.0825 1.8977</xyz>
<limit>
<lower>-1e+16</lower>
<upper>1e+16</upper>
<effort>10</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.005</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name='rotor_0'>
<pose>-0.209396 0.122762 0.082219 0 0 2.09439510239</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.005</mass>
<inertia>
<ixx>9.75e-07</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.000273104</iyy>
<iyz>0</iyz>
<izz>0.000274004</izz>
</inertia>
</inertial>
<collision name='rotor_0_collision'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<cylinder>
<length>0.005</length>
<radius>0.128</radius>
</cylinder>
</geometry>
<surface>
<contact>
<ode/>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='rotor_0_visual'>
<pose>-0.211396 -0.119762 -0.082219 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/prop_ccw_assembly_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/Blue</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<joint name='rotor_0_joint' type='revolute'>
<child>rotor_0</child>
<parent>base_link</parent>
<axis>
<xyz>0.046 0.0827 1.8977</xyz>
<limit>
<lower>-1e+16</lower>
<upper>1e+16</upper>
<effort>10</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.005</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name='rotor_4'>
<pose>-0.00187896 0.242705 0.0822169 0 0 0</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.005</mass>
<inertia>
<ixx>9.75e-07</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.000273104</iyy>
<iyz>0</iyz>
<izz>0.000274004</izz>
</inertia>
</inertial>
<collision name='rotor_4_collision'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<cylinder>
<length>0.005</length>
<radius>0.128</radius>
</cylinder>
</geometry>
<surface>
<contact>
<ode/>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='rotor_4_visual'>
<pose>0.00187896 -0.242705 -0.0822169 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/prop_cw_assembly_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<joint name='rotor_4_joint' type='revolute'>
<child>rotor_4</child>
<parent>base_link</parent>
<axis>
<xyz>-0.09563 -0.0003 1.8976</xyz>
<limit>
<lower>-1e+16</lower>
<upper>1e+16</upper>
<effort>10</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.005</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name='rotor_1'>
<pose>0.211396 -0.119762 0.082219 0 0 -2.09439510239</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.005</mass>
<inertia>
<ixx>9.75e-07</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.000273104</iyy>
<iyz>0</iyz>
<izz>0.000274004</izz>
</inertia>
</inertial>
<collision name='rotor_1_collision'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<cylinder>
<length>0.005</length>
<radius>0.128</radius>
</cylinder>
</geometry>
<surface>
<contact>
<ode/>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='rotor_1_visual'>
<pose>0.00187896 -0.242705 -0.0822169 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/prop_cw_assembly_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<joint name='rotor_1_joint' type='revolute'>
<child>rotor_1</child>
<parent>base_link</parent>
<axis>
<xyz>0.0486 0.0811 1.8976</xyz>
<limit>
<lower>-1e+16</lower>
<upper>1e+16</upper>
<effort>10</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.005</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name='rotor_5'>
<pose>-0.00187896 -0.242705 0.0822169 0 0 -2.09439510239</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.005</mass>
<inertia>
<ixx>9.75e-07</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.000273104</iyy>
<iyz>0</iyz>
<izz>0.000274004</izz>
</inertia>
</inertial>
<collision name='rotor_5_collision'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<cylinder>
<length>0.005</length>
<radius>0.128</radius>
</cylinder>
</geometry>
<surface>
<contact>
<ode/>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='rotor_5_visual'>
<pose>-0.211396 -0.119762 -0.082219 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/prop_ccw_assembly_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/Blue</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<joint name='rotor_5_joint' type='revolute'>
<child>rotor_5</child>
<parent>base_link</parent>
<axis>
<xyz>-0.033996 -0.0006 0.68216</xyz>
<limit>
<lower>-1e+16</lower>
<upper>1e+16</upper>
<effort>10</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.005</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<link name='rotor_2'>
<pose>-0.209396 -0.122762 0.082219 0 0 2.09439510239</pose>
<inertial>
<pose>0 0 0 0 0 0</pose>
<mass>0.005</mass>
<inertia>
<ixx>9.75e-07</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.000273104</iyy>
<iyz>0</iyz>
<izz>0.000274004</izz>
</inertia>
</inertial>
<collision name='rotor_2_collision'>
<pose>0 0 0 0 0 0</pose>
<geometry>
<cylinder>
<length>0.005</length>
<radius>0.128</radius>
</cylinder>
</geometry>
<surface>
<contact>
<ode/>
</contact>
<friction>
<ode/>
</friction>
</surface>
</collision>
<visual name='rotor_2_visual'>
<pose>0.00187896 -0.242705 -0.0822169 0 0 0</pose>
<geometry>
<mesh>
<scale>0.001 0.001 0.001</scale>
<uri>model://typhoon_2cam/meshes/prop_cw_assembly_remeshed_v3.stl</uri>
</mesh>
</geometry>
<material>
<script>
<name>Gazebo/DarkGrey</name>
<uri>file://media/materials/scripts/gazebo.material</uri>
</script>
</material>
</visual>
<gravity>1</gravity>
<velocity_decay/>
<self_collide>0</self_collide>
</link>
<joint name='rotor_2_joint' type='revolute'>
<child>rotor_2</child>
<parent>base_link</parent>
<axis>
<xyz>0.0404 -0.0876 1.8976</xyz>
<limit>
<lower>-1e+16</lower>
<upper>1e+16</upper>
<effort>10</effort>
<velocity>-1</velocity>
</limit>
<dynamics>
<damping>0.005</damping>
</dynamics>
<use_parent_model_frame>1</use_parent_model_frame>
</axis>
<physics>
<ode>
<implicit_spring_damper>1</implicit_spring_damper>
</ode>
</physics>
</joint>
<!--
<include>
<uri>model://sonar</uri>
</include>
<joint name="sonar_joint" type="revolute">
<child>sonar_model::link</child>
<parent>typhoon_2cam::base_link</parent>
<axis>
<xyz>0 0 1</xyz>
<limit>
<upper>0</upper>
<lower>0</lower>
</limit>
</axis>
</joint>
-->
<plugin name='rosbag' filename='libgazebo_multirotor_base_plugin.so'>
<robotNamespace></robotNamespace>
<linkName>base_link</linkName>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
</plugin>
<plugin name='front_right_motor_model' filename='libgazebo_motor_model.so'>
<robotNamespace></robotNamespace>
<jointName>rotor_0_joint</jointName>
<linkName>rotor_0</linkName>
<turningDirection>ccw</turningDirection>
<timeConstantUp>0.0125</timeConstantUp>
<timeConstantDown>0.025</timeConstantDown>
<maxRotVelocity>1500</maxRotVelocity>
<motorConstant>8.54858e-06</motorConstant>
<momentConstant>0.06</momentConstant>
<commandSubTopic>/gazebo/command/motor_speed</commandSubTopic>
<motorNumber>4</motorNumber>
<rotorDragCoefficient>0.000806428</rotorDragCoefficient>
<rollingMomentCoefficient>1e-06</rollingMomentCoefficient>
<motorSpeedPubTopic>/motor_speed/4</motorSpeedPubTopic>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
<!--
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
</plugin>
<plugin name='back_left_motor_model' filename='libgazebo_motor_model.so'>
<robotNamespace></robotNamespace>
<jointName>rotor_1_joint</jointName>
<linkName>rotor_1</linkName>
<turningDirection>cw</turningDirection>
<timeConstantUp>0.0125</timeConstantUp>
<timeConstantDown>0.025</timeConstantDown>
<maxRotVelocity>1500</maxRotVelocity>
<motorConstant>8.54858e-06</motorConstant>
<momentConstant>0.06</momentConstant>
<commandSubTopic>/gazebo/command/motor_speed</commandSubTopic>
<motorNumber>5</motorNumber>
<rotorDragCoefficient>0.000806428</rotorDragCoefficient>
<rollingMomentCoefficient>1e-06</rollingMomentCoefficient>
<motorSpeedPubTopic>/motor_speed/5</motorSpeedPubTopic>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
<!--
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
</plugin>
<plugin name='front_left_motor_model' filename='libgazebo_motor_model.so'>
<robotNamespace></robotNamespace>
<jointName>rotor_2_joint</jointName>
<linkName>rotor_2</linkName>
<turningDirection>cw</turningDirection>
<timeConstantUp>0.0125</timeConstantUp>
<timeConstantDown>0.025</timeConstantDown>
<maxRotVelocity>1500</maxRotVelocity>
<motorConstant>8.54858e-06</motorConstant>
<momentConstant>0.06</momentConstant>
<commandSubTopic>/gazebo/command/motor_speed</commandSubTopic>
<motorNumber>2</motorNumber>
<rotorDragCoefficient>0.000806428</rotorDragCoefficient>
<rollingMomentCoefficient>1e-06</rollingMomentCoefficient>
<motorSpeedPubTopic>/motor_speed/2</motorSpeedPubTopic>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
<!--
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
</plugin>
<plugin name='back_right_motor_model' filename='libgazebo_motor_model.so'>
<robotNamespace></robotNamespace>
<jointName>rotor_3_joint</jointName>
<linkName>rotor_3</linkName>
<turningDirection>ccw</turningDirection>
<timeConstantUp>0.0125</timeConstantUp>
<timeConstantDown>0.025</timeConstantDown>
<maxRotVelocity>1500</maxRotVelocity>
<motorConstant>8.54858e-06</motorConstant>
<momentConstant>0.06</momentConstant>
<commandSubTopic>/gazebo/command/motor_speed</commandSubTopic>
<motorNumber>3</motorNumber>
<rotorDragCoefficient>0.000806428</rotorDragCoefficient>
<rollingMomentCoefficient>1e-06</rollingMomentCoefficient>
<motorSpeedPubTopic>/motor_speed/3</motorSpeedPubTopic>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
<!--
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
</plugin>
<plugin name='back_left_motor_model' filename='libgazebo_motor_model.so'>
<robotNamespace></robotNamespace>
<jointName>rotor_4_joint</jointName>
<linkName>rotor_4</linkName>
<turningDirection>cw</turningDirection>
<timeConstantUp>0.0125</timeConstantUp>
<timeConstantDown>0.025</timeConstantDown>
<maxRotVelocity>1500</maxRotVelocity>
<motorConstant>8.54858e-06</motorConstant>
<momentConstant>0.06</momentConstant>
<commandSubTopic>/gazebo/command/motor_speed</commandSubTopic>
<motorNumber>0</motorNumber>
<rotorDragCoefficient>0.000806428</rotorDragCoefficient>
<rollingMomentCoefficient>1e-06</rollingMomentCoefficient>
<motorSpeedPubTopic>/motor_speed/0</motorSpeedPubTopic>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
<!--
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
</plugin>
<plugin name='front_left_motor_model' filename='libgazebo_motor_model.so'>
<robotNamespace></robotNamespace>
<jointName>rotor_5_joint</jointName>
<linkName>rotor_5</linkName>
<turningDirection>ccw</turningDirection>
<timeConstantUp>0.0125</timeConstantUp>
<timeConstantDown>0.025</timeConstantDown>
<maxRotVelocity>1500</maxRotVelocity>
<motorConstant>8.54858e-06</motorConstant>
<momentConstant>0.06</momentConstant>
<commandSubTopic>/gazebo/command/motor_speed</commandSubTopic>
<motorNumber>1</motorNumber>
<rotorDragCoefficient>0.000806428</rotorDragCoefficient>
<rollingMomentCoefficient>1e-06</rollingMomentCoefficient>
<motorSpeedPubTopic>/motor_speed/1</motorSpeedPubTopic>
<rotorVelocitySlowdownSim>10</rotorVelocitySlowdownSim>
<!--
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
</plugin>
<plugin name="gps_plugin" filename="libgazebo_gps_plugin.so">
<robotNamespace></robotNamespace>
<gpsNoise>false</gpsNoise>
</plugin>
<plugin name='magnetometer_plugin' filename='libgazebo_magnetometer_plugin.so'>
<robotNamespace/>
<pubRate>20</pubRate>
<!--<noiseDensity>0.0004</noiseDensity>
<randomWalk>6.4e-06</randomWalk>-->
<noiseDensity>0.0</noiseDensity>
<randomWalk>0.0</randomWalk>
<biasCorrelationTime>600</biasCorrelationTime>
<magTopic>/mag</magTopic>
</plugin>
<plugin name='barometer_plugin' filename='libgazebo_barometer_plugin.so'>
<robotNamespace/>
<pubRate>10</pubRate>
<baroTopic>/baro</baroTopic>
</plugin>
<plugin name='mavlink_interface' filename='libgazebo_mavlink_interface.so'>
<robotNamespace></robotNamespace>
<imuSubTopic>/imu</imuSubTopic>
<gpsSubTopic>/gps</gpsSubTopic>
<magSubTopic>/mag</magSubTopic>
<baroSubTopic>/baro</baroSubTopic>
<lidarSubTopic>/link/lidar</lidarSubTopic> # JS
<!-- <lidarSubTopic>/sf10a/link/lidar</lidarSubTopic> # JS-->
<mavlink_addr>INADDR_ANY</mavlink_addr>
<mavlink_udp_port>14560</mavlink_udp_port>
<serialEnabled>false</serialEnabled>
<serialDevice>/dev/ttyACM0</serialDevice>
<baudRate>921600</baudRate>
<qgc_addr>INADDR_ANY</qgc_addr>
<qgc_udp_port>14550</qgc_udp_port>
<sdk_addr>INADDR_ANY</sdk_addr>
<sdk_udp_port>14540</sdk_udp_port>
<hil_mode>false</hil_mode>
<hil_state_level>false</hil_state_level>
<enable_lockstep>true</enable_lockstep> # JS
<use_tcp>true</use_tcp>
<motorSpeedCommandPubTopic>/gazebo/command/motor_speed</motorSpeedCommandPubTopic>
<control_channels>
<channel name="rotor0">
<input_index>0</input_index>
<input_offset>0</input_offset>
<input_scaling>1500</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>100</zero_position_armed>
<joint_control_type>velocity</joint_control_type>
<!-- gazebo_motor_model has the joint_control_pid active in this model
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
<joint_name>rotor_4_joint</joint_name>
</channel>
<channel name="rotor1">
<input_index>1</input_index>
<input_offset>0</input_offset>
<input_scaling>1500</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>100</zero_position_armed>
<joint_control_type>velocity</joint_control_type>
<!-- gazebo_motor_model has the joint_control_pid active in this model
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
<joint_name>rotor_5_joint</joint_name>
</channel>
<channel name="rotor2">
<input_index>2</input_index>
<input_offset>0</input_offset>
<input_scaling>1500</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>100</zero_position_armed>
<joint_control_type>velocity</joint_control_type>
<!-- gazebo_motor_model has the joint_control_pid active in this model
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
<joint_name>rotor_2_joint</joint_name>
</channel>
<channel name="rotor3">
<input_index>3</input_index>
<input_offset>0</input_offset>
<input_scaling>1500</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>100</zero_position_armed>
<joint_control_type>velocity</joint_control_type>
<!-- gazebo_motor_model has the joint_control_pid active in this model
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
<joint_name>rotor_3_joint</joint_name>
</channel>
<channel name="rotor4">
<input_index>4</input_index>
<input_offset>0</input_offset>
<input_scaling>1500</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>100</zero_position_armed>
<joint_control_type>velocity</joint_control_type>
<!-- gazebo_motor_model has the joint_control_pid active in this model
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
<joint_name>rotor_0_joint</joint_name>
</channel>
<channel name="rotor5">
<input_index>5</input_index>
<input_offset>0</input_offset>
<input_scaling>1500</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>100</zero_position_armed>
<joint_control_type>velocity</joint_control_type>
<!-- gazebo_motor_model has the joint_control_pid active in this model
<joint_control_pid>
<p>0.1</p>
<i>0</i>
<d>0</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>3</cmdMax>
<cmdMin>-3</cmdMin>
</joint_control_pid>
-->
<joint_name>rotor_1_joint</joint_name>
</channel>
<channel name="gimbal_roll">
<input_index>6</input_index>
<input_offset>0</input_offset>
<input_scaling>-3.1415</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>0</zero_position_armed>
<joint_control_type>position_gztopic</joint_control_type>
<gztopic>/gimbal_roll_cmd</gztopic>
<joint_name>typhoon_2cam::cgo3_camera_joint</joint_name>
</channel>
<channel name="gimbal_pitch">
<input_index>7</input_index>
<input_offset>0</input_offset>
<input_scaling>3.1415</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>0</zero_position_armed>
<joint_control_type>position_gztopic</joint_control_type>
<gztopic>/gimbal_pitch_cmd</gztopic>
<joint_name>typhoon_2cam::cgo3_camera_joint</joint_name>
</channel>
<channel name="gimbal_yaw">
<input_index>8</input_index>
<input_offset>0</input_offset>
<input_scaling>-3.1415</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>0</zero_position_armed>
<joint_control_type>position_gztopic</joint_control_type>
<gztopic>/gimbal_yaw_cmd</gztopic>
<joint_name>typhoon_2cam::cgo3_vertical_arm_joint</joint_name>
</channel>
<channel name="left_leg">
<input_index>9</input_index>
<input_offset>1</input_offset>
<input_scaling>0.5</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>0</zero_position_armed>
<joint_control_type>position</joint_control_type>
<joint_control_pid>
<p>3.5</p>
<i>0.5</i>
<d>0</d>
<iMax>4</iMax>
<iMin>-4</iMin>
<cmdMax>6</cmdMax>
<cmdMin>-6</cmdMin>
</joint_control_pid>
<joint_name>left_leg_joint</joint_name>
</channel>
<channel name="right_leg">
<input_index>10</input_index>
<input_offset>1</input_offset>
<input_scaling>0.5</input_scaling>
<zero_position_disarmed>0</zero_position_disarmed>
<zero_position_armed>0</zero_position_armed>
<joint_control_type>position</joint_control_type>
<joint_control_pid>
<p>3.5</p>
<i>0.5</i>
<d>0</d>
<iMax>4</iMax>
<iMin>-4</iMin>
<cmdMax>6</cmdMax>
<cmdMin>-6</cmdMin>
</joint_control_pid>
<joint_name>right_leg_joint</joint_name>
</channel>
</control_channels>
</plugin>
<static>0</static>
<plugin name='gazebo_imu_plugin' filename='libgazebo_imu_plugin.so'>
<robotNamespace></robotNamespace>
<linkName>typhoon_2cam/imu_link</linkName>
<imuTopic>/imu</imuTopic>
<gyroscopeNoiseDensity>0.0</gyroscopeNoiseDensity>
<gyroscopeRandomWalk>0.0</gyroscopeRandomWalk>
<gyroscopeTurnOnBiasSigma>0.0</gyroscopeTurnOnBiasSigma>
<accelerometerNoiseDensity>0.0</accelerometerNoiseDensity>
<accelerometerRandomWalk>0.0</accelerometerRandomWalk>
<accelerometerTurnOnBiasSigma>0.0</accelerometerTurnOnBiasSigma>
<!--
<gyroscopeNoiseDensity>0.0003394</gyroscopeNoiseDensity>
<gyroscopeRandomWalk>3.8785e-05</gyroscopeRandomWalk>
<gyroscopeBiasCorrelationTime>1000.0</gyroscopeBiasCorrelationTime>
<gyroscopeTurnOnBiasSigma>0.0087</gyroscopeTurnOnBiasSigma>
<accelerometerNoiseDensity>0.004</accelerometerNoiseDensity>
<accelerometerRandomWalk>0.006</accelerometerRandomWalk>
<accelerometerBiasCorrelationTime>300.0</accelerometerBiasCorrelationTime>
<accelerometerTurnOnBiasSigma>0.196</accelerometerTurnOnBiasSigma>
-->
</plugin>
<plugin name='gimbal_controller' filename='libgazebo_gimbal_controller_plugin.so'>
<joint_yaw>typhoon_2cam::cgo3_vertical_arm_joint</joint_yaw>
<joint_roll>typhoon_2cam::cgo3_horizontal_arm_joint</joint_roll>
<joint_pitch>typhoon_2cam::cgo3_camera_joint</joint_pitch>
<control_gimbal_channels>
<channel>
<joint_control_pid>
<p>0.5</p>
<i>0.01245</i>
<d>0.01</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>1.0</cmdMax>
<cmdMin>-1.0</cmdMin>
</joint_control_pid>
<joint_axis>joint_yaw</joint_axis>
</channel>
<!--
<channel>
<joint_control_pid>
<p>0.5</p>
<i>0.01245</i>
<d>0.01</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>1.0</cmdMax>
<cmdMin>-1.0</cmdMin>
</joint_control_pid>
<joint_axis>joint_yaw</joint_axis>
</channel>
<channel>
<joint_control_pid>
<p>0.8</p>
<i>0.035</i>
<d>0.02</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>0.3</cmdMax>
<cmdMin>-0.3</cmdMin>
</joint_control_pid>
<joint_axis>joint_roll</joint_axis>
</channel>
## untouched
<channel>
<joint_control_pid>
<p>2.068</p>
<i>0.01245</i>
<d>0.01</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>1.0</cmdMax>
<cmdMin>-1.0</cmdMin>
</joint_control_pid>
<joint_axis>joint_yaw</joint_axis>
</channel>
<channel>
<joint_control_pid>
<p>2.068</p>
<i>0.01245</i>
<d>0.01</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>0.3</cmdMax>
<cmdMin>-0.3</cmdMin>
</joint_control_pid>
<joint_axis>joint_roll</joint_axis>
</channel>
<channel>
<joint_control_pid>
<p>2.068</p>
<i>0.01245</i>
<d>0.01</d>
<iMax>0</iMax>
<iMin>0</iMin>
<cmdMax>0.3</cmdMax>
<cmdMin>-0.3</cmdMin>
</joint_control_pid>
<joint_axis>joint_pitch</joint_axis>
</channel>
-->
<channel>
<joint_control_pid>
<p>15</p>
<i>0.01245</i>
<d>0.01</d>
<iMax>0.5</iMax>
<iMin>-0.5</iMin>
<cmdMax>0.9</cmdMax>
<cmdMin>-0.9</cmdMin>
</joint_control_pid>
<joint_axis>joint_roll</joint_axis>
</channel>
<channel>
<joint_control_pid>
<p>7.5</p>
<i>0.001245</i>
<d>0.01</d>
<iMax>0.5</iMax>
<iMin>-0.5</iMin>
<cmdMax>0.9</cmdMax>
<cmdMin>-0.9</cmdMin>
</joint_control_pid>
<joint_axis>joint_pitch</joint_axis>
</channel>
</control_gimbal_channels>
<gimbal_imu>camera_imu</gimbal_imu>
</plugin>
JS <!--
<include>
<uri>model://sf10a</uri>
<pose>0.08 0 -0.04 0 0 0</pose>
</include>
<joint name="lidar_joint" type="fixed">
<child>sf10a::link</child>
<parent>base_link</parent>
<axis>
<xyz>0 0 1</xyz>
<limit>
<upper>0</upper>
<lower>0</lower>
</limit>
</axis>
</joint>
-->
THis lidar was too short range
<include>
<uri>model://lidar_long</uri>
<pose>0.08 0 -0.08 0 3.1415 0</pose>
</include>
<!---->
<joint name="lidar_joint" type="fixed">
<child>lidar_long::link</child>
<parent>base_link</parent>
<axis>
<xyz>0 0 1</xyz>
<limit>
<upper>0</upper>
<lower>0</lower>
</limit>
</axis>
</joint>
<plugin name="p3d_base_controller" filename="libgazebo_ros_p3d.so">
<bodyName>cgo3_camera_link</bodyName>
<topicName>camera_gt</topicName>
<!--<frameName>test_link</frameName> --> leave blank so that the world frame is used here
<updateRate>100.0</updateRate>
</plugin>
<plugin name="p3d_base_controller" filename="libgazebo_ros_p3d.so">
<bodyName>base_link</bodyName>
<topicName>body_ground_truth</topicName>
<!--<frameName>test_link</frameName> --> leave blank so that the world frame is used here
<updateRate>100.0</updateRate>
</plugin>
<plugin name="imu_plugin" filename="libgazebo_ros_imu.so">
<alwaysOn>true</alwaysOn>
<bodyName>cgo3_camera_link</bodyName>
<topicName>camera_imu_ros</topicName>
<serviceName>cam_imu_service</serviceName>
<gaussianNoise>0.0</gaussianNoise>
<updateRate>100.0</updateRate>
</plugin>
</model>
</sdf>
"""
full_str = preliminary_str + main_camera_bit + final_str
# print (full_str)
# print main_camera_bit
# savefile
with open(save_dir, 'w') as filehandle:
filehandle.write(full_str)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description="This parses instructions for autorun.")
# todo - can this be combined with argparse cx?
parser.add_argument('-p', '--pitch_offset_deg', type=float, default=0, help='pitch angle w.r.t the ground - 0 is at'
' the ground, 90 is horizontal in the direction of motion')
args = parser.parse_args(rospy.myargv(argv=sys.argv)[1:])
print (args)
generate_singlecam_sdf_function(pitch_angle_deg=args.pitch_offset_deg) | 33.758521 | 326 | 0.488432 | 5,847 | 58,436 | 4.703609 | 0.086198 | 0.015272 | 0.013417 | 0.010908 | 0.789724 | 0.765944 | 0.752091 | 0.737546 | 0.724856 | 0.717402 | 0 | 0.081082 | 0.37802 | 58,436 | 1,731 | 327 | 33.758521 | 0.67559 | 0.003063 | 0 | 0.793976 | 1 | 0.004819 | 0.975777 | 0.33117 | 0 | 0 | 0 | 0.000578 | 0 | 1 | 0.000602 | false | 0 | 0.003012 | 0 | 0.003614 | 0.001205 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a1c3bd74d6f2b1d7a97017a1367f044d9218ca1d | 8,183 | py | Python | test/test_matrix_algorithms.py | evgenyim/bsse-spring-2020-graph-BD | 73e98918c9a7bebec866fd2c0e4d40dfe45e8c9d | [
"MIT"
] | 3 | 2020-02-20T17:34:23.000Z | 2020-04-03T12:58:44.000Z | test/test_matrix_algorithms.py | evgenyim/bsse-spring-2020-graph-BD | 73e98918c9a7bebec866fd2c0e4d40dfe45e8c9d | [
"MIT"
] | 1 | 2020-05-13T06:52:24.000Z | 2020-05-13T06:52:24.000Z | test/test_matrix_algorithms.py | evgenyim/bsse-spring-2020-graph-BD | 73e98918c9a7bebec866fd2c0e4d40dfe45e8c9d | [
"MIT"
] | 2 | 2020-02-20T17:38:14.000Z | 2020-04-24T09:45:26.000Z | import os
import tempfile
from src.matrix_algorithms import *
def test_evalCFPQ():
g = Grammar()
g_path = os.path.dirname(__file__) + '/resources/test3.txt'
g.read_from_file(g_path)
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph.txt'
gr.read_graph(gr_path)
t = evalCFPQ(g, gr)
for left, rules in g.rules.items():
for rule in rules:
if len(rule) == 1:
assert rule[0].islower()
else:
assert len(rule) == 2
assert rule[0].isupper()
assert rule[1].isupper()
m = {}
for term in g.nonterminals:
m[term] = t[term].toarray()
pairsS = [(0, 2), (0, 3), (1, 2), (1, 3), (2, 2), (2, 3)]
pairsA = [(0, 1), (1, 2), (2, 0)]
pairsB = [(2, 3), (3, 2)]
pairsS1 = [(0, 2), (0, 3), (1, 2), (1, 3), (2, 2), (2, 3)]
n = len(gr.vertices)
for i in range(n):
for j in range(n):
for term in g.nonterminals:
if term == 'S' and (i, j) in pairsS:
assert m[term].item((i, j))
elif term == 'A' and (i, j) in pairsA:
assert m[term].item((i, j))
elif term == 'B' and (i, j) in pairsB:
assert m[term].item((i, j))
elif term == 'S1' and (i, j) in pairsS1:
assert m[term].item((i, j))
else:
assert not m[term].item((i, j))
def test_evalCFPQ_empty_graph():
g = Grammar()
g_path = os.path.dirname(__file__) + '/resources/test3.txt'
g.read_from_file(g_path)
temp = tempfile.NamedTemporaryFile()
gr = Graph()
gr_path = temp.name
gr.read_graph(gr_path)
t = evalCFPQ(g, gr)
assert t['S'].toarray().size == 0
def test_evalCFPQ_amb_grammar():
g = Grammar()
path = os.path.dirname(__file__) + '/resources/ambiguous_grammar.txt'
g.read_from_file(path)
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph2.txt'
gr.read_graph(gr_path)
t = evalCFPQ(g, gr)
m = {}
for term in g.nonterminals:
m[term] = t[term].toarray()
pairsA = [(0, 1), (0, 2)]
pairsQ2 = [(0, 1)]
pairsQ3 = [(0, 2)]
n = len(gr.vertices)
for i in range(n):
for j in range(n):
for term in g.nonterminals:
if term == 'A' and (i, j) in pairsA:
assert m[term].item((i, j))
elif term == 'Q2' and (i, j) in pairsQ2:
assert m[term].item((i, j))
elif term == 'Q3' and (i, j) in pairsQ3:
assert m[term].item((i, j))
else:
assert not m[term].item((i, j))
def test_evalCFPQ_inh_amb_grammar():
g = Grammar()
path = os.path.dirname(__file__) + '/resources/inh_amb_grammar.txt'
g.read_from_file(path)
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph3.txt'
gr.read_graph(gr_path)
t = evalCFPQ(g, gr)
m = {}
for term in g.nonterminals:
m[term] = t[term].toarray()
n = len(gr.vertices)
for i in range(n):
for j in range(n):
assert not m['S'].item((i, j))
def test_evalCFPQ_inh_amb_grammar2():
g = Grammar()
path = os.path.dirname(__file__) + '/resources/inh_amb_grammar.txt'
g.read_from_file(path)
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph4.txt'
gr.read_graph(gr_path)
t = evalCFPQ(g, gr)
m = {}
for term in g.nonterminals:
m[term] = t[term].toarray()
n = len(gr.vertices)
for i in range(n):
for j in range(n):
if i == 0 and j == 3:
assert m['S'].item((i, j))
else:
assert not m['S'].item((i, j))
def test_evalCFPQ_from_file():
g_path = os.path.dirname(__file__) + '/resources/test3.txt'
gr_path = os.path.dirname(__file__) + '/resources/graph.txt'
key_path = os.path.dirname(__file__) + '/resources/test_key_evalCPFQ.txt'
temp = tempfile.NamedTemporaryFile()
evalCFPQ_from_file(g_path, gr_path, temp.name)
assert open(temp.name).readlines() == open(key_path).readlines()
def test_evalCFPQ_tensor():
g_path = os.path.dirname(__file__) + '/resources/test_tensor.txt'
file = open(g_path)
lines = file.read().splitlines()
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph.txt'
gr.read_graph(gr_path)
t_gr, _, terms, _ = evalCFPQ_tensor(lines, gr)
m = {}
for term in terms:
m[term] = t_gr[term].toarray()
pairsS = [(0, 2), (0, 3), (1, 2), (1, 3), (2, 2), (2, 3)]
pairsA = [(0, 1), (1, 2), (2, 0)]
pairsB = [(2, 3), (3, 2)]
n = len(gr.vertices)
for i in range(n):
for j in range(n):
for term in terms:
if term == 'S' and (i, j) in pairsS:
assert m[term].item((i, j))
elif term == 'a' and (i, j) in pairsA:
assert m[term].item((i, j))
elif term == 'b' and (i, j) in pairsB:
assert m[term].item((i, j))
else:
assert not m[term].item((i, j))
def test_evalCFPQ_tensor_empty_graph():
g_path = os.path.dirname(__file__) + '/resources/test_tensor.txt'
file = open(g_path)
lines = file.read().splitlines()
temp = tempfile.NamedTemporaryFile()
gr = Graph()
gr_path = temp.name
gr.read_graph(gr_path)
t_gr, _, terms, _ = evalCFPQ_tensor(lines, gr)
assert t_gr['S'].nnz == 0
def test_evalCFPQ_tensor_amb_grammar():
path = os.path.dirname(__file__) + '/resources/tensor_ambiguous_grammar.txt'
file = open(path)
lines = file.read().splitlines()
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph2.txt'
gr.read_graph(gr_path)
t_gr, _, terms, _ = evalCFPQ_tensor(lines, gr)
m = {}
for term in terms:
m[term] = t_gr[term].toarray()
pairsA = [(0, 1), (0, 2)]
pairsp = [(0, 1)]
pairsm = [(0, 2)]
n = len(gr.vertices)
for i in range(n):
for j in range(n):
for term in terms:
if (term == 'A' or term == 'a') and (i, j) in pairsA:
assert m[term].item((i, j))
elif term == 'p' and (i, j) in pairsp:
assert m[term].item((i, j))
elif term == 'm' and (i, j) in pairsm:
assert m[term].item((i, j))
else:
assert not m[term].item((i, j))
def test_evalCFPQ_tensor_inh_amb_grammar():
path = os.path.dirname(__file__) + '/resources/tensor_inh_amb_grammar.txt'
file = open(path)
lines = file.read().splitlines()
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph3.txt'
gr.read_graph(gr_path)
t_gr, _, terms, _ = evalCFPQ_tensor(lines, gr)
m = {}
for term in terms:
m[term] = t_gr[term].toarray()
n = len(gr.vertices)
for i in range(n):
for j in range(n):
assert not m['S'].item((i, j))
def test_evalCFPQ_tensor_inh_amb_grammar2():
path = os.path.dirname(__file__) + '/resources/tensor_inh_amb_grammar.txt'
file = open(path)
lines = file.read().splitlines()
gr = Graph()
gr_path = os.path.dirname(__file__) + '/resources/graph4.txt'
gr.read_graph(gr_path)
t_gr, _, terms, _ = evalCFPQ_tensor(lines, gr)
m = {}
for term in terms:
m[term] = t_gr[term].toarray()
n = len(gr.vertices)
for i in range(n):
for j in range(n):
if i == 0 and j == 3:
assert m['S'].item((i, j))
else:
assert not m['S'].item((i, j))
def test_evalCFPQ_tensor_from_file():
g_path = os.path.dirname(__file__) + '/resources/test_tensor.txt'
gr_path = os.path.dirname(__file__) + '/resources/graph.txt'
key_path = os.path.dirname(__file__) + '/resources/test_key_evalCPFQ_tensor.txt'
temp = tempfile.NamedTemporaryFile()
evalCFPQ_tensor_from_file(g_path, gr_path, temp.name)
assert open(temp.name).readlines() == open(key_path).readlines()
| 33.536885 | 84 | 0.545888 | 1,180 | 8,183 | 3.573729 | 0.074576 | 0.017074 | 0.056913 | 0.096751 | 0.892341 | 0.874793 | 0.871473 | 0.86673 | 0.850842 | 0.820489 | 0 | 0.017922 | 0.29769 | 8,183 | 243 | 85 | 33.674897 | 0.715852 | 0 | 0 | 0.788991 | 0 | 0 | 0.078822 | 0.058658 | 0 | 0 | 0 | 0 | 0.142202 | 1 | 0.055046 | false | 0 | 0.013761 | 0 | 0.068807 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b803fc18f737e5b2b9c365f695233e7a1324d858 | 135 | py | Python | build/lib/mortgage_package/__init__.py | lukavuko/mortgage-filter-package | 187d771c441f93b6a5dd2c5bf67ee519d1888430 | [
"MIT"
] | null | null | null | build/lib/mortgage_package/__init__.py | lukavuko/mortgage-filter-package | 187d771c441f93b6a5dd2c5bf67ee519d1888430 | [
"MIT"
] | null | null | null | build/lib/mortgage_package/__init__.py | lukavuko/mortgage-filter-package | 187d771c441f93b6a5dd2c5bf67ee519d1888430 | [
"MIT"
] | null | null | null | from mortgage_package.mortgage_filter import *
from mortgage_package.mortgage_base import *
from mortgage_package.exceptions import *
| 27 | 46 | 0.859259 | 17 | 135 | 6.529412 | 0.411765 | 0.324324 | 0.513514 | 0.486486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096296 | 135 | 4 | 47 | 33.75 | 0.909836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
62f3c5c8e59ca2e09c86f57bfe72468ff083192d | 12,727 | py | Python | pylark/api_service_attendance_user_approval_create.py | chyroc/pylark | a54cce6b814935fd3c72668b262b54c8ee461484 | [
"Apache-2.0"
] | 7 | 2021-08-18T00:42:05.000Z | 2022-03-14T09:49:15.000Z | pylark/api_service_attendance_user_approval_create.py | chyroc/pylark | a54cce6b814935fd3c72668b262b54c8ee461484 | [
"Apache-2.0"
] | null | null | null | pylark/api_service_attendance_user_approval_create.py | chyroc/pylark | a54cce6b814935fd3c72668b262b54c8ee461484 | [
"Apache-2.0"
] | 1 | 2022-03-14T09:49:20.000Z | 2022-03-14T09:49:20.000Z | # Code generated by lark_sdk_gen. DO NOT EDIT.
from pylark.lark_request import RawRequestReq, _new_method_option
from pylark import lark_type, lark_type_sheet, lark_type_approval
import attr
import typing
import io
@attr.s
class CreateAttendanceUserApprovalReqUserApprovalTrip(object):
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
reason: str = attr.ib(
default="", metadata={"req_type": "json", "key": "reason"}
) # 出差理由
approve_pass_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_pass_time"}
) # 审批通过时间,时间格式为 yyyy-MM-dd HH:mm:ss
approve_apply_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_apply_time"}
) # 审批申请时间,时间格式为 yyyy-MM-dd HH:mm:ss
@attr.s
class CreateAttendanceUserApprovalReqUserApprovalOvertimeWork(object):
duration: float = attr.ib(
default=None, metadata={"req_type": "json", "key": "duration"}
) # 加班时长
unit: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "unit"}
) # 加班时长单位,可用值:【1(天),2(小时)】
category: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "category"}
) # 加班日期类型,可用值:【1(工作日),2(休息日),3(节假日)】
type: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "type"}
) # 加班规则类型,可用值:【0(不关联加班规则),1(调休),2(加班费)】
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
@attr.s
class CreateAttendanceUserApprovalReqUserApprovalLeave(object):
uniq_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "uniq_id"}
) # 假期类型唯一 ID,代表一种假期类型,长度小于 14
unit: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "unit"}
) # 假期时长单位,可用值:【1(天),2(小时),3(半天),4(半小时)】
interval: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "interval"}
) # 假期时长(单位秒)
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
i18n_names: I18nNames = attr.ib(
default=None, metadata={"req_type": "json", "key": "i18n_names"}
) # 假期多语言展示,格式为 map,key 为["ch"、"en"、"ja"],其中 ch 代表中文,en 代表英文、ja 代表日文
default_locale: str = attr.ib(
default="", metadata={"req_type": "json", "key": "default_locale"}
) # 默认语言类型,由于飞书客户端支持中、英、日三种语言,当用户切换语言时,如果假期名称没有所对应语言的名称,则会使用默认语言的名称,可用值:【ch(中文),en(英文),ja(日文)】
reason: str = attr.ib(
default="", metadata={"req_type": "json", "key": "reason"}
) # 请假理由,必选字段
@attr.s
class CreateAttendanceUserApprovalReqUserApprovalOut(object):
uniq_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "uniq_id"}
) # 外出类型唯一 ID,代表一种外出类型,长度小于 14
unit: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "unit"}
) # 外出时长单位,可用值:【1(天),2(小时),3(半天),4(半小时)】
interval: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "interval"}
) # 假期时长(单位秒)
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
i18n_names: I18nNames = attr.ib(
default=None, metadata={"req_type": "json", "key": "i18n_names"}
) # 外出多语言展示,格式为 map,key 为["ch"、"en"、"ja"],其中 ch 代表中文,en 代表英文、ja 代表日文
default_locale: str = attr.ib(
default="", metadata={"req_type": "json", "key": "default_locale"}
) # 默认语言类型,由于飞书客户端支持中、英、日三种语言,当用户切换语言时,如果外出名称没有所对应语言的名称,则会使用默认语言的名称
reason: str = attr.ib(
default="", metadata={"req_type": "json", "key": "reason"}
) # 外出理由
@attr.s
class CreateAttendanceUserApprovalReqUserApproval(object):
user_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "user_id"}
) # 审批用户
date: str = attr.ib(
default="", metadata={"req_type": "json", "key": "date"}
) # 审批作用时间
outs: typing.List[CreateAttendanceUserApprovalReqUserApprovalOut] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "outs"}
) # 外出信息
leaves: typing.List[CreateAttendanceUserApprovalReqUserApprovalLeave] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "leaves"}
) # 请假信息
overtime_works: typing.List[
CreateAttendanceUserApprovalReqUserApprovalOvertimeWork
] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "overtime_works"}
) # 加班信息
trips: typing.List[CreateAttendanceUserApprovalReqUserApprovalTrip] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "trips"}
) # 出差信息
@attr.s
class CreateAttendanceUserApprovalReq(object):
employee_type: lark_type.EmployeeType = attr.ib(
factory=lambda: lark_type.EmployeeType(),
metadata={"req_type": "query", "key": "employee_type"},
) # 请求体中的 user_id 的员工工号类型,必选字段,可用值:【employee_id(员工employeeId),employee_no(员工工号)】,示例值:"employee_id"
user_approval: CreateAttendanceUserApprovalReqUserApproval = attr.ib(
default=None, metadata={"req_type": "json", "key": "user_approval"}
) # 审批信息
@attr.s
class CreateAttendanceUserApprovalRespUserApprovalTrip(object):
approval_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approval_id"}
) # 审批实例ID
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
reason: str = attr.ib(
default="", metadata={"req_type": "json", "key": "reason"}
) # 出差理由
approve_pass_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_pass_time"}
) # 审批通过时间,时间格式为 yyyy-MM-dd HH:mm:ss
approve_apply_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_apply_time"}
) # 审批申请时间,时间格式为 yyyy-MM-dd HH:mm:ss
@attr.s
class CreateAttendanceUserApprovalRespUserApprovalOvertimeWork(object):
approval_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approval_id"}
) # 审批实例ID
duration: float = attr.ib(
default=None, metadata={"req_type": "json", "key": "duration"}
) # 加班时长
unit: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "unit"}
) # 加班时长单位,可用值:【1(天),2(小时)】
category: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "category"}
) # 加班日期类型,可用值:【1(工作日),2(休息日),3(节假日)】
type: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "type"}
) # 加班规则类型,可用值:【0(不关联加班规则),1(调休),2(加班费),3(关联加班规则,没有调休或加班费)】
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
@attr.s
class CreateAttendanceUserApprovalRespUserApprovalLeave(object):
approval_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approval_id"}
) # 审批实例ID
uniq_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "uniq_id"}
) # 假期类型唯一 ID,代表一种假期类型,长度小于 14
unit: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "unit"}
) # 假期时长单位,可用值:【1(天),2(小时),3(半天),4(半小时)】
interval: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "interval"}
) # 假期时长(单位秒)
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
i18n_names: I18nNames = attr.ib(
default=None, metadata={"req_type": "json", "key": "i18n_names"}
) # 假期多语言展示,格式为 map,key 为["ch"、"en"、"ja"],其中 ch 代表中文,en 代表英文、ja 代表日文
default_locale: str = attr.ib(
default="", metadata={"req_type": "json", "key": "default_locale"}
) # 默认语言类型,由于飞书客户端支持中、英、日三种语言,当用户切换语言时,如果假期名称没有所对应语言的名称,则会使用默认语言的名称,可用值:【ch(中文),en(英文),ja(日文)】
reason: str = attr.ib(
default="", metadata={"req_type": "json", "key": "reason"}
) # 请假理由
approve_pass_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_pass_time"}
) # 审批通过时间,时间格式为 yyyy-MM-dd HH:mm:ss
approve_apply_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_apply_time"}
) # 审批申请时间,时间格式为 yyyy-MM-dd HH:mm:ss
@attr.s
class CreateAttendanceUserApprovalRespUserApprovalOut(object):
approval_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approval_id"}
) # 审批实例ID
uniq_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "uniq_id"}
) # 外出类型唯一 ID,代表一种外出类型,长度小于 14
unit: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "unit"}
) # 外出时长单位,可用值:【1(天),2(小时),3(半天),4(半小时)】
interval: int = attr.ib(
default=0, metadata={"req_type": "json", "key": "interval"}
) # 假期时长(单位秒)
start_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "start_time"}
) # 开始时间,时间格式为 yyyy-MM-dd HH:mm:ss
end_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "end_time"}
) # 结束时间,时间格式为 yyyy-MM-dd HH:mm:ss
i18n_names: I18nNames = attr.ib(
default=None, metadata={"req_type": "json", "key": "i18n_names"}
) # 外出多语言展示,格式为 map,key 为["ch"、"en"、"ja"],其中 ch 代表中文,en 代表英文、ja 代表日文
default_locale: str = attr.ib(
default="", metadata={"req_type": "json", "key": "default_locale"}
) # 默认语言类型,由于飞书客户端支持中、英、日三种语言,当用户切换语言时,如果外出名称没有所对应语言的名称,则会使用默认语言的名称
reason: str = attr.ib(
default="", metadata={"req_type": "json", "key": "reason"}
) # 外出理由
approve_pass_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_pass_time"}
) # 审批通过时间
approve_apply_time: str = attr.ib(
default="", metadata={"req_type": "json", "key": "approve_apply_time"}
) # 审批申请时间
@attr.s
class CreateAttendanceUserApprovalRespUserApproval(object):
user_id: str = attr.ib(
default="", metadata={"req_type": "json", "key": "user_id"}
) # 审批用户 ID
date: str = attr.ib(
default="", metadata={"req_type": "json", "key": "date"}
) # 审批作用时间
outs: typing.List[CreateAttendanceUserApprovalRespUserApprovalOut] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "outs"}
) # 外出信息
leaves: typing.List[CreateAttendanceUserApprovalRespUserApprovalLeave] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "leaves"}
) # 请假信息
overtime_works: typing.List[
CreateAttendanceUserApprovalRespUserApprovalOvertimeWork
] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "overtime_works"}
) # 加班信息
trips: typing.List[CreateAttendanceUserApprovalRespUserApprovalTrip] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "trips"}
) # 出差信息
@attr.s
class CreateAttendanceUserApprovalResp(object):
user_approvals: typing.List[CreateAttendanceUserApprovalRespUserApproval] = attr.ib(
factory=lambda: [], metadata={"req_type": "json", "key": "user_approvals"}
) # 审批结果列表
def _gen_create_attendance_user_approval_req(request, options) -> RawRequestReq:
return RawRequestReq(
dataclass=CreateAttendanceUserApprovalResp,
scope="Attendance",
api="CreateAttendanceUserApproval",
method="POST",
url="https://open.feishu.cn/open-apis/attendance/v1/user_approvals",
body=request,
method_option=_new_method_option(options),
need_tenant_access_token=True,
)
| 41.727869 | 103 | 0.617349 | 1,588 | 12,727 | 4.810453 | 0.117128 | 0.060479 | 0.151198 | 0.18903 | 0.759 | 0.759 | 0.757953 | 0.757953 | 0.757953 | 0.74748 | 0 | 0.007849 | 0.199104 | 12,727 | 304 | 104 | 41.865132 | 0.741587 | 0.162961 | 0 | 0.565217 | 1 | 0 | 0.185644 | 0.002648 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003623 | false | 0.028986 | 0.018116 | 0.003623 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c50b59a98a288831e85281004d9d86802eecdd60 | 9,942 | py | Python | pylearn2/training_algorithms/tests/test_learning_rule.py | BouchardLab/pylearn2 | 4cab785b870d22cd9e85a5f536d4cac234b6bf60 | [
"BSD-3-Clause"
] | 3 | 2018-04-05T21:24:54.000Z | 2021-09-14T01:48:36.000Z | pylearn2/training_algorithms/tests/test_learning_rule.py | BouchardLab/pylearn2 | 4cab785b870d22cd9e85a5f536d4cac234b6bf60 | [
"BSD-3-Clause"
] | null | null | null | pylearn2/training_algorithms/tests/test_learning_rule.py | BouchardLab/pylearn2 | 4cab785b870d22cd9e85a5f536d4cac234b6bf60 | [
"BSD-3-Clause"
] | 2 | 2018-02-18T14:46:57.000Z | 2019-05-03T11:51:45.000Z | import numpy as np
from theano.compat.six.moves import zip as izip
from pylearn2.costs.cost import SumOfCosts
from pylearn2.testing.cost import SumOfOneHalfParamsSquared
from pylearn2.testing.cost import SumOfParams
from pylearn2.testing.datasets import ArangeDataset
from pylearn2.training_algorithms.sgd import SGD
from pylearn2.training_algorithms.learning_rule import Momentum
from pylearn2.training_algorithms.learning_rule import AdaDelta
from pylearn2.training_algorithms.learning_rule import AdaGrad
from pylearn2.training_algorithms.learning_rule import RMSProp
from test_sgd import DummyCost, DummyModel
# used by all learning rule tests
scales = [.01, .02, .05, 1., 5.]
shapes = [(1,), (9,), (8, 7), (6, 5, 4), (3, 2, 2, 2)]
learning_rate = .001
def test_momentum():
"""
Make sure that learning_rule.Momentum obtains the same parameter values as
with a hand-crafted sgd w/ momentum implementation, given a dummy model and
learning rate scaler for each parameter.
"""
# We include a cost other than SumOfParams so that data is actually
# queried from the training set, and the expected number of updates
# are applied.
cost = SumOfCosts([SumOfParams(), (0., DummyCost())])
model = DummyModel(shapes, lr_scalers=scales)
dataset = ArangeDataset(1)
momentum = 0.5
sgd = SGD(cost=cost,
learning_rate=learning_rate,
learning_rule=Momentum(momentum),
batch_size=1)
sgd.setup(model=model, dataset=dataset)
manual = [param.get_value() for param in model.get_params()]
inc = [-learning_rate * scale for scale in scales]
manual = [param + i for param, i in izip(manual, inc)]
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
manual = [param - learning_rate * scale + i * momentum
for param, scale, i in izip(manual, scales, inc)]
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
def test_nesterov_momentum():
"""
Make sure that learning_rule.Momentum obtains the same parameter values as
with a hand-crafted sgd w/ momentum implementation, given a dummy model and
learning rate scaler for each parameter.
"""
# We include a cost other than SumOfParams so that data is actually
# queried from the training set, and the expected number of updates
# are applied.
cost = SumOfCosts([SumOfParams(), (0., DummyCost())])
model = DummyModel(shapes, lr_scalers=scales)
dataset = ArangeDataset(1)
momentum = 0.5
sgd = SGD(cost=cost,
learning_rate=learning_rate,
learning_rule=Momentum(momentum, nesterov_momentum=True),
batch_size=1)
sgd.setup(model=model, dataset=dataset)
manual = [param.get_value() for param in model.get_params()]
vel = [-learning_rate * scale for scale in scales]
updates = [-learning_rate * scale + v * momentum
for scale, v in izip(scales, vel)]
manual = [param + update for param, update in izip(manual, updates)]
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
vel = [-learning_rate * scale + i * momentum
for scale, i in izip(scales, vel)]
updates = [-learning_rate * scale + v * momentum
for scale, v in izip(scales, vel)]
manual = [param + update for param, update in izip(manual, updates)]
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
def test_adadelta():
"""
Make sure that learning_rule.AdaDelta obtains the same parameter values as
with a hand-crafted AdaDelta implementation, given a dummy model and
learning rate scaler for each parameter.
Reference:
"AdaDelta: An Adaptive Learning Rate Method", Matthew D. Zeiler.
"""
# We include a cost other than SumOfParams so that data is actually
# queried from the training set, and the expected number of updates
# are applied.
cost = SumOfCosts([SumOfOneHalfParamsSquared(), (0., DummyCost())])
model = DummyModel(shapes, lr_scalers=scales)
dataset = ArangeDataset(1)
decay = 0.95
sgd = SGD(cost=cost,
learning_rate=learning_rate,
learning_rule=AdaDelta(decay),
batch_size=1)
sgd.setup(model=model, dataset=dataset)
state = {}
for param in model.get_params():
param_shape = param.get_value().shape
state[param] = {}
state[param]['g2'] = np.zeros(param_shape)
state[param]['dx2'] = np.zeros(param_shape)
def adadelta_manual(model, state):
inc = []
rval = []
for scale, param in izip(scales, model.get_params()):
pstate = state[param]
param_val = param.get_value()
# begin adadelta
pstate['g2'] = decay * pstate['g2'] + (1 - decay) * param_val ** 2
rms_g_t = np.sqrt(pstate['g2'] + scale * learning_rate)
rms_dx_tm1 = np.sqrt(pstate['dx2'] + scale * learning_rate)
dx_t = -rms_dx_tm1 / rms_g_t * param_val
pstate['dx2'] = decay * pstate['dx2'] + (1 - decay) * dx_t ** 2
rval += [param_val + dx_t]
return rval
manual = adadelta_manual(model, state)
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
manual = adadelta_manual(model, state)
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param in
izip(manual, model.get_params()))
def test_adagrad():
"""
Make sure that learning_rule.AdaGrad obtains the same parameter values as
with a hand-crafted AdaGrad implementation, given a dummy model and
learning rate scaler for each parameter.
Reference:
"Adaptive subgradient methods for online learning and
stochastic optimization", Duchi J, Hazan E, Singer Y.
"""
# We include a cost other than SumOfParams so that data is actually
# queried from the training set, and the expected number of updates
# are applied.
cost = SumOfCosts([SumOfOneHalfParamsSquared(), (0., DummyCost())])
model = DummyModel(shapes, lr_scalers=scales)
dataset = ArangeDataset(1)
sgd = SGD(cost=cost,
learning_rate=learning_rate,
learning_rule=AdaGrad(),
batch_size=1)
sgd.setup(model=model, dataset=dataset)
state = {}
for param in model.get_params():
param_shape = param.get_value().shape
state[param] = {}
state[param]['sg2'] = np.zeros(param_shape)
def adagrad_manual(model, state):
rval = []
for scale, param in izip(scales, model.get_params()):
pstate = state[param]
param_val = param.get_value()
# begin adadelta
pstate['sg2'] += param_val ** 2
dx_t = - (scale * learning_rate
/ np.sqrt(pstate['sg2'])
* param_val)
rval += [param_val + dx_t]
return rval
manual = adagrad_manual(model, state)
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
manual = adagrad_manual(model, state)
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param in
izip(manual, model.get_params()))
def test_rmsprop():
"""
Make sure that learning_rule.RMSProp obtains the same parameter values as
with a hand-crafted RMSProp implementation, given a dummy model and
learning rate scaler for each parameter.
"""
# We include a cost other than SumOfParams so that data is actually
# queried from the training set, and the expected number of updates
# are applied.
cost = SumOfCosts([SumOfOneHalfParamsSquared(), (0., DummyCost())])
scales = [.01, .02, .05, 1., 5.]
shapes = [(1,), (9,), (8, 7), (6, 5, 4), (3, 2, 2, 2)]
model = DummyModel(shapes, lr_scalers=scales)
dataset = ArangeDataset(1)
learning_rate = .001
decay = 0.90
max_scaling = 1e5
sgd = SGD(cost=cost,
learning_rate=learning_rate,
learning_rule=RMSProp(decay),
batch_size=1)
sgd.setup(model=model, dataset=dataset)
state = {}
for param in model.get_params():
param_shape = param.get_value().shape
state[param] = {}
state[param]['g2'] = np.zeros(param_shape)
def rmsprop_manual(model, state):
inc = []
rval = []
epsilon = 1. / max_scaling
for scale, param in izip(scales, model.get_params()):
pstate = state[param]
param_val = param.get_value()
# begin rmsprop
pstate['g2'] = decay * pstate['g2'] + (1 - decay) * param_val ** 2
rms_g_t = np.maximum(np.sqrt(pstate['g2']), epsilon)
dx_t = - scale * learning_rate / rms_g_t * param_val
rval += [param_val + dx_t]
return rval
manual = rmsprop_manual(model, state)
sgd.train(dataset=dataset)
assert all(np.allclose(manual_param, sgd_param.get_value())
for manual_param, sgd_param
in izip(manual, model.get_params()))
| 35.130742 | 79 | 0.639107 | 1,280 | 9,942 | 4.825781 | 0.122656 | 0.054395 | 0.040797 | 0.055367 | 0.842804 | 0.798608 | 0.79197 | 0.746479 | 0.742432 | 0.733204 | 0 | 0.014251 | 0.258902 | 9,942 | 282 | 80 | 35.255319 | 0.824104 | 0.194428 | 0 | 0.721591 | 0 | 0 | 0.00471 | 0 | 0 | 0 | 0 | 0 | 0.051136 | 1 | 0.045455 | false | 0 | 0.068182 | 0 | 0.130682 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c530ff63a7ac0e50efd23184fa3f38d5e3d22311 | 260,561 | py | Python | jsfuzz/utils/BLACKLIST.py | gustavopinto/entente | 19b65d8cafd77c198c9c441f4f5e01503360309b | [
"BSD-2-Clause"
] | 5 | 2018-03-20T21:53:38.000Z | 2018-12-28T21:08:47.000Z | jsfuzz/utils/BLACKLIST.py | gustavopinto/entente | 19b65d8cafd77c198c9c441f4f5e01503360309b | [
"BSD-2-Clause"
] | 14 | 2018-04-09T20:16:00.000Z | 2019-06-11T12:31:10.000Z | jsfuzz/utils/BLACKLIST.py | gustavopinto/entente | 19b65d8cafd77c198c9c441f4f5e01503360309b | [
"BSD-2-Clause"
] | 12 | 2018-04-06T00:52:24.000Z | 2018-07-10T19:44:16.000Z | from jsfuzz.utils.utils import string_to_hash
duktape_utils = [
'utils.js', 'util-base.js', 'util-buffer.js', 'util-buffer.js',
'util-helloworld.js', 'util-number.js', 'util-object.js',
'uti-promise.js', 'util-regexp.js', 'util-string.js', 'util-symbol.js'
]
v8_data = [
'ai-astar-data.js',
'audio-beat-detection-data.js',
'audio-dft-data.js',
'audio-fft-data.js',
'audio-oscillator-data.js',
'imaging-darkroom-data.js',
'imaging-desaturate-data.js',
'imaging-gaussian-blur-data.js',
'json-parse-financial-data.js',
'json-stringify-tinderbox-data.js',
'stanford-crypto-aes-data.js',
'stanford-crypto-ccm-data.js',
'stanford-crypto-pbkdf2-data.js',
'stanford-crypto-sha256-iterative-data.js'
]
BLACKLIST = [
"mozilla/non262/String/normalize-generateddata-part0.js",
"mozilla/non262/String/normalize-generateddata-part1-not-listed.js",
"mozilla/non262/String/match-004.js",
"mozilla/non262/String/generics-deprecated.js",
"mozilla/non262/String/normalize-generateddata-part3.js",
"mozilla/non262/String/string-object-length.js",
"mozilla/non262/String/normalize-generateddata-part1.js",
"mozilla/non262/String/normalize-generateddata-part2.js",
"mozilla/non262/String/match-001.js",
"mozilla/non262/String/match-003.js",
"mozilla/non262/String/match-002.js",
"mozilla/non262/regress/regress-425360.js",
"mozilla/non262/regress/regress-595230-2.js",
"mozilla/non262/regress/regress-560998-1.js",
"mozilla/non262/regress/regress-233483-2.js",
"mozilla/non262/regress/regress-341360.js",
"mozilla/non262/regress/regress-314401.js",
"mozilla/non262/regress/regress-453024.js",
"mozilla/non262/regress/regress-607799.js",
"mozilla/non262/regress/regress-233483.js",
"mozilla/non262/regress/regress-356693.js",
"mozilla/non262/regress/regress-384758.js",
"mozilla/non262/regress/regress-418540.js",
"mozilla/non262/regress/regress-466747.js",
"mozilla/non262/regress/regress-442333-01.js",
"mozilla/non262/regress/regress-607863.js",
"mozilla/non262/regress/regress-585257.js",
"mozilla/non262/TypedObject/arrayzerolen.js",
"mozilla/non262/TypedObject/simpleequiv.js",
"mozilla/non262/TypedObject/structtypeindexedfields.js",
"mozilla/non262/TypedObject/method_from.js",
"mozilla/non262/TypedObject/referencetypemultiple.js",
"mozilla/non262/TypedObject/referencetypealiasing.js",
"mozilla/non262/TypedObject/structtypeenumerate.js",
"mozilla/non262/TypedObject/architecture.js",
"mozilla/non262/TypedObject/method_reduce.js",
"mozilla/non262/TypedObject/structtypereflection.js",
"mozilla/non262/TypedObject/structtypeprototype.js",
"mozilla/non262/TypedObject/method_build.js",
"mozilla/non262/TypedObject/memory.js",
"mozilla/non262/TypedObject/scalar_types.js",
"mozilla/non262/TypedObject/storageopaque.js",
"mozilla/non262/TypedObject/atopbuffer.js",
"mozilla/non262/TypedObject/referencetypeuninit.js",
"mozilla/non262/TypedObject/atopbufferwithoffset.js",
"mozilla/non262/TypedObject/redimension.js",
"mozilla/non262/TypedObject/referencetypecoercions.js",
"mozilla/non262/TypedObject/arrayequiv.js",
"mozilla/non262/TypedObject/referencetypetrace.js",
"mozilla/non262/TypedObject/numerictypes.js",
"mozilla/non262/TypedObject/structequiv.js",
"mozilla/non262/TypedObject/structtypestructuralassign.js",
"mozilla/non262/TypedObject/arraytype.js",
"mozilla/non262/TypedObject/arrayofstructs.js",
"mozilla/non262/TypedObject/method_filter.js",
"mozilla/non262/TypedObject/method_map.js",
"mozilla/non262/TypedObject/structtypegetownproperty.js",
"mozilla/non262/TypedObject/objecttype.js",
"mozilla/non262/Intl/getCalendarInfo.js",
"mozilla/non262/Intl/getDisplayNames.js",
"mozilla/non262/Intl/getLocaleInfo.js",
"mozilla/non262/iterable/regress-341815.js",
"mozilla/non262/iterable/regress-341821.js",
"mozilla/non262/SIMD/load-int8x16.js",
"mozilla/non262/SIMD/bug1023145.js",
"mozilla/non262/SIMD/unary-operations.js",
"mozilla/non262/SIMD/minmax.js",
"mozilla/non262/SIMD/typedobjects.js",
"mozilla/non262/SIMD/shifts.js",
"mozilla/non262/SIMD/load-int16x8.js",
"mozilla/non262/SIMD/load-unsigned-integers.js",
"mozilla/non262/SIMD/load-int32x4.js",
"mozilla/non262/SIMD/select-bitselect.js",
"mozilla/non262/SIMD/load-floats.js",
"mozilla/non262/SIMD/toString.js",
"mozilla/non262/SIMD/store.js",
"mozilla/non262/SIMD/swizzle-shuffle.js",
"mozilla/non262/SIMD/check.js",
"mozilla/non262/SIMD/conversions.js",
"mozilla/non262/SIMD/constructors.js",
"mozilla/non262/SIMD/bug953270.js",
"mozilla/non262/SIMD/binary-operations.js",
"mozilla/non262/SIMD/comparisons.js",
"mozilla/non262/SIMD/splat.js",
"mozilla/non262/SIMD/replaceLane.js",
"mozilla/non262/SIMD/load-sab-buffer-compat.js",
"mozilla/non262/SIMD/ToSource.js",
"mozilla/non262/SIMD/float64x2-arithmetic.js",
"mozilla/non262/GC/regress-319980-01.js",
"mozilla/non262/Date/time-zone-2038-pst.js",
"mozilla/non262/Date/time-zones-posix.js",
"mozilla/non262/Date/time-zone-pst.js",
"mozilla/non262/Date/time-zones-pedantic.js",
"mozilla/non262/Date/time-zones.js",
"mozilla/non262/Date/15.9.5.5.js",
"mozilla/non262/lexical-environment/var-in-catch-body-annex-b-eval-for-of.js",
"mozilla/non262/lexical-environment/redeclaring-global-properties.js",
"mozilla/non262/lexical-environment/nondefinable-function-same-script.js",
"mozilla/non262/lexical-environment/var-in-catch-body-annex-b-eval-destructuring.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-property.js",
"mozilla/non262/Error/constructor-proto.js",
"mozilla/non262/Error/prototype.js",
"mozilla/non262/Error/prototype-properties.js",
"mozilla/non262/Promise/self-resolve.js",
"mozilla/non262/Promise/promise-basics.js",
"mozilla/non262/Promise/enqueue-promise-reactions.js",
"mozilla/non262/Promise/get-wait-for-all-promise.js",
"mozilla/non262/Promise/promise-rejection-tracking.js",
"mozilla/non262/Promise/promise-all.js",
"mozilla/non262/Promise/iterator-close.js",
"mozilla/non262/Promise/promise-subclassing.js",
"mozilla/non262/Promise/iterator-primitive.js",
"mozilla/non262/class/outerBinding.js",
"mozilla/non262/destructuring/rest-parameter-aray-iterator.js",
"mozilla/non262/syntax/statement-versus-statementlistitem.js",
"mozilla/non262/syntax/identifier_vertical_tilde.js",
"mozilla/non262/Scope/regress-184107.js",
"mozilla/non262/Scope/regress-446026-01.js",
"mozilla/non262/misc/function-definition-evaluate.js",
"mozilla/non262/async-functions/identity.js",
"mozilla/non262/async-functions/await-newline.js",
"mozilla/non262/async-functions/ErrorStack.js",
"mozilla/non262/async-functions/arguments_callee.js",
"mozilla/non262/async-functions/methods.js",
"mozilla/non262/async-functions/semantics.js",
"mozilla/non262/Function/10.2.1.1.6.js",
"mozilla/non262/TypedArray/subarray-validation.js",
"mozilla/non262/TypedArray/filter-validation.js",
"mozilla/non262/TypedArray/slice-validation.js",
"mozilla/non262/TypedArray/map-validation.js",
"mozilla/non262/RegExp/octal-002.js",
"mozilla/non262/RegExp/octal-003.js",
"mozilla/non262/RegExp/exec-002.js",
"mozilla/non262/RegExp/regress-9141.js",
"mozilla/non262/RegExp/properties-001.js",
"mozilla/non262/RegExp/multiline-001.js",
"mozilla/non262/RegExp/regress-6359.js",
"mozilla/non262/RegExp/properties-002.js",
"mozilla/non262/RegExp/octal-001.js",
"mozilla/non262/RegExp/regress-001.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-4-of-8.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-14-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-19-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-22-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-27-of-32.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-1-of-8.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-7-of-8.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-12-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-11-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-16-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-07-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-21-of-32.js",
"mozilla/non262/object/15.2.3.6-function-length.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-02-of-32.js",
"mozilla/non262/object/15.2.3.6-redefinition-3-of-4.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-05-of-32.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-6-of-8.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-26-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-18-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-24-of-32.js",
"mozilla/non262/object/15.2.3.6-redefinition-2-of-4.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-13-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-20-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-23-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-06-of-32.js",
"mozilla/non262/object/15.2.3.6-redefinition-1-of-4.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-10-of-32.js",
"mozilla/non262/object/15.2.3.6-new-definition.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-31-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-32-of-32.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-2-of-8.js",
"mozilla/non262/object/freeze-global-eval-const.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-8-of-8.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-08-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-15-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-30-of-32.js",
"mozilla/non262/object/setPrototypeOf-cycle.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-29-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-01-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-17-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-04-of-32.js",
"mozilla/non262/object/15.2.3.6-redefinition-4-of-4.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-5-of-8.js",
"mozilla/non262/object/15.2.3.6-middle-redefinition-3-of-8.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-25-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-09-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-28-of-32.js",
"mozilla/non262/object/15.2.3.6-dictionary-redefinition-03-of-32.js",
"mozilla/non262/extensions/String-methods-infinite-recursion.js",
"mozilla/non262/extensions/toLocaleString-infinite-recursion.js",
"mozilla/non262/extensions/regress-367121.js",
"mozilla/non262/extensions/redeclaration-of-catch-warning.js",
"mozilla/non262/extensions/toLength.js",
"mozilla/non262/extensions/clone-simple.js",
"mozilla/non262/extensions/proxy-proto-setter.js",
"mozilla/non262/extensions/regress-369404.js",
"mozilla/non262/extensions/file-mapped-arraybuffers.js",
"mozilla/non262/extensions/clone-v1-typed-array.js",
"mozilla/non262/extensions/clone-errors.js",
"mozilla/non262/extensions/regress-372309.js",
"mozilla/non262/extensions/clone-many-transferables.js",
"mozilla/non262/extensions/regress-636818.js",
"mozilla/non262/extensions/non_syntactic.js",
"mozilla/non262/extensions/regress-327608.js",
"mozilla/non262/extensions/clone-transferables.js",
"mozilla/non262/extensions/expclo.js",
"mozilla/non262/extensions/regress-367589.js",
"mozilla/non262/extensions/toSource-infinite-recursion.js",
"mozilla/non262/extensions/function-definition-with.js",
"mozilla/non262/extensions/regress-407720.js",
"mozilla/non262/extensions/arrow-as-end-of-expression-closure.js",
"mozilla/non262/extensions/clone-forge.js",
"mozilla/non262/extensions/regress-443569.js",
"mozilla/non262/extensions/expclo2.js",
"mozilla/non262/extensions/array-toString-recursion.js",
"mozilla/non262/extensions/expression-closure-syntax.js",
"mozilla/non262/extensions/clone-typed-array.js",
"mozilla/non262/extensions/regress-352372.js",
"mozilla/non262/extensions/sps-generators.js",
"mozilla/non262/Intl/DateTimeFormat/mozExtensions.js",
"mozilla/non262/Intl/DateTimeFormat/tz-environment-variable.js",
"mozilla/non262/Intl/RelativeTimeFormat/relativetimeformat.js",
"mozilla/non262/Intl/RelativeTimeFormat/supportedLocalesOf.js",
"mozilla/non262/Intl/RelativeTimeFormat/format.js",
"mozilla/non262/Intl/RelativeTimeFormat/construct-newtarget.js",
"mozilla/non262/Intl/extensions/options-value-emulates-undefined.js",
"mozilla/non262/Intl/extensions/unicode-extension-sequences.js",
"JSI/tests/update.js",
"JSI/tests/alias.js",
"JSI/tests/interp2.js",
"JSI/tests/expr.js",
"JSI/tests/syntax.js",
"JSI/tests/interp.js",
"JSI/tests/exec.js",
"JSI/tests/info.js",
"JSI/tests/signal.js",
"JSI/tests/eval.js",
"JSI/tests/while2.js",
"JSI/tests/util.js",
"JSI/tests/badfunc.js",
"JSI/tests/file.js",
"JSI/tests/logging.js",
"JSI/tests/sqlite.js",
"JSI/tests/bind.js",
"JSI/tests/file2.js",
"JSI/tests/format.js",
"JSI/tests/b64.js",
"JSI/tests/time.js",
"v8/benchmarks/data/kraken/tests/stanford-crypto-aes.js",
"v8/benchmarks/data/kraken/tests/stanford-crypto-ccm.js",
"DukTape/ecmascript/test-dev-undecl-var-error-messages.js",
"DukTape/ecmascript/test-dev-finalizer-silent-error.js",
"DukTape/ecmascript/test-dev-logicalnot-refcount.js",
"DukTape/ecmascript/test-dev-finalizer-loop.js",
"DukTape/ecmascript/test-dev-coroutine-basic.js",
"DukTape/ecmascript/test-dev-refcount-finalizer-3.js",
"DukTape/ecmascript/test-dev-finalize-reachable.js",
"DukTape/ecmascript/test-dev-getpropc-misc.js",
"DukTape/ecmascript/test-dev-call-prop-side-effect-order.js",
"DukTape/ecmascript/test-dev-inlined-unary-lnot.js",
"DukTape/ecmascript/test-dev-markandsweep-finalizer-1.js",
"DukTape/ecmascript/test-commonjs-require-id.js",
"DukTape/ecmascript/test-dev-markandsweep-during-finalization.js",
"DukTape/ecmascript/test-dev-finalizer-heapdestruct-rescue.js",
"DukTape/ecmascript/test-err-callstack-headroom-1.js",
"DukTape/ecmascript/test-bug-throw-in-catch.js",
"DukTape/ecmascript/test-misc-pointer-tostring.js",
"DukTape/ecmascript/test-dev-16bit-overflows.js",
"DukTape/ecmascript/test-bug-finalizer-rescue.js",
"DukTape/ecmascript/test-commonjs-require-resolution.js",
"DukTape/ecmascript/test-dev-notail-directive.js",
"DukTape/ecmascript/test-bug-proxy-finalizer-double-call.js",
"DukTape/ecmascript/test-dev-finalizer-heapdestruct-spawn1.js",
"DukTape/ecmascript/test-bug-stringtable-leak.js",
"DukTape/ecmascript/test-dev-buffer-delete-elem.js",
"DukTape/ecmascript/test-dev-finalizer-heapdestruct-runonce.js",
"DukTape/ecmascript/test-bi-duktape-errhandler.js",
"DukTape/ecmascript/test-bi-duktape-line.js",
"DukTape/ecmascript/test-bug-act-linenumber-gh143.js",
"DukTape/ecmascript/test-dev-refcount-finalizer-1.js",
"DukTape/ecmascript/test-commonjs-require-environment.js",
"DukTape/ecmascript/test-base64-enc-basic.js",
"DukTape/ecmascript/test-dev-markandsweep-finalizer-2.js",
"DukTape/ecmascript/test-dev-valstack-shrink-check-2.js",
"DukTape/ecmascript/test-bi-logger.js",
"DukTape/ecmascript/test-commonjs-module-logname.js",
"DukTape/ecmascript/test-dev-finalizer-inherited.js",
"DukTape/ecmascript/test-bug-finally-ljtype-gh287.js",
"DukTape/ecmascript/test-dev-valstack-shrink-check-1.js",
"DukTape/ecmascript/test-bi-function-nonstd-caller-prop.js",
"DukTape/ecmascript/test-dev-hex-enc.js",
"DukTape/ecmascript/test-bug-refcount-finalizer-garbage-loop.js",
"DukTape/ecmascript/test-dev-call-error-messages.js",
"DukTape/ecmascript/test-commonjs-require-example.js",
"DukTape/ecmascript/test-dev-refcount-finalizer-2.js",
"DukTape/ecmascript/test-dev-markandsweep-finalizer-3.js",
"DukTape/ecmascript/test-dev-pointer-object.js",
"DukTape/ecmascript/test-bug-object-prop-alloc-unbounded.js",
"DukTape/ecmascript/test-bug-object-literal-getset-tempreg.js",
"DukTape/ecmascript/test-bi-duktape-act.js",
"DukTape/ecmascript/test-commonjs-require-circular.js",
"DukTape/ecmascript/test-bi-duktape-json-lightfunc.js",
"DukTape/ecmascript/test-commonjs-module-search-function.js",
"DukTape/ecmascript/test-bi-duktape-thread-prototype-class.js",
"DukTape/ecmascript/test-dev-finalizer-heapdestruct-spawn2.js",
"DukTape/ecmascript/test-dev-named-funcexpr-refcount.js",
"DukTape/ecmascript/test-bi-proxy-internal-keys.js",
"DukTape/ecmascript/test-bug-tailcall-thread-yield-resume.js",
"DukTape/ecmascript/test-bi-duktape-json-custom.js",
"DukTape/ecmascript/test-dev-finalizer-refzero-for-pending.js",
"DukTape/ecmascript/test-commonjs-require-tweaked-id.js",
"DukTape/ecmascript/test-dev-internal-property-basics.js",
"DukTape/ecmascript/test-dev-markandsweep-finalizer-4.js",
"DukTape/ecmascript/test-err-callstack-headroom-2.js",
"DukTape/ecmascript/test-commonjs-require-resolution-randomized.js",
"DukTape/ecmascript/test-dev-finalizer-heapdestruct-argument.js",
"DukTape/ecmascript/test-bug-finalizer-repro-gh1311.js",
"DukTape/ecmascript/test-commonjs-module-return.js",
"JerryJS/regression/tests/regression-test-issue-1873.js",
"JerryJS/regression/tests/arguments-postfix-strict.js",
"JerryJS/regression/tests/arguments-prefix-strict.js",
"mozilla/non262/String/regress-306591.js",
"mozilla/non262/String/replace-GetMethod.js",
"mozilla/non262/String/generics.js",
"mozilla/non262/regress/regress-355556.js",
"mozilla/non262/jit/regress-452498-01.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-label.js",
"mozilla/non262/class/superPropDVG.js",
"mozilla/non262/class/superPropBasicCalls.js",
"mozilla/non262/destructuring/cover-init-name-syntax.js",
"mozilla/non262/async-functions/toSource.js",
"mozilla/non262/Function/has-instance-jitted.js",
"mozilla/non262/Array/regress-415540.js",
"mozilla/non262/Array/generics.js",
"mozilla/non262/RegExp/regress-yarr-regexp.js",
"mozilla/non262/object/regress-444787.js",
"mozilla/non262/extensions/regress-469625-01.js",
"mozilla/non262/extensions/regress-90596-001.js",
"mozilla/non262/extensions/regress-470310.js",
"mozilla/non262/extensions/object-toSource-undefined-getter.js",
"mozilla/non262/extensions/regress-336410-1.js",
"mozilla/non262/extensions/decompile-for-of.js",
"mozilla/non262/extensions/regress-336409-1.js",
"mozilla/non262/extensions/regress-314874.js",
"mozilla/non262/extensions/regress-379566.js",
"mozilla/non262/extensions/regress-369696-01.js",
"mozilla/non262/extensions/eval-native-callback-is-indirect.js",
"mozilla/non262/extensions/regress-459606.js",
"mozilla/non262/extensions/regress-96284-001.js",
"mozilla/non262/extensions/regress-336410-2.js",
"mozilla/non262/extensions/regress-381304.js",
"mozilla/non262/extensions/regress-333541.js",
"mozilla/non262/extensions/regress-355497.js",
"mozilla/non262/extensions/object-toSource-with-symbol-keys.js",
"mozilla/non262/extensions/regress-311161.js",
"mozilla/non262/extensions/regress-342960.js",
"mozilla/non262/extensions/clone-leaf-object.js",
"mozilla/non262/extensions/regress-336409-2.js",
"mozilla/non262/extensions/regress-381303.js",
"DukTape/ecmascript/test-bi-textencoder.js",
"DukTape/ecmascript/test-dev-buffer-interop.js",
"DukTape/ecmascript/test-bi-textdecoder.js",
"mozilla/non262/extensions/regress-355052-01.js",
"mozilla/non262/extensions/regress-355052-02.js",
"mozilla/non262/extensions/regress-355052-03.js",
"WebKit/es6/Reflect_Reflect.enumerate.js",
"mozilla/non262/GC/regress-324278.js",
"mozilla/non262/extensions/regress-375801.js",
"mozilla/non262/extensions/regress-445818.js",
"mozilla/non262/extensions/regress-96284-002.js",
"mozilla/non262/extensions/regress-367629.js",
"mozilla/non262/extensions/regress-381301.js",
"mozilla/non262/extensions/toSource-0.js",
"mozilla/non262/extensions/regress-44009.js",
"WebKit/es6/typed_arrays_correct_prototype_chains.js",
"mozilla/non262/regress/regress-464334.js",
"mozilla/non262/regress/regress-596103.js",
"mozilla/non262/regress/regress-404755.js",
"mozilla/non262/regress/regress-592556-c35.js",
"mozilla/non262/GC/",
"mozilla/non262/generators/gen-with-call-obj.js",
"mozilla/non262/generators/regress-466206.js",
"mozilla/non262/async-functions/subclass.js",
"mozilla/non262/Array/regress-360681-02.js",
"mozilla/non262/Array/regress-360681-01.js",
"mozilla/non262/Array/regress-474529.js",
"mozilla/non262/object/clear-dictionary-accessor-getset.js",
"mozilla/non262/extensions/clone-object-deep.js",
"mozilla/non262/extensions/regress-650753.js",
"mozilla/non262/extensions/regress-354297.js",
"mozilla/non262/extensions/clone-object.js",
"mozilla/non262/extensions/regress-417131.js",
"mozilla/non262/extensions/weakmap.js",
"mozilla/non262/extensions/regress-412926.js",
"mozilla/non262/extensions/regress-311792-02.js",
"mozilla/non262/extensions/recursion.js",
"mozilla/non262/extensions/clone-sab.js",
"mozilla/non262/extensions/regress-311792-01.js",
"mozilla/non262/extensions/sharedtypedarray.js",
"mozilla/non262/get-set/regress-375976.js",
"mozilla/non262/lexical-environment/unscopables-strict.js",
"mozilla/non262/regress/regress-407727-01.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-properties.js",
"DukTape/ecmascript/test-dev-finalizer-markandsweep-zero-refcount.js",
"DukTape/ecmascript/test-dev-bound-func-callstack.js",
"DukTape/ecmascript/test-bug-currpc-sync-gh294.js",
"DukTape/ecmascript/test-bi-arraybuffer-proto-slice.js",
"DukTape/ecmascript/test-base64-random-roundtrip.js",
"DukTape/ecmascript/test-bi-typedarray-misc-iff.js",
"DukTape/ecmascript/test-bi-symbol-misc.js",
"DukTape/ecmascript/test-bi-reflect-construct-callstack.js",
"DukTape/ecmascript/test-dev-plain-pointer.js",
"DukTape/ecmascript/test-commonjs-require-subrequire-name.js",
"DukTape/ecmascript/test-commonjs-require-filename.js",
"DukTape/ecmascript/test-dev-typedarray-view-1.js",
"DukTape/ecmascript/test-bi-typedarray.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-compare.js",
"DukTape/ecmascript/test-bug-base64-dec-whitespace-padding.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-instance-enum.js",
"DukTape/ecmascript/test-bug-bufferobject-cast-gh336.js",
"DukTape/ecmascript/test-bi-proxy-apply-yield.js",
"DukTape/ecmascript/test-bi-typedarray-proto-set.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-class.js",
"DukTape/ecmascript/test-bi-typedarray-proto-subarray.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-assign-nonnumber.js",
"DukTape/ecmascript/test-bug-commonjs-relative-id.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-json.js",
"DukTape/ecmascript/test-bi-duktape-enc-jx.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-json-stringify.js",
"DukTape/ecmascript/test-bi-uint8array-plainof.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-differences.js",
"DukTape/ecmascript/test-bi-typedarray-misc-zeroing.js",
"DukTape/ecmascript/test-commonjs-module-filename.js",
"DukTape/ecmascript/test-bi-arraybuffer-constructor.js",
"DukTape/ecmascript/test-dev-lightfunc-finalizer.js",
"DukTape/ecmascript/test-bi-json-enc-key-autoquote.js",
"DukTape/ecmascript/test-dev-api-verbose-error-messages-gh441.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-concat.js",
"DukTape/ecmascript/test-bi-proxy-construct-yield.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-misc-write-coercion.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-slowbuffer.js",
"DukTape/ecmascript/test-bug-object-delprop-eidx-1.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-isbuffer.js",
"DukTape/ecmascript/test-dev-writable-error-filename-gh387.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-subclassing.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-this-safety.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-buffer-property.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-slice-inherit.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-valueof.js",
"DukTape/ecmascript/test-bi-proxy-property-safety.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-write.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-constructor.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-fill.js",
"DukTape/ecmascript/test-dev-prop-error-messages.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-misc-iff.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-required-props.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-bytelength.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-noassert.js",
"DukTape/ecmascript/test-dev-buffer-to-string.js",
"DukTape/ecmascript/test-bi-global-global-binding.js",
"DukTape/ecmascript/test-dev-finalizer-skip.js",
"DukTape/ecmascript/test-bi-symbol-custom.js",
"DukTape/ecmascript/test-bug-object-defprop-eidx-1.js",
"DukTape/ecmascript/test-dev-lightfunc-accessor.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-varint.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-copy.js",
"DukTape/ecmascript/test-bi-json-enc-fastpath.js",
"DukTape/ecmascript/test-bi-typedarray-write-index.js",
"DukTape/ecmascript/test-dev-regexp-negative-jump-offset.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-instance-properties.js",
"DukTape/ecmascript/test-bi-json-enc-fastpath-plainbuf.js",
"DukTape/ecmascript/test-dev-hex-dec-brute.js",
"DukTape/ecmascript/test-bi-typedarray-constructor.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-constructor-properties.js",
"DukTape/ecmascript/test-dev-coroutine-bound-func.js",
"DukTape/ecmascript/test-dev-tailcall-constructor-normal-mixing.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-readfield.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-slice.js",
"DukTape/ecmascript/test-dev-fastint-basic.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-tostring.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-misc-isview.js",
"DukTape/ecmascript/test-bug-object-defprop-eidx-2.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-misc-retval.js",
"DukTape/ecmascript/test-dev-bound-thread-start-func.js",
"DukTape/ecmascript/test-bi-duktape.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-tojson.js",
"DukTape/ecmascript/test-dev-refcount-leak-basic.js",
"DukTape/ecmascript/test-bug-duktape-gc-retval.js",
"DukTape/ecmascript/test-commonjs-module-exports-circular.js",
"DukTape/ecmascript/test-dev-fromcharcode-nonbmp.js",
"DukTape/ecmascript/test-bi-duktape-enc-jc.js",
"DukTape/ecmascript/test-commonjs-module-exports-repl.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-subarray.js",
"DukTape/ecmascript/test-bi-string-frombuffer.js",
"DukTape/ecmascript/test-bi-dataview-read-methods.js",
"DukTape/ecmascript/test-bi-object-proto-tostring-custom.js",
"DukTape/ecmascript/test-bug-buffer-assign-x.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-equals.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-fill-string.js",
"DukTape/ecmascript/test-bug-nodejs-buffer-slice-isview.js",
"DukTape/ecmascript/test-bug-recursive-voluntary-markandsweep.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-varint-special.js",
"DukTape/ecmascript/test-commonjs-module-load-error.js",
"DukTape/ecmascript/test-bi-dataview-constructor.js",
"DukTape/ecmascript/test-bug-jx-minusinf.js",
"DukTape/ecmascript/test-dev-rom-builtins-1.js",
"DukTape/ecmascript/test-bug-json-fastpath-boxedptr.js",
"DukTape/ecmascript/test-dev-lightfunc.js",
"DukTape/ecmascript/test-dev-finalizer-markandsweep-refzero.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-isencoding.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-proto-writefield.js",
"DukTape/ecmascript/test-bi-string-constructor-custom.js",
"DukTape/ecmascript/test-bi-arraybuffer-isview.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-instance-indexed.js",
"DukTape/ecmascript/test-dev-func-varmap-drop.js",
"DukTape/ecmascript/test-bi-nodejs-buffer-defineproperty.js",
"DukTape/ecmascript/test-bi-dataview-write-methods.js",
"mozilla/non262/extensions/clone-sab-leak.js",
"mozilla/non262/extensions/clone-sab-failure.js",
"JerryJS/regression/tests/regression-test-issue-1621.js",
"DukTape/ecmascript/test-bi-uint8array-allocplain.js",
"DukTape/ecmascript/test-dev-string-to-buffer.js",
"DukTape/ecmascript/test-bug-currpc-unwind-gh294.js",
"DukTape/ecmascript/test-bi-typedarray-read-index.js",
"DukTape/ecmascript/test-bi-performance.js",
"mozilla/non262/Intl/",
"mozilla/non262/Intl/Collator/toStringTag.js",
"mozilla/non262/Intl/Collator/call.js",
"mozilla/non262/Intl/Collator/compare.js",
"mozilla/non262/Intl/Collator/caseFirst.js",
"mozilla/non262/Intl/Collator/supportedLocalesOf.js",
"mozilla/non262/Intl/Collator/construct-newtarget.js",
"mozilla/non262/Intl/NumberFormat/remove-unicode-extensions.js",
"mozilla/non262/Intl/NumberFormat/toStringTag.js",
"mozilla/non262/Intl/NumberFormat/formatToParts.js",
"mozilla/non262/Intl/NumberFormat/StringBuffer.js",
"mozilla/non262/Intl/NumberFormat/call.js",
"mozilla/non262/Intl/NumberFormat/significantDigitsOfZero.js",
"mozilla/non262/Intl/NumberFormat/unwrapping.js",
"mozilla/non262/Intl/NumberFormat/negativeZeroFractionDigits.js",
"mozilla/non262/Intl/NumberFormat/supportedLocalesOf.js",
"mozilla/non262/Intl/NumberFormat/format.js",
"mozilla/non262/Intl/NumberFormat/construct-newtarget.js",
"mozilla/non262/Intl/NumberFormat/format-as-code-or-name.js",
"mozilla/non262/Intl/DateTimeFormat/calendar-aliases.js",
"mozilla/non262/Intl/DateTimeFormat/toStringTag.js",
"mozilla/non262/Intl/DateTimeFormat/formatToParts.js",
"mozilla/non262/Intl/DateTimeFormat/format_timeZone.js",
"mozilla/non262/Intl/DateTimeFormat/timeZone_backzone_links.js",
"mozilla/non262/Intl/DateTimeFormat/timeZone_backzone.js",
"mozilla/non262/Intl/DateTimeFormat/call.js",
"mozilla/non262/Intl/DateTimeFormat/unwrapping.js",
"mozilla/non262/Intl/DateTimeFormat/hourCycle.js",
"mozilla/non262/Intl/DateTimeFormat/timeZone.js",
"mozilla/non262/Intl/DateTimeFormat/timeZone_backward_links.js",
"mozilla/non262/Intl/DateTimeFormat/timeZone_notbackward_links.js",
"mozilla/non262/Intl/DateTimeFormat/supportedLocalesOf.js",
"mozilla/non262/Intl/DateTimeFormat/format.js",
"mozilla/non262/Intl/DateTimeFormat/construct-newtarget.js",
"mozilla/non262/Intl/DateTimeFormat/islamic.js",
"mozilla/non262/Intl/PluralRules/resolvedOptions-overridden-species.js",
"mozilla/non262/Intl/PluralRules/call.js",
"mozilla/non262/Intl/PluralRules/pluralrules.js",
"mozilla/non262/Intl/PluralRules/negativeZeroFractionDigits.js",
"mozilla/non262/Intl/PluralRules/select.js",
"mozilla/non262/Intl/PluralRules/supportedLocalesOf.js",
"mozilla/non262/Intl/PluralRules/construct-newtarget.js",
"DukTape/ecmascript/test-dev-call-special-misc.js",
"WebKit/es6/Promise_Promise.race.js",
"WebKit/es6/Promise_Promise.all_generic_iterables.js",
"WebKit/es6/Promise_Promise.all.js",
"WebKit/es6/Promise_basic_functionality.js",
"WebKit/es6/Promise_is_subclassable_basic_functionality.js",
"WebKit/es6/Promise_Promise.race_generic_iterables.js",
"WebKit/es6/Promise_is_subclassable_Promise.race.js",
"WebKit/es6/Promise_is_subclassable_Promise.all.js",
"DukTape/ecmascript/test-base64-dec-brute.js",
"DukTape/ecmascript/test-dev-syntax-error-line-2.js",
"test262/language/global-code/super-call-arrow.js",
"test262/language/global-code/export.js",
"test262/language/global-code/new.target.js",
"test262/language/global-code/super-prop-arrow.js",
"test262/language/global-code/super-prop.js",
"test262/language/global-code/super-call.js",
"test262/language/global-code/new.target-arrow.js",
"test262/language/global-code/return.js",
"test262/language/global-code/import.js",
"test262/language/global-code/yield-strict.js",
"test262/language/punctuators/S7.7_A2_T5.js",
"test262/language/punctuators/S7.7_A2_T6.js",
"test262/language/punctuators/S7.7_A2_T8.js",
"test262/language/punctuators/S7.7_A2_T2.js",
"test262/language/punctuators/S7.7_A2_T7.js",
"test262/language/punctuators/S7.7_A2_T9.js",
"test262/language/punctuators/S7.7_A2_T4.js",
"test262/language/punctuators/S7.7_A2_T1.js",
"test262/language/punctuators/S7.7_A2_T10.js",
"test262/language/punctuators/S7.7_A2_T3.js",
"test262/language/keywords/ident-ref-return.js",
"test262/language/keywords/ident-ref-typeof.js",
"test262/language/keywords/ident-ref-if.js",
"test262/language/keywords/ident-ref-function.js",
"test262/language/keywords/ident-ref-this.js",
"test262/language/keywords/ident-ref-finally.js",
"test262/language/keywords/ident-ref-catch.js",
"test262/language/keywords/ident-ref-try.js",
"test262/language/keywords/ident-ref-var.js",
"test262/language/keywords/ident-ref-in.js",
"test262/language/keywords/ident-ref-continue.js",
"test262/language/keywords/ident-ref-instanceof.js",
"test262/language/keywords/ident-ref-while.js",
"test262/language/keywords/ident-ref-throw.js",
"test262/language/keywords/ident-ref-void.js",
"test262/language/keywords/ident-ref-new.js",
"test262/language/keywords/ident-ref-do.js",
"test262/language/keywords/ident-ref-case.js",
"test262/language/keywords/ident-ref-break.js",
"test262/language/keywords/ident-ref-switch.js",
"test262/language/keywords/ident-ref-else.js",
"test262/language/keywords/ident-ref-for.js",
"test262/language/keywords/ident-ref-default.js",
"test262/language/keywords/ident-ref-with.js",
"test262/language/keywords/ident-ref-delete.js",
"test262/language/module-code/parse-err-decl-pos-import-do-while.js",
"test262/language/module-code/early-dup-export-dflt.js",
"test262/language/module-code/parse-err-decl-pos-export-object-setter.js",
"test262/language/module-code/parse-err-decl-pos-export-for-lhs.js",
"test262/language/module-code/parse-err-decl-pos-import-class-expr-meth-gen-static.js",
"test262/language/module-code/early-import-as-eval.js",
"test262/language/module-code/parse-err-decl-pos-import-for-in-const.js",
"test262/language/module-code/parse-err-decl-pos-import-class-decl-meth-static.js",
"test262/language/module-code/parse-err-decl-pos-export-function-decl.js",
"test262/language/module-code/parse-err-decl-pos-import-for-in-lhs.js",
"test262/language/module-code/parse-err-decl-pos-import-class-decl-method-gen.js",
"test262/language/module-code/parse-err-decl-pos-import-switch-dftl.js",
"test262/language/module-code/early-import-arguments.js",
"test262/language/module-code/early-dup-export-as-star-as.js",
"test262/language/module-code/parse-err-decl-pos-export-try-try.js",
"test262/language/module-code/parse-err-decl-pos-import-try-catch.js",
"test262/language/module-code/comment-single-line-html-close.js",
"test262/language/module-code/parse-err-decl-pos-export-try-catch.js",
"test262/language/module-code/parse-err-decl-pos-export-switch-case-dflt.js",
"test262/language/module-code/parse-err-decl-pos-export-for-of-let.js",
"test262/language/module-code/parse-err-decl-pos-export-object-method.js",
"test262/language/module-code/parse-err-decl-pos-import-try-try.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-1.js",
"test262/language/module-code/early-new-target.js",
"test262/language/module-code/parse-err-decl-pos-import-try-catch-finally.js",
"test262/language/module-code/parse-err-decl-pos-export-for-of-const.js",
"test262/language/module-code/parse-err-decl-pos-import-switch-case-dflt.js",
"test262/language/module-code/parse-err-decl-pos-export-for-of-var.js",
"test262/language/module-code/parse-err-decl-pos-export-switch-case.js",
"test262/language/module-code/parse-err-decl-pos-export-try-catch-finally.js",
"test262/language/module-code/parse-err-decl-pos-import-class-expr-meth-gen.js",
"test262/language/module-code/parse-err-decl-pos-export-class-expr-meth-gen-static.js",
"test262/language/module-code/parse-err-invoke-anon-gen-decl.js",
"test262/language/module-code/parse-err-decl-pos-import-labeled.js",
"test262/language/module-code/instn-resolve-empty-import.js",
"test262/language/module-code/parse-err-decl-pos-export-class-decl-method-gen-static.js",
"test262/language/module-code/parse-err-decl-pos-export-function-expr.js",
"test262/language/module-code/parse-err-export-dflt-var.js",
"test262/language/module-code/parse-err-decl-pos-import-for-of-var.js",
"test262/language/module-code/parse-err-decl-pos-import-if-else.js",
"test262/language/module-code/parse-err-semi-export-star.js",
"test262/language/module-code/early-export-unresolvable.js",
"test262/language/module-code/parse-err-decl-pos-import-object-getter.js",
"test262/language/module-code/early-import-eval.js",
"test262/language/module-code/comment-single-line-html-open.js",
"test262/language/module-code/parse-err-decl-pos-export-for-in-lhs.js",
"test262/language/module-code/parse-err-decl-pos-import-object-gen-method.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-7.js",
"test262/language/module-code/parse-err-decl-pos-import-for-let.js",
"test262/language/module-code/parse-err-decl-pos-export-switch-dftl.js",
"test262/language/module-code/early-lex-and-var.js",
"test262/language/module-code/parse-err-export-dflt-const.js",
"test262/language/module-code/parse-err-decl-pos-export-for-in-const.js",
"test262/language/module-code/parse-err-invoke-anon-fun-decl.js",
"test262/language/module-code/parse-err-decl-pos-export-if-if.js",
"test262/language/module-code/parse-err-decl-pos-import-for-in-var.js",
"test262/language/module-code/parse-err-decl-pos-import-object-setter.js",
"test262/language/module-code/parse-err-decl-pos-export-class-expr-meth-static.js",
"test262/language/module-code/early-super.js",
"test262/language/module-code/parse-err-decl-pos-export-if-else.js",
"test262/language/module-code/parse-err-decl-pos-import-for-of-const.js",
"test262/language/module-code/parse-err-syntax.js",
"test262/language/module-code/parse-err-decl-pos-import-function-expr.js",
"test262/language/module-code/parse-err-decl-pos-import-block-stmt-list.js",
"test262/language/module-code/early-dup-export-id.js",
"test262/language/module-code/comment-multi-line-html-close.js",
"test262/language/module-code/parse-err-export-dflt-expr.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-6.js",
"test262/language/module-code/parse-err-decl-pos-export-for-const.js",
"test262/language/module-code/early-export-global.js",
"test262/language/module-code/early-dup-lables.js",
"test262/language/module-code/parse-err-semi-named-export-from.js",
"test262/language/module-code/instn-resolve-err-syntax.js",
"test262/language/module-code/early-dup-export-id-as.js",
"test262/language/module-code/parse-err-decl-pos-import-while.js",
"test262/language/module-code/parse-err-decl-pos-import-object-method.js",
"test262/language/module-code/parse-err-decl-pos-import-for-var.js",
"test262/language/module-code/parse-err-semi-name-space-export.js",
"test262/language/module-code/parse-err-decl-pos-import-for-of-let.js",
"test262/language/module-code/parse-err-decl-pos-export-for-in-let.js",
"test262/language/module-code/instn-resolve-empty-export.js",
"test262/language/module-code/parse-err-decl-pos-export-class-decl-meth-static.js",
"test262/language/module-code/parse-err-decl-pos-export-arrow-function.js",
"test262/language/module-code/parse-err-decl-pos-import-if-if.js",
"test262/language/module-code/instn-resolve-order-depth.js",
"test262/language/module-code/parse-err-semi-dflt-expr.js",
"test262/language/module-code/parse-err-decl-pos-export-for-in-var.js",
"test262/language/module-code/parse-err-decl-pos-export-generator-decl.js",
"test262/language/module-code/parse-err-decl-pos-export-for-var.js",
"test262/language/module-code/parse-err-decl-pos-import-function-decl.js",
"test262/language/module-code/parse-err-return.js",
"test262/language/module-code/parse-err-decl-pos-import-generator-decl.js",
"test262/language/module-code/parse-err-decl-pos-import-class-decl-meth.js",
"test262/language/module-code/early-dup-export-dflt-id.js",
"test262/language/module-code/instn-resolve-err-reference.js",
"test262/language/module-code/parse-err-decl-pos-export-class-decl-meth.js",
"test262/language/module-code/early-undef-continue.js",
"test262/language/module-code/parse-err-decl-pos-import-switch-case.js",
"test262/language/module-code/parse-err-decl-pos-export-block-stmt-list.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-8.js",
"test262/language/module-code/parse-err-decl-pos-export-class-expr-meth-gen.js",
"test262/language/module-code/parse-err-decl-pos-import-block-stmt.js",
"test262/language/module-code/parse-err-decl-pos-export-class-decl-method-gen.js",
"test262/language/module-code/parse-err-hoist-lex-fun.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-4.js",
"test262/language/module-code/parse-err-decl-pos-export-labeled.js",
"test262/language/module-code/parse-err-decl-pos-export-object-gen-method.js",
"test262/language/module-code/early-dup-lex.js",
"test262/language/module-code/parse-err-hoist-lex-gen.js",
"test262/language/module-code/parse-err-decl-pos-export-class-expr-meth.js",
"test262/language/module-code/early-import-as-arguments.js",
"test262/language/module-code/parse-err-decl-pos-import-generator-expr.js",
"test262/language/module-code/parse-err-decl-pos-export-for-let.js",
"test262/language/module-code/parse-err-decl-pos-export-while.js",
"test262/language/module-code/parse-err-yield.js",
"test262/language/module-code/parse-err-semi-named-export.js",
"test262/language/module-code/parse-err-decl-pos-import-class-decl-method-gen-static.js",
"test262/language/module-code/parse-err-decl-pos-import-try-finally.js",
"test262/language/module-code/parse-err-decl-pos-import-for-const.js",
"test262/language/module-code/parse-err-decl-pos-export-for-of-lhs.js",
"test262/language/module-code/parse-err-decl-pos-import-for-of-lhs.js",
"test262/language/module-code/parse-err-reference.js",
"test262/language/module-code/instn-resolve-order-src.js",
"test262/language/module-code/parse-err-decl-pos-import-for-in-let.js",
"test262/language/module-code/early-dup-export-star-as-dflt.js",
"test262/language/module-code/parse-err-decl-pos-export-block-stmt.js",
"test262/language/module-code/early-strict-mode.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-5.js",
"test262/language/module-code/parse-err-export-dflt-let.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-3.js",
"test262/language/module-code/privatename-not-valid-earlyerr-module-2.js",
"test262/language/module-code/parse-err-decl-pos-import-for-lhs.js",
"test262/language/module-code/early-dup-export-decl.js",
"test262/language/module-code/parse-err-decl-pos-import-class-expr-meth-static.js",
"test262/language/module-code/parse-err-decl-pos-export-do-while.js",
"test262/language/module-code/parse-err-decl-pos-export-generator-expr.js",
"test262/language/module-code/early-undef-break.js",
"test262/language/module-code/parse-err-decl-pos-import-arrow-function.js",
"test262/language/module-code/parse-err-decl-pos-export-try-finally.js",
"test262/language/module-code/parse-err-decl-pos-export-object-getter.js",
"test262/language/module-code/parse-err-decl-pos-import-class-expr-meth.js",
"test262/language/import/escaped-as-import-specifier.js",
"test262/language/import/dup-bound-names.js",
"test262/language/import/escaped-from.js",
"test262/language/import/escaped-as-namespace-import.js",
"test262/language/statements/do-while/let-array-with-newline.js",
"test262/language/statements/do-while/S12.6.1_A6_T2.js",
"test262/language/statements/do-while/decl-const.js",
"test262/language/statements/do-while/S12.6.1_A6_T1.js",
"test262/language/statements/do-while/S12.6.1_A12.js",
"test262/language/statements/do-while/S12.6.1_A6_T3.js",
"test262/language/statements/do-while/S12.6.1_A6_T5.js",
"test262/language/statements/do-while/decl-gen.js",
"test262/language/statements/do-while/S12.6.1_A15.js",
"test262/language/statements/do-while/S12.6.1_A6_T6.js",
"test262/language/statements/do-while/labelled-fn-stmt.js",
"test262/language/statements/do-while/decl-cls.js",
"test262/language/statements/do-while/decl-fun.js",
"test262/language/statements/do-while/decl-let.js",
"test262/language/statements/do-while/decl-async-fun.js",
"test262/language/statements/do-while/S12.6.1_A6_T4.js",
"test262/language/statements/do-while/decl-async-gen.js",
"test262/language/statements/expression/S12.4_A1.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-async-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-let-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-async-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-let-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/let-array-with-newline.js",
"test262/language/statements/for-await-of/async-func-dstr-let-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-init-yield-ident-invalid.js",
"test262/language/statements/for-await-of/async-func-dstr-const-async-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-dstr-let-async-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-async-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-async-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-let-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-nested-array-invalid.js",
"test262/language/statements/for-await-of/async-func-dstr-let-async-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-func-dstr-const-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-var-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-var-async-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-async-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-async-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-var-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-async-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-var-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-let-async-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-async-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-async-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-nested-obj-invalid.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-async-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-dstr-var-async-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-target-simple-strict.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-async-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-var-async-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-var-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-func-dstr-let-async-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-async-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-dstr-var-async-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-async-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-const-async-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-const-async-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-let-async-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-let-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-var-async-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-var-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/escaped-of.js",
"test262/language/statements/for-await-of/async-func-dstr-var-async-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-target-yield-invalid.js",
"test262/language/statements/for-await-of/async-func-decl-dstr-array-elem-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-await-of/async-func-dstr-var-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-dstr-const-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-dstr-const-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-async-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-const-async-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-async-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-async-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-let-async-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-const-async-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-async-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-async-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-const-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-await-of/async-func-dstr-let-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-var-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-func-dstr-const-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-const-async-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-await-of/async-func-dstr-const-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-const-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-await-of/async-gen-dstr-let-ary-ptrn-rest-init-ary.js",
"test262/language/statements/const/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/const/redeclaration-error-from-within-strict-mode-function-const.js",
"test262/language/statements/const/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/const/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/const/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/const/syntax/with-initializer-if-expression-statement-else-statement.js",
"test262/language/statements/const/syntax/without-initializer-while-expression-statement.js",
"test262/language/statements/const/syntax/without-initializer-label-statement.js",
"test262/language/statements/const/syntax/without-initializer-default-statement-list.js",
"test262/language/statements/const/syntax/without-initializer-if-expression-statement.js",
"test262/language/statements/const/syntax/without-initializer-for-statement.js",
"test262/language/statements/const/syntax/with-initializer-while-expression-statement.js",
"test262/language/statements/const/syntax/without-initializer-case-expression-statement-list.js",
"test262/language/statements/const/syntax/const-declaring-let-split-across-two-lines.js",
"test262/language/statements/const/syntax/with-initializer-for-statement.js",
"test262/language/statements/const/syntax/with-initializer-do-statement-while-expression.js",
"test262/language/statements/const/syntax/block-scope-syntax-const-declarations-mixed-without-with-initialiser.js",
"test262/language/statements/const/syntax/with-initializer-label-statement.js",
"test262/language/statements/const/syntax/block-scope-syntax-const-declarations-without-initialiser.js",
"test262/language/statements/const/syntax/block-scope-syntax-const-declarations-mixed-with-without-initialiser.js",
"test262/language/statements/const/syntax/without-initializer-do-statement-while-expression.js",
"test262/language/statements/const/syntax/without-initializer-if-expression-statement-else-statement.js",
"test262/language/statements/const/syntax/with-initializer-if-expression-statement.js",
"test262/language/statements/const/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/const/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/variable/S12.2_A8_T7.js",
"test262/language/statements/variable/S12.2_A8_T4.js",
"test262/language/statements/variable/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/variable/S12.2_A8_T5.js",
"test262/language/statements/variable/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/variable/id-arguments-strict.js",
"test262/language/statements/variable/S12.2_A8_T8.js",
"test262/language/statements/variable/12.2.1-1gs.js",
"test262/language/statements/variable/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/variable/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/variable/id-eval-strict.js",
"test262/language/statements/variable/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/variable/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/variable/S12.2_A8_T3.js",
"test262/language/statements/variable/12.2.1-4gs.js",
"test262/language/statements/variable/S12.2_A8_T6.js",
"test262/language/statements/variable/S12.2_A8_T2.js",
"test262/language/statements/variable/S12.2_A8_T1.js",
"test262/language/statements/for-of/dstr-var-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-of/dstr-obj-rest-not-last-element-invalid.js",
"test262/language/statements/for-of/dstr-array-elem-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-of/let-array-with-newline.js",
"test262/language/statements/for-of/decl-const.js",
"test262/language/statements/for-of/dstr-array-rest-before-rest.js",
"test262/language/statements/for-of/dstr-array-elem-nested-obj-invalid.js",
"test262/language/statements/for-of/dstr-let-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-of/head-lhs-invalid-asnmt-ptrn-ary.js",
"test262/language/statements/for-of/dstr-const-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-of/head-lhs-cover-non-asnmt-trgt.js",
"test262/language/statements/for-of/dstr-obj-prop-nested-array-invalid.js",
"test262/language/statements/for-of/head-let-bound-names-dup.js",
"test262/language/statements/for-of/dstr-array-elem-target-simple-strict.js",
"test262/language/statements/for-of/dstr-array-rest-init.js",
"test262/language/statements/for-of/labelled-fn-stmt-let.js",
"test262/language/statements/for-of/dstr-let-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-of/dstr-obj-prop-elem-init-yield-ident-invalid.js",
"test262/language/statements/for-of/head-const-bound-names-dup.js",
"test262/language/statements/for-of/dstr-obj-id-init-yield-ident-invalid.js",
"test262/language/statements/for-of/labelled-fn-stmt-const.js",
"test262/language/statements/for-of/head-let-bound-names-let.js",
"test262/language/statements/for-of/dstr-array-rest-before-elision.js",
"test262/language/statements/for-of/dstr-var-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-of/dstr-array-rest-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-of/dstr-array-rest-nested-array-invalid.js",
"test262/language/statements/for-of/dstr-obj-prop-nested-obj-invalid.js",
"test262/language/statements/for-of/head-lhs-invalid-asnmt-ptrn-obj.js",
"test262/language/statements/for-of/dstr-array-rest-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-of/dstr-array-elem-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-of/labelled-fn-stmt-lhs.js",
"test262/language/statements/for-of/dstr-array-rest-before-element.js",
"test262/language/statements/for-of/dstr-var-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-of/head-expr-no-expr.js",
"test262/language/statements/for-of/dstr-let-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-of/dstr-let-ary-ptrn-rest-init-id.js",
"test262/language/statements/for-of/dstr-const-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for-of/dstr-obj-id-identifier-yield-expr.js",
"test262/language/statements/for-of/dstr-let-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-of/decl-gen.js",
"test262/language/statements/for-of/dstr-const-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-of/head-var-no-expr.js",
"test262/language/statements/for-of/head-let-bound-names-in-stmt.js",
"test262/language/statements/for-of/dstr-array-elem-init-yield-ident-invalid.js",
"test262/language/statements/for-of/labelled-fn-stmt-var.js",
"test262/language/statements/for-of/dstr-obj-prop-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-of/dstr-var-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-of/dstr-const-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-of/dstr-array-rest-elision-invalid.js",
"test262/language/statements/for-of/dstr-let-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-of/decl-cls.js",
"test262/language/statements/for-of/escaped-of.js",
"test262/language/statements/for-of/decl-fun.js",
"test262/language/statements/for-of/decl-let.js",
"test262/language/statements/for-of/dstr-array-elem-nested-array-invalid.js",
"test262/language/statements/for-of/dstr-obj-id-init-simple-strict.js",
"test262/language/statements/for-of/head-lhs-let.js",
"test262/language/statements/for-of/dstr-obj-prop-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-of/head-decl-no-expr.js",
"test262/language/statements/for-of/head-const-bound-names-in-stmt.js",
"test262/language/statements/for-of/dstr-obj-id-simple-strict.js",
"test262/language/statements/for-of/dstr-const-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-of/dstr-obj-id-identifier-yield-ident-invalid.js",
"test262/language/statements/for-of/dstr-array-rest-nested-obj-invalid.js",
"test262/language/statements/for-of/head-lhs-non-asnmt-trgt.js",
"test262/language/statements/for-of/dstr-var-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for-of/dstr-const-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for-of/dstr-var-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for-of/head-const-bound-names-let.js",
"test262/language/statements/for-of/dstr-array-elem-target-yield-invalid.js",
"test262/language/statements/for-of/decl-async-fun.js",
"test262/language/statements/for-of/dstr-obj-prop-elem-target-yield-ident-invalid.js",
"test262/language/statements/for-of/dstr-array-rest-yield-ident-invalid.js",
"test262/language/statements/for-of/decl-async-gen.js",
"test262/language/statements/async-function/early-errors-declaration-duplicate-parameters.js",
"test262/language/statements/async-function/early-errors-declaration-NSPL-with-USD.js",
"test262/language/statements/async-function/await-as-identifier-reference.js",
"test262/language/statements/async-function/early-errors-declaration-binding-identifier-arguments.js",
"test262/language/statements/async-function/early-errors-declaration-formals-body-duplicate.js",
"test262/language/statements/async-function/await-as-label-identifier-escaped.js",
"test262/language/statements/async-function/early-errors-declaration-await-in-formals-default.js",
"test262/language/statements/async-function/early-errors-declaration-formals-contains-super-property.js",
"test262/language/statements/async-function/await-as-binding-identifier.js",
"test262/language/statements/async-function/escaped-async.js",
"test262/language/statements/async-function/dflt-params-rest.js",
"test262/language/statements/async-function/dflt-params-duplicates.js",
"test262/language/statements/async-function/let-newline-await-in-async-function.js",
"test262/language/statements/async-function/await-as-label-identifier.js",
"test262/language/statements/async-function/early-errors-declaration-await-in-formals.js",
"test262/language/statements/async-function/await-as-binding-identifier-escaped.js",
"test262/language/statements/async-function/early-errors-declaration-eval-in-formal-parameters.js",
"test262/language/statements/async-function/early-errors-declaration-arguments-in-formal-parameters.js",
"test262/language/statements/async-function/early-errors-declaration-body-contains-super-property.js",
"test262/language/statements/async-function/early-errors-declaration-binding-identifier-eval.js",
"test262/language/statements/async-function/early-errors-declaration-formals-contains-super-call.js",
"test262/language/statements/async-function/rest-params-trailing-comma-early-error.js",
"test262/language/statements/async-function/early-errors-declaration-body-contains-super-call.js",
"test262/language/statements/async-function/await-as-identifier-reference-escaped.js",
"test262/language/statements/return/S12.9_A1_T9.js",
"test262/language/statements/return/S12.9_A1_T1.js",
"test262/language/statements/return/S12.9_A1_T2.js",
"test262/language/statements/return/S12.9_A1_T7.js",
"test262/language/statements/return/S12.9_A1_T4.js",
"test262/language/statements/return/S12.9_A1_T3.js",
"test262/language/statements/return/S12.9_A1_T10.js",
"test262/language/statements/return/S12.9_A1_T6.js",
"test262/language/statements/return/S12.9_A1_T8.js",
"test262/language/statements/return/S12.9_A1_T5.js",
"test262/language/statements/if/labelled-fn-stmt-lone.js",
"test262/language/statements/if/if-stmt-else-cls.js",
"test262/language/statements/if/if-decl-else-stmt-strict.js",
"test262/language/statements/if/let-array-with-newline.js",
"test262/language/statements/if/labelled-fn-stmt-first.js",
"test262/language/statements/if/if-fun-else-fun-strict.js",
"test262/language/statements/if/if-decl-else-decl-strict.js",
"test262/language/statements/if/if-async-fun-else-stmt.js",
"test262/language/statements/if/if-stmt-else-decl-strict.js",
"test262/language/statements/if/if-let-no-else.js",
"test262/language/statements/if/if-cls-else-cls.js",
"test262/language/statements/if/if-decl-no-else-strict.js",
"test262/language/statements/if/if-stmt-else-fun-strict.js",
"test262/language/statements/if/S12.5_A8.js",
"test262/language/statements/if/if-const-else-stmt.js",
"test262/language/statements/if/if-let-else-let.js",
"test262/language/statements/if/if-async-fun-no-else.js",
"test262/language/statements/if/if-stmt-else-const.js",
"test262/language/statements/if/S12.5_A6_T1.js",
"test262/language/statements/if/if-const-no-else.js",
"test262/language/statements/if/if-let-else-stmt.js",
"test262/language/statements/if/if-stmt-else-let.js",
"test262/language/statements/if/if-fun-else-stmt-strict.js",
"test262/language/statements/if/if-stmt-else-gen.js",
"test262/language/statements/if/if-const-else-const.js",
"test262/language/statements/if/if-async-gen-else-async-gen.js",
"test262/language/statements/if/S12.5_A6_T2.js",
"test262/language/statements/if/if-cls-no-else.js",
"test262/language/statements/if/if-gen-else-stmt.js",
"test262/language/statements/if/if-stmt-else-async-fun.js",
"test262/language/statements/if/if-gen-else-gen.js",
"test262/language/statements/if/S12.5_A11.js",
"test262/language/statements/if/if-gen-no-else.js",
"test262/language/statements/if/if-stmt-else-async-gen.js",
"test262/language/statements/if/if-fun-no-else-strict.js",
"test262/language/statements/if/if-async-gen-else-stmt.js",
"test262/language/statements/if/if-cls-else-stmt.js",
"test262/language/statements/if/if-async-gen-no-else.js",
"test262/language/statements/if/if-async-fun-else-async-fun.js",
"test262/language/statements/if/labelled-fn-stmt-second.js",
"test262/language/statements/class/definition/methods-gen-yield-as-identifier-in-nested-function.js",
"test262/language/statements/class/definition/early-errors-class-method-eval-in-formal-parameters.js",
"test262/language/statements/class/definition/methods-gen-yield-as-parameter.js",
"test262/language/statements/class/definition/early-errors-class-method-await-in-formals-default.js",
"test262/language/statements/class/definition/early-errors-class-method-duplicate-parameters.js",
"test262/language/statements/class/definition/early-errors-class-method-await-in-formals.js",
"test262/language/statements/class/definition/methods-gen-yield-as-function-expression-binding-identifier.js",
"test262/language/statements/class/definition/early-errors-class-method-formals-body-duplicate.js",
"test262/language/statements/class/definition/methods-gen-yield-as-logical-or-expression.js",
"test262/language/statements/class/definition/early-errors-class-method-arguments-in-formal-parameters.js",
"test262/language/statements/class/definition/methods-gen-yield-star-after-newline.js",
"test262/language/statements/class/definition/early-errors-class-method-formals-contains-super-call.js",
"test262/language/statements/class/definition/early-errors-class-method-NSPL-with-USD.js",
"test262/language/statements/class/definition/early-errors-class-method-body-contains-super-call.js",
"test262/language/statements/class/definition/methods-gen-yield-weak-binding.js",
"test262/language/statements/class/async-gen-method-static-await-as-binding-identifier-escaped.js",
"test262/language/statements/class/async-gen-method-static-await-as-identifier-reference-escaped.js",
"test262/language/statements/class/async-meth-static-dflt-params-rest.js",
"test262/language/statements/class/fields-arrow-fnc-init-err-contains-super.js",
"test262/language/statements/class/async-gen-method-static-yield-as-label-identifier.js",
"test262/language/statements/class/dstr-async-gen-meth-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/gen-method-static-yield-as-binding-identifier-escaped.js",
"test262/language/statements/class/meth-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/fields-asi-3.js",
"test262/language/statements/class/fields-private-ternary-init-err-contains-super.js",
"test262/language/statements/class/dstr-meth-static-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/dstr-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/meth-static-dflt-params-duplicates.js",
"test262/language/statements/class/async-gen-method-yield-as-label-identifier.js",
"test262/language/statements/class/dstr-async-gen-meth-static-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/dstr-async-gen-meth-static-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/dstr-async-gen-meth-static-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/gen-method-yield-as-identifier-reference.js",
"test262/language/statements/class/dstr-gen-meth-static-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/dstr-async-gen-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/fields-duplicate-privatenames.js",
"test262/language/statements/class/dstr-meth-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/gen-method-yield-as-label-identifier.js",
"test262/language/statements/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/fields-string-literal-name-init-err-contains-arguments.js",
"test262/language/statements/class/async-method-await-as-binding-identifier.js",
"test262/language/statements/class/async-gen-method-static-await-as-identifier-reference.js",
"test262/language/statements/class/class-name-ident-let.js",
"test262/language/statements/class/meth-static-dflt-params-rest.js",
"test262/language/statements/class/async-gen-method-yield-as-identifier-reference-escaped.js",
"test262/language/statements/class/dstr-meth-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/gen-meth-dflt-params-duplicates.js",
"test262/language/statements/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/err-method-delete-twice-covered-call-expression-privatename.js",
"test262/language/statements/class/dstr-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/dstr-async-gen-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/async-method-static-await-as-identifier-reference-escaped.js",
"test262/language/statements/class/gen-meth-static-dflt-params-rest.js",
"test262/language/statements/class/fields-equality-init-err-contains-super.js",
"test262/language/statements/class/async-method-await-as-identifier-reference.js",
"test262/language/statements/class/getter-param-dflt.js",
"test262/language/statements/class/dstr-async-gen-meth-static-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/dstr-gen-meth-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/dstr-meth-static-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/gen-method-static-yield-as-label-identifier-escaped.js",
"test262/language/statements/class/dstr-meth-static-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/gen-method-static-yield-as-label-identifier.js",
"test262/language/statements/class/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/dstr-meth-static-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/gen-method-static-yield-identifier-spread-strict.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-4.js",
"test262/language/statements/class/dstr-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/async-gen-method-static-yield-as-identifier-reference.js",
"test262/language/statements/class/gen-method-yield-as-binding-identifier-escaped.js",
"test262/language/statements/class/gen-meth-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/dstr-meth-static-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/fields-private-arrow-fnc-init-err-contains-super.js",
"test262/language/statements/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/meth-dflt-params-rest.js",
"test262/language/statements/class/async-gen-method-static-await-as-label-identifier.js",
"test262/language/statements/class/class-name-ident-yield.js",
"test262/language/statements/class/async-method-await-as-label-identifier.js",
"test262/language/statements/class/err-method-delete-covered-call-expression-privatename.js",
"test262/language/statements/class/gen-method-yield-as-label-identifier-escaped.js",
"test262/language/statements/class/dstr-async-gen-meth-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-7.js",
"test262/language/statements/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/dstr-async-gen-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/async-method-static-await-as-binding-identifier.js",
"test262/language/statements/class/dstr-gen-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-5.js",
"test262/language/statements/class/fields-literal-name-init-err-contains-super.js",
"test262/language/statements/class/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/err-field-delete-twice-covered-call-expression-privatename.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-3.js",
"test262/language/statements/class/async-meth-static-dflt-params-duplicates.js",
"test262/language/statements/class/class-name-ident-await-escaped-module.js",
"test262/language/statements/class/gen-meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/async-gen-method-static-yield-identifier-spread-strict.js",
"test262/language/statements/class/gen-method-yield-identifier-spread-strict.js",
"test262/language/statements/class/gen-method-yield-as-identifier-reference-escaped.js",
"test262/language/statements/class/err-method-delete-twice-covered-member-expression-privatename.js",
"test262/language/statements/class/dstr-meth-static-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/async-meth-dflt-params-duplicates.js",
"test262/language/statements/class/async-method-static-await-as-binding-identifier-escaped.js",
"test262/language/statements/class/async-meth-escaped-async.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-6.js",
"test262/language/statements/class/gen-method-yield-identifier-strict.js",
"test262/language/statements/class/meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/dstr-meth-static-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/async-gen-method-await-as-identifier-reference.js",
"test262/language/statements/class/gen-method-static-yield-as-identifier-reference-escaped.js",
"test262/language/statements/class/fields-ternary-init-err-contains-arguments.js",
"test262/language/statements/class/async-gen-method-yield-as-binding-identifier.js",
"test262/language/statements/class/async-gen-meth-dflt-params-duplicates.js",
"test262/language/statements/class/async-gen-meth-escaped-async.js",
"test262/language/statements/class/class-name-ident-let-escaped.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-8.js",
"test262/language/statements/class/async-gen-meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/async-gen-meth-dflt-params-rest.js",
"test262/language/statements/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/gen-method-static-yield-as-identifier-reference.js",
"test262/language/statements/class/fields-arrow-fnc-init-err-contains-arguments.js",
"test262/language/statements/class/class-name-ident-static-escaped.js",
"test262/language/statements/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/err-method-delete-covered-member-expression-privatename.js",
"test262/language/statements/class/fields-private-typeof-init-err-contains-arguments.js",
"test262/language/statements/class/method-param-yield.js",
"test262/language/statements/class/err-field-delete-covered-member-expression-privatename.js",
"test262/language/statements/class/dstr-meth-static-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/fields-asi-4.js",
"test262/language/statements/class/async-gen-method-static-yield-as-identifier-reference-escaped.js",
"test262/language/statements/class/async-gen-method-static-await-as-binding-identifier.js",
"test262/language/statements/class/async-gen-method-await-as-binding-identifier.js",
"test262/language/statements/class/dstr-gen-meth-static-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/gen-method-static-yield-as-binding-identifier.js",
"test262/language/statements/class/fields-equality-init-err-contains-arguments.js",
"test262/language/statements/class/fields-literal-name-propname-constructor.js",
"test262/language/statements/class/async-gen-method-await-as-label-identifier-escaped.js",
"test262/language/statements/class/fields-typeof-init-err-contains-arguments.js",
"test262/language/statements/class/async-gen-method-await-as-label-identifier.js",
"test262/language/statements/class/err-field-delete-call-expression-privatename.js",
"test262/language/statements/class/async-gen-method-yield-as-binding-identifier-escaped.js",
"test262/language/statements/class/dstr-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/fields-typeof-init-err-contains-super.js",
"test262/language/statements/class/dstr-gen-meth-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/err-field-delete-twice-covered-member-expression-privatename.js",
"test262/language/statements/class/class-name-ident-yield-escaped.js",
"test262/language/statements/class/syntax/early-errors/class-body-special-method-get-propname-constructor.js",
"test262/language/statements/class/syntax/early-errors/class-body-static-method-get-contains-direct-super.js",
"test262/language/statements/class/syntax/early-errors/class-body-special-method-get-contains-direct-super.js",
"test262/language/statements/class/syntax/early-errors/class-body-static-method-propname-prototype.js",
"test262/language/statements/class/syntax/early-errors/class-body-special-method-generator-propname-constructor.js",
"test262/language/statements/class/syntax/early-errors/class-body-has-direct-super-missing-class-heritage.js",
"test262/language/statements/class/syntax/early-errors/class-body-static-method-get-propname-prototype.js",
"test262/language/statements/class/syntax/early-errors/class-body-special-method-generator-contains-direct-super.js",
"test262/language/statements/class/syntax/early-errors/class-body-static-method-set-propname-prototype.js",
"test262/language/statements/class/syntax/early-errors/class-body-method-contains-direct-super.js",
"test262/language/statements/class/syntax/early-errors/class-definition-evaluation-scriptbody-duplicate-binding.js",
"test262/language/statements/class/syntax/early-errors/class-body-special-method-set-propname-constructor.js",
"test262/language/statements/class/syntax/early-errors/class-body-contains-multiple-constructor.js",
"test262/language/statements/class/syntax/early-errors/class-body-static-method-contains-direct-super.js",
"test262/language/statements/class/syntax/early-errors/class-definition-evaluation-block-duplicate-binding.js",
"test262/language/statements/class/syntax/early-errors/class-body-static-method-set-contains-direct-super.js",
"test262/language/statements/class/syntax/early-errors/class-body-special-method-set-contains-direct-super.js",
"test262/language/statements/class/syntax/escaped-static.js",
"test262/language/statements/class/async-gen-meth-static-dflt-params-rest.js",
"test262/language/statements/class/async-method-static-await-as-identifier-reference.js",
"test262/language/statements/class/gen-method-yield-as-binding-identifier.js",
"test262/language/statements/class/dstr-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/async-gen-method-yield-as-identifier-reference.js",
"test262/language/statements/class/fields-private-ternary-init-err-contains-arguments.js",
"test262/language/statements/class/async-gen-method-static-await-as-label-identifier-escaped.js",
"test262/language/statements/class/async-method-await-as-label-identifier-escaped.js",
"test262/language/statements/class/async-gen-method-static-yield-as-binding-identifier-escaped.js",
"test262/language/statements/class/dstr-meth-static-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/async-gen-meth-static-dflt-params-duplicates.js",
"test262/language/statements/class/dstr-gen-meth-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/dstr-gen-meth-static-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/static-gen-method-param-dflt-yield.js",
"test262/language/statements/class/dstr-gen-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/fields-string-name-propname-constructor.js",
"test262/language/statements/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/dstr-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/dstr-gen-meth-static-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/async-method-await-as-identifier-reference-escaped.js",
"test262/language/statements/class/async-meth-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/gen-method-static-yield-identifier-strict.js",
"test262/language/statements/class/dstr-gen-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/async-gen-method-await-as-binding-identifier-escaped.js",
"test262/language/statements/class/gen-meth-static-dflt-params-duplicates.js",
"test262/language/statements/class/static-method-param-yield.js",
"test262/language/statements/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/dstr-gen-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/async-method-static-await-as-label-identifier.js",
"test262/language/statements/class/async-gen-meth-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/gen-method-param-dflt-yield.js",
"test262/language/statements/class/dstr-meth-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/dstr-async-gen-meth-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/async-meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/statements/class/fields-private-typeof-init-err-contains-super.js",
"test262/language/statements/class/dstr-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/dstr-gen-meth-static-ary-ptrn-rest-init-ary.js",
"test262/language/statements/class/meth-dflt-params-duplicates.js",
"test262/language/statements/class/err-field-delete-covered-call-expression-privatename.js",
"test262/language/statements/class/fields-private-literal-name-init-err-contains-super.js",
"test262/language/statements/class/dstr-meth-static-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/dstr-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/class-name-ident-await-module.js",
"test262/language/statements/class/strict-mode/with.js",
"test262/language/statements/class/dstr-meth-static-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/class/class-name-ident-static.js",
"test262/language/statements/class/fields-private-literal-name-init-err-contains-arguments.js",
"test262/language/statements/class/async-gen-method-static-yield-as-label-identifier-escaped.js",
"test262/language/statements/class/dstr-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/fields-privatename-constructor-err.js",
"test262/language/statements/class/async-gen-method-await-as-identifier-reference-escaped.js",
"test262/language/statements/class/async-method-await-as-binding-identifier-escaped.js",
"test262/language/statements/class/async-gen-method-static-yield-as-binding-identifier.js",
"test262/language/statements/class/dstr-async-gen-meth-static-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/fields-ternary-init-err-contains-super.js",
"test262/language/statements/class/fields-string-literal-name-init-err-contains-super.js",
"test262/language/statements/class/fields-private-arrow-fnc-init-err-contains-arguments.js",
"test262/language/statements/class/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/async-gen-method-static-yield-identifier-strict.js",
"test262/language/statements/class/async-gen-method-yield-identifier-strict.js",
"test262/language/statements/class/dstr-gen-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-1.js",
"test262/language/statements/class/dstr-gen-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/err-method-delete-call-expression-privatename.js",
"test262/language/statements/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/class/async-gen-method-yield-as-label-identifier-escaped.js",
"test262/language/statements/class/async-method-static-await-as-label-identifier-escaped.js",
"test262/language/statements/class/async-meth-dflt-params-rest.js",
"test262/language/statements/class/err-field-delete-member-expression-privatename.js",
"test262/language/statements/class/dstr-gen-meth-static-ary-ptrn-rest-init-obj.js",
"test262/language/statements/class/fields-literal-name-init-err-contains-arguments.js",
"test262/language/statements/class/dstr-async-gen-meth-static-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/class/gen-meth-dflt-params-rest.js",
"test262/language/statements/class/privatename-not-valid-earlyerr-script-2.js",
"test262/language/statements/class/err-method-delete-member-expression-privatename.js",
"test262/language/statements/class/dstr-meth-static-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/class/async-gen-method-yield-identifier-spread-strict.js",
"test262/language/statements/generators/yield-as-label-identifier-escaped.js",
"test262/language/statements/generators/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/generators/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/generators/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/generators/yield-identifier-strict.js",
"test262/language/statements/generators/yield-as-parameter.js",
"test262/language/statements/generators/yield-as-binding-identifier-escaped.js",
"test262/language/statements/generators/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/generators/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/generators/yield-as-binding-identifier.js",
"test262/language/statements/generators/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/generators/yield-as-identifier-reference.js",
"test262/language/statements/generators/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/generators/dflt-params-rest.js",
"test262/language/statements/generators/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/generators/use-strict-with-non-simple-param.js",
"test262/language/statements/generators/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/generators/yield-star-after-newline.js",
"test262/language/statements/generators/dflt-params-duplicates.js",
"test262/language/statements/generators/yield-as-identifier-reference-escaped.js",
"test262/language/statements/generators/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/generators/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/generators/param-dflt-yield.js",
"test262/language/statements/generators/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/generators/yield-as-label-identifier.js",
"test262/language/statements/generators/rest-params-trailing-comma-early-error.js",
"test262/language/statements/generators/yield-as-logical-or-expression.js",
"test262/language/statements/generators/yield-identifier-spread-strict.js",
"test262/language/statements/generators/yield-weak-binding.js",
"test262/language/statements/continue/S12.7_A8_T1.js",
"test262/language/statements/continue/S12.7_A8_T2.js",
"test262/language/statements/continue/S12.7_A5_T2.js",
"test262/language/statements/continue/S12.7_A1_T3.js",
"test262/language/statements/continue/S12.7_A1_T4.js",
"test262/language/statements/continue/S12.7_A1_T2.js",
"test262/language/statements/continue/S12.7_A1_T1.js",
"test262/language/statements/continue/S12.7_A5_T1.js",
"test262/language/statements/continue/S12.7_A5_T3.js",
"test262/language/statements/continue/S12.7_A6.js",
"test262/language/statements/while/let-array-with-newline.js",
"test262/language/statements/while/S12.6.2_A15.js",
"test262/language/statements/while/decl-const.js",
"test262/language/statements/while/S12.6.2_A6_T6.js",
"test262/language/statements/while/S12.6.2_A6_T2.js",
"test262/language/statements/while/decl-gen.js",
"test262/language/statements/while/S12.6.2_A6_T5.js",
"test262/language/statements/while/labelled-fn-stmt.js",
"test262/language/statements/while/S12.6.2_A6_T3.js",
"test262/language/statements/while/decl-cls.js",
"test262/language/statements/while/decl-fun.js",
"test262/language/statements/while/decl-let.js",
"test262/language/statements/while/decl-async-fun.js",
"test262/language/statements/while/S12.6.2_A6_T4.js",
"test262/language/statements/while/decl-async-gen.js",
"test262/language/statements/while/S12.6.2_A6_T1.js",
"test262/language/statements/let/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/let/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/let/redeclaration-error-from-within-strict-mode-function.js",
"test262/language/statements/let/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/let/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/let/syntax/attempt-to-redeclare-let-binding-with-var.js",
"test262/language/statements/let/syntax/without-initialisers-in-statement-positions-label-statement.js",
"test262/language/statements/let/syntax/let-let-declaration-with-initializer-split-across-two-lines.js",
"test262/language/statements/let/syntax/with-initialisers-in-statement-positions-while-expression-statement.js",
"test262/language/statements/let/syntax/let-newline-yield-in-generator-function.js",
"test262/language/statements/let/syntax/let-let-declaration-split-across-two-lines.js",
"test262/language/statements/let/syntax/without-initialisers-in-statement-positions-if-expression-statement-else-statement.js",
"test262/language/statements/let/syntax/with-initialisers-in-statement-positions-if-expression-statement.js",
"test262/language/statements/let/syntax/with-initialisers-in-statement-positions-label-statement.js",
"test262/language/statements/let/syntax/without-initialisers-in-statement-positions-while-expression-statement.js",
"test262/language/statements/let/syntax/with-initialisers-in-statement-positions-for-statement.js",
"test262/language/statements/let/syntax/let-newline-yield-in-normal-function.js",
"test262/language/statements/let/syntax/with-initialisers-in-statement-positions-do-statement-while-expression.js",
"test262/language/statements/let/syntax/with-initialisers-in-statement-positions-if-expression-statement-else-statement.js",
"test262/language/statements/let/syntax/attempt-to-redeclare-let-binding-with-function-declaration.js",
"test262/language/statements/let/syntax/identifier-let-disallowed-as-boundname.js",
"test262/language/statements/let/syntax/without-initialisers-in-statement-positions-do-statement-while-expression.js",
"test262/language/statements/let/syntax/without-initialisers-in-statement-positions-for-statement.js",
"test262/language/statements/let/syntax/identifier-let-allowed-as-lefthandside-expression-strict.js",
"test262/language/statements/let/syntax/let-newline-await-in-normal-function.js",
"test262/language/statements/let/syntax/without-initialisers-in-statement-positions-if-expression-statement.js",
"test262/language/statements/let/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/let/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for-in/dstr-obj-rest-not-last-element-invalid.js",
"test262/language/statements/for-in/dstr-array-elem-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-in/let-array-with-newline.js",
"test262/language/statements/for-in/decl-const.js",
"test262/language/statements/for-in/dstr-array-rest-before-rest.js",
"test262/language/statements/for-in/dstr-array-elem-nested-obj-invalid.js",
"test262/language/statements/for-in/head-lhs-invalid-asnmt-ptrn-ary.js",
"test262/language/statements/for-in/head-lhs-cover-non-asnmt-trgt.js",
"test262/language/statements/for-in/dstr-obj-prop-nested-array-invalid.js",
"test262/language/statements/for-in/head-let-bound-names-dup.js",
"test262/language/statements/for-in/dstr-array-elem-target-simple-strict.js",
"test262/language/statements/for-in/dstr-array-rest-init.js",
"test262/language/statements/for-in/labelled-fn-stmt-let.js",
"test262/language/statements/for-in/dstr-obj-prop-elem-init-yield-ident-invalid.js",
"test262/language/statements/for-in/head-const-bound-names-dup.js",
"test262/language/statements/for-in/dstr-obj-id-init-yield-ident-invalid.js",
"test262/language/statements/for-in/labelled-fn-stmt-const.js",
"test262/language/statements/for-in/head-let-bound-names-let.js",
"test262/language/statements/for-in/dstr-array-rest-before-elision.js",
"test262/language/statements/for-in/dstr-array-rest-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-in/dstr-array-rest-nested-array-invalid.js",
"test262/language/statements/for-in/dstr-obj-prop-nested-obj-invalid.js",
"test262/language/statements/for-in/head-lhs-invalid-asnmt-ptrn-obj.js",
"test262/language/statements/for-in/dstr-array-rest-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-in/dstr-array-elem-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-in/labelled-fn-stmt-lhs.js",
"test262/language/statements/for-in/dstr-array-rest-before-element.js",
"test262/language/statements/for-in/dstr-obj-id-identifier-yield-expr.js",
"test262/language/statements/for-in/decl-gen.js",
"test262/language/statements/for-in/head-let-bound-names-in-stmt.js",
"test262/language/statements/for-in/dstr-array-elem-init-yield-ident-invalid.js",
"test262/language/statements/for-in/labelled-fn-stmt-var.js",
"test262/language/statements/for-in/dstr-obj-prop-nested-obj-yield-ident-invalid.js",
"test262/language/statements/for-in/S12.6.4_A15.js",
"test262/language/statements/for-in/dstr-array-rest-elision-invalid.js",
"test262/language/statements/for-in/decl-cls.js",
"test262/language/statements/for-in/decl-fun.js",
"test262/language/statements/for-in/decl-let.js",
"test262/language/statements/for-in/dstr-array-elem-nested-array-invalid.js",
"test262/language/statements/for-in/dstr-obj-id-init-simple-strict.js",
"test262/language/statements/for-in/dstr-obj-prop-nested-array-yield-ident-invalid.js",
"test262/language/statements/for-in/head-const-bound-names-in-stmt.js",
"test262/language/statements/for-in/dstr-obj-id-simple-strict.js",
"test262/language/statements/for-in/dstr-obj-id-identifier-yield-ident-invalid.js",
"test262/language/statements/for-in/dstr-array-rest-nested-obj-invalid.js",
"test262/language/statements/for-in/head-lhs-non-asnmt-trgt.js",
"test262/language/statements/for-in/head-const-bound-names-let.js",
"test262/language/statements/for-in/dstr-array-elem-target-yield-invalid.js",
"test262/language/statements/for-in/decl-async-fun.js",
"test262/language/statements/for-in/dstr-obj-prop-elem-target-yield-ident-invalid.js",
"test262/language/statements/for-in/dstr-array-rest-yield-ident-invalid.js",
"test262/language/statements/for-in/decl-async-gen.js",
"test262/language/statements/block/S12.1_A4_T2.js",
"test262/language/statements/block/S12.1_A4_T1.js",
"test262/language/statements/try/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/try/S12.14_A16_T8.js",
"test262/language/statements/try/S12.14_A16_T9.js",
"test262/language/statements/try/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/try/S12.14_A16_T15.js",
"test262/language/statements/try/S12.14_A16_T11.js",
"test262/language/statements/try/early-catch-lex.js",
"test262/language/statements/try/S12.14_A16_T5.js",
"test262/language/statements/try/S12.14_A16_T6.js",
"test262/language/statements/try/S12.14_A16_T7.js",
"test262/language/statements/try/catch-parameter-boundnames-restriction-arguments-negative-early.js",
"test262/language/statements/try/early-catch-duplicates.js",
"test262/language/statements/try/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/try/S12.14_A16_T1.js",
"test262/language/statements/try/S12.14_A16_T3.js",
"test262/language/statements/try/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/try/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/try/S12.14_A16_T12.js",
"test262/language/statements/try/S12.14_A16_T13.js",
"test262/language/statements/try/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/try/optional-catch-binding-parens.js",
"test262/language/statements/try/early-catch-function.js",
"test262/language/statements/try/S12.14_A16_T14.js",
"test262/language/statements/try/S12.14_A16_T10.js",
"test262/language/statements/try/catch-parameter-boundnames-restriction-eval-negative-early.js",
"test262/language/statements/try/S12.14_A16_T2.js",
"test262/language/statements/try/early-catch-var.js",
"test262/language/statements/async-generator/yield-as-label-identifier-escaped.js",
"test262/language/statements/async-generator/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/async-generator/await-as-identifier-reference.js",
"test262/language/statements/async-generator/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/async-generator/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/async-generator/yield-identifier-strict.js",
"test262/language/statements/async-generator/yield-as-binding-identifier-escaped.js",
"test262/language/statements/async-generator/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/async-generator/await-as-label-identifier-escaped.js",
"test262/language/statements/async-generator/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/async-generator/yield-as-binding-identifier.js",
"test262/language/statements/async-generator/await-as-binding-identifier.js",
"test262/language/statements/async-generator/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/async-generator/yield-as-identifier-reference.js",
"test262/language/statements/async-generator/escaped-async.js",
"test262/language/statements/async-generator/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/async-generator/dflt-params-rest.js",
"test262/language/statements/async-generator/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/async-generator/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/async-generator/dflt-params-duplicates.js",
"test262/language/statements/async-generator/yield-as-identifier-reference-escaped.js",
"test262/language/statements/async-generator/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/async-generator/await-as-label-identifier.js",
"test262/language/statements/async-generator/await-as-binding-identifier-escaped.js",
"test262/language/statements/async-generator/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/async-generator/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/async-generator/yield-as-label-identifier.js",
"test262/language/statements/async-generator/rest-params-trailing-comma-early-error.js",
"test262/language/statements/async-generator/yield-identifier-spread-strict.js",
"test262/language/statements/async-generator/await-as-identifier-reference-escaped.js",
"test262/language/statements/break/S12.8_A6.js",
"test262/language/statements/break/S12.8_A5_T1.js",
"test262/language/statements/break/S12.8_A1_T1.js",
"test262/language/statements/break/S12.8_A5_T2.js",
"test262/language/statements/break/S12.8_A1_T4.js",
"test262/language/statements/break/S12.8_A5_T3.js",
"test262/language/statements/break/S12.8_A8_T1.js",
"test262/language/statements/break/S12.8_A1_T3.js",
"test262/language/statements/break/S12.8_A8_T2.js",
"test262/language/statements/break/S12.8_A1_T2.js",
"test262/language/statements/with/let-array-with-newline.js",
"test262/language/statements/with/decl-const.js",
"test262/language/statements/with/12.10.1-11gs.js",
"test262/language/statements/with/decl-gen.js",
"test262/language/statements/with/labelled-fn-stmt.js",
"test262/language/statements/with/decl-cls.js",
"test262/language/statements/with/decl-fun.js",
"test262/language/statements/with/decl-let.js",
"test262/language/statements/with/decl-async-fun.js",
"test262/language/statements/with/decl-async-gen.js",
"test262/language/statements/for/dstr-var-ary-ptrn-rest-init-id.js",
"test262/language/statements/for/S12.6.3_A8.1_T1.js",
"test262/language/statements/for/let-array-with-newline.js",
"test262/language/statements/for/decl-const.js",
"test262/language/statements/for/dstr-let-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for/dstr-const-ary-ptrn-rest-init-id.js",
"test262/language/statements/for/S12.6.3_A11_T3.js",
"test262/language/statements/for/S12.6.3_A8_T3.js",
"test262/language/statements/for/S12.6.3_A4_T2.js",
"test262/language/statements/for/labelled-fn-stmt-let.js",
"test262/language/statements/for/S12.6.3_A7_T1.js",
"test262/language/statements/for/dstr-let-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for/labelled-fn-stmt-const.js",
"test262/language/statements/for/dstr-var-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for/S12.6.3_A8_T2.js",
"test262/language/statements/for/S12.6.3_A4.1.js",
"test262/language/statements/for/dstr-var-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for/dstr-let-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for/dstr-let-ary-ptrn-rest-init-id.js",
"test262/language/statements/for/dstr-const-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/for/S12.6.3_A7.1_T1.js",
"test262/language/statements/for/dstr-let-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for/decl-gen.js",
"test262/language/statements/for/S12.6.3_A11.1_T3.js",
"test262/language/statements/for/dstr-const-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for/head-let-bound-names-in-stmt.js",
"test262/language/statements/for/S12.6.3_A12.1_T3.js",
"test262/language/statements/for/labelled-fn-stmt-var.js",
"test262/language/statements/for/dstr-var-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for/S12.6.3_A8_T1.js",
"test262/language/statements/for/dstr-const-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for/dstr-let-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for/decl-cls.js",
"test262/language/statements/for/decl-fun.js",
"test262/language/statements/for/decl-let.js",
"test262/language/statements/for/S12.6.3_A8.1_T2.js",
"test262/language/statements/for/S12.6.3_A4_T1.js",
"test262/language/statements/for/S12.6.3_A7.1_T2.js",
"test262/language/statements/for/head-const-bound-names-in-stmt.js",
"test262/language/statements/for/dstr-const-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/for/dstr-var-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/for/dstr-const-ary-ptrn-rest-init-ary.js",
"test262/language/statements/for/dstr-var-ary-ptrn-rest-init-obj.js",
"test262/language/statements/for/S12.6.3_A8.1_T3.js",
"test262/language/statements/for/labelled-fn-stmt-expr.js",
"test262/language/statements/for/decl-async-fun.js",
"test262/language/statements/for/S12.6.3_A7_T2.js",
"test262/language/statements/for/decl-async-gen.js",
"test262/language/statements/for/S12.6.3_A12_T3.js",
"test262/language/statements/labeled/let-array-with-newline.js",
"test262/language/statements/labeled/decl-const.js",
"test262/language/statements/labeled/decl-async-function.js",
"test262/language/statements/labeled/value-yield-strict.js",
"test262/language/statements/labeled/value-await-module.js",
"test262/language/statements/labeled/value-await-module-escaped.js",
"test262/language/statements/labeled/decl-gen.js",
"test262/language/statements/labeled/decl-cls.js",
"test262/language/statements/labeled/decl-let.js",
"test262/language/statements/labeled/decl-async-generator.js",
"test262/language/statements/labeled/continue.js",
"test262/language/statements/labeled/decl-fun-strict.js",
"test262/language/statements/labeled/value-yield-strict-escaped.js",
"test262/language/statements/switch/S12.11_A3_T5.js",
"test262/language/statements/switch/S12.11_A3_T2.js",
"test262/language/statements/switch/S12.11_A3_T4.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/statements/switch/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/statements/switch/S12.11_A3_T3.js",
"test262/language/statements/switch/S12.11_A2_T1.js",
"test262/language/statements/switch/S12.11_A3_T1.js",
"test262/language/statements/function/name-eval-strict.js",
"test262/language/statements/function/early-body-super-call.js",
"test262/language/statements/function/name-arguments-strict.js",
"test262/language/statements/function/param-arguments-strict.js",
"test262/language/statements/function/dstr-ary-ptrn-rest-init-id.js",
"test262/language/statements/function/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/function/enable-strict-via-outer-script.js",
"test262/language/statements/function/param-duplicated-strict-body-2.js",
"test262/language/statements/function/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/statements/function/enable-strict-via-body.js",
"test262/language/statements/function/early-body-super-prop.js",
"test262/language/statements/function/early-params-super-call.js",
"test262/language/statements/function/invalid-name-dot.js",
"test262/language/statements/function/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/statements/function/S13_A7_T3.js",
"test262/language/statements/function/param-eval-strict-body.js",
"test262/language/statements/function/13.1-5gs.js",
"test262/language/statements/function/invalid-name-two-dots.js",
"test262/language/statements/function/param-eval-strict.js",
"test262/language/statements/function/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/function/invalid-3-names.js",
"test262/language/statements/function/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/function/13.1-13gs.js",
"test262/language/statements/function/name-eval-strict-body.js",
"test262/language/statements/function/param-duplicated-strict-body-3.js",
"test262/language/statements/function/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/statements/function/param-arguments-strict-body.js",
"test262/language/statements/function/13.1-1gs.js",
"test262/language/statements/function/dflt-params-rest.js",
"test262/language/statements/function/enable-strict-via-outer-body.js",
"test262/language/statements/function/param-duplicated-strict-1.js",
"test262/language/statements/function/13.1-4gs.js",
"test262/language/statements/function/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/statements/function/use-strict-with-non-simple-param.js",
"test262/language/statements/function/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/statements/function/dflt-params-duplicates.js",
"test262/language/statements/function/13.1-8gs.js",
"test262/language/statements/function/name-arguments-strict-body.js",
"test262/language/statements/function/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/statements/function/13.0_4-5gs.js",
"test262/language/statements/function/invalid-function-body-1.js",
"test262/language/statements/function/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/statements/function/param-dflt-yield-strict.js",
"test262/language/statements/function/invalid-2-names.js",
"test262/language/statements/function/param-duplicated-strict-2.js",
"test262/language/statements/function/param-duplicated-strict-body-1.js",
"test262/language/statements/function/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/statements/function/invalid-function-body-2.js",
"test262/language/statements/function/invalid-function-body-3.js",
"test262/language/statements/function/param-duplicated-strict-3.js",
"test262/language/statements/function/rest-params-trailing-comma-early-error.js",
"test262/language/statements/function/early-params-super-prop.js",
"test262/language/statements/debugger/expression.js",
"test262/language/export/escaped-as-export-specifier.js",
"test262/language/export/escaped-from.js",
"test262/language/export/escaped-default.js",
"test262/language/future-reserved-words/private-strict.js",
"test262/language/future-reserved-words/let-strict.js",
"test262/language/future-reserved-words/public-strict.js",
"test262/language/future-reserved-words/export.js",
"test262/language/future-reserved-words/debugger.js",
"test262/language/future-reserved-words/public-strict-escaped.js",
"test262/language/future-reserved-words/const.js",
"test262/language/future-reserved-words/protected-strict.js",
"test262/language/future-reserved-words/interface-strict.js",
"test262/language/future-reserved-words/implements-strict.js",
"test262/language/future-reserved-words/interface-strict-escaped.js",
"test262/language/future-reserved-words/static-strict.js",
"test262/language/future-reserved-words/static-strict-escaped.js",
"test262/language/future-reserved-words/enum.js",
"test262/language/future-reserved-words/implements-strict-escaped.js",
"test262/language/future-reserved-words/let-strict-escaped.js",
"test262/language/future-reserved-words/package-strict-escaped.js",
"test262/language/future-reserved-words/package-strict.js",
"test262/language/future-reserved-words/class.js",
"test262/language/future-reserved-words/extends.js",
"test262/language/future-reserved-words/private-strict-escaped.js",
"test262/language/future-reserved-words/super.js",
"test262/language/future-reserved-words/protected-strict-escaped.js",
"test262/language/future-reserved-words/yield-strict-escaped.js",
"test262/language/future-reserved-words/import.js",
"test262/language/future-reserved-words/yield-strict.js",
"test262/language/rest-parameters/position-invalid.js",
"test262/language/identifiers/val-class.js",
"test262/language/identifiers/val-this.js",
"test262/language/identifiers/val-do.js",
"test262/language/identifiers/val-catch-via-escape-hex.js",
"test262/language/identifiers/val-debugger.js",
"test262/language/identifiers/vertical-tilde-continue-escaped.js",
"test262/language/identifiers/val-typeof-via-escape-hex4.js",
"test262/language/identifiers/val-switch.js",
"test262/language/identifiers/val-false-via-escape-hex.js",
"test262/language/identifiers/val-new-via-escape-hex4.js",
"test262/language/identifiers/val-typeof.js",
"test262/language/identifiers/val-false-via-escape-hex4.js",
"test262/language/identifiers/val-do-via-escape-hex4.js",
"test262/language/identifiers/val-finally-via-escape-hex4.js",
"test262/language/identifiers/val-super.js",
"test262/language/identifiers/val-new-via-escape-hex.js",
"test262/language/identifiers/val-case-via-escape-hex.js",
"test262/language/identifiers/val-with-via-escape-hex.js",
"test262/language/identifiers/val-true-via-escape-hex.js",
"test262/language/identifiers/val-break-via-escape-hex.js",
"test262/language/identifiers/val-const.js",
"test262/language/identifiers/val-while-via-escape-hex4.js",
"test262/language/identifiers/val-default-via-escape-hex.js",
"test262/language/identifiers/val-this-via-escape-hex4.js",
"test262/language/identifiers/val-while.js",
"test262/language/identifiers/val-true.js",
"test262/language/identifiers/val-extends.js",
"test262/language/identifiers/val-enum.js",
"test262/language/identifiers/val-instanceof-via-escape-hex4.js",
"test262/language/identifiers/val-function-via-escape-hex4.js",
"test262/language/identifiers/val-new.js",
"test262/language/identifiers/val-export-via-escape-hex.js",
"test262/language/identifiers/unicode-escape-nls-err.js",
"test262/language/identifiers/val-try-via-escape-hex.js",
"test262/language/identifiers/val-finally-via-escape-hex.js",
"test262/language/identifiers/val-if-via-escape-hex4.js",
"test262/language/identifiers/val-instanceof-via-escape-hex.js",
"test262/language/identifiers/val-else-via-escape-hex.js",
"test262/language/identifiers/val-delete-via-escape-hex4.js",
"test262/language/identifiers/val-function.js",
"test262/language/identifiers/val-var-via-escape-hex4.js",
"test262/language/identifiers/val-with.js",
"test262/language/identifiers/val-return.js",
"test262/language/identifiers/val-while-via-escape-hex.js",
"test262/language/identifiers/val-finally.js",
"test262/language/identifiers/val-default.js",
"test262/language/identifiers/val-default-via-escape-hex4.js",
"test262/language/identifiers/val-this-via-escape-hex.js",
"test262/language/identifiers/val-throw-via-escape-hex.js",
"test262/language/identifiers/val-return-via-escape-hex4.js",
"test262/language/identifiers/val-for-via-escape-hex.js",
"test262/language/identifiers/val-extends-via-escape-hex.js",
"test262/language/identifiers/val-const-via-escape-hex4.js",
"test262/language/identifiers/val-else.js",
"test262/language/identifiers/val-import-via-escape-hex4.js",
"test262/language/identifiers/val-function-via-escape-hex.js",
"test262/language/identifiers/val-in-via-escape-hex4.js",
"test262/language/identifiers/val-do-via-escape-hex.js",
"test262/language/identifiers/val-break.js",
"test262/language/identifiers/val-super-via-escape-hex4.js",
"test262/language/identifiers/val-instanceof.js",
"test262/language/identifiers/val-return-via-escape-hex.js",
"test262/language/identifiers/val-if.js",
"test262/language/identifiers/vertical-tilde-start-escaped.js",
"test262/language/identifiers/vertical-tilde-start.js",
"test262/language/identifiers/val-import.js",
"test262/language/identifiers/val-delete-via-escape-hex.js",
"test262/language/identifiers/val-continue.js",
"test262/language/identifiers/val-false.js",
"test262/language/identifiers/val-case-via-escape-hex4.js",
"test262/language/identifiers/val-try-via-escape-hex4.js",
"test262/language/identifiers/val-case.js",
"test262/language/identifiers/val-in.js",
"test262/language/identifiers/val-enum-via-escape-hex4.js",
"test262/language/identifiers/val-super-via-escape-hex.js",
"test262/language/identifiers/val-throw-via-escape-hex4.js",
"test262/language/identifiers/val-switch-via-escape-hex4.js",
"test262/language/identifiers/val-null-via-escape-hex4.js",
"test262/language/identifiers/val-export.js",
"test262/language/identifiers/val-in-via-escape-hex.js",
"test262/language/identifiers/val-throw.js",
"test262/language/identifiers/val-catch.js",
"test262/language/identifiers/val-else-via-escape-hex4.js",
"test262/language/identifiers/val-var-via-escape-hex.js",
"test262/language/identifiers/val-continue-via-escape-hex4.js",
"test262/language/identifiers/val-switch-via-escape-hex.js",
"test262/language/identifiers/val-var.js",
"test262/language/identifiers/val-export-via-escape-hex4.js",
"test262/language/identifiers/val-const-via-escape-hex.js",
"test262/language/identifiers/val-void.js",
"test262/language/identifiers/vertical-tilde-continue.js",
"test262/language/identifiers/val-null.js",
"test262/language/identifiers/val-delete.js",
"test262/language/identifiers/val-enum-via-escape-hex.js",
"test262/language/identifiers/val-for-via-escape-hex4.js",
"test262/language/identifiers/val-void-via-escape-hex4.js",
"test262/language/identifiers/val-if-via-escape-hex.js",
"test262/language/identifiers/val-class-via-escape-hex4.js",
"test262/language/identifiers/val-catch-via-escape-hex4.js",
"test262/language/identifiers/val-try.js",
"test262/language/identifiers/val-yield-strict.js",
"test262/language/identifiers/val-true-via-escape-hex4.js",
"test262/language/identifiers/val-typeof-via-escape-hex.js",
"test262/language/identifiers/val-with-via-escape-hex4.js",
"test262/language/identifiers/val-for.js",
"test262/language/identifiers/val-continue-via-escape-hex.js",
"test262/language/identifiers/val-extends-via-escape-hex4.js",
"test262/language/identifiers/val-debugger-via-escape-hex.js",
"test262/language/identifiers/val-import-via-escape-hex.js",
"test262/language/identifiers/val-class-via-escape-hex.js",
"test262/language/identifiers/val-break-via-escape-hex4.js",
"test262/language/identifiers/val-void-via-escape-hex.js",
"test262/language/identifiers/val-debugger-via-escape-hex4.js",
"test262/language/identifiers/val-null-via-escape-hex.js",
"test262/language/reserved-words/ident-reference-null-escaped.js",
"test262/language/reserved-words/label-ident-null.js",
"test262/language/reserved-words/label-ident-true-escaped.js",
"test262/language/reserved-words/await-module.js",
"test262/language/reserved-words/ident-reference-false.js",
"test262/language/reserved-words/ident-reference-true-escaped.js",
"test262/language/reserved-words/label-ident-false.js",
"test262/language/reserved-words/label-ident-false-escaped.js",
"test262/language/reserved-words/label-ident-null-escaped.js",
"test262/language/reserved-words/ident-reference-null.js",
"test262/language/reserved-words/label-ident-true.js",
"test262/language/reserved-words/ident-reference-true.js",
"test262/language/reserved-words/ident-reference-false-escaped.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-var-declaration-nested-in-function.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/generator-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/function-declaration-attempt-to-redeclare-with-var-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-generator-declaration-attempt-to-redeclare-with-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-class-declaration.js",
"test262/language/block-scope/syntax/redeclaration/let-declaration-attempt-to-redeclare-with-async-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/const-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/var-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-function-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-let-declaration.js",
"test262/language/block-scope/syntax/redeclaration/class-declaration-attempt-to-redeclare-with-async-generator-declaration.js",
"test262/language/block-scope/syntax/redeclaration/async-function-declaration-attempt-to-redeclare-with-const-declaration.js",
"test262/language/block-scope/syntax/for-in/disallow-multiple-lexical-bindings-without-and-with-initializer.js",
"test262/language/block-scope/syntax/for-in/disallow-initialization-assignment.js",
"test262/language/block-scope/syntax/for-in/disallow-multiple-lexical-bindings-with-and-without-initializer.js",
"test262/language/block-scope/syntax/for-in/disallow-multiple-lexical-bindings-with-initializer.js",
"test262/language/block-scope/syntax/for-in/disallow-multiple-lexical-bindings.js",
"test262/language/block-scope/syntax/function-declarations/in-statement-position-while-expression-statement.js",
"test262/language/block-scope/syntax/function-declarations/in-statement-position-if-expression-statement-else-statement.js",
"test262/language/block-scope/syntax/function-declarations/in-statement-position-if-expression-statement.js",
"test262/language/block-scope/syntax/function-declarations/in-statement-position-do-statement-while-expression.js",
"test262/language/block-scope/syntax/function-declarations/in-statement-position-for-statement.js",
"test262/language/types/boolean/S8.3_A2.2.js",
"test262/language/types/boolean/S8.3_A2.1.js",
"test262/language/types/null/S8.2_A2.js",
"test262/language/types/string/S8.4_A14_T3.js",
"test262/language/types/string/S8.4_A14_T2.js",
"test262/language/types/string/S8.4_A14_T1.js",
"test262/language/types/string/S8.4_A13_T2.js",
"test262/language/types/string/S8.4_A13_T3.js",
"test262/language/types/string/S8.4_A13_T1.js",
"test262/language/types/reference/S8.7.2_A1_T2.js",
"test262/language/types/reference/S8.7.2_A1_T1.js",
"test262/language/literals/string/S7.8.4_A1.1_T1.js",
"test262/language/literals/string/S7.8.4_A7.2_T4.js",
"test262/language/literals/string/S7.8.4_A3.1_T1.js",
"test262/language/literals/string/legacy-non-octal-escape-sequence-strict.js",
"test262/language/literals/string/S7.8.4_A1.1_T2.js",
"test262/language/literals/string/S7.8.4_A3.2_T2.js",
"test262/language/literals/string/unicode-escape-nls-err-double.js",
"test262/language/literals/string/S7.8.4_A7.2_T6.js",
"test262/language/literals/string/S7.8.4_A7.2_T3.js",
"test262/language/literals/string/S7.8.4_A3.1_T2.js",
"test262/language/literals/string/S7.8.4_A1.2_T1.js",
"test262/language/literals/string/S7.8.4_A7.2_T2.js",
"test262/language/literals/string/S7.8.4_A4.3_T1.js",
"test262/language/literals/string/S7.8.4_A4.3_T2.js",
"test262/language/literals/string/legacy-octal-escape-sequence-prologue-strict.js",
"test262/language/literals/string/S7.8.4_A3.2_T1.js",
"test262/language/literals/string/unicode-escape-nls-err-single.js",
"test262/language/literals/string/S7.8.4_A7.2_T5.js",
"test262/language/literals/string/S7.8.4_A7.1_T4.js",
"test262/language/literals/string/S7.8.4_A1.2_T2.js",
"test262/language/literals/string/S7.8.4_A7.2_T1.js",
"test262/language/literals/string/legacy-octal-escape-sequence-strict.js",
"test262/language/literals/regexp/S7.8.5_A1.2_T4.js",
"test262/language/literals/regexp/invalid-braced-quantifier-lower.js",
"test262/language/literals/regexp/invalid-range-negative-lookbehind.js",
"test262/language/literals/regexp/u-invalid-optional-lookahead.js",
"test262/language/literals/regexp/invalid-range-lookbehind.js",
"test262/language/literals/regexp/u-dec-esc.js",
"test262/language/literals/regexp/u-invalid-extended-pattern-char.js",
"test262/language/literals/regexp/u-invalid-non-empty-class-ranges-no-dash-a.js",
"test262/language/literals/regexp/early-err-dup-flag.js",
"test262/language/literals/regexp/regexp-source-char-no-paragraph-separator.js",
"test262/language/literals/regexp/S7.8.5_A2.5_T3.js",
"test262/language/literals/regexp/u-invalid-identity-escape.js",
"test262/language/literals/regexp/S7.8.5_A1.5_T3.js",
"test262/language/literals/regexp/regexp-source-char-no-line-separator.js",
"test262/language/literals/regexp/invalid-braced-quantifier-range.js",
"test262/language/literals/regexp/S7.8.5_A1.2_T3.js",
"test262/language/literals/regexp/u-invalid-optional-negative-lookahead.js",
"test262/language/literals/regexp/unicode-escape-nls-err.js",
"test262/language/literals/regexp/u-invalid-range-lookbehind.js",
"test262/language/literals/regexp/u-invalid-range-negative-lookbehind.js",
"test262/language/literals/regexp/S7.8.5_A1.3_T3.js",
"test262/language/literals/regexp/u-invalid-optional-negative-lookbehind.js",
"test262/language/literals/regexp/S7.8.5_A2.3_T3.js",
"test262/language/literals/regexp/u-invalid-legacy-octal-escape.js",
"test262/language/literals/regexp/S7.8.5_A2.2_T1.js",
"test262/language/literals/regexp/early-err-bad-flag.js",
"test262/language/literals/regexp/u-invalid-range-negative-lookahead.js",
"test262/language/literals/regexp/early-err-flags-unicode-escape.js",
"test262/language/literals/regexp/u-invalid-optional-lookbehind.js",
"test262/language/literals/regexp/invalid-optional-lookbehind.js",
"test262/language/literals/regexp/u-unicode-esc-bounds.js",
"test262/language/literals/regexp/u-unicode-esc-non-hex.js",
"test262/language/literals/regexp/S7.8.5_A1.3_T1.js",
"test262/language/literals/regexp/invalid-braced-quantifier-exact.js",
"test262/language/literals/regexp/S7.8.5_A1.2_T2.js",
"test262/language/literals/regexp/regexp-first-char-no-line-separator.js",
"test262/language/literals/regexp/early-err-pattern.js",
"test262/language/literals/regexp/S7.8.5_A2.3_T1.js",
"test262/language/literals/regexp/u-invalid-non-empty-class-ranges-no-dash-b.js",
"test262/language/literals/regexp/S7.8.5_A1.2_T1.js",
"test262/language/literals/regexp/regexp-first-char-no-paragraph-separator.js",
"test262/language/literals/regexp/u-invalid-non-empty-class-ranges-no-dash-ab.js",
"test262/language/literals/regexp/S7.8.5_A2.2_T2.js",
"test262/language/literals/regexp/u-invalid-non-empty-class-ranges.js",
"test262/language/literals/regexp/invalid-optional-negative-lookbehind.js",
"test262/language/literals/regexp/u-invalid-class-escape.js",
"test262/language/literals/regexp/S7.8.5_A2.5_T1.js",
"test262/language/literals/regexp/S7.8.5_A1.5_T1.js",
"test262/language/literals/regexp/u-invalid-oob-decimal-escape.js",
"test262/language/literals/regexp/u-invalid-range-lookahead.js",
"test262/language/literals/numeric/numeric-followed-by-ident.js",
"test262/language/literals/numeric/binary-invalid-digit.js",
"test262/language/literals/numeric/7.8.3-1gs.js",
"test262/language/literals/numeric/numeric-separator-literal-dd-nsl-err.js",
"test262/language/literals/numeric/S7.8.3_A6.1_T1.js",
"test262/language/literals/numeric/non-octal-decimal-integer-strict.js",
"test262/language/literals/numeric/octal-invalid-unicode.js",
"test262/language/literals/numeric/numeric-separator-literal-bil-nsl-bd-err.js",
"test262/language/literals/numeric/octal-invalid-digit.js",
"test262/language/literals/numeric/numeric-separator-literal-dot-nsl-err.js",
"test262/language/literals/numeric/numeric-separator-literal-dot-nsl-ep-err.js",
"test262/language/literals/numeric/numeric-separator-literal-nzd-nsl-dds-leading-zero-err.js",
"test262/language/literals/numeric/numeric-separator-literal-nzd-nsl-dds-dunder-err.js",
"test262/language/literals/numeric/binary-invalid-unicode.js",
"test262/language/literals/numeric/numeric-separator-literal-hil-nsl-hd-dunder-err.js",
"test262/language/literals/numeric/S7.8.3_A6.1_T2.js",
"test262/language/literals/numeric/7.8.3-2gs.js",
"test262/language/literals/numeric/numeric-separator-literal-oil-nsl-od-dunder-err.js",
"test262/language/literals/numeric/numeric-separator-literal-hil-nsl-hd-err.js",
"test262/language/literals/numeric/numeric-separator-literal-oil-od-nsl-od-err.js",
"test262/language/literals/numeric/numeric-separator-literal-dds-nsl-err.js",
"test262/language/literals/numeric/octal-invalid-truncated.js",
"test262/language/literals/numeric/binary-invalid-truncated.js",
"test262/language/literals/numeric/numeric-separator-literal-dds-nsl-dds-dunder-err.js",
"test262/language/literals/numeric/numeric-separator-literal-dot-dds-nsl-ep-err.js",
"test262/language/literals/numeric/numeric-separator-literal-oil-nsl-od-err.js",
"test262/language/literals/numeric/binary-invalid-leading.js",
"test262/language/literals/numeric/numeric-separator-literal-bil-bd-nsl-bd-err.js",
"test262/language/literals/numeric/S7.8.3_A6.2_T2.js",
"test262/language/literals/numeric/numeric-separator-literal-hil-hd-nsl-hd-err.js",
"test262/language/literals/numeric/legacy-octal-integer-strict.js",
"test262/language/literals/numeric/numeric-separator-literal-dil-dot-dds-nsl-ep-dd-err.js",
"test262/language/literals/numeric/numeric-separator-literal-bil-nsl-bd-dunder-err.js",
"test262/language/literals/numeric/octal-invalid-leading.js",
"test262/language/literals/numeric/numeric-separator-literal-dil-dot-nsl-ep-err.js",
"test262/language/literals/numeric/numeric-separator-literal-unicode-err.js",
"test262/language/literals/numeric/numeric-separator-literal-dil-dot-nsl-err.js",
"test262/language/literals/numeric/S7.8.3_A6.2_T1.js",
"test262/language/literals/numeric/numeric-separator-literal-dd-nsl-dds-dunder-err.js",
"test262/language/literals/bigint/binary-invalid-digit.js",
"test262/language/literals/bigint/mv-is-not-integer-dot-dds.js",
"test262/language/literals/bigint/mv-is-not-integer-dil-dot-dds.js",
"test262/language/literals/bigint/octal-invalid-digit.js",
"test262/language/literals/bigint/hexadecimal-invalid-digit.js",
"test262/language/literals/bigint/exponent-part.js",
"test262/language/white-space/S7.2_A5_T5.js",
"test262/language/white-space/mongolian-vowel-separator.js",
"test262/language/white-space/S7.2_A5_T1.js",
"test262/language/white-space/S7.2_A5_T3.js",
"test262/language/white-space/S7.2_A5_T4.js",
"test262/language/white-space/S7.2_A5_T2.js",
"test262/language/arguments-object/10.5-1gs.js",
"test262/language/expressions/compound-assignment/add-non-simple.js",
"test262/language/expressions/compound-assignment/left-shift-non-simple.js",
"test262/language/expressions/compound-assignment/btws-and-non-simple.js",
"test262/language/expressions/compound-assignment/div-non-simple.js",
"test262/language/expressions/compound-assignment/right-shift-non-simple.js",
"test262/language/expressions/compound-assignment/11.13.2-6-1gs.js",
"test262/language/expressions/compound-assignment/subtract-non-simple.js",
"test262/language/expressions/compound-assignment/btws-or-non-simple.js",
"test262/language/expressions/compound-assignment/mod-div-non-simple.js",
"test262/language/expressions/compound-assignment/u-right-shift-non-simple.js",
"test262/language/expressions/compound-assignment/btws-xor-non-simple.js",
"test262/language/expressions/compound-assignment/mult-non-simple.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-typeof-unary-expression-base.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-void-unary-expression-base.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-bitnot-unary-expression-base.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-negate-unary-expression-base.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-logical-not-unary-expression-base.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-plus-unary-expression-base.js",
"test262/language/expressions/exponentiation/exp-operator-syntax-error-delete-unary-expression-base.js",
"test262/language/expressions/async-arrow-function/await-as-identifier-reference.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-await-in-formals-default.js",
"test262/language/expressions/async-arrow-function/await-as-param-ident-nested-arrow-parameter-position.js",
"test262/language/expressions/async-arrow-function/await-as-label-identifier-escaped.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-formals-body-duplicate.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-formals-contains-super-property.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-duplicate-parameters.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-formals-lineterminator.js",
"test262/language/expressions/async-arrow-function/await-as-binding-identifier.js",
"test262/language/expressions/async-arrow-function/escaped-async.js",
"test262/language/expressions/async-arrow-function/dflt-params-rest.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-body-contains-super-call.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-formals-contains-super-call.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-eval-in-formal-parameters.js",
"test262/language/expressions/async-arrow-function/await-as-param-nested-arrow-parameter-position.js",
"test262/language/expressions/async-arrow-function/dflt-params-duplicates.js",
"test262/language/expressions/async-arrow-function/await-as-param-rest-nested-arrow-parameter-position.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-arguments-in-formal-parameters.js",
"test262/language/expressions/async-arrow-function/await-as-label-identifier.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-body-contains-super-property.js",
"test262/language/expressions/async-arrow-function/await-as-binding-identifier-escaped.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-NSPL-with-USD.js",
"test262/language/expressions/async-arrow-function/await-as-param-nested-arrow-body-position.js",
"test262/language/expressions/async-arrow-function/rest-params-trailing-comma-early-error.js",
"test262/language/expressions/async-arrow-function/await-as-identifier-reference-escaped.js",
"test262/language/expressions/async-arrow-function/early-errors-arrow-await-in-formals.js",
"test262/language/expressions/prefix-decrement/non-simple.js",
"test262/language/expressions/prefix-decrement/11.4.5-2-2gs.js",
"test262/language/expressions/prefix-decrement/target-newtarget.js",
"test262/language/expressions/prefix-decrement/target-cover-yieldexpr.js",
"test262/language/expressions/prefix-decrement/target-cover-newtarget.js",
"test262/language/expressions/postfix-decrement/non-simple.js",
"test262/language/expressions/postfix-decrement/target-newtarget.js",
"test262/language/expressions/postfix-decrement/target-cover-yieldexpr.js",
"test262/language/expressions/postfix-decrement/target-cover-newtarget.js",
"test262/language/expressions/async-function/early-errors-expression-NSPL-with-USD.js",
"test262/language/expressions/async-function/await-as-identifier-reference.js",
"test262/language/expressions/async-function/early-errors-expression-binding-identifier-arguments.js",
"test262/language/expressions/async-function/named-await-as-identifier-reference-escaped.js",
"test262/language/expressions/async-function/early-errors-expression-binding-identifier-eval.js",
"test262/language/expressions/async-function/early-errors-expression-body-contains-super-property.js",
"test262/language/expressions/async-function/early-errors-expression-formals-body-duplicate.js",
"test262/language/expressions/async-function/await-as-label-identifier-escaped.js",
"test262/language/expressions/async-function/early-errors-expression-body-contains-super-call.js",
"test262/language/expressions/async-function/named-await-as-binding-identifier-escaped.js",
"test262/language/expressions/async-function/early-errors-expression-formals-contains-super-call.js",
"test262/language/expressions/async-function/early-errors-expression-not-simple-assignment-target.js",
"test262/language/expressions/async-function/named-await-as-label-identifier-escaped.js",
"test262/language/expressions/async-function/await-as-binding-identifier.js",
"test262/language/expressions/async-function/escaped-async.js",
"test262/language/expressions/async-function/named-await-as-binding-identifier.js",
"test262/language/expressions/async-function/named-dflt-params-duplicates.js",
"test262/language/expressions/async-function/await-as-label-identifier.js",
"test262/language/expressions/async-function/nameless-dflt-params-duplicates.js",
"test262/language/expressions/async-function/await-as-binding-identifier-escaped.js",
"test262/language/expressions/async-function/early-errors-expression-eval-in-formal-parameters.js",
"test262/language/expressions/async-function/named-await-as-identifier-reference.js",
"test262/language/expressions/async-function/named-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/async-function/nameless-dflt-params-rest.js",
"test262/language/expressions/async-function/await-as-identifier-reference-escaped.js",
"test262/language/expressions/async-function/named-await-as-label-identifier.js",
"test262/language/expressions/async-function/named-dflt-params-rest.js",
"test262/language/expressions/async-function/nameless-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/async-function/early-errors-expression-formals-contains-super-property.js",
"test262/language/expressions/delete/11.4.1-5-a-5gs.js",
"test262/language/expressions/delete/identifier-strict.js",
"test262/language/expressions/class/async-gen-method-static-await-as-binding-identifier-escaped.js",
"test262/language/expressions/class/async-gen-method-static-await-as-identifier-reference-escaped.js",
"test262/language/expressions/class/async-meth-static-dflt-params-rest.js",
"test262/language/expressions/class/fields-arrow-fnc-init-err-contains-super.js",
"test262/language/expressions/class/async-gen-method-static-yield-as-label-identifier.js",
"test262/language/expressions/class/dstr-async-gen-meth-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/gen-method-static-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/class/meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/fields-asi-3.js",
"test262/language/expressions/class/fields-private-ternary-init-err-contains-super.js",
"test262/language/expressions/class/dstr-meth-static-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/dstr-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/meth-static-dflt-params-duplicates.js",
"test262/language/expressions/class/async-gen-method-yield-as-label-identifier.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/gen-method-yield-as-identifier-reference.js",
"test262/language/expressions/class/dstr-gen-meth-static-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/dstr-async-gen-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/fields-duplicate-privatenames.js",
"test262/language/expressions/class/dstr-meth-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/gen-method-yield-as-label-identifier.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/fields-string-literal-name-init-err-contains-arguments.js",
"test262/language/expressions/class/async-method-await-as-binding-identifier.js",
"test262/language/expressions/class/async-gen-method-static-await-as-identifier-reference.js",
"test262/language/expressions/class/class-name-ident-let.js",
"test262/language/expressions/class/meth-static-dflt-params-rest.js",
"test262/language/expressions/class/async-gen-method-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/class/dstr-meth-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/gen-meth-dflt-params-duplicates.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/err-method-delete-twice-covered-call-expression-privatename.js",
"test262/language/expressions/class/method-param-dflt-yield.js",
"test262/language/expressions/class/dstr-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/dstr-async-gen-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/async-method-static-await-as-identifier-reference-escaped.js",
"test262/language/expressions/class/gen-meth-static-dflt-params-rest.js",
"test262/language/expressions/class/fields-equality-init-err-contains-super.js",
"test262/language/expressions/class/async-method-await-as-identifier-reference.js",
"test262/language/expressions/class/getter-param-dflt.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/dstr-gen-meth-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/dstr-meth-static-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/gen-method-static-yield-as-label-identifier-escaped.js",
"test262/language/expressions/class/dstr-meth-static-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/gen-method-static-yield-as-label-identifier.js",
"test262/language/expressions/class/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/dstr-meth-static-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/gen-method-static-yield-identifier-spread-strict.js",
"test262/language/expressions/class/dstr-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/async-gen-method-static-yield-as-identifier-reference.js",
"test262/language/expressions/class/gen-method-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/class/gen-meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/dstr-meth-static-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/fields-private-arrow-fnc-init-err-contains-super.js",
"test262/language/expressions/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/meth-dflt-params-rest.js",
"test262/language/expressions/class/async-gen-method-static-await-as-label-identifier.js",
"test262/language/expressions/class/class-name-ident-yield.js",
"test262/language/expressions/class/async-method-await-as-label-identifier.js",
"test262/language/expressions/class/err-method-delete-covered-call-expression-privatename.js",
"test262/language/expressions/class/gen-method-yield-as-label-identifier-escaped.js",
"test262/language/expressions/class/dstr-async-gen-meth-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/fields-comp-name-init-err-contains-super.js",
"test262/language/expressions/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/dstr-async-gen-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/async-method-static-await-as-binding-identifier.js",
"test262/language/expressions/class/dstr-gen-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/fields-literal-name-init-err-contains-super.js",
"test262/language/expressions/class/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/err-field-delete-twice-covered-call-expression-privatename.js",
"test262/language/expressions/class/async-meth-static-dflt-params-duplicates.js",
"test262/language/expressions/class/class-name-ident-await-escaped-module.js",
"test262/language/expressions/class/gen-meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/async-gen-method-static-yield-identifier-spread-strict.js",
"test262/language/expressions/class/gen-method-yield-identifier-spread-strict.js",
"test262/language/expressions/class/gen-method-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/class/err-method-delete-twice-covered-member-expression-privatename.js",
"test262/language/expressions/class/dstr-meth-static-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/async-meth-dflt-params-duplicates.js",
"test262/language/expressions/class/async-method-static-await-as-binding-identifier-escaped.js",
"test262/language/expressions/class/gen-method-yield-identifier-strict.js",
"test262/language/expressions/class/meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/dstr-meth-static-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/async-gen-method-await-as-identifier-reference.js",
"test262/language/expressions/class/gen-method-static-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/class/fields-ternary-init-err-contains-arguments.js",
"test262/language/expressions/class/async-gen-method-yield-as-binding-identifier.js",
"test262/language/expressions/class/async-gen-meth-dflt-params-duplicates.js",
"test262/language/expressions/class/class-name-ident-let-escaped.js",
"test262/language/expressions/class/async-gen-meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/async-gen-meth-dflt-params-rest.js",
"test262/language/expressions/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/gen-method-static-yield-as-identifier-reference.js",
"test262/language/expressions/class/fields-arrow-fnc-init-err-contains-arguments.js",
"test262/language/expressions/class/class-name-ident-static-escaped.js",
"test262/language/expressions/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/err-method-delete-covered-member-expression-privatename.js",
"test262/language/expressions/class/fields-private-typeof-init-err-contains-arguments.js",
"test262/language/expressions/class/err-field-delete-covered-member-expression-privatename.js",
"test262/language/expressions/class/dstr-meth-static-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/fields-asi-4.js",
"test262/language/expressions/class/async-gen-method-static-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/class/async-gen-method-static-await-as-binding-identifier.js",
"test262/language/expressions/class/async-gen-method-await-as-binding-identifier.js",
"test262/language/expressions/class/dstr-gen-meth-static-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/gen-method-static-yield-as-binding-identifier.js",
"test262/language/expressions/class/fields-equality-init-err-contains-arguments.js",
"test262/language/expressions/class/fields-literal-name-propname-constructor.js",
"test262/language/expressions/class/async-gen-method-await-as-label-identifier-escaped.js",
"test262/language/expressions/class/fields-typeof-init-err-contains-arguments.js",
"test262/language/expressions/class/async-gen-method-await-as-label-identifier.js",
"test262/language/expressions/class/err-field-delete-call-expression-privatename.js",
"test262/language/expressions/class/async-gen-method-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/class/dstr-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/fields-typeof-init-err-contains-super.js",
"test262/language/expressions/class/dstr-gen-meth-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/err-field-delete-twice-covered-member-expression-privatename.js",
"test262/language/expressions/class/class-name-ident-yield-escaped.js",
"test262/language/expressions/class/async-gen-meth-static-dflt-params-rest.js",
"test262/language/expressions/class/async-method-static-await-as-identifier-reference.js",
"test262/language/expressions/class/gen-method-yield-as-binding-identifier.js",
"test262/language/expressions/class/dstr-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/async-gen-method-yield-as-identifier-reference.js",
"test262/language/expressions/class/fields-private-ternary-init-err-contains-arguments.js",
"test262/language/expressions/class/async-gen-method-static-await-as-label-identifier-escaped.js",
"test262/language/expressions/class/async-method-await-as-label-identifier-escaped.js",
"test262/language/expressions/class/async-gen-method-static-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/class/dstr-meth-static-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/async-gen-meth-static-dflt-params-duplicates.js",
"test262/language/expressions/class/dstr-gen-meth-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/dstr-gen-meth-static-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/static-gen-method-param-dflt-yield.js",
"test262/language/expressions/class/dstr-gen-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/fields-string-name-propname-constructor.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/dstr-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/dstr-gen-meth-static-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/async-method-await-as-identifier-reference-escaped.js",
"test262/language/expressions/class/async-meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/gen-method-static-yield-identifier-strict.js",
"test262/language/expressions/class/dstr-gen-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/async-gen-method-await-as-binding-identifier-escaped.js",
"test262/language/expressions/class/gen-meth-static-dflt-params-duplicates.js",
"test262/language/expressions/class/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/dstr-gen-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/async-method-static-await-as-label-identifier.js",
"test262/language/expressions/class/async-gen-meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/gen-method-param-dflt-yield.js",
"test262/language/expressions/class/dstr-meth-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/dstr-async-gen-meth-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/async-meth-static-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/class/fields-private-typeof-init-err-contains-super.js",
"test262/language/expressions/class/dstr-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/dstr-gen-meth-static-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/class/meth-dflt-params-duplicates.js",
"test262/language/expressions/class/err-field-delete-covered-call-expression-privatename.js",
"test262/language/expressions/class/fields-private-literal-name-init-err-contains-super.js",
"test262/language/expressions/class/dstr-meth-static-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/dstr-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/class-name-ident-await-module.js",
"test262/language/expressions/class/dstr-meth-static-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/class/class-name-ident-static.js",
"test262/language/expressions/class/fields-private-literal-name-init-err-contains-arguments.js",
"test262/language/expressions/class/async-gen-method-static-yield-as-label-identifier-escaped.js",
"test262/language/expressions/class/dstr-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/fields-privatename-constructor-err.js",
"test262/language/expressions/class/async-gen-method-await-as-identifier-reference-escaped.js",
"test262/language/expressions/class/async-method-await-as-binding-identifier-escaped.js",
"test262/language/expressions/class/async-gen-method-static-yield-as-binding-identifier.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/fields-ternary-init-err-contains-super.js",
"test262/language/expressions/class/fields-string-literal-name-init-err-contains-super.js",
"test262/language/expressions/class/fields-private-arrow-fnc-init-err-contains-arguments.js",
"test262/language/expressions/class/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/async-gen-method-static-yield-identifier-strict.js",
"test262/language/expressions/class/async-gen-method-yield-identifier-strict.js",
"test262/language/expressions/class/dstr-gen-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/dstr-gen-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/err-method-delete-call-expression-privatename.js",
"test262/language/expressions/class/dstr-gen-meth-static-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/class/async-gen-method-yield-as-label-identifier-escaped.js",
"test262/language/expressions/class/async-method-static-await-as-label-identifier-escaped.js",
"test262/language/expressions/class/fields-comp-name-init-err-contains-arguments.js",
"test262/language/expressions/class/async-meth-dflt-params-rest.js",
"test262/language/expressions/class/err-field-delete-member-expression-privatename.js",
"test262/language/expressions/class/dstr-gen-meth-static-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/class/fields-literal-name-init-err-contains-arguments.js",
"test262/language/expressions/class/dstr-async-gen-meth-static-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/class/gen-meth-dflt-params-rest.js",
"test262/language/expressions/class/err-method-delete-member-expression-privatename.js",
"test262/language/expressions/class/dstr-meth-static-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/class/static-method-param-dflt-yield.js",
"test262/language/expressions/class/async-gen-method-yield-identifier-spread-strict.js",
"test262/language/expressions/property-accessors/non-identifier-name.js",
"test262/language/expressions/new.target/escaped-new.js",
"test262/language/expressions/new.target/escaped-target.js",
"test262/language/expressions/generators/yield-as-label-identifier-escaped.js",
"test262/language/expressions/generators/dstr-ary-ptrn-rest-init-id.js",
"test262/language/expressions/generators/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/generators/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/generators/named-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/generators/yield-identifier-strict.js",
"test262/language/expressions/generators/yield-as-parameter.js",
"test262/language/expressions/generators/named-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/generators/yield-as-binding-identifier-escaped.js",
"test262/language/expressions/generators/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/generators/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/generators/yield-as-binding-identifier.js",
"test262/language/expressions/generators/named-yield-as-identifier-reference.js",
"test262/language/expressions/generators/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/generators/yield-as-identifier-reference.js",
"test262/language/expressions/generators/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/generators/named-yield-as-binding-identifier.js",
"test262/language/expressions/generators/dflt-params-rest.js",
"test262/language/expressions/generators/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/generators/named-yield-identifier-strict.js",
"test262/language/expressions/generators/use-strict-with-non-simple-param.js",
"test262/language/expressions/generators/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/generators/yield-star-after-newline.js",
"test262/language/expressions/generators/dflt-params-duplicates.js",
"test262/language/expressions/generators/yield-as-identifier-reference-escaped.js",
"test262/language/expressions/generators/named-yield-as-label-identifier-escaped.js",
"test262/language/expressions/generators/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/generators/named-yield-as-label-identifier.js",
"test262/language/expressions/generators/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/generators/yield-as-generator-expression-binding-identifier.js",
"test262/language/expressions/generators/param-dflt-yield.js",
"test262/language/expressions/generators/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/generators/yield-as-label-identifier.js",
"test262/language/expressions/generators/rest-params-trailing-comma-early-error.js",
"test262/language/expressions/generators/yield-as-logical-or-expression.js",
"test262/language/expressions/generators/yield-identifier-spread-strict.js",
"test262/language/expressions/generators/yield-weak-binding.js",
"test262/language/expressions/generators/named-yield-identifier-spread-strict.js",
"test262/language/expressions/this/S11.1.1_A1.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-6.js",
"test262/language/expressions/template-literal/invalid-hexidecimal-character-escape-sequence-truncated-1.js",
"test262/language/expressions/template-literal/invalid-hexidecimal-character-escape-sequence-truncated-3.js",
"test262/language/expressions/template-literal/invalid-legacy-octal-escape-sequence.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-3.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-8.js",
"test262/language/expressions/template-literal/unicode-escape-nls-err.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-7.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-5.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-4.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-2.js",
"test262/language/expressions/template-literal/invalid-hexidecimal-character-escape-sequence-truncated-2.js",
"test262/language/expressions/template-literal/invalid-unicode-escape-sequence-1.js",
"test262/language/expressions/postfix-increment/11.3.1-2-1gs.js",
"test262/language/expressions/postfix-increment/non-simple.js",
"test262/language/expressions/postfix-increment/target-newtarget.js",
"test262/language/expressions/postfix-increment/target-cover-yieldexpr.js",
"test262/language/expressions/postfix-increment/target-cover-newtarget.js",
"test262/language/expressions/call/S11.2.4_A1.3_T1.js",
"test262/language/expressions/async-generator/dstr-named-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/async-generator/early-errors-expression-NSPL-with-USD.js",
"test262/language/expressions/async-generator/dstr-named-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/async-generator/dstr-named-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/async-generator/yield-as-label-identifier-escaped.js",
"test262/language/expressions/async-generator/dstr-ary-ptrn-rest-init-id.js",
"test262/language/expressions/async-generator/early-errors-expression-yield-as-function-binding-identifier.js",
"test262/language/expressions/async-generator/await-as-identifier-reference.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-contains-yield-expr.js",
"test262/language/expressions/async-generator/early-errors-expression-arguments-in-formal-parameters.js",
"test262/language/expressions/async-generator/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/async-generator/early-errors-expression-binding-identifier-arguments.js",
"test262/language/expressions/async-generator/named-await-as-identifier-reference-escaped.js",
"test262/language/expressions/async-generator/dstr-named-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/async-generator/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/async-generator/named-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/async-generator/yield-identifier-strict.js",
"test262/language/expressions/async-generator/early-errors-expression-binding-identifier-eval.js",
"test262/language/expressions/async-generator/early-errors-expression-body-contains-super-property.js",
"test262/language/expressions/async-generator/named-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/async-generator/yield-as-binding-identifier-escaped.js",
"test262/language/expressions/async-generator/dstr-named-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/async-generator/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/async-generator/await-as-label-identifier-escaped.js",
"test262/language/expressions/async-generator/early-errors-expression-body-contains-super-call.js",
"test262/language/expressions/async-generator/named-await-as-binding-identifier-escaped.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-contains-await.js",
"test262/language/expressions/async-generator/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-contains-super-call.js",
"test262/language/expressions/async-generator/yield-as-binding-identifier.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-body-duplicate-let.js",
"test262/language/expressions/async-generator/early-errors-expression-yield-star-after-newline.js",
"test262/language/expressions/async-generator/early-errors-expression-not-simple-assignment-target.js",
"test262/language/expressions/async-generator/named-yield-as-identifier-reference.js",
"test262/language/expressions/async-generator/named-await-as-label-identifier-escaped.js",
"test262/language/expressions/async-generator/await-as-binding-identifier.js",
"test262/language/expressions/async-generator/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-body-duplicate-const.js",
"test262/language/expressions/async-generator/yield-as-identifier-reference.js",
"test262/language/expressions/async-generator/escaped-async.js",
"test262/language/expressions/async-generator/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/async-generator/named-yield-as-binding-identifier.js",
"test262/language/expressions/async-generator/named-await-as-binding-identifier.js",
"test262/language/expressions/async-generator/dflt-params-rest.js",
"test262/language/expressions/async-generator/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/async-generator/named-dflt-params-duplicates.js",
"test262/language/expressions/async-generator/dstr-named-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/async-generator/named-yield-identifier-strict.js",
"test262/language/expressions/async-generator/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/async-generator/dflt-params-duplicates.js",
"test262/language/expressions/async-generator/dstr-named-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/async-generator/yield-as-identifier-reference-escaped.js",
"test262/language/expressions/async-generator/named-yield-as-label-identifier-escaped.js",
"test262/language/expressions/async-generator/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/async-generator/await-as-label-identifier.js",
"test262/language/expressions/async-generator/await-as-binding-identifier-escaped.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-contains-await-expr.js",
"test262/language/expressions/async-generator/early-errors-expression-label-name-await.js",
"test262/language/expressions/async-generator/named-yield-as-label-identifier.js",
"test262/language/expressions/async-generator/dstr-named-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/async-generator/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/async-generator/early-errors-expression-eval-in-formal-parameters.js",
"test262/language/expressions/async-generator/named-await-as-identifier-reference.js",
"test262/language/expressions/async-generator/dstr-named-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/async-generator/early-errors-expression-label-name-yield.js",
"test262/language/expressions/async-generator/dstr-named-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/async-generator/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-contains-yield.js",
"test262/language/expressions/async-generator/named-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/async-generator/yield-as-label-identifier.js",
"test262/language/expressions/async-generator/dstr-named-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/async-generator/rest-params-trailing-comma-early-error.js",
"test262/language/expressions/async-generator/yield-identifier-spread-strict.js",
"test262/language/expressions/async-generator/dstr-named-ary-ptrn-rest-init-id.js",
"test262/language/expressions/async-generator/early-errors-expression-await-as-function-binding-identifier.js",
"test262/language/expressions/async-generator/await-as-identifier-reference-escaped.js",
"test262/language/expressions/async-generator/named-await-as-label-identifier.js",
"test262/language/expressions/async-generator/named-yield-identifier-spread-strict.js",
"test262/language/expressions/async-generator/named-dflt-params-rest.js",
"test262/language/expressions/async-generator/early-errors-expression-formals-contains-super-property.js",
"test262/language/expressions/await/no-operand.js",
"test262/language/expressions/await/await-BindingIdentifier-nested.js",
"test262/language/expressions/await/early-errors-await-not-simple-assignment-target.js",
"test262/language/expressions/arrow-function/param-dflt-yield-id-strict.js",
"test262/language/expressions/arrow-function/dstr-ary-ptrn-rest-init-id.js",
"test262/language/expressions/arrow-function/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/arrow-function/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/arrow-function/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/arrow-function/param-dflt-yield-expr.js",
"test262/language/expressions/arrow-function/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/arrow-function/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/arrow-function/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/arrow-function/dflt-params-rest.js",
"test262/language/expressions/arrow-function/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/arrow-function/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/arrow-function/dflt-params-duplicates.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-object-4.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-no-arguments.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-no-yield.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-no-eval.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-object-5.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-identifier-futurereservedword.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-rest.js",
"test262/language/expressions/arrow-function/syntax/early-errors/asi-restriction-invalid-parenless-parameters.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-object-3.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-eval.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-identifier-strict-futurereservedword.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-array-1.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-identifier.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-arguments.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-array-3.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-object-2.js",
"test262/language/expressions/arrow-function/syntax/early-errors/use-strict-with-non-simple-param.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-object-6.js",
"test262/language/expressions/arrow-function/syntax/early-errors/asi-restriction-invalid-parenless-parameters-expression-body.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-bindingidentifier-rest.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-array-2.js",
"test262/language/expressions/arrow-function/syntax/early-errors/asi-restriction-invalid.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-duplicates-binding-object-1.js",
"test262/language/expressions/arrow-function/syntax/early-errors/arrowparameters-cover-no-yield.js",
"test262/language/expressions/arrow-function/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/arrow-function/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/arrow-function/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/arrow-function/rest-params-trailing-comma-early-error.js",
"test262/language/expressions/prefix-increment/non-simple.js",
"test262/language/expressions/prefix-increment/target-newtarget.js",
"test262/language/expressions/prefix-increment/target-cover-yieldexpr.js",
"test262/language/expressions/prefix-increment/target-cover-newtarget.js",
"test262/language/expressions/function/name-eval-strict.js",
"test262/language/expressions/function/early-body-super-call.js",
"test262/language/expressions/function/name-arguments-strict.js",
"test262/language/expressions/function/dstr-ary-ptrn-rest-init-id.js",
"test262/language/expressions/function/dstr-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/function/param-duplicated-strict-body-2.js",
"test262/language/expressions/function/dstr-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/function/early-body-super-prop.js",
"test262/language/expressions/function/early-params-super-call.js",
"test262/language/expressions/function/dstr-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/function/param-eval-strict-body.js",
"test262/language/expressions/function/dstr-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/function/dstr-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/function/name-eval-strict-body.js",
"test262/language/expressions/function/param-duplicated-strict-body-3.js",
"test262/language/expressions/function/dstr-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/function/dflt-params-rest.js",
"test262/language/expressions/function/param-duplicated-strict-1.js",
"test262/language/expressions/function/dstr-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/function/use-strict-with-non-simple-param.js",
"test262/language/expressions/function/dstr-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/function/dflt-params-duplicates.js",
"test262/language/expressions/function/name-arguments-strict-body.js",
"test262/language/expressions/function/dstr-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/function/dstr-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/function/param-dflt-yield-strict.js",
"test262/language/expressions/function/param-duplicated-strict-2.js",
"test262/language/expressions/function/param-duplicated-strict-body-1.js",
"test262/language/expressions/function/dstr-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/function/param-duplicated-strict-3.js",
"test262/language/expressions/function/rest-params-trailing-comma-early-error.js",
"test262/language/expressions/function/early-params-super-prop.js",
"test262/language/expressions/object/cover-initialized-name.js",
"test262/language/expressions/object/dstr-async-gen-meth-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/object/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/object/dstr-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/object/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/object/dstr-async-gen-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/object/getter-body-strict-inside.js",
"test262/language/expressions/object/setter-body-strict-inside.js",
"test262/language/expressions/object/dstr-meth-ary-ptrn-rest-init-id.js",
"test262/language/expressions/object/dstr-meth-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/object/setter-param-arguments-strict-outside.js",
"test262/language/expressions/object/dstr-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/object/dstr-async-gen-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/object/getter-param-dflt.js",
"test262/language/expressions/object/dstr-gen-meth-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/object/setter-param-eval-strict-inside.js",
"test262/language/expressions/object/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/object/dstr-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/object/prop-def-invalid-async-prefix.js",
"test262/language/expressions/object/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/object/setter-param-arguments-strict-inside.js",
"test262/language/expressions/object/dstr-async-gen-meth-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/object/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/object/dstr-async-gen-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/object/dstr-gen-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/object/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/object/setter-param-eval-strict-outside.js",
"test262/language/expressions/object/dstr-async-gen-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/object/setter-body-strict-outside.js",
"test262/language/expressions/object/dstr-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/object/dstr-gen-meth-ary-ptrn-rest-init-id.js",
"test262/language/expressions/object/identifier-shorthand-invalid-computed-name.js",
"test262/language/expressions/object/dstr-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/object/dstr-gen-meth-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/object/dstr-gen-meth-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/object/dstr-meth-dflt-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/object/dstr-gen-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/object/dstr-async-gen-meth-dflt-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/object/dstr-gen-meth-dflt-ary-ptrn-rest-init-obj.js",
"test262/language/expressions/object/method-definition/async-gen-await-as-binding-identifier.js",
"test262/language/expressions/object/method-definition/async-await-as-label-identifier.js",
"test262/language/expressions/object/method-definition/meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/object/method-definition/gen-yield-identifier-strict.js",
"test262/language/expressions/object/method-definition/async-gen-yield-as-identifier-reference.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-formals-contains-super-call.js",
"test262/language/expressions/object/method-definition/async-await-as-binding-identifier.js",
"test262/language/expressions/object/method-definition/name-param-redecl.js",
"test262/language/expressions/object/method-definition/generator-super-call-param.js",
"test262/language/expressions/object/method-definition/async-await-as-identifier-reference-escaped.js",
"test262/language/expressions/object/method-definition/gen-meth-dflt-params-duplicates.js",
"test262/language/expressions/object/method-definition/gen-yield-as-label-identifier.js",
"test262/language/expressions/object/method-definition/yield-as-parameter.js",
"test262/language/expressions/object/method-definition/async-gen-yield-as-label-identifier-escaped.js",
"test262/language/expressions/object/method-definition/async-await-as-identifier-reference.js",
"test262/language/expressions/object/method-definition/name-super-call-param.js",
"test262/language/expressions/object/method-definition/async-gen-yield-identifier-spread-strict.js",
"test262/language/expressions/object/method-definition/async-gen-await-as-identifier-reference-escaped.js",
"test262/language/expressions/object/method-definition/gen-yield-identifier-spread-strict.js",
"test262/language/expressions/object/method-definition/generator-param-redecl-let.js",
"test262/language/expressions/object/method-definition/gen-meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/object/method-definition/async-gen-yield-identifier-strict.js",
"test262/language/expressions/object/method-definition/meth-dflt-params-rest.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-body-contains-super-call.js",
"test262/language/expressions/object/method-definition/escaped-set.js",
"test262/language/expressions/object/method-definition/generator-param-init-yield.js",
"test262/language/expressions/object/method-definition/async-gen-await-as-label-identifier-escaped.js",
"test262/language/expressions/object/method-definition/generator-super-call-body.js",
"test262/language/expressions/object/method-definition/async-meth-dflt-params-duplicates.js",
"test262/language/expressions/object/method-definition/async-meth-escaped-async.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-async-lineterminator.js",
"test262/language/expressions/object/method-definition/async-await-as-binding-identifier-escaped.js",
"test262/language/expressions/object/method-definition/async-gen-meth-dflt-params-duplicates.js",
"test262/language/expressions/object/method-definition/async-gen-meth-escaped-async.js",
"test262/language/expressions/object/method-definition/async-gen-meth-dflt-params-rest.js",
"test262/language/expressions/object/method-definition/name-super-call-body.js",
"test262/language/expressions/object/method-definition/gen-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-eval-in-formal-parameters.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-await-in-formals-default.js",
"test262/language/expressions/object/method-definition/gen-yield-as-identifier-reference.js",
"test262/language/expressions/object/method-definition/async-gen-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/object/method-definition/generator-param-redecl-const.js",
"test262/language/expressions/object/method-definition/async-gen-await-as-identifier-reference.js",
"test262/language/expressions/object/method-definition/use-strict-with-non-simple-param.js",
"test262/language/expressions/object/method-definition/yield-star-after-newline.js",
"test262/language/expressions/object/method-definition/generator-use-strict-with-non-simple-param.js",
"test262/language/expressions/object/method-definition/gen-yield-as-identifier-reference-escaped.js",
"test262/language/expressions/object/method-definition/gen-yield-as-label-identifier-escaped.js",
"test262/language/expressions/object/method-definition/async-gen-yield-as-label-identifier.js",
"test262/language/expressions/object/method-definition/gen-yield-as-binding-identifier.js",
"test262/language/expressions/object/method-definition/setter-use-strict-with-non-simple-param.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-formals-body-duplicate.js",
"test262/language/expressions/object/method-definition/async-meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-NSPL-with-USD.js",
"test262/language/expressions/object/method-definition/async-gen-meth-rest-params-trailing-comma-early-error.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-arguments-in-formal-parameters.js",
"test262/language/expressions/object/method-definition/meth-dflt-params-duplicates.js",
"test262/language/expressions/object/method-definition/escaped-get.js",
"test262/language/expressions/object/method-definition/async-gen-await-as-binding-identifier-escaped.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-await-in-formals.js",
"test262/language/expressions/object/method-definition/early-errors-object-method-duplicate-parameters.js",
"test262/language/expressions/object/method-definition/async-gen-yield-as-binding-identifier-escaped.js",
"test262/language/expressions/object/method-definition/async-gen-await-as-label-identifier.js",
"test262/language/expressions/object/method-definition/yield-as-logical-or-expression.js",
"test262/language/expressions/object/method-definition/async-meth-dflt-params-rest.js",
"test262/language/expressions/object/method-definition/gen-meth-dflt-params-rest.js",
"test262/language/expressions/object/method-definition/yield-weak-binding.js",
"test262/language/expressions/object/method-definition/async-gen-yield-as-binding-identifier.js",
"test262/language/expressions/object/method-definition/async-await-as-label-identifier-escaped.js",
"test262/language/expressions/object/method-definition/generator-param-id-yield.js",
"test262/language/expressions/object/dstr-meth-ary-ptrn-rest-init-ary.js",
"test262/language/expressions/object/dstr-async-gen-meth-ary-ptrn-rest-init-id.js",
"test262/language/expressions/object/dstr-meth-dflt-ary-ptrn-rest-not-final-id.js",
"test262/language/expressions/object/getter-body-strict-outside.js",
"test262/language/expressions/object/dstr-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/object/dstr-meth-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/object/identifier-shorthand-invalid-zero.js",
"test262/language/expressions/object/dstr-gen-meth-dflt-ary-ptrn-rest-not-final-ary.js",
"test262/language/expressions/object/dstr-gen-meth-ary-ptrn-rest-not-final-obj.js",
"test262/language/expressions/object/dstr-gen-meth-dflt-ary-ptrn-rest-init-id.js",
"test262/language/expressions/object/11.1.5-1gs.js",
"test262/language/expressions/yield/in-iteration-stmt.js",
"test262/language/expressions/yield/star-in-iteration-stmt.js",
"test262/language/expressions/yield/invalid-left-hand-side.js",
"test262/language/expressions/assignment/target-boolean.js",
"test262/language/expressions/assignment/dstr-obj-rest-not-last-element-invalid.js",
"test262/language/expressions/assignment/dstr-array-elem-nested-obj-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-array-rest-before-rest.js",
"test262/language/expressions/assignment/target-string.js",
"test262/language/expressions/assignment/dstr-array-elem-nested-obj-invalid.js",
"test262/language/expressions/assignment/dstr-obj-prop-nested-array-invalid.js",
"test262/language/expressions/assignment/target-newtarget.js",
"test262/language/expressions/assignment/target-null.js",
"test262/language/expressions/assignment/dstr-array-elem-target-simple-strict.js",
"test262/language/expressions/assignment/dstr-array-rest-init.js",
"test262/language/expressions/assignment/target-cover-yieldexpr.js",
"test262/language/expressions/assignment/id-arguments-strict.js",
"test262/language/expressions/assignment/dstr-obj-prop-elem-init-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-obj-id-init-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-array-rest-before-elision.js",
"test262/language/expressions/assignment/dstr-array-rest-nested-array-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-array-rest-nested-array-invalid.js",
"test262/language/expressions/assignment/dstr-obj-prop-nested-obj-invalid.js",
"test262/language/expressions/assignment/dstr-array-rest-nested-obj-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-array-elem-nested-array-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-array-rest-before-element.js",
"test262/language/expressions/assignment/dstr-obj-id-identifier-yield-expr.js",
"test262/language/expressions/assignment/dstr-array-elem-init-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-obj-prop-nested-obj-yield-ident-invalid.js",
"test262/language/expressions/assignment/id-eval-strict.js",
"test262/language/expressions/assignment/dstr-array-rest-elision-invalid.js",
"test262/language/expressions/assignment/dstr-array-elem-nested-array-invalid.js",
"test262/language/expressions/assignment/dstr-obj-id-init-simple-strict.js",
"test262/language/expressions/assignment/dstr-obj-prop-nested-array-yield-ident-invalid.js",
"test262/language/expressions/assignment/target-cover-newtarget.js",
"test262/language/expressions/assignment/dstr-obj-id-simple-strict.js",
"test262/language/expressions/assignment/dstr-obj-id-identifier-yield-ident-invalid.js",
"test262/language/expressions/assignment/dstr-array-rest-nested-obj-invalid.js",
"test262/language/expressions/assignment/non-simple-target.js",
"test262/language/expressions/assignment/dstr-array-elem-target-yield-invalid.js",
"test262/language/expressions/assignment/dstr-obj-prop-elem-target-yield-ident-invalid.js",
"test262/language/expressions/assignment/target-number.js",
"test262/language/expressions/assignment/dstr-array-rest-yield-ident-invalid.js",
"test262/language/expressions/conditional/in-branch-2.js",
"test262/language/expressions/conditional/in-condition.js",
"test262/language/asi/S7.9_A6.3_T1.js",
"test262/language/asi/S7.9_A6.2_T5.js",
"test262/language/asi/S7.9_A5.7_T1.js",
"test262/language/asi/S7.9_A6.3_T5.js",
"test262/language/asi/S7.9.2_A1_T6.js",
"test262/language/asi/S7.9_A6.2_T7.js",
"test262/language/asi/S7.9_A6.3_T7.js",
"test262/language/asi/S7.9_A10_T4.js",
"test262/language/asi/S7.9_A6.3_T4.js",
"test262/language/asi/S7.9_A10_T2.js",
"test262/language/asi/S7.9_A6.2_T1.js",
"test262/language/asi/S7.9.2_A1_T1.js",
"test262/language/asi/S7.9_A6.3_T2.js",
"test262/language/asi/S7.9_A10_T8.js",
"test262/language/asi/S7.9_A9_T6.js",
"test262/language/asi/S7.9_A6.2_T8.js",
"test262/language/asi/S7.9.2_A1_T3.js",
"test262/language/asi/S7.9_A6.3_T6.js",
"test262/language/asi/S7.9_A11_T8.js",
"test262/language/asi/S7.9_A6.4_T1.js",
"test262/language/asi/S7.9_A10_T6.js",
"test262/language/asi/S7.9_A5.3_T1.js",
"test262/language/asi/S7.9_A11_T4.js",
"test262/language/asi/S7.9_A9_T8.js",
"test262/language/asi/S7.9_A6.3_T3.js",
"test262/language/asi/S7.9_A6.2_T9.js",
"test262/language/asi/S7.9_A6.2_T2.js",
"test262/language/asi/S7.9_A6.2_T6.js",
"test262/language/asi/S7.9_A6.2_T4.js",
"test262/language/asi/S7.9_A6.2_T10.js",
"test262/language/asi/S7.9_A6.4_T2.js",
"test262/language/asi/S7.9_A6.2_T3.js",
"test262/language/asi/S7.9_A5.1_T1.js",
"test262/language/asi/S7.9_A4.js",
"test262/language/asi/S7.9_A9_T7.js",
"test262/language/line-terminators/S7.3_A2.1_T2.js",
"test262/language/line-terminators/S7.3_A3.1_T3.js",
"test262/language/line-terminators/S7.3_A6_T2.js",
"test262/language/line-terminators/S7.3_A6_T1.js",
"test262/language/line-terminators/S7.3_A6_T4.js",
"test262/language/line-terminators/S7.3_A3.2_T3.js",
"test262/language/line-terminators/S7.3_A2.2_T2.js",
"test262/language/line-terminators/S7.3_A3.4_T1.js",
"test262/language/line-terminators/S7.3_A3.3_T1.js",
"test262/language/line-terminators/S7.3_A3.2_T1.js",
"test262/language/line-terminators/S7.3_A6_T3.js",
"test262/language/comments/S7.4_A4_T4.js",
"test262/language/comments/single-line-html-close-without-lt.js",
"test262/language/comments/S7.4_A3.js",
"test262/language/comments/S7.4_A4_T1.js",
"test262/language/comments/S7.4_A2_T2.js",
"test262/language/comments/multi-line-html-close-extra.js",
"test262/annexB/language/statements/for-in/var-objectbindingpattern-initializer.js",
"test262/annexB/language/statements/for-in/var-arraybindingpattern-initializer.js",
"test262/annexB/language/statements/for-in/strict-initializer.js",
"test262/annexB/language/statements/for-in/let-initializer.js",
"test262/annexB/language/statements/for-in/const-initializer.js",
"test262/annexB/language/statements/for-in/bare-initializer.js",
"test262/annexB/language/expressions/template-literal/legacy-octal-escape-sequence-strict.js",
"test262/annexB/language/expressions/object/__proto__-duplicate.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-03-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-FC_NFKC_Closure.js",
"test262/built-ins/RegExp/property-escapes/non-existent-binary-property-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Default_Ignorable_Code_Point.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-13.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-09-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-Line_Break-with-value.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFKD-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-FC_NFKC_Closure.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-separator-and-value-only-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-In-prefix-Script.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-unclosed-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Full_Composition_Exclusion.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-existing-value-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_F.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-08.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_ID_Continue-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-02.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-General_Category.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-separator-only-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-11.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-no-braces-value-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Uppercase.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-09.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFKC.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-Is-prefix-Script-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_Yes.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-value-Script_Extensions-negated.js",
"test262/built-ins/RegExp/property-escapes/non-existent-binary-property.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Grapheme_Extend-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-Line_Break-with-value-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-10-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Composition_Exclusion-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_N-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-02-negated.js",
"test262/built-ins/RegExp/property-escapes/character-class-range-no-dash-end.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Lowercase.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Composition_Exclusion.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-separator-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-14-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_ID_Start-negated.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-General_Category-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-03.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script_Extensions-equals.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_T.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script-equals-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-separator.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-value-Script.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_T-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-01-negated.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-unclosed.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-05-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-unopened.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Prepended_Concatenation_Mark.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFC-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFC.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script_Extensions.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script_Extensions-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-Is-prefix-Script.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-In-prefix-Script-implicit-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-FC_NFKC_Closure-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-04.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_ID_Continue.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-value-General_Category-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-invalid.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-separator-and-value-only.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-no-braces-value.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Default_Ignorable_Code_Point-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-11-negated.js",
"test262/built-ins/RegExp/property-escapes/character-class-range-end.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-08-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-12.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-04-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_No-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_Y.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_Invalid-negated.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-General_Category-equals.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-and-value-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-01.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-In-prefix-Script-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-empty-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-Block-with-value.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Grapheme_Extend.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_N.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-separator-only.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_Y-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-In-prefix-Block-implicit.js",
"test262/built-ins/RegExp/property-escapes/character-class-range-no-dash-start.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script_Extensions-equals-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-circumflex-negation-negated.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-General_Category-equals-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Math.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFKC-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-invalid-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-05.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-value-Script_Extensions.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-unopened-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-12-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_F-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-07.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_Yes-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Hyphen-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Uppercase-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Lowercase-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-07-negated.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-and-value.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_No.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-In-prefix-Script-implicit.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Full_Composition_Exclusion-negated.js",
"test262/built-ins/RegExp/property-escapes/binary-property-with-value-ASCII_-_Invalid.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Math-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Alphabetic-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-13-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Grapheme_Link.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_ID_Start.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-value-Script-negated.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-value-general-category.js",
"test262/built-ins/RegExp/property-escapes/non-binary-property-without-value-Script-equals.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-In-prefix-Block-implicit-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-no-braces.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Prepended_Concatenation_Mark-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-10.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-Line_Break.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFD.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-Block-with-value-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Hyphen.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-14.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-empty.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-06.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-FC_NFKC_Closure-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-no-braces-negated.js",
"test262/built-ins/RegExp/property-escapes/non-existent-property-existing-value.js",
"test262/built-ins/RegExp/property-escapes/unsupported-property-Line_Break-negated.js",
"test262/built-ins/RegExp/property-escapes/loose-matching-06-negated.js",
"test262/built-ins/RegExp/property-escapes/grammar-extension-circumflex-negation.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Grapheme_Link-negated.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFD-negated.js",
"test262/built-ins/RegExp/property-escapes/character-class-range-start.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Other_Alphabetic.js",
"test262/built-ins/RegExp/property-escapes/unsupported-binary-property-Expands_On_NFKD.js",
]
FAILED_AT_LEAST_ONE_ENGINE = [
"DukTape/ecmascript/test-dev-syntax-error-line-2.js",
"DukTape/ecmascript/test-misc-asmjs.js",
"DukTape/ecmascript/test-regexp-charclass-canon-misc.js",
"JerryJS/ecma/tests/22.02.01-009.js",
"JerryJS/ecma/tests/22.02.03-019.js",
"JerryJS/ecma/tests/24.01.02-013.js",
"JerryJS/ecma/tests/24.01.04-007.js",
"JerryJS/regression/tests/arithmetics-2.js",
"JerryJS/regression/tests/for-parse.js",
"JerryJS/regression/tests/global-uri-coding.js",
"JerryJS/regression/tests/json-parse.js",
"JerryJS/regression/tests/regression-test-issue-1386.js",
"JerryJS/regression/tests/regression-test-issue-1616.js",
"JerryJS/regression/tests/regression-test-issue-1936.js",
"JerryJS/regression/tests/regression-test-issue-782.js",
"mozilla/non262/Array/fill.js",
"mozilla/non262/Array/join-01.js",
"mozilla/non262/Array/length-truncate-nonconfigurable-sparse.js",
"mozilla/non262/Array/redefine-nonwritable-length-custom-conversion-throw.js",
"mozilla/non262/Array/sort_basics.js",
"mozilla/non262/Array/sort-non-function.js",
"mozilla/non262/Array/to-length.js",
"mozilla/non262/Array/toLocaleString-01.js",
"mozilla/non262/Array/unscopables.js",
"mozilla/non262/Array/unshift-01.js",
"mozilla/non262/Array/unshift-with-enumeration.js",
"mozilla/non262/arrow-functions/arrow-not-as-end-of-statement.js",
"mozilla/non262/async-functions/async-contains-unicode-escape.js",
"mozilla/non262/async-functions/async-property-name-error.js",
"mozilla/non262/async-functions/await-in-arrow-parameters.js",
"mozilla/non262/async-functions/create-function-parse-before-getprototype.js",
"mozilla/non262/async-functions/duplicate-__proto__.js",
"mozilla/non262/async-functions/forbidden-as-consequent.js",
"mozilla/non262/async-functions/property.js",
"mozilla/non262/AsyncGenerators/async-generator-declaration-in-modules.js",
"mozilla/non262/AsyncGenerators/create-function-parse-before-getprototype.js",
"mozilla/non262/AsyncGenerators/for-await-bad-syntax.js",
"mozilla/non262/class/boundFunctionSubclassing.js",
"mozilla/non262/class/compPropNames.js",
"mozilla/non262/class/methDefnGen.js",
"mozilla/non262/class/methDefn.js",
"mozilla/non262/class/method-named-static.js",
"mozilla/non262/class/newTargetDirectInvoke.js",
"mozilla/non262/class/newTargetDVG.js",
"mozilla/non262/class/newTargetEval.js",
"mozilla/non262/class/superCallBadNewTargetPrototype.js",
"mozilla/non262/class/superPropDerivedCalls.js",
"mozilla/non262/class/superPropEvalInsideNested.js",
"mozilla/non262/class/superPropNoOverwriting.js",
"mozilla/non262/class/superThisStrictNoBoxing.js",
"mozilla/non262/DataView/get-set-index-range.js",
"mozilla/non262/Date/15.9.5.5-02.js",
"mozilla/non262/Date/constructor-convert-all-arguments.js",
"mozilla/non262/Date/regress-188211.js",
"mozilla/non262/Date/regress-301738-01.js",
"mozilla/non262/Date/timeclip.js",
"mozilla/non262/Date/toJSON-01.js",
"mozilla/non262/Date/toString-generic.js",
"mozilla/non262/destructuring/bug1396261.js",
"mozilla/non262/destructuring/iterator-primitive.js",
"mozilla/non262/destructuring/rest-parameter-arguments.js",
"mozilla/non262/destructuring/rest-parameter-function-length.js",
"mozilla/non262/destructuring/rest-parameter.js",
"mozilla/non262/destructuring/rest-parameter-spread-call-optimization.js",
"mozilla/non262/destructuring/rest-parameter-syntax.js",
"mozilla/non262/Error/constructor-ordering.js",
"mozilla/non262/expressions/destructuring-array-done.js",
"mozilla/non262/expressions/trailing_comma_arrow.js",
"mozilla/non262/expressions/trailing_comma_getter_setter.js",
"mozilla/non262/extensions/15.9.4.2.js",
"mozilla/non262/extensions/error-tostring-function.js",
"mozilla/non262/extensions/function-caller-skips-eval-frames.js",
"mozilla/non262/extensions/mutable-proto-special-form.js",
"mozilla/non262/extensions/regress-341956-01.js",
"mozilla/non262/extensions/regress-429739.js",
"mozilla/non262/extensions/regress-452329.js",
"mozilla/non262/extensions/regress-466905-04.js",
"mozilla/non262/extensions/regress-473040.js",
"mozilla/non262/extensions/regress-476869.js",
"mozilla/non262/extensions/regress-482263.js",
"mozilla/non262/extensions/reviver-mutates-holder-array-nonnative.js",
"mozilla/non262/extensions/reviver-mutates-holder-object-nonnative.js",
"mozilla/non262/extensions/scope-001.js",
"mozilla/non262/Function/arrow-has-duplicated.js",
"mozilla/non262/Function/bound-prototype.js",
"mozilla/non262/Function/create-function-parse-before-getprototype.js",
"mozilla/non262/Function/function-constructor-toString-arguments-before-parsing-params.js",
"mozilla/non262/Function/function-name-binding.js",
"mozilla/non262/Function/function-name-for.js",
"mozilla/non262/Function/parameter-redeclaration.js",
"mozilla/non262/Function/regress-338121-01.js",
"mozilla/non262/Function/regress-338121-02.js",
"mozilla/non262/Function/regress-338121-03.js",
"mozilla/non262/generators/construct-newtarget.js",
"mozilla/non262/generators/create-function-parse-before-getprototype.js",
"mozilla/non262/generators/delegating-yield-1.js",
"mozilla/non262/generators/delegating-yield-3.js",
"mozilla/non262/generators/delegating-yield-5.js",
"mozilla/non262/generators/delegating-yield-6.js",
"mozilla/non262/generators/delegating-yield-7.js",
"mozilla/non262/generators/forbidden-as-consequent.js",
"mozilla/non262/generators/syntax.js",
"mozilla/non262/JSON/parse-arguments.js",
"mozilla/non262/JSON/stringify-boxed-primitives.js",
"mozilla/non262/JSON/stringify-missing-arguments.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-if.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-parameter.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-same-name.js",
"mozilla/non262/lexical-environment/block-scoped-functions-deprecated-redecl.js",
"mozilla/non262/lexical-environment/for-loop.js",
"mozilla/non262/Map/iterable.js",
"mozilla/non262/Map/NaN-as-key.js",
"mozilla/non262/Math/20.2.2.ToNumber.js",
"mozilla/non262/misc/function-definition-eval.js",
"mozilla/non262/misc/future-reserved-words.js",
"mozilla/non262/misc/new-with-non-constructor.js",
"mozilla/non262/misc/regress-bug632003.js",
"mozilla/non262/Number/20.1.3.2-toExponential.js",
"mozilla/non262/object/defineGetter-defineSetter.js",
"mozilla/non262/object/defineProperties-order.js",
"mozilla/non262/object/getter-name.js",
"mozilla/non262/object/isPrototypeOf.js",
"mozilla/non262/object/method-non-constructor.js",
"mozilla/non262/object/mutation-prevention-methods.js",
"mozilla/non262/object/object-create-with-primitive-second-arg.js",
"mozilla/non262/object/propertyIsEnumerable.js",
"mozilla/non262/object/values-entries-lazy-props.js",
"mozilla/non262/Proxy/getPrototypeOf.js",
"mozilla/non262/Proxy/global-receiver.js",
"mozilla/non262/Proxy/setPrototypeOf.js",
"mozilla/non262/Proxy/trap-null.js",
"mozilla/non262/RegExp/15.5.4.11.js",
"mozilla/non262/RegExp/character-escape-class-s-mongolian-vowel-separator.js",
"mozilla/non262/RegExp/compile-lastIndex.js",
"mozilla/non262/RegExp/constructor-IsRegExp.js",
"mozilla/non262/RegExp/constructor-ordering.js",
"mozilla/non262/RegExp/descriptor.js",
"mozilla/non262/RegExp/flags-param-handling.js",
"mozilla/non262/RegExp/getter-name.js",
"mozilla/non262/RegExp/instance-property-storage-introspection.js",
"mozilla/non262/RegExp/lastIndex-nonwritable.js",
"mozilla/non262/RegExp/match.js",
"mozilla/non262/RegExp/match-trace.js",
"mozilla/non262/RegExp/RegExpExec-exec.js",
"mozilla/non262/RegExp/RegExpExec-return.js",
"mozilla/non262/RegExp/RegExp_rightContext_as_array.js",
"mozilla/non262/RegExp/RegExp_rightContext.js",
"mozilla/non262/RegExp/regress-330684.js",
"mozilla/non262/RegExp/replace-compile-elembase.js",
"mozilla/non262/RegExp/replace-compile.js",
"mozilla/non262/RegExp/replace-global-unicode.js",
"mozilla/non262/RegExp/replace-sticky.js",
"mozilla/non262/RegExp/search.js",
"mozilla/non262/RegExp/search-trace.js",
"mozilla/non262/RegExp/source.js",
"mozilla/non262/RegExp/split-deleted-flags.js",
"mozilla/non262/RegExp/split-flags-on-obj.js",
"mozilla/non262/RegExp/split-invalid-lastIndex.js",
"mozilla/non262/RegExp/split.js",
"mozilla/non262/RegExp/split-obj.js",
"mozilla/non262/RegExp/unicode-back-reference.js",
"mozilla/non262/RegExp/unicode-character-class-escape.js",
"mozilla/non262/RegExp/unicode-class-empty.js",
"mozilla/non262/RegExp/unicode-ignoreCase-escape.js",
"mozilla/non262/RegExp/unicode-ignoreCase.js",
"mozilla/non262/RegExp/unicode-ignoreCase-word-boundary.js",
"mozilla/non262/regress/regress-172699.js",
"mozilla/non262/regress/regress-290575.js",
"mozilla/non262/regress/regress-317476.js",
"mozilla/non262/regress/regress-467495-06.js",
"mozilla/non262/regress/regress-477234.js",
"mozilla/non262/regress/regress-511859.js",
"mozilla/non262/regress/regress-551763-2.js",
"mozilla/non262/regress/regress-810525.js",
"mozilla/non262/Set/NaN-as-key.js",
"mozilla/non262/statements/for-inof-loop-const-declaration.js",
"mozilla/non262/statements/for-in-with-assignment-semantics.js",
"mozilla/non262/statements/for-of-var-with-initializer.js",
"mozilla/non262/strict/15.4.5.1.js",
"mozilla/non262/strict/8.7.2.js",
"mozilla/non262/strict/strict-function-statements.js",
"mozilla/non262/String/internalUsage.js",
"mozilla/non262/String/IsRegExp.js",
"mozilla/non262/String/lastIndexOf-ToNumber-when-searchStr-larger-than-string.js",
"mozilla/non262/String/match-defines-match-elements.js",
"mozilla/non262/String/match.js",
"mozilla/non262/String/match-throws-nonwritable-lastIndex-global.js",
"mozilla/non262/String/replace.js",
"mozilla/non262/String/replace-throws-nonwritable-lastIndex-global.js",
"mozilla/non262/String/replace-updates-global-lastIndex.js",
"mozilla/non262/String/search-GetMethod.js",
"mozilla/non262/String/search.js",
"mozilla/non262/String/split-01.js",
"mozilla/non262/String/split-GetMethod.js",
"mozilla/non262/String/split.js",
"mozilla/non262/Symbol/constructor.js",
"mozilla/non262/Symbol/symbol-object-not-unboxed-for-value-to-id.js",
"mozilla/non262/Symbol/well-known.js",
"mozilla/non262/syntax/escaped-let-static-identifier.js",
"mozilla/non262/syntax/identifiers-with-extended-unicode-escape.js",
"mozilla/non262/syntax/let-as-label.js",
"mozilla/non262/syntax/omitted-catch-binding.js",
"mozilla/non262/syntax/unicode_other_id_continue.js",
"mozilla/non262/syntax/unicode_other_id_start.js",
"mozilla/non262/template-strings/tagTempl.js",
"mozilla/non262/TypedArray/constructor-iterable-modified-array-iterator-next.js",
"mozilla/non262/TypedArray/constructor-iterable-undefined-or-null.js",
"mozilla/non262/TypedArray/constructor-undefined-args.js",
"mozilla/non262/TypedArray/fill.js",
"mozilla/non262/TypedArray/from_constructor.js",
"mozilla/non262/TypedArray/from_errors.js",
"mozilla/non262/TypedArray/from-iterable-validation.js",
"mozilla/non262/TypedArray/from-non-iterable-validation.js",
"mozilla/non262/TypedArray/has-property-op.js",
"mozilla/non262/TypedArray/length.js",
"mozilla/non262/TypedArray/map-and-filter.js",
"mozilla/non262/TypedArray/object-defineproperty.js",
"mozilla/non262/TypedArray/of.js",
"mozilla/non262/TypedArray/of-validation.js",
"mozilla/non262/TypedArray/set.js",
"mozilla/non262/TypedArray/slice.js",
"mozilla/non262/TypedArray/sort_basics.js",
"mozilla/non262/TypedArray/sort-negative-nan.js",
"WebKit/es6/function_name_property_variables_function.js",
"WebKit/es6/miscellaneous_built-in_prototypes_are_not_instances.js",
"WebKit/es6/Proxy_internal_get_calls_RegExp_constructor.js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype[Symbol.match].js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype[Symbol.replace].js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype[Symbol.search].js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype[Symbol.split].js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype.test.js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype.toString.js",
"WebKit/es6/Proxy_internal_get_calls_String.prototype.match.js",
"WebKit/es6/Proxy_internal_get_calls_String.prototype.replace.js",
"WebKit/es6/Proxy_internal_get_calls_String.prototype.search.js",
"WebKit/es6/Proxy_internal_get_calls_String.prototype.split.js",
"WebKit/es6/Proxy_ownKeys_duplicates.js",
"WebKit/es6/RegExp.prototype_properties_RegExp.prototype.flags.js",
"WebKit/es6/RegExp.prototype_properties_RegExp.prototype[Symbol.match].js",
"WebKit/es6/RegExp.prototype_properties_RegExp.prototype[Symbol.replace].js",
"WebKit/es6/RegExp.prototype_properties_RegExp.prototype[Symbol.search].js",
"WebKit/es6/RegExp.prototype_properties_RegExp.prototype[Symbol.split].js",
"WebKit/es6/well-known_symbols_Symbol.match.js",
"WebKit/es6/well-known_symbols_Symbol.match_String.prototype.endsWith.js",
"WebKit/es6/well-known_symbols_Symbol.match_String.prototype.includes.js",
"WebKit/es6/well-known_symbols_Symbol.match_String.prototype.startsWith.js",
"WebKit/es6/well-known_symbols_Symbol.replace.js",
"WebKit/es6/well-known_symbols_Symbol.search.js",
"WebKit/es6/well-known_symbols_Symbol.species_RegExp.prototype[Symbol.split].js",
"WebKit/es6/well-known_symbols_Symbol.split.js",
"WebKit/microbenchmarks/emscripten-cube2hash.js",
"WebKit/microbenchmarks/regexp-prototype-search-observable-side-effects2.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects3-flags.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects3-global.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects3-ignoreCase.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects3-multiline.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects3-sticky.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects3-unicode.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects4.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects.js",
"WebKit/microbenchmarks/regexp-prototype-test-observable-side-effects2.js",
"WebKit/microbenchmarks/regexp-prototype-test-observable-side-effects.js",
"WebKit/microbenchmarks/string-prototype-search-observable-side-effects2.js",
"WebKit/microbenchmarks/string-prototype-search-observable-side-effects3.js",
"WebKit/microbenchmarks/string-prototype-search-observable-side-effects4.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects3-flags.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects3-global.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects3-ignoreCase.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects3-multiline.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects3-sticky.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects3-unicode.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects4.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects.js",
"DukTape/ecmascript/test-expr-lhs-newoper.js",
"DukTape/ecmascript/test-expr-lhs-this.js",
"DukTape/ecmascript/test-spec-redeclare-global-nonconfig-plain.js",
"JerryJS/ecma/tests/11.12-008.js",
"JerryJS/regression/tests/array-prototype-tolocalestring.js",
"JerryJS/regression/tests/error.js",
"JerryJS/regression/tests/for-in.js",
"JerryJS/regression/tests/regression-test-issue-1387.js",
"JerryJS/regression/tests/regression-test-issue-1597.js",
"JerryJS/regression/tests/regression-test-issue-1918.js",
"JerryJS/regression/tests/regression-test-issue-2143.js",
"JerryJS/regression/tests/throw-number.js",
"JerryJS/regression/tests/throw-string.js",
"mozilla/non262/Array/from_errors.js",
"mozilla/non262/Array/from_primitive.js",
"mozilla/non262/Array/frozen-dense-array.js",
"mozilla/non262/Array/indexOf-packed-array.js",
"mozilla/non262/Array/pop-no-has-trap.js",
"mozilla/non262/Array/regress-465980-02.js",
"mozilla/non262/Array/shift-no-has-trap.js",
"mozilla/non262/async-functions/await-error.js",
"mozilla/non262/async-functions/properties.js",
"mozilla/non262/async-functions/toString.js",
"mozilla/non262/class/className.js",
"mozilla/non262/class/methodName.js",
"mozilla/non262/class/newTargetArrow.js",
"mozilla/non262/class/strictExecution.js",
"mozilla/non262/class/superCallOrder.js",
"mozilla/non262/class/superPropBasicNew.js",
"mozilla/non262/Date/regress-309925-02.js",
"mozilla/non262/destructuring/duplicate-__proto__.js",
"mozilla/non262/destructuring/yield-with-escape-in-object-destr-function.js",
"mozilla/non262/destructuring/yield-with-escape-in-object-destr-script.js",
"mozilla/non262/Exceptions/regress-350650-n.js",
"mozilla/non262/expressions/object-literal-computed-property-evaluation.js",
"mozilla/non262/expressions/object-literal-__proto__.js",
"mozilla/non262/expressions/regress-394673.js",
"mozilla/non262/expressions/ToPropertyKey-symbols.js",
"mozilla/non262/extensions/censor-strict-caller.js",
"mozilla/non262/extensions/inc-dec-functioncall.js",
"mozilla/non262/extensions/regress-366668-01.js",
"mozilla/non262/extensions/too-many-arguments-constructing-bound-function.js",
"mozilla/non262/Function/arguments-parameter-shadowing.js",
"mozilla/non262/Function/bound-length-and-name.js",
"mozilla/non262/Function/builtin-no-construct.js",
"mozilla/non262/Function/configurable-length.js",
"mozilla/non262/Function/function-name-method.js",
"mozilla/non262/Function/function-name-property.js",
"mozilla/non262/Function/return-finally.js",
"mozilla/non262/Function/throw-type-error.js",
"mozilla/non262/generators/properties.js",
"mozilla/non262/generators/return-finally.js",
"mozilla/non262/jit/regress-470739.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-generators.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-notapplicable.js",
"mozilla/non262/lexical-environment/block-scoped-functions-annex-b-with.js",
"mozilla/non262/lexical-environment/unscopables-mutation.js",
"mozilla/non262/Math/cbrt-approx.js",
"mozilla/non262/object/15.2.3.14-01.js",
"mozilla/non262/object/accessor-name.js",
"mozilla/non262/object/entries.js",
"mozilla/non262/object/property-descriptor-order.js",
"mozilla/non262/object/toPrimitive.js",
"mozilla/non262/object/values.js",
"mozilla/non262/Proxy/proxy-for-in.js",
"mozilla/non262/Proxy/revoke-as-side-effect.js",
"mozilla/non262/Reflect/argumentsList.js",
"mozilla/non262/Reflect/defineProperty.js",
"mozilla/non262/RegExp/escape.js",
"mozilla/non262/RegExp/exec.js",
"mozilla/non262/RegExp/exec-lastIndex-negative.js",
"mozilla/non262/RegExp/exec-lastIndex-ToInteger.js",
"mozilla/non262/RegExp/lastIndex-exec.js",
"mozilla/non262/RegExp/lastIndex-match-or-replace.js",
"mozilla/non262/RegExp/lastIndex-search.js",
"mozilla/non262/RegExp/match-local-tolength-recompilation.js",
"mozilla/non262/RegExp/replace-local-tolength-lastindex.js",
"mozilla/non262/RegExp/replace-local-tolength-recompilation.js",
"mozilla/non262/RegExp/replace-sticky-lastIndex.js",
"mozilla/non262/RegExp/replace-twoBytes.js",
"mozilla/non262/RegExp/split-limit.js",
"mozilla/non262/RegExp/split-prop-access.js",
"mozilla/non262/RegExp/split-trace.js",
"mozilla/non262/RegExp/toString.js",
"mozilla/non262/RegExp/unicode-braced.js",
"mozilla/non262/RegExp/unicode-class-braced.js",
"mozilla/non262/RegExp/unicode-class-lead-trail.js",
"mozilla/non262/RegExp/unicode-class-negated.js",
"mozilla/non262/RegExp/unicode-class-range.js",
"mozilla/non262/RegExp/unicode-class-raw.js",
"mozilla/non262/RegExp/unicode-disallow-extended.js",
"mozilla/non262/RegExp/unicode-lead-trail.js",
"mozilla/non262/RegExp/unicode-raw.js",
"mozilla/non262/regress/regress-211590.js",
"mozilla/non262/regress/regress-452346.js",
"mozilla/non262/regress/regress-58116.js",
"mozilla/non262/regress/regress-584355.js",
"mozilla/non262/String/AdvanceStringIndex.js",
"mozilla/non262/String/match-GetMethod.js",
"mozilla/non262/String/string-code-point-upper-lower-mapping.js",
"mozilla/non262/Symbol/conversions.js",
"mozilla/non262/Symbol/enumeration-order.js",
"mozilla/non262/Symbol/toStringTag.js",
"mozilla/non262/syntax/declaration-forbidden-in-label.js",
"mozilla/non262/syntax/escaped-strict-reserved-words-and-yield.js",
"mozilla/non262/TypedArray/constructor-ArrayBuffer-species.js",
"mozilla/non262/TypedArray/constructor-iterable-packed-array-side-effect.js",
"mozilla/non262/TypedArray/from_mapping.js",
"mozilla/non262/TypedArray/sort-non-function.js",
"WebKit/es6/Proxy_internal_get_calls_RegExp.prototype.flags.js",
"WebKit/microbenchmarks/map-for-each.js",
"WebKit/microbenchmarks/map-for-of.js",
"WebKit/microbenchmarks/regexp-prototype-split-observable-side-effects2.js",
"WebKit/microbenchmarks/set-for-each.js",
"WebKit/microbenchmarks/set-for-of.js",
"WebKit/microbenchmarks/string-prototype-split-observable-side-effects2.js",
"JerryJS/ecma/tests/13.02-002.js",
"JerryJS/ecma/tests/22.02.02-002.js",
"JerryJS/regression/tests/array-prototype-push.js",
"JerryJS/regression/tests/array-prototype-sort.js",
"JerryJS/regression/tests/date-construct.js",
"JerryJS/regression/tests/date-utc.js",
"JerryJS/regression/tests/number-prototype-to-fixed.js",
"JerryJS/regression/tests/number-prototype-to-precision.js",
"JerryJS/regression/tests/object-define-properties.js",
"JerryJS/regression/tests/regexp-assertions.js",
"JerryJS/regression/tests/regression-test-issue-1065.js",
"JerryJS/regression/tests/regression-test-issue-1080.js",
"JerryJS/regression/tests/regression-test-issue-312.js",
"JerryJS/regression/tests/regression-test-issue-358.js",
"JerryJS/regression/tests/regression-test-issue-783.js",
"mozilla/non262/Array/from-iterator-close.js",
"mozilla/non262/Array/regress-313153.js",
"mozilla/non262/Array/regress-387501.js",
"mozilla/non262/Array/set-with-indexed-property-on-prototype-chain.js",
"mozilla/non262/async-functions/construct-newtarget.js",
"mozilla/non262/AsyncGenerators/for-await-of-error.js",
"mozilla/non262/class/derivedConstructorInlining.js",
"mozilla/non262/class/derivedConstructorTDZExplicitThis.js",
"mozilla/non262/class/derivedConstructorTDZOffEdge.js",
"mozilla/non262/class/derivedConstructorTDZReturnUndefined.js",
"mozilla/non262/class/superPropDelete.js",
"mozilla/non262/class/superPropOrdering.js",
"mozilla/non262/class/uninitializedThisError.js",
"mozilla/non262/Date/15.9.4.2.js",
"mozilla/non262/Date/15.9.5.6.js",
"mozilla/non262/Date/15.9.5.7.js",
"mozilla/non262/Date/non-iso.js",
"mozilla/non262/Date/setTime-argument-shortcircuiting.js",
"mozilla/non262/Date/two-digit-years.js",
"mozilla/non262/destructuring/order.js",
"mozilla/non262/destructuring/order-super.js",
"mozilla/non262/eval/line-terminator-paragraph-terminator.js",
"mozilla/non262/Exceptions/errstack-001.js",
"mozilla/non262/Exceptions/regress-257751.js",
"mozilla/non262/expressions/destructuring-pattern-parenthesized.js",
"mozilla/non262/expressions/inNotObjectError.js",
"mozilla/non262/expressions/regress-96526-delelem.js",
"mozilla/non262/extensions/Boolean-toSource.js",
"mozilla/non262/extensions/DataView-construct-arguments-detaching.js",
"mozilla/non262/extensions/errorcolumnblame.js",
"mozilla/non262/extensions/Number-toSource.js",
"mozilla/non262/extensions/RegExp-replace-lastParen.js",
"mozilla/non262/extensions/regress-312385-01.js",
"mozilla/non262/extensions/regress-369696-02.js",
"mozilla/non262/extensions/regress-369696-03.js",
"mozilla/non262/extensions/regress-385729.js",
"mozilla/non262/extensions/regress-50447-1.js",
"mozilla/non262/extensions/String-toSource.js",
"mozilla/non262/extensions/unterminated-literal-error-location.js",
"mozilla/non262/Function/function-name-assignment.js",
"mozilla/non262/generators/runtime.js",
"mozilla/non262/generators/yield-error.js",
"mozilla/non262/generators/yield-star-iterator-close.js",
"mozilla/non262/lexical-environment/eval-nondefinable-function.js",
"mozilla/non262/lexical-environment/var-in-catch-body-annex-b.js",
"mozilla/non262/Map/constructor-iterator-close.js",
"mozilla/non262/object/destructuring-shorthand-defaults.js",
"mozilla/non262/Proxy/ownkeys-trap-duplicates.js",
"mozilla/non262/Proxy/regress-bug950407.js",
"mozilla/non262/RegExp/flags.js",
"mozilla/non262/RegExp/prototype.js",
"mozilla/non262/RegExp/replace.js",
"mozilla/non262/RegExp/replace-trace.js",
"mozilla/non262/RegExp/unicode-index.js",
"mozilla/non262/regress/regress-1383630.js",
"mozilla/non262/regress/regress-243869.js",
"mozilla/non262/regress/regress-3649-n.js",
"mozilla/non262/regress/regress-469758.js",
"mozilla/non262/regress/regress-591846.js",
"mozilla/non262/regress/regress-609617.js",
"mozilla/non262/regress/regress-619003-1.js",
"mozilla/non262/regress/regress-665355.js",
"mozilla/non262/statements/for-in-with-assignment-syntax.js",
"mozilla/non262/statements/for-of-iterator-close.js",
"mozilla/non262/statements/property-reference-self-assignment.js",
"mozilla/non262/statements/try-completion.js",
"mozilla/non262/String/string-upper-lower-mapping.js",
"mozilla/non262/syntax/keyword-unescaped-requirement.js",
"mozilla/non262/template-strings/lineNumber.js",
"mozilla/non262/TypedArray/set-toobject.js",
"mozilla/non262/TypedArray/test-integrity-level.js",
"WebKit/es6/proper_tail_calls_tail_call_optimisation_direct_recursion.js",
"WebKit/es6/proper_tail_calls_tail_call_optimisation_mutual_recursion.js",
"WebKit/es6/String.prototype_methods_String.prototype.padEnd.js",
"WebKit/es6/String.prototype_methods_String.prototype.padStart.js",
"WebKit/microbenchmarks/regexp-prototype-is-not-instance.js",
"WebKit/microbenchmarks/regexp-prototype-search-observable-side-effects.js",
"WebKit/microbenchmarks/string-prototype-search-observable-side-effects.js",
"JerryJS/ecma/tests/15.03.03.02-001.js",
"JerryJS/regression/tests/parser-oom.js",
"JerryJS/regression/tests/regexp-construct.js",
"JerryJS/regression/tests/string-prototype-trim.js",
"mozilla/non262/Array/toLocaleString.js",
"mozilla/non262/eval/exhaustive-global-strictcaller-direct-normalcode.js",
"mozilla/non262/eval/exhaustive-global-strictcaller-direct-strictcode.js",
"mozilla/non262/eval/exhaustive-global-strictcaller-indirect-strictcode.js",
"mozilla/non262/Exceptions/catchguard-002-n.js",
"mozilla/non262/Exceptions/catchguard-003-n.js",
"mozilla/non262/extensions/clone-complex-object.js",
"mozilla/non262/extensions/clone-regexp.js",
"mozilla/non262/extensions/column-numbers.js",
"mozilla/non262/extensions/regress-304897.js",
"mozilla/non262/extensions/regress-311583.js",
"mozilla/non262/extensions/regress-313803.js",
"mozilla/non262/extensions/regress-322957.js",
"mozilla/non262/extensions/regress-465276.js",
"mozilla/non262/extensions/regress-465453.js",
"mozilla/non262/extensions/regress-475971.js",
"mozilla/non262/extensions/regress-476414-01.js",
"mozilla/non262/extensions/regress-476414-02.js",
"mozilla/non262/extensions/regress-480579.js",
"mozilla/non262/extensions/regress-481516.js",
"mozilla/non262/extensions/regress-90596-002.js",
"mozilla/non262/extensions/symbol-uneval.js",
"mozilla/non262/Function/function-bind.js",
"mozilla/non262/global/delete-global-NaN-property.js",
"mozilla/non262/jit/regress-451673.js",
"mozilla/non262/jit/regress-451974-01.js",
"mozilla/non262/jit/regress-451974-02.js",
"mozilla/non262/lexical-environment/block-scoped-functions-strict.js",
"mozilla/non262/object/toLocaleString-01.js",
"mozilla/non262/Proxy/proxy-no-receiver-overwrite.js",
"mozilla/non262/RegExp/regress-375715-01-n.js",
"mozilla/non262/regress/regress-168347.js",
"mozilla/non262/regress/regress-243389-n.js",
"mozilla/non262/regress/regress-344711-n.js",
"mozilla/non262/regress/regress-450833.js",
"mozilla/non262/regress/regress-452498-168-2.js",
"mozilla/non262/regress/regress-458851.js",
"mozilla/non262/regress/regress-465013.js",
"mozilla/non262/regress/regress-465308.js",
"mozilla/non262/regress/regress-465567-01.js",
"mozilla/non262/regress/regress-466787.js",
"mozilla/non262/regress/regress-467495-03.js",
"mozilla/non262/regress/regress-467495-05.js",
"mozilla/non262/regress/regress-469044.js",
"mozilla/non262/regress/regress-477758.js",
"mozilla/non262/regress/regress-640075.js",
"mozilla/non262/regress/regress-96128-n.js",
"mozilla/non262/template-strings/debugLineNumber.js",
"mozilla/non262/TypedArray/constructor-length-too-large.js",
"mozilla/non262/TypedArray/seal-and-freeze.js",
"mozilla/non262/TypedObject/size_and_alignment.js",
"mozilla/non262/Unicode/regress-352044-02-n.js",
"mozilla/non262/Unicode/uc-001-n.js",
"mozilla/non262/Unicode/uc-002-n.js",
"WebKit/es6/miscellaneous_no_assignments_allowed_in_for-in_head.js",
"WebKit/es6/Proxy_enumerate_handler.js",
]
def get_blacklist():
''' Get the dataset of invalid files '''
invalid_files = []
invalid_files.extend(BLACKLIST)
invalid_files.extend(FAILED_AT_LEAST_ONE_ENGINE)
blacklist = set({})
for filepath in BLACKLIST:
hash_id = string_to_hash(filepath)
blacklist.add(hash_id)
return blacklist
| 73.771518 | 147 | 0.77129 | 35,600 | 260,561 | 5.624522 | 0.039382 | 0.106481 | 0.188396 | 0.126078 | 0.917636 | 0.849321 | 0.793206 | 0.746915 | 0.665505 | 0.606084 | 0 | 0.05235 | 0.0677 | 260,561 | 3,531 | 148 | 73.79241 | 0.771922 | 0.000123 | 0 | 0.000567 | 0 | 0.42695 | 0.890573 | 0.888197 | 0 | 0 | 0 | 0 | 0.000567 | 1 | 0.000284 | false | 0 | 0.016454 | 0 | 0.017021 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c54753030f5670279e1b9be3a246f963bc32b2c3 | 8,976 | py | Python | django_seo_js/tests/test_middlewares.py | denisvlr/django-seo-js | 342a17320e3c5a60c4af9064577df3f0646c4ebf | [
"MIT"
] | null | null | null | django_seo_js/tests/test_middlewares.py | denisvlr/django-seo-js | 342a17320e3c5a60c4af9064577df3f0646c4ebf | [
"MIT"
] | null | null | null | django_seo_js/tests/test_middlewares.py | denisvlr/django-seo-js | 342a17320e3c5a60c4af9064577df3f0646c4ebf | [
"MIT"
] | null | null | null | from mock import Mock
from django.test import TestCase
from django_seo_js.tests.utils import override_settings
from django_seo_js.middleware import HashBangMiddleware, UserAgentMiddleware
print override_settings
class BaseMiddlewareTest(TestCase):
pass
class HashBangMiddlewareTest(TestCase):
@override_settings(BACKEND='django_seo_js.backends.TestBackend')
def setUp(self):
super(HashBangMiddlewareTest, self).setUp()
self.middleware = HashBangMiddleware()
self.request = Mock()
self.request.path = "/"
self.request.GET = {}
def test_has_escaped_fragment(self):
self.request.GET = {"_escaped_fragment_": None}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
def test_does_not_have_escaped_fragment(self):
self.request.GET = {}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestBackend', ENABLED=False)
def test_has_escaped_fragment_skips_if_disabled_via_enabled(self):
self.middleware = HashBangMiddleware()
self.request.GET = {}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestServiceDownBackend')
def test_has_escaped_fragment_skips_if_service_is_down(self):
self.middleware = HashBangMiddleware()
self.request.GET = {"_escaped_fragment_": None}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestBackend')
def test_overriding_skips_sitemap_xml_by_default(self):
self.middleware = HashBangMiddleware()
self.request.path = "/sitemap.xml"
self.request.GET = {"_escaped_fragment_": None}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(
BACKEND='django_seo_js.backends.TestBackend',
IGNORE_URLS=["/foo.html", "/bar/ibbity.html", ],
IGNORE_EXTENSIONS=[],
)
def test_overriding_skips_custom_overrides_xml_by_default(self):
self.middleware = HashBangMiddleware()
self.request.path = "/sitemap.xml"
self.request.GET = {"_escaped_fragment_": None}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
self.request.path = "/foo.html"
self.assertEqual(self.middleware.process_request(self.request), None)
self.request.path = "/bar/ibbity.html"
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestBackend')
def test_overriding_skips_gifs_by_default(self):
self.middleware = HashBangMiddleware()
self.request.path = "/sitemap.xml"
self.request.GET = {"_escaped_fragment_": None}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(
BACKEND='django_seo_js.backends.TestBackend',
IGNORE_EXTENSIONS=[".html", ".txt", ]
)
def test_overriding_skips_custom_overrides_gifs_by_default(self):
self.middleware = HashBangMiddleware()
self.request.path = "/foo.gif"
self.request.GET = {"_escaped_fragment_": None}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
self.request.path = "/foo.html"
self.assertEqual(self.middleware.process_request(self.request), None)
self.request.path = "/bar/ibbity.txt"
self.assertEqual(self.middleware.process_request(self.request), None)
class UserAgentMiddlewareTest(TestCase):
@override_settings(BACKEND='django_seo_js.backends.TestBackend')
def setUp(self):
super(UserAgentMiddlewareTest, self).setUp()
self.middleware = UserAgentMiddleware()
self.request = Mock()
self.request.path = "/"
self.request.META = {}
def test_matches_one_of_the_default_user_agents(self):
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
def test_does_not_match_one_of_the_default_user_agents(self):
self.request.META = {
"HTTP_USER_AGENT": "This user-agent is not a search engine."
}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(
USER_AGENTS=["TestUserAgent", ],
BACKEND='django_seo_js.backends.TestBackend'
)
def test_overriding_matches(self):
self.middleware = UserAgentMiddleware()
self.request.META = {
"HTTP_USER_AGENT": "The TestUserAgent v1.0"
}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
@override_settings(
USER_AGENTS=["TestUserAgent", ],
BACKEND='django_seo_js.backends.TestBackend'
)
def test_overriding_does_not_match_properly(self):
self.middleware = UserAgentMiddleware()
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(
USER_AGENTS=["TestUserAgent", ],
BACKEND='django_seo_js.backends.TestBackend'
)
def test_missing_user_agent_still_works(self):
self.middleware = UserAgentMiddleware()
self.request.META = {}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestBackend', ENABLED=False)
def test_overriding_matches_skips_if_disabled_via_enabled(self):
self.middleware = UserAgentMiddleware()
self.request.META = {
"HTTP_USER_AGENT": "The TestUserAgent v1.0"
}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestServiceDownBackend')
def test_overriding_matches_skips_if_service_is_down(self):
self.middleware = UserAgentMiddleware()
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestBackend')
def test_overriding_skips_sitemap_xml_by_default(self):
self.middleware = UserAgentMiddleware()
self.request.path = "/sitemap.xml"
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(
BACKEND='django_seo_js.backends.TestBackend',
IGNORE_URLS=["/foo.html", "/bar/ibbity.html", ],
IGNORE_EXTENSIONS=[],
)
def test_overriding_skips_custom_overrides_xml_by_default(self):
self.middleware = UserAgentMiddleware()
self.request.path = "/sitemap.xml"
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
self.request.path = "/foo.html"
self.assertEqual(self.middleware.process_request(self.request), None)
self.request.path = "/bar/ibbity.html"
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(BACKEND='django_seo_js.backends.TestBackend')
def test_overriding_skips_gifs_by_default(self):
self.middleware = UserAgentMiddleware()
self.request.path = "/foo.gif"
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request), None)
@override_settings(
BACKEND='django_seo_js.backends.TestBackend',
IGNORE_EXTENSIONS=[".html", ".txt", ]
)
def test_overriding_skips_custom_overrides_gifs_by_default(self):
self.middleware = UserAgentMiddleware()
self.request.path = "/foo.gif"
self.request.META = {
"HTTP_USER_AGENT": "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
}
self.assertEqual(self.middleware.process_request(self.request).content, "Test")
self.request.path = "/foo.html"
self.assertEqual(self.middleware.process_request(self.request), None)
self.request.path = "/bar/ibbity.txt"
self.assertEqual(self.middleware.process_request(self.request), None)
| 41.748837 | 105 | 0.695187 | 1,016 | 8,976 | 5.898622 | 0.097441 | 0.124812 | 0.0856 | 0.130652 | 0.925079 | 0.912565 | 0.900551 | 0.878859 | 0.844152 | 0.844152 | 0 | 0.004377 | 0.185495 | 8,976 | 214 | 106 | 41.943925 | 0.815347 | 0 | 0 | 0.774011 | 0 | 0.039548 | 0.196524 | 0.066845 | 0 | 0 | 0 | 0 | 0.152542 | 0 | null | null | 0.00565 | 0.022599 | null | null | 0.00565 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3d9b529fd6552d645769b6618bddf928f84a3341 | 1,676 | py | Python | src/admin_panel/services/user_passes_test.py | Rey092/myhouse24_django | b2b31873006ec4917c2ed043350f2841745fadfb | [
"MIT"
] | null | null | null | src/admin_panel/services/user_passes_test.py | Rey092/myhouse24_django | b2b31873006ec4917c2ed043350f2841745fadfb | [
"MIT"
] | null | null | null | src/admin_panel/services/user_passes_test.py | Rey092/myhouse24_django | b2b31873006ec4917c2ed043350f2841745fadfb | [
"MIT"
] | null | null | null | # region ACCESS
def check_access(user, access_type):
if not user.is_authenticated:
return False
else:
pass1, pass2, pass3 = False, False, False
pass1 = user.is_superuser
if user.role is not None:
access_check = getattr(user.role, access_type)
pass2 = user.role.name == "Директор"
pass3 = access_check
return any([pass1, pass2, pass3])
def statistics_access(user):
return check_access(user, "statistics_access")
def cashbox_access(user):
return check_access(user, "cashbox_access")
def receipt_access(user):
return check_access(user, "receipt_access")
def account_access(user):
return check_access(user, "account_access")
def flat_access(user):
return check_access(user, "flat_access")
def house_user_access(user):
return check_access(user, "house_user_access")
def house_access(user):
return check_access(user, "house_access")
def message_access(user):
return check_access(user, "message_access")
def call_request_access(user):
return check_access(user, "call_request_access")
def meter_data_access(user):
return check_access(user, "meter_data_access")
def site_access(user):
return check_access(user, "site_access")
def service_access(user):
return check_access(user, "service_access")
def tariff_access(user):
return check_access(user, "tariff_access")
def role_access(user):
return check_access(user, "role_access")
def staff_access(user):
return check_access(user, "staff_access")
def payments_detail_access(user):
return check_access(user, "payments_detail_access")
# endregion ACCESS
| 19.952381 | 58 | 0.715394 | 222 | 1,676 | 5.117117 | 0.184685 | 0.290493 | 0.224472 | 0.295775 | 0.445423 | 0.445423 | 0.06338 | 0 | 0 | 0 | 0 | 0.006598 | 0.186158 | 1,676 | 83 | 59 | 20.192771 | 0.826246 | 0.0179 | 0 | 0 | 0 | 0 | 0.146074 | 0.01339 | 0 | 0 | 0 | 0 | 0 | 1 | 0.395349 | false | 0.116279 | 0 | 0.372093 | 0.813953 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
3d9bfee65369269925e23e07f1173cd783ab724d | 3,171 | py | Python | modelUtility.py | renhaocui/ensembleTopic | dd33bcce4d0f81e3f3412c4ce9e92268ea77a0c6 | [
"MIT"
] | 1 | 2018-08-28T15:28:54.000Z | 2018-08-28T15:28:54.000Z | modelUtility.py | renhaocui/ensembleTopic | dd33bcce4d0f81e3f3412c4ce9e92268ea77a0c6 | [
"MIT"
] | null | null | null | modelUtility.py | renhaocui/ensembleTopic | dd33bcce4d0f81e3f3412c4ce9e92268ea77a0c6 | [
"MIT"
] | null | null | null | import textCleaner
import random
import operator
def readData(docFileName, labelFileName):
docFile = open(docFileName, 'r')
labelFile = open(labelFileName, 'r')
docOutput = []
labelOutput = []
labelCorpus = {}
for line in labelFile:
labels = line.strip().replace('"', '').split(' ')
index = random.randint(0, len(labels)-1)
output = labels[index]
labelOutput.append(output)
if output not in labelCorpus:
labelCorpus[output] = 1.0
else:
labelCorpus[output] += 1.0
labelFile.close()
for line in docFile:
docOutput.append(textCleaner.tweetCleaner(line.strip().lower()))
docFile.close()
candLabel = max(labelCorpus.iteritems(), key=operator.itemgetter(1))[0]
return docOutput, labelOutput, candLabel
def readData2(labelFileName, alchemyFileName):
labelFile = open(labelFileName, 'r')
alchemyFile = open(alchemyFileName, 'r')
labelOutput = []
labelCorpus = {}
alchemyOutput = []
for line in labelFile:
labels = line.strip().replace('"', '').split(' ')
index = random.randint(0, len(labels)-1)
output = labels[index]
labelOutput.append(output)
if output not in labelCorpus:
labelCorpus[output] = 1.0
else:
labelCorpus[output] += 1.0
labelFile.close()
for line in alchemyFile:
labels = {}
temp = line.strip().split(' ')
for item in temp:
prob = float(item.split(':')[1])
path = item.split(':')[0].split('/')
label = path[len(path)-1]
if label not in labels:
labels[label] = prob
else:
labels[label] += prob
alchemyOutput.append(labels)
alchemyFile.close()
return labelOutput, alchemyOutput
def readData3(docFileName, labelFileName, alchemyFileName):
docFile = open(docFileName, 'r')
labelFile = open(labelFileName, 'r')
alchemyFile = open(alchemyFileName, 'r')
docOutput = []
labelOutput = []
labelCorpus = {}
alchemyOutput = []
for line in labelFile:
labels = line.strip().replace('"', '').split(' ')
index = random.randint(0, len(labels)-1)
output = labels[index]
labelOutput.append(output)
if output not in labelCorpus:
labelCorpus[output] = 1.0
else:
labelCorpus[output] += 1.0
labelFile.close()
candLabel = max(labelCorpus.iteritems(), key=operator.itemgetter(1))[0]
for line in docFile:
docOutput.append(textCleaner.tweetCleaner(line.strip().lower()))
docFile.close()
for line in alchemyFile:
labels = {}
temp = line.strip().split(' ')
for item in temp:
prob = float(item.split(':')[1])
path = item.split(':')[0].split('/')
label = path[len(path)-1]
if label not in labels:
labels[label] = prob
else:
labels[label] += prob
alchemyOutput.append(labels)
alchemyFile.close()
return docOutput, labelOutput, candLabel, alchemyOutput | 28.567568 | 75 | 0.581205 | 317 | 3,171 | 5.81388 | 0.164038 | 0.008682 | 0.034183 | 0.061856 | 0.831796 | 0.831796 | 0.831796 | 0.831796 | 0.729246 | 0.729246 | 0 | 0.013304 | 0.288868 | 3,171 | 111 | 76 | 28.567568 | 0.803991 | 0 | 0 | 0.9 | 0 | 0 | 0.00662 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.033333 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3dd5d3ad52ad48a546f41d262b6467e8700a6de2 | 159 | py | Python | threeML_utils/__init__.py | grburgess/3ml_utils | 3996df2958c39df63b9a9b18c93f57da77518c28 | [
"BSD-3-Clause"
] | null | null | null | threeML_utils/__init__.py | grburgess/3ml_utils | 3996df2958c39df63b9a9b18c93f57da77518c28 | [
"BSD-3-Clause"
] | null | null | null | threeML_utils/__init__.py | grburgess/3ml_utils | 3996df2958c39df63b9a9b18c93f57da77518c28 | [
"BSD-3-Clause"
] | 2 | 2020-02-03T17:54:13.000Z | 2021-03-03T08:20:26.000Z | from display_posterior_model_counts import display_posterior_model_counts
from ppc_tools import compute_ppc, PPC
__all__ = ['display_posterior_model_counts']
| 31.8 | 73 | 0.880503 | 22 | 159 | 5.681818 | 0.454545 | 0.384 | 0.504 | 0.648 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081761 | 159 | 4 | 74 | 39.75 | 0.856164 | 0 | 0 | 0 | 0 | 0 | 0.188679 | 0.188679 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3de6f10ef28342a291f8c431dbcc6e5691f3e5a2 | 152,670 | py | Python | src/encoded/tests/test_audit_experiment.py | ENCODE-DCC/encoded | 77688076259af7441a9ffc3e3104f115c988d8e9 | [
"MIT"
] | 102 | 2015-05-20T01:17:43.000Z | 2022-03-07T06:03:55.000Z | src/encoded/tests/test_audit_experiment.py | ENCODE-DCC/encoded | 77688076259af7441a9ffc3e3104f115c988d8e9 | [
"MIT"
] | 901 | 2015-01-07T23:11:57.000Z | 2022-03-18T13:56:12.000Z | src/encoded/tests/test_audit_experiment.py | ENCODE-DCC/encoded | 77688076259af7441a9ffc3e3104f115c988d8e9 | [
"MIT"
] | 65 | 2015-02-06T23:00:26.000Z | 2022-01-22T07:58:44.000Z | import pytest, re
from .constants import RED_DOT
def collect_audit_errors(result, error_types=None):
errors = result.json['audit']
errors_list = []
if error_types:
for error_type in error_types:
errors_list.extend(errors[error_type])
else:
for error_type in errors:
errors_list.extend(errors[error_type])
return errors_list
def test_audit_experiment_missing_fragmentation_method(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'HiC'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing fragmentation method'
for error in collect_audit_errors(res))
def test_audit_experiment_inconsistent_fragmentation_method(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'HiC'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(library_1['@id'], {'fragmentation_methods': ['chemical (HindIII restriction)']})
testapp.patch_json(library_2['@id'], {'fragmentation_methods': ['chemical (MboI restriction)']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent fragmentation method'
for error in collect_audit_errors(res))
def test_audit_experiment_consistent_fragmentation_method(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'HiC'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(library_1['@id'], {'fragmentation_methods': ['chemical (HindIII restriction)', 'chemical (MboI restriction)']})
testapp.patch_json(library_2['@id'], {'fragmentation_methods': ['chemical (MboI restriction)', 'chemical (HindIII restriction)']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'inconsistent fragmentation method'
for error in collect_audit_errors(res))
def test_audit_experiment_mixed_libraries(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2):
testapp.patch_json(library_1['@id'], {'nucleic_acid_term_name': 'DNA'})
testapp.patch_json(library_2['@id'], {'nucleic_acid_term_name': 'RNA'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed libraries'
for error in collect_audit_errors(res))
def test_audit_experiment_RNA_library_RIN(testapp,
base_experiment,
replicate_1_1,
library_1):
testapp.patch_json(library_1['@id'], {'nucleic_acid_term_name': 'RNA'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing RIN'
for error in collect_audit_errors(res))
testapp.patch_json(library_1['@id'], {'rna_integrity_number': 7})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'missing RIN'
for error in collect_audit_errors(res))
def test_audit_experiment_RNA_library_RIN_excluded_assays(testapp,
base_experiment,
replicate_1_1,
library_1):
testapp.patch_json(library_1['@id'], {'nucleic_acid_term_name': 'RNA'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(base_experiment['@id'],{'assay_term_name': 'eCLIP'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] != 'missing RIN'
for error in collect_audit_errors(res))
def test_audit_experiment_released_with_unreleased_files(testapp, base_experiment, file_fastq):
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01'})
testapp.patch_json(file_fastq['@id'], {'status': 'in progress'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mismatched file status'
for error in collect_audit_errors(res))
def test_ChIP_possible_control(testapp, base_experiment, ctrl_experiment, IgG_ctrl_rep):
testapp.patch_json(base_experiment['@id'], {'possible_controls': [ctrl_experiment['@id']],
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'invalid possible_control'
for error in collect_audit_errors(res))
def test_ChIP_possible_control_roadmap(testapp, base_experiment, ctrl_experiment, IgG_ctrl_rep,
award):
testapp.patch_json(award['@id'], {'rfa': 'Roadmap'})
testapp.patch_json(base_experiment['@id'], {'possible_controls': [ctrl_experiment['@id']],
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'invalid possible_control'
for error in collect_audit_errors(res))
def test_audit_input_control(
testapp,
base_experiment,
ctrl_experiment,
construct_genetic_modification,
base_biosample,
base_library,
base_replicate,
):
# Non-tagged ChIP
testapp.patch_json(
base_experiment['@id'],
{
'possible_controls': [ctrl_experiment['@id']],
'assay_term_name': 'ChIP-seq'
}
)
testapp.patch_json(ctrl_experiment['@id'], {'control_type': 'wild type'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing input control'
for error in collect_audit_errors(res)
)
testapp.patch_json(
ctrl_experiment['@id'], {'control_type': 'input library'}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing input control'
for error in collect_audit_errors(res)
)
# Non-tagged Mint-ChIP
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'Mint-ChIP-seq'})
testapp.patch_json(ctrl_experiment['@id'], {'control_type': 'wild type'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing input control'
for error in collect_audit_errors(res)
)
testapp.patch_json(ctrl_experiment['@id'], {'control_type': 'input library'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing input control'
for error in collect_audit_errors(res)
)
# Tagged ChIP
testapp.patch_json(
construct_genetic_modification['@id'],
{'introduced_tags': [{'name': 'FLAG', 'location': 'internal'}]}
)
testapp.patch_json(
base_biosample['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']]}
)
testapp.patch_json(
base_replicate['@id'], {'library': base_library['@id']}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing input control'
for error in collect_audit_errors(res)
)
testapp.patch_json(
ctrl_experiment['@id'], {'control_type': 'control'}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing input control'
for error in collect_audit_errors(res)
)
testapp.patch_json(
ctrl_experiment['@id'], {'control_type': 'wild type'}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing input control'
for error in collect_audit_errors(res)
)
def test_audit_experiment_target(testapp, base_experiment):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing target'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'PLAC-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing target'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'CUT&RUN'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing target'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'CUT&Tag'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing target'
for error in collect_audit_errors(res))
def test_audit_experiment_replicated(testapp, base_experiment, base_replicate, base_library, a549):
testapp.patch_json(base_experiment['@id'], {'status': 'submitted', 'date_submitted': '2015-03-03'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'unreplicated experiment' and error['level_name'] == 'INTERNAL_ACTION'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'biosample_ontology': a549['uuid']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'unreplicated experiment' and error['level_name'] == 'NOT_COMPLIANT'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'single-cell RNA sequencing assay'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'unreplicated experiment'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'long read single-cell RNA-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'unreplicated experiment'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_experiment['@id'], {'replicates': []})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'unreplicated experiment' and error['level_name'] == 'NOT_COMPLIANT'
for error in collect_audit_errors(res))
def test_audit_experiment_technical_replicates_same_library(testapp, base_experiment,
base_replicate, base_replicate_two,
base_library):
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(base_replicate_two['@id'], {'library': base_library['@id']})
testapp.patch_json(base_experiment['@id'], {
'replicates': [base_replicate['@id'], base_replicate_two['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'sequencing runs labeled as technical replicates'
for error in collect_audit_errors(res))
def test_audit_experiment_biological_replicates_biosample(
testapp, base_experiment, base_biosample,
library_1, library_2, replicate_1_1, replicate_2_1):
testapp.patch_json(library_1['@id'], {'biosample': base_biosample['@id']})
testapp.patch_json(library_2['@id'], {'biosample': base_biosample['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'biological replicates with identical biosample'
for error in collect_audit_errors(res))
def test_audit_experiment_technical_replicates_biosample(
testapp, base_experiment, biosample_1, biosample_2,
library_1, library_2, replicate_1_1, replicate_1_2):
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_1_2['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'technical replicates with not identical biosample'
for error in collect_audit_errors(res))
def test_audit_experiment_with_libraryless_replicated(
testapp, base_experiment, base_replicate, base_library):
testapp.patch_json(base_experiment['@id'], {'status': 'submitted', 'date_submitted': '2015-03-03'})
testapp.patch_json(base_experiment['@id'], {'replicates': [base_replicate['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'replicate with no library'
for error in collect_audit_errors(res))
def test_audit_experiment_single_cell_replicated(
testapp, base_experiment, base_replicate, base_library):
testapp.patch_json(base_experiment['@id'], {'status': 'submitted', 'date_submitted': '2015-03-03'})
testapp.patch_json(base_experiment['@id'], {'assay_term_name':
'single-cell RNA sequencing assay'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'unreplicated experiment'
for error in collect_audit_errors(res))
def test_audit_experiment_RNA_bind_n_seq_replicated(testapp, base_experiment, base_replicate,
base_library):
testapp.patch_json(base_experiment['@id'], {'status': 'submitted', 'date_submitted': '2015-03-03'})
testapp.patch_json(base_experiment['@id'], {'assay_term_name':
'RNA Bind-n-Seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'unreplicated experiment'
for error in collect_audit_errors(res))
def test_audit_experiment_roadmap_replicated(
testapp, base_experiment, base_replicate, base_library, award):
testapp.patch_json(award['@id'], {'rfa': 'Roadmap'})
testapp.patch_json(base_experiment['@id'], {'award': award['@id']})
testapp.patch_json(base_experiment['@id'],
{'status': 'released', 'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'unreplicated experiment'
for error in collect_audit_errors(res))
def test_audit_experiment_spikeins(testapp, base_experiment, base_replicate, base_library):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_library['@id'], {'size_range': '>200'})
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing spikeins'
for error in collect_audit_errors(res))
def test_audit_experiment_target_mismatch(
testapp, base_experiment, base_replicate, base_target, antibody_lot):
testapp.patch_json(base_replicate['@id'], {'antibody': antibody_lot['uuid']})
testapp.patch_json(
base_experiment['@id'], {'assay_term_name': 'ChIP-seq', 'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent target'
for error in collect_audit_errors(res))
def test_audit_experiment_no_characterizations_antibody(testapp,
base_experiment,
base_replicate,
base_library,
base_biosample,
antibody_lot,
target,
k562):
testapp.patch_json(base_replicate['@id'], {'antibody': antibody_lot['@id'],
'library': base_library['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'biosample_ontology': k562['uuid'],
'target': target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'uncharacterized antibody'
for error in collect_audit_errors(res))
def test_audit_experiment_wrong_organism_histone_antibody(testapp,
base_experiment,
wrangler,
base_antibody,
base_replicate,
base_library,
base_biosample,
mouse_H3K9me3,
target_H3K9me3,
base_antibody_characterization1,
base_antibody_characterization2,
mouse,
human,
k562,
mel):
# Mouse biosample in mouse ChIP-seq experiment but supporting antibody characterizations
# are compliant in human but not mouse.
base_antibody['targets'] = [mouse_H3K9me3['@id'], target_H3K9me3['@id']]
histone_antibody = testapp.post_json('/antibody_lot', base_antibody).json['@graph'][0]
testapp.patch_json(base_biosample['@id'], {'organism': mouse['@id']})
characterization_reviews = [
{
'biosample_ontology': mel['uuid'],
'organism': mouse['@id'],
'lane_status': 'not compliant',
'lane': 1
},
{
'biosample_ontology': k562['uuid'],
'organism': human['@id'],
'lane_status': 'compliant',
'lane': 2
}
]
testapp.patch_json(
base_antibody_characterization1['@id'],
{'target': target_H3K9me3['@id'],
'characterizes': histone_antibody['@id'],
'status': 'compliant',
'reviewed_by': wrangler['@id'],
'characterization_reviews': characterization_reviews})
testapp.patch_json(
base_antibody_characterization2['@id'],
{'target': target_H3K9me3['@id'],
'characterizes': histone_antibody['@id'],
'status': 'compliant',
'reviewed_by': wrangler['@id']})
testapp.patch_json(base_replicate['@id'], {'antibody': histone_antibody['@id'],
'library': base_library['@id'],
'experiment': base_experiment['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'biosample_ontology': mel['uuid'],
'target': mouse_H3K9me3['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'antibody not characterized to standard'
for error in collect_audit_errors(res))
def test_audit_experiment_partially_characterized_antibody(testapp,
base_experiment,
wrangler,
base_target,
base_antibody,
base_replicate,
base_library,
base_biosample,
base_antibody_characterization1,
base_antibody_characterization2,
human,
hepg2,
k562):
# K562 biosample in ChIP-seq experiment with exempt primary in K562 and in progress
# secondary - leading to partial characterization.
base_antibody['targets'] = [base_target['@id']]
TF_antibody = testapp.post_json('/antibody_lot', base_antibody).json['@graph'][0]
characterization_reviews = [
{
'biosample_ontology': hepg2['uuid'],
'organism': human['@id'],
'lane_status': 'not compliant',
'lane': 1
},
{
'biosample_ontology': k562['uuid'],
'organism': human['@id'],
'lane_status': 'exempt from standards',
'lane': 2
}
]
testapp.patch_json(
base_antibody_characterization1['@id'],
{'target': base_target['@id'],
'characterizes': TF_antibody['@id'],
'status': 'compliant',
'reviewed_by': wrangler['@id'],
'characterization_reviews': characterization_reviews})
testapp.patch_json(base_replicate['@id'], {'antibody': TF_antibody['@id'],
'library': base_library['@id'],
'experiment': base_experiment['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'biosample_ontology': k562['uuid'],
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'partially characterized antibody'
for error in collect_audit_errors(res))
def test_audit_experiment_antibody_characterizations_NTR_biosample(testapp,
base_experiment,
wrangler,
base_target,
base_antibody,
base_replicate,
base_library,
base_antibody_characterization1,
human,
hepg2,
ntr_biosample_type):
# Antibody has characterization reviews on a NTR biosample type
base_antibody['targets'] = [base_target['@id']]
TF_antibody = testapp.post_json('/antibody_lot', base_antibody).json['@graph'][0]
characterization_reviews = [
{
'biosample_ontology': hepg2['uuid'],
'organism': human['@id'],
'lane_status': 'compliant',
'lane': 1
},
{
'biosample_ontology': ntr_biosample_type['uuid'],
'organism': human['@id'],
'lane_status': 'exempt from standards',
'lane': 2
}
]
testapp.patch_json(
base_antibody_characterization1['@id'],
{'target': base_target['@id'],
'characterizes': TF_antibody['@id'],
'status': 'compliant',
'reviewed_by': wrangler['@id'],
'characterization_reviews': characterization_reviews})
testapp.patch_json(base_replicate['@id'], {'antibody': TF_antibody['@id'],
'library': base_library['@id'],
'experiment': base_experiment['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'biosample_ontology': hepg2['uuid'],
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert not any(error['category'] == 'NTR biosample'
for error in collect_audit_errors(res))
def test_audit_experiment_geo_submission(testapp, base_experiment):
testapp.patch_json(
base_experiment['@id'], {'status': 'released', 'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'experiment not submitted to GEO'
for error in collect_audit_errors(res))
def test_audit_experiment_biosample_match(testapp, base_experiment,
base_biosample, base_replicate,
base_library, h1, ileum, biosample_1,
biosample_2, library_no_biosample,
base_replicate_two):
testapp.patch_json(base_biosample['@id'], {'biosample_ontology': h1['uuid']})
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(base_experiment['@id'], {'biosample_ontology': ileum['uuid']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent library biosample'
for error in collect_audit_errors(res))
# https://encodedcc.atlassian.net/browse/ENCD-5674
testapp.patch_json(library_no_biosample['@id'], {'mixed_biosamples': [biosample_1['@id'], biosample_2['@id']]})
testapp.patch_json(base_replicate_two['@id'], {'library': library_no_biosample['@id']})
res_errors = collect_audit_errors(testapp.get(base_experiment['@id'] + '@@index-data'))
assert any(error['category'] == 'inconsistent library biosample'
and 'generated from mixed biosamples' in error['detail']
for error in res_errors)
assert any(error['category'] == 'inconsistent library biosample'
and 'both standard and mixed biosamples' in error['detail']
for error in res_errors)
def test_audit_experiment_biosample_and_mixed_biosamples(testapp, base_experiment, base_replicate,
base_library, library_no_biosample, biosample_1,
biosample_2, base_replicate_two):
# https://encodedcc.atlassian.net/browse/ENCD-5674
testapp.patch_json(library_no_biosample['@id'],
{'mixed_biosamples': [biosample_1['@id'], biosample_2['@id']]})
testapp.patch_json(base_replicate_two['@id'],
{'library': library_no_biosample['@id']})
testapp.patch_json(base_replicate['@id'],
{'library': base_library['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent library biosample'
for error in collect_audit_errors(res))
def test_audit_experiment_documents(testapp, base_experiment, base_library, base_replicate):
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing documents'
for error in collect_audit_errors(res))
def test_audit_experiment_documents_excluded(testapp, base_experiment,
base_library, award, base_replicate):
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(award['@id'], {'rfa': 'modENCODE'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] != 'missing documents'
for error in collect_audit_errors(res))
def test_audit_experiment_links_included(testapp, base_experiment,
base_library, award, base_replicate):
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(award['@id'], {'rfa': 'modENCODE'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(re.search(r'{.+?\|.+?}', error['detail'])
for error in collect_audit_errors(res))
def test_audit_experiment_model_organism_mismatched_sex(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6):
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_1['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_2['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_1['@id'], {'model_organism_sex': 'male'})
testapp.patch_json(biosample_2['@id'], {'model_organism_sex': 'female'})
testapp.patch_json(biosample_1['@id'], {'model_organism_age_units': 'day',
'model_organism_age': '54'})
testapp.patch_json(biosample_2['@id'], {'model_organism_age_units': 'day',
'model_organism_age': '54'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent sex'
for error in collect_audit_errors(res))
def test_audit_experiment_model_organism_mismatched_age(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
mouse_donor_2):
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_1['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_2['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_1['@id'], {'model_organism_age_units': 'day',
'model_organism_age': '51'})
testapp.patch_json(biosample_2['@id'], {'model_organism_age_units': 'day',
'model_organism_age': '54'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent age'
for error in collect_audit_errors(res))
def test_audit_experiment_model_organism_mismatched_donor(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
mouse_donor_2_6):
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_2_6['@id']})
testapp.patch_json(biosample_1['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_2['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_1['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent donor'
for error in collect_audit_errors(res))
def test_audit_experiment_with_library_without_biosample(testapp, base_experiment, base_replicate,
library_no_biosample, biosample_1,
biosample_2):
testapp.patch_json(base_replicate['@id'], {'library': library_no_biosample['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing biosample'
for error in collect_audit_errors(res))
testapp.patch_json(library_no_biosample['@id'], {'mixed_biosamples': [biosample_1['@id'], biosample_2['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'missing biosample'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_no_size_range(
testapp,
experiment_with_RNA_library,
):
res = testapp.get(experiment_with_RNA_library.json['object']['@id'] + '@@index-data')
assert any(error['category'] == 'missing RNA fragment size'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_with_size_range(
testapp,
experiment_with_RNA_library,
base_library,
):
testapp.patch_json(base_library['@id'], {'size_range': '>200'})
res = testapp.get(experiment_with_RNA_library.json['object']['@id'] + '@@index-data')
assert all(error['category'] != 'missing RNA fragment size'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_no_size_range_RNA_microarray(
testapp,
experiment_with_RNA_library,
):
testapp.patch_json(experiment_with_RNA_library.json['object']['@id'], {'assay_term_name': 'transcription profiling by array assay'})
res = testapp.get(experiment_with_RNA_library.json['object']['@id'] + '@@index-data')
assert all(error['category'] != 'missing RNA fragment size'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_no_size_range_long_read_RNA(
testapp,
experiment_with_RNA_library,
):
testapp.patch_json(experiment_with_RNA_library.json['object']['@id'], {'assay_term_name': 'long read RNA-seq'})
res = testapp.get(experiment_with_RNA_library.json['object']['@id'] + '@@index-data')
assert all(error['category'] != 'missing RNA fragment size'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_no_size_range_Bru_seq(
testapp,
experiment_with_RNA_library,
):
# https://encodedcc.atlassian.net/browse/ENCD-5457
testapp.patch_json(experiment_with_RNA_library.json['object']['@id'], {'assay_term_name': 'Bru-seq'})
res = testapp.get(experiment_with_RNA_library.json['object']['@id'] + '@@index-data')
assert any(error['category'] == 'missing RNA fragment size'
for error in collect_audit_errors(res, error_types=['WARNING']))
assert all(error['category'] != 'missing RNA fragment size'
for error in collect_audit_errors(res, error_types=['NOT_COMPLIANT']))
def test_audit_experiment_with_RNA_library_missing_read_length_long_read_RNA_seq(
testapp,
experiment_no_read_length,
pipeline_bam,
):
testapp.patch_json(pipeline_bam['@id'], {'title': 'Long read RNA-seq pipeline'})
res = testapp.get(experiment_no_read_length.json['object']['@id'] + '@@index-data')
assert all(error['category'] != 'missing read_length'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_missing_read_length_RNA_seq(
testapp,
experiment_no_read_length,
pipeline_bam,
):
testapp.patch_json(pipeline_bam['@id'], {'title': 'RNA-seq of long RNAs (paired-end, stranded)'})
res = testapp.get(experiment_no_read_length.json['object']['@id'] + '@@index-data')
assert any(error['category'] == 'missing read_length'
for error in collect_audit_errors(res))
def test_audit_experiment_with_RNA_library_missing_read_length_bulk_RNA_seq(
testapp,
experiment_no_read_length,
pipeline_bam,
):
testapp.patch_json(pipeline_bam['@id'], {'title': 'Bulk RNA-seq'})
res = testapp.get(experiment_no_read_length.json['object']['@id'] + '@@index-data')
assert any(error['category'] == 'missing read_length'
for error in collect_audit_errors(res))
def test_audit_experiment_replicate_with_file(testapp, file_fastq,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(file_fastq['@id'], {'replicate': base_replicate['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all((error['category'] != 'missing raw data in replicate')
for error in collect_audit_errors(res))
def test_audit_experiment_replicate_with_archived_file(
testapp,
file_fastq,
base_experiment,
base_replicate,
base_library
):
testapp.patch_json(file_fastq['@id'], {
'replicate': base_replicate['@id'],
'status': 'archived'})
testapp.patch_json(base_experiment['@id'], {
'assay_term_name': 'RNA-seq',
'status': 'released',
'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all((error['category'] != 'missing raw data in replicate')
for error in collect_audit_errors(res))
def test_audit_experiment_replicate_with_no_fastq_files(testapp, file_bam,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing raw data in replicate'
for error in collect_audit_errors(res))
def test_audit_experiment_replicate_with_no_files(testapp,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing raw data in replicate'
for error in collect_audit_errors(res))
def test_audit_experiment_replicate_with_no_files_dream(testapp,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq',
'internal_tags': ['DREAM'],
'status': 'released',
'date_released': '2016-01-01'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'missing raw data in replicate'
for error in collect_audit_errors(res))
def test_audit_experiment_replicate_with_no_files_warning(testapp, file_bed_methyl,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(file_bed_methyl['@id'], {'replicate': base_replicate['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_experiment['@id'], {'status': 'in progress'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing raw data in replicate' for
error in collect_audit_errors(res, ['ERROR']))
def test_audit_experiment_pipeline_assay_term_name_consistency(
testapp,
experiment, bam_file,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam):
testapp.patch_json(experiment['@id'], {'status': 'released', 'date_released': '2016-01-01'})
testapp.patch_json(bam_file['@id'], {'step_run': analysis_step_run_bam['@id']})
testapp.patch_json(pipeline_bam['@id'], {'title':
'RNA-seq of long RNAs (single-end, unstranded)',
'assay_term_names': ['RNA-seq', 'RAMPAGE']})
testapp.patch_json(experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent assay_term_name'
for error in collect_audit_errors(res))
def test_audit_experiment_pipeline_without_assay_term_names(
testapp,
experiment, bam_file,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_without_assay_term_names_bam):
testapp.patch_json(experiment['@id'], {'status': 'released', 'date_released': '2016-01-01'})
testapp.patch_json(bam_file['@id'], {'step_run': analysis_step_run_bam['@id']})
testapp.patch_json(pipeline_without_assay_term_names_bam['@id'], {'title':
'RNA-seq of long RNAs (single-end, unstranded)'})
testapp.patch_json(experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent assay_term_name'
for error in collect_audit_errors(res))
def test_audit_experiment_not_uploaded_files(testapp, file_bam,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(file_bam['@id'], {'status': 'upload failed'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'file validation error'
for error in collect_audit_errors(res))
def test_audit_experiment_uploading_files(testapp, file_bam,
base_experiment,
base_replicate,
base_library):
testapp.patch_json(file_bam['@id'], {'status': 'uploading'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'file validation error'
for error in collect_audit_errors(res))
assert any(error['category'] == 'file in uploading state'
for error in collect_audit_errors(res))
def test_audit_experiment_mismatched_length_sequencing_files(testapp, file_bam, file_fastq,
base_experiment, file_fastq_2,
base_replicate,
base_library):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed run types'
for error in collect_audit_errors(res))
def test_audit_experiment_mismatched_platforms(testapp, file_fastq,
base_experiment, file_fastq_2,
base_replicate, platform1,
base_library, platform2):
testapp.patch_json(file_fastq['@id'], {'platform': platform1['@id']})
testapp.patch_json(file_fastq_2['@id'], {'platform': platform2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent platforms'
for error in collect_audit_errors(res))
def test_audit_experiment_archived_files_mismatched_platforms(
testapp, file_fastq, base_experiment, file_fastq_2, base_replicate,
platform1, base_library, platform2):
testapp.patch_json(file_fastq['@id'], {'platform': platform1['@id'],
'status': 'archived'})
testapp.patch_json(file_fastq_2['@id'], {'platform': platform2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'inconsistent platforms'
for error in collect_audit_errors(res))
def test_audit_experiment_internal_tag(testapp, base_experiment,
base_biosample,
library_1,
replicate_1_1):
testapp.patch_json(base_biosample['@id'], {'internal_tags': ['ENTEx']})
testapp.patch_json(library_1['@id'], {'biosample': base_biosample['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent internal tags'
for error in collect_audit_errors(res))
def test_audit_experiment_internal_tags(testapp, base_experiment,
biosample_1,
biosample_2,
library_1,
library_2,
replicate_1_1,
replicate_1_2):
testapp.patch_json(biosample_1['@id'], {'internal_tags': ['ENTEx']})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(biosample_2['@id'], {'internal_tags': ['ENTEx', 'SESCC']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_2['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent internal tags'
for error in collect_audit_errors(res))
def test_audit_experiment_internal_tags2(testapp, base_experiment,
biosample_1,
biosample_2,
library_1,
library_2,
replicate_1_1,
replicate_1_2):
testapp.patch_json(biosample_1['@id'], {'internal_tags': ['ENTEx']})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_2['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'inconsistent internal tags' for error in collect_audit_errors(res))
def test_audit_experiment_mismatched_inter_paired_sequencing_files(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
mouse_donor_2,
file_fastq_6,
file_fastq_4):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed run types'
for error in collect_audit_errors(res))
def test_audit_experiment_DNase_mismatched_inter_paired_sequencing_files(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
mouse_donor_2,
file_fastq_6,
file_fastq_4):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'DNase-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'mixed run types'
for error in collect_audit_errors(res))
def test_audit_experiment_mismatched_inter_length_sequencing_files(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
mouse_donor_2,
file_fastq_3,
file_fastq_4,
file_fastq_5):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 50})
testapp.patch_json(file_fastq_5['@id'], {'read_length': 150})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed read lengths'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'eCLIP'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed read lengths'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'Mint-ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed read lengths'
for error in collect_audit_errors(res))
def test_audit_experiment_mismatched_valid_inter_length_sequencing_files(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
mouse_donor_2,
file_fastq_3,
file_fastq_4,
file_fastq_5):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 50})
testapp.patch_json(file_fastq_5['@id'], {'read_length': 52})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'mixed read lengths'
for error in collect_audit_errors(res))
def test_audit_experiment_DNase_mismatched_valid_inter_length_sequencing_files(
testapp, base_experiment,
replicate_1_1, replicate_2_1,
library_1, library_2,
biosample_1, biosample_2,
mouse_donor_1_6, mouse_donor_2,
file_fastq_3, file_fastq_4,
file_fastq_5):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'DNase-seq'})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 27})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 27})
testapp.patch_json(file_fastq_5['@id'], {'read_length': 36})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'mixed read lengths'
for error in collect_audit_errors(res))
def test_audit_experiment_long_rna_standards_crispr(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2,
mad_quality_metric_1_2,
bam_quality_metric_1_1,
bam_quality_metric_2_1,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam):
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 100})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'assembly': 'mm10'})
testapp.patch_json(file_bam_2_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'assembly': 'mm10'})
testapp.patch_json(pipeline_bam['@id'], {'title':
'RNA-seq of long RNAs (paired-end, stranded)'})
testapp.patch_json(bam_quality_metric_1_1['@id'], {'Uniquely mapped reads number': 5000000})
testapp.patch_json(bam_quality_metric_2_1['@id'], {'Uniquely mapped reads number': 10000000})
testapp.patch_json(bam_quality_metric_1_1['@id'],
{'Number of reads mapped to multiple loci': 10})
testapp.patch_json(bam_quality_metric_2_1['@id'],
{'Number of reads mapped to multiple loci': 100})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_1['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_2['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_1['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id'],
'size_range': '>200'})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id'],
'size_range': '>200'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'CRISPR genome editing followed by RNA-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing spikeins' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_standards_control_read_depth_encode4(testapp,
experiment_chip_control,
experiment_chip_H3K27me3,
experiment_mint_chip,
file_bam_1_chip,
file_bam_2_chip,
file_tsv_1_2,
file_bam_control_chip,
chip_alignment_quality_metric_insufficient_read_depth,
chip_alignment_quality_metric_extremely_low_read_depth,
analysis_step_run_chip_encode4,
analysis_step_version_chip_encode4,
analysis_step_chip_encode4,
pipeline_chip_encode4,
replicate_1_mint_chip,
file_fastq_1_chip):
testapp.patch_json(chip_alignment_quality_metric_extremely_low_read_depth['@id'], {'quality_metric_of': [file_bam_control_chip['@id']]})
testapp.patch_json(file_bam_control_chip['@id'], {'step_run': analysis_step_run_chip_encode4['@id']})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_control_chip['@id'], file_bam_1_chip['@id']],
'dataset': experiment_chip_H3K27me3['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'step_run': analysis_step_run_chip_encode4['uuid'],
'output_type': 'peaks and background as input for IDR'})
testapp.patch_json(experiment_chip_H3K27me3['@id'], {'status': 'submitted',
'date_submitted': '2015-01-01',
'possible_controls': [experiment_chip_control['@id']]})
res = testapp.get(experiment_chip_H3K27me3['@id'] + '@@index-data')
assert any(error['category'] ==
'control extremely low read depth' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_1_chip['@id'], {
'dataset': experiment_mint_chip['@id'],
'replicate': replicate_1_mint_chip['@id']
})
testapp.patch_json(file_bam_1_chip['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_tsv_1_2['@id'], {'dataset': experiment_mint_chip['@id']})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'control extremely low read depth' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_standards_peak_but_no_qc_encode4(testapp,
experiment_chip_control,
experiment_chip_H3K27me3,
experiment_mint_chip,
file_fastq_control_chip,
file_fastq_1_chip,
file_bam_1_chip,
file_tsv_1_2,
chip_alignment_quality_metric_extremely_low_read_depth,
file_bam_control_chip,
analysis_step_run_chip_encode4,
analysis_step_version_chip_encode4,
analysis_step_chip_encode4,
pipeline_chip_encode4,
replicate_1_mint_chip):
testapp.patch_json(chip_alignment_quality_metric_extremely_low_read_depth['@id'], {'quality_metric_of': [file_bam_1_chip['@id']]})
testapp.patch_json(file_fastq_control_chip['@id'], {'dataset': experiment_chip_control['@id']})
testapp.patch_json(file_fastq_1_chip['@id'], {'controlled_by': [file_fastq_control_chip['@id']],
'dataset': experiment_chip_H3K27me3['@id']})
testapp.patch_json(file_bam_1_chip['@id'], {'step_run': analysis_step_run_chip_encode4['@id'],
'dataset': experiment_chip_H3K27me3['@id'],
'derived_from': [file_fastq_1_chip['@id']]})
testapp.patch_json(file_bam_control_chip['@id'], {'step_run': analysis_step_run_chip_encode4['@id'],
'dataset': experiment_chip_control['@id'],
'derived_from': [file_fastq_control_chip['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_control_chip['@id'], file_bam_1_chip['@id']],
'dataset': experiment_chip_H3K27me3['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'step_run': analysis_step_run_chip_encode4['uuid'],
'output_type': 'peaks and background as input for IDR'})
testapp.patch_json(experiment_chip_H3K27me3['@id'], {'possible_controls': [experiment_chip_control['@id']]})
res = testapp.get(experiment_chip_H3K27me3['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control quality metric' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_1_chip['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_bam_1_chip['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_tsv_1_2['@id'], {'dataset': experiment_mint_chip['@id']})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control quality metric' for error in collect_audit_errors(res))
def test_audit_experiment_missing_control_alignment_chip_encode4(testapp,
experiment_chip_control,
experiment_chip_H3K27me3,
experiment_mint_chip,
file_fastq_control_chip,
file_fastq_1_chip,
file_bam_1_chip,
file_tsv_1_2,
chip_alignment_quality_metric_extremely_low_read_depth,
file_bam_control_chip,
analysis_step_run_chip_encode4,
analysis_step_version_chip_encode4,
analysis_step_chip_encode4,
pipeline_chip_encode4,
replicate_1_mint_chip):
testapp.patch_json(chip_alignment_quality_metric_extremely_low_read_depth['@id'], {'quality_metric_of': [file_bam_1_chip['@id']]})
testapp.patch_json(file_fastq_control_chip['@id'], {'dataset': experiment_chip_control['@id']})
testapp.patch_json(file_fastq_1_chip['@id'], {'controlled_by': [file_fastq_control_chip['@id']],
'dataset': experiment_chip_H3K27me3['@id']})
testapp.patch_json(file_bam_1_chip['@id'], {'step_run': analysis_step_run_chip_encode4['@id'],
'dataset': experiment_chip_H3K27me3['@id'],
'derived_from': [file_fastq_1_chip['@id']]})
testapp.patch_json(file_bam_control_chip['@id'], {'step_run': analysis_step_run_chip_encode4['@id'],
'dataset': experiment_chip_control['@id'],
'derived_from': [file_fastq_control_chip['@id']],
'status': 'revoked'})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_control_chip['@id'], file_bam_1_chip['@id']],
'dataset': experiment_chip_H3K27me3['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'step_run': analysis_step_run_chip_encode4['uuid'],
'output_type': 'peaks and background as input for IDR'})
testapp.patch_json(experiment_chip_H3K27me3['@id'], {'possible_controls': [experiment_chip_control['@id']],
'status': 'released',
'date_released': '2019-10-08'})
res = testapp.get(experiment_chip_H3K27me3['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control alignments' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_1_chip['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_bam_1_chip['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_tsv_1_2['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(experiment_mint_chip['@id'], {
'status': 'released',
'date_released': '2019-10-08'})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control alignments' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_control_standards(
testapp,
base_experiment,
experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2,
mad_quality_metric_1_2,
chip_seq_quality_metric,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam,
target_H3K9me3):
testapp.patch_json(chip_seq_quality_metric['@id'], {'quality_metric_of': [file_bam_2_1['@id']],
'processing_stage': 'filtered',
'total': 1000,
'mapped': 1000,
'read1': 100, 'read2': 100})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20,
'dataset': base_experiment['@id'],
'controlled_by': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 100,
'dataset': experiment['@id']})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': base_experiment['@id'],
'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': experiment['@id'],
'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_2_1['@id'],
file_bam_1_1['@id']],
'dataset': base_experiment['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks'})
testapp.patch_json(pipeline_bam['@id'], {'title':
'ChIP-seq read mapping'})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(experiment['@id'], {'control_type': 'input library',
'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'ChIP-seq'})
testapp.patch_json(base_experiment['@id'], {'target': target_H3K9me3['@id'],
'status': 'submitted',
'date_submitted': '2015-01-01',
'possible_controls': [experiment['@id']],
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'control extremely low read depth' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_peaks_without_controls(
testapp,
base_experiment,
experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2,
mad_quality_metric_1_2,
chip_seq_quality_metric,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam,
target_H3K9me3):
testapp.patch_json(chip_seq_quality_metric['@id'], {'quality_metric_of': [file_bam_2_1['@id']],
'processing_stage': 'filtered',
'total': 1000,
'mapped': 1000,
'read1': 100, 'read2': 100})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20,
'dataset': base_experiment['@id'],
'controlled_by': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 100,
'dataset': experiment['@id']})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': base_experiment['@id'],
'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': experiment['@id'],
'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_1_1['@id']],
'dataset': base_experiment['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks'})
testapp.patch_json(pipeline_bam['@id'], {'title':
'ChIP-seq read mapping'})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(experiment['@id'], {'control_type': 'control',
'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'ChIP-seq'})
testapp.patch_json(base_experiment['@id'], {'target': target_H3K9me3['@id'],
'status': 'submitted',
'date_submitted': '2015-01-01',
'possible_controls': [experiment['@id']],
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control alignments' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_peaks_with_controls_but_no_qc(
testapp,
base_experiment,
experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2,
mad_quality_metric_1_2,
chip_seq_quality_metric,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam,
target_H3K9me3):
testapp.patch_json(chip_seq_quality_metric['@id'], {'quality_metric_of': [file_bam_1_1['@id']],
'processing_stage': 'filtered',
'total': 1000,
'mapped': 1000,
'read1': 100, 'read2': 100})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20,
'dataset': base_experiment['@id'],
'controlled_by': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 100,
'dataset': experiment['@id']})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': base_experiment['@id'],
'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': experiment['@id'],
'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_1_1['@id'], file_bam_2_1['@id']],
'dataset': base_experiment['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks and background as input for IDR'})
testapp.patch_json(pipeline_bam['@id'], {'title':
'ChIP-seq read mapping'})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(experiment['@id'], {'control_type': 'input library',
'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'ChIP-seq'})
testapp.patch_json(base_experiment['@id'], {'target': target_H3K9me3['@id'],
'status': 'submitted',
'date_submitted': '2015-01-01',
'possible_controls': [experiment['@id']],
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control quality metric' for error in collect_audit_errors(res))
testapp.patch_json(chip_seq_quality_metric['@id'], {'quality_metric_of': [
file_bam_1_1['@id'],
file_bam_2_1['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] !=
'missing control quality metric' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_peaks_with_subsampled_controls(
testapp,
base_experiment,
experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2,
mad_quality_metric_1_2,
chip_seq_quality_metric,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam,
target_H3K9me3):
testapp.patch_json(analysis_step_bam['@id'], {'title': 'Alignment pooliing and subsampling step'})
testapp.patch_json(chip_seq_quality_metric['@id'], {'quality_metric_of': [file_bam_1_1['@id'], file_bam_2_1['@id']],
'processing_stage': 'filtered',
'total': 1002,
'mapped': 1002,
'read1': 100, 'read2': 100})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20,
'dataset': base_experiment['@id'],
'controlled_by': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 100,
'dataset': experiment['@id']})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': base_experiment['@id'],
'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'status': 'in progress',
'assembly': 'mm10', 'dataset': experiment['@id'],
'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_1_1['@id'], file_bam_2_1['@id']],
'dataset': base_experiment['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks and background as input for IDR'})
testapp.patch_json(pipeline_bam['@id'], {'title':
'Some subsampling pipeline'})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id'],
'organism': '/organisms/mouse/',
'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(experiment['@id'], {'control_type': 'control',
'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'ChIP-seq'})
testapp.patch_json(base_experiment['@id'], {'target': target_H3K9me3['@id'],
'status': 'submitted',
'date_submitted': '2015-01-01',
'possible_controls': [experiment['@id']],
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing control alignments' for error in collect_audit_errors(res))
testapp.patch_json(analysis_step_bam['@id'], {'title': 'Alignment pooling and subsampling step'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] !=
'missing control alignments' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_no_target_standards(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
mouse_donor_1_6,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2,
mad_quality_metric_1_2,
chip_seq_quality_metric,
chipseq_filter_quality_metric,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam):
testapp.patch_json(chip_seq_quality_metric['@id'], {'quality_metric_of': [file_bam_1_1['@id']],
'processing_stage': 'unfiltered',
'total': 10000000,
'mapped': 10000000,
'read1': 100, 'read2': 100})
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 100})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'assembly': 'mm10',
'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'assembly': 'mm10',
'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(pipeline_bam['@id'], {'title':
'ChIP-seq read mapping'})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_2['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_1['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_2['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_1['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(biosample_2['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing target' for error in collect_audit_errors(res))
def test_audit_experiment_dnase_low_read_length(testapp,
base_experiment,
replicate_1_1,
library_1,
biosample_1,
mouse_donor_1_6,
file_fastq_3,
file_bam_1_1,
mad_quality_metric_1_2,
chip_seq_quality_metric,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam):
testapp.patch_json(file_fastq_3['@id'], {'read_length': 20})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id'],
'assembly': 'mm10',
'output_type': 'alignments',
'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(pipeline_bam['@id'], {'title':
'DNase-HS pipeline single-end - Version 2'})
testapp.patch_json(chip_seq_quality_metric['@id'], {'mapped': 23})
testapp.patch_json(biosample_1['@id'], {'donor': mouse_donor_1_6['@id']})
testapp.patch_json(biosample_1['@id'], {'organism': '/organisms/mouse/'})
testapp.patch_json(biosample_1['@id'], {'model_organism_sex': 'mixed'})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(base_experiment['@id'], {'status': 'released',
'date_released': '2016-01-01',
'assay_term_name': 'DNase-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'insufficient read length' for error in collect_audit_errors(res))
# duplication rate audit was removed from v54
def test_audit_experiment_out_of_date_analysis_added_fastq(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
experiment_mint_chip,
replicate_1_mint_chip):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
testapp.patch_json(file_fastq_4['@id'], {'replicate': replicate_1_1['@id']})
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_3['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'out of date analysis' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_4['@id'], {
'replicate': replicate_1_mint_chip['@id'],
'dataset': experiment_mint_chip['@id']
})
testapp.patch_json(file_fastq_3['@id'], {
'replicate': replicate_1_mint_chip['@id'],
'dataset': experiment_mint_chip['@id']
})
testapp.patch_json(file_bam_1_1['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_bam_2_1['@id'], {'dataset': experiment_mint_chip['@id']})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'out of date analysis' for error in collect_audit_errors(res))
def test_audit_experiment_out_of_date_analysis_removed_fastq(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
experiment_mint_chip):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_3['@id'], {'status': 'deleted'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'out of date analysis' for error in collect_audit_errors(res))
testapp.patch_json(file_bam_1_1['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_bam_2_1['@id'], {'dataset': experiment_mint_chip['@id']})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'out of date analysis' for error in collect_audit_errors(res))
def test_audit_experiment_not_out_of_date_analysis_DNase(testapp,
base_experiment,
replicate_1_1,
replicate_1_2,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'DNase-seq'})
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_3['@id'], {'replicate': replicate_1_1['@id']})
testapp.patch_json(file_fastq_4['@id'], {'replicate': replicate_1_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'out of date analysis' for error in collect_audit_errors(res))
def test_audit_experiment_out_of_date_analysis_DNase(testapp,
base_experiment,
replicate_1_1,
replicate_1_2,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'DNase-seq'})
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_fastq_3['@id'], {'replicate': replicate_1_1['@id'],
'status': 'deleted'})
testapp.patch_json(file_fastq_4['@id'], {'replicate': replicate_1_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'out of date analysis' for error in collect_audit_errors(res))
def test_audit_experiment_out_of_date_analysis_ENCODE4_DNase(
testapp,
base_experiment,
replicate_1_1,
replicate_1_2,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
analysis_step_run_dnase_encode4,
pipeline_dnase_encode4):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'DNase-seq'})
testapp.patch_json(file_bam_1_1['@id'], {
'derived_from': [file_fastq_3['@id'], file_fastq_4['@id']],
'step_run': analysis_step_run_dnase_encode4['@id']})
testapp.patch_json(file_fastq_3['@id'], {'replicate': replicate_1_1['@id']})
testapp.patch_json(file_fastq_4['@id'], {'replicate': replicate_1_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'out of date analysis' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_4['@id'], {'status': 'deleted'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'out of date analysis' for error in collect_audit_errors(res))
def test_audit_experiment_no_out_of_date_analysis(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1):
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_4['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] !=
'out of date analysis' for error in collect_audit_errors(res))
# def test_audit_experiment_modERN_control_missing_files() removed from v54
# def test_audit_experiment_modERN_experiment_missing_files() removed from v54
def test_audit_experiment_missing_genetic_modification(
testapp,
base_experiment,
base_target,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
tag_antibody,
biosample_1,
biosample_2,
donor_1,
donor_2,
k562):
testapp.patch_json(biosample_1['@id'], {'biosample_ontology': k562['uuid'],
'donor': donor_1['@id']})
testapp.patch_json(biosample_2['@id'], {'biosample_ontology': k562['uuid'],
'donor': donor_2['@id']})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(
replicate_1_1['@id'],
{'library': library_1['@id'], 'antibody': tag_antibody['@id']}
)
testapp.patch_json(
replicate_2_1['@id'],
{'library': library_2['@id'], 'antibody': tag_antibody['@id']}
)
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'inconsistent genetic modification tags' for error in collect_audit_errors(res))
def test_audit_experiment_tagging_genetic_modification_characterization(
testapp,
construct_genetic_modification,
gm_characterization,
base_experiment,
base_target,
replicate_1_1,
library_1,
biosample_1,
donor_1,
k562):
testapp.patch_json(biosample_1['@id'], {'genetic_modifications': [construct_genetic_modification['@id']],
'biosample_ontology': k562['uuid'],
'donor': donor_1['@id']})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing genetic modification characterization' for error in collect_audit_errors(res))
testapp.patch_json(gm_characterization['@id'], {'characterizes': construct_genetic_modification['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] !=
'missing genetic modification characterization' for error in collect_audit_errors(res))
def test_audit_experiment_tagging_biosample_characterization(
testapp,
construct_genetic_modification,
interference_genetic_modification,
biosample_characterization,
base_experiment,
base_target,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
donor_1,
k562,
award_encode4,
wrangler,
):
testapp.patch_json(biosample_1['@id'],
{'genetic_modifications': [interference_genetic_modification['@id']],
'biosample_ontology': k562['uuid'],
'donor': donor_1['@id']})
testapp.patch_json(biosample_2['@id'],
{'genetic_modifications': [interference_genetic_modification['@id']],
'biosample_ontology': k562['uuid'],
'donor': donor_1['@id']})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(base_experiment['@id'],
{'assay_term_name': 'ChIP-seq',
'award': award_encode4['@id'],
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing biosample characterization'
for error in collect_audit_errors(res, ['WARNING']))
testapp.patch_json(biosample_1['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']]})
testapp.patch_json(biosample_2['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing biosample characterization'
for error in collect_audit_errors(res, ['ERROR']))
testapp.patch_json(biosample_characterization['@id'],
{'characterizes': biosample_1['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'missing biosample characterization'
for error in collect_audit_errors(res))
# Has characterization but hasn't been reviewed as compliant
assert any(
error['category'] == 'missing compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
# Has compliant characterization
testapp.patch_json(
biosample_characterization['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing compliant biosample characterization'
for error in collect_audit_errors(res)
)
# Has not compliant characterization
testapp.patch_json(
biosample_characterization['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'not compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'not compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
def test_audit_experiment_pooled_biosample_no_characterization(
testapp,
biosample_pooled_from_not_characterized_biosamples,
award_encode4,
base_experiment,
base_target,
base_replicate,
base_library,
):
testapp.patch_json(
base_experiment['@id'],
{
'assay_term_name': 'ChIP-seq',
'award': award_encode4['@id'],
'target': base_target['@id']
}
)
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(
base_library['@id'],
{'biosample': biosample_pooled_from_not_characterized_biosamples['@id']}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing biosample characterization'
for error in collect_audit_errors(res, error_types=['ERROR'])
)
def test_audit_experiment_pooled_biosample_partial_characterization(
testapp,
biosample_pooled_from_characterized_and_not_characterized_biosamples,
award_encode4,
base_experiment,
base_target,
base_replicate,
base_library,
):
testapp.patch_json(
base_experiment['@id'],
{
'assay_term_name': 'ChIP-seq',
'award': award_encode4['@id'],
'target': base_target['@id']
}
)
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(
base_library['@id'],
{'biosample': biosample_pooled_from_characterized_and_not_characterized_biosamples['@id']}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing biosample characterization'
for error in collect_audit_errors(res, error_types=['ERROR'])
)
def test_audit_experiment_pooled_biosample_characterization(
testapp,
biosample_pooled_from_characterized_biosamples,
award_encode4,
base_experiment,
base_target,
base_replicate,
base_library,
biosample_characterization,
biosample_characterization_no_review,
wrangler,
):
testapp.patch_json(
base_experiment['@id'],
{
'assay_term_name': 'ChIP-seq',
'award': award_encode4['@id'],
'target': base_target['@id']
}
)
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
testapp.patch_json(
base_library['@id'],
{'biosample': biosample_pooled_from_characterized_biosamples['@id']}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing biosample characterization'
for error in collect_audit_errors(res)
)
# One compliant parent biosample
testapp.patch_json(
biosample_characterization['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
# One not compliant parent biosample
testapp.patch_json(
biosample_characterization_no_review['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'not compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'not compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
# Both parent biosamples are compliant
testapp.patch_json(
biosample_characterization_no_review['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] not in [
'missing biosample characterization',
'missing compliant biosample characterization',
'not compliant biosample characterization',
]
for error in collect_audit_errors(res, ['ERROR'])
)
@pytest.mark.parametrize(
'relationship',
[
'part_of',
'originated_from'
])
def test_biosample_characterization_parent_relationship(
testapp,
relationship,
construct_genetic_modification,
biosample_characterization,
base_experiment,
base_target,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2,
base_biosample,
donor_1,
k562,
award_encode4,
wrangler,
treatment_5
):
# Parent biosamples via part_of or originated_from can be checked for biosample
# characterizations if ontology, applied_modifications, and treatments match the child
testapp.patch_json(biosample_1['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']],
'biosample_ontology': k562['uuid'],
'donor': donor_1['@id'],
relationship: base_biosample['@id']})
testapp.patch_json(biosample_2['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']],
'biosample_ontology': k562['uuid'],
'donor': donor_1['@id'],
relationship: base_biosample['@id']})
testapp.patch_json(base_biosample['@id'],
{'biosample_ontology': k562['uuid'],
'genetic_modifications': [construct_genetic_modification['@id']]})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
testapp.patch_json(base_experiment['@id'],
{'assay_term_name': 'ChIP-seq',
'award': award_encode4['@id'],
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing biosample characterization'
for error in collect_audit_errors(res, ['ERROR']))
# Parent biosample characterization not reviewed
testapp.patch_json(biosample_characterization['@id'],
{'characterizes': base_biosample['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'missing biosample characterization'
for error in collect_audit_errors(res))
assert any(
error['category'] == 'missing compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
# Parent has compliant characterization
testapp.patch_json(
biosample_characterization['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing compliant biosample characterization'
for error in collect_audit_errors(res)
)
# Parent has not compliant characterization
testapp.patch_json(
biosample_characterization['@id'],
{
'review': {
'lab': base_experiment['lab'],
'reviewed_by': wrangler['@id'],
'status': 'not compliant'
}
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'not compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
# If treatments or modifications differ between child and parent, parent won't be queried
testapp.patch_json(biosample_1['@id'],
{'treatments': [treatment_5['@id']]})
testapp.patch_json(biosample_2['@id'],
{'treatments': [treatment_5['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
assert all(
error['category'] != 'not compliant biosample characterization'
for error in collect_audit_errors(res)
)
# Adding the matching treatment to the parent means it is checked again
testapp.patch_json(base_biosample['@id'],
{'treatments': [treatment_5['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'not compliant biosample characterization'
for error in collect_audit_errors(res, ['ERROR'])
)
def test_audit_experiment_missing_unfiltered_bams(testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
file_fastq_3,
file_bam_1_1,
file_bam_2_1,
analysis_step_run_bam,
analysis_step_version_bam,
analysis_step_bam,
pipeline_bam):
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_3['@id']],
'assembly': 'hg19',
'output_type': 'unfiltered alignments'})
testapp.patch_json(file_bam_1_1['@id'], {'step_run': analysis_step_run_bam['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing unfiltered alignments' for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'Mint-ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'missing unfiltered alignments' for error in collect_audit_errors(res))
def test_audit_experiment_wrong_modification(
testapp,
base_experiment,
base_target,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
tag_antibody,
biosample_1,
biosample_2,
donor_1,
donor_2,
construct_genetic_modification,
k562):
testapp.patch_json(construct_genetic_modification['@id'],
{'modified_site_by_target_id': base_target['@id'],
'introduced_tags': [{'name': 'FLAG', 'location': 'internal'}]})
testapp.patch_json(biosample_1['@id'], {'biosample_ontology': k562['uuid'],
'donor': donor_1['@id']})
testapp.patch_json(biosample_2['@id'], {'biosample_ontology': k562['uuid'],
'donor': donor_2['@id']})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(
replicate_1_1['@id'],
{'library': library_1['@id'], 'antibody': tag_antibody['@id']}
)
testapp.patch_json(
replicate_2_1['@id'],
{'library': library_2['@id'], 'antibody': tag_antibody['@id']}
)
testapp.patch_json(biosample_1['@id'], {'genetic_modifications': [construct_genetic_modification['@id']]})
testapp.patch_json(biosample_2['@id'], {'genetic_modifications': [construct_genetic_modification['@id']]})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'target': base_target['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'inconsistent genetic modification tags' for error in collect_audit_errors(res))
testapp.patch_json(construct_genetic_modification['@id'],
{'introduced_tags': [{'name': 'eGFP', 'location': 'internal'}]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'inconsistent genetic modification tags'
for error in collect_audit_errors(res)
)
def test_audit_experiment_chip_seq_mapped_read_length(testapp,
base_experiment,
experiment_mint_chip,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2):
testapp.patch_json(file_fastq_3['@id'], {'read_length': 100})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 130})
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_2_1['@id'],
file_bam_1_1['@id']],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks'})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'inconsistent mapped reads lengths' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_3['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_fastq_4['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_bam_1_1['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_bam_2_1['@id'], {'dataset': experiment_mint_chip['@id']})
testapp.patch_json(file_tsv_1_2['@id'], {'dataset': experiment_mint_chip['@id']})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'inconsistent mapped reads lengths' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_consistent_mapped_read_length(
testapp,
base_experiment,
file_fastq_3,
file_fastq_4,
file_bam_1_1,
file_bam_2_1,
file_tsv_1_2):
testapp.patch_json(file_fastq_3['@id'], {'read_length': 124})
testapp.patch_json(file_fastq_4['@id'], {'read_length': 130})
testapp.patch_json(file_bam_1_1['@id'], {'derived_from': [file_fastq_3['@id']]})
testapp.patch_json(file_bam_2_1['@id'], {'derived_from': [file_fastq_4['@id']]})
testapp.patch_json(file_tsv_1_2['@id'], {'derived_from': [file_bam_2_1['@id'],
file_bam_1_1['@id']],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks'})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] !=
'inconsistent mapped reads lengths' for error in collect_audit_errors(res))
def test_audit_experiment_chip_seq_read_count(
testapp,
base_experiment,
experiment_mint_chip,
file_fastq_3,
file_fastq_4,
replicate_1_mint_chip):
testapp.patch_json(file_fastq_3['@id'], {'read_count': 124})
testapp.patch_json(file_fastq_4['@id'], {'read_count': 134})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] ==
'low read count' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_3['@id'], {'read_count': 100000000})
testapp.patch_json(file_fastq_4['@id'], {'read_count': 100000000})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] !=
'low read count' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_4['@id'], {
'replicate': replicate_1_mint_chip['@id'],
'dataset': experiment_mint_chip['@id']
})
testapp.patch_json(file_fastq_3['@id'], {
'replicate': replicate_1_mint_chip['@id'],
'dataset': experiment_mint_chip['@id']
})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert all(error['category'] !=
'low read count' for error in collect_audit_errors(res))
testapp.patch_json(file_fastq_3['@id'], {'read_count': 124})
testapp.patch_json(file_fastq_4['@id'], {'read_count': 134})
res = testapp.get(experiment_mint_chip['@id'] + '@@index-data')
assert any(error['category'] ==
'low read count' for error in collect_audit_errors(res))
def test_audit_experiment_with_biosample_missing_nih_consent(testapp, experiment, replicate_url,
library_url, biosample, encode4_award):
testapp.patch_json(experiment['@id'], {'award': encode4_award['@id']})
r = testapp.get(experiment['@id'] + '@@index-data')
audits = r.json['audit']
assert any(
[
detail['category'] == 'missing nih_institutional_certification'
for audit in audits.values() for detail in audit
]
)
def test_audit_experiment_with_biosample_not_missing_nih_consent(testapp, experiment, replicate,
library, biosample, encode4_award):
testapp.patch_json(experiment['@id'], {'award': encode4_award['@id']})
testapp.patch_json(biosample['@id'], {'nih_institutional_certification': 'NICABC123'})
r = testapp.get(experiment['@id'] + '@@index-data')
audits = r.json['audit']
assert all(
[
detail['category'] != 'missing nih_institutional_certification'
for audit in audits.values() for detail in audit
]
)
def test_audit_fcc_experiment_nih_consent(
testapp,
experiment,
replicate,
library,
biosample,
encode4_award,
):
testapp.patch_json(encode4_award['@id'], {'component': 'functional characterization'})
testapp.patch_json(experiment['@id'], {'award': encode4_award['@id']})
r = testapp.get(experiment['@id'] + '@@index-data')
audits = r.json['audit']
assert not any(
[
detail['category'] == 'missing nih_institutional_certification'
for audit in audits.values() for detail in audit
]
)
def test_audit_experiment_computational_award_nih_consent(testapp, experiment, encode4_award):
testapp.patch_json(encode4_award['@id'], {'component': 'computational analysis'})
testapp.patch_json(experiment['@id'], {'award': encode4_award['@id']})
r = testapp.get(experiment['@id'] + '@@index-data')
audits = r.json['audit']
assert not any(
[
detail['category'] == 'missing nih_institutional_certification'
for audit in audits.values() for detail in audit
]
)
def test_is_matching_biosample_control(testapp, biosample, ctrl_experiment):
from encoded.audit.experiment import is_matching_biosample_control
exp = testapp.get(ctrl_experiment['@id'] + '@@index-data')
exp_embedded = exp.json['embedded']
bio = testapp.get(biosample['@id'] + '@@index-data')
bio_embedded = bio.json['embedded']
assert is_matching_biosample_control(exp_embedded, bio_embedded['biosample_ontology']['term_id']) == False
testapp.patch_json(biosample['@id'], {'biosample_ontology': ctrl_experiment['biosample_ontology']})
bio = testapp.get(biosample['@id'] + '@@index-data')
bio_embedded = bio.json['embedded']
assert is_matching_biosample_control(exp_embedded, bio_embedded['biosample_ontology']['term_id']) == True
def test_audit_experiment_histone_characterized_no_primary(testapp,
base_experiment,
wrangler,
base_antibody,
base_replicate,
base_library,
base_biosample,
target_H3K9me3,
mouse_H3K9me3,
base_antibody_characterization2,
mouse,
mel):
# Supporting antibody only have secondary characterizations
testapp.patch_json(base_biosample['@id'], {'organism': mouse['@id']})
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'ChIP-seq',
'biosample_ontology': mel['uuid'],
'target': mouse_H3K9me3['@id']})
base_antibody['targets'] = [mouse_H3K9me3['@id']]
no_primary_antibody = testapp.post_json('/antibody_lot', base_antibody).json['@graph'][0]
testapp.patch_json(base_replicate['@id'], {'antibody': no_primary_antibody['@id'],
'library': base_library['@id'],
'experiment': base_experiment['@id']})
testapp.patch_json(
base_antibody_characterization2['@id'],
{'target': mouse_H3K9me3['@id'],
'characterizes': no_primary_antibody['@id'],
'status': 'not compliant',
'reviewed_by': wrangler['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'antibody not characterized to standard'
for error in collect_audit_errors(res))
def test_audit_experiment_tag_target(testapp, experiment, ctcf):
tag_target = testapp.post_json(
'/target',
{
'genes': [ctcf['uuid']],
'modifications': [{'modification': 'eGFP'}],
'label': 'eGFP-CTCF',
'investigated_as': ['other context']
}
).json['@graph'][0]
testapp.patch_json(experiment['@id'], {'assay_term_name': 'ChIP-seq',
'target': tag_target['@id']})
audits = testapp.get(experiment['@id'] + '@@index-data').json['audit']
assert any(detail['category'] == 'inconsistent experiment target'
for audit in audits.values() for detail in audit)
def test_audit_experiment_inconsist_mod_target(testapp, experiment,
library_url, replicate_url, biosample, ctcf,
construct_genetic_modification):
tag_target = testapp.post_json(
'/target',
{
'genes': [ctcf['uuid']],
'modifications': [{'modification': 'eGFP'}],
'label': 'eGFP-CTCF',
'investigated_as': ['other context']
}
).json['@graph'][0]
testapp.patch_json(
biosample['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']]}
)
testapp.patch_json(experiment['@id'], {'assay_term_name': 'ChIP-seq',
'target': tag_target['@id']})
audits = testapp.get(experiment['@id'] + '@@index-data').json['audit']
assert any(detail['category'] == 'inconsistent genetic modification targets'
for audit in audits.values() for detail in audit)
def test_audit_experiment_chip_seq_control_target_failures(
testapp,
base_experiment,
experiment,
file_fastq_3,
file_bam_1_1,
file_tsv_1_2,
analysis_step_run_bam,
pipeline_bam,
target_H3K9me3,
):
testapp.patch_json(
base_experiment['@id'],
{
'target': target_H3K9me3['@id'],
'possible_controls': [experiment['@id']],
'assay_term_name': 'ChIP-seq',
}
)
testapp.patch_json(
file_tsv_1_2['@id'],
{
'derived_from': [file_bam_1_1['@id']],
'dataset': base_experiment['@id'],
'file_format_type': 'narrowPeak',
'file_format': 'bed',
'output_type': 'peaks',
}
)
testapp.patch_json(
file_bam_1_1['@id'],
{
'step_run': analysis_step_run_bam['@id'],
'dataset': experiment['@id'],
'derived_from': [file_fastq_3['@id']]
}
)
testapp.patch_json(
experiment['@id'],
{
'target': target_H3K9me3['@id'],
'assay_term_name': 'ChIP-seq'
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(
error['category'] == 'missing control_type of control experiment'
for error in collect_audit_errors(res)
)
assert all(
error['category'] != 'improper control_type of control experiment'
for error in collect_audit_errors(res)
)
assert any(
error['category'] == 'unexpected target of control experiment'
for error in collect_audit_errors(res)
)
testapp.patch_json(
experiment['@id'],
{
'control_type': 'control',
'assay_term_name': 'ChIP-seq'
}
)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'missing control_type of control experiment'
for error in collect_audit_errors(res)
)
assert any(
error['category'] == 'improper control_type of control experiment'
for error in collect_audit_errors(res)
)
assert any(
error['category'] == 'unexpected target of control experiment'
for error in collect_audit_errors(res)
)
ctrl_exp = testapp.get(experiment['@id'] + '@@edit').json
ctrl_exp.pop('target')
testapp.put_json(experiment['@id'], ctrl_exp)
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(
error['category'] != 'unexpected target of control experiment'
for error in collect_audit_errors(res)
)
def test_audit_experiment_missing_queried_RNP_size_range(
testapp,
base_experiment,
replicate_1_1,
library_1
):
testapp.patch_json(base_experiment['@id'], {
'assay_term_name': 'eCLIP'
})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'missing queried_RNP_size_range'
for error in collect_audit_errors(res))
def test_audit_experiment_mixed_queried_RNP_size_range(
testapp,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2
):
testapp.patch_json(base_experiment['@id'], {
'assay_term_name': 'eCLIP'
})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {
'library': library_2['@id'],
'experiment': base_experiment['@id']
})
testapp.patch_json(library_1['@id'], {'queried_RNP_size_range': '150-200'})
testapp.patch_json(library_2['@id'], {'queried_RNP_size_range': '200-400'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed queried_RNP_size_range'
for error in collect_audit_errors(res))
def test_audit_experiment_inconsistent_queried_RNP_size_range(
testapp,
base_experiment,
experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2
):
testapp.patch_json(base_experiment['@id'], {
'assay_term_name': 'eCLIP',
'possible_controls': [experiment['@id']]
})
testapp.patch_json(experiment['@id'], {'assay_term_name': 'eCLIP'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {
'library': library_2['@id'],
'experiment': experiment['@id']
})
testapp.patch_json(library_1['@id'], {'queried_RNP_size_range': '150-200'})
testapp.patch_json(library_2['@id'], {'queried_RNP_size_range': '200-400'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent queried_RNP_size_range'
for error in collect_audit_errors(res))
def test_audit_experiment_lacking_processed_data(
testapp,
base_experiment,
experiment,
file_fastq,
file_bam
):
testapp.patch_json(file_fastq['@id'], {
'dataset': base_experiment['@id'],
})
testapp.patch_json(file_bam['@id'], {
'dataset': base_experiment['@id'],
})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(warning['category'] != 'lacking processed data'
for warning in collect_audit_errors(res))
testapp.patch_json(file_bam['@id'], {
'dataset': experiment['@id']
})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(warning['category'] == 'lacking processed data'
for warning in collect_audit_errors(res))
testapp.patch_json(file_fastq['@id'], {
'dataset': experiment['@id']
})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(warning['category'] == 'lacking processed data'
for warning in collect_audit_errors(res))
def test_audit_experiment_control(testapp, base_matched_set, ChIP_experiment, experiment, base_experiment):
ctrl = testapp.patch_json(base_matched_set['@id'], {'related_datasets': [experiment['@id'],
base_experiment['@id']]})
res = testapp.get(ChIP_experiment['@id'] + '@@index-data')
assert (error['category'] == 'inconsistent control' for error in collect_audit_errors(res))
ctrl = testapp.patch_json(base_matched_set['@id'], {'related_datasets': [experiment['@id']]})
res = testapp.get(ChIP_experiment['@id'] + '@@index-data')
assert not any(error['category'] == 'inconsistent control' for error in collect_audit_errors(res))
def test_audit_experiment_inconsistent_analysis_files(
testapp,
experiment_with_analysis,
experiment_with_analysis_2,
analysis_1,
analysis_2,
analysis_released,
file_bam_1_1,
file_bam_2_1,
bigWig_file,
bam_file
):
# No inconsistencies, all files in analyses and all analyses associated with dataset
testapp.patch_json(file_bam_1_1['@id'], {
'dataset': experiment_with_analysis['@id'],
})
testapp.patch_json(file_bam_2_1['@id'], {
'dataset': experiment_with_analysis['@id'],
})
testapp.patch_json(bam_file['@id'], {
'dataset': experiment_with_analysis['@id'],
})
testapp.patch_json(experiment_with_analysis['@id'], {
'analyses': [analysis_1['@id'], analysis_2['@id'], analysis_released['@id']]
})
res = testapp.get(experiment_with_analysis['@id'] + '@@index-data')
assert not any(error['category'] == 'inconsistent analysis files' for error in collect_audit_errors(res))
# Processed file not in any analysis
testapp.patch_json(bigWig_file['@id'], {
'dataset': experiment_with_analysis['@id'],
})
res = testapp.get(experiment_with_analysis['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent analysis files' for error in collect_audit_errors(res))
# Files in analysis belonging to a different dataset
testapp.patch_json(file_bam_1_1['@id'], {
'dataset': experiment_with_analysis_2['@id'],
})
testapp.patch_json(file_bam_2_1['@id'], {
'dataset': experiment_with_analysis_2['@id'],
})
res = testapp.get(experiment_with_analysis_2['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent analysis files' for error in collect_audit_errors(res))
# Deleted files are excluded from processed data
testapp.patch_json(file_bam_1_1['@id'], {
'status': 'deleted'
})
testapp.patch_json(file_bam_2_1['@id'], {
'status': 'deleted'
})
testapp.patch_json(bam_file['@id'], {
'dataset': experiment_with_analysis_2['@id'],
'status': 'deleted'
})
res = testapp.get(experiment_with_analysis_2['@id'] + '@@index-data')
print(res.json['audit'])
assert not any(error['category'] == 'inconsistent analysis files' for error in collect_audit_errors(res))
def test_audit_experiment_inconsistent_genetic_modifications(
testapp,
construct_genetic_modification,
interference_genetic_modification,
base_experiment,
replicate_1_1,
replicate_2_1,
library_1,
library_2,
biosample_1,
biosample_2):
# one biosample with genetic modifications and one biosample without genetic modifications
testapp.patch_json(biosample_1['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']]})
testapp.patch_json(library_1['@id'], {'biosample': biosample_1['@id']})
testapp.patch_json(library_2['@id'], {'biosample': biosample_2['@id']})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent genetic modifications' for error in collect_audit_errors(res))
# biosamples with the same genetic modifications
testapp.patch_json(biosample_2['@id'],
{'genetic_modifications': [construct_genetic_modification['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert not any(error['category'] == 'inconsistent genetic modifications' for error in collect_audit_errors(res))
# biosamples with different genetic modifications
testapp.patch_json(biosample_2['@id'],
{'genetic_modifications': [interference_genetic_modification['@id']]})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent genetic modifications' for error in collect_audit_errors(res))
def test_audit_experiment_average_fragment_size(testapp, base_experiment, base_replicate, base_library):
# average_fragment_size may stand in for size_range, behavior should match
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'RNA-seq'})
testapp.patch_json(base_library['@id'], {'average_fragment_size': 220})
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
res_errors = collect_audit_errors(res)
assert any(error['category'] == 'missing spikeins'
for error in res_errors)
assert 'missing RNA fragment size' not in res_errors
def test_audit_experiment_mixed_strand_specificity_libraries(
testapp, base_experiment, replicate_1_1, replicate_2_1,
library_1, library_2
):
# https://encodedcc.atlassian.net/browse/ENCD-5554
testapp.patch_json(library_1['@id'], {'strand_specificity': 'reverse'})
testapp.patch_json(replicate_1_1['@id'], {'library': library_1['@id']})
testapp.patch_json(replicate_2_1['@id'], {'library': library_2['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed strand specificities'
for error in collect_audit_errors(res))
testapp.patch_json(library_2['@id'], {'strand_specificity': 'strand-specific'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'mixed strand specificities'
for error in collect_audit_errors(res))
def test_audit_experiment_inconsistent_analysis_status(testapp, experiment_with_analysis,
analysis_released, analysis_released_2,
analysis_1, experiment_rna):
# https://encodedcc.atlassian.net/browse/ENCD-5705
# Released analysis objects are disallowed in non-released datasets
testapp.patch_json(experiment_with_analysis['@id'],
{"analyses": [analysis_released["@id"]]})
res = testapp.get(experiment_with_analysis['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent analysis status'
and 'not released' in error['detail']
for error in collect_audit_errors(res))
# Released datasets must have a released analysis
testapp.patch_json(
experiment_with_analysis['@id'], {'status': 'released', 'date_released': '2021-01-01'})
testapp.patch_json(
experiment_with_analysis['@id'], {"analyses": [analysis_1["@id"]]})
res = testapp.get(experiment_with_analysis['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent analysis status'
and 'lacks a released analysis' in error['detail']
for error in collect_audit_errors(res))
# Multiple released analyses in a dataset is disallowed
testapp.patch_json(
experiment_with_analysis['@id'], {
"analyses": [analysis_released["@id"], analysis_released_2["@id"]]})
res = testapp.get(experiment_with_analysis['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent analysis status'
and 'released analyses' in error['detail']
for error in collect_audit_errors(res))
# Datasets lacking a released analysis (no analyses at all) are flagged
res = testapp.get(experiment_rna['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent analysis status'
and 'lacks a released analysis' in error['detail']
for error in collect_audit_errors(res))
def test_audit_experiment_mixed_biosamples_replication_type(testapp, base_experiment, biosample_1,
biosample_2, base_replicate,
library_no_biosample):
# https://encodedcc.atlassian.net/browse/ENCD-5706
testapp.patch_json(library_no_biosample['@id'], {
'mixed_biosamples': [biosample_1['@id'], biosample_2['@id']]})
testapp.patch_json(base_replicate['@id'], {'library': library_no_biosample['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert all(error['category'] != 'undetermined replication_type'
for error in collect_audit_errors(res))
def test_audit_experiment_single_cell_libraries(testapp, base_experiment, base_replicate, base_library):
testapp.patch_json(base_library['@id'], {'barcode_details': [{'barcode': 'ATTTCGC'}]})
testapp.patch_json(base_replicate['@id'], {'library': base_library['@id']})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert any(error['category'] == 'inconsistent barcode details'
for error in collect_audit_errors(res))
testapp.patch_json(base_experiment['@id'], {'assay_term_name': 'single-cell RNA sequencing assay'})
res = testapp.get(base_experiment['@id'] + '@@index-data')
assert not any(error['category'] == 'inconsistent barcode details'
for error in collect_audit_errors(res))
| 52.212722 | 140 | 0.536458 | 15,262 | 152,670 | 5.020443 | 0.031123 | 0.083005 | 0.110673 | 0.048158 | 0.927175 | 0.915781 | 0.898162 | 0.882592 | 0.868719 | 0.852575 | 0 | 0.019839 | 0.337709 | 152,670 | 2,923 | 141 | 52.230585 | 0.737954 | 0.015065 | 0 | 0.764165 | 0 | 0 | 0.173343 | 0.006725 | 0 | 0 | 0 | 0 | 0.070827 | 1 | 0.043645 | false | 0 | 0.001149 | 0 | 0.045176 | 0.000383 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ad636de69e19593a87405adb9938af6686eba09 | 3,070 | py | Python | rdr_service/alembic/versions/9a0873b51fe0_add_columns_to_gc_metrics_for_wgs.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 39 | 2017-10-13T19:16:27.000Z | 2021-09-24T16:58:21.000Z | rdr_service/alembic/versions/9a0873b51fe0_add_columns_to_gc_metrics_for_wgs.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 312 | 2017-09-08T15:42:13.000Z | 2022-03-23T18:21:40.000Z | rdr_service/alembic/versions/9a0873b51fe0_add_columns_to_gc_metrics_for_wgs.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 19 | 2017-09-15T13:58:00.000Z | 2022-02-07T18:33:20.000Z | """add columns to gc metrics for wgs
Revision ID: 9a0873b51fe0
Revises: 235693878327
Create Date: 2020-05-05 10:54:18.411657
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '9a0873b51fe0'
down_revision = '235693878327'
branch_labels = None
depends_on = None
def upgrade(engine_name):
globals()["upgrade_%s" % engine_name]()
def downgrade(engine_name):
globals()["downgrade_%s" % engine_name]()
def upgrade_rdr():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('genomic_gc_validation_metrics', sa.Column('crai_md5_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('crai_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('cram_md5_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('cram_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('hf_vcf_md5_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('hf_vcf_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('hf_vcf_tbi_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('raw_vcf_md5_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('raw_vcf_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('raw_vcf_tbi_received', sa.SmallInteger(), nullable=False))
op.add_column('genomic_gc_validation_metrics', sa.Column('sex_ploidy', sa.String(length=10), nullable=True))
# ### end Alembic commands ###
def downgrade_rdr():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('genomic_gc_validation_metrics', 'sex_ploidy')
op.drop_column('genomic_gc_validation_metrics', 'raw_vcf_tbi_received')
op.drop_column('genomic_gc_validation_metrics', 'raw_vcf_received')
op.drop_column('genomic_gc_validation_metrics', 'raw_vcf_md5_received')
op.drop_column('genomic_gc_validation_metrics', 'hf_vcf_tbi_received')
op.drop_column('genomic_gc_validation_metrics', 'hf_vcf_received')
op.drop_column('genomic_gc_validation_metrics', 'hf_vcf_md5_received')
op.drop_column('genomic_gc_validation_metrics', 'cram_received')
op.drop_column('genomic_gc_validation_metrics', 'cram_md5_received')
op.drop_column('genomic_gc_validation_metrics', 'crai_received')
op.drop_column('genomic_gc_validation_metrics', 'crai_md5_received')
# ### end Alembic commands ###
def upgrade_metrics():
# ### commands auto generated by Alembic - please adjust! ###
pass
# ### end Alembic commands ###
def downgrade_metrics():
# ### commands auto generated by Alembic - please adjust! ###
pass
# ### end Alembic commands ###
| 44.492754 | 120 | 0.755049 | 407 | 3,070 | 5.334152 | 0.174447 | 0.131737 | 0.152004 | 0.253339 | 0.801474 | 0.782128 | 0.782128 | 0.765546 | 0.765546 | 0.631967 | 0 | 0.025726 | 0.113681 | 3,070 | 68 | 121 | 45.147059 | 0.772143 | 0.152769 | 0 | 0.052632 | 0 | 0 | 0.411045 | 0.251677 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157895 | false | 0.052632 | 0.052632 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
9ae6d21f0ad42a4cf2631156e9d2d4dfa9ae53c3 | 64,447 | py | Python | ops-tests/feature/test_classifierd_ft_acl_udp_traffic.py | learnopx/ops-classifierd | 8f473be41cacf21b09b9b7d49cd0f730c1ca5fc0 | [
"Apache-2.0"
] | null | null | null | ops-tests/feature/test_classifierd_ft_acl_udp_traffic.py | learnopx/ops-classifierd | 8f473be41cacf21b09b9b7d49cd0f730c1ca5fc0 | [
"Apache-2.0"
] | null | null | null | ops-tests/feature/test_classifierd_ft_acl_udp_traffic.py | learnopx/ops-classifierd | 8f473be41cacf21b09b9b7d49cd0f730c1ca5fc0 | [
"Apache-2.0"
] | 1 | 2021-09-10T08:12:25.000Z | 2021-09-10T08:12:25.000Z | # -*- coding: utf-8 -*-
#
# Copyright (C) 2016 Hewlett Packard Enterprise Development LP
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""
OpenSwitch Test for ACL operations with UDP traffic.
This file consists of the following test cases:
Test1 : acl_udp_any_any_permit
Test2 : acl_udp_any_any_deny
Test3 : acl_permit_udp_hs1_hs2
Test4 : acl_deny_udp_hs1_hs2
Test5 : acl_permit_udp_prefix_len_mask
Test6 : acl_deny_udp_prefix_len_mask
Test7 : acl_permit_udp_dotted_netmask
Test8 : acl_deny_udp_dotted_netmask
Test9 : acl_permit_udp_non_contiguous_mask
Test10: acl_deny_udp_non_contiguous_mask
Test11: acl_permit_udp_dport_eq_param
Test12: acl_deny_udp_dport_eq_param
Test13: acl_deny_udp_dport_eq_param
Test14: acl_deny_udp_dport_eq_param
Test15: acl_modify_after_sending_udp_traffic
Test16: acl_deny_udp_on_multiple_ports
"""
from pytest import mark
from re import findall
from re import search
from topology_lib_scapy.library import ScapyThread
from topology_lib_scapy.library import send_traffic
from topology_lib_scapy.library import sniff_traffic
from time import sleep
TOPOLOGY = """
# +-------+ +-------+
# | | +--------+ | |G
# | hs1 <-----> ops1 <-----> hs2 |
# | | +--------+ | |
# +-------+ +-------+
# Nodes
# [image="fs-genericx86-64:latest" \
# type=openswitch name="OpenSwitch 1"] ops1
# [type=host name="Host 1" image="openswitch/ubuntuscapy:latest"] hs1
# [type=host name="Host 2" image="openswitch/ubuntuscapy:latest"] hs2
[type=openswitch name="Switch 1"] ops1
[type=host name="Host 1" image="Ubuntu"] hs1
[type=host name="Host 2" image="Ubuntu"] hs2
# Links
hs1:1 -- ops1:1
ops1:2 -- hs2:1
"""
filter_udp = 'udp and port 48621 and ip src 1.1.1.1 and ip dst 1.1.1.2'
filter_udp_other = 'udp and port 5555 and ip src 1.1.1.1 and ip dst 1.1.1.2'
filter_icmp = 'icmp and ip src 1.1.1.1 and ip dst 1.1.1.2'
filter_udp_reverse = 'udp and port 48621 and ip src 1.1.1.2 and ip dst 1.1.1.1'
filter_icmp_reverse = 'icmp and ip src 1.1.1.2 and ip dst 1.1.1.1'
port_str = '1'
timeout = 25
count = 10
def configure_permit_acl(ops1, name, seq_num, proto, src_ip,
src_port, dst_ip, dst_port):
"""
Configure an ACL with one permit rule
"""
with ops1.libs.vtysh.ConfigAccessListIpTestname(name) as ctx:
ctx.permit('', seq_num, proto, src_ip, src_port, dst_ip, dst_port)
def configure_deny_acl(ops1, name, seq_num, proto, src_ip,
src_port, dst_ip, dst_port):
"""
Configure an ACL with one deny rule
"""
with ops1.libs.vtysh.ConfigAccessListIpTestname(name) as ctx:
ctx.deny('', seq_num, proto, src_ip, src_port, dst_ip, dst_port)
def acl_permit_udp_any_any(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp any any" rule on interface 1.
It then sends 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2
"""
global filter_udp, timeout, count, port_str
step('1.a Configure an ACL with 1 permit udp any any rule')
configure_permit_acl(ops1, 'test', '1', 'udp', 'any', '', 'any', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+any\s+any'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('1.b Create UDP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
list_udp = [ip_packet, udp_packet]
proto_str = 'IP/UDP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
step('1.c Send and receive udp packets on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('1.d Verify results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_any_any(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 deny udp any any" rule on interface 1.
It then sends 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2
"""
global filter_udp, timeout, count, port_str
step('2.a Configure an ACL with 1 deny udp any any rule')
configure_deny_acl(ops1, 'test', '1', 'udp', 'any', '', 'any', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+any\s+any'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('2.b Create UDP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
list_udp = [ip_packet, udp_packet]
proto_str = 'IP/UDP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
step('2.c Send and receive UDP packets on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('2.d Verify results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_permit_udp_hs1_hs2(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.1 1.1.1.2" rule on interface 1.
It then sends 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('3.a Configure an ACL with 1 permit udp 1.1.1.1 1.1.1.2 rule')
configure_permit_acl(ops1, 'test', '1', 'udp', '1.1.1.1', '', '1.1.1.2',
'')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+1.1.1.1\s+1.1.1.2'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('3.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('3.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('3.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('3.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('3.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_hs1_hs2(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 deny udp 1.1.1.1 1.1.1.2" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are denied on hs2. Also, it verifies that other
protocol traffic is received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('4.a Configure an ACL with 1 deny udp 1.1.1.1 1.1.1.2 rule')
configure_deny_acl(
ops1, 'test', '1', 'udp', '1.1.1.1', '', '1.1.1.2', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.1\s+1.1.1.2'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('4.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('4.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('4.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('4.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('4.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_permit_udp_prefix_len_mask(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.0/31 1.1.1.0/30" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('5.a Configure an ACL with 1 permit udp 1.1.1.0/31 1.1.1.0/30 rule')
configure_permit_acl(ops1, 'test', '1', 'udp', '1.1.1.0/31', '',
'1.1.1.0/30',
'')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+1.1.1.0/255.255.255.254\s+'
'1.1.1.0/255.255.255.252'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('5.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('5.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('5.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('5.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('5.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_prefix_len_mask(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.0/31 1.1.1.0/30" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('6.a Configure an ACL with 1 deny udp 1.1.1.0/31 1.1.1.0/30 rule')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.1.1.0/31', '',
'1.1.1.0/30',
'')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.0/255.255.255.254\s+'
'1.1.1.0/255.255.255.252'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('6.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('6.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('6.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('6.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('6.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_permit_udp_dotted_netmask(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.0/255.255.255.254
1.1.1.0/255.255.255.252" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('7.a Configure an ACL with 1 permit udp 1.1.1.0/255.255.255.254'
' 1.1.1.0/255.255.255.252 rule')
configure_permit_acl(
ops1, 'test', '1', 'udp', 'any', '',
'1.1.1.0/255.255.255.252', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+any\s+'
'1.1.1.0/255.255.255.252'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('7.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('7.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('7.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('7.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('7.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_dotted_netmask(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.0/255.255.255.254
1.1.1.0/255.255.255.252" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('8.a Configure an ACL with 1 deny udp 1.1.1.0/255.255.255.254 '
'1.1.1.0/255.255.255.252 rule')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.1.1.0/255.255.255.254', '',
'1.1.1.0/255.255.255.252',
'')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.0/255.255.255.254\s+'
'1.1.1.0/255.255.255.252'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('8.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('8.c Send and receive udp traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('8.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('8.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('8.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_permit_udp_non_contiguous_mask(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.0/255.255.255.254
1.1.1.0/255.255.255.252" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('9.a Configure an ACL with 1 permit udp 1.0.1.0/255.0.255.254'
' any rule')
configure_permit_acl(
ops1, 'test', '1', 'udp', '1.0.1.0/255.0.255.254', '',
'any', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+1.0.1.0/255.0.255.254\s+'
'any'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('9.b Create udp packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('9.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('9.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('9.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('9.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_non_contiguous_mask(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 deny udp 1.1.1.0/255.255.255.254
any" rule on interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_icmp, timeout, count, port_str
step('10.a Configure an ACL with 1 deny udp 1.0.1.0/255.255.255.254 '
'any rule')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.0.1.0/255.0.255.0', '',
'any',
'')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.0.1.0/255.0.255.0\s+'
'any'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('10.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('10.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('10.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('10.e Send ICMP traffic')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('10.f Verify ICMP results')
if rxthread_icmp.outresult():
rest_icmp, sniffcnt_icmp = rxthread_icmp.outresult().split('<Sniffed:')
list_result_icmp = findall(r'[0-9]+', sniffcnt_icmp)
print(list_result_icmp)
assert (list_result_icmp[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_permit_udp_dport_eq_param(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.1 1.1.1.2 eq 48621" rule on
interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_udp_other, timeout, count, port_str
step('11.a Configure an ACL with 1 permit udp 1.1.1.1 1.1.1.2 '
'eq 48621 rule')
configure_permit_acl(ops1, 'test', '1', 'udp', '1.1.1.1', '',
'1.1.1.2',
'eq 48621')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+1.1.1.1\s+'
'1.1.1.2 eq 48621'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('11.b Create UDP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
udp_packet_other_port = hs1.libs.scapy.udp("dport=5555")
list_udp = [ip_packet, udp_packet]
list_udp_other = [ip_packet, udp_packet_other_port]
proto_str = 'IP/UDP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_udp_other = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp_other, '', count,
'', 0)
rxthread_udp_other = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp_other, count,
port_str, timeout)
step('11.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('11.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('11.e Send UDP traffic to a different port')
rxthread_udp_other.start()
txthread_udp_other.start()
txthread_udp_other.join()
rxthread_udp_other.join()
step('11.f Verify Other UDP results')
if rxthread_udp_other.outresult():
rest_udp_other, sniffcnt_udp_other = rxthread_udp_other.outresult(
).split('<Sniffed:')
list_result_udp_other = findall(r'[0-9]+', sniffcnt_udp_other)
print(list_result_udp_other)
assert (list_result_udp_other[1] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_dport_eq_param(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 deny udp 1.1.1.1 1.1.1.2 eq 48621" rule on
interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_udp_other, timeout, count, port_str
step('12.a Configure an ACL with 1 permit udp 1.1.1.1 1.1.1.2 eq 48621 '
'rule')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.1.1.1', '',
'1.1.1.2', 'eq 48621')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.1\s+'
'1.1.1.2 eq 48621'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('12.b Create UDP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
udp_packet_other_port = hs1.libs.scapy.udp("dport=5555")
list_udp = [ip_packet, udp_packet]
list_udp_other = [ip_packet, udp_packet_other_port]
proto_str = 'IP/UDP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_udp_other = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp_other, '', count,
'', 0)
rxthread_udp_other = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp_other, count,
port_str, timeout)
step('12.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('12.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('12.e Send UDP traffic to a different port')
rxthread_udp_other.start()
txthread_udp_other.start()
txthread_udp_other.join()
rxthread_udp_other.join()
step('12.f Verify Other UDP results')
if rxthread_udp_other.outresult():
rest_udp_other, sniffcnt_udp_other = (rxthread_udp_other.outresult()
.split('<Sniffed:'))
list_result_udp_other = findall(r'[0-9]+', sniffcnt_udp_other)
print(list_result_udp_other)
assert (list_result_udp_other[1] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_permit_udp_sport_eq_param(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 permit udp 1.1.1.1 eq 5555 1.1.1.2" rule on
interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_udp_other, timeout, count, port_str
step('13.a Configure an ACL with 1 permit udp 1.1.1.1 eq 5555 '
'1.1.1.2 rule')
configure_permit_acl(ops1, 'test', '1', 'udp', '1.1.1.1', 'eq 5555',
'1.1.1.2',
'')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+1.1.1.1 eq 5555\s+'
'1.1.1.2'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('13.b Create UDP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp()
udp_packet['dport'] = 48621
udp_packet['sport'] = 5555
udp_packet_other_port = hs1.libs.scapy.udp()
udp_packet_other_port['dport'] = 5555
udp_packet_other_port['sport'] = 1000
list_udp = [ip_packet, udp_packet]
list_udp_other = [ip_packet, udp_packet_other_port]
proto_str = 'IP/UDP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_udp_other = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp_other, '', count,
'', 0)
rxthread_udp_other = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp_other, count,
port_str, timeout)
step('13.c Send and receive udp traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('13.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('13.e Send UDP traffic to a different port')
rxthread_udp_other.start()
txthread_udp_other.start()
txthread_udp_other.join()
rxthread_udp_other.join()
step('13.f Verify Other UDP results')
if rxthread_udp_other.outresult():
rest_udp_other, sniffcnt_udp_other = rxthread_udp_other.outresult(
).split('<Sniffed:')
list_result_udp_other = findall(r'[0-9]+', sniffcnt_udp_other)
print(list_result_udp_other)
assert (list_result_udp_other[1] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_sport_eq_param(ops1, hs1, hs2, topology, step):
"""
This test adds a "1 deny udp 1.1.1.1 eq 5555 1.1.1.2" rule on
interface 1.
It then passes 10 UDP packets from hs1 to hs2 and verifies that
10 UDP packets are received on hs2. Also, it verifies that other
protocol traffic is not received by hs2 by sending 10 ICMP packets.
"""
global filter_udp, filter_udp_other, timeout, count, port_str
step('14.a Configure an ACL with 1 permit udp 1.1.1.1 eq 5555 '
' 1.1.1.2 rule')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.1.1.1', 'eq 5555',
'1.1.1.2', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.1 eq 5555\s+'
'1.1.1.2'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('14.b Create udp packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp()
udp_packet['dport'] = 48621
udp_packet['sport'] = 5555
udp_packet_other_port = hs1.libs.scapy.udp()
udp_packet_other_port['dport'] = 5555
udp_packet_other_port['sport'] = 1000
list_udp = [ip_packet, udp_packet]
list_udp_other = [ip_packet, udp_packet_other_port]
proto_str = 'IP/UDP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_udp_other = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp_other, '', count,
'', 0)
rxthread_udp_other = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp_other, count,
port_str, timeout)
step('14.c Send and receive UDP traffic on hs1 and hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('14.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('14.e Send UDP traffic to a different port')
rxthread_udp_other.start()
txthread_udp_other.start()
txthread_udp_other.join()
rxthread_udp_other.join()
step('14.f Verify Other UDP results')
if rxthread_udp_other.outresult():
rest_udp_other, sniffcnt_udp_other = (rxthread_udp_other.outresult()
.split('<Sniffed:'))
list_result_udp_other = findall(r'[0-9]+', sniffcnt_udp_other)
print(list_result_udp_other)
assert (list_result_udp_other[1] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_modify_after_sending_udp_traffic(ops1, hs1, hs2, topology, step):
"""
This test sends some traffic after applying an ACL to interface 1.
It then stops traffic, modifies the ACL and verifies that traffic behavior
complies with the applied ACL
"""
global filter_udp, filter_icmp, count, timeout, port_str
step('15.a Configure an ACL with 1 permit udp 1.1.1.1 1.1.1.2 rule')
configure_permit_acl(
ops1, 'test', '1', 'udp', '1.1.1.1', '', '1.1.1.2', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+permit\s+udp\s+1.1.1.1\s+'
'1.1.1.2'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('15.b Create UDP and ICMP packets from hs1 to hs2')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
txthread_udp_repeat = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread_udp_repeat = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp_repeat = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp_repeat = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
step('15.c Send UDP packets')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('15.d Verify UDP results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '10')
step('15.e Send ICMP packets')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('15.f Verify ICMP results')
if rxthread_icmp.outresult():
rest, sniffcnt = rxthread_icmp.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert(list_result[2] == '0')
step('15.g Modify ACL with 1 deny udp 1.1.1.1 1.1.1.2 rule')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.1.1.1', '',
'1.1.1.2', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.1\s+'
'1.1.1.2'.format(**locals()), test1_result
)
step('15.h Send UDP packets')
rxthread_udp_repeat.start()
txthread_udp_repeat.start()
txthread_udp_repeat.join()
rxthread_udp_repeat.join()
step('15.i Verify UDP results')
if rxthread_udp_repeat.outresult():
rest, sniffcnt = rxthread_udp_repeat.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert (list_result[1] == '0')
step('15.j Send ICMP packets')
rxthread_icmp_repeat.start()
txthread_icmp_repeat.start()
txthread_icmp_repeat.join()
rxthread_icmp_repeat.join()
step('15.k Verify ICMP results')
if rxthread_icmp_repeat.outresult():
rest, sniffcnt = rxthread_icmp_repeat.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert(list_result[2] == '0')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
def acl_deny_udp_on_multiple_ports(ops1, hs1, hs2, topology, step):
"""
This tests applies a deny rule for UDP and permit rule for ICMP on
interfaces 1 and 2. Then, it passes UDP traffic in both directions
and verifies that traffic is blocked. Next, it passes ICMP traffic
and verifies that the responses are received.
"""
global filter_udp, filter_icmp, filter_udp_reverse, filter_icmp_reverse
global count, timeout, port_str
step('16.a Configure a deny udp and permit icmp rule on ACL test')
configure_deny_acl(ops1, 'test', '1', 'udp', '1.1.1.1', '',
'1.1.1.2', '')
test1_result = ops1('show run')
assert search(
''
r'1\s+deny\s+udp\s+1.1.1.1\s+'
'1.1.1.2'.format(**locals()), test1_result
)
configure_permit_acl(ops1, 'test', '2', 'icmp', 'any', '', 'any', '')
test1_result = ops1('show run')
assert search(
''
r'2\s+permit\s+icmp\s+any\s+'
'any'.format(**locals()), test1_result
)
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
with ops1.libs.vtysh.ConfigInterface('2') as ctx:
ctx.apply_access_list_ip_in('test')
test1_result = ops1('show run')
assert search(
r'(access-list\s+ip\s+test\s+\in)'.format(
**locals()
), test1_result
)
step('16.b Create UDP and ICMP packets')
ip_packet = hs1.libs.scapy.ip("dst='1.1.1.2', src='1.1.1.1'")
ip_packet_reverse = hs2.libs.scapy.ip("dst='1.1.1.1', src='1.1.1.2'")
udp_packet = hs1.libs.scapy.udp("dport=48621")
icmp_packet = hs1.libs.scapy.icmp()
list_udp = [ip_packet, udp_packet]
list_icmp = [ip_packet, icmp_packet]
list_udp_reverse = [ip_packet_reverse, udp_packet]
list_icmp_reverse = [ip_packet_reverse, icmp_packet]
proto_str = 'IP/UDP'
icmp_proto_str = 'IP/ICMP'
txthread = ScapyThread(
send_traffic,
'hs1', topology, proto_str, list_udp, '', count,
'', 0)
rxthread = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_udp, count,
port_str, timeout)
txthread_icmp = ScapyThread(
send_traffic,
'hs1', topology, icmp_proto_str, list_icmp, '', count,
'', 0)
rxthread_icmp = ScapyThread(
sniff_traffic,
'hs2', topology, '', [], filter_icmp, count,
port_str, timeout)
txthread_udp_reverse = ScapyThread(
send_traffic,
'hs2', topology, proto_str, list_udp, '', count,
'', 0)
rxthread_udp_reverse = ScapyThread(
sniff_traffic,
'hs1', topology, '', [], filter_udp_reverse, count,
port_str, timeout)
txthread_icmp_reverse = ScapyThread(
send_traffic,
'hs2', topology, icmp_proto_str, list_icmp_reverse,
'',
count, '', 0)
rxthread_icmp_reverse = ScapyThread(
sniff_traffic,
'hs1', topology, '', [], filter_icmp_reverse,
count, port_str, timeout)
step('16.c Send UDP packets from hs1 to hs2')
rxthread.start()
txthread.start()
txthread.join()
rxthread.join()
step('16.d Verify results')
if rxthread.outresult():
rest, sniffcnt = rxthread.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert(list_result[1] == '0')
step('16.e Send UDP packets from hs2 to hs1')
rxthread_udp_reverse.start()
txthread_udp_reverse.start()
txthread_udp_reverse.join()
rxthread_udp_reverse.join()
step('16.f Verify results')
if rxthread_udp_reverse.outresult():
rest, sniffcnt = rxthread_udp_reverse.outresult().split('<Sniffed:')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert(list_result[1] == '0')
step('16.g Send ICMP traffic from hs1 to hs2')
rxthread_icmp.start()
txthread_icmp.start()
txthread_icmp.join()
rxthread_icmp.join()
step('16.h Verify results')
if rxthread_icmp.outresult():
rest, sniffcnt = rxthread_icmp.outresult().split('<Sniffed')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert(list_result[2] == '10')
step('16.i Send ICMP traffic from hs2 to hs1')
rxthread_icmp_reverse.start()
txthread_icmp_reverse.start()
txthread_icmp_reverse.join()
rxthread_icmp_reverse.join()
step('16.j Verify results')
if rxthread_icmp.outresult():
rest, sniffcnt = rxthread_icmp_reverse.outresult().split('<Sniffed')
list_result = findall(r'[0-9]+', sniffcnt)
print(list_result)
assert(list_result[2] == '10')
with ops1.libs.vtysh.ConfigAccessListIpTestname('test') as ctx:
ctx.no('1')
with ops1.libs.vtysh.Configure() as ctx:
ctx.no_access_list_ip('test')
test1_result = ops1('show run')
assert search(
r'(?!access-list\s+ip\s+test\s+)'.format(
**locals()
), test1_result
)
@mark.test_id(10405)
@mark.platform_incompatible(['docker'])
def test_classifierd_ft_acl_udp_traffic(topology, step):
"""
Test traffic after applying ACEs to ports.
Build a topology of one switch and two hosts on the same subnet.
"""
ops1 = topology.get('ops1')
hs1 = topology.get('hs1')
hs2 = topology.get('hs2')
assert ops1 is not None
assert hs1 is not None
assert hs2 is not None
p1 = ops1.ports['1']
p2 = ops1.ports['2']
# Mark interfaces as enabled
assert not ops1(
'set interface {p1} user_config:admin=up'.format(**locals()),
shell='vsctl'
)
assert not ops1(
'set interface {p2} user_config:admin=up'.format(**locals()),
shell='vsctl'
)
# Configure interfaces
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.no_routing()
ctx.no_shutdown()
with ops1.libs.vtysh.ConfigInterface('2') as ctx:
ctx.no_routing()
ctx.no_shutdown()
ops1('show interface {p1}'.format(**locals()))
ops1('show interface {p2}'.format(**locals()))
hs1.send_command('service network-manager stop', shell='bash')
hs2.send_command('service network-manager stop', shell='bash')
hs1.libs.ip.interface('1', addr='1.1.1.1/24', up=True)
hs2.libs.ip.interface('1', addr='1.1.1.2/24', up=True)
with ops1.libs.vtysh.ConfigVlan('100') as ctx:
ctx.no_shutdown()
with ops1.libs.vtysh.ConfigInterface('1') as ctx:
ctx.vlan_access(100)
with ops1.libs.vtysh.ConfigInterface('2') as ctx:
ctx.vlan_access(100)
step('Wait until interfaces are up')
for portlbl in ['1', '2']:
wait_until_interface_up(ops1, portlbl)
ping = hs2.libs.ping.ping(1, '1.1.1.1')
step('Start scapy on host workstations')
hs1.libs.scapy.start_scapy()
hs2.libs.scapy.start_scapy()
step('Test1 : acl_udp_any_any_permit')
acl_permit_udp_any_any(ops1, hs1, hs2, topology, step)
step('Test2: acl_udp_any_any_deny')
acl_deny_udp_any_any(ops1, hs1, hs2, topology, step)
step('Test3: acl_permit_udp_hs1_hs2')
acl_permit_udp_hs1_hs2(ops1, hs1, hs2, topology, step)
step('Test4: acl_deny_udp_hs1_hs2')
acl_deny_udp_hs1_hs2(ops1, hs1, hs2, topology, step)
step('Test5: acl_permit_udp_prefix_len_mask')
acl_permit_udp_prefix_len_mask(ops1, hs1, hs2, topology, step)
step('Test6: acl_deny_udp_prefix_len_mask')
acl_deny_udp_prefix_len_mask(ops1, hs1, hs2, topology, step)
step('Test7: acl_permit_udp_dotted_netmask')
acl_permit_udp_dotted_netmask(ops1, hs1, hs2, topology, step)
step('Test8: acl_deny_udp_dotted_netmask')
acl_deny_udp_dotted_netmask(ops1, hs1, hs2, topology, step)
step('Test9: acl_permit_udp_non_contiguous_mask')
acl_permit_udp_non_contiguous_mask(ops1, hs1, hs2, topology, step)
step('Test10: acl_deny_udp_non_contiguous_mask')
acl_deny_udp_non_contiguous_mask(ops1, hs1, hs2, topology, step)
step('Test11: acl_permit_udp_dport_eq_param')
acl_permit_udp_dport_eq_param(ops1, hs1, hs2, topology, step)
step('Test12: acl_deny_udp_dport_eq_param')
acl_deny_udp_sport_eq_param(ops1, hs1, hs2, topology, step)
step('Test13: acl_deny_udp_dport_eq_param')
acl_permit_udp_sport_eq_param(ops1, hs1, hs2, topology, step)
step('Test14: acl_deny_udp_dport_eq_param')
acl_deny_udp_dport_eq_param(ops1, hs1, hs2, topology, step)
step('Test15: acl_modify_after_sending_udp_traffic')
acl_modify_after_sending_udp_traffic(ops1, hs1, hs2, topology, step)
step('Test16: acl_deny_udp_on_multiple_ports')
acl_deny_udp_on_multiple_ports(ops1, hs1, hs2, topology, step)
def wait_until_interface_up(switch, portlbl, timeout=30, polling_frequency=1):
"""
Wait until the interface, as mapped by the given portlbl, is marked as up.
:param switch: The switch node.
:param str portlbl: Port label that is mapped to the interfaces.
:param int timeout: Number of seconds to wait.
:param int polling_frequency: Frequency of the polling.
:return: None if interface is brought-up. If not, an assertion is raised.
"""
for i in range(timeout):
status = switch.libs.vtysh.show_interface(portlbl)
if status['interface_state'] == 'up':
break
sleep(polling_frequency)
else:
assert False, (
'Interface {}:{} never brought-up after '
'waiting for {} seconds'.format(
switch.identifier, portlbl, timeout
)
)
| 32.127119 | 79 | 0.572501 | 8,493 | 64,447 | 4.164371 | 0.03862 | 0.02047 | 0.019933 | 0.012214 | 0.910993 | 0.897959 | 0.873247 | 0.851787 | 0.840392 | 0.825351 | 0 | 0.052152 | 0.299021 | 64,447 | 2,005 | 80 | 32.143142 | 0.730742 | 0.092122 | 0 | 0.749825 | 0 | 0.021724 | 0.170408 | 0.041588 | 0 | 0 | 0 | 0 | 0.06377 | 1 | 0.014015 | false | 0 | 0.004905 | 0 | 0.018921 | 0.023826 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9af4080f2c5fa0e8df28ec01a67421623a35f621 | 5,315 | py | Python | policies/migrations/0001_initial.py | fr33ky/signalserver | ce360cd89732c9d9270d7af04e38e55f6570d6a7 | [
"MIT"
] | 23 | 2016-03-24T00:31:47.000Z | 2022-02-10T21:27:53.000Z | policies/migrations/0001_initial.py | fr33ky/signalserver | ce360cd89732c9d9270d7af04e38e55f6570d6a7 | [
"MIT"
] | 148 | 2016-04-03T00:22:55.000Z | 2020-08-01T20:08:03.000Z | policies/migrations/0001_initial.py | fr33ky/signalserver | ce360cd89732c9d9270d7af04e38e55f6570d6a7 | [
"MIT"
] | 11 | 2016-04-24T03:31:31.000Z | 2019-09-03T16:51:08.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.10.dev20160107235441 on 2016-10-25 03:29
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Configuration',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('configuration_name', models.CharField(max_length=200, unique=True)),
('creation_time', models.DateTimeField(auto_now_add=True)),
('display_order', models.IntegerField(default=0)),
],
),
migrations.CreateModel(
name='Operation',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('signal_name', models.CharField(choices=[('None', 'None'), ('lavfi.signalstats.BRNG', 'BRNG'), ('lavfi.cropdetect.y2', 'Crop Bottom'), ('lavfi.cropdetect.y1', 'Crop Top'), ('lavfi.cropdetect.x1', 'Crop Left'), ('lavfi.cropdetect.x2', 'Crop Right'), ('lavfi.cropdetect.h', 'Crop Height'), ('lavfi.cropdetect.w', 'Crop Width'), ('lavfi.cropdetect.x', 'Crop X'), ('lavfi.cropdetect.y', 'Crop Y'), ('lavfi.signalstats.HUEAVG', 'HUE AVG'), ('lavfi.signalstats.HUEMED', 'HUE MED'), ('lavfi.psnr.mse_avg', 'MSEf Avg'), ('lavfi.psnr.mse.u', 'MSEf U'), ('lavfi.psnr.mse.v', 'MSEf V'), ('lavfi.psnr.mse.y', 'MSEf Y'), ('lavfi.psnr.psnr_avg', 'PSNRf Avg'), ('lavfi.psnr.psnr.u', 'PSNRf U'), ('lavfi.psnr.psnr.v', 'PSNRf V'), ('lavfi.psnr.psnr.y', 'PSNRf Y'), ('lavfi.r128.I', 'R128.I'), ('lavfi.r128.LRA', 'R128.LRA'), ('lavfi.r128.LRA.high', 'R28.LRA.high'), ('lavfi.r128.LRA.low', 'R128.LRA.low'), ('lavfi.r128.M', 'R128.M'), ('lavfi.r128.S', 'R128.S'), ('lavfi.signalstats.SATAVG', 'SAT AVG'), ('lavfi.signalstats.SATHIGH', 'SAT HIGH'), ('lavfi.signalstats.SATLOW', 'SAT LOW'), ('lavfi.signalstats.SATMAX', 'SAT MAX'), ('lavfi.signalstats.SATMIN', 'SAT MIN'), ('lavfi.signalstats.TOUT', 'TOUT'), ('lavfi.signalstats.UAVG', 'U AVG'), ('lavfi.signalstats.UDIF', 'U DIF'), ('lavfi.signalstats.UHIGH', 'U HIGH'), ('lavfi.signalstats.ULOW', 'U LOW'), ('lavfi.signalstats.UMAX', 'U MAX'), ('lavfi.signalstats.UMIN', 'U MIN'), ('lavfi.signalstats.VAVG', 'V AVG'), ('lavfi.signalstats.VDIF', 'V DIF'), ('lavfi.signalstats.VHIGH', 'V HIGH'), ('lavfi.signalstats.VLOW', 'V LOW'), ('lavfi.signalstats.VMAX', 'V MAX'), ('lavfi.signalstats.VMIN', 'V MIN'), ('lavfi.signalstats.VREP', 'VREP'), ('lavfi.signalstats.YAVG', 'Y AVG'), ('lavfi.signalstats.YDIF', 'Y DIF'), ('lavfi.signalstats.YHIGH', 'Y HIGH'), ('lavfi.signalstats.YLOW', 'Y LOW'), ('lavfi.signalstats.YMAX', 'Y MAX'), ('lavfi.signalstats.YMIN', 'Y MIN')], max_length=200)),
('second_signal_name', models.CharField(choices=[('None', 'None'), ('lavfi.signalstats.BRNG', 'BRNG'), ('lavfi.cropdetect.y2', 'Crop Bottom'), ('lavfi.cropdetect.y1', 'Crop Top'), ('lavfi.cropdetect.x1', 'Crop Left'), ('lavfi.cropdetect.x2', 'Crop Right'), ('lavfi.cropdetect.h', 'Crop Height'), ('lavfi.cropdetect.w', 'Crop Width'), ('lavfi.cropdetect.x', 'Crop X'), ('lavfi.cropdetect.y', 'Crop Y'), ('lavfi.signalstats.HUEAVG', 'HUE AVG'), ('lavfi.signalstats.HUEMED', 'HUE MED'), ('lavfi.psnr.mse_avg', 'MSEf Avg'), ('lavfi.psnr.mse.u', 'MSEf U'), ('lavfi.psnr.mse.v', 'MSEf V'), ('lavfi.psnr.mse.y', 'MSEf Y'), ('lavfi.psnr.psnr_avg', 'PSNRf Avg'), ('lavfi.psnr.psnr.u', 'PSNRf U'), ('lavfi.psnr.psnr.v', 'PSNRf V'), ('lavfi.psnr.psnr.y', 'PSNRf Y'), ('lavfi.r128.I', 'R128.I'), ('lavfi.r128.LRA', 'R128.LRA'), ('lavfi.r128.LRA.high', 'R28.LRA.high'), ('lavfi.r128.LRA.low', 'R128.LRA.low'), ('lavfi.r128.M', 'R128.M'), ('lavfi.r128.S', 'R128.S'), ('lavfi.signalstats.SATAVG', 'SAT AVG'), ('lavfi.signalstats.SATHIGH', 'SAT HIGH'), ('lavfi.signalstats.SATLOW', 'SAT LOW'), ('lavfi.signalstats.SATMAX', 'SAT MAX'), ('lavfi.signalstats.SATMIN', 'SAT MIN'), ('lavfi.signalstats.TOUT', 'TOUT'), ('lavfi.signalstats.UAVG', 'U AVG'), ('lavfi.signalstats.UDIF', 'U DIF'), ('lavfi.signalstats.UHIGH', 'U HIGH'), ('lavfi.signalstats.ULOW', 'U LOW'), ('lavfi.signalstats.UMAX', 'U MAX'), ('lavfi.signalstats.UMIN', 'U MIN'), ('lavfi.signalstats.VAVG', 'V AVG'), ('lavfi.signalstats.VDIF', 'V DIF'), ('lavfi.signalstats.VHIGH', 'V HIGH'), ('lavfi.signalstats.VLOW', 'V LOW'), ('lavfi.signalstats.VMAX', 'V MAX'), ('lavfi.signalstats.VMIN', 'V MIN'), ('lavfi.signalstats.VREP', 'VREP'), ('lavfi.signalstats.YAVG', 'Y AVG'), ('lavfi.signalstats.YDIF', 'Y DIF'), ('lavfi.signalstats.YHIGH', 'Y HIGH'), ('lavfi.signalstats.YLOW', 'Y LOW'), ('lavfi.signalstats.YMAX', 'Y MAX'), ('lavfi.signalstats.YMIN', 'Y MIN')], default=None, max_length=100)),
('op_name', models.CharField(choices=[('average', 'average'), ('exceeds', 'exceeds'), ('average_difference', 'average_difference')], max_length=20)),
('cut_off_number', models.IntegerField(default=0)),
('display_order', models.IntegerField(default=0)),
('configuration', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='policies.Configuration')),
],
),
]
| 136.282051 | 1,952 | 0.632926 | 683 | 5,315 | 4.875549 | 0.207906 | 0.269069 | 0.057057 | 0.023423 | 0.801802 | 0.801802 | 0.778979 | 0.778979 | 0.778979 | 0.778979 | 0 | 0.026299 | 0.127187 | 5,315 | 38 | 1,953 | 139.868421 | 0.691528 | 0.015804 | 0 | 0.4 | 1 | 0 | 0.559679 | 0.246748 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.233333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
b18766fecad70a67d7ee7fb388049d1d496cfd66 | 14,782 | py | Python | sdk/python/pulumi_oci/database/autonomous_database_wallet.py | EladGabay/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2021-08-17T11:14:46.000Z | 2021-12-31T02:07:03.000Z | sdk/python/pulumi_oci/database/autonomous_database_wallet.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-09-06T11:21:29.000Z | 2021-09-06T11:21:29.000Z | sdk/python/pulumi_oci/database/autonomous_database_wallet.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2021-08-24T23:31:30.000Z | 2022-01-02T19:26:54.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['AutonomousDatabaseWalletArgs', 'AutonomousDatabaseWallet']
@pulumi.input_type
class AutonomousDatabaseWalletArgs:
def __init__(__self__, *,
autonomous_database_id: pulumi.Input[str],
password: pulumi.Input[str],
base64_encode_content: Optional[pulumi.Input[bool]] = None,
generate_type: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a AutonomousDatabaseWallet resource.
:param pulumi.Input[str] autonomous_database_id: The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
:param pulumi.Input[str] password: The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
:param pulumi.Input[str] generate_type: The type of wallet to generate.
"""
pulumi.set(__self__, "autonomous_database_id", autonomous_database_id)
pulumi.set(__self__, "password", password)
if base64_encode_content is not None:
pulumi.set(__self__, "base64_encode_content", base64_encode_content)
if generate_type is not None:
pulumi.set(__self__, "generate_type", generate_type)
@property
@pulumi.getter(name="autonomousDatabaseId")
def autonomous_database_id(self) -> pulumi.Input[str]:
"""
The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
"""
return pulumi.get(self, "autonomous_database_id")
@autonomous_database_id.setter
def autonomous_database_id(self, value: pulumi.Input[str]):
pulumi.set(self, "autonomous_database_id", value)
@property
@pulumi.getter
def password(self) -> pulumi.Input[str]:
"""
The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: pulumi.Input[str]):
pulumi.set(self, "password", value)
@property
@pulumi.getter(name="base64EncodeContent")
def base64_encode_content(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "base64_encode_content")
@base64_encode_content.setter
def base64_encode_content(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "base64_encode_content", value)
@property
@pulumi.getter(name="generateType")
def generate_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of wallet to generate.
"""
return pulumi.get(self, "generate_type")
@generate_type.setter
def generate_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "generate_type", value)
@pulumi.input_type
class _AutonomousDatabaseWalletState:
def __init__(__self__, *,
autonomous_database_id: Optional[pulumi.Input[str]] = None,
base64_encode_content: Optional[pulumi.Input[bool]] = None,
content: Optional[pulumi.Input[str]] = None,
generate_type: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering AutonomousDatabaseWallet resources.
:param pulumi.Input[str] autonomous_database_id: The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
:param pulumi.Input[str] content: content of the downloaded zipped wallet for the Autonomous Database. If `base64_encode_content` is set to `true`, then this content will be base64 encoded.
:param pulumi.Input[str] generate_type: The type of wallet to generate.
:param pulumi.Input[str] password: The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
"""
if autonomous_database_id is not None:
pulumi.set(__self__, "autonomous_database_id", autonomous_database_id)
if base64_encode_content is not None:
pulumi.set(__self__, "base64_encode_content", base64_encode_content)
if content is not None:
pulumi.set(__self__, "content", content)
if generate_type is not None:
pulumi.set(__self__, "generate_type", generate_type)
if password is not None:
pulumi.set(__self__, "password", password)
@property
@pulumi.getter(name="autonomousDatabaseId")
def autonomous_database_id(self) -> Optional[pulumi.Input[str]]:
"""
The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
"""
return pulumi.get(self, "autonomous_database_id")
@autonomous_database_id.setter
def autonomous_database_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "autonomous_database_id", value)
@property
@pulumi.getter(name="base64EncodeContent")
def base64_encode_content(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "base64_encode_content")
@base64_encode_content.setter
def base64_encode_content(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "base64_encode_content", value)
@property
@pulumi.getter
def content(self) -> Optional[pulumi.Input[str]]:
"""
content of the downloaded zipped wallet for the Autonomous Database. If `base64_encode_content` is set to `true`, then this content will be base64 encoded.
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content", value)
@property
@pulumi.getter(name="generateType")
def generate_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of wallet to generate.
"""
return pulumi.get(self, "generate_type")
@generate_type.setter
def generate_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "generate_type", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
class AutonomousDatabaseWallet(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
autonomous_database_id: Optional[pulumi.Input[str]] = None,
base64_encode_content: Optional[pulumi.Input[bool]] = None,
generate_type: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
## Import
Import is not supported for this resource.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] autonomous_database_id: The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
:param pulumi.Input[str] generate_type: The type of wallet to generate.
:param pulumi.Input[str] password: The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: AutonomousDatabaseWalletArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
## Import
Import is not supported for this resource.
:param str resource_name: The name of the resource.
:param AutonomousDatabaseWalletArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(AutonomousDatabaseWalletArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
autonomous_database_id: Optional[pulumi.Input[str]] = None,
base64_encode_content: Optional[pulumi.Input[bool]] = None,
generate_type: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = AutonomousDatabaseWalletArgs.__new__(AutonomousDatabaseWalletArgs)
if autonomous_database_id is None and not opts.urn:
raise TypeError("Missing required property 'autonomous_database_id'")
__props__.__dict__["autonomous_database_id"] = autonomous_database_id
__props__.__dict__["base64_encode_content"] = base64_encode_content
__props__.__dict__["generate_type"] = generate_type
if password is None and not opts.urn:
raise TypeError("Missing required property 'password'")
__props__.__dict__["password"] = password
__props__.__dict__["content"] = None
super(AutonomousDatabaseWallet, __self__).__init__(
'oci:database/autonomousDatabaseWallet:AutonomousDatabaseWallet',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
autonomous_database_id: Optional[pulumi.Input[str]] = None,
base64_encode_content: Optional[pulumi.Input[bool]] = None,
content: Optional[pulumi.Input[str]] = None,
generate_type: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None) -> 'AutonomousDatabaseWallet':
"""
Get an existing AutonomousDatabaseWallet resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] autonomous_database_id: The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
:param pulumi.Input[str] content: content of the downloaded zipped wallet for the Autonomous Database. If `base64_encode_content` is set to `true`, then this content will be base64 encoded.
:param pulumi.Input[str] generate_type: The type of wallet to generate.
:param pulumi.Input[str] password: The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _AutonomousDatabaseWalletState.__new__(_AutonomousDatabaseWalletState)
__props__.__dict__["autonomous_database_id"] = autonomous_database_id
__props__.__dict__["base64_encode_content"] = base64_encode_content
__props__.__dict__["content"] = content
__props__.__dict__["generate_type"] = generate_type
__props__.__dict__["password"] = password
return AutonomousDatabaseWallet(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="autonomousDatabaseId")
def autonomous_database_id(self) -> pulumi.Output[str]:
"""
The database [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
"""
return pulumi.get(self, "autonomous_database_id")
@property
@pulumi.getter(name="base64EncodeContent")
def base64_encode_content(self) -> pulumi.Output[Optional[bool]]:
return pulumi.get(self, "base64_encode_content")
@property
@pulumi.getter
def content(self) -> pulumi.Output[str]:
"""
content of the downloaded zipped wallet for the Autonomous Database. If `base64_encode_content` is set to `true`, then this content will be base64 encoded.
"""
return pulumi.get(self, "content")
@property
@pulumi.getter(name="generateType")
def generate_type(self) -> pulumi.Output[Optional[str]]:
"""
The type of wallet to generate.
"""
return pulumi.get(self, "generate_type")
@property
@pulumi.getter
def password(self) -> pulumi.Output[str]:
"""
The password to encrypt the keys inside the wallet. The password must be at least 8 characters long and must include at least 1 letter and either 1 numeric character or 1 special character.
"""
return pulumi.get(self, "password")
| 47.683871 | 232 | 0.676634 | 1,739 | 14,782 | 5.511213 | 0.097182 | 0.066569 | 0.068656 | 0.057387 | 0.791841 | 0.76273 | 0.747913 | 0.718176 | 0.703464 | 0.677588 | 0 | 0.009214 | 0.229062 | 14,782 | 309 | 233 | 47.838188 | 0.831783 | 0.308551 | 0 | 0.630208 | 1 | 0 | 0.119391 | 0.056897 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15625 | false | 0.130208 | 0.026042 | 0.015625 | 0.276042 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
b1abcde15a6a53612510109557ad66b6cad0170c | 138,803 | py | Python | examples/First_Project/mg_processes/signal1/bin/internal/ufomodel/couplings.py | JaySandesara/madminer | c5fcb9fbbd5d70f7a07114e4ea6afc4e3c4518fb | [
"MIT"
] | null | null | null | examples/First_Project/mg_processes/signal1/bin/internal/ufomodel/couplings.py | JaySandesara/madminer | c5fcb9fbbd5d70f7a07114e4ea6afc4e3c4518fb | [
"MIT"
] | null | null | null | examples/First_Project/mg_processes/signal1/bin/internal/ufomodel/couplings.py | JaySandesara/madminer | c5fcb9fbbd5d70f7a07114e4ea6afc4e3c4518fb | [
"MIT"
] | null | null | null | # This file was automatically created by FeynRules 2.4.72
# Mathematica version: 11.3.0 for Mac OS X x86 (64-bit) (March 7, 2018)
# Date: Thu 8 Aug 2019 15:25:35
from object_library import all_couplings, Coupling
from function_library import complexconjugate, re, im, csc, sec, acsc, asec, cot
GC_1 = Coupling(name = 'GC_1',
value = '-(ee0*complex(0,1))/3.',
order = {'QED':1})
GC_10 = Coupling(name = 'GC_10',
value = '-G',
order = {'QCD':1})
GC_100 = Coupling(name = 'GC_100',
value = 'cpu/Lambda**2',
order = {'NP':2,'QED':2})
GC_101 = Coupling(name = 'GC_101',
value = '(4*cpW*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_102 = Coupling(name = 'GC_102',
value = '(cQd1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_103 = Coupling(name = 'GC_103',
value = '(cQd8*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_104 = Coupling(name = 'GC_104',
value = '(cQe*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_105 = Coupling(name = 'GC_105',
value = '(2*cQl31*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_106 = Coupling(name = 'GC_106',
value = '(2*cQl32*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_107 = Coupling(name = 'GC_107',
value = '(2*cQl33*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_108 = Coupling(name = 'GC_108',
value = '(cQlM1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_109 = Coupling(name = 'GC_109',
value = '(cQlM2*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_11 = Coupling(name = 'GC_11',
value = 'complex(0,1)*G',
order = {'QCD':1})
GC_110 = Coupling(name = 'GC_110',
value = '(cQlM3*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_111 = Coupling(name = 'GC_111',
value = '(cQmu*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_112 = Coupling(name = 'GC_112',
value = '(4*cQq13Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_113 = Coupling(name = 'GC_113',
value = '(cQQ8*complex(0,1))/(2.*Lambda**2)',
order = {'NP':2})
GC_114 = Coupling(name = 'GC_114',
value = '(4*cQq83Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_115 = Coupling(name = 'GC_115',
value = '(cQt1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_116 = Coupling(name = 'GC_116',
value = '(cQt8*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_117 = Coupling(name = 'GC_117',
value = '(cQta*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_118 = Coupling(name = 'GC_118',
value = '(cQu1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_119 = Coupling(name = 'GC_119',
value = '(cQu8*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_12 = Coupling(name = 'GC_12',
value = 'complex(0,1)*G**2',
order = {'QCD':2})
GC_120 = Coupling(name = 'GC_120',
value = '(ctd1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_121 = Coupling(name = 'GC_121',
value = '(ctd8*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_122 = Coupling(name = 'GC_122',
value = '(cte*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_123 = Coupling(name = 'GC_123',
value = '(ctl1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_124 = Coupling(name = 'GC_124',
value = '(ctl2*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_125 = Coupling(name = 'GC_125',
value = '(ctl3*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_126 = Coupling(name = 'GC_126',
value = '-((ctlS3*complex(0,1))/Lambda**2)',
order = {'NP':2})
GC_127 = Coupling(name = 'GC_127',
value = '(ctlS3*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_128 = Coupling(name = 'GC_128',
value = '-(ctlT3*complex(0,1))/(4.*Lambda**2)',
order = {'NP':2})
GC_129 = Coupling(name = 'GC_129',
value = '(ctlT3*complex(0,1))/(4.*Lambda**2)',
order = {'NP':2})
GC_13 = Coupling(name = 'GC_13',
value = '(2*cdp*complex(0,1))/Lambda**2 - (cpDC*complex(0,1))/(2.*Lambda**2)',
order = {'NP':2,'QED':2})
GC_130 = Coupling(name = 'GC_130',
value = '-(ctlT3*complex(0,1))/(2.*Lambda**2)',
order = {'NP':2})
GC_131 = Coupling(name = 'GC_131',
value = '(ctlT3*complex(0,1))/(2.*Lambda**2)',
order = {'NP':2})
GC_132 = Coupling(name = 'GC_132',
value = '(ctmu*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_133 = Coupling(name = 'GC_133',
value = '(-2*ctp)/Lambda**2',
order = {'NP':2,'QED':3})
GC_134 = Coupling(name = 'GC_134',
value = '-(ctp/Lambda**2)',
order = {'NP':2,'QED':3})
GC_135 = Coupling(name = 'GC_135',
value = 'ctp/Lambda**2',
order = {'NP':2,'QED':3})
GC_136 = Coupling(name = 'GC_136',
value = '(2*ctp)/Lambda**2',
order = {'NP':2,'QED':3})
GC_137 = Coupling(name = 'GC_137',
value = '(ctp*complex(0,1))/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':3})
GC_138 = Coupling(name = 'GC_138',
value = '(3*ctp*complex(0,1))/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':3})
GC_139 = Coupling(name = 'GC_139',
value = 'ctp/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':3})
GC_14 = Coupling(name = 'GC_14',
value = '(4*cdp*complex(0,1))/Lambda**2 - (cpDC*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_140 = Coupling(name = 'GC_140',
value = '(3*ctp)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':3})
GC_141 = Coupling(name = 'GC_141',
value = '(ctq1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_142 = Coupling(name = 'GC_142',
value = '(ctq8*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_143 = Coupling(name = 'GC_143',
value = '(2*ctt1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_144 = Coupling(name = 'GC_144',
value = '(ctta*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_145 = Coupling(name = 'GC_145',
value = '(2*ctu1Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_146 = Coupling(name = 'GC_146',
value = '(2*ctu8Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_147 = Coupling(name = 'GC_147',
value = '-(ctW/Lambda**2)',
order = {'NP':2,'QED':2})
GC_148 = Coupling(name = 'GC_148',
value = '(ctW*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_149 = Coupling(name = 'GC_149',
value = 'ctW/Lambda**2',
order = {'NP':2,'QED':2})
GC_15 = Coupling(name = 'GC_15',
value = '-((c3pl1*complex(0,1))/Lambda**2) - (cpl1*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_150 = Coupling(name = 'GC_150',
value = '-((ctW*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_151 = Coupling(name = 'GC_151',
value = '(ctW*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_152 = Coupling(name = 'GC_152',
value = '(-2*cpWB*cw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_153 = Coupling(name = 'GC_153',
value = '(-2*cpWB*cw0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_154 = Coupling(name = 'GC_154',
value = '(2*cpWB*cw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_155 = Coupling(name = 'GC_155',
value = '(-6*cw0*cWWW*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':1})
GC_156 = Coupling(name = 'GC_156',
value = '-((c3pl1*ee0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_157 = Coupling(name = 'GC_157',
value = '-((c3pl1*ee0*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_158 = Coupling(name = 'GC_158',
value = '(c3pl1*ee0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':3})
GC_159 = Coupling(name = 'GC_159',
value = '-((c3pl2*ee0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_16 = Coupling(name = 'GC_16',
value = '(c3pl1*complex(0,1))/Lambda**2 - (cpl1*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_160 = Coupling(name = 'GC_160',
value = '-((c3pl2*ee0*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_161 = Coupling(name = 'GC_161',
value = '(c3pl2*ee0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':3})
GC_162 = Coupling(name = 'GC_162',
value = '-((c3pl3*ee0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_163 = Coupling(name = 'GC_163',
value = '-((c3pl3*ee0*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_164 = Coupling(name = 'GC_164',
value = '(c3pl3*ee0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':3})
GC_165 = Coupling(name = 'GC_165',
value = '-((c3pQ3Internal*ee0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_166 = Coupling(name = 'GC_166',
value = '-((c3pQ3Internal*ee0*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_167 = Coupling(name = 'GC_167',
value = '(c3pQ3Internal*ee0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':3})
GC_168 = Coupling(name = 'GC_168',
value = '-((c3pqiInternal*ee0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_169 = Coupling(name = 'GC_169',
value = '-((c3pqiInternal*ee0*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_17 = Coupling(name = 'GC_17',
value = '-(c3pl1/Lambda**2) + cpl1/Lambda**2',
order = {'NP':2,'QED':2})
GC_170 = Coupling(name = 'GC_170',
value = '(c3pqiInternal*ee0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':3})
GC_171 = Coupling(name = 'GC_171',
value = '(2*cpd*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_172 = Coupling(name = 'GC_172',
value = '(-2*cpDC*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_173 = Coupling(name = 'GC_173',
value = '(cpDC*ee0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_174 = Coupling(name = 'GC_174',
value = '(2*cpe*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_175 = Coupling(name = 'GC_175',
value = '(2*cpmu*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_176 = Coupling(name = 'GC_176',
value = '(2*cpt*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_177 = Coupling(name = 'GC_177',
value = '(2*cpta*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_178 = Coupling(name = 'GC_178',
value = '(2*cpu*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_179 = Coupling(name = 'GC_179',
value = '(-4*cpW*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_18 = Coupling(name = 'GC_18',
value = 'c3pl1/Lambda**2 + cpl1/Lambda**2',
order = {'NP':2,'QED':2})
GC_180 = Coupling(name = 'GC_180',
value = '(-2*cpWB*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_181 = Coupling(name = 'GC_181',
value = '(2*cpWB*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_182 = Coupling(name = 'GC_182',
value = '-((ctW*ee0)/Lambda**2)',
order = {'NP':2,'QED':3})
GC_183 = Coupling(name = 'GC_183',
value = '-((ctW*ee0*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':3})
GC_184 = Coupling(name = 'GC_184',
value = '(ctW*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_185 = Coupling(name = 'GC_185',
value = '(ctW*ee0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':3})
GC_186 = Coupling(name = 'GC_186',
value = '(-2*cpWB*cw0*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_187 = Coupling(name = 'GC_187',
value = '(2*cpWB*cw0*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_188 = Coupling(name = 'GC_188',
value = '(2*cpWB*cw0*ee0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_189 = Coupling(name = 'GC_189',
value = '(6*cw0*cWWW*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_19 = Coupling(name = 'GC_19',
value = '-((c3pl2*complex(0,1))/Lambda**2) - (cpl2*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_190 = Coupling(name = 'GC_190',
value = '(8*cpDC*ee0**2*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':4})
GC_191 = Coupling(name = 'GC_191',
value = '(-4*cpW*ee0**2*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':4})
GC_192 = Coupling(name = 'GC_192',
value = '(6*cw0*cWWW*ee0**2*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_193 = Coupling(name = 'GC_193',
value = '(-6*cG*G)/Lambda**2',
order = {'NP':2,'QCD':1})
GC_194 = Coupling(name = 'GC_194',
value = '(4*cpG*G)/Lambda**2',
order = {'NP':2,'QCD':1,'QED':2})
GC_195 = Coupling(name = 'GC_195',
value = '-((ctG*G)/Lambda**2)',
order = {'NP':2,'QCD':1,'QED':1})
GC_196 = Coupling(name = 'GC_196',
value = '(ctG*G)/Lambda**2',
order = {'NP':2,'QCD':1,'QED':1})
GC_197 = Coupling(name = 'GC_197',
value = '(ctG*complex(0,1)*G)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QCD':1,'QED':1})
GC_198 = Coupling(name = 'GC_198',
value = '(ctG*G)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QCD':1,'QED':1})
GC_199 = Coupling(name = 'GC_199',
value = '(6*cG*complex(0,1)*G**2)/Lambda**2',
order = {'NP':2,'QCD':2})
GC_2 = Coupling(name = 'GC_2',
value = '(2*ee0*complex(0,1))/3.',
order = {'QED':1})
GC_20 = Coupling(name = 'GC_20',
value = '(c3pl2*complex(0,1))/Lambda**2 - (cpl2*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_200 = Coupling(name = 'GC_200',
value = '(-4*cpG*complex(0,1)*G**2)/Lambda**2',
order = {'NP':2,'QCD':2,'QED':2})
GC_201 = Coupling(name = 'GC_201',
value = '-((ctG*complex(0,1)*G**2)/Lambda**2)',
order = {'NP':2,'QCD':2,'QED':1})
GC_202 = Coupling(name = 'GC_202',
value = '(ctG*complex(0,1)*G**2)/Lambda**2',
order = {'NP':2,'QCD':2,'QED':1})
GC_203 = Coupling(name = 'GC_203',
value = '-((ctG*G**2)/(Lambda**2*cmath.sqrt(2)))',
order = {'NP':2,'QCD':2,'QED':1})
GC_204 = Coupling(name = 'GC_204',
value = '-((ctG*complex(0,1)*G**2)/(Lambda**2*cmath.sqrt(2)))',
order = {'NP':2,'QCD':2,'QED':1})
GC_205 = Coupling(name = 'GC_205',
value = '(-3*cG*G**3)/Lambda**2',
order = {'NP':2,'QCD':3})
GC_206 = Coupling(name = 'GC_206',
value = '(3*cG*G**3)/Lambda**2',
order = {'NP':2,'QCD':3})
GC_207 = Coupling(name = 'GC_207',
value = '-((cG*complex(0,1)*G**4)/Lambda**2)',
order = {'NP':2,'QCD':4})
GC_208 = Coupling(name = 'GC_208',
value = '(cG*complex(0,1)*G**4)/Lambda**2',
order = {'NP':2,'QCD':4})
GC_209 = Coupling(name = 'GC_209',
value = '(c3pl1*complex(0,1)*MH**2)/Lambda**2 + (c3pl2*complex(0,1)*MH**2)/Lambda**2 - (cll1221*complex(0,1)*MH**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_21 = Coupling(name = 'GC_21',
value = '-(c3pl2/Lambda**2) + cpl2/Lambda**2',
order = {'NP':2,'QED':2})
GC_210 = Coupling(name = 'GC_210',
value = '(c3pl1*complex(0,1)*MH**2)/Lambda**2 + (c3pl2*complex(0,1)*MH**2)/Lambda**2 + (2*cdp*complex(0,1)*MH**2)/Lambda**2 - (cll1221*complex(0,1)*MH**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_211 = Coupling(name = 'GC_211',
value = '(c3pl1*complex(0,1)*MH**2)/Lambda**2 + (c3pl2*complex(0,1)*MH**2)/Lambda**2 - (cll1221*complex(0,1)*MH**2)/Lambda**2 + (cpDC*complex(0,1)*MH**2)/(2.*Lambda**2)',
order = {'NP':2,'QED':2})
GC_212 = Coupling(name = 'GC_212',
value = '(2*c3pl1*complex(0,1)*MH**2)/Lambda**2 + (2*c3pl2*complex(0,1)*MH**2)/Lambda**2 + (4*cdp*complex(0,1)*MH**2)/Lambda**2 - (2*cll1221*complex(0,1)*MH**2)/Lambda**2 - (cpDC*complex(0,1)*MH**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_213 = Coupling(name = 'GC_213',
value = '(3*c3pl1*complex(0,1)*MH**2)/Lambda**2 + (3*c3pl2*complex(0,1)*MH**2)/Lambda**2 - (6*cdp*complex(0,1)*MH**2)/Lambda**2 - (3*cll1221*complex(0,1)*MH**2)/Lambda**2 + (3*cpDC*complex(0,1)*MH**2)/(2.*Lambda**2)',
order = {'NP':2,'QED':2})
GC_214 = Coupling(name = 'GC_214',
value = '(3*c3pl1*complex(0,1)*MH**2)/Lambda**2 + (3*c3pl2*complex(0,1)*MH**2)/Lambda**2 + (6*cdp*complex(0,1)*MH**2)/Lambda**2 - (3*cll1221*complex(0,1)*MH**2)/Lambda**2 + (3*cpDC*complex(0,1)*MH**2)/(2.*Lambda**2)',
order = {'NP':2,'QED':2})
GC_215 = Coupling(name = 'GC_215',
value = '(-3*cpDC*ee0**2)/(2.*cw0*Lambda**2) - (3*cpDC*cw0*ee0**2)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_216 = Coupling(name = 'GC_216',
value = '(cpDC*ee0**2)/(cw0*Lambda**2) - (cpDC*cw0*ee0**2)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_217 = Coupling(name = 'GC_217',
value = '-(cpDC*ee0**2)/(2.*cw0*Lambda**2) - (cpDC*cw0*ee0**2)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_218 = Coupling(name = 'GC_218',
value = '-(cpDC*ee0**2*complex(0,1))/(2.*cw0*Lambda**2) - (cpDC*cw0*ee0**2*complex(0,1))/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_219 = Coupling(name = 'GC_219',
value = '-((cpDC*ee0**2*complex(0,1))/(cw0*Lambda**2)) + (cpDC*cw0*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_22 = Coupling(name = 'GC_22',
value = 'c3pl2/Lambda**2 + cpl2/Lambda**2',
order = {'NP':2,'QED':2})
GC_220 = Coupling(name = 'GC_220',
value = '(-3*cpDC*ee0**2*complex(0,1))/(2.*cw0*Lambda**2) - (3*cpDC*cw0*ee0**2*complex(0,1))/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_221 = Coupling(name = 'GC_221',
value = '(cpDC*ee0**2)/(2.*cw0*Lambda**2) + (cpDC*cw0*ee0**2)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_222 = Coupling(name = 'GC_222',
value = '-((cpDC*ee0**2)/(cw0*Lambda**2)) + (cpDC*cw0*ee0**2)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_223 = Coupling(name = 'GC_223',
value = '(3*cpDC*ee0**2)/(2.*cw0*Lambda**2) + (3*cpDC*cw0*ee0**2)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_224 = Coupling(name = 'GC_224',
value = '(24*cw0**2*cWWW*ee0**3*complex(0,1))/(Lambda**2*sw0**3)',
order = {'NP':2,'QED':4})
GC_225 = Coupling(name = 'GC_225',
value = '(ee0**2*complex(0,1))/(2.*sw0**2)',
order = {'QED':2})
GC_226 = Coupling(name = 'GC_226',
value = '-((ee0**2*complex(0,1))/sw0**2)',
order = {'QED':2})
GC_227 = Coupling(name = 'GC_227',
value = '(cw0**2*ee0**2*complex(0,1))/sw0**2',
order = {'QED':2})
GC_228 = Coupling(name = 'GC_228',
value = '(-2*cpDC*ee0**2)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_229 = Coupling(name = 'GC_229',
value = '(cpDC*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_23 = Coupling(name = 'GC_23',
value = '-((c3pl3*complex(0,1))/Lambda**2) - (cpl3*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_230 = Coupling(name = 'GC_230',
value = '(-2*cpDC*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_231 = Coupling(name = 'GC_231',
value = '(2*cpDC*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_232 = Coupling(name = 'GC_232',
value = '(2*cpDC*ee0**2)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_233 = Coupling(name = 'GC_233',
value = '(4*cpW*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_234 = Coupling(name = 'GC_234',
value = '(-4*cpW*cw0**2*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_235 = Coupling(name = 'GC_235',
value = '(6*cw0*cWWW*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_236 = Coupling(name = 'GC_236',
value = '(6*cw0**3*cWWW*ee0**2*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_237 = Coupling(name = 'GC_237',
value = '(-24*cw0*cWWW*ee0**3*complex(0,1))/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':4})
GC_238 = Coupling(name = 'GC_238',
value = '-ee0/(2.*sw0)',
order = {'QED':1})
GC_239 = Coupling(name = 'GC_239',
value = '-(ee0*complex(0,1))/(2.*sw0)',
order = {'QED':1})
GC_24 = Coupling(name = 'GC_24',
value = '(c3pl3*complex(0,1))/Lambda**2 - (cpl3*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_240 = Coupling(name = 'GC_240',
value = '(ee0*complex(0,1))/(2.*sw0)',
order = {'QED':1})
GC_241 = Coupling(name = 'GC_241',
value = '(ee0*complex(0,1))/(sw0*cmath.sqrt(2))',
order = {'QED':1})
GC_242 = Coupling(name = 'GC_242',
value = '-(cw0*ee0*complex(0,1))/(2.*sw0)',
order = {'QED':1})
GC_243 = Coupling(name = 'GC_243',
value = '(cw0*ee0*complex(0,1))/(2.*sw0)',
order = {'QED':1})
GC_244 = Coupling(name = 'GC_244',
value = '-((cw0*ee0*complex(0,1))/sw0)',
order = {'QED':1})
GC_245 = Coupling(name = 'GC_245',
value = '(cw0*ee0*complex(0,1))/sw0',
order = {'QED':1})
GC_246 = Coupling(name = 'GC_246',
value = '-ee0**2/(2.*sw0)',
order = {'QED':2})
GC_247 = Coupling(name = 'GC_247',
value = '-(ee0**2*complex(0,1))/(2.*sw0)',
order = {'QED':2})
GC_248 = Coupling(name = 'GC_248',
value = 'ee0**2/(2.*sw0)',
order = {'QED':2})
GC_249 = Coupling(name = 'GC_249',
value = '(2*cw0*ee0**2*complex(0,1))/sw0',
order = {'QED':2})
GC_25 = Coupling(name = 'GC_25',
value = '-(c3pl3/Lambda**2) + cpl3/Lambda**2',
order = {'NP':2,'QED':2})
GC_250 = Coupling(name = 'GC_250',
value = '(c3pl1*ee0*complex(0,1)*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_251 = Coupling(name = 'GC_251',
value = '(c3pl2*ee0*complex(0,1)*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_252 = Coupling(name = 'GC_252',
value = '(c3pl3*ee0*complex(0,1)*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_253 = Coupling(name = 'GC_253',
value = '(c3pQ3Internal*ee0*complex(0,1)*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_254 = Coupling(name = 'GC_254',
value = '(c3pqiInternal*ee0*complex(0,1)*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_255 = Coupling(name = 'GC_255',
value = '-((cpd*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_256 = Coupling(name = 'GC_256',
value = '-((cpd*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_257 = Coupling(name = 'GC_257',
value = '(cpd*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_258 = Coupling(name = 'GC_258',
value = '-(cpDC*ee0)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_259 = Coupling(name = 'GC_259',
value = '(cpDC*ee0*complex(0,1))/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_26 = Coupling(name = 'GC_26',
value = 'c3pl3/Lambda**2 + cpl3/Lambda**2',
order = {'NP':2,'QED':2})
GC_260 = Coupling(name = 'GC_260',
value = '-((cpDC*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_261 = Coupling(name = 'GC_261',
value = '(cpDC*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_262 = Coupling(name = 'GC_262',
value = '(cpDC*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_263 = Coupling(name = 'GC_263',
value = '-((cpe*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_264 = Coupling(name = 'GC_264',
value = '-((cpe*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_265 = Coupling(name = 'GC_265',
value = '(cpe*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_266 = Coupling(name = 'GC_266',
value = '-((cpl1*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_267 = Coupling(name = 'GC_267',
value = '-((cpl1*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_268 = Coupling(name = 'GC_268',
value = '(cpl1*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_269 = Coupling(name = 'GC_269',
value = '-((cpl2*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_27 = Coupling(name = 'GC_27',
value = '-((c3pQ3Internal*complex(0,1))/Lambda**2) - (cpQ3Internal*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_270 = Coupling(name = 'GC_270',
value = '-((cpl2*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_271 = Coupling(name = 'GC_271',
value = '(cpl2*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_272 = Coupling(name = 'GC_272',
value = '-((cpl3*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_273 = Coupling(name = 'GC_273',
value = '-((cpl3*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_274 = Coupling(name = 'GC_274',
value = '(cpl3*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_275 = Coupling(name = 'GC_275',
value = '-((cpmu*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_276 = Coupling(name = 'GC_276',
value = '-((cpmu*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_277 = Coupling(name = 'GC_277',
value = '(cpmu*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_278 = Coupling(name = 'GC_278',
value = '-((cpQ3Internal*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_279 = Coupling(name = 'GC_279',
value = '-((cpQ3Internal*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_28 = Coupling(name = 'GC_28',
value = '(c3pQ3Internal*complex(0,1))/Lambda**2 - (cpQ3Internal*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_280 = Coupling(name = 'GC_280',
value = '(cpQ3Internal*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_281 = Coupling(name = 'GC_281',
value = '-((cpqiInternal*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_282 = Coupling(name = 'GC_282',
value = '-((cpqiInternal*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_283 = Coupling(name = 'GC_283',
value = '(cpqiInternal*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_284 = Coupling(name = 'GC_284',
value = '-((cpt*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_285 = Coupling(name = 'GC_285',
value = '-((cpt*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_286 = Coupling(name = 'GC_286',
value = '(cpt*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_287 = Coupling(name = 'GC_287',
value = '-((cpta*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_288 = Coupling(name = 'GC_288',
value = '-((cpta*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_289 = Coupling(name = 'GC_289',
value = '(cpta*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_29 = Coupling(name = 'GC_29',
value = '-(c3pQ3Internal/Lambda**2) + cpQ3Internal/Lambda**2',
order = {'NP':2,'QED':2})
GC_290 = Coupling(name = 'GC_290',
value = '-((cpu*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_291 = Coupling(name = 'GC_291',
value = '-((cpu*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_292 = Coupling(name = 'GC_292',
value = '(cpu*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_293 = Coupling(name = 'GC_293',
value = '-((ctW*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_294 = Coupling(name = 'GC_294',
value = '(ctW*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_295 = Coupling(name = 'GC_295',
value = '(ctW*ee0*complex(0,1))/(Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':3})
GC_296 = Coupling(name = 'GC_296',
value = '(ctW*ee0)/(Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':3})
GC_297 = Coupling(name = 'GC_297',
value = '(4*cpW*cw0*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_298 = Coupling(name = 'GC_298',
value = '(-2*cpWB*cw0*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_299 = Coupling(name = 'GC_299',
value = '(2*cpWB*cw0*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_3 = Coupling(name = 'GC_3',
value = '-(ee0*complex(0,1))',
order = {'QED':1})
GC_30 = Coupling(name = 'GC_30',
value = 'c3pQ3Internal/Lambda**2 + cpQ3Internal/Lambda**2',
order = {'NP':2,'QED':2})
GC_300 = Coupling(name = 'GC_300',
value = '-((ctW*cw0*ee0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_301 = Coupling(name = 'GC_301',
value = '-((ctW*cw0*ee0*complex(0,1))/(Lambda**2*sw0))',
order = {'NP':2,'QED':3})
GC_302 = Coupling(name = 'GC_302',
value = '(ctW*cw0*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_303 = Coupling(name = 'GC_303',
value = '(ctW*cw0*ee0*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_304 = Coupling(name = 'GC_304',
value = '(-2*cpWB*cw0**2*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_305 = Coupling(name = 'GC_305',
value = '(2*cpWB*cw0**2*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_306 = Coupling(name = 'GC_306',
value = '(2*cpWB*cw0**2*ee0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_307 = Coupling(name = 'GC_307',
value = '(-6*cWWW*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_308 = Coupling(name = 'GC_308',
value = '(6*cw0**2*cWWW*ee0*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_309 = Coupling(name = 'GC_309',
value = '(-2*cpDC*ee0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':4})
GC_31 = Coupling(name = 'GC_31',
value = '-((c3pqiInternal*complex(0,1))/Lambda**2) - (cpqiInternal*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_310 = Coupling(name = 'GC_310',
value = '(-2*cpDC*ee0**2*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':4})
GC_311 = Coupling(name = 'GC_311',
value = '(2*cpDC*ee0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':4})
GC_312 = Coupling(name = 'GC_312',
value = '(-8*cpW*cw0*ee0**2*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':4})
GC_313 = Coupling(name = 'GC_313',
value = '(-6*cWWW*ee0**2*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_314 = Coupling(name = 'GC_314',
value = '(-6*cw0**2*cWWW*ee0**2*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_315 = Coupling(name = 'GC_315',
value = '(24*cWWW*ee0**3*complex(0,1))/(Lambda**2*sw0)',
order = {'NP':2,'QED':4})
GC_316 = Coupling(name = 'GC_316',
value = '(ee0*complex(0,1)*sw0)/(6.*cw0)',
order = {'QED':1})
GC_317 = Coupling(name = 'GC_317',
value = '-(ee0*complex(0,1)*sw0)/(2.*cw0)',
order = {'QED':1})
GC_318 = Coupling(name = 'GC_318',
value = '(-2*cpWB*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_319 = Coupling(name = 'GC_319',
value = '(-2*cpWB*complex(0,1)*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_32 = Coupling(name = 'GC_32',
value = '(c3pqiInternal*complex(0,1))/Lambda**2 - (cpqiInternal*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_320 = Coupling(name = 'GC_320',
value = '(2*cpWB*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_321 = Coupling(name = 'GC_321',
value = '(6*cWWW*complex(0,1)*sw0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_322 = Coupling(name = 'GC_322',
value = '(-2*cpWB*ee0*complex(0,1)*sw0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_323 = Coupling(name = 'GC_323',
value = '(2*cpWB*ee0*complex(0,1)*sw0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_324 = Coupling(name = 'GC_324',
value = '(2*cpWB*ee0*sw0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_325 = Coupling(name = 'GC_325',
value = '-((c3pl1*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_326 = Coupling(name = 'GC_326',
value = '-((c3pl1*ee0*complex(0,1)*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_327 = Coupling(name = 'GC_327',
value = '(c3pl1*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_328 = Coupling(name = 'GC_328',
value = '-((c3pl2*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_329 = Coupling(name = 'GC_329',
value = '-((c3pl2*ee0*complex(0,1)*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_33 = Coupling(name = 'GC_33',
value = '-(c3pqiInternal/Lambda**2) + cpqiInternal/Lambda**2',
order = {'NP':2,'QED':2})
GC_330 = Coupling(name = 'GC_330',
value = '(c3pl2*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_331 = Coupling(name = 'GC_331',
value = '-((c3pl3*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_332 = Coupling(name = 'GC_332',
value = '-((c3pl3*ee0*complex(0,1)*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_333 = Coupling(name = 'GC_333',
value = '(c3pl3*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_334 = Coupling(name = 'GC_334',
value = '-((c3pQ3Internal*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_335 = Coupling(name = 'GC_335',
value = '-((c3pQ3Internal*ee0*complex(0,1)*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_336 = Coupling(name = 'GC_336',
value = '(c3pQ3Internal*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_337 = Coupling(name = 'GC_337',
value = '-((c3pqiInternal*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_338 = Coupling(name = 'GC_338',
value = '-((c3pqiInternal*ee0*complex(0,1)*sw0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':3})
GC_339 = Coupling(name = 'GC_339',
value = '(c3pqiInternal*ee0*sw0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_34 = Coupling(name = 'GC_34',
value = 'c3pqiInternal/Lambda**2 + cpqiInternal/Lambda**2',
order = {'NP':2,'QED':2})
GC_340 = Coupling(name = 'GC_340',
value = '(6*cWWW*ee0*complex(0,1)*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_341 = Coupling(name = 'GC_341',
value = '(-12*cWWW*ee0**2*complex(0,1)*sw0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_342 = Coupling(name = 'GC_342',
value = '-(cw0*ee0*complex(0,1))/(2.*sw0) - (ee0*complex(0,1)*sw0)/(2.*cw0)',
order = {'QED':1})
GC_343 = Coupling(name = 'GC_343',
value = '(cw0*ee0*complex(0,1))/(2.*sw0) - (ee0*complex(0,1)*sw0)/(2.*cw0)',
order = {'QED':1})
GC_344 = Coupling(name = 'GC_344',
value = '(cw0*ee0)/(2.*sw0) + (ee0*sw0)/(2.*cw0)',
order = {'QED':1})
GC_345 = Coupling(name = 'GC_345',
value = '-((cw0*ee0**2*complex(0,1))/sw0) + (ee0**2*complex(0,1)*sw0)/cw0',
order = {'QED':2})
GC_346 = Coupling(name = 'GC_346',
value = '-((ctW*cw0)/Lambda**2) - (ctB*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_347 = Coupling(name = 'GC_347',
value = '(ctW*cw0)/Lambda**2 + (ctB*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_348 = Coupling(name = 'GC_348',
value = '-((ctW*cw0*complex(0,1))/(Lambda**2*cmath.sqrt(2))) + (ctB*complex(0,1)*sw0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_349 = Coupling(name = 'GC_349',
value = '-((ctW*cw0)/(Lambda**2*cmath.sqrt(2))) + (ctB*sw0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_35 = Coupling(name = 'GC_35',
value = '(2*cQl31*complex(0,1))/Lambda**2 + (cQlM1*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_350 = Coupling(name = 'GC_350',
value = '(ctB*cw0)/Lambda**2 - (ctW*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_351 = Coupling(name = 'GC_351',
value = '-((ctB*cw0)/Lambda**2) + (ctW*sw0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_352 = Coupling(name = 'GC_352',
value = '(ctB*cw0*complex(0,1))/(Lambda**2*cmath.sqrt(2)) + (ctW*complex(0,1)*sw0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_353 = Coupling(name = 'GC_353',
value = '(ctB*cw0)/(Lambda**2*cmath.sqrt(2)) + (ctW*sw0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_354 = Coupling(name = 'GC_354',
value = '-((cpd*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpd*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_355 = Coupling(name = 'GC_355',
value = '(cpd*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpd*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_356 = Coupling(name = 'GC_356',
value = '-(cpDC*cw0*ee0*complex(0,1))/(2.*Lambda**2*sw0) - (cpDC*ee0*complex(0,1)*sw0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_357 = Coupling(name = 'GC_357',
value = '(cpDC*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (cpDC*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_358 = Coupling(name = 'GC_358',
value = '-(cpDC*cw0*ee0)/(2.*Lambda**2*sw0) + (cpDC*ee0*sw0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_359 = Coupling(name = 'GC_359',
value = '(cpDC*cw0*ee0)/(2.*Lambda**2*sw0) + (cpDC*ee0*sw0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_36 = Coupling(name = 'GC_36',
value = '(2*cQl32*complex(0,1))/Lambda**2 + (cQlM2*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_360 = Coupling(name = 'GC_360',
value = '(3*cpDC*cw0*ee0)/(2.*Lambda**2*sw0) + (3*cpDC*ee0*sw0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_361 = Coupling(name = 'GC_361',
value = '-((cpe*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpe*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_362 = Coupling(name = 'GC_362',
value = '(cpe*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpe*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_363 = Coupling(name = 'GC_363',
value = '(c3pl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (cpl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_364 = Coupling(name = 'GC_364',
value = '-((c3pl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_365 = Coupling(name = 'GC_365',
value = '-((c3pl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) - (cpl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_366 = Coupling(name = 'GC_366',
value = '(c3pl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpl1*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_367 = Coupling(name = 'GC_367',
value = '(c3pl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (cpl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_368 = Coupling(name = 'GC_368',
value = '-((c3pl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_369 = Coupling(name = 'GC_369',
value = '-((c3pl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) - (cpl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_37 = Coupling(name = 'GC_37',
value = '(2*cQl33*complex(0,1))/Lambda**2 + (cQlM3*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_370 = Coupling(name = 'GC_370',
value = '(c3pl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpl2*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_371 = Coupling(name = 'GC_371',
value = '(c3pl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (cpl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_372 = Coupling(name = 'GC_372',
value = '-((c3pl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_373 = Coupling(name = 'GC_373',
value = '-((c3pl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) - (cpl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_374 = Coupling(name = 'GC_374',
value = '(c3pl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpl3*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_375 = Coupling(name = 'GC_375',
value = '-((cpmu*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpmu*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_376 = Coupling(name = 'GC_376',
value = '(cpmu*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpmu*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_377 = Coupling(name = 'GC_377',
value = '(c3pQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (cpQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_378 = Coupling(name = 'GC_378',
value = '-((c3pQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_379 = Coupling(name = 'GC_379',
value = '-((c3pQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) - (cpQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_38 = Coupling(name = 'GC_38',
value = '(2*cQq11Internal*complex(0,1))/Lambda**2 - (2*cQq13Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_380 = Coupling(name = 'GC_380',
value = '(c3pQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpQ3Internal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_381 = Coupling(name = 'GC_381',
value = '(c3pqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (cpqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_382 = Coupling(name = 'GC_382',
value = '-((c3pqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) - (c3pqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_383 = Coupling(name = 'GC_383',
value = '-((c3pqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) - (cpqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_384 = Coupling(name = 'GC_384',
value = '(c3pqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpqiInternal*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (c3pqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_385 = Coupling(name = 'GC_385',
value = '-((cpt*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpt*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_386 = Coupling(name = 'GC_386',
value = '(cpt*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpt*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_387 = Coupling(name = 'GC_387',
value = '-((cpta*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpta*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_388 = Coupling(name = 'GC_388',
value = '(cpta*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpta*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_389 = Coupling(name = 'GC_389',
value = '-((cpu*cw0*ee0*complex(0,1))/(Lambda**2*sw0)) + (cpu*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_39 = Coupling(name = 'GC_39',
value = '(2*cQq11Internal*complex(0,1))/Lambda**2 + (2*cQq13Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_390 = Coupling(name = 'GC_390',
value = '(cpu*cw0*ee0*complex(0,1))/(Lambda**2*sw0) + (cpu*ee0*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_391 = Coupling(name = 'GC_391',
value = '(cpDC*cw0*ee0**2*complex(0,1))/(Lambda**2*sw0) + (cpDC*ee0**2*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':4})
GC_392 = Coupling(name = 'GC_392',
value = '(-4*cpDC*cw0*ee0**2*complex(0,1))/(Lambda**2*sw0) + (4*cpDC*ee0**2*complex(0,1)*sw0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':4})
GC_393 = Coupling(name = 'GC_393',
value = '-(ee0**2*complex(0,1)) + (cw0**2*ee0**2*complex(0,1))/(2.*sw0**2) + (ee0**2*complex(0,1)*sw0**2)/(2.*cw0**2)',
order = {'QED':2})
GC_394 = Coupling(name = 'GC_394',
value = 'ee0**2*complex(0,1) + (cw0**2*ee0**2*complex(0,1))/(2.*sw0**2) + (ee0**2*complex(0,1)*sw0**2)/(2.*cw0**2)',
order = {'QED':2})
GC_395 = Coupling(name = 'GC_395',
value = '(4*cpW*cw0**2*complex(0,1))/Lambda**2 - (4*cpWB*cw0*complex(0,1)*sw0)/Lambda**2 + (4*cpBB*complex(0,1)*sw0**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_396 = Coupling(name = 'GC_396',
value = '(4*cpW*cw0**2*complex(0,1))/Lambda**2 + (4*cpWB*cw0*complex(0,1)*sw0)/Lambda**2 + (4*cpBB*complex(0,1)*sw0**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_397 = Coupling(name = 'GC_397',
value = '(4*cpBB*cw0**2*complex(0,1))/Lambda**2 - (4*cpWB*cw0*complex(0,1)*sw0)/Lambda**2 + (4*cpW*complex(0,1)*sw0**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_398 = Coupling(name = 'GC_398',
value = '(4*cpBB*cw0**2*complex(0,1))/Lambda**2 + (4*cpWB*cw0*complex(0,1)*sw0)/Lambda**2 + (4*cpW*complex(0,1)*sw0**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_399 = Coupling(name = 'GC_399',
value = '(2*cpWB*cw0**2*complex(0,1))/Lambda**2 + (4*cpBB*cw0*complex(0,1)*sw0)/Lambda**2 - (4*cpW*cw0*complex(0,1)*sw0)/Lambda**2 - (2*cpWB*complex(0,1)*sw0**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_4 = Coupling(name = 'GC_4',
value = 'ee0*complex(0,1)',
order = {'QED':1})
GC_40 = Coupling(name = 'GC_40',
value = '(cQQ1*complex(0,1))/Lambda**2 - (cQQ8*complex(0,1))/(6.*Lambda**2)',
order = {'NP':2})
GC_400 = Coupling(name = 'GC_400',
value = '(-2*cpWB*cw0**2*complex(0,1))/Lambda**2 + (4*cpBB*cw0*complex(0,1)*sw0)/Lambda**2 - (4*cpW*cw0*complex(0,1)*sw0)/Lambda**2 + (2*cpWB*complex(0,1)*sw0**2)/Lambda**2',
order = {'NP':2,'QED':2})
GC_401 = Coupling(name = 'GC_401',
value = '-((cpDC*cw0**2*ee0**2*complex(0,1))/(Lambda**2*sw0**2)) + (cpDC*ee0**2*complex(0,1)*sw0**2)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':4})
GC_402 = Coupling(name = 'GC_402',
value = '(2*cpDC*ee0**2*complex(0,1))/Lambda**2 + (cpDC*cw0**2*ee0**2*complex(0,1))/(Lambda**2*sw0**2) + (cpDC*ee0**2*complex(0,1)*sw0**2)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':4})
GC_403 = Coupling(name = 'GC_403',
value = '(-4*cpDC*ee0**2*complex(0,1))/Lambda**2 + (2*cpDC*cw0**2*ee0**2*complex(0,1))/(Lambda**2*sw0**2) + (2*cpDC*ee0**2*complex(0,1)*sw0**2)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':4})
GC_404 = Coupling(name = 'GC_404',
value = '(6*cpDC*ee0**2*complex(0,1))/Lambda**2 + (3*cpDC*cw0**2*ee0**2*complex(0,1))/(Lambda**2*sw0**2) + (3*cpDC*ee0**2*complex(0,1)*sw0**2)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':4})
GC_405 = Coupling(name = 'GC_405',
value = '-((complex(0,1)*MH**2)/vev0**2)',
order = {'QED':2})
GC_406 = Coupling(name = 'GC_406',
value = '(-2*complex(0,1)*MH**2)/vev0**2',
order = {'QED':2})
GC_407 = Coupling(name = 'GC_407',
value = '(-3*complex(0,1)*MH**2)/vev0**2',
order = {'QED':2})
GC_408 = Coupling(name = 'GC_408',
value = '-((complex(0,1)*MH**2)/vev0)',
order = {'QED':1})
GC_409 = Coupling(name = 'GC_409',
value = '(-3*complex(0,1)*MH**2)/vev0',
order = {'QED':1})
GC_41 = Coupling(name = 'GC_41',
value = '(cQQ1*complex(0,1))/Lambda**2 + (cQQ8*complex(0,1))/(3.*Lambda**2)',
order = {'NP':2})
GC_410 = Coupling(name = 'GC_410',
value = '-(ee0**2*vev0)/(2.*cw0)',
order = {'QED':1})
GC_411 = Coupling(name = 'GC_411',
value = '(ee0**2*vev0)/(2.*cw0)',
order = {'QED':1})
GC_412 = Coupling(name = 'GC_412',
value = '-((c3pl1*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':1})
GC_413 = Coupling(name = 'GC_413',
value = '-((c3pl2*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':1})
GC_414 = Coupling(name = 'GC_414',
value = '-((c3pl3*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':1})
GC_415 = Coupling(name = 'GC_415',
value = '-((c3pQ3Internal*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':1})
GC_416 = Coupling(name = 'GC_416',
value = '-((c3pqiInternal*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':1})
GC_417 = Coupling(name = 'GC_417',
value = '(2*cdp*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_418 = Coupling(name = 'GC_418',
value = '(6*cp*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_419 = Coupling(name = 'GC_419',
value = '(12*cp*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_42 = Coupling(name = 'GC_42',
value = '(2*cQq81Internal*complex(0,1))/Lambda**2 - (2*cQq83Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_420 = Coupling(name = 'GC_420',
value = '(18*cp*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_421 = Coupling(name = 'GC_421',
value = '(90*cp*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_422 = Coupling(name = 'GC_422',
value = '(cpd*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_423 = Coupling(name = 'GC_423',
value = '-(cpDC*vev0)/(2.*Lambda**2)',
order = {'NP':2,'QED':1})
GC_424 = Coupling(name = 'GC_424',
value = '-((cpDC*complex(0,1)*vev0)/Lambda**2)',
order = {'NP':2,'QED':1})
GC_425 = Coupling(name = 'GC_425',
value = '(cpe*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_426 = Coupling(name = 'GC_426',
value = '(4*cpG*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_427 = Coupling(name = 'GC_427',
value = '(cpmu*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_428 = Coupling(name = 'GC_428',
value = '(cpt*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_429 = Coupling(name = 'GC_429',
value = '(cpta*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_43 = Coupling(name = 'GC_43',
value = '(2*cQq81Internal*complex(0,1))/Lambda**2 + (2*cQq83Internal*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_430 = Coupling(name = 'GC_430',
value = '(cpu*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_431 = Coupling(name = 'GC_431',
value = '(4*cpW*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_432 = Coupling(name = 'GC_432',
value = '-((ctp*vev0)/Lambda**2)',
order = {'NP':2,'QED':2})
GC_433 = Coupling(name = 'GC_433',
value = '(ctp*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_434 = Coupling(name = 'GC_434',
value = '-((ctp*vev0)/(Lambda**2*cmath.sqrt(2)))',
order = {'NP':2,'QED':2})
GC_435 = Coupling(name = 'GC_435',
value = '(ctp*complex(0,1)*vev0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_436 = Coupling(name = 'GC_436',
value = '(3*ctp*complex(0,1)*vev0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_437 = Coupling(name = 'GC_437',
value = '(ctW*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_438 = Coupling(name = 'GC_438',
value = '(-2*cpWB*cw0*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_439 = Coupling(name = 'GC_439',
value = '(2*cpWB*cw0*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_44 = Coupling(name = 'GC_44',
value = '(-2*c3pl1*ee0*complex(0,1))/Lambda**2 + (2*cpl1*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_440 = Coupling(name = 'GC_440',
value = '-((c3pl1*ee0*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_441 = Coupling(name = 'GC_441',
value = '(c3pl1*ee0*vev0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_442 = Coupling(name = 'GC_442',
value = '-((c3pl2*ee0*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_443 = Coupling(name = 'GC_443',
value = '(c3pl2*ee0*vev0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_444 = Coupling(name = 'GC_444',
value = '-((c3pl3*ee0*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_445 = Coupling(name = 'GC_445',
value = '(c3pl3*ee0*vev0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_446 = Coupling(name = 'GC_446',
value = '-((c3pQ3Internal*ee0*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_447 = Coupling(name = 'GC_447',
value = '(c3pQ3Internal*ee0*vev0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_448 = Coupling(name = 'GC_448',
value = '-((c3pqiInternal*ee0*vev0*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_449 = Coupling(name = 'GC_449',
value = '(c3pqiInternal*ee0*vev0*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_45 = Coupling(name = 'GC_45',
value = '(2*c3pl1*ee0*complex(0,1))/Lambda**2 + (2*cpl1*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_450 = Coupling(name = 'GC_450',
value = '(cpDC*ee0*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_451 = Coupling(name = 'GC_451',
value = '(-4*cpW*ee0*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_452 = Coupling(name = 'GC_452',
value = '(2*cpWB*ee0*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_453 = Coupling(name = 'GC_453',
value = '-((ctW*ee0*complex(0,1)*vev0)/Lambda**2)',
order = {'NP':2,'QED':2})
GC_454 = Coupling(name = 'GC_454',
value = '(ctW*ee0*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_455 = Coupling(name = 'GC_455',
value = '(2*cpWB*cw0*ee0*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_456 = Coupling(name = 'GC_456',
value = '(-4*cpW*ee0**2*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':3})
GC_457 = Coupling(name = 'GC_457',
value = '(4*cpG*G*vev0)/Lambda**2',
order = {'NP':2,'QCD':1,'QED':1})
GC_458 = Coupling(name = 'GC_458',
value = '(ctG*complex(0,1)*G*vev0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QCD':1})
GC_459 = Coupling(name = 'GC_459',
value = '(-4*cpG*complex(0,1)*G**2*vev0)/Lambda**2',
order = {'NP':2,'QCD':2,'QED':1})
GC_46 = Coupling(name = 'GC_46',
value = '(-2*c3pl2*ee0*complex(0,1))/Lambda**2 + (2*cpl2*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_460 = Coupling(name = 'GC_460',
value = '-((ctG*G**2*vev0)/(Lambda**2*cmath.sqrt(2)))',
order = {'NP':2,'QCD':2})
GC_461 = Coupling(name = 'GC_461',
value = '-(ee0**2*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_462 = Coupling(name = 'GC_462',
value = '-(ee0**2*complex(0,1)*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_463 = Coupling(name = 'GC_463',
value = '(ee0**2*complex(0,1)*vev0)/(2.*sw0**2)',
order = {'QED':1})
GC_464 = Coupling(name = 'GC_464',
value = '(ee0**2*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_465 = Coupling(name = 'GC_465',
value = '(-2*cpDC*ee0**2*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_466 = Coupling(name = 'GC_466',
value = '(cpDC*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_467 = Coupling(name = 'GC_467',
value = '(-2*cpDC*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_468 = Coupling(name = 'GC_468',
value = '(2*cpDC*ee0**2*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_469 = Coupling(name = 'GC_469',
value = '(4*cpW*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_47 = Coupling(name = 'GC_47',
value = '(2*c3pl2*ee0*complex(0,1))/Lambda**2 + (2*cpl2*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_470 = Coupling(name = 'GC_470',
value = '(-4*cpW*cw0**2*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_471 = Coupling(name = 'GC_471',
value = '-(ee0**2*vev0)/(2.*sw0)',
order = {'QED':1})
GC_472 = Coupling(name = 'GC_472',
value = '(ee0**2*vev0)/(2.*sw0)',
order = {'QED':1})
GC_473 = Coupling(name = 'GC_473',
value = '(c3pl1*ee0*complex(0,1)*vev0*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_474 = Coupling(name = 'GC_474',
value = '(c3pl2*ee0*complex(0,1)*vev0*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_475 = Coupling(name = 'GC_475',
value = '(c3pl3*ee0*complex(0,1)*vev0*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_476 = Coupling(name = 'GC_476',
value = '(c3pQ3Internal*ee0*complex(0,1)*vev0*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_477 = Coupling(name = 'GC_477',
value = '(c3pqiInternal*ee0*complex(0,1)*vev0*cmath.sqrt(2))/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_478 = Coupling(name = 'GC_478',
value = '-((cpd*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_479 = Coupling(name = 'GC_479',
value = '(cpd*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_48 = Coupling(name = 'GC_48',
value = '(-2*c3pl3*ee0*complex(0,1))/Lambda**2 + (2*cpl3*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_480 = Coupling(name = 'GC_480',
value = '-(cpDC*ee0*vev0)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_481 = Coupling(name = 'GC_481',
value = '-((cpDC*ee0*complex(0,1)*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_482 = Coupling(name = 'GC_482',
value = '(cpDC*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_483 = Coupling(name = 'GC_483',
value = '(cpDC*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_484 = Coupling(name = 'GC_484',
value = '-((cpe*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_485 = Coupling(name = 'GC_485',
value = '(cpe*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_486 = Coupling(name = 'GC_486',
value = '-((cpl1*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_487 = Coupling(name = 'GC_487',
value = '(cpl1*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_488 = Coupling(name = 'GC_488',
value = '-((cpl2*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_489 = Coupling(name = 'GC_489',
value = '(cpl2*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_49 = Coupling(name = 'GC_49',
value = '(2*c3pl3*ee0*complex(0,1))/Lambda**2 + (2*cpl3*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_490 = Coupling(name = 'GC_490',
value = '-((cpl3*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_491 = Coupling(name = 'GC_491',
value = '(cpl3*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_492 = Coupling(name = 'GC_492',
value = '-((cpmu*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_493 = Coupling(name = 'GC_493',
value = '(cpmu*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_494 = Coupling(name = 'GC_494',
value = '-((cpQ3Internal*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_495 = Coupling(name = 'GC_495',
value = '(cpQ3Internal*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_496 = Coupling(name = 'GC_496',
value = '-((cpqiInternal*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_497 = Coupling(name = 'GC_497',
value = '(cpqiInternal*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_498 = Coupling(name = 'GC_498',
value = '-((cpt*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_499 = Coupling(name = 'GC_499',
value = '(cpt*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_5 = Coupling(name = 'GC_5',
value = 'ee0**2*complex(0,1)',
order = {'QED':2})
GC_50 = Coupling(name = 'GC_50',
value = '(-2*c3pQ3Internal*ee0*complex(0,1))/Lambda**2 + (2*cpQ3Internal*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_500 = Coupling(name = 'GC_500',
value = '-((cpta*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_501 = Coupling(name = 'GC_501',
value = '(cpta*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_502 = Coupling(name = 'GC_502',
value = '-((cpu*ee0*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_503 = Coupling(name = 'GC_503',
value = '(cpu*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_504 = Coupling(name = 'GC_504',
value = '(ctW*ee0*complex(0,1)*vev0)/(Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':2})
GC_505 = Coupling(name = 'GC_505',
value = '(4*cpW*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_506 = Coupling(name = 'GC_506',
value = '(2*cpWB*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_507 = Coupling(name = 'GC_507',
value = '-((ctW*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0))',
order = {'NP':2,'QED':2})
GC_508 = Coupling(name = 'GC_508',
value = '(ctW*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_509 = Coupling(name = 'GC_509',
value = '(2*cpWB*cw0**2*ee0*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_51 = Coupling(name = 'GC_51',
value = '(2*c3pQ3Internal*ee0*complex(0,1))/Lambda**2 + (2*cpQ3Internal*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_510 = Coupling(name = 'GC_510',
value = '(-2*cpDC*ee0**2*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_511 = Coupling(name = 'GC_511',
value = '(2*cpDC*ee0**2*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_512 = Coupling(name = 'GC_512',
value = '(-8*cpW*cw0*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0)',
order = {'NP':2,'QED':3})
GC_513 = Coupling(name = 'GC_513',
value = '(-2*cpWB*sw0*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_514 = Coupling(name = 'GC_514',
value = '(2*cpWB*sw0*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_515 = Coupling(name = 'GC_515',
value = '(2*cpWB*ee0*sw0*vev0)/Lambda**2',
order = {'NP':2,'QED':2})
GC_516 = Coupling(name = 'GC_516',
value = '-((c3pl1*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':2})
GC_517 = Coupling(name = 'GC_517',
value = '(c3pl1*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_518 = Coupling(name = 'GC_518',
value = '-((c3pl2*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':2})
GC_519 = Coupling(name = 'GC_519',
value = '(c3pl2*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_52 = Coupling(name = 'GC_52',
value = '(-2*c3pqiInternal*ee0*complex(0,1))/Lambda**2 + (2*cpqiInternal*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_520 = Coupling(name = 'GC_520',
value = '-((c3pl3*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':2})
GC_521 = Coupling(name = 'GC_521',
value = '(c3pl3*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_522 = Coupling(name = 'GC_522',
value = '-((c3pQ3Internal*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':2})
GC_523 = Coupling(name = 'GC_523',
value = '(c3pQ3Internal*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_524 = Coupling(name = 'GC_524',
value = '-((c3pqiInternal*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2))',
order = {'NP':2,'QED':2})
GC_525 = Coupling(name = 'GC_525',
value = '(c3pqiInternal*ee0*sw0*vev0*cmath.sqrt(2))/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_526 = Coupling(name = 'GC_526',
value = '(6*cp*complex(0,1)*vev0**2)/Lambda**2',
order = {'NP':2,'QED':1})
GC_527 = Coupling(name = 'GC_527',
value = '(36*cp*complex(0,1)*vev0**2)/Lambda**2',
order = {'NP':2,'QED':1})
GC_528 = Coupling(name = 'GC_528',
value = '-((cpWB*ee0*complex(0,1)*vev0**2)/Lambda**2)',
order = {'NP':2,'QED':1})
GC_529 = Coupling(name = 'GC_529',
value = '-(cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_53 = Coupling(name = 'GC_53',
value = '(2*c3pqiInternal*ee0*complex(0,1))/Lambda**2 + (2*cpqiInternal*ee0*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_530 = Coupling(name = 'GC_530',
value = '(cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_531 = Coupling(name = 'GC_531',
value = '-((cpDC*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2))',
order = {'NP':2,'QED':2})
GC_532 = Coupling(name = 'GC_532',
value = '(-3*cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_533 = Coupling(name = 'GC_533',
value = '(3*cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_534 = Coupling(name = 'GC_534',
value = '(cpWB*cw0*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_535 = Coupling(name = 'GC_535',
value = '(3*c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_536 = Coupling(name = 'GC_536',
value = '(3*c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_537 = Coupling(name = 'GC_537',
value = '(6*cp*complex(0,1)*vev0**3)/Lambda**2',
order = {'NP':2})
GC_538 = Coupling(name = 'GC_538',
value = '(2*cdp*complex(0,1)*vev0)/Lambda**2 - (cpDC*complex(0,1)*vev0)/(2.*Lambda**2)',
order = {'NP':2,'QED':1})
GC_539 = Coupling(name = 'GC_539',
value = '(4*cdp*complex(0,1)*vev0)/Lambda**2 - (cpDC*complex(0,1)*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_54 = Coupling(name = 'GC_54',
value = '-((c3pl1*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_540 = Coupling(name = 'GC_540',
value = '-((c3pl1*vev0)/Lambda**2) + (cpl1*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_541 = Coupling(name = 'GC_541',
value = '(c3pl1*vev0)/Lambda**2 + (cpl1*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_542 = Coupling(name = 'GC_542',
value = '-((c3pl2*vev0)/Lambda**2) + (cpl2*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_543 = Coupling(name = 'GC_543',
value = '(c3pl2*vev0)/Lambda**2 + (cpl2*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_544 = Coupling(name = 'GC_544',
value = '-((c3pl3*vev0)/Lambda**2) + (cpl3*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_545 = Coupling(name = 'GC_545',
value = '(c3pl3*vev0)/Lambda**2 + (cpl3*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_546 = Coupling(name = 'GC_546',
value = '-((c3pQ3Internal*vev0)/Lambda**2) + (cpQ3Internal*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_547 = Coupling(name = 'GC_547',
value = '(c3pQ3Internal*vev0)/Lambda**2 + (cpQ3Internal*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_548 = Coupling(name = 'GC_548',
value = '-((c3pqiInternal*vev0)/Lambda**2) + (cpqiInternal*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_549 = Coupling(name = 'GC_549',
value = '(c3pqiInternal*vev0)/Lambda**2 + (cpqiInternal*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_55 = Coupling(name = 'GC_55',
value = '-((c3pl1*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_550 = Coupling(name = 'GC_550',
value = '(c3pl1*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (c3pl2*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (cdp*complex(0,1)*MH**2*vev0)/Lambda**2 - (cll1221*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) - (cpDC*complex(0,1)*MH**2*vev0)/(4.*Lambda**2)',
order = {'NP':2,'QED':1})
GC_551 = Coupling(name = 'GC_551',
value = '(c3pl1*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (c3pl2*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (cdp*complex(0,1)*MH**2*vev0)/Lambda**2 - (cll1221*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (cpDC*complex(0,1)*MH**2*vev0)/(4.*Lambda**2)',
order = {'NP':2,'QED':1})
GC_552 = Coupling(name = 'GC_552',
value = '(3*c3pl1*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (3*c3pl2*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) - (3*cdp*complex(0,1)*MH**2*vev0)/Lambda**2 - (3*cll1221*complex(0,1)*MH**2*vev0)/(2.*Lambda**2) + (3*cpDC*complex(0,1)*MH**2*vev0)/(4.*Lambda**2)',
order = {'NP':2,'QED':1})
GC_553 = Coupling(name = 'GC_553',
value = '-(ee0**2*vev0)/(4.*cw0) - (cw0*ee0**2*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_554 = Coupling(name = 'GC_554',
value = '(ee0**2*vev0)/(4.*cw0) - (cw0*ee0**2*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_555 = Coupling(name = 'GC_555',
value = '-(ee0**2*vev0)/(4.*cw0) + (cw0*ee0**2*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_556 = Coupling(name = 'GC_556',
value = '(ee0**2*vev0)/(4.*cw0) + (cw0*ee0**2*vev0)/(4.*sw0**2)',
order = {'QED':1})
GC_557 = Coupling(name = 'GC_557',
value = '(-3*cpDC*ee0**2*vev0)/(2.*cw0*Lambda**2) - (3*cpDC*cw0*ee0**2*vev0)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_558 = Coupling(name = 'GC_558',
value = '(cpDC*ee0**2*vev0)/(cw0*Lambda**2) - (cpDC*cw0*ee0**2*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_559 = Coupling(name = 'GC_559',
value = '-(cpDC*ee0**2*vev0)/(2.*cw0*Lambda**2) - (cpDC*cw0*ee0**2*vev0)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_56 = Coupling(name = 'GC_56',
value = '(c3pl1*complex(0,1)*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_560 = Coupling(name = 'GC_560',
value = '-(cpDC*ee0**2*complex(0,1)*vev0)/(2.*cw0*Lambda**2) - (cpDC*cw0*ee0**2*complex(0,1)*vev0)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_561 = Coupling(name = 'GC_561',
value = '(cpDC*ee0**2*vev0)/(2.*cw0*Lambda**2) + (cpDC*cw0*ee0**2*vev0)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_562 = Coupling(name = 'GC_562',
value = '-((cpDC*ee0**2*vev0)/(cw0*Lambda**2)) + (cpDC*cw0*ee0**2*vev0)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_563 = Coupling(name = 'GC_563',
value = '(3*cpDC*ee0**2*vev0)/(2.*cw0*Lambda**2) + (3*cpDC*cw0*ee0**2*vev0)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':3})
GC_564 = Coupling(name = 'GC_564',
value = '-((ctW*cw0*complex(0,1)*vev0)/(Lambda**2*cmath.sqrt(2))) + (ctB*complex(0,1)*sw0*vev0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_565 = Coupling(name = 'GC_565',
value = '(ctB*cw0*complex(0,1)*vev0)/(Lambda**2*cmath.sqrt(2)) + (ctW*complex(0,1)*sw0*vev0)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_566 = Coupling(name = 'GC_566',
value = '(cpd*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpd*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_567 = Coupling(name = 'GC_567',
value = '-(cpDC*cw0*ee0*complex(0,1)*vev0)/(2.*Lambda**2*sw0) - (cpDC*ee0*complex(0,1)*sw0*vev0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_568 = Coupling(name = 'GC_568',
value = '-(cpDC*cw0*ee0*vev0)/(2.*Lambda**2*sw0) + (cpDC*ee0*sw0*vev0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_569 = Coupling(name = 'GC_569',
value = '(cpDC*cw0*ee0*vev0)/(2.*Lambda**2*sw0) + (cpDC*ee0*sw0*vev0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_57 = Coupling(name = 'GC_57',
value = '-((c3pl2*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_570 = Coupling(name = 'GC_570',
value = '(3*cpDC*cw0*ee0*vev0)/(2.*Lambda**2*sw0) + (3*cpDC*ee0*sw0*vev0)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_571 = Coupling(name = 'GC_571',
value = '(cpe*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpe*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_572 = Coupling(name = 'GC_572',
value = '-((c3pl1*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)) + (cpl1*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) - (c3pl1*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_573 = Coupling(name = 'GC_573',
value = '(c3pl1*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpl1*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_574 = Coupling(name = 'GC_574',
value = '-((c3pl2*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)) + (cpl2*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) - (c3pl2*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_575 = Coupling(name = 'GC_575',
value = '(c3pl2*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpl2*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (c3pl2*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_576 = Coupling(name = 'GC_576',
value = '-((c3pl3*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)) + (cpl3*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) - (c3pl3*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_577 = Coupling(name = 'GC_577',
value = '(c3pl3*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpl3*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (c3pl3*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_578 = Coupling(name = 'GC_578',
value = '(cpmu*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpmu*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_579 = Coupling(name = 'GC_579',
value = '-((c3pQ3Internal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)) + (cpQ3Internal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) - (c3pQ3Internal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_58 = Coupling(name = 'GC_58',
value = '-((c3pl2*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_580 = Coupling(name = 'GC_580',
value = '(c3pQ3Internal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpQ3Internal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (c3pQ3Internal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_581 = Coupling(name = 'GC_581',
value = '-((c3pqiInternal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0)) + (cpqiInternal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) - (c3pqiInternal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_582 = Coupling(name = 'GC_582',
value = '(c3pqiInternal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpqiInternal*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (c3pqiInternal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_583 = Coupling(name = 'GC_583',
value = '(cpt*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpt*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_584 = Coupling(name = 'GC_584',
value = '(cpta*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpta*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_585 = Coupling(name = 'GC_585',
value = '(cpu*cw0*ee0*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpu*ee0*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_586 = Coupling(name = 'GC_586',
value = '(cpDC*cw0*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0) + (cpDC*ee0**2*complex(0,1)*sw0*vev0)/(cw0*Lambda**2)',
order = {'NP':2,'QED':3})
GC_587 = Coupling(name = 'GC_587',
value = '-(ee0**2*complex(0,1)*vev0)/2. - (cw0**2*ee0**2*complex(0,1)*vev0)/(4.*sw0**2) - (ee0**2*complex(0,1)*sw0**2*vev0)/(4.*cw0**2)',
order = {'QED':1})
GC_588 = Coupling(name = 'GC_588',
value = 'ee0**2*complex(0,1)*vev0 + (cw0**2*ee0**2*complex(0,1)*vev0)/(2.*sw0**2) + (ee0**2*complex(0,1)*sw0**2*vev0)/(2.*cw0**2)',
order = {'QED':1})
GC_589 = Coupling(name = 'GC_589',
value = '(4*cpW*cw0**2*complex(0,1)*vev0)/Lambda**2 + (4*cpWB*cw0*complex(0,1)*sw0*vev0)/Lambda**2 + (4*cpBB*complex(0,1)*sw0**2*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_59 = Coupling(name = 'GC_59',
value = '(c3pl2*complex(0,1)*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_590 = Coupling(name = 'GC_590',
value = '(4*cpBB*cw0**2*complex(0,1)*vev0)/Lambda**2 - (4*cpWB*cw0*complex(0,1)*sw0*vev0)/Lambda**2 + (4*cpW*complex(0,1)*sw0**2*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_591 = Coupling(name = 'GC_591',
value = '(2*cpWB*cw0**2*complex(0,1)*vev0)/Lambda**2 + (4*cpBB*cw0*complex(0,1)*sw0*vev0)/Lambda**2 - (4*cpW*cw0*complex(0,1)*sw0*vev0)/Lambda**2 - (2*cpWB*complex(0,1)*sw0**2*vev0)/Lambda**2',
order = {'NP':2,'QED':1})
GC_592 = Coupling(name = 'GC_592',
value = '-((cpDC*cw0**2*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2)) + (cpDC*ee0**2*complex(0,1)*sw0**2*vev0)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':3})
GC_593 = Coupling(name = 'GC_593',
value = '(2*cpDC*ee0**2*complex(0,1)*vev0)/Lambda**2 + (cpDC*cw0**2*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2) + (cpDC*ee0**2*complex(0,1)*sw0**2*vev0)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':3})
GC_594 = Coupling(name = 'GC_594',
value = '(6*cpDC*ee0**2*complex(0,1)*vev0)/Lambda**2 + (3*cpDC*cw0**2*ee0**2*complex(0,1)*vev0)/(Lambda**2*sw0**2) + (3*cpDC*ee0**2*complex(0,1)*sw0**2*vev0)/(cw0**2*Lambda**2)',
order = {'NP':2,'QED':3})
GC_595 = Coupling(name = 'GC_595',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(24.*Lambda**2) - (cpDC*ee0*complex(0,1)*vev0**2)/(24.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_596 = Coupling(name = 'GC_596',
value = '-(cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2) + (cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_597 = Coupling(name = 'GC_597',
value = '-(cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2) + (cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2) - (cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_598 = Coupling(name = 'GC_598',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_599 = Coupling(name = 'GC_599',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_6 = Coupling(name = 'GC_6',
value = '2*ee0**2*complex(0,1)',
order = {'QED':2})
GC_60 = Coupling(name = 'GC_60',
value = '-((c3pl3*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_600 = Coupling(name = 'GC_600',
value = '(cpDC*ee0*vev0**2)/(8.*Lambda**2) - (cpDC*ee0*vev0**2)/(8.*Lambda**2*sw0**2) + (cpDC*cw0**2*ee0*vev0**2)/(8.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_601 = Coupling(name = 'GC_601',
value = '(c3pl1*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) + (c3pl2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) - (cll1221*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2)',
order = {'NP':2,'QED':2})
GC_602 = Coupling(name = 'GC_602',
value = '-(c3pl1*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cll1221*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':2})
GC_603 = Coupling(name = 'GC_603',
value = '-(c3pl1*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cdp*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) + (cll1221*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':2})
GC_604 = Coupling(name = 'GC_604',
value = '-(c3pl1*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cll1221*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cpDC*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':2})
GC_605 = Coupling(name = 'GC_605',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (cll1221*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_606 = Coupling(name = 'GC_606',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cll1221*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_607 = Coupling(name = 'GC_607',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) - (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_608 = Coupling(name = 'GC_608',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) + (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_609 = Coupling(name = 'GC_609',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) - (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) + (c3pl3*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0*cmath.sqrt(2)) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_61 = Coupling(name = 'GC_61',
value = '-((c3pl3*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_610 = Coupling(name = 'GC_610',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) - (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) + (c3pQ3Internal*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0*cmath.sqrt(2)) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_611 = Coupling(name = 'GC_611',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) - (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2)) + (c3pqiInternal*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0*cmath.sqrt(2)) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_612 = Coupling(name = 'GC_612',
value = '(c3pl1*ee0*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*ee0*vev0**2)/(4.*Lambda**2*sw0) - (cdp*ee0*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*ee0*vev0**2)/(4.*Lambda**2*sw0) + (cpDC*ee0*vev0**2)/(8.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_613 = Coupling(name = 'GC_613',
value = '(cpWB*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2) + (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(12.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_614 = Coupling(name = 'GC_614',
value = '(-2*cpWB*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2) - (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(6.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_615 = Coupling(name = 'GC_615',
value = '(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_616 = Coupling(name = 'GC_616',
value = '-((cpWB*ee0*complex(0,1)*vev0**2)/Lambda**2) + (c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_617 = Coupling(name = 'GC_617',
value = '(cpWB*ee0*complex(0,1)*vev0**2)/Lambda**2 + (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_618 = Coupling(name = 'GC_618',
value = '-(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_619 = Coupling(name = 'GC_619',
value = '(cpWB*ee0*complex(0,1)*vev0**2)/Lambda**2 - (c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpDC*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_62 = Coupling(name = 'GC_62',
value = '(c3pl3*complex(0,1)*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_620 = Coupling(name = 'GC_620',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(6.*Lambda**2) + (c3pl2*ee0*complex(0,1)*vev0**2)/(6.*Lambda**2) - (cll1221*ee0*complex(0,1)*vev0**2)/(6.*Lambda**2) + (cpWB*cw0*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_621 = Coupling(name = 'GC_621',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2) - (c3pl2*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2) + (cll1221*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2) - (2*cpWB*cw0*ee0*complex(0,1)*vev0**2)/(3.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_622 = Coupling(name = 'GC_622',
value = '-(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) - (cpWB*cw0*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_623 = Coupling(name = 'GC_623',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (cpWB*cw0*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_624 = Coupling(name = 'GC_624',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2) + (cpDC*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2) + (cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2) + (cpWB*cw0*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_625 = Coupling(name = 'GC_625',
value = '(c3pl1*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (c3pl2*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cll1221*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2) + (cpDC*cw0**2*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) + (cpWB*cw0*ee0*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_626 = Coupling(name = 'GC_626',
value = '(cpDC*ee0**2*vev0**2)/(8.*Lambda**2*sw0**3) + (cpWB*cw0*ee0**2*vev0**2)/(2.*Lambda**2*sw0**2) + (c3pl1*ee0**2*vev0**2)/(2.*Lambda**2*sw0) + (c3pl2*ee0**2*vev0**2)/(2.*Lambda**2*sw0) - (cdp*ee0**2*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*ee0**2*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_627 = Coupling(name = 'GC_627',
value = '(cpDC*ee0**2*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**3) + (cpWB*cw0*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (c3pl1*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pl2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_628 = Coupling(name = 'GC_628',
value = '-(cpDC*ee0**2*vev0**2)/(8.*Lambda**2*sw0**3) - (cpWB*cw0*ee0**2*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl1*ee0**2*vev0**2)/(2.*Lambda**2*sw0) - (c3pl2*ee0**2*vev0**2)/(2.*Lambda**2*sw0) + (cdp*ee0**2*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*ee0**2*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_629 = Coupling(name = 'GC_629',
value = '-(c3pl1*ee0**2*vev0**2)/(2.*cw0*Lambda**2) - (c3pl2*ee0**2*vev0**2)/(2.*cw0*Lambda**2) + (cdp*ee0**2*vev0**2)/(2.*cw0*Lambda**2) + (cll1221*ee0**2*vev0**2)/(2.*cw0*Lambda**2) + (5*cpDC*ee0**2*vev0**2)/(8.*cw0*Lambda**2) - (cpDC*ee0**2*vev0**2)/(8.*cw0*Lambda**2*sw0**2) + (5*cpDC*cw0*ee0**2*vev0**2)/(8.*Lambda**2*sw0**2) - (cpWB*ee0**2*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_63 = Coupling(name = 'GC_63',
value = '-((c3pQ3Internal*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_630 = Coupling(name = 'GC_630',
value = '(c3pl1*ee0**2*complex(0,1)*vev0**2)/(2.*cw0*Lambda**2) + (c3pl2*ee0**2*complex(0,1)*vev0**2)/(2.*cw0*Lambda**2) - (cll1221*ee0**2*complex(0,1)*vev0**2)/(2.*cw0*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2) + (cpDC*ee0**2*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0**2) - (cpDC*cw0*ee0**2*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**2) + (cpWB*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_631 = Coupling(name = 'GC_631',
value = '(c3pl1*ee0**2*vev0**2)/(2.*cw0*Lambda**2) + (c3pl2*ee0**2*vev0**2)/(2.*cw0*Lambda**2) - (cdp*ee0**2*vev0**2)/(2.*cw0*Lambda**2) - (cll1221*ee0**2*vev0**2)/(2.*cw0*Lambda**2) - (5*cpDC*ee0**2*vev0**2)/(8.*cw0*Lambda**2) + (cpDC*ee0**2*vev0**2)/(8.*cw0*Lambda**2*sw0**2) - (5*cpDC*cw0*ee0**2*vev0**2)/(8.*Lambda**2*sw0**2) + (cpWB*ee0**2*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_632 = Coupling(name = 'GC_632',
value = '(2*cpWB*ee0**2*complex(0,1)*vev0**2)/Lambda**2 - (cpDC*cw0**3*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**3) - (2*cpWB*cw0**2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) - (2*c3pl1*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) - (2*c3pl2*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) + (2*cll1221*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) + (cpDC*cw0*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_633 = Coupling(name = 'GC_633',
value = '-((c3pl1*ee0**2*complex(0,1)*vev0**2)/Lambda**2) - (c3pl2*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (cll1221*ee0**2*complex(0,1)*vev0**2)/Lambda**2 - (cpDC*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (2*cpWB*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_634 = Coupling(name = 'GC_634',
value = '-((c3pl1*cw0**2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2)) - (c3pl2*cw0**2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) + (cll1221*cw0**2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) + (cpDC*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (2*cpWB*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_635 = Coupling(name = 'GC_635',
value = '(-2*c3pl1*ee0**2*complex(0,1)*vev0**2)/Lambda**2 - (2*c3pl2*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (2*cll1221*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (cpDC*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (cpDC*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (4*cpWB*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0)',
order = {'NP':2,'QED':2})
GC_636 = Coupling(name = 'GC_636',
value = '-(cpDC*ee0*complex(0,1)*vev0**2)/(24.*cw0*Lambda**2*sw0) - (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(12.*cw0*Lambda**2) - (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(12.*cw0*Lambda**2) + (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(12.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_637 = Coupling(name = 'GC_637',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_638 = Coupling(name = 'GC_638',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_639 = Coupling(name = 'GC_639',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_64 = Coupling(name = 'GC_64',
value = '-((c3pQ3Internal*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_640 = Coupling(name = 'GC_640',
value = '-(cpDC*ee0*vev0**2)/(8.*cw0*Lambda**2*sw0) - (c3pl1*cw0*ee0*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*vev0**2)/(4.*Lambda**2*sw0) + (cdp*cw0*ee0*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*cw0*ee0*vev0**2)/(4.*Lambda**2*sw0) - (c3pl1*ee0*sw0*vev0**2)/(4.*cw0*Lambda**2) - (c3pl2*ee0*sw0*vev0**2)/(4.*cw0*Lambda**2) + (cdp*ee0*sw0*vev0**2)/(2.*cw0*Lambda**2) + (cll1221*ee0*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_641 = Coupling(name = 'GC_641',
value = '(cpd*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpd*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_642 = Coupling(name = 'GC_642',
value = '(cpWB*ee0*complex(0,1)*vev0**2)/Lambda**2 + (cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) - (c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cpDC*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_643 = Coupling(name = 'GC_643',
value = '(cpDC*cw0*ee0*vev0**2)/(2.*Lambda**2*sw0) + (cpDC*ee0*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_644 = Coupling(name = 'GC_644',
value = '(cpe*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpe*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_645 = Coupling(name = 'GC_645',
value = '(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpl1*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpl1*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_646 = Coupling(name = 'GC_646',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) - (c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpl1*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (cpl1*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_647 = Coupling(name = 'GC_647',
value = '-(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpl2*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpl2*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_648 = Coupling(name = 'GC_648',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpl2*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (cpl2*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_649 = Coupling(name = 'GC_649',
value = '-(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl3*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpl3*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pl3*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_65 = Coupling(name = 'GC_65',
value = '(c3pQ3Internal*complex(0,1)*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_650 = Coupling(name = 'GC_650',
value = '(cpDC*ee0*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl3*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpl3*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pl1*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (c3pl2*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) - (c3pl3*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2) - (cll1221*ee0*complex(0,1)*sw0*vev0**2)/(4.*cw0*Lambda**2) + (cpl3*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_651 = Coupling(name = 'GC_651',
value = '(cpmu*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpmu*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_652 = Coupling(name = 'GC_652',
value = '(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pQ3Internal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpQ3Internal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (c3pQ3Internal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_653 = Coupling(name = 'GC_653',
value = '-(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pQ3Internal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpQ3Internal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pQ3Internal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2) + (cpQ3Internal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_654 = Coupling(name = 'GC_654',
value = '(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pqiInternal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpqiInternal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) - (c3pqiInternal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_655 = Coupling(name = 'GC_655',
value = '-(c3pl1*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl2*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (c3pqiInternal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cll1221*cw0*ee0*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpqiInternal*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (c3pqiInternal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2) + (cpqiInternal*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_656 = Coupling(name = 'GC_656',
value = '(cpt*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpt*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_657 = Coupling(name = 'GC_657',
value = '(cpta*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpta*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_658 = Coupling(name = 'GC_658',
value = '(cpu*cw0*ee0*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0) + (cpu*ee0*complex(0,1)*sw0*vev0**2)/(2.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_659 = Coupling(name = 'GC_659',
value = '-(cpDC*cw0*ee0**2*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**3) + (cpDC*cw0**3*ee0**2*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**3) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (cpDC*cw0*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) + (cpDC*ee0**2*complex(0,1)*sw0*vev0**2)/(8.*cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_66 = Coupling(name = 'GC_66',
value = '-((c3pqiInternal*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_660 = Coupling(name = 'GC_660',
value = '(-3*cpWB*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (cpDC*cw0*ee0**2*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**3) + (cpDC*cw0**3*ee0**2*complex(0,1)*vev0**2)/(8.*Lambda**2*sw0**3) + (cpWB*cw0**2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) - (3*cpDC*ee0**2*complex(0,1)*vev0**2)/(8.*cw0*Lambda**2*sw0) + (c3pl1*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) + (c3pl2*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) - (cll1221*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) - (cpDC*cw0*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0) - (c3pl1*ee0**2*complex(0,1)*sw0*vev0**2)/(cw0*Lambda**2) - (c3pl2*ee0**2*complex(0,1)*sw0*vev0**2)/(cw0*Lambda**2) + (cll1221*ee0**2*complex(0,1)*sw0*vev0**2)/(cw0*Lambda**2) + (5*cpDC*ee0**2*complex(0,1)*sw0*vev0**2)/(8.*cw0*Lambda**2)',
order = {'NP':2,'QED':2})
GC_661 = Coupling(name = 'GC_661',
value = '-((c3pl1*ee0**2*complex(0,1)*vev0**2)/Lambda**2) - (c3pl2*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (cll1221*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (cpDC*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*cw0**2*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) - (c3pl1*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl2*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cll1221*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cpDC*cw0**2*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) - (c3pl1*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) - (c3pl2*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) + (cll1221*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) + (cpDC*ee0**2*complex(0,1)*sw0**2*vev0**2)/(4.*cw0**2*Lambda**2)',
order = {'NP':2,'QED':2})
GC_662 = Coupling(name = 'GC_662',
value = '(c3pl1*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (c3pl2*ee0**2*complex(0,1)*vev0**2)/Lambda**2 - (cll1221*ee0**2*complex(0,1)*vev0**2)/Lambda**2 - (cpDC*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*cw0**2*Lambda**2) + (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) - (c3pl1*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl2*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cll1221*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (2*cpWB*cw0*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0) - (2*cpWB*ee0**2*complex(0,1)*sw0*vev0**2)/(cw0*Lambda**2) - (c3pl1*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) - (c3pl2*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) + (cll1221*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) + (cpDC*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2)',
order = {'NP':2,'QED':2})
GC_663 = Coupling(name = 'GC_663',
value = '-((c3pl1*ee0**2*complex(0,1)*vev0**2)/Lambda**2) - (c3pl2*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (2*cdp*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (cll1221*ee0**2*complex(0,1)*vev0**2)/Lambda**2 + (5*cpDC*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*cw0**2*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) - (c3pl1*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) - (c3pl2*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (cdp*cw0**2*ee0**2*complex(0,1)*vev0**2)/(Lambda**2*sw0**2) + (cll1221*cw0**2*ee0**2*complex(0,1)*vev0**2)/(2.*Lambda**2*sw0**2) + (5*cpDC*cw0**2*ee0**2*complex(0,1)*vev0**2)/(4.*Lambda**2*sw0**2) - (c3pl1*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) - (c3pl2*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) + (cdp*ee0**2*complex(0,1)*sw0**2*vev0**2)/(cw0**2*Lambda**2) + (cll1221*ee0**2*complex(0,1)*sw0**2*vev0**2)/(2.*cw0**2*Lambda**2) + (5*cpDC*ee0**2*complex(0,1)*sw0**2*vev0**2)/(4.*cw0**2*Lambda**2)',
order = {'NP':2,'QED':2})
GC_664 = Coupling(name = 'GC_664',
value = '-(c3pl1*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (c3pl2*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cll1221*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_665 = Coupling(name = 'GC_665',
value = '(c3pl1*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) + (c3pl2*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) - (cdp*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (cll1221*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) + (cpDC*ee0**2*complex(0,1)*vev0**3)/(16.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_666 = Coupling(name = 'GC_666',
value = '-(c3pl1*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (c3pl2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) + (cdp*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2*sw0**2) + (cll1221*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (cpDC*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_667 = Coupling(name = 'GC_667',
value = '(c3pl1*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (c3pl2*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cll1221*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0**2)',
order = {'NP':2,'QED':1})
GC_668 = Coupling(name = 'GC_668',
value = '(cpDC*ee0**2*vev0**3)/(8.*Lambda**2*sw0**3) + (cpWB*cw0*ee0**2*vev0**3)/(2.*Lambda**2*sw0**2) + (c3pl1*ee0**2*vev0**3)/(4.*Lambda**2*sw0) + (c3pl2*ee0**2*vev0**3)/(4.*Lambda**2*sw0) - (cll1221*ee0**2*vev0**3)/(4.*Lambda**2*sw0) - (cpDC*ee0**2*vev0**3)/(8.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_669 = Coupling(name = 'GC_669',
value = '(cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) - (cpDC*cw0**2*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) + (cpWB*cw0*ee0**2*vev0**3)/(4.*Lambda**2*sw0**2) - (cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_67 = Coupling(name = 'GC_67',
value = '-((c3pqiInternal*complex(0,1)*cmath.sqrt(2))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_670 = Coupling(name = 'GC_670',
value = '(cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) + (cpDC*cw0**2*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) + (cpWB*cw0*ee0**2*vev0**3)/(4.*Lambda**2*sw0**2) + (c3pl1*ee0**2*vev0**3)/(4.*Lambda**2*sw0) + (c3pl2*ee0**2*vev0**3)/(4.*Lambda**2*sw0) - (cll1221*ee0**2*vev0**3)/(4.*Lambda**2*sw0) - (cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_671 = Coupling(name = 'GC_671',
value = '-(cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) + (cpDC*cw0**2*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) - (cpWB*cw0*ee0**2*vev0**3)/(4.*Lambda**2*sw0**2) + (cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_672 = Coupling(name = 'GC_672',
value = '-(cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) - (cpDC*cw0**2*ee0**2*vev0**3)/(16.*Lambda**2*sw0**3) - (cpWB*cw0*ee0**2*vev0**3)/(4.*Lambda**2*sw0**2) - (c3pl1*ee0**2*vev0**3)/(4.*Lambda**2*sw0) - (c3pl2*ee0**2*vev0**3)/(4.*Lambda**2*sw0) + (cll1221*ee0**2*vev0**3)/(4.*Lambda**2*sw0) + (cpDC*ee0**2*vev0**3)/(16.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_673 = Coupling(name = 'GC_673',
value = '-(cpDC*ee0**2*vev0**3)/(8.*Lambda**2*sw0**3) - (cpWB*cw0*ee0**2*vev0**3)/(2.*Lambda**2*sw0**2) - (c3pl1*ee0**2*vev0**3)/(4.*Lambda**2*sw0) - (c3pl2*ee0**2*vev0**3)/(4.*Lambda**2*sw0) + (cll1221*ee0**2*vev0**3)/(4.*Lambda**2*sw0) + (cpDC*ee0**2*vev0**3)/(8.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_674 = Coupling(name = 'GC_674',
value = '-(c3pl1*ee0**2*vev0**3)/(4.*cw0*Lambda**2) - (c3pl2*ee0**2*vev0**3)/(4.*cw0*Lambda**2) + (cll1221*ee0**2*vev0**3)/(4.*cw0*Lambda**2) + (cpDC*ee0**2*vev0**3)/(4.*cw0*Lambda**2) - (cpDC*ee0**2*vev0**3)/(8.*cw0*Lambda**2*sw0**2) + (cpDC*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cpWB*ee0**2*vev0**3)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_675 = Coupling(name = 'GC_675',
value = '-(c3pl1*ee0**2*vev0**3)/(8.*cw0*Lambda**2) - (c3pl2*ee0**2*vev0**3)/(8.*cw0*Lambda**2) + (cll1221*ee0**2*vev0**3)/(8.*cw0*Lambda**2) - (cpDC*ee0**2*vev0**3)/(16.*cw0*Lambda**2*sw0**2) - (c3pl1*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (c3pl2*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cll1221*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cpWB*ee0**2*vev0**3)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_676 = Coupling(name = 'GC_676',
value = '-(c3pl1*ee0**2*vev0**3)/(8.*cw0*Lambda**2) - (c3pl2*ee0**2*vev0**3)/(8.*cw0*Lambda**2) + (cll1221*ee0**2*vev0**3)/(8.*cw0*Lambda**2) - (cpDC*ee0**2*vev0**3)/(16.*cw0*Lambda**2*sw0**2) + (c3pl1*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (c3pl2*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cll1221*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cpDC*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cpWB*ee0**2*vev0**3)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_677 = Coupling(name = 'GC_677',
value = '(c3pl1*ee0**2*vev0**3)/(8.*cw0*Lambda**2) + (c3pl2*ee0**2*vev0**3)/(8.*cw0*Lambda**2) - (cll1221*ee0**2*vev0**3)/(8.*cw0*Lambda**2) + (cpDC*ee0**2*vev0**3)/(16.*cw0*Lambda**2*sw0**2) + (c3pl1*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (c3pl2*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (cll1221*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cpWB*ee0**2*vev0**3)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_678 = Coupling(name = 'GC_678',
value = '(c3pl1*ee0**2*vev0**3)/(8.*cw0*Lambda**2) + (c3pl2*ee0**2*vev0**3)/(8.*cw0*Lambda**2) - (cll1221*ee0**2*vev0**3)/(8.*cw0*Lambda**2) + (cpDC*ee0**2*vev0**3)/(16.*cw0*Lambda**2*sw0**2) - (c3pl1*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) - (c3pl2*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cll1221*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cpDC*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cpWB*ee0**2*vev0**3)/(4.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_679 = Coupling(name = 'GC_679',
value = '(c3pl1*ee0**2*vev0**3)/(4.*cw0*Lambda**2) + (c3pl2*ee0**2*vev0**3)/(4.*cw0*Lambda**2) - (cll1221*ee0**2*vev0**3)/(4.*cw0*Lambda**2) - (cpDC*ee0**2*vev0**3)/(4.*cw0*Lambda**2) + (cpDC*ee0**2*vev0**3)/(8.*cw0*Lambda**2*sw0**2) - (cpDC*cw0*ee0**2*vev0**3)/(8.*Lambda**2*sw0**2) + (cpWB*ee0**2*vev0**3)/(2.*Lambda**2*sw0)',
order = {'NP':2,'QED':1})
GC_68 = Coupling(name = 'GC_68',
value = '(c3pqiInternal*complex(0,1)*cmath.sqrt(2))/Lambda**2',
order = {'NP':2,'QED':2})
GC_680 = Coupling(name = 'GC_680',
value = '-(cpWB*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2) - (cpDC*cw0*ee0**2*complex(0,1)*vev0**3)/(16.*Lambda**2*sw0**3) + (cpDC*cw0**3*ee0**2*complex(0,1)*vev0**3)/(16.*Lambda**2*sw0**3) - (cpWB*cw0**2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (cpDC*ee0**2*complex(0,1)*vev0**3)/(16.*cw0*Lambda**2*sw0) + (cpDC*cw0*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0) + (cpDC*ee0**2*complex(0,1)*sw0*vev0**3)/(16.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_681 = Coupling(name = 'GC_681',
value = '-(cpDC*cw0*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**3) + (cpDC*cw0**3*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**3) - (cpDC*ee0**2*complex(0,1)*vev0**3)/(8.*cw0*Lambda**2*sw0) + (cpDC*cw0*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0) + (cpDC*ee0**2*complex(0,1)*sw0*vev0**3)/(8.*cw0*Lambda**2)',
order = {'NP':2,'QED':1})
GC_682 = Coupling(name = 'GC_682',
value = '(c3pl1*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2) + (c3pl2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2) - (cdp*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2) - (cll1221*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2) + (cpDC*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2) + (cpDC*ee0**2*complex(0,1)*vev0**3)/(8.*cw0**2*Lambda**2) + (cpDC*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) + (c3pl1*cw0**2*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) + (c3pl2*cw0**2*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) - (cdp*cw0**2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (cll1221*cw0**2*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) + (cpDC*cw0**2*ee0**2*complex(0,1)*vev0**3)/(16.*Lambda**2*sw0**2) + (cpWB*cw0*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2*sw0) + (cpWB*ee0**2*complex(0,1)*sw0*vev0**3)/(2.*cw0*Lambda**2) + (c3pl1*ee0**2*complex(0,1)*sw0**2*vev0**3)/(8.*cw0**2*Lambda**2) + (c3pl2*ee0**2*complex(0,1)*sw0**2*vev0**3)/(8.*cw0**2*Lambda**2) - (cdp*ee0**2*complex(0,1)*sw0**2*vev0**3)/(4.*cw0**2*Lambda**2) - (cll1221*ee0**2*complex(0,1)*sw0**2*vev0**3)/(8.*cw0**2*Lambda**2) + (cpDC*ee0**2*complex(0,1)*sw0**2*vev0**3)/(16.*cw0**2*Lambda**2)',
order = {'NP':2,'QED':1})
GC_683 = Coupling(name = 'GC_683',
value = '-(c3pl1*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2) - (c3pl2*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2) + (cdp*ee0**2*complex(0,1)*vev0**3)/Lambda**2 + (cll1221*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2) + (3*cpDC*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**3)/(4.*cw0**2*Lambda**2) - (cpDC*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (c3pl1*cw0**2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) - (c3pl2*cw0**2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) + (cdp*cw0**2*ee0**2*complex(0,1)*vev0**3)/(2.*Lambda**2*sw0**2) + (cll1221*cw0**2*ee0**2*complex(0,1)*vev0**3)/(4.*Lambda**2*sw0**2) + (3*cpDC*cw0**2*ee0**2*complex(0,1)*vev0**3)/(8.*Lambda**2*sw0**2) - (c3pl1*ee0**2*complex(0,1)*sw0**2*vev0**3)/(4.*cw0**2*Lambda**2) - (c3pl2*ee0**2*complex(0,1)*sw0**2*vev0**3)/(4.*cw0**2*Lambda**2) + (cdp*ee0**2*complex(0,1)*sw0**2*vev0**3)/(2.*cw0**2*Lambda**2) + (cll1221*ee0**2*complex(0,1)*sw0**2*vev0**3)/(4.*cw0**2*Lambda**2) + (3*cpDC*ee0**2*complex(0,1)*sw0**2*vev0**3)/(8.*cw0**2*Lambda**2)',
order = {'NP':2,'QED':1})
GC_684 = Coupling(name = 'GC_684',
value = '-((complex(0,1)*ymt)/vev0)',
order = {'QED':1})
GC_685 = Coupling(name = 'GC_685',
value = 'ymt/vev0',
order = {'QED':1})
GC_686 = Coupling(name = 'GC_686',
value = '-((ymt*cmath.sqrt(2))/vev0)',
order = {'QED':1})
GC_687 = Coupling(name = 'GC_687',
value = '(ymt*cmath.sqrt(2))/vev0',
order = {'QED':1})
GC_688 = Coupling(name = 'GC_688',
value = '(c3pl1*vev0*ymt)/(Lambda**2*cmath.sqrt(2)) + (c3pl2*vev0*ymt)/(Lambda**2*cmath.sqrt(2)) - (cll1221*vev0*ymt)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_689 = Coupling(name = 'GC_689',
value = '-((c3pl1*vev0*ymt)/(Lambda**2*cmath.sqrt(2))) - (c3pl2*vev0*ymt)/(Lambda**2*cmath.sqrt(2)) + (cll1221*vev0*ymt)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_69 = Coupling(name = 'GC_69',
value = '(cblS3*complex(0,1))/Lambda**2',
order = {'NP':2})
GC_690 = Coupling(name = 'GC_690',
value = '-(c3pl1*vev0*ymt)/(2.*Lambda**2) - (c3pl2*vev0*ymt)/(2.*Lambda**2) + (cll1221*vev0*ymt)/(2.*Lambda**2) - (cpDC*vev0*ymt)/(4.*Lambda**2)',
order = {'NP':2,'QED':1})
GC_691 = Coupling(name = 'GC_691',
value = '(c3pl1*complex(0,1)*vev0*ymt)/(2.*Lambda**2) + (c3pl2*complex(0,1)*vev0*ymt)/(2.*Lambda**2) - (cdp*complex(0,1)*vev0*ymt)/Lambda**2 - (cll1221*complex(0,1)*vev0*ymt)/(2.*Lambda**2) + (cpDC*complex(0,1)*vev0*ymt)/(4.*Lambda**2) + (ctp*complex(0,1)*vev0**2)/(Lambda**2*cmath.sqrt(2))',
order = {'NP':2,'QED':1})
GC_7 = Coupling(name = 'GC_7',
value = '-ee0**2/(2.*cw0)',
order = {'QED':2})
GC_70 = Coupling(name = 'GC_70',
value = '(2*cdp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_71 = Coupling(name = 'GC_71',
value = '(4*cdp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_72 = Coupling(name = 'GC_72',
value = '(2*cll1111*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_73 = Coupling(name = 'GC_73',
value = '(2*cll1122*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_74 = Coupling(name = 'GC_74',
value = '(2*cll1133*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_75 = Coupling(name = 'GC_75',
value = '(2*cll1221*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_76 = Coupling(name = 'GC_76',
value = '(2*cll1331*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_77 = Coupling(name = 'GC_77',
value = '(2*cll2222*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_78 = Coupling(name = 'GC_78',
value = '(2*cll2233*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_79 = Coupling(name = 'GC_79',
value = '(2*cll2332*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_8 = Coupling(name = 'GC_8',
value = '-(ee0**2*complex(0,1))/(2.*cw0)',
order = {'QED':2})
GC_80 = Coupling(name = 'GC_80',
value = '(2*cll3333*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_81 = Coupling(name = 'GC_81',
value = '(6*cp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_82 = Coupling(name = 'GC_82',
value = '(12*cp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_83 = Coupling(name = 'GC_83',
value = '(18*cp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_84 = Coupling(name = 'GC_84',
value = '(36*cp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_85 = Coupling(name = 'GC_85',
value = '(90*cp*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':3})
GC_86 = Coupling(name = 'GC_86',
value = '-((cpd*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_87 = Coupling(name = 'GC_87',
value = 'cpd/Lambda**2',
order = {'NP':2,'QED':2})
GC_88 = Coupling(name = 'GC_88',
value = '-cpDC/(2.*Lambda**2)',
order = {'NP':2,'QED':2})
GC_89 = Coupling(name = 'GC_89',
value = '-((cpDC*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_9 = Coupling(name = 'GC_9',
value = 'ee0**2/(2.*cw0)',
order = {'QED':2})
GC_90 = Coupling(name = 'GC_90',
value = '-((cpe*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_91 = Coupling(name = 'GC_91',
value = 'cpe/Lambda**2',
order = {'NP':2,'QED':2})
GC_92 = Coupling(name = 'GC_92',
value = '(4*cpG*complex(0,1))/Lambda**2',
order = {'NP':2,'QED':2})
GC_93 = Coupling(name = 'GC_93',
value = '-((cpmu*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_94 = Coupling(name = 'GC_94',
value = 'cpmu/Lambda**2',
order = {'NP':2,'QED':2})
GC_95 = Coupling(name = 'GC_95',
value = '-((cpt*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_96 = Coupling(name = 'GC_96',
value = 'cpt/Lambda**2',
order = {'NP':2,'QED':2})
GC_97 = Coupling(name = 'GC_97',
value = '-((cpta*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
GC_98 = Coupling(name = 'GC_98',
value = 'cpta/Lambda**2',
order = {'NP':2,'QED':2})
GC_99 = Coupling(name = 'GC_99',
value = '-((cpu*complex(0,1))/Lambda**2)',
order = {'NP':2,'QED':2})
| 50.001081 | 1,198 | 0.490631 | 21,232 | 138,803 | 3.142238 | 0.038715 | 0.139757 | 0.132202 | 0.093156 | 0.775586 | 0.770295 | 0.758754 | 0.751244 | 0.733767 | 0.698543 | 0 | 0.150858 | 0.248071 | 138,803 | 2,775 | 1,199 | 50.019099 | 0.488368 | 0.001117 | 0 | 0.331566 | 0 | 0.153735 | 0.511299 | 0.436834 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000964 | 0 | 0.000964 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b8df4bf70929f01dcf107aa20f40b6631f390f28 | 174 | py | Python | server/src/service/__init__.py | lixinyang123/ERP | 4df02d68c74067cd22589c5a397a85029b388665 | [
"MIT"
] | 4 | 2021-04-30T14:56:50.000Z | 2022-03-29T06:09:28.000Z | server/src/service/__init__.py | lixinyang123/ERP | 4df02d68c74067cd22589c5a397a85029b388665 | [
"MIT"
] | null | null | null | server/src/service/__init__.py | lixinyang123/ERP | 4df02d68c74067cd22589c5a397a85029b388665 | [
"MIT"
] | 1 | 2022-02-26T02:51:54.000Z | 2022-02-26T02:51:54.000Z | from service.DbService import *
from service.ProductService import *
from service.PurchaseService import *
from service.SaleService import *
from service.UserService import * | 34.8 | 37 | 0.833333 | 20 | 174 | 7.25 | 0.4 | 0.37931 | 0.468966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109195 | 174 | 5 | 38 | 34.8 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
772ba01127bb538b41e0c77e4c43397163a0b1ea | 191 | py | Python | wyrd/__init__.py | meadsteve/constrained_types | 2a3b87a0b14be70ee2de963acf0eebf302dfe1d9 | [
"MIT"
] | 1 | 2021-05-03T08:53:33.000Z | 2021-05-03T08:53:33.000Z | wyrd/__init__.py | meadsteve/constrained_types | 2a3b87a0b14be70ee2de963acf0eebf302dfe1d9 | [
"MIT"
] | 16 | 2020-10-11T07:46:39.000Z | 2020-10-25T13:29:05.000Z | wyrd/__init__.py | meadsteve/constrained_types | 2a3b87a0b14be70ee2de963acf0eebf302dfe1d9 | [
"MIT"
] | null | null | null | """Library for helpers with Domain driven security"""
from . import constrained_types, read_once
from .version import __version__
__all__ = ["constrained_types", "read_once", "__version__"]
| 31.833333 | 59 | 0.780105 | 23 | 191 | 5.782609 | 0.652174 | 0.240602 | 0.300752 | 0.360902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115183 | 191 | 5 | 60 | 38.2 | 0.786982 | 0.246073 | 0 | 0 | 0 | 0 | 0.268116 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7733b7a6018c5c51bcab70849225034dc3561a51 | 3,510 | py | Python | gallery/views.py | Ianadika44/picture-gallery | eec05805c53f97faaeb163a37c0a1fbb99df1181 | [
"Unlicense"
] | null | null | null | gallery/views.py | Ianadika44/picture-gallery | eec05805c53f97faaeb163a37c0a1fbb99df1181 | [
"Unlicense"
] | 2 | 2021-06-08T21:38:06.000Z | 2021-06-10T22:56:25.000Z | gallery/views.py | Ianadika44/picture-gallery | eec05805c53f97faaeb163a37c0a1fbb99df1181 | [
"Unlicense"
] | null | null | null | from django.shortcuts import render, redirect
from django.http import HttpResponse, Http404
import datetime as dt
from .models import Article
# Create your views here.
def welcome(request):
return render(request, 'welcome.html')
def gallery_of_day(request):from django.http import HttpResponse,Http404
import datetime as dt
from .models import Article
from django.shortcuts import render
# Create your views here.
def welcome(request):
return render(request, 'welcome.html')
def gallery_of_day(request):
date = dt.date.today()
return render(request, 'gallery/today-gallery.html')
def past_days_gallery(request, past_date):
try:
# Converts data from the string Url
date = dt.datetime.strptime(past_date, '%Y-%m-%d').date()
except ValueError:
# Raise 404 error when ValueError is thrown
raise Http404()
assert False
if date == dt.date.today():
return redirect(gallery_today)
gallery = Article.days_gallery(date)
return render(request, 'gallery/past-gallery.html', {"date": date,"gallery":gallery})
def gallery_today(request):
date = dt.date.today()
gallery = Article.todays_gallery()
return render(request, 'gallery/today-gallery.html', {"date": date,"gallery":gallery})
def search_results(request):
if 'article' in request.GET and request.GET["article"]:
search_term = request.GET.get("article")
searched_articles = Article.search_by_title(search_term)
message = f"{search_term}"
return render(request, 'gallery/search.html', {"message":message,"articles": searched_articles})
else:
message = "You haven't searched for any term"
return render(request, 'gallery/search.html',{"message":message})
def article(request,article_id):
try:
article = Article.objects.get(id = article_id)
except DoesNotExist:
raise Http404()
return render(request,"gallery/article.html", {"article":article})
date = dt.date.today()
return render(request, 'all-gallery/today-gallery.html', {"date": date, })
def past_days_gallery(request, past_date):
try:
# Converts data from the string Url
date = dt.datetime.strptime(past_date, '%Y-%m-%d').date()
except ValueError:
# Raise 404 error when ValueError is thrown
raise Http404()
assert False
if date == dt.date.today():
return redirect(gallery_of_day)
gallery = Article.days_gallery(date)
return render(request, 'gallery/past-gallery.html', {"date": date, "gallery": gallery})
def gallery_today(request):
date = dt.date.today()
gallery = Article.todays_gallery()
return render(request, 'all-gallery/today-gallery.html', {"date": date,"gallery":gallery})
def search_results(request):
if 'article' in request.GET and request.GET["article"]:
search_term = request.GET.get("article")
searched_articles = Article.search_by_title(search_term)
message = f"{search_term}"
return render(request, 'all-gallery/search.html', {"message":message,"articles": searched_articles})
else:
message = "You haven't searched for any term"
return render(request, 'all-gallery/search.html',{"message":message})
def article(request,article_id):
try:
article = Article.objects.get(id = article_id)
except DoesNotExist:
raise Http404()
return render(request,"all-gallery/article.html", {"article":article}) | 30.789474 | 108 | 0.680912 | 441 | 3,510 | 5.326531 | 0.1678 | 0.07152 | 0.11324 | 0.07748 | 0.992337 | 0.93742 | 0.93742 | 0.906769 | 0.906769 | 0.906769 | 0 | 0.008514 | 0.196866 | 3,510 | 114 | 109 | 30.789474 | 0.824761 | 0.056695 | 0 | 0.773333 | 0 | 0 | 0.172466 | 0.070197 | 0 | 0 | 0 | 0 | 0.026667 | 1 | 0.16 | false | 0 | 0.106667 | 0.026667 | 0.48 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7751e2aaeda90f714bc0e30cb2111f6ed58d8a36 | 88 | py | Python | vibra/firefly/data/__init__.py | PerceptronV/vibra | ce8587987fb7a642f885af89ee20899b52a9d517 | [
"MIT"
] | null | null | null | vibra/firefly/data/__init__.py | PerceptronV/vibra | ce8587987fb7a642f885af89ee20899b52a9d517 | [
"MIT"
] | null | null | null | vibra/firefly/data/__init__.py | PerceptronV/vibra | ce8587987fb7a642f885af89ee20899b52a9d517 | [
"MIT"
] | null | null | null | from vibra.firefly.data.dataset import dataset
from vibra.firefly.data.data import data | 44 | 47 | 0.840909 | 14 | 88 | 5.285714 | 0.428571 | 0.243243 | 0.432432 | 0.540541 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 88 | 2 | 48 | 44 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
620d41e3ae6119934c75fef56b16f718ad938054 | 68,586 | py | Python | benchmarks/SimResults/Paper2_rr_spec_base/cmp_astarxalancbmkleslie3dnamd/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/Paper2_rr_spec_base/cmp_astarxalancbmkleslie3dnamd/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/Paper2_rr_spec_base/cmp_astarxalancbmkleslie3dnamd/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.163535,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.331137,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.892513,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.504947,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.874386,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.501485,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 1.88082,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.362284,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 7.18899,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.168615,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0183047,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.19321,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.135375,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.361825,
'Execution Unit/Register Files/Runtime Dynamic': 0.153679,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.511677,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 1.1886,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 3.95961,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.0024158,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.0024158,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00211223,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000822095,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00194467,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00888851,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0228741,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.130139,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 6.43323,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.29571,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.442011,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 8.96874,
'Instruction Fetch Unit/Runtime Dynamic': 0.899623,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.103516,
'L2/Runtime Dynamic': 0.025514,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 4.84484,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.75567,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.116718,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.116719,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 5.39826,
'Load Store Unit/Runtime Dynamic': 2.448,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.287808,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.575616,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.102144,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.103657,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.399995,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0486004,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.735081,
'Memory Management Unit/Runtime Dynamic': 0.152258,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 26.9563,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.58826,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.0328989,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.250612,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.87177,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 8.35678,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0596279,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.249523,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.328864,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.212744,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.343149,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.17321,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.729103,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.192898,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.77589,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0621295,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00892345,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0865726,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0659944,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.148702,
'Execution Unit/Register Files/Runtime Dynamic': 0.0749178,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.197294,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.482332,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.94125,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.0016797,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.0016797,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00148384,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000585811,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000948015,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00579126,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0153607,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0634421,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.03546,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.142282,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.215478,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 6.44983,
'Instruction Fetch Unit/Runtime Dynamic': 0.442354,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0608579,
'L2/Runtime Dynamic': 0.0151537,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.06371,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.890527,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0590946,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0590945,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.34277,
'Load Store Unit/Runtime Dynamic': 1.24106,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.145717,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.291434,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0517155,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0526007,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.25091,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0234103,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.495857,
'Memory Management Unit/Runtime Dynamic': 0.076011,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 18.7147,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.163434,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0115874,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.104725,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.279746,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.99557,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0591064,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.249114,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.32412,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.221212,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.356806,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.180104,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.758122,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.20331,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.78902,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0612333,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00927861,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0890242,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.068621,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.150258,
'Execution Unit/Register Files/Runtime Dynamic': 0.0778996,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.202329,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.495895,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.98641,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00196774,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00196774,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00172948,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000678031,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000985746,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00665072,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.01831,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0659671,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.19607,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.148149,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.224054,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 6.61823,
'Instruction Fetch Unit/Runtime Dynamic': 0.463131,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0618961,
'L2/Runtime Dynamic': 0.0156854,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.15351,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.935562,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0619998,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0619999,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.44629,
'Load Store Unit/Runtime Dynamic': 1.30332,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.152881,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.305763,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.054258,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0551477,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.260897,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0244052,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.510211,
'Memory Management Unit/Runtime Dynamic': 0.0795529,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 19.0151,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.161077,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0119407,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.10861,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.281627,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.12973,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0366681,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.23149,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.196774,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.17861,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.288091,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.145419,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.612119,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.174109,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.51098,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0371748,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0074917,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0679528,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0554057,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.105128,
'Execution Unit/Register Files/Runtime Dynamic': 0.0628974,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.152327,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.390631,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.70251,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00171675,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00171675,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00151495,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000597215,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000795907,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00574436,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0157576,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0532629,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 3.38798,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.136514,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.180905,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 5.77092,
'Instruction Fetch Unit/Runtime Dynamic': 0.392183,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0727159,
'L2/Runtime Dynamic': 0.0196208,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.93388,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.835875,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0548943,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0548944,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.19311,
'Load Store Unit/Runtime Dynamic': 1.16149,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.13536,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.270721,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0480397,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0490713,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.210652,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.022559,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.449285,
'Memory Management Unit/Runtime Dynamic': 0.0716303,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 17.5865,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.09779,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00924847,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.088632,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.19567,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.54311,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 3.9154683412375313,
'Runtime Dynamic': 3.9154683412375313,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.214129,
'Runtime Dynamic': 0.113149,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 82.4867,
'Peak Power': 115.599,
'Runtime Dynamic': 20.1383,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 82.2725,
'Total Cores/Runtime Dynamic': 20.0252,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.214129,
'Total L3s/Runtime Dynamic': 0.113149,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.039387 | 124 | 0.681976 | 8,082 | 68,586 | 5.78149 | 0.068052 | 0.123614 | 0.112999 | 0.093481 | 0.938921 | 0.931045 | 0.917648 | 0.886423 | 0.863181 | 0.8427 | 0 | 0.131573 | 0.224404 | 68,586 | 914 | 125 | 75.039387 | 0.746818 | 0 | 0 | 0.642232 | 0 | 0 | 0.657632 | 0.048114 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
62332a7b5c3cf233611a11a1b4edccfd83f7dce7 | 153 | py | Python | mytoy/tests/test_toys.py | fperez/mytoy | d91f6267e0cbf7d5aa21c56076da39cf70220c3a | [
"BSD-3-Clause"
] | 17 | 2021-04-29T07:38:02.000Z | 2021-08-06T03:30:39.000Z | mytoy/tests/test_toys.py | fperez/mytoy | d91f6267e0cbf7d5aa21c56076da39cf70220c3a | [
"BSD-3-Clause"
] | null | null | null | mytoy/tests/test_toys.py | fperez/mytoy | d91f6267e0cbf7d5aa21c56076da39cf70220c3a | [
"BSD-3-Clause"
] | 5 | 2021-04-29T12:52:12.000Z | 2021-08-06T03:30:53.000Z | from mytoy import toy
def test_toy_default():
assert toy() == 1
def test_toy_0():
assert toy(0) == 1
def test_toy_1():
assert toy(1) == 2
| 13.909091 | 23 | 0.620915 | 27 | 153 | 3.296296 | 0.407407 | 0.235955 | 0.337079 | 0.247191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060345 | 0.24183 | 153 | 10 | 24 | 15.3 | 0.706897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0.428571 | true | 0 | 0.142857 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
623c8cf57299e003347f235e0d526c8bf782ca55 | 6,920 | py | Python | app.py | rishabhc9/Python-Flask-SparkFun-Product-Data-Scraping-Website | 08838a2cd16a2c02cfc6d3cb36d8dca671d59432 | [
"MIT"
] | null | null | null | app.py | rishabhc9/Python-Flask-SparkFun-Product-Data-Scraping-Website | 08838a2cd16a2c02cfc6d3cb36d8dca671d59432 | [
"MIT"
] | null | null | null | app.py | rishabhc9/Python-Flask-SparkFun-Product-Data-Scraping-Website | 08838a2cd16a2c02cfc6d3cb36d8dca671d59432 | [
"MIT"
] | null | null | null | from flask import Flask,jsonify
import requests
import os
import json
from bs4 import BeautifulSoup
import pandas as pd
from flask import Flask, request, \
render_template, redirect, url_for, \
session, send_file
from flask import (Flask,request,redirect,session)
app = Flask(__name__)
@app.route('/', methods=["GET", "POST"])
def sparkfun():
try:
req = request.form
compurl=req.get('url')
result = requests.get(compurl)
src = result.content
soup = BeautifulSoup(src, 'lxml')
prodtitle=[]
products = soup.find_all('div', attrs={'class': 'main'})
for i in products:
for j in i.find_all('h3'):
for k in j.find_all('a'):
prodtitle.append(k.text)
li=[]
pics=soup.find_all('div', attrs={'class': 'actions-wrap'})
for p in pics:
for q in p.find_all('a'):
for r in q.find_all('img'):
s= li.append(r['src'])
pricelist=[]
prices=soup.find_all('div', attrs={'class': 'prices'})
for a in prices:
pricelist.append(a.text)
#lists of different parameter to dictionary
data={'Product Title':prodtitle,'Product Image Links':li,"prices":pricelist}
#dictionary to dataframe
df = pd.DataFrame.from_dict(data, orient='index')
df = df.transpose()
#df to json
result = df.to_json(orient="index")
parsed = json.loads(result)
h=json.dumps(parsed, indent=4)
html = df.to_html('templates/data.html')
return render_template("home.html",value=h)
except Exception as e:
req = request.form
compurl=req.get('url')
result = requests.get("https://www.sparkfun.com/categories/287")
src = result.content
soup = BeautifulSoup(src, 'lxml')
prodtitle=[]
products = soup.find_all('div', attrs={'class': 'main'})
for i in products:
for j in i.find_all('h3'):
for k in j.find_all('a'):
prodtitle.append(k.text)
li=[]
pics=soup.find_all('div', attrs={'class': 'actions-wrap'})
for p in pics:
for q in p.find_all('a'):
for r in q.find_all('img'):
s= li.append(r['src'])
pricelist=[]
prices=soup.find_all('div', attrs={'class': 'prices'})
for a in prices:
pricelist.append(a.text)
#lists of different parameter to dictionary
data={'Product Title':prodtitle,'Product Image Links':li,"prices":pricelist}
#dictionary to dataframe
df = pd.DataFrame.from_dict(data, orient='index')
df = df.transpose()
#df to json
result = df.to_json(orient="index")
parsed = json.loads(result)
h=json.dumps(parsed, indent=4)
html = df.to_html('templates/data.html')
return render_template("home.html",value=h)
@app.route('/data', methods=["GET", "POST"])
def datafunc():
return render_template("data.html")
@app.route('/productdetails', methods=["GET", "POST"])
def sparkfun2():
try:
req = request.form
compurl=req.get('url9')
result = requests.get(compurl)
src = result.content
soup = BeautifulSoup(src, 'lxml')
prodesc=[]
products = soup.find_all('div', attrs={'id': 'description-tab'})
for u in products:
for v in u.find_all('p'):
prodesc.append(v.text)
prodprice=[]
pprices = soup.find_all('div', attrs={'class': 'display-price'})
for pp in pprices:
for tt in pp.find_all('h3'):
oo=tt.text
prodprice.append(oo)
prodtitle=[]
ptitle = soup.find_all('div', attrs={'class': 'product-title'})
for pt in ptitle:
for title in pt.find_all('h1'):
oott=title.text
prodtitle.append(oott)
aggrating=[]
rating = soup.find_all('div', attrs={'itemprop': 'aggregateRating'})
for g in rating:
for h in g.find_all('h3'):
aggrating.append(h.text)
reviewdes=[]
reviews = soup.find_all('div', attrs={'class': 'review-text'})
for a in reviews:
for f in a.find_all('p'):
reviewdes.append(f.text)
reviewperson=[]
author = soup.find_all('p', attrs={'class': 'review-byline text-muted'})
for au in author:
for auid in au.find_all('a'):
reviewperson.append(auid['href'])
proddata={'Product Name':prodtitle,'Product Description':prodesc,'Product Price':prodprice,'Product Rating':aggrating,"Produc Reviews":reviewdes,"Reviewed By":reviewperson}
pro= json.dumps(proddata, indent = 4)
return render_template("productdetails.html",value999=pro)
except Exception as e:
result = requests.get('https://www.sparkfun.com/products/16811')
src = result.content
soup = BeautifulSoup(src, 'lxml')
prodesc=[]
products = soup.find_all('div', attrs={'id': 'description-tab'})
for u in products:
for v in u.find_all('p'):
prodesc.append(v.text)
prodprice=[]
pprices = soup.find_all('div', attrs={'class': 'display-price'})
for pp in pprices:
for tt in pp.find_all('h3'):
oo=tt.text
prodprice.append(oo)
prodtitle=[]
ptitle = soup.find_all('div', attrs={'class': 'product-title'})
for pt in ptitle:
for title in pt.find_all('h1'):
oott=title.text
prodtitle.append(oott)
aggrating=[]
rating = soup.find_all('div', attrs={'itemprop': 'aggregateRating'})
for g in rating:
for h in g.find_all('h3'):
aggrating.append(h.text)
reviewdes=[]
reviews = soup.find_all('div', attrs={'class': 'review-text'})
for a in reviews:
for f in a.find_all('p'):
reviewdes.append(f.text)
reviewperson=[]
author = soup.find_all('p', attrs={'class': 'review-byline text-muted'})
for au in author:
for auid in au.find_all('a'):
reviewperson.append(auid['href'])
proddata={'Product Name':prodtitle,'Product Description':prodesc,'Product Price':prodprice,'Product Rating':aggrating,"Produc Reviews":reviewdes,"Reviewed By":reviewperson}
pro= json.dumps(proddata, indent = 4)
a_json = json.loads(pro)
return render_template("productdetails.html",value999=pro)
if __name__ == "__main__":
app.run(debug=True) | 35.487179 | 180 | 0.55159 | 829 | 6,920 | 4.525935 | 0.179735 | 0.070896 | 0.052772 | 0.059701 | 0.886727 | 0.872335 | 0.872335 | 0.822761 | 0.822761 | 0.822761 | 0 | 0.006078 | 0.310549 | 6,920 | 195 | 181 | 35.487179 | 0.78034 | 0.021676 | 0 | 0.850932 | 0 | 0 | 0.13762 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018634 | false | 0 | 0.049689 | 0.006211 | 0.099379 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6551bdd91f611329a6c89abad3f40a0d6d2be07a | 45,374 | py | Python | policy_learning/datasets.py | UT-Austin-RPL/BUDS | 6b5ae1864b50bb6212fae4fdfba4ffc8e74f2e85 | [
"MIT"
] | 9 | 2021-10-03T06:05:32.000Z | 2022-03-14T01:25:27.000Z | policy_learning/datasets.py | UT-Austin-RPL/BUDS | 6b5ae1864b50bb6212fae4fdfba4ffc8e74f2e85 | [
"MIT"
] | null | null | null | policy_learning/datasets.py | UT-Austin-RPL/BUDS | 6b5ae1864b50bb6212fae4fdfba4ffc8e74f2e85 | [
"MIT"
] | 2 | 2021-10-07T02:22:59.000Z | 2021-11-05T00:31:17.000Z | import h5py
import torch
from torch.utils.data import Dataset, DataLoader
from torchvision import transforms
import numpy as np
from models.model_utils import *
import init_path
from PIL import Image
import cv2
from policy_learning.models import PolicyType
from models.torch_utils import to_onehot
class SubtaskDataset(Dataset):
def __init__(self,
data_file,
subtask_file,
subgoal_embedding_file,
subtask_id,
data_modality=["image", "proprio"],
use_eye_in_hand=True,
use_subgoal_eye_in_hand=False,
policy_type=PolicyType.NO_SUBGOAL,
gripper_smoothing=False,
subgoal_cfg=None,
skill_training_cfg=None,
transform=None,
use_final_goal=False,
skip_task=[]):
"""
Args:
data_modality (list): provide a list of data modality. "image" - using image; "proprio": incorporating gripper pose + gripper position; "state": using low-dim states
subgoal_cfg (EasyDict): if None, no subgoal is used; or the subgoal_cfg is defined
"""
self.data_modality = data_modality
self.use_eye_in_hand = use_eye_in_hand
self.use_subgoal_eye_in_hand = use_subgoal_eye_in_hand
self.transform = transform
self.env_name = data_file["data"].attrs["env"]
self.subtask_id = subtask_id
self.policy_type = policy_type
self.subgoal_cfg = subgoal_cfg
self.skill_training_cfg = skill_training_cfg
subtask_segmentation = subtask_file["subtasks"][f"subtask_{subtask_id}"]["segmentation"][()]
self.agentview_image_names = []
self.eye_in_hand_image_names = []
self.goal_image_names = []
self.states = []
self.actions = []
self.proprios = []
self.agentview_images = []
self.eye_in_hand_images = []
self.goal_images = []
self.subgoal_indices = []
action_threshold = -1
smooth_window = 10
before_window = int(smooth_window * 0.3)
after_window = int(smooth_window * 0.7)
skip_action_indices = []
skip_ep_indices = []
if skip_task != []:
for idx in skip_task:
ep_indices = data_file["data/task"].attrs[f"{idx}"]
skip_ep_indices += ep_indices.tolist()
print("Skipping : ", skip_ep_indices, len(skip_ep_indices))
for (i, start_idx, end_idx) in subtask_segmentation:
if i in skip_ep_indices:
continue
actions = data_file[f"data/ep_{i}/actions"][()][start_idx:end_idx+1]
for j in range(len(actions)):
if gripper_smoothing:
action_history = list(actions[max(0, j- before_window):min(j+after_window, len(actions))][:, -1])
if j - smooth_window < 0:
action_history += [actions[0][-1]] * (abs(j-before_window))
elif j + smooth_window > len(actions):
action_history += [actions[-1][-1]] * (abs(j+after_window - len(actions)))
smoothed_action = np.mean(action_history)
self.actions.append(np.concatenate([actions[j][:-1], [smoothed_action]]))
else:
self.actions.append(actions[j])
self.actions = np.array(self.actions)
self.total_len = len(self.actions)
self.actions = safe_cuda(torch.from_numpy(self.actions).float())
self.policy_type = policy_type
if policy_type == PolicyType.NORMAL_SUBGOAL:
self.subgoal_images = []
count = 0
for (i, start_idx, end_idx) in subtask_segmentation:
if i in skip_ep_indices:
continue
agentview_image_names = data_file[f"data/ep_{i}/agentview_image_names"][()][start_idx:end_idx+1]
for j in range(len(agentview_image_names)):
future_idx = min(end_idx, start_idx + j + subgoal_cfg["horizon"]) - start_idx
self.subgoal_indices.append(future_idx + count)
count = len(self.subgoal_indices)
# if np.linalg.norm(data_file[f"data/ep_{i}/actions"][()][start_idx+j][:-1]) <= action_threshold:
# continue
# self.subgoal_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[future_idx])).transpose(2, 0, 1)))
# self.subgoal_images = safe_cuda(torch.stack(self.subgoal_images, dim=0))
assert(len(self.actions) == len(self.subgoal_indices))
assert(max(self.subgoal_indices) == len(self.actions)-1)
# elif policy_type == PolicyType.VAE_SUBGOAL:
# vae_embedding_file = subgoal_embedding_file
# self.vae_embeddings = []
# for (i, start_idx, end_idx) in subtask_segmentation:
# vae_embeddings = vae_embedding_file[f"data/ep_{i}/embedding"][()][start_idx:end_idx+1]
# for j in range(len(vae_embeddings)):
# if np.linalg.norm(data_file[f"data/ep_{i}/actions"][()][start_idx+j][:-1]) <= action_threshold:
# continue
# self.vae_embeddings.append(torch.from_numpy(vae_embeddings[j]))
# self.vae_embeddings = safe_cuda(torch.stack(self.vae_embeddings, dim=0)).float()
if "image" in data_modality:
for (i, start_idx, end_idx) in subtask_segmentation:
if i in skip_ep_indices:
continue
agentview_image_names = data_file[f"data/ep_{i}/agentview_image_names"][()][start_idx:end_idx+1]
if self.use_eye_in_hand:
eye_in_hand_image_names = data_file[f"data/ep_{i}/eye_in_hand_image_names"][()][start_idx:end_idx+1]
for j in range(len(agentview_image_names)):
if np.linalg.norm(data_file[f"data/ep_{i}/actions"][()][start_idx+j][:-1]) <= action_threshold:
continue
self.agentview_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[j])).transpose(2, 0, 1)))
if self.use_eye_in_hand:
self.eye_in_hand_images.append(torch.from_numpy(np.array(Image.open(eye_in_hand_image_names[j])).transpose(2, 0, 1)))
self.agentview_images =safe_cuda(torch.stack(self.agentview_images, dim=0))
if self.use_eye_in_hand:
self.eye_in_hand_images = safe_cuda(torch.stack(self.eye_in_hand_images, dim=0))
assert(len(self.actions) == len(self.agentview_images))
if "proprio" in data_modality:
gripper_states_list = []
joint_states_list = []
for (i, start_idx, end_idx) in subtask_segmentation:
if i in skip_ep_indices:
continue
gripper_states = data_file[f"data/ep_{i}/gripper_states"][()]
if self.skill_training_cfg.use_gripper:
for j in range(start_idx, end_idx+1):
gripper_state = []
for k in range(j-5, j):
if k < 0:
gripper_state += gripper_states[0].tolist()
else:
gripper_state += gripper_states[k].tolist()
gripper_states_list.append(gripper_state)
if self.skill_training_cfg.use_joints:
joint_states = torch.from_numpy(data_file[f"data/ep_{i}/joint_states"][()][start_idx:end_idx+1])
for j in range(len(joint_states)):
joint_states_list.append(joint_states[j])
if self.skill_training_cfg.use_gripper and self.skill_training_cfg.use_joints:
self.proprios = safe_cuda(torch.cat([torch.stack(joint_states_list, dim=0),
torch.tensor(gripper_states_list)], dim=1)).float()
elif self.skill_training_cfg.use_gripper:
self.proprios = safe_cuda(torch.tensor(gripper_states_list)).float()
elif self.skill_training_cfg.use_joints:
self.proprios = safe_cuda(torch.stack(joint_states_list, dim=0)).float()
assert(len(self.proprios) == len(self.actions))
# if "state" in data_modality:
# # low dimensional state training
# for (i, start_idx, end_idx) in subtask_segmentation:
# low_dim_states = torch.from_numpy(data_file[f"data/ep_{i}/low_dim_states"][()][start_idx:end_idx + 1])
# self.states.append(low_dim_states)
# self.states = safe_cuda(torch.cat(self.states, dim=0)).float()
# assert(len(self.actions) == len(self.states))
print("Dataset info: ")
print("Action dim: ", self.action_dim)
print("Proprio dim: ", self.proprio_dim)
@property
def action_dim(self):
return self.actions.shape[-1]
@property
def proprio_dim(self):
if self.proprios == []:
return 0
else:
return self.proprios.shape[-1]
def __len__(self):
return self.total_len
def __getitem__(self, idx):
action = self.actions[idx, ...]
data = {"action": action}
if "image" in self.data_modality:
agentview_image = self.agentview_images[idx].float() / 255.
if self.use_eye_in_hand:
eye_in_hand_image = self.eye_in_hand_images[idx].float() / 255.
state_image = torch.cat((agentview_image, eye_in_hand_image), dim=0)
if self.transform is not None:
state_image = self.transform(state_image)
data["state_image"] = state_image
else:
if self.transform is not None:
state_image = self.transform(state_image)
data["state_image"] = agentview_image
if "proprio" in self.data_modality:
data["proprio"] = self.proprios[idx]
if "state" in self.data_modality:
data["state"] = self.states[idx]
if self.policy_type == PolicyType.NORMAL_SUBGOAL:
if self.use_subgoal_eye_in_hand:
subgoal_image = torch.cat((self.agentview_images[self.subgoal_indices[idx]].float() / 255.,
self.eye_in_hand_images[self.subgoal_indices[idx]].float() / 255.))
else:
subgoal_image = self.agentview_images[self.subgoal_indices[idx]].float() / 255.
# subgoal_image = self.subgoal_images[idx].float() / 255.
data["subgoal"] = subgoal_image
elif self.policy_type == PolicyType.VAE_SUBGOAL:
data["vae_embedding"] = self.vae_embeddings[idx]
return data
class BCMetaDataset():
def __init__(self,
data_file_name,
subtasks_file_name,
subgoal_embedding_file_name,
use_rnn=False,
data_modality=["image", "proprio"],
use_eye_in_hand=True,
policy_type=PolicyType.NO_SUBGOAL,
subgoal_cfg=None,
skill_training_cfg=None,
subtask_id=[],
gripper_smoothing=False,
transform=None,
rnn_horizon=0,
skip_task_id=[]):
self.f = h5py.File(data_file_name, "r")
self.subtasks_f = h5py.File(subtasks_file_name, "r")
if subgoal_embedding_file_name is not None:
self.subgoal_embedding_f = h5py.File(subgoal_embedding_file_name, "r")
else:
self.subgoal_embedding_f = None
self.use_rnn = use_rnn
self.subtask_id = subtask_id
self.data_modality = data_modality
self.use_eye_in_hand = use_eye_in_hand
self.transform = transform
self.num_subtasks = self.subtasks_f["subtasks"].attrs["num_subtasks"]
self.policy_type = policy_type
self.subgoal_cfg = subgoal_cfg
self.skill_training_cfg = skill_training_cfg
self.gripper_smoothing = gripper_smoothing
self.rnn_horizon = rnn_horizon
self.skip_task_id = skip_task_id
print("Score of this data is: ", self.subtasks_f["subtasks"].attrs["score"])
self.datasets = []
def get_dataset(self, idx):
if self.subtask_id != []:
if idx not in self.subtask_id:
return None
if not self.use_rnn:
dataset = SubtaskDataset(self.f,
self.subtasks_f,
self.subgoal_embedding_f,
idx,
data_modality=self.data_modality,
use_eye_in_hand=self.use_eye_in_hand,
use_subgoal_eye_in_hand=self.subgoal_cfg.use_eye_in_hand,
policy_type=self.policy_type,
subgoal_cfg=self.subgoal_cfg,
skill_training_cfg=self.skill_training_cfg,
gripper_smoothing=self.gripper_smoothing,
transform=self.transform,
skip_task=self.skip_task_id)
else:
print("Using RNN")
dataset = SubtaskSequenceDataset(self.f,
self.subtasks_f,
self.subgoal_embedding_f,
idx,
data_modality=self.data_modality,
use_eye_in_hand=self.use_eye_in_hand,
use_subgoal_eye_in_hand=self.subgoal_cfg.use_eye_in_hand,
policy_type=self.policy_type,
subgoal_cfg=self.subgoal_cfg,
skill_training_cfg=self.skill_training_cfg,
gripper_smoothing=self.gripper_smoothing,
transform=self.transform,
rnn_horizon=self.rnn_horizon)
print(idx, len(dataset))
return dataset
def close(self):
self.f.close()
self.subtasks_f.close()
class SubtaskSequenceDataset(Dataset):
def __init__(self,
data_file,
subtask_file,
subgoal_embedding_file,
subtask_id,
data_modality=["image", "proprio"],
use_eye_in_hand=True,
use_subgoal_eye_in_hand=True,
policy_type=PolicyType.NO_SUBGOAL,
gripper_smoothing=False,
subgoal_cfg=None,
transform=None,
rnn_horizon=10):
num_eps = data_file["data"].attrs["num_eps"]
self.data_modality = data_modality
self.use_eye_in_hand = use_eye_in_hand
self.use_subgoal_eye_in_hand = use_subgoal_eye_in_hand
self.transform = transform
self.env_name = data_file["data"].attrs["env"]
self.subtask_id = subtask_id
self.policy_type = policy_type
self.subgoal_cfg = subgoal_cfg
subtask_segmentation = subtask_file["subtasks"][f"subtask_{subtask_id}"]["segmentation"][()]
self._idx_to_seg_id = dict()
self._seg_id_to_start_indices = dict()
self._seg_id_to_seg_length = dict()
self.seq_length = rnn_horizon
self.agentview_image_names = []
self.frontview_image_names = []
self.eye_in_hand_image_names = []
self.goal_image_names = []
self.actions = []
self.states = []
self.agentview_images = []
self.eye_in_hand_images = []
self.goal_images = []
self.subgoal_indices = []
self.proprios = []
start_idx = 0 # Clip initial few steps of each episode
self.total_len = 0
if "image" in data_modality:
count = 0
for (seg_idx, (i, start_idx, end_idx)) in enumerate(subtask_segmentation):
agentview_image_names = data_file[f"data/ep_{i}/agentview_image_names"][()][start_idx:end_idx+1]
eye_in_hand_image_names = data_file[f"data/ep_{i}/eye_in_hand_image_names"][()][start_idx:end_idx+1]
self._seg_id_to_start_indices[seg_idx] = self.total_len
self._seg_id_to_seg_length[seg_idx] = end_idx - start_idx + 1
actions = data_file[f"data/ep_{i}/actions"][()][start_idx:end_idx+1]
for j in range(len(agentview_image_names)):
self._idx_to_seg_id[self.total_len] = seg_idx
self.total_len += 1
self.agentview_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[j])).transpose(2, 0, 1)))
self.eye_in_hand_images.append(torch.from_numpy(np.array(Image.open(eye_in_hand_image_names[j])).transpose(2, 0, 1)))
future_idx = min(end_idx, start_idx + j + subgoal_cfg["horizon"]) - start_idx
self.subgoal_indices.append(future_idx + count)
count = len(self.subgoal_indices)
self.actions.append(actions)
self.actions = np.vstack(self.actions)
self.actions = safe_cuda(torch.from_numpy(self.actions))
self.agentview_images = safe_cuda(torch.stack(self.agentview_images, dim=0))
self.eye_in_hand_images = safe_cuda(torch.stack(self.eye_in_hand_images, dim=0))
assert(len(self.actions) == len(self.subgoal_indices))
assert(max(self.subgoal_indices) == len(self.actions)-1)
# else:
# for (seg_idx, (i, start_idx, end_idx)) in enumerate(subtask_segmentation):
# low_dim_states = torch.from_numpy(data_file[f"data/ep_{i}/low_dim_states"][()][start_idx:end_idx+1])
# actions = data_file[f"data/ep_{i}/actions"][()][start_idx:end_idx+1]
# self.states.append(low_dim_states)
# self._seg_id_to_start_indices[seg_idx] = self.total_len
# self._seg_id_to_seg_length[seg_idx] = end_idx - start_idx + 1
# for j in range(len(actions)):
# self._idx_to_seg_id[self.total_len] = seg_idx
# self.total_len += 1
# self.actions.append(actions)
# self.states = safe_cuda(torch.from_numpy(np.vstack(self.states))).float()
# self.actions = safe_cuda(torch.from_numpy(np.vstack(self.actions))).float()
print("Finish loading: ", self.total_len)
@property
def action_dim(self):
return self.actions.shape[-1]
@property
def proprio_dim(self):
if self.proprios == []:
return 0
else:
return self.proprios.shape[-1]
def __len__(self):
return self.total_len
def __getitem__(self, idx):
seg_id = self._idx_to_seg_id[idx]
seg_start_index = self._seg_id_to_start_indices[seg_id]
seg_length = self._seg_id_to_seg_length[seg_id]
index_in_seg = idx - seg_start_index
end_index_in_seg = seg_length
seq_begin_index = max(0, index_in_seg)
seq_end_index = min(seg_length, index_in_seg + self.seq_length)
padding = max(0, seq_begin_index + self.seq_length - seg_length)
seq_begin_index += seg_start_index
seq_end_index += seg_start_index
action_seq = self.actions[seq_begin_index: seq_end_index].float()
if "image" in self.data_modality:
agentview_seq = self.agentview_images[seq_begin_index: seq_end_index]
eye_in_hand_seq = self.eye_in_hand_images[seq_begin_index: seq_end_index]
subgoal_index = self.subgoal_indices[seq_end_index-1]
if padding > 0:
# Pad
action_end_pad = torch.repeat_interleave(action_seq[-1].unsqueeze(0), padding, dim=0)
action_seq = torch.cat([action_seq] + [action_end_pad], dim=0)
agentview_end_pad = torch.repeat_interleave(agentview_seq[-1].unsqueeze(0), padding, dim=0)
agentview_seq = torch.cat([agentview_seq] + [agentview_end_pad], dim=0)
eye_in_hand_end_pad = torch.repeat_interleave(eye_in_hand_seq[-1].unsqueeze(0), padding, dim=0)
eye_in_hand_seq = torch.cat([eye_in_hand_seq] + [eye_in_hand_end_pad], dim=0)
if self.use_eye_in_hand:
obs_seq = torch.cat((agentview_seq, eye_in_hand_seq), dim=1).float() / 255.
else:
obs_seq = agentview_seq.float() / 255.
if self.use_subgoal_eye_in_hand:
subgoal = torch.cat((self.agentview_images[subgoal_index],
self.eye_in_hand_images[subgoal_index]), dim=1).float() / 255.
else:
subgoal = self.agentview_images[subgoal_index].float() / 255.
return {"obs_seq": obs_seq,
"action": action_seq,
"subgoal": subgoal}
# else:
# state_seq = self.states[seq_begin_index:seq_end_index]
# if padding > 0:
# action_end_pad = torch.repeat_interleave(action_seq[-1].unsqueeze(0), padding, dim=0)
# action_seq = torch.cat([action_seq] + [action_end_pad], dim=0)
# state_end_pad = torch.repeat_interleave(state_seq[-1].unsqueeze(0), padding, dim=0)
# state_seq = torch.cat([state_seq] + [state_end_pad], dim=0)
# return {"obs": state_seq,
# "actions": action_seq}
class BaselineBCDataset(Dataset):
def __init__(self,
data_file_name,
data_modality=["image", "proprio"],
use_eye_in_hand=True,
use_subgoal_eye_in_hand=False,
subgoal_cfg=None,
transform=None,
skill_training_cfg=None,
baseline_type="single_skill"):
assert(baseline_type in ["single_skill", "gti", "rpl"])
data_file = h5py.File(data_file_name, "r")
self.data_modality = data_modality
self.use_eye_in_hand = use_eye_in_hand
self.transform = transform
self.subgoal_cfg = subgoal_cfg
self.skill_training_cfg = skill_training_cfg
self.baseline_type = baseline_type
self.use_subgoal_eye_in_hand = use_subgoal_eye_in_hand
self.env_name = data_file["data"].attrs["env"]
self.agentview_image_names = []
self.eye_in_hand_image_names = []
self.goal_image_names = []
self.states = []
self.actions = []
self.proprios = []
self.agentview_images = []
self.eye_in_hand_images = []
self.goal_images = []
self.subgoal_indices = []
self.subgoal_transforms = transforms.Compose([
transforms.Resize((64, 64)),
transforms.Grayscale(num_output_channels=1)
])
self.num_eps = data_file["data"].attrs["num_eps"]
for i in range(self.num_eps):
actions = data_file[f"data/ep_{i}/actions"][()]
for j in range(len(actions)):
self.actions.append(actions[j])
self.actions = np.array(self.actions)
self.total_len = len(self.actions)
self.actions = safe_cuda(torch.from_numpy(self.actions).float())
# If GTI, also load goals / subgoals
if self.baseline_type == "gti":
self.subgoal_images = []
count = 0
for i in range(self.num_eps):
agentview_image_names = data_file[f"data/ep_{i}/agentview_image_names"][()]
for j in range(len(agentview_image_names)):
start_idx = j
future_idx = min(len(agentview_image_names)-1, j + subgoal_cfg["horizon"]) - start_idx
self.subgoal_indices.append(start_idx + future_idx)
assert(len(self.actions) == len(self.subgoal_indices))
if self.baseline_type == "rpl":
self.subgoal_images = []
count = 0
for i in range(self.num_eps):
agentview_image_names = data_file[f"data/ep_{i}/agentview_image_names"][()]
for j in range(len(agentview_image_names)):
start_idx = j
future_idx = min(len(agentview_image_names)-1, j + subgoal_cfg["horizon"]) - start_idx
self.subgoal_indices.append(start_idx + future_idx)
assert(len(self.actions) == len(self.subgoal_indices))
if "image" in data_modality:
for i in range(self.num_eps):
agentview_image_names = data_file[f"data/ep_{i}/agentview_image_names"][()]
if self.use_eye_in_hand:
eye_in_hand_image_names = data_file[f"data/ep_{i}/eye_in_hand_image_names"][()]
for j in range(len(agentview_image_names)):
self.agentview_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[j])).transpose(2, 0, 1)))
if self.use_eye_in_hand:
self.eye_in_hand_images.append(torch.from_numpy(np.array(Image.open(eye_in_hand_image_names[j])).transpose(2, 0, 1)))
self.agentview_images =safe_cuda(torch.stack(self.agentview_images, dim=0))
if self.use_eye_in_hand:
self.eye_in_hand_images = safe_cuda(torch.stack(self.eye_in_hand_images, dim=0))
assert(len(self.actions) == len(self.agentview_images))
if "proprio" in data_modality:
gripper_states_list = []
joint_states_list = []
for i in range(self.num_eps):
gripper_states = data_file[f"data/ep_{i}/gripper_states"][()]
if self.skill_training_cfg.use_gripper:
for j in range(len(gripper_states)):
gripper_state = []
for k in range(j-5, j):
if k < 0:
gripper_state += gripper_states[0].tolist()
else:
gripper_state += gripper_states[k].tolist()
gripper_states_list.append(gripper_state)
if self.skill_training_cfg.use_joints:
joint_states = torch.from_numpy(data_file[f"data/ep_{i}/joint_states"][()])
for j in range(len(joint_states)):
joint_states_list.append(joint_states[j])
if self.skill_training_cfg.use_gripper and self.skill_training_cfg.use_joints:
self.proprios = safe_cuda(torch.cat([torch.stack(joint_states_list, dim=0),
torch.tensor(gripper_states_list)], dim=1)).float()
elif self.skill_training_cfg.use_gripper:
self.proprios = safe_cuda(torch.tensor(gripper_states_list)).float()
elif self.skill_training_cfg.use_joints:
self.proprios = safe_cuda(torch.stack(joint_states_list, dim=0)).float()
assert(len(self.proprios) == len(self.actions))
@property
def action_dim(self):
return self.actions.shape[-1]
@property
def proprio_dim(self):
if self.proprios == []:
return 0
else:
return self.proprios.shape[-1]
def __len__(self):
return self.total_len
def __getitem__(self, idx):
action = self.actions[idx, ...]
data = {"action": action}
if "image" in self.data_modality:
agentview_image = self.agentview_images[idx].float() / 255.
if self.use_eye_in_hand:
eye_in_hand_image = self.eye_in_hand_images[idx].float() / 255.
state_image = torch.cat((agentview_image, eye_in_hand_image), dim=0)
if self.transform is not None:
state_image = self.transform(state_image)
data["state_image"] = state_image
else:
if self.transform is not None:
state_image = self.transform(state_image)
data["state_image"] = agentview_image
if "proprio" in self.data_modality:
data["proprio"] = self.proprios[idx]
if self.baseline_type == "gti":
if self.use_subgoal_eye_in_hand:
subgoal_image = torch.cat((self.agentview_images[self.subgoal_indices[idx]].float() / 255.,
self.eye_in_hand_images[self.subgoal_indices[idx]].float() / 255.))
else:
subgoal_image = self.agentview_images[self.subgoal_indices[idx]].float() / 255.
data["subgoal"] = subgoal_image
data["subgoal_target"] = self.subgoal_transforms(self.agentview_images[self.subgoal_indices[idx]].float() / 255.)
elif self.baseline_type == "rpl":
if self.use_subgoal_eye_in_hand:
subgoal_image = torch.cat((self.agentview_images[self.subgoal_indices[idx]].float() / 255.,
self.eye_in_hand_images[self.subgoal_indices[idx]].float() / 255.))
else:
subgoal_image = self.agentview_images[self.subgoal_indices[idx]].float() / 255.
data["subgoal"] = subgoal_image
return data
class VAEDataset(Dataset):
def __init__(self,
data_file_name,
transform):
data_file = h5py.File(data_file_name, "r")
self.num_eps = data_file["data"].attrs["num_eps"]
self.agentview_images = safe_cuda(torch.from_numpy((data_file["data/agentview_images"][()])))
self.total_len = len(self.agentview_images)
self.transform = transform
print("Finished loading!")
def __len__(self):
return self.total_len
def __getitem__(self, idx):
return {"state": self.agentview_images[idx].float() / 255.,
"target": self.transform(self.agentview_images[idx]).float() / 255.}
class MetaPolicyDataset(Dataset):
def __init__(self,
data_file_name,
embedding_file_name,
subtasks_file_name,
use_eye_in_hand=False,
use_embedding=False,
seq_length=10,
transform=None):
data_file = h5py.File(data_file_name, "r")
embedding_file = h5py.File(embedding_file_name, "r")
subtasks_file = h5py.File(subtasks_file_name, "r")
self.use_eye_in_hand = use_eye_in_hand
self.seq_length = seq_length
self.transform = transform
self.num_subtasks = subtasks_file["subtasks"].attrs["num_subtasks"]
self.num_eps = subtasks_file["subtasks"].attrs["num_eps"]
self.env_name = data_file["data"].attrs["env"]
self.embeddings = []
self.goal_embeddings = []
self.agentview_image_names = []
self.eye_in_hand_image_names = []
self.subgoal_embeddings = []
self.subtask_labels = []
self.agentview_images = []
self.eye_in_hand_images = []
self.total_len = 0
for ep_idx in range(self.num_eps):
try:
saved_ep_subtasks_seq = subtasks_file["subtasks"][f"ep_subtasks_seq_{ep_idx}"][()]
except:
continue
for (k, (start_idx, end_idx, subtask_label)) in enumerate(saved_ep_subtasks_seq):
if k == len(saved_ep_subtasks_seq) - 1:
e_idx = end_idx + 1
else:
e_idx = end_idx
agentview_image_names = data_file[f"data/ep_{ep_idx}/agentview_image_names"][()][start_idx:e_idx]
eye_in_hand_image_names = data_file[f"data/ep_{ep_idx}/eye_in_hand_image_names"][()][start_idx:e_idx]
embeddings = embedding_file[f"data/ep_{ep_idx}/embedding"][()][start_idx:e_idx]
for j in range(len(agentview_image_names)):
self.agentview_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[j])).transpose(2, 0, 1)))
self.eye_in_hand_images.append(torch.from_numpy(np.array(Image.open(eye_in_hand_image_names[j])).transpose(2, 0, 1)))
self.subgoal_embeddings.append(torch.from_numpy(embeddings[j]))
self.subtask_labels.append(subtask_label)
self.total_len += 1
self.subgoal_embedding_dim = len(self.subgoal_embeddings[-1])
self.agentview_images =safe_cuda(torch.stack(self.agentview_images, dim=0))
self.eye_in_hand_images = safe_cuda(torch.stack(self.eye_in_hand_images, dim=0))
self.subgoal_embeddings = safe_cuda(torch.stack(self.subgoal_embeddings, dim=0))
assert(self.total_len == len(self.subtask_labels))
self.subtask_labels = safe_cuda(torch.from_numpy(np.array(self.subtask_labels)))
# print(self.agentview_images.shape)
print("Subtask: ", self.subtask_labels.shape)
data_file.close()
embedding_file.close()
subtasks_file.close()
def __len__(self):
return self.total_len
def __getitem__(self, idx):
agentview_image = self.agentview_images[idx].float() / 255.
if self.use_eye_in_hand:
eye_in_hand_image = self.eye_in_hand_images[idx].float() / 255.
state_image = torch.cat([agentview_image, eye_in_hand_image], dim=0)
else:
state_image = agentview_image
subgoal_embedding = self.subgoal_embeddings[idx].float()
subtask_label = self.subtask_labels[idx]
return {"state_image": state_image, "embedding": subgoal_embedding, "id_vector": to_onehot(subtask_label, self.num_subtasks)}, {"embedding": subgoal_embedding, "id": subtask_label}
class MultitaskMetaPolicyDataset(Dataset):
def __init__(self,
data_file_name,
embedding_file_name,
subtasks_file_name,
task_id,
use_eye_in_hand=False,
use_embedding=False,
seq_length=10,
testing_percentage=1.0,
training_task_id=-1,
transform=None):
data_file = h5py.File(data_file_name, "r")
embedding_file = h5py.File(embedding_file_name, "r")
subtasks_file = h5py.File(subtasks_file_name, "r")
self.use_eye_in_hand = use_eye_in_hand
self.seq_length = seq_length
self.transform = transform
self.num_subtasks = subtasks_file["subtasks"].attrs["num_subtasks"]
self.task_id = task_id
self.training_task_id = training_task_id
if training_task_id == -1:
self.ep_indices = data_file["data/task"].attrs[f"{self.task_id}"]
else:
ep_indices = []
ids = [[0, 4], [2, 3], [1, 7]][self.training_task_id]
for i in ids:
ep_indices += (data_file["data/task"].attrs[f"{i}"]).tolist()
self.ep_indices = ep_indices
if testing_percentage < 1.0:
self.ep_indices = self.ep_indices[:int(len(self.ep_indices) * testing_percentage)]
self.num_eps = len(self.ep_indices)
print("Number of eps: ", self.num_eps)
self.env_name = data_file["data"].attrs["env"]
self.embeddings = []
self.goal_embeddings = []
self.agentview_image_names = []
self.eye_in_hand_image_names = []
self.subgoal_embeddings = []
self.subtask_labels = []
self.agentview_images = []
self.eye_in_hand_images = []
self.total_len = 0
for ep_idx in self.ep_indices:
try:
saved_ep_subtasks_seq = subtasks_file["subtasks"][f"ep_subtasks_seq_{ep_idx}"][()]
except:
continue
for (k, (start_idx, end_idx, subtask_label)) in enumerate(saved_ep_subtasks_seq):
if k == len(saved_ep_subtasks_seq) - 1:
e_idx = end_idx + 1
else:
e_idx = end_idx
agentview_image_names = data_file[f"data/ep_{ep_idx}/agentview_image_names"][()][start_idx:e_idx]
eye_in_hand_image_names = data_file[f"data/ep_{ep_idx}/eye_in_hand_image_names"][()][start_idx:e_idx]
embeddings = embedding_file[f"data/ep_{ep_idx}/embedding"][()][start_idx:e_idx]
for j in range(len(agentview_image_names)):
self.agentview_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[j])).transpose(2, 0, 1)))
self.eye_in_hand_images.append(torch.from_numpy(np.array(Image.open(eye_in_hand_image_names[j])).transpose(2, 0, 1)))
self.subgoal_embeddings.append(torch.from_numpy(embeddings[j]))
self.subtask_labels.append(subtask_label)
self.total_len += 1
self.subgoal_embedding_dim = len(self.subgoal_embeddings[-1])
self.agentview_images =safe_cuda(torch.stack(self.agentview_images, dim=0))
self.eye_in_hand_images = safe_cuda(torch.stack(self.eye_in_hand_images, dim=0))
self.subgoal_embeddings = safe_cuda(torch.stack(self.subgoal_embeddings, dim=0))
assert(self.total_len == len(self.subtask_labels))
self.subtask_labels = safe_cuda(torch.from_numpy(np.array(self.subtask_labels)))
# print(self.agentview_images.shape)
print("Subtask: ", self.subtask_labels.shape)
data_file.close()
embedding_file.close()
subtasks_file.close()
def __len__(self):
return self.total_len
def __getitem__(self, idx):
agentview_image = self.agentview_images[idx].float() / 255.
if self.use_eye_in_hand:
eye_in_hand_image = self.eye_in_hand_images[idx].float() / 255.
state_image = torch.cat([agentview_image, eye_in_hand_image], dim=0)
else:
state_image = agentview_image
subgoal_embedding = self.subgoal_embeddings[idx].float()
subtask_label = self.subtask_labels[idx]
return {"state_image": state_image, "embedding": subgoal_embedding, "id_vector": to_onehot(subtask_label, self.num_subtasks)}, {"embedding": subgoal_embedding, "id": subtask_label}
# class MetaRNNPolicyDataset(Dataset):
# def __init__(self,
# data_file_name,
# embedding_file_name,
# subtasks_file_name,
# use_eye_in_hand=False,
# use_embedding=False,
# seq_length=10,
# transform=None):
# data_file = h5py.File(data_file_name, "r")
# embedding_file = h5py.File(embedding_file_name, "r")
# self.embedding_dim = 16 # embedding_file["data"].attrs["embedding_dim"]
# subtasks_file = h5py.File(subtasks_file_name, "r")
# self.use_eye_in_hand = use_eye_in_hand
# self.seq_length = seq_length
# self.transform = transform
# self.num_subtasks = subtasks_file["subtasks"].attrs["num_subtasks"]
# self.num_eps = subtasks_file["subtasks"].attrs["num_eps"]
# self.env_name = data_file["data"].attrs["env"]
# self.embeddings = []
# self.goal_embeddings = []
# self.agentview_image_names = []
# self.eye_in_hand_image_names = []
# self.goal_image_names = []
# self.subtask_labels = []
# self.agentview_images = []
# self.eye_in_hand_images = []
# self.goal_images = []
# self.total_len = 0
# self._idx_to_seg_id = dict()
# self._seg_id_to_start_indices = dict()
# self._seg_id_to_seg_length = dict()
# seg_idx = 0
# for ep_idx in range(self.num_eps):
# try:
# saved_ep_subtasks_seq = subtasks_file["subtasks"][f"ep_subtasks_seq_{ep_idx}"][()]
# except:
# continue
# for (start_idx, end_idx, subtask_label) in saved_ep_subtasks_seq:
# self._seg_id_to_start_indices[seg_idx] = self.total_len
# self._seg_id_to_seg_length[seg_idx] = end_idx - start_idx + 1
# goal_image_name = data_file[f"data/ep_{ep_idx}/agentview_image_names"][()][-1]
# agentview_image_names = data_file[f"data/ep_{ep_idx}/agentview_image_names"][()][start_idx:end_idx+1]
# eye_in_hand_image_names = data_file[f"data/ep_{ep_idx}/eye_in_hand_image_names"][()][start_idx:end_idx+1]
# for j in range(len(agentview_image_names)):
# self.agentview_images.append(torch.from_numpy(np.array(Image.open(agentview_image_names[j])).transpose(2, 0, 1)))
# self.eye_in_hand_images.append(torch.from_numpy(np.array(Image.open(eye_in_hand_image_names[j])).transpose(2, 0, 1)))
# self.goal_images.append(torch.from_numpy(np.array(Image.open(goal_image_name)).transpose(2, 0, 1)))
# self.subtask_labels.append(subtask_label)
# self._idx_to_seg_id[self.total_len] = seg_idx
# self.total_len += 1
# seg_idx += 1
# self.agentview_images =safe_cuda(torch.stack(self.agentview_images, dim=0))
# self.eye_in_hand_images = safe_cuda(torch.stack(self.eye_in_hand_images, dim=0))
# self.goal_images = safe_cuda(torch.stack(self.goal_images, dim=0))
# assert(self.total_len == len(self.subtask_labels))
# self.subtask_labels = safe_cuda(torch.from_numpy(np.array(self.subtask_labels)))
# # print(self.agentview_images.shape)
# print("Subtask: ", self.subtask_labels.shape)
# data_file.close()
# embedding_file.close()
# subtasks_file.close()
# def __len__(self):
# return self.total_len
# def __getitem__(self, idx):
# seg_id = self._idx_to_seg_id[idx]
# seg_start_index = self._seg_id_to_start_indices[seg_id]
# seg_length = self._seg_id_to_seg_length[seg_id]
# index_in_seg = idx - seg_start_index
# end_index_in_seg = seg_length
# seq_begin_index = max(0, index_in_seg)
# seq_end_index = min(seg_length, index_in_seg + self.seq_length)
# padding = max(0, seq_begin_index + self.seq_length - seg_length)
# seq_begin_index += seg_start_index
# seq_end_index += seg_start_index
# agentview_seq = self.agentview_images[seq_begin_index:seq_end_index]
# eye_in_hand_seq = self.eye_in_hand_images[seq_begin_index:seq_end_index]
# goal_seq = self.goal_images[seq_begin_index:seq_end_index]
# subtask_label_seq = self.subtask_labels[seq_begin_index:seq_end_index]
# if padding > 0:
# agentview_end_pad = torch.repeat_interleave(agentview_seq[-1].unsqueeze(0), padding, dim=0)
# agentview_seq = torch.cat([agentview_seq] + [agentview_end_pad], dim=0)
# eye_in_hand_end_pad = torch.repeat_interleave(eye_in_hand_seq[-1].unsqueeze(0), padding, dim=0)
# eye_in_hand_seq = torch.cat([eye_in_hand_seq] + [eye_in_hand_end_pad], dim=0)
# goal_end_pad = torch.repeat_interleave(goal_seq[-1].unsqueeze(0), padding, dim=0)
# goal_seq = torch.cat([goal_seq] + [goal_end_pad], dim=0)
# subtask_label_end_pad = torch.repeat_interleave(subtask_label_seq[-1].unsqueeze(0), padding, dim=0)
# subtask_label_seq = torch.cat([subtask_label_seq] + [subtask_label_end_pad], dim=0)
# if self.use_eye_in_hand:
# state_seq = torch.cat((agentview_seq, eye_in_hand_seq), dim=1).float() / 255.
# else:
# state_seq = agentview_seq.float() / 255.
# goal_seq = goal_seq.float() / 255.
# if self.transform is not None:
# goal_seq = self.transform(goal_seq)
# return {"state": state_seq,
# "goal": goal_seq}, subtask_label_seq
| 44.095238 | 188 | 0.582184 | 5,533 | 45,374 | 4.41135 | 0.040304 | 0.02745 | 0.04941 | 0.021304 | 0.847714 | 0.810759 | 0.79392 | 0.773476 | 0.763192 | 0.743814 | 0 | 0.010364 | 0.313131 | 45,374 | 1,028 | 189 | 44.138132 | 0.772797 | 0.187971 | 0 | 0.751111 | 0 | 0 | 0.044496 | 0.018551 | 0 | 0 | 0 | 0 | 0.019259 | 1 | 0.04 | false | 0 | 0.016296 | 0.014815 | 0.100741 | 0.017778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
65c8b1a5264d5004601d70c77590b0415ac31e03 | 25,848 | py | Python | sdk/python/pulumi_sakuracloud/cdrom.py | sacloud/pulumi-sakuracloud | 3eff14c6ec8ef4ad6422e0cdf15585df67eb4d6e | [
"ECL-2.0",
"Apache-2.0"
] | 6 | 2019-12-07T07:46:05.000Z | 2020-12-19T02:41:42.000Z | sdk/python/pulumi_sakuracloud/cdrom.py | sacloud/pulumi-sakuracloud | 3eff14c6ec8ef4ad6422e0cdf15585df67eb4d6e | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2019-09-11T04:41:06.000Z | 2021-10-19T07:50:34.000Z | sdk/python/pulumi_sakuracloud/cdrom.py | sacloud/pulumi-sakuracloud | 3eff14c6ec8ef4ad6422e0cdf15585df67eb4d6e | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-09-08T05:38:16.000Z | 2021-06-24T01:32:47.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['CDROMArgs', 'CDROM']
@pulumi.input_type
class CDROMArgs:
def __init__(__self__, *,
content: Optional[pulumi.Input[str]] = None,
content_file_name: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
hash: Optional[pulumi.Input[str]] = None,
icon_id: Optional[pulumi.Input[str]] = None,
iso_image_file: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
size: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
zone: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a CDROM resource.
:param pulumi.Input[str] content: The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
:param pulumi.Input[str] content_file_name: The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
:param pulumi.Input[str] description: The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
:param pulumi.Input[str] hash: The md5 checksum calculated from the base64 encoded file body.
:param pulumi.Input[str] icon_id: The icon id to attach to the CD-ROM.
:param pulumi.Input[str] iso_image_file: The file path to upload to as the CD-ROM. This conflicts with [`content`].
:param pulumi.Input[str] name: The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
:param pulumi.Input[int] size: The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Any tags to assign to the CD-ROM.
:param pulumi.Input[str] zone: The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
if content is not None:
pulumi.set(__self__, "content", content)
if content_file_name is not None:
pulumi.set(__self__, "content_file_name", content_file_name)
if description is not None:
pulumi.set(__self__, "description", description)
if hash is not None:
pulumi.set(__self__, "hash", hash)
if icon_id is not None:
pulumi.set(__self__, "icon_id", icon_id)
if iso_image_file is not None:
pulumi.set(__self__, "iso_image_file", iso_image_file)
if name is not None:
pulumi.set(__self__, "name", name)
if size is not None:
pulumi.set(__self__, "size", size)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if zone is not None:
pulumi.set(__self__, "zone", zone)
@property
@pulumi.getter
def content(self) -> Optional[pulumi.Input[str]]:
"""
The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content", value)
@property
@pulumi.getter(name="contentFileName")
def content_file_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
"""
return pulumi.get(self, "content_file_name")
@content_file_name.setter
def content_file_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content_file_name", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def hash(self) -> Optional[pulumi.Input[str]]:
"""
The md5 checksum calculated from the base64 encoded file body.
"""
return pulumi.get(self, "hash")
@hash.setter
def hash(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "hash", value)
@property
@pulumi.getter(name="iconId")
def icon_id(self) -> Optional[pulumi.Input[str]]:
"""
The icon id to attach to the CD-ROM.
"""
return pulumi.get(self, "icon_id")
@icon_id.setter
def icon_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "icon_id", value)
@property
@pulumi.getter(name="isoImageFile")
def iso_image_file(self) -> Optional[pulumi.Input[str]]:
"""
The file path to upload to as the CD-ROM. This conflicts with [`content`].
"""
return pulumi.get(self, "iso_image_file")
@iso_image_file.setter
def iso_image_file(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "iso_image_file", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def size(self) -> Optional[pulumi.Input[int]]:
"""
The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
"""
return pulumi.get(self, "size")
@size.setter
def size(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "size", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Any tags to assign to the CD-ROM.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter
def zone(self) -> Optional[pulumi.Input[str]]:
"""
The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
return pulumi.get(self, "zone")
@zone.setter
def zone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "zone", value)
@pulumi.input_type
class _CDROMState:
def __init__(__self__, *,
content: Optional[pulumi.Input[str]] = None,
content_file_name: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
hash: Optional[pulumi.Input[str]] = None,
icon_id: Optional[pulumi.Input[str]] = None,
iso_image_file: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
size: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
zone: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering CDROM resources.
:param pulumi.Input[str] content: The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
:param pulumi.Input[str] content_file_name: The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
:param pulumi.Input[str] description: The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
:param pulumi.Input[str] hash: The md5 checksum calculated from the base64 encoded file body.
:param pulumi.Input[str] icon_id: The icon id to attach to the CD-ROM.
:param pulumi.Input[str] iso_image_file: The file path to upload to as the CD-ROM. This conflicts with [`content`].
:param pulumi.Input[str] name: The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
:param pulumi.Input[int] size: The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Any tags to assign to the CD-ROM.
:param pulumi.Input[str] zone: The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
if content is not None:
pulumi.set(__self__, "content", content)
if content_file_name is not None:
pulumi.set(__self__, "content_file_name", content_file_name)
if description is not None:
pulumi.set(__self__, "description", description)
if hash is not None:
pulumi.set(__self__, "hash", hash)
if icon_id is not None:
pulumi.set(__self__, "icon_id", icon_id)
if iso_image_file is not None:
pulumi.set(__self__, "iso_image_file", iso_image_file)
if name is not None:
pulumi.set(__self__, "name", name)
if size is not None:
pulumi.set(__self__, "size", size)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if zone is not None:
pulumi.set(__self__, "zone", zone)
@property
@pulumi.getter
def content(self) -> Optional[pulumi.Input[str]]:
"""
The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content", value)
@property
@pulumi.getter(name="contentFileName")
def content_file_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
"""
return pulumi.get(self, "content_file_name")
@content_file_name.setter
def content_file_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content_file_name", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def hash(self) -> Optional[pulumi.Input[str]]:
"""
The md5 checksum calculated from the base64 encoded file body.
"""
return pulumi.get(self, "hash")
@hash.setter
def hash(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "hash", value)
@property
@pulumi.getter(name="iconId")
def icon_id(self) -> Optional[pulumi.Input[str]]:
"""
The icon id to attach to the CD-ROM.
"""
return pulumi.get(self, "icon_id")
@icon_id.setter
def icon_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "icon_id", value)
@property
@pulumi.getter(name="isoImageFile")
def iso_image_file(self) -> Optional[pulumi.Input[str]]:
"""
The file path to upload to as the CD-ROM. This conflicts with [`content`].
"""
return pulumi.get(self, "iso_image_file")
@iso_image_file.setter
def iso_image_file(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "iso_image_file", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def size(self) -> Optional[pulumi.Input[int]]:
"""
The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
"""
return pulumi.get(self, "size")
@size.setter
def size(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "size", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Any tags to assign to the CD-ROM.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter
def zone(self) -> Optional[pulumi.Input[str]]:
"""
The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
return pulumi.get(self, "zone")
@zone.setter
def zone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "zone", value)
class CDROM(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
content: Optional[pulumi.Input[str]] = None,
content_file_name: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
hash: Optional[pulumi.Input[str]] = None,
icon_id: Optional[pulumi.Input[str]] = None,
iso_image_file: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
size: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
zone: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages a SakuraCloud CD-ROM.
## Example Usage
```python
import pulumi
import pulumi_sakuracloud as sakuracloud
foobar = sakuracloud.CDROM("foobar",
description="description",
iso_image_file="example.iso",
size=5,
tags=[
"tag1",
"tag2",
])
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] content: The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
:param pulumi.Input[str] content_file_name: The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
:param pulumi.Input[str] description: The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
:param pulumi.Input[str] hash: The md5 checksum calculated from the base64 encoded file body.
:param pulumi.Input[str] icon_id: The icon id to attach to the CD-ROM.
:param pulumi.Input[str] iso_image_file: The file path to upload to as the CD-ROM. This conflicts with [`content`].
:param pulumi.Input[str] name: The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
:param pulumi.Input[int] size: The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Any tags to assign to the CD-ROM.
:param pulumi.Input[str] zone: The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[CDROMArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a SakuraCloud CD-ROM.
## Example Usage
```python
import pulumi
import pulumi_sakuracloud as sakuracloud
foobar = sakuracloud.CDROM("foobar",
description="description",
iso_image_file="example.iso",
size=5,
tags=[
"tag1",
"tag2",
])
```
:param str resource_name: The name of the resource.
:param CDROMArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(CDROMArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
content: Optional[pulumi.Input[str]] = None,
content_file_name: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
hash: Optional[pulumi.Input[str]] = None,
icon_id: Optional[pulumi.Input[str]] = None,
iso_image_file: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
size: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
zone: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = CDROMArgs.__new__(CDROMArgs)
__props__.__dict__["content"] = content
__props__.__dict__["content_file_name"] = content_file_name
__props__.__dict__["description"] = description
__props__.__dict__["hash"] = hash
__props__.__dict__["icon_id"] = icon_id
__props__.__dict__["iso_image_file"] = iso_image_file
__props__.__dict__["name"] = name
__props__.__dict__["size"] = size
__props__.__dict__["tags"] = tags
__props__.__dict__["zone"] = zone
super(CDROM, __self__).__init__(
'sakuracloud:index/cDROM:CDROM',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
content: Optional[pulumi.Input[str]] = None,
content_file_name: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
hash: Optional[pulumi.Input[str]] = None,
icon_id: Optional[pulumi.Input[str]] = None,
iso_image_file: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
size: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
zone: Optional[pulumi.Input[str]] = None) -> 'CDROM':
"""
Get an existing CDROM resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] content: The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
:param pulumi.Input[str] content_file_name: The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
:param pulumi.Input[str] description: The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
:param pulumi.Input[str] hash: The md5 checksum calculated from the base64 encoded file body.
:param pulumi.Input[str] icon_id: The icon id to attach to the CD-ROM.
:param pulumi.Input[str] iso_image_file: The file path to upload to as the CD-ROM. This conflicts with [`content`].
:param pulumi.Input[str] name: The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
:param pulumi.Input[int] size: The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Any tags to assign to the CD-ROM.
:param pulumi.Input[str] zone: The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _CDROMState.__new__(_CDROMState)
__props__.__dict__["content"] = content
__props__.__dict__["content_file_name"] = content_file_name
__props__.__dict__["description"] = description
__props__.__dict__["hash"] = hash
__props__.__dict__["icon_id"] = icon_id
__props__.__dict__["iso_image_file"] = iso_image_file
__props__.__dict__["name"] = name
__props__.__dict__["size"] = size
__props__.__dict__["tags"] = tags
__props__.__dict__["zone"] = zone
return CDROM(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def content(self) -> pulumi.Output[Optional[str]]:
"""
The content to upload to as the CD-ROM. This conflicts with [`iso_image_file`].
"""
return pulumi.get(self, "content")
@property
@pulumi.getter(name="contentFileName")
def content_file_name(self) -> pulumi.Output[Optional[str]]:
"""
The name of content file to upload to as the CD-ROM. This is only used when `content` is specified. This conflicts with [`iso_image_file`]. Default:`config`.
"""
return pulumi.get(self, "content_file_name")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
The description of the CD-ROM. The length of this value must be in the range [`1`-`512`].
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def hash(self) -> pulumi.Output[str]:
"""
The md5 checksum calculated from the base64 encoded file body.
"""
return pulumi.get(self, "hash")
@property
@pulumi.getter(name="iconId")
def icon_id(self) -> pulumi.Output[Optional[str]]:
"""
The icon id to attach to the CD-ROM.
"""
return pulumi.get(self, "icon_id")
@property
@pulumi.getter(name="isoImageFile")
def iso_image_file(self) -> pulumi.Output[Optional[str]]:
"""
The file path to upload to as the CD-ROM. This conflicts with [`content`].
"""
return pulumi.get(self, "iso_image_file")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the CD-ROM. The length of this value must be in the range [`1`-`64`].
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def size(self) -> pulumi.Output[Optional[int]]:
"""
The size of CD-ROM in GiB. This must be one of [`5`/`10`]. Changing this forces a new resource to be created. Default:`5`.
"""
return pulumi.get(self, "size")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Any tags to assign to the CD-ROM.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def zone(self) -> pulumi.Output[str]:
"""
The name of zone that the CD-ROM will be created. (e.g. `is1a`, `tk1a`). Changing this forces a new resource to be created.
"""
return pulumi.get(self, "zone")
| 42.865672 | 209 | 0.617069 | 3,381 | 25,848 | 4.547767 | 0.052351 | 0.105164 | 0.108351 | 0.103018 | 0.903486 | 0.893665 | 0.872919 | 0.8657 | 0.862383 | 0.859586 | 0 | 0.00626 | 0.264585 | 25,848 | 602 | 210 | 42.936877 | 0.80262 | 0.350395 | 0 | 0.855908 | 1 | 0 | 0.064231 | 0.001885 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164265 | false | 0.002882 | 0.014409 | 0 | 0.276657 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
02e4d15def744dd3588a930176a9ec49ad6cad4c | 56,592 | py | Python | mars/dataframe/arithmetic/tests/test_arithmetic.py | sighingnow/mars | c7897fbd144d230fff5edabc1494fb3ff44aa0d2 | [
"Apache-2.0"
] | null | null | null | mars/dataframe/arithmetic/tests/test_arithmetic.py | sighingnow/mars | c7897fbd144d230fff5edabc1494fb3ff44aa0d2 | [
"Apache-2.0"
] | null | null | null | mars/dataframe/arithmetic/tests/test_arithmetic.py | sighingnow/mars | c7897fbd144d230fff5edabc1494fb3ff44aa0d2 | [
"Apache-2.0"
] | null | null | null | # Copyright 1999-2018 Alibaba Group Holding Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import itertools
import unittest
import numpy as np
try:
import pandas as pd
except ImportError: # pragma: no cover
pd = None
from mars.dataframe.core import IndexValue
from mars.dataframe.operands import ObjectType
from mars.dataframe.utils import hash_dtypes
from mars.dataframe.utils import split_monotonic_index_min_max, \
build_split_idx_to_origin_idx, filter_index_value
from mars.dataframe.datasource.dataframe import from_pandas, DataFrameDataSource
from mars.dataframe.datasource.series import from_pandas as from_pandas_series, SeriesDataSource
from mars.dataframe.arithmetic import abs, add, \
DataFrameAbs, DataFrameAdd
from mars.dataframe.align import DataFrameIndexAlignMap, \
DataFrameIndexAlignReduce, DataFrameShuffleProxy
from mars.tests.core import TestBase
@unittest.skipIf(pd is None, 'pandas not installed')
class Test(TestBase):
def testAddWithoutShuffle(self):
# all the axes are monotonic
# data1 with index split into [0...4], [5...9],
# columns [3...7], [8...12]
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=np.arange(3, 13))
df1 = from_pandas(data1, chunk_size=5)
# data2 with index split into [6...11], [2, 5],
# columns [4...9], [10, 13]
data2 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(11, 1, -1),
columns=np.arange(4, 14))
df2 = from_pandas(data2, chunk_size=6)
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 11) # columns is recorded, so we can get it
df3.tiles()
# test df3's index and columns after tiling
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 11) # columns is recorded, so we can get it
data1_index_min_max = [(0, True, 4, True), (5, True, 9, True)]
data1_columns_min_max = [[3, True, 7, True], [8, True, 12, True]]
data2_index_min_max = [(2, True, 5, True), (6, True, 11, True)]
data2_columns_min_max = [(4, True, 9, True), (10, True, 13, True)]
left_index_splits, right_index_splits = split_monotonic_index_min_max(
data1_index_min_max, True, data2_index_min_max, False)
left_columns_splits, right_columns_splits = split_monotonic_index_min_max(
data1_columns_min_max, True, data2_columns_min_max, True)
left_index_idx_to_original_idx = build_split_idx_to_origin_idx(left_index_splits)
right_index_idx_to_original_idx = build_split_idx_to_origin_idx(right_index_splits, False)
left_columns_idx_to_original_idx = build_split_idx_to_origin_idx(left_columns_splits)
right_columns_idx_to_original_idx = build_split_idx_to_origin_idx(right_columns_splits)
self.assertEqual(df3.chunk_shape, (7, 7))
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test shape
idx = c.index
# test the left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignMap)
left_row_idx, left_row_inner_idx = left_index_idx_to_original_idx[idx[0]]
left_col_idx, left_col_inner_idx = left_columns_idx_to_original_idx[idx[1]]
expect_df1_input = df1.cix[left_row_idx, left_col_idx].data
self.assertIs(c.inputs[0].inputs[0], expect_df1_input)
left_index_min_max = left_index_splits[left_row_idx][left_row_inner_idx]
self.assertEqual(c.inputs[0].op.index_min, left_index_min_max[0])
self.assertEqual(c.inputs[0].op.index_min_close, left_index_min_max[1])
self.assertEqual(c.inputs[0].op.index_max, left_index_min_max[2])
self.assertEqual(c.inputs[0].op.index_max_close, left_index_min_max[3])
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
left_column_min_max = left_columns_splits[left_col_idx][left_col_inner_idx]
self.assertEqual(c.inputs[0].op.column_min, left_column_min_max[0])
self.assertEqual(c.inputs[0].op.column_min_close, left_column_min_max[1])
self.assertEqual(c.inputs[0].op.column_max, left_column_min_max[2])
self.assertEqual(c.inputs[0].op.column_max_close, left_column_min_max[3])
expect_left_columns = filter_index_value(expect_df1_input.columns, left_column_min_max,
store_data=True)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), expect_left_columns.to_pandas())
pd.testing.assert_index_equal(c.inputs[0].dtypes.index, expect_left_columns.to_pandas())
# test the right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignMap)
right_row_idx, right_row_inner_idx = right_index_idx_to_original_idx[idx[0]]
right_col_idx, right_col_inner_idx = right_columns_idx_to_original_idx[idx[1]]
expect_df2_input = df2.cix[right_row_idx, right_col_idx].data
self.assertIs(c.inputs[1].inputs[0], expect_df2_input)
right_index_min_max = right_index_splits[right_row_idx][right_row_inner_idx]
self.assertEqual(c.inputs[1].op.index_min, right_index_min_max[0])
self.assertEqual(c.inputs[1].op.index_min_close, right_index_min_max[1])
self.assertEqual(c.inputs[1].op.index_max, right_index_min_max[2])
self.assertEqual(c.inputs[1].op.index_max_close, right_index_min_max[3])
self.assertIsInstance(c.inputs[1].index_value.to_pandas(), type(data2.index))
right_column_min_max = right_columns_splits[right_col_idx][right_col_inner_idx]
self.assertEqual(c.inputs[1].op.column_min, right_column_min_max[0])
self.assertEqual(c.inputs[1].op.column_min_close, right_column_min_max[1])
self.assertEqual(c.inputs[1].op.column_max, right_column_min_max[2])
self.assertEqual(c.inputs[1].op.column_max_close, right_column_min_max[3])
expect_right_columns = filter_index_value(expect_df2_input.columns, left_column_min_max,
store_data=True)
pd.testing.assert_index_equal(c.inputs[1].columns.to_pandas(), expect_right_columns.to_pandas())
pd.testing.assert_index_equal(c.inputs[1].dtypes.index, expect_right_columns.to_pandas())
def testAddDataFrameAndSeriesWithAlignMap(self):
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=np.arange(3, 13))
df1 = from_pandas(data1, chunk_size=5)
s1 = df1[3]
df2 = add(df1, s1)
df2.tiles()
self.assertEqual(df2.shape, (df1.shape[0], np.nan))
self.assertEqual(df2.index_value.key, df1.index_value.key)
data1_columns_min_max = [[3, True, 7, True], [8, True, 12, True]]
data2_index_min_max = [(0, True, 4, True), (5, True, 9, True)]
left_columns_splits, right_index_splits = split_monotonic_index_min_max(
data1_columns_min_max, True, data2_index_min_max, True)
left_columns_idx_to_original_idx = build_split_idx_to_origin_idx(left_columns_splits)
right_index_idx_to_original_idx = build_split_idx_to_origin_idx(right_index_splits)
self.assertEqual(df2.chunk_shape, (2, 7))
for c in df2.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test shape
idx = c.index
# test the left side (dataframe)
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignMap)
left_col_idx, left_col_inner_idx = left_columns_idx_to_original_idx[idx[1]]
expect_df1_input = df1.cix[idx[0], left_col_idx].data
self.assertIs(c.inputs[0].inputs[0], expect_df1_input)
left_column_min_max = left_columns_splits[left_col_idx][left_col_inner_idx]
self.assertEqual(c.inputs[0].op.column_min, left_column_min_max[0])
self.assertEqual(c.inputs[0].op.column_min_close, left_column_min_max[1])
self.assertEqual(c.inputs[0].op.column_max, left_column_min_max[2])
self.assertEqual(c.inputs[0].op.column_max_close, left_column_min_max[3])
expect_left_columns = filter_index_value(expect_df1_input.columns, left_column_min_max,
store_data=True)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), expect_left_columns.to_pandas())
pd.testing.assert_index_equal(c.inputs[0].dtypes.index, expect_left_columns.to_pandas())
# test the right side (series)
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignMap)
right_row_idx, right_row_inner_idx = right_index_idx_to_original_idx[idx[1]]
expect_s1_input = s1.cix[(right_row_idx,)].data
self.assertIs(c.inputs[1].inputs[0], expect_s1_input)
right_index_min_max = right_index_splits[right_row_idx][right_row_inner_idx]
self.assertEqual(c.inputs[1].op.index_min, right_index_min_max[0])
self.assertEqual(c.inputs[1].op.index_min_close, right_index_min_max[1])
self.assertEqual(c.inputs[1].op.index_max, right_index_min_max[2])
self.assertEqual(c.inputs[1].op.index_max_close, right_index_min_max[3])
self.assertIsInstance(c.inputs[1].index_value.to_pandas(), type(data1[3].index))
def testAddDataFrameAndSeriesIdentical(self):
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=np.arange(10))
df1 = from_pandas(data1, chunk_size=5)
s1 = from_pandas_series(data1[3], chunk_size=5)
df2 = add(df1, s1)
df2.tiles()
self.assertEqual(df2.shape, (10, 10))
self.assertEqual(df2.index_value.key, df1.index_value.key)
self.assertEqual(df2.columns.key, df1.columns.key)
self.assertEqual(df2.columns.key, s1.index_value.key)
self.assertEqual(df2.chunk_shape, (2, 2))
for c in df2.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
self.assertEqual(c.shape, (5, 5))
self.assertEqual(c.index_value.key, df1.cix[c.index].index_value.key)
self.assertEqual(c.index_value.key, df2.cix[c.index].index_value.key)
self.assertEqual(c.columns.key, df1.cix[c.index].columns.key)
self.assertEqual(c.columns.key, df2.cix[c.index].columns.key)
pd.testing.assert_index_equal(c.columns.to_pandas(), df1.cix[c.index].columns.to_pandas())
pd.testing.assert_index_equal(c.columns.to_pandas(), df2.cix[c.index].columns.to_pandas())
pd.testing.assert_index_equal(c.dtypes.index, df1.cix[c.index].columns.to_pandas())
# test the left side
self.assertIsInstance(c.inputs[0].op, DataFrameDataSource)
self.assertIs(c.inputs[0], df1.cix[c.index].data)
# test the right side
self.assertIsInstance(c.inputs[1].op, SeriesDataSource)
self.assertIs(c.inputs[1], s1.cix[(c.index[1],)].data)
def testAddDataFrameAndSeriesWithShuffle(self):
data1 = pd.DataFrame(np.random.rand(10, 10),
index=[4, 9, 3, 2, 1, 5, 8, 6, 7, 10],
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=5)
s1 = from_pandas_series(data1[10], chunk_size=6)
df2 = add(df1, s1)
# test df2's index and columns
self.assertEqual(df2.shape, (df1.shape[0], np.nan))
self.assertEqual(df2.index_value.key, df1.index_value.key)
pd.testing.assert_index_equal(df2.columns.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df2.columns.key, df1.columns.key)
self.assertTrue(df2.columns.should_be_monotonic)
df2.tiles()
self.assertEqual(df2.chunk_shape, (2, 2))
for c in df2.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
idx = c.index
# test the left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 2)[c.index[1]]
for ic in c.inputs[0].inputs[0].inputs])
pd.testing.assert_series_equal(c.inputs[0].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), c.inputs[0].dtypes.index)
pd.testing.assert_index_equal(c.inputs[0].index_value.to_pandas(), c.index_value.to_pandas())
self.assertIsInstance(c.inputs[0].inputs[0].op, DataFrameShuffleProxy)
for j, ci, ic in zip(itertools.count(0), c.inputs[0].inputs[0].inputs, df1.cix[idx[0], :]):
self.assertIsInstance(ci.op, DataFrameIndexAlignMap)
self.assertEqual(ci.index, (idx[0], j))
self.assertTrue(ci.op.column_shuffle_size, 2)
shuffle_segments = ci.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ic.data.dtypes, 2)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ci.inputs[0], ic.data)
# test the right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignReduce)
self.assertEqual(c.inputs[1].op.object_type, ObjectType.series)
self.assertIsInstance(c.inputs[1].inputs[0].op, DataFrameShuffleProxy)
for j, ci, ic in zip(itertools.count(0), c.inputs[1].inputs[0].inputs, s1.chunks):
self.assertIsInstance(ci.op, DataFrameIndexAlignMap)
self.assertEqual(ci.index, (j,))
self.assertTrue(ci.op.index_shuffle_size, 2)
self.assertIs(ci.inputs[0], ic.data)
# make sure shuffle proxies' key are different
proxy_keys = set()
for i in range(df2.chunk_shape[0]):
cs = [c for c in df2.chunks if c.index[0] == i]
lps = {c.inputs[0].inputs[0].op.key for c in cs}
self.assertEqual(len(lps), 1)
proxy_keys.add(lps.pop())
rps = {c.inputs[1].inputs[0].op.key for c in cs}
self.assertEqual(len(rps), 1)
proxy_keys.add(rps.pop())
self.assertEqual(len(proxy_keys), df2.chunk_shape[0] + 1)
def testAddSeriesAndSeriesWithAlignMap(self):
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=np.arange(3, 13))
df1 = from_pandas(data1, chunk_size=5)
s1 = df1.iloc[4]
s2 = df1[3]
s3 = add(s1, s2)
s3.tiles()
self.assertEqual(s3.shape, (np.nan,))
s1_index_min_max = [[3, True, 7, True], [8, True, 12, True]]
s2_index_min_max = [(0, True, 4, True), (5, True, 9, True)]
left_index_splits, right_index_splits = split_monotonic_index_min_max(
s1_index_min_max, True, s2_index_min_max, True)
left_index_idx_to_original_idx = build_split_idx_to_origin_idx(left_index_splits)
right_index_idx_to_original_idx = build_split_idx_to_origin_idx(right_index_splits)
self.assertEqual(s3.chunk_shape, (7,))
for c in s3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test shape
idx = c.index
# test the left side (series)
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignMap)
left_col_idx, left_col_inner_idx = left_index_idx_to_original_idx[idx[0]]
expect_s1_input = s1.cix[(left_col_idx,)].data
self.assertIs(c.inputs[0].inputs[0], expect_s1_input)
left_index_min_max = left_index_splits[left_col_idx][left_col_inner_idx]
self.assertEqual(c.inputs[0].op.index_min, left_index_min_max[0])
self.assertEqual(c.inputs[0].op.index_min_close, left_index_min_max[1])
self.assertEqual(c.inputs[0].op.index_max, left_index_min_max[2])
self.assertEqual(c.inputs[0].op.index_max_close, left_index_min_max[3])
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.iloc[4].index))
expect_left_index = filter_index_value(expect_s1_input.index_value, left_index_min_max,
store_data=True)
pd.testing.assert_index_equal(c.inputs[0].index_value.to_pandas(), expect_left_index.to_pandas())
# test the right side (series)
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignMap)
right_row_idx, right_row_inner_idx = right_index_idx_to_original_idx[idx[0]]
expect_s2_input = s2.cix[(right_row_idx,)].data
self.assertIs(c.inputs[1].inputs[0], expect_s2_input)
right_index_min_max = right_index_splits[right_row_idx][right_row_inner_idx]
self.assertEqual(c.inputs[1].op.index_min, right_index_min_max[0])
self.assertEqual(c.inputs[1].op.index_min_close, right_index_min_max[1])
self.assertEqual(c.inputs[1].op.index_max, right_index_min_max[2])
self.assertEqual(c.inputs[1].op.index_max_close, right_index_min_max[3])
self.assertIsInstance(c.inputs[1].index_value.to_pandas(), type(data1[3].index))
expect_right_index = filter_index_value(expect_s2_input.index_value, right_index_min_max,
store_data=True)
pd.testing.assert_index_equal(c.inputs[1].index_value.to_pandas(), expect_right_index.to_pandas())
def testAddSeriesAndSeriesIdentical(self):
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=np.arange(10))
s1 = from_pandas_series(data1[1], chunk_size=5)
s2 = from_pandas_series(data1[3], chunk_size=5)
s3 = add(s1, s2)
s3.tiles()
self.assertEqual(s3.shape, (10,))
self.assertEqual(s3.index_value.key, s1.index_value.key)
self.assertEqual(s3.index_value.key, s2.index_value.key)
self.assertEqual(s3.chunk_shape, (2,))
for c in s3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(c.op.object_type, ObjectType.series)
self.assertEqual(len(c.inputs), 2)
self.assertEqual(c.shape, (5,))
self.assertEqual(c.index_value.key, s1.cix[c.index].index_value.key)
self.assertEqual(c.index_value.key, s2.cix[c.index].index_value.key)
# test the left side
self.assertIsInstance(c.inputs[0].op, SeriesDataSource)
self.assertIs(c.inputs[0], s1.cix[c.index].data)
# test the right side
self.assertIsInstance(c.inputs[1].op, SeriesDataSource)
self.assertIs(c.inputs[1], s2.cix[c.index].data)
def testAddSeriesAndSeriesWithShuffle(self):
data1 = pd.DataFrame(np.random.rand(10, 10),
index=[4, 9, 3, 2, 1, 5, 8, 6, 7, 10],
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
s1 = from_pandas_series(data1.iloc[4], chunk_size=5)
s2 = from_pandas_series(data1[10], chunk_size=6)
s3 = add(s1, s2)
# test s3's index
self.assertEqual(s3.shape, (np.nan,))
self.assertNotEqual(s3.index_value.key, s1.index_value.key)
self.assertNotEqual(s3.index_value.key, s2.index_value.key)
pd.testing.assert_index_equal(s3.index_value.to_pandas(), pd.Int64Index([]))
self.assertTrue(s3.index_value.should_be_monotonic)
s3.tiles()
self.assertEqual(s3.chunk_shape, (2,))
for c in s3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test the left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignReduce)
self.assertEqual(c.inputs[0].op.object_type, ObjectType.series)
self.assertIsInstance(c.inputs[0].inputs[0].op, DataFrameShuffleProxy)
for j, ci, ic in zip(itertools.count(0), c.inputs[0].inputs[0].inputs, s1.chunks):
self.assertIsInstance(ci.op, DataFrameIndexAlignMap)
self.assertEqual(ci.index, (j,))
self.assertTrue(ci.op.index_shuffle_size, 2)
self.assertIs(ci.inputs[0], ic.data)
# test the right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignReduce)
self.assertEqual(c.inputs[1].op.object_type, ObjectType.series)
self.assertIsInstance(c.inputs[1].inputs[0].op, DataFrameShuffleProxy)
for j, ci, ic in zip(itertools.count(0), c.inputs[1].inputs[0].inputs, s2.chunks):
self.assertIsInstance(ci.op, DataFrameIndexAlignMap)
self.assertEqual(ci.index, (j,))
self.assertTrue(ci.op.index_shuffle_size, 2)
self.assertIs(ci.inputs[0], ic.data)
# make sure shuffle proxies' key are different
proxy_keys = set()
for c in s3.chunks:
proxy_keys.add(c.inputs[0].inputs[0].op.key)
proxy_keys.add(c.inputs[1].inputs[0].op.key)
self.assertEqual(len(proxy_keys), 2)
def testAddIdenticalIndexAndColumns(self):
data1 = pd.DataFrame(np.random.rand(10, 10),
columns=np.arange(3, 13))
df1 = from_pandas(data1, chunk_size=5)
data2 = pd.DataFrame(np.random.rand(10, 10),
columns=np.arange(3, 13))
df2 = from_pandas(data2, chunk_size=5)
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.RangeIndex)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.RangeIndex(0, 10))
self.assertEqual(df3.index_value.key, df1.index_value.key)
self.assertEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape, (10, 10)) # columns is recorded, so we can get it
df3.tiles()
self.assertEqual(df3.chunk_shape, (2, 2))
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
self.assertEqual(c.shape, (5, 5))
self.assertEqual(c.index_value.key, df1.cix[c.index].index_value.key)
self.assertEqual(c.index_value.key, df2.cix[c.index].index_value.key)
self.assertEqual(c.columns.key, df1.cix[c.index].columns.key)
self.assertEqual(c.columns.key, df2.cix[c.index].columns.key)
pd.testing.assert_index_equal(c.columns.to_pandas(), df1.cix[c.index].columns.to_pandas())
pd.testing.assert_index_equal(c.columns.to_pandas(), df2.cix[c.index].columns.to_pandas())
pd.testing.assert_index_equal(c.dtypes.index, df1.cix[c.index].columns.to_pandas())
# test the left side
self.assertIs(c.inputs[0], df1.cix[c.index].data)
# test the right side
self.assertIs(c.inputs[1], df2.cix[c.index].data)
def testAddWithOneShuffle(self):
# only 1 axis is monotonic
# data1 with index split into [0...4], [5...9],
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=5)
# data2 with index split into [6...11], [2, 5],
data2 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(11, 1, -1),
columns=[5, 9, 12, 3, 11, 10, 6, 4, 1, 2])
df2 = from_pandas(data2, chunk_size=6)
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 12) # columns is recorded, so we can get it
df3.tiles()
data1_index_min_max = [(0, True, 4, True), (5, True, 9, True)]
data2_index_min_max = [(2, True, 5, True), (6, True, 11, True)]
left_index_splits, right_index_splits = split_monotonic_index_min_max(
data1_index_min_max, True, data2_index_min_max, False)
left_index_idx_to_original_idx = build_split_idx_to_origin_idx(left_index_splits)
right_index_idx_to_original_idx = build_split_idx_to_origin_idx(right_index_splits, False)
self.assertEqual(df3.chunk_shape, (7, 2))
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
idx = c.index
# test the left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 2)[c.index[1]]
for ic in c.inputs[0].inputs[0].inputs])
pd.testing.assert_series_equal(c.inputs[0].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), c.inputs[0].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[0].inputs[0].op, DataFrameShuffleProxy)
left_row_idx, left_row_inner_idx = left_index_idx_to_original_idx[idx[0]]
left_index_min_max = left_index_splits[left_row_idx][left_row_inner_idx]
ics = [ic for ic in df1.chunks if ic.index[0] == left_row_idx]
for j, ci, ic in zip(itertools.count(0), c.inputs[0].inputs[0].inputs, ics):
self.assertIsInstance(ci.op, DataFrameIndexAlignMap)
self.assertEqual(ci.index, (idx[0], j))
self.assertEqual(ci.op.index_min, left_index_min_max[0])
self.assertEqual(ci.op.index_min_close, left_index_min_max[1])
self.assertEqual(ci.op.index_max, left_index_min_max[2])
self.assertEqual(ci.op.index_max_close, left_index_min_max[3])
self.assertIsInstance(ci.index_value.to_pandas(), type(data1.index))
self.assertTrue(ci.op.column_shuffle_size, 2)
shuffle_segments = ci.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ic.data.dtypes, 2)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ci.inputs[0], ic.data)
# test the right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 2)[c.index[1]]
for ic in c.inputs[1].inputs[0].inputs])
pd.testing.assert_series_equal(c.inputs[1].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[1].columns.to_pandas(), c.inputs[1].dtypes.index)
self.assertIsInstance(c.inputs[1].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[1].inputs[0].op, DataFrameShuffleProxy)
right_row_idx, right_row_inner_idx = right_index_idx_to_original_idx[idx[0]]
right_index_min_max = right_index_splits[right_row_idx][right_row_inner_idx]
ics = [ic for ic in df2.chunks if ic.index[0] == right_row_idx]
for j, ci, ic in zip(itertools.count(0), c.inputs[1].inputs[0].inputs, ics):
self.assertIsInstance(ci.op, DataFrameIndexAlignMap)
self.assertEqual(ci.index, (idx[0], j))
self.assertEqual(ci.op.index_min, right_index_min_max[0])
self.assertEqual(ci.op.index_min_close, right_index_min_max[1])
self.assertEqual(ci.op.index_max, right_index_min_max[2])
self.assertEqual(ci.op.index_max_close, right_index_min_max[3])
self.assertTrue(ci.op.column_shuffle_size, 2)
shuffle_segments = ci.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ic.data.dtypes, 2)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ci.inputs[0], ic.data)
# make sure shuffle proxies' key are different
proxy_keys = set()
for i in range(df3.chunk_shape[0]):
cs = [c for c in df3.chunks if c.index[0] == i]
lps = {c.inputs[0].inputs[0].op.key for c in cs}
self.assertEqual(len(lps), 1)
proxy_keys.add(lps.pop())
rps = {c.inputs[1].inputs[0].op.key for c in cs}
self.assertEqual(len(rps), 1)
proxy_keys.add(rps.pop())
self.assertEqual(len(proxy_keys), 2 * df3.chunk_shape[0])
def testAddWithAllShuffle(self):
# no axis is monotonic
data1 = pd.DataFrame(np.random.rand(10, 10), index=[0, 10, 2, 3, 4, 5, 6, 7, 8, 9],
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=5)
data2 = pd.DataFrame(np.random.rand(10, 10), index=[11, 1, 2, 5, 7, 6, 8, 9, 10, 3],
columns=[5, 9, 12, 3, 11, 10, 6, 4, 1, 2])
df2 = from_pandas(data2, chunk_size=6)
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 12) # columns is recorded, so we can get it
df3.tiles()
self.assertEqual(df3.chunk_shape, (2, 2))
proxy_keys = set()
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 2)[c.index[1]]
for ic in c.inputs[0].inputs[0].inputs if ic.index[0] == 0])
pd.testing.assert_series_equal(c.inputs[0].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), c.inputs[0].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[0].inputs[0].op, DataFrameShuffleProxy)
proxy_keys.add(c.inputs[0].inputs[0].op.key)
for ic, ci in zip(c.inputs[0].inputs[0].inputs, df1.chunks):
self.assertIsInstance(ic.op, DataFrameIndexAlignMap)
self.assertEqual(ic.op.index_shuffle_size, 2)
self.assertIsInstance(ic.index_value.to_pandas(), type(data1.index))
self.assertEqual(ic.op.column_shuffle_size, 2)
self.assertIsNotNone(ic.columns)
shuffle_segments = ic.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ci.data.dtypes, 2)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ic.inputs[0], ci.data)
# test right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 2)[c.index[1]]
for ic in c.inputs[1].inputs[0].inputs if ic.index[0] == 0])
pd.testing.assert_series_equal(c.inputs[1].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[1].columns.to_pandas(), c.inputs[1].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[1].inputs[0].op, DataFrameShuffleProxy)
proxy_keys.add(c.inputs[1].inputs[0].op.key)
for ic, ci in zip(c.inputs[1].inputs[0].inputs, df2.chunks):
self.assertIsInstance(ic.op, DataFrameIndexAlignMap)
self.assertEqual(ic.op.index_shuffle_size, 2)
self.assertIsInstance(ic.index_value.to_pandas(), type(data1.index))
self.assertEqual(ic.op.column_shuffle_size, 2)
self.assertIsNotNone(ic.columns)
shuffle_segments = ic.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ci.data.dtypes, 2)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ic.inputs[0], ci.data)
self.assertEqual(len(proxy_keys), 2)
data4 = pd.DataFrame(np.random.rand(10, 10), index=np.random.randint(-100, 100, size=(10,)),
columns=[np.random.bytes(10) for _ in range(10)])
df4 = from_pandas(data4, chunk_size=3)
data5 = pd.DataFrame(np.random.rand(10, 10), index=np.random.randint(-100, 100, size=(10,)),
columns=[np.random.bytes(10) for _ in range(10)])
df5 = from_pandas(data5, chunk_size=3)
df6 = add(df4, df5)
# test df6's index and columns
pd.testing.assert_index_equal(df6.columns.to_pandas(), (data4 + data5).columns)
self.assertTrue(df6.columns.should_be_monotonic)
self.assertIsInstance(df6.index_value.value, IndexValue.Int64Index)
self.assertTrue(df6.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df6.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df6.index_value.key, df4.index_value.key)
self.assertNotEqual(df6.index_value.key, df5.index_value.key)
self.assertEqual(df6.shape[1], 20) # columns is recorded, so we can get it
df6.tiles()
self.assertEqual(df6.chunk_shape, (4, 4))
proxy_keys = set()
for c in df6.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 4)[c.index[1]]
for ic in c.inputs[0].inputs[0].inputs if ic.index[0] == 0])
pd.testing.assert_series_equal(c.inputs[0].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), c.inputs[0].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[0].inputs[0].op, DataFrameShuffleProxy)
proxy_keys.add(c.inputs[0].inputs[0].op.key)
for ic, ci in zip(c.inputs[0].inputs[0].inputs, df4.chunks):
self.assertIsInstance(ic.op, DataFrameIndexAlignMap)
self.assertEqual(ic.op.index_shuffle_size, 4)
self.assertIsInstance(ic.index_value.to_pandas(), type(data1.index))
self.assertEqual(ic.op.column_shuffle_size, 4)
self.assertIsNotNone(ic.columns)
shuffle_segments = ic.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ci.data.dtypes, 4)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ic.inputs[0], ci.data)
# test right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([hash_dtypes(ic.inputs[0].op.data.dtypes, 4)[c.index[1]]
for ic in c.inputs[1].inputs[0].inputs if ic.index[0] == 0])
pd.testing.assert_series_equal(c.inputs[1].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[1].columns.to_pandas(), c.inputs[1].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[1].inputs[0].op, DataFrameShuffleProxy)
proxy_keys.add(c.inputs[1].inputs[0].op.key)
for ic, ci in zip(c.inputs[1].inputs[0].inputs, df5.chunks):
self.assertIsInstance(ic.op, DataFrameIndexAlignMap)
self.assertEqual(ic.op.index_shuffle_size, 4)
self.assertIsInstance(ic.index_value.to_pandas(), type(data1.index))
self.assertEqual(ic.op.column_shuffle_size, 4)
self.assertIsNotNone(ic.columns)
shuffle_segments = ic.op.column_shuffle_segments
expected_shuffle_segments = hash_dtypes(ci.data.dtypes, 4)
self.assertEqual(len(shuffle_segments), len(expected_shuffle_segments))
for ss, ess in zip(shuffle_segments, expected_shuffle_segments):
pd.testing.assert_series_equal(ss, ess)
self.assertIs(ic.inputs[0], ci.data)
self.assertEqual(len(proxy_keys), 2)
def testWithoutShuffleAndWithOneChunk(self):
# only 1 axis is monotonic
# data1 with index split into [0...4], [5...9],
data1 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=(5, 10))
# data2 with index split into [6...11], [2, 5],
data2 = pd.DataFrame(np.random.rand(10, 10), index=np.arange(11, 1, -1),
columns=[5, 9, 12, 3, 11, 10, 6, 4, 1, 2])
df2 = from_pandas(data2, chunk_size=(6, 10))
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 12) # columns is recorded, so we can get it
df3.tiles()
data1_index_min_max = [(0, True, 4, True), (5, True, 9, True)]
data2_index_min_max = [(2, True, 5, True), (6, True, 11, True)]
left_index_splits, right_index_splits = split_monotonic_index_min_max(
data1_index_min_max, True, data2_index_min_max, False)
left_index_idx_to_original_idx = build_split_idx_to_origin_idx(left_index_splits)
right_index_idx_to_original_idx = build_split_idx_to_origin_idx(right_index_splits, False)
self.assertEqual(df3.chunk_shape, (7, 1))
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test shape
idx = c.index
# test the left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignMap)
left_row_idx, left_row_inner_idx = left_index_idx_to_original_idx[idx[0]]
expect_df1_input = df1.cix[left_row_idx, 0].data
self.assertIs(c.inputs[0].inputs[0], expect_df1_input)
left_index_min_max = left_index_splits[left_row_idx][left_row_inner_idx]
self.assertEqual(c.inputs[0].op.index_min, left_index_min_max[0])
self.assertEqual(c.inputs[0].op.index_min_close, left_index_min_max[1])
self.assertEqual(c.inputs[0].op.index_max, left_index_min_max[2])
self.assertEqual(c.inputs[0].op.index_max_close, left_index_min_max[3])
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertEqual(c.inputs[0].op.column_min, expect_df1_input.columns.min_val)
self.assertEqual(c.inputs[0].op.column_min_close, expect_df1_input.columns.min_val_close)
self.assertEqual(c.inputs[0].op.column_max, expect_df1_input.columns.max_val)
self.assertEqual(c.inputs[0].op.column_max_close, expect_df1_input.columns.max_val_close)
expect_left_columns = expect_df1_input.columns
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), expect_left_columns.to_pandas())
pd.testing.assert_index_equal(c.inputs[0].dtypes.index, expect_left_columns.to_pandas())
# test the right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignMap)
right_row_idx, right_row_inner_idx = right_index_idx_to_original_idx[idx[0]]
expect_df2_input = df2.cix[right_row_idx, 0].data
self.assertIs(c.inputs[1].inputs[0], expect_df2_input)
right_index_min_max = right_index_splits[right_row_idx][right_row_inner_idx]
self.assertEqual(c.inputs[1].op.index_min, right_index_min_max[0])
self.assertEqual(c.inputs[1].op.index_min_close, right_index_min_max[1])
self.assertEqual(c.inputs[1].op.index_max, right_index_min_max[2])
self.assertEqual(c.inputs[1].op.index_max_close, right_index_min_max[3])
self.assertIsInstance(c.inputs[1].index_value.to_pandas(), type(data2.index))
self.assertEqual(c.inputs[1].op.column_min, expect_df2_input.columns.min_val)
self.assertEqual(c.inputs[1].op.column_min_close, expect_df2_input.columns.min_val_close)
self.assertEqual(c.inputs[1].op.column_max, expect_df2_input.columns.max_val)
self.assertEqual(c.inputs[1].op.column_max_close, expect_df2_input.columns.max_val_close)
expect_right_columns = expect_df2_input.columns
pd.testing.assert_index_equal(c.inputs[1].columns.to_pandas(), expect_right_columns.to_pandas())
pd.testing.assert_index_equal(c.inputs[1].dtypes.index, expect_right_columns.to_pandas())
def testBothOneChunk(self):
# no axis is monotonic, but 1 chunk for all axes
data1 = pd.DataFrame(np.random.rand(10, 10), index=[0, 10, 2, 3, 4, 5, 6, 7, 8, 9],
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=10)
data2 = pd.DataFrame(np.random.rand(10, 10), index=[11, 1, 2, 5, 7, 6, 8, 9, 10, 3],
columns=[5, 9, 12, 3, 11, 10, 6, 4, 1, 2])
df2 = from_pandas(data2, chunk_size=10)
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 12) # columns is recorded, so we can get it
df3.tiles()
self.assertEqual(df3.chunk_shape, (1, 1))
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test the left side
self.assertIs(c.inputs[0], df1.chunks[0].data)
# test the right side
self.assertIs(c.inputs[1], df2.chunks[0].data)
def testWithShuffleAndOneChunk(self):
# no axis is monotonic
data1 = pd.DataFrame(np.random.rand(10, 10), index=[0, 10, 2, 3, 4, 5, 6, 7, 8, 9],
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=(5, 10))
data2 = pd.DataFrame(np.random.rand(10, 10), index=[11, 1, 2, 5, 7, 6, 8, 9, 10, 3],
columns=[5, 9, 12, 3, 11, 10, 6, 4, 1, 2])
df2 = from_pandas(data2, chunk_size=(6, 10))
df3 = add(df1, df2)
# test df3's index and columns
pd.testing.assert_index_equal(df3.columns.to_pandas(), (data1 + data2).columns)
self.assertTrue(df3.columns.should_be_monotonic)
self.assertIsInstance(df3.index_value.value, IndexValue.Int64Index)
self.assertTrue(df3.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df3.index_value.to_pandas(), pd.Int64Index([]))
self.assertNotEqual(df3.index_value.key, df1.index_value.key)
self.assertNotEqual(df3.index_value.key, df2.index_value.key)
self.assertEqual(df3.shape[1], 12) # columns is recorded, so we can get it
df3.tiles()
self.assertEqual(df3.chunk_shape, (2, 1))
proxy_keys = set()
for c in df3.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test left side
self.assertIsInstance(c.inputs[0].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([ic.inputs[0].op.data.dtypes
for ic in c.inputs[0].inputs[0].inputs if ic.index[0] == 0])
pd.testing.assert_series_equal(c.inputs[0].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[0].columns.to_pandas(), c.inputs[0].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[0].inputs[0].op, DataFrameShuffleProxy)
proxy_keys.add(c.inputs[0].inputs[0].op.key)
for ic, ci in zip(c.inputs[0].inputs[0].inputs, df1.chunks):
self.assertIsInstance(ic.op, DataFrameIndexAlignMap)
self.assertEqual(ic.op.index_shuffle_size, 2)
self.assertIsInstance(ic.index_value.to_pandas(), type(data1.index))
self.assertEqual(ic.op.column_min, ci.columns.min_val)
self.assertEqual(ic.op.column_min_close, ci.columns.min_val_close)
self.assertEqual(ic.op.column_max, ci.columns.max_val)
self.assertEqual(ic.op.column_max_close, ci.columns.max_val_close)
self.assertIsNone(ic.op.column_shuffle_size, None)
self.assertIsNotNone(ic.columns)
self.assertIs(ic.inputs[0], ci.data)
# test right side
self.assertIsInstance(c.inputs[1].op, DataFrameIndexAlignReduce)
expect_dtypes = pd.concat([ic.inputs[0].op.data.dtypes
for ic in c.inputs[1].inputs[0].inputs if ic.index[0] == 0])
pd.testing.assert_series_equal(c.inputs[1].dtypes, expect_dtypes)
pd.testing.assert_index_equal(c.inputs[1].columns.to_pandas(), c.inputs[1].dtypes.index)
self.assertIsInstance(c.inputs[0].index_value.to_pandas(), type(data1.index))
self.assertIsInstance(c.inputs[1].inputs[0].op, DataFrameShuffleProxy)
proxy_keys.add(c.inputs[1].inputs[0].op.key)
for ic, ci in zip(c.inputs[1].inputs[0].inputs, df2.chunks):
self.assertIsInstance(ic.op, DataFrameIndexAlignMap)
self.assertEqual(ic.op.index_shuffle_size, 2)
self.assertIsInstance(ic.index_value.to_pandas(), type(data1.index))
self.assertIsNone(ic.op.column_shuffle_size)
self.assertEqual(ic.op.column_min, ci.columns.min_val)
self.assertEqual(ic.op.column_min_close, ci.columns.min_val_close)
self.assertEqual(ic.op.column_max, ci.columns.max_val)
self.assertEqual(ic.op.column_max_close, ci.columns.max_val_close)
self.assertIsNone(ic.op.column_shuffle_size, None)
self.assertIsNotNone(ic.columns)
self.assertIs(ic.inputs[0], ci.data)
self.assertEqual(len(proxy_keys), 2)
def testAddSelf(self):
data = pd.DataFrame(np.random.rand(10, 10), index=np.random.randint(-100, 100, size=(10,)),
columns=[np.random.bytes(10) for _ in range(10)])
df = from_pandas(data, chunk_size=3)
df2 = add(df, df)
# test df2's index and columns
pd.testing.assert_index_equal(df2.columns.to_pandas(), (data + data).columns)
self.assertTrue(df2.columns.should_be_monotonic)
self.assertIsInstance(df2.index_value.value, IndexValue.Int64Index)
self.assertTrue(df2.index_value.should_be_monotonic)
pd.testing.assert_index_equal(df2.index_value.to_pandas(), pd.Int64Index([]))
self.assertEqual(df2.index_value.key, df.index_value.key)
self.assertEqual(df2.columns.key, df.columns.key)
self.assertEqual(df2.shape[1], 10)
df2.tiles()
self.assertEqual(df2.chunk_shape, df.chunk_shape)
for c in df2.chunks:
self.assertIsInstance(c.op, DataFrameAdd)
self.assertEqual(len(c.inputs), 2)
# test the left side
self.assertIs(c.inputs[0], df.cix[c.index].data)
# test the right side
self.assertIs(c.inputs[1], df.cix[c.index].data)
def testDataFrameAddScalar(self):
data = pd.DataFrame(np.random.rand(10, 10), index=np.arange(10),
columns=np.arange(3, 13))
df = from_pandas(data, chunk_size=5)
# test add with scalar
result = add(df, 1)
result2 = df.add(1)
# test radd with scalar
result3 = df.radd(1)
result4 = df + 1
result5 = 1 + df
pd.testing.assert_index_equal(result.columns.to_pandas(), data.columns)
self.assertIsInstance(result.index_value.value, IndexValue.Int64Index)
pd.testing.assert_index_equal(result2.columns.to_pandas(), data.columns)
self.assertIsInstance(result2.index_value.value, IndexValue.Int64Index)
pd.testing.assert_index_equal(result3.columns.to_pandas(), data.columns)
self.assertIsInstance(result3.index_value.value, IndexValue.Int64Index)
pd.testing.assert_index_equal(result4.columns.to_pandas(), data.columns)
self.assertIsInstance(result4.index_value.value, IndexValue.Int64Index)
pd.testing.assert_index_equal(result5.columns.to_pandas(), data.columns)
self.assertIsInstance(result5.index_value.value, IndexValue.Int64Index)
# test NotImplemented, use other's radd instead
class TestRadd:
def __radd__(self, other):
return 1
other = TestRadd()
ret = df + other
self.assertEqual(ret, 1)
def testSeriesAddScalar(self):
data = pd.Series(range(10), index=[1, 3, 4, 2, 9, 10, 33, 23, 999, 123])
s1 = from_pandas_series(data, chunk_size=3)
r = s1.add(456)
r.tiles()
self.assertEqual(r.index_value.key, s1.index_value.key)
self.assertEqual(r.chunk_shape, s1.chunk_shape)
for cr in r.chunks:
cs = s1.cix[cr.index]
self.assertEqual(cr.index_value.key, cs.index_value.key)
self.assertIsInstance(cr.op, DataFrameAdd)
self.assertEqual(len(cr.inputs), 1)
self.assertIsInstance(cr.inputs[0].op, SeriesDataSource)
self.assertEqual(cr.op.rhs, 456)
r = s1.radd(789)
r.tiles()
self.assertEqual(r.index_value.key, s1.index_value.key)
self.assertEqual(r.chunk_shape, s1.chunk_shape)
for cr in r.chunks:
cs = s1.cix[cr.index]
self.assertEqual(cr.index_value.key, cs.index_value.key)
self.assertIsInstance(cr.op, DataFrameAdd)
self.assertEqual(len(cr.inputs), 1)
self.assertIsInstance(cr.inputs[0].op, SeriesDataSource)
self.assertEqual(cr.op.lhs, 789)
def testAbs(self):
data1 = pd.DataFrame(np.random.rand(10, 10), index=[0, 10, 2, 3, 4, 5, 6, 7, 8, 9],
columns=[4, 1, 3, 2, 10, 5, 9, 8, 6, 7])
df1 = from_pandas(data1, chunk_size=(5, 10))
df2 = abs(df1)
# test df2's index and columns
pd.testing.assert_index_equal(df2.columns.to_pandas(), df1.columns.to_pandas())
self.assertIsInstance(df2.index_value.value, IndexValue.Int64Index)
self.assertEqual(df2.shape, (10, 10))
df2.tiles()
self.assertEqual(df2.chunk_shape, (2, 1))
for c2, c1 in zip(df2.chunks, df1.chunks):
self.assertIsInstance(c2.op, DataFrameAbs)
self.assertEqual(len(c2.inputs), 1)
# compare with input chunks
self.assertEqual(c2.index, c1.index)
pd.testing.assert_index_equal(c2.columns.to_pandas(), c1.columns.to_pandas())
pd.testing.assert_index_equal(c2.index_value.to_pandas(), c1.index_value.to_pandas())
| 55.755665 | 110 | 0.642988 | 7,854 | 56,592 | 4.424242 | 0.036415 | 0.041902 | 0.023483 | 0.033383 | 0.914297 | 0.89562 | 0.872741 | 0.853287 | 0.827789 | 0.80419 | 0 | 0.039697 | 0.237048 | 56,592 | 1,014 | 111 | 55.810651 | 0.765083 | 0.047586 | 0 | 0.716584 | 0 | 0 | 0.000372 | 0 | 0 | 0 | 0 | 0 | 0.559406 | 1 | 0.022277 | false | 0 | 0.017327 | 0.001238 | 0.043317 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
02e5d67219cb0a052227216f47daa035eb63e6fc | 15,210 | py | Python | sim/migrations/0008_auto_20200715_0459.py | sb-git-cloud/covid-support-tool | 511f39507f5e4f66824fc859c152badf8bda8037 | [
"MIT"
] | null | null | null | sim/migrations/0008_auto_20200715_0459.py | sb-git-cloud/covid-support-tool | 511f39507f5e4f66824fc859c152badf8bda8037 | [
"MIT"
] | null | null | null | sim/migrations/0008_auto_20200715_0459.py | sb-git-cloud/covid-support-tool | 511f39507f5e4f66824fc859c152badf8bda8037 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.7 on 2020-07-15 04:59
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('sim', '0007_auto_20200626_0040'),
]
operations = [
migrations.RemoveField(
model_name='sim',
name='er_nct',
),
migrations.RemoveField(
model_name='sim',
name='er_nmri',
),
migrations.RemoveField(
model_name='sim',
name='er_nxray',
),
migrations.RemoveField(
model_name='sim',
name='highRiskIcuLos',
),
migrations.RemoveField(
model_name='sim',
name='highRiskLos',
),
migrations.RemoveField(
model_name='sim',
name='highRiskVentLos',
),
migrations.RemoveField(
model_name='sim',
name='icu_nct',
),
migrations.RemoveField(
model_name='sim',
name='icu_nmri',
),
migrations.RemoveField(
model_name='sim',
name='icu_nxray',
),
migrations.RemoveField(
model_name='sim',
name='lowRiskIcuLos',
),
migrations.RemoveField(
model_name='sim',
name='lowRiskLos',
),
migrations.RemoveField(
model_name='sim',
name='lowRiskVentLos',
),
migrations.RemoveField(
model_name='sim',
name='medRiskIcuLos',
),
migrations.RemoveField(
model_name='sim',
name='medRiskLos',
),
migrations.RemoveField(
model_name='sim',
name='medRiskVentLos',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_erhr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_erlr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_ermr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_icuhr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_iculr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_icumr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_wardhr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_wardlr',
),
migrations.RemoveField(
model_name='sim',
name='mortrate_wardmr',
),
migrations.RemoveField(
model_name='sim',
name='narrivalsHR',
),
migrations.RemoveField(
model_name='sim',
name='narrivalsLR',
),
migrations.RemoveField(
model_name='sim',
name='narrivalsMR',
),
migrations.RemoveField(
model_name='sim',
name='tend',
),
migrations.RemoveField(
model_name='sim',
name='ward_nct',
),
migrations.RemoveField(
model_name='sim',
name='ward_nmri',
),
migrations.RemoveField(
model_name='sim',
name='ward_nxray',
),
migrations.AddField(
model_name='sim',
name='daysDialysis_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysDialysis_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysDialysis_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysEcmo_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysEcmo_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysEcmo_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysIcu_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysIcu_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysIcu_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysVent_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysVent_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='daysVent_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='narrivals_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='narrivals_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='narrivals_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='nconsultations_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='nconsultations_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='nconsultations_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='ninitials_cat1',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='ninitials_cat2',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='ninitials_cat3',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_cisatricurium',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_ct',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_dexmedetomidine',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_fentanyl',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_morphine',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_morphineOral',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_mri',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_oxycodone',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_ppe',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_propofol',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat1_vecuronium',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_cisatricurium',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_ct',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_dexmedetomidine',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_fentanyl',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_morphine',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_morphineOral',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_mri',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_oxycodone',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_ppe',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_propofol',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat2_vecuronium',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_cisatricurium',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_ct',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_dexmedetomidine',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_fentanyl',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_morphine',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_morphineOral',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_mri',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_oxycodone',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_ppe',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_propofol',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='requiredRes_cat3_vecuronium',
field=models.PositiveSmallIntegerField(default=1),
preserve_default=False,
),
migrations.AddField(
model_name='sim',
name='simTime',
field=models.IntegerField(default=1, validators=[django.core.validators.MinValueValidator(1), django.core.validators.MaxValueValidator(21)]),
preserve_default=False,
),
]
| 32.430704 | 153 | 0.55378 | 1,191 | 15,210 | 6.86314 | 0.078925 | 0.09469 | 0.126254 | 0.168339 | 0.936628 | 0.936628 | 0.886836 | 0.774774 | 0.774774 | 0.774774 | 0 | 0.014389 | 0.346614 | 15,210 | 468 | 154 | 32.5 | 0.80811 | 0.002959 | 0 | 0.794372 | 1 | 0 | 0.117391 | 0.044846 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004329 | 0 | 0.010823 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
02ee4a856777461f7198a6eca707b8ee8c5a5238 | 2,655 | py | Python | conv_base.py | Rongpeng-Lin/pix2pixhd_Tensorflow | 3ab9dbefe290bff1023c61ca2f4efed88bf2c4bf | [
"Apache-2.0"
] | 20 | 2018-10-21T14:57:51.000Z | 2021-06-10T10:26:00.000Z | conv_base.py | xuyanging/pix2pixhd_Tensorflow | 3ab9dbefe290bff1023c61ca2f4efed88bf2c4bf | [
"Apache-2.0"
] | 3 | 2019-07-15T08:20:37.000Z | 2020-08-06T01:20:15.000Z | conv_base.py | xuyanging/pix2pixhd_Tensorflow | 3ab9dbefe290bff1023c61ca2f4efed88bf2c4bf | [
"Apache-2.0"
] | 9 | 2018-10-25T03:39:32.000Z | 2021-01-13T08:11:53.000Z | from scipy import misc
import numpy as np
import tensorflow as tf
import math
import functools
import os
def conv(name,x,kers,outs,s,ref_pad,pad):
shape = [i.value for i in x.get_shape()]
ker = math.sqrt(kers)
ins = int(kers*shape[-1])
ins_min,ins_max = 1/math.sqrt(ins),(-1)/math.sqrt(ins)
with tf.variable_scope(name):
w = tf.get_variable('w',
[ker,ker,shape[-1],outs],
tf.float32,
tf.random_uniform_initializer(ins_min,ins_max))
b = tf.get_variable('b',
[outs],
tf.float32,
tf.random_uniform_initializer(ins_min,ins_max))
if ref_pad:
x_pad = tf.pad(x,[[0,0],[ref_pad,ref_pad],[ref_pad,ref_pad],[0,0]])
return tf.nn.conv2d(x_pad,w,[1,s,s,1],"VALID") + b
else:
paded = "SAME" if pad else "VALID"
return tf.nn.conv2d(x,w,[1,s,s,1],paded) + b
def conv_D(name,x,kers,outs,s,ref_pad,pad):
shape = [i.value for i in x.get_shape()]
ker = math.sqrt(kers)
ins = int(kers*shape[-1])
ins_min,ins_max = 1/math.sqrt(ins),(-1)/math.sqrt(ins)
with tf.variable_scope(name):
w = tf.get_variable('w',
[ker,ker,shape[-1],outs],
tf.float32,
tf.initializers.truncated_normal(stddev=0.02))
b = tf.get_variable('b',
[outs],
tf.float32,
tf.random_uniform_initializer(ins_min,ins_max))
if ref_pad:
x_pad = tf.pad(x,[[0,0],[ref_pad,ref_pad],[ref_pad,ref_pad],[0,0]])
return tf.nn.conv2d(x_pad,w,[1,s,s,1],"VALID") + b
else:
paded = "SAME" if pad else "VALID"
return tf.nn.conv2d(x,w,[1,s,s,1],paded) + b
def conv_trans(name,x,kers,outs,s,b,pad): # 要给batch
shape = [i.value for i in x.get_shape()]
w,h,c = shape[1],shape[2],shape[3]
ker = math.sqrt(kers)
outshape = [b,int(s*w),int(s*h),outs]
with tf.variable_scope(name):
w = tf.get_variable('w',
[ker,ker,outs,c],
tf.float32,
tf.random_uniform_initializer(0,1))
b = tf.get_variable('b',
[outs],
tf.float32,
tf.random_uniform_initializer(0,1))
paded = "SAME" if pad else "VALID"
return tf.nn.conv2d_transpose(x,w,outshape,[1,s,s,1],padding=paded) + b
| 39.626866 | 79 | 0.498305 | 383 | 2,655 | 3.318538 | 0.174935 | 0.056648 | 0.061369 | 0.056648 | 0.805665 | 0.79465 | 0.79465 | 0.79465 | 0.763965 | 0.743509 | 0 | 0.03103 | 0.356686 | 2,655 | 66 | 80 | 40.227273 | 0.713115 | 0.002637 | 0 | 0.774194 | 0 | 0 | 0.016257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048387 | false | 0 | 0.096774 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b8397c1592440b9f9d273902ff7afc53b5a0cfc3 | 190,942 | py | Python | lesson7.4/tensorflow/python/training/gen_training_ops.py | magnusmel/Serverless-Deep-Learning-with-TensorFlow-and-AWS-Lambda | cc226deb7b46852407900f9fec0caf62638defe2 | [
"MIT"
] | 21 | 2018-12-11T20:07:47.000Z | 2021-11-08T13:12:32.000Z | lesson7.4/tensorflow/python/training/gen_training_ops.py | magnusmel/Serverless-Deep-Learning-with-TensorFlow-and-AWS-Lambda | cc226deb7b46852407900f9fec0caf62638defe2 | [
"MIT"
] | 1 | 2020-07-07T21:30:02.000Z | 2020-07-08T18:16:03.000Z | lesson7.4/tensorflow/python/training/gen_training_ops.py | magnusmel/Serverless-Deep-Learning-with-TensorFlow-and-AWS-Lambda | cc226deb7b46852407900f9fec0caf62638defe2 | [
"MIT"
] | 15 | 2018-12-12T02:32:28.000Z | 2021-11-05T20:40:10.000Z | """Python wrappers around TensorFlow ops.
This file is MACHINE GENERATED! Do not edit.
Original C++ source file: training_ops.cc
"""
import collections as _collections
from tensorflow.python.eager import execute as _execute
from tensorflow.python.eager import context as _context
from tensorflow.python.eager import core as _core
from tensorflow.python.framework import dtypes as _dtypes
from tensorflow.python.framework import tensor_shape as _tensor_shape
from tensorflow.core.framework import op_def_pb2 as _op_def_pb2
# Needed to trigger the call to _set_call_cpp_shape_fn.
from tensorflow.python.framework import common_shapes as _common_shapes
from tensorflow.python.framework import op_def_registry as _op_def_registry
from tensorflow.python.framework import ops as _ops
from tensorflow.python.framework import op_def_library as _op_def_library
def apply_adadelta(var, accum, accum_update, lr, rho, epsilon, grad, use_locking=False, name=None):
r"""Update '*var' according to the adadelta scheme.
accum = rho() * accum + (1 - rho()) * grad.square();
update = (update_accum + epsilon).sqrt() * (accum + epsilon()).rsqrt() * grad;
update_accum = rho() * update_accum + (1 - rho()) * update.square();
var -= update;
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
accum_update: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `var`.
Decay factor. Must be a scalar.
epsilon: A `Tensor`. Must have the same type as `var`.
Constant factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var, accum and update_accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyAdadelta", var=var, accum=accum, accum_update=accum_update,
lr=lr, rho=rho, epsilon=epsilon, grad=grad, use_locking=use_locking,
name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_adadelta op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyAdadelta", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_adagrad(var, accum, lr, grad, use_locking=False, name=None):
r"""Update '*var' according to the adagrad scheme.
accum += grad * grad
var -= lr * grad * (1 / sqrt(accum))
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyAdagrad", var=var, accum=accum, lr=lr, grad=grad,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_adagrad op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyAdagrad", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_adagrad_da(var, gradient_accumulator, gradient_squared_accumulator, grad, lr, l1, l2, global_step, use_locking=False, name=None):
r"""Update '*var' according to the proximal adagrad scheme.
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
gradient_accumulator: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
gradient_squared_accumulator: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
grad: A `Tensor`. Must have the same type as `var`. The gradient.
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
global_step: A `Tensor` of type `int64`.
Training step number. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyAdagradDA", var=var, gradient_accumulator=gradient_accumulator,
gradient_squared_accumulator=gradient_squared_accumulator, grad=grad,
lr=lr, l1=l1, l2=l2, global_step=global_step, use_locking=use_locking,
name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_adagrad_da op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyAdagradDA", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_adam(var, m, v, beta1_power, beta2_power, lr, beta1, beta2, epsilon, grad, use_locking=False, use_nesterov=False, name=None):
r"""Update '*var' according to the Adam algorithm.
lr_t <- learning_rate * sqrt(1 - beta2^t) / (1 - beta1^t)
m_t <- beta1 * m_{t-1} + (1 - beta1) * g_t
v_t <- beta2 * v_{t-1} + (1 - beta2) * g_t * g_t
variable <- variable - lr_t * m_t / (sqrt(v_t) + epsilon)
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
m: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
v: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
beta1_power: A `Tensor`. Must have the same type as `var`.
Must be a scalar.
beta2_power: A `Tensor`. Must have the same type as `var`.
Must be a scalar.
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
beta1: A `Tensor`. Must have the same type as `var`.
Momentum factor. Must be a scalar.
beta2: A `Tensor`. Must have the same type as `var`.
Momentum factor. Must be a scalar.
epsilon: A `Tensor`. Must have the same type as `var`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, m, and v tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
use_nesterov: An optional `bool`. Defaults to `False`.
If `True`, uses the nesterov update.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
if use_nesterov is None:
use_nesterov = False
use_nesterov = _execute.make_bool(use_nesterov, "use_nesterov")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyAdam", var=var, m=m, v=v, beta1_power=beta1_power,
beta2_power=beta2_power, lr=lr, beta1=beta1, beta2=beta2,
epsilon=epsilon, grad=grad, use_locking=use_locking,
use_nesterov=use_nesterov, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"), "use_nesterov",
_op.get_attr("use_nesterov"))
else:
raise RuntimeError(
"apply_adam op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyAdam", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_centered_rms_prop(var, mg, ms, mom, lr, rho, momentum, epsilon, grad, use_locking=False, name=None):
r"""Update '*var' according to the centered RMSProp algorithm.
The centered RMSProp algorithm uses an estimate of the centered second moment
(i.e., the variance) for normalization, as opposed to regular RMSProp, which
uses the (uncentered) second moment. This often helps with training, but is
slightly more expensive in terms of computation and memory.
Note that in dense implementation of this algorithm, mg, ms, and mom will
update even if the grad is zero, but in this sparse implementation, mg, ms,
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
mean_grad = decay * mean_grad + (1-decay) * gradient
Delta = learning_rate * gradient / sqrt(mean_square + epsilon - mean_grad ** 2)
mg <- rho * mg_{t-1} + (1-rho) * grad
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms - mg * mg + epsilon)
var <- var - mom
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
mg: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
ms: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
mom: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `var`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `var`.
epsilon: A `Tensor`. Must have the same type as `var`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, mg, ms, and mom tensors is
protected by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyCenteredRMSProp", var=var, mg=mg, ms=ms, mom=mom, lr=lr,
rho=rho, momentum=momentum, epsilon=epsilon, grad=grad,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_centered_rms_prop op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyCenteredRMSProp", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_ftrl(var, accum, linear, grad, lr, l1, l2, lr_power, use_locking=False, name=None):
r"""Update '*var' according to the Ftrl-proximal scheme.
accum_new = accum + grad * grad
linear += grad + (accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
linear: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
grad: A `Tensor`. Must have the same type as `var`. The gradient.
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regulariation. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regulariation. Must be a scalar.
lr_power: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyFtrl", var=var, accum=accum, linear=linear, grad=grad, lr=lr,
l1=l1, l2=l2, lr_power=lr_power, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_ftrl op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyFtrl", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_ftrl_v2(var, accum, linear, grad, lr, l1, l2, l2_shrinkage, lr_power, use_locking=False, name=None):
r"""Update '*var' according to the Ftrl-proximal scheme.
grad_with_shrinkage = grad + 2 * l2_shrinkage * var
accum_new = accum + grad_with_shrinkage * grad_with_shrinkage
linear += grad_with_shrinkage +
(accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
linear: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
grad: A `Tensor`. Must have the same type as `var`. The gradient.
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regulariation. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 shrinkage regulariation. Must be a scalar.
l2_shrinkage: A `Tensor`. Must have the same type as `var`.
lr_power: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyFtrlV2", var=var, accum=accum, linear=linear, grad=grad, lr=lr,
l1=l1, l2=l2, l2_shrinkage=l2_shrinkage, lr_power=lr_power,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_ftrl_v2 op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyFtrlV2", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_gradient_descent(var, alpha, delta, use_locking=False, name=None):
r"""Update '*var' by subtracting 'alpha' * 'delta' from it.
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
alpha: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
delta: A `Tensor`. Must have the same type as `var`. The change.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, the subtraction will be protected by a lock;
otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyGradientDescent", var=var, alpha=alpha, delta=delta,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_gradient_descent op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyGradientDescent", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_momentum(var, accum, lr, grad, momentum, use_locking=False, use_nesterov=False, name=None):
r"""Update '*var' according to the momentum scheme. Set use_nesterov = True if you
want to use Nesterov momentum.
accum = accum * momentum + grad
var -= lr * accum
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
momentum: A `Tensor`. Must have the same type as `var`.
Momentum. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
use_nesterov: An optional `bool`. Defaults to `False`.
If `True`, the tensor passed to compute grad will be
var - lr * momentum * accum, so in the end, the var you get is actually
var - lr * momentum * accum.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
if use_nesterov is None:
use_nesterov = False
use_nesterov = _execute.make_bool(use_nesterov, "use_nesterov")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyMomentum", var=var, accum=accum, lr=lr, grad=grad,
momentum=momentum, use_locking=use_locking, use_nesterov=use_nesterov,
name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"), "use_nesterov",
_op.get_attr("use_nesterov"))
else:
raise RuntimeError(
"apply_momentum op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyMomentum", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_proximal_adagrad(var, accum, lr, l1, l2, grad, use_locking=False, name=None):
r"""Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
accum += grad * grad
prox_v = var - lr * grad * (1 / sqrt(accum))
var = sign(prox_v)/(1+lr*l2) * max{|prox_v|-lr*l1,0}
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyProximalAdagrad", var=var, accum=accum, lr=lr, l1=l1, l2=l2,
grad=grad, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_proximal_adagrad op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyProximalAdagrad", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_proximal_gradient_descent(var, alpha, l1, l2, delta, use_locking=False, name=None):
r"""Update '*var' as FOBOS algorithm with fixed learning rate.
prox_v = var - alpha * delta
var = sign(prox_v)/(1+alpha*l2) * max{|prox_v|-alpha*l1,0}
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
alpha: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
delta: A `Tensor`. Must have the same type as `var`. The change.
use_locking: An optional `bool`. Defaults to `False`.
If True, the subtraction will be protected by a lock;
otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyProximalGradientDescent", var=var, alpha=alpha, l1=l1, l2=l2,
delta=delta, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_proximal_gradient_descent op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyProximalGradientDescent", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def apply_rms_prop(var, ms, mom, lr, rho, momentum, epsilon, grad, use_locking=False, name=None):
r"""Update '*var' according to the RMSProp algorithm.
Note that in dense implementation of this algorithm, ms and mom will
update even if the grad is zero, but in this sparse implementation, ms
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
Delta = learning_rate * gradient / sqrt(mean_square + epsilon)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon)
var <- var - mom
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
ms: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
mom: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `var`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `var`.
epsilon: A `Tensor`. Must have the same type as `var`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, ms, and mom tensors is protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ApplyRMSProp", var=var, ms=ms, mom=mom, lr=lr, rho=rho,
momentum=momentum, epsilon=epsilon, grad=grad,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "use_locking",
_op.get_attr("use_locking"))
else:
raise RuntimeError(
"apply_rms_prop op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"ApplyRMSProp", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def resource_apply_adadelta(var, accum, accum_update, lr, rho, epsilon, grad, use_locking=False, name=None):
r"""Update '*var' according to the adadelta scheme.
accum = rho() * accum + (1 - rho()) * grad.square();
update = (update_accum + epsilon).sqrt() * (accum + epsilon()).rsqrt() * grad;
update_accum = rho() * update_accum + (1 - rho()) * update.square();
var -= update;
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
accum_update: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `lr`.
Decay factor. Must be a scalar.
epsilon: A `Tensor`. Must have the same type as `lr`.
Constant factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var, accum and update_accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyAdadelta", var=var, accum=accum,
accum_update=accum_update, lr=lr, rho=rho, epsilon=epsilon, grad=grad,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, rho, epsilon, grad], _ctx)
(lr, rho, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
accum_update = _ops.convert_to_tensor(accum_update, _dtypes.resource)
_inputs_flat = [var, accum, accum_update, lr, rho, epsilon, grad]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyAdadelta", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_adagrad(var, accum, lr, grad, use_locking=False, name=None):
r"""Update '*var' according to the adagrad scheme.
accum += grad * grad
var -= lr * grad * (1 / sqrt(accum))
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyAdagrad", var=var, accum=accum, lr=lr, grad=grad,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, grad], _ctx)
(lr, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
_inputs_flat = [var, accum, lr, grad]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyAdagrad", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_adagrad_da(var, gradient_accumulator, gradient_squared_accumulator, grad, lr, l1, l2, global_step, use_locking=False, name=None):
r"""Update '*var' according to the proximal adagrad scheme.
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
gradient_accumulator: A `Tensor` of type `resource`.
Should be from a Variable().
gradient_squared_accumulator: A `Tensor` of type `resource`.
Should be from a Variable().
grad: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
The gradient.
lr: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `grad`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `grad`.
L2 regularization. Must be a scalar.
global_step: A `Tensor` of type `int64`.
Training step number. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyAdagradDA", var=var,
gradient_accumulator=gradient_accumulator,
gradient_squared_accumulator=gradient_squared_accumulator, grad=grad,
lr=lr, l1=l1, l2=l2, global_step=global_step, use_locking=use_locking,
name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([grad, lr, l1, l2], _ctx)
(grad, lr, l1, l2) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
gradient_accumulator = _ops.convert_to_tensor(gradient_accumulator, _dtypes.resource)
gradient_squared_accumulator = _ops.convert_to_tensor(gradient_squared_accumulator, _dtypes.resource)
global_step = _ops.convert_to_tensor(global_step, _dtypes.int64)
_inputs_flat = [var, gradient_accumulator, gradient_squared_accumulator, grad, lr, l1, l2, global_step]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyAdagradDA", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_adam(var, m, v, beta1_power, beta2_power, lr, beta1, beta2, epsilon, grad, use_locking=False, use_nesterov=False, name=None):
r"""Update '*var' according to the Adam algorithm.
lr_t <- learning_rate * sqrt(1 - beta2^t) / (1 - beta1^t)
m_t <- beta1 * m_{t-1} + (1 - beta1) * g_t
v_t <- beta2 * v_{t-1} + (1 - beta2) * g_t * g_t
variable <- variable - lr_t * m_t / (sqrt(v_t) + epsilon)
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
m: A `Tensor` of type `resource`. Should be from a Variable().
v: A `Tensor` of type `resource`. Should be from a Variable().
beta1_power: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Must be a scalar.
beta2_power: A `Tensor`. Must have the same type as `beta1_power`.
Must be a scalar.
lr: A `Tensor`. Must have the same type as `beta1_power`.
Scaling factor. Must be a scalar.
beta1: A `Tensor`. Must have the same type as `beta1_power`.
Momentum factor. Must be a scalar.
beta2: A `Tensor`. Must have the same type as `beta1_power`.
Momentum factor. Must be a scalar.
epsilon: A `Tensor`. Must have the same type as `beta1_power`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `beta1_power`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, m, and v tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
use_nesterov: An optional `bool`. Defaults to `False`.
If `True`, uses the nesterov update.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
if use_nesterov is None:
use_nesterov = False
use_nesterov = _execute.make_bool(use_nesterov, "use_nesterov")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyAdam", var=var, m=m, v=v, beta1_power=beta1_power,
beta2_power=beta2_power, lr=lr, beta1=beta1, beta2=beta2,
epsilon=epsilon, grad=grad, use_locking=use_locking,
use_nesterov=use_nesterov, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([beta1_power, beta2_power, lr, beta1, beta2, epsilon, grad], _ctx)
(beta1_power, beta2_power, lr, beta1, beta2, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
m = _ops.convert_to_tensor(m, _dtypes.resource)
v = _ops.convert_to_tensor(v, _dtypes.resource)
_inputs_flat = [var, m, v, beta1_power, beta2_power, lr, beta1, beta2, epsilon, grad]
_attrs = ("T", _attr_T, "use_locking", use_locking, "use_nesterov",
use_nesterov)
_result = _execute.execute(b"ResourceApplyAdam", 0, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
return _result
def resource_apply_centered_rms_prop(var, mg, ms, mom, lr, rho, momentum, epsilon, grad, use_locking=False, name=None):
r"""Update '*var' according to the centered RMSProp algorithm.
The centered RMSProp algorithm uses an estimate of the centered second moment
(i.e., the variance) for normalization, as opposed to regular RMSProp, which
uses the (uncentered) second moment. This often helps with training, but is
slightly more expensive in terms of computation and memory.
Note that in dense implementation of this algorithm, mg, ms, and mom will
update even if the grad is zero, but in this sparse implementation, mg, ms,
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
mean_grad = decay * mean_grad + (1-decay) * gradient
Delta = learning_rate * gradient / sqrt(mean_square + epsilon - mean_grad ** 2)
mg <- rho * mg_{t-1} + (1-rho) * grad
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms - mg * mg + epsilon)
var <- var - mom
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
mg: A `Tensor` of type `resource`. Should be from a Variable().
ms: A `Tensor` of type `resource`. Should be from a Variable().
mom: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `lr`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `lr`.
epsilon: A `Tensor`. Must have the same type as `lr`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, mg, ms, and mom tensors is
protected by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyCenteredRMSProp", var=var, mg=mg, ms=ms, mom=mom, lr=lr,
rho=rho, momentum=momentum, epsilon=epsilon, grad=grad,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, rho, momentum, epsilon, grad], _ctx)
(lr, rho, momentum, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
mg = _ops.convert_to_tensor(mg, _dtypes.resource)
ms = _ops.convert_to_tensor(ms, _dtypes.resource)
mom = _ops.convert_to_tensor(mom, _dtypes.resource)
_inputs_flat = [var, mg, ms, mom, lr, rho, momentum, epsilon, grad]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyCenteredRMSProp", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_ftrl(var, accum, linear, grad, lr, l1, l2, lr_power, use_locking=False, name=None):
r"""Update '*var' according to the Ftrl-proximal scheme.
accum_new = accum + grad * grad
linear += grad - (accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
linear: A `Tensor` of type `resource`. Should be from a Variable().
grad: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
The gradient.
lr: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `grad`.
L1 regulariation. Must be a scalar.
l2: A `Tensor`. Must have the same type as `grad`.
L2 regulariation. Must be a scalar.
lr_power: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyFtrl", var=var, accum=accum, linear=linear, grad=grad,
lr=lr, l1=l1, l2=l2, lr_power=lr_power, use_locking=use_locking,
name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([grad, lr, l1, l2, lr_power], _ctx)
(grad, lr, l1, l2, lr_power) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
linear = _ops.convert_to_tensor(linear, _dtypes.resource)
_inputs_flat = [var, accum, linear, grad, lr, l1, l2, lr_power]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyFtrl", 0, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
return _result
def resource_apply_ftrl_v2(var, accum, linear, grad, lr, l1, l2, l2_shrinkage, lr_power, use_locking=False, name=None):
r"""Update '*var' according to the Ftrl-proximal scheme.
grad_with_shrinkage = grad + 2 * l2_shrinkage * var
accum_new = accum + grad_with_shrinkage * grad_with_shrinkage
linear += grad_with_shrinkage +
(accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
linear: A `Tensor` of type `resource`. Should be from a Variable().
grad: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
The gradient.
lr: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `grad`.
L1 regulariation. Must be a scalar.
l2: A `Tensor`. Must have the same type as `grad`.
L2 shrinkage regulariation. Must be a scalar.
l2_shrinkage: A `Tensor`. Must have the same type as `grad`.
lr_power: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyFtrlV2", var=var, accum=accum, linear=linear, grad=grad,
lr=lr, l1=l1, l2=l2, l2_shrinkage=l2_shrinkage, lr_power=lr_power,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([grad, lr, l1, l2, l2_shrinkage, lr_power], _ctx)
(grad, lr, l1, l2, l2_shrinkage, lr_power) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
linear = _ops.convert_to_tensor(linear, _dtypes.resource)
_inputs_flat = [var, accum, linear, grad, lr, l1, l2, l2_shrinkage, lr_power]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyFtrlV2", 0, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
return _result
def resource_apply_gradient_descent(var, alpha, delta, use_locking=False, name=None):
r"""Update '*var' by subtracting 'alpha' * 'delta' from it.
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
alpha: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
delta: A `Tensor`. Must have the same type as `alpha`. The change.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, the subtraction will be protected by a lock;
otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyGradientDescent", var=var, alpha=alpha, delta=delta,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([alpha, delta], _ctx)
(alpha, delta) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
_inputs_flat = [var, alpha, delta]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyGradientDescent", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_momentum(var, accum, lr, grad, momentum, use_locking=False, use_nesterov=False, name=None):
r"""Update '*var' according to the momentum scheme. Set use_nesterov = True if you
want to use Nesterov momentum.
accum = accum * momentum + grad
var -= lr * accum
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
momentum: A `Tensor`. Must have the same type as `lr`.
Momentum. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
use_nesterov: An optional `bool`. Defaults to `False`.
If `True`, the tensor passed to compute grad will be
var - lr * momentum * accum, so in the end, the var you get is actually
var - lr * momentum * accum.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
if use_nesterov is None:
use_nesterov = False
use_nesterov = _execute.make_bool(use_nesterov, "use_nesterov")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyMomentum", var=var, accum=accum, lr=lr, grad=grad,
momentum=momentum, use_locking=use_locking, use_nesterov=use_nesterov,
name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, grad, momentum], _ctx)
(lr, grad, momentum) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
_inputs_flat = [var, accum, lr, grad, momentum]
_attrs = ("T", _attr_T, "use_locking", use_locking, "use_nesterov",
use_nesterov)
_result = _execute.execute(b"ResourceApplyMomentum", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_proximal_adagrad(var, accum, lr, l1, l2, grad, use_locking=False, name=None):
r"""Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
accum += grad * grad
prox_v = var - lr * grad * (1 / sqrt(accum))
var = sign(prox_v)/(1+lr*l2) * max{|prox_v|-lr*l1,0}
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `lr`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `lr`.
L2 regularization. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyProximalAdagrad", var=var, accum=accum, lr=lr, l1=l1,
l2=l2, grad=grad, use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, l1, l2, grad], _ctx)
(lr, l1, l2, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
_inputs_flat = [var, accum, lr, l1, l2, grad]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyProximalAdagrad", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_proximal_gradient_descent(var, alpha, l1, l2, delta, use_locking=False, name=None):
r"""Update '*var' as FOBOS algorithm with fixed learning rate.
prox_v = var - alpha * delta
var = sign(prox_v)/(1+alpha*l2) * max{|prox_v|-alpha*l1,0}
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
alpha: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `alpha`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `alpha`.
L2 regularization. Must be a scalar.
delta: A `Tensor`. Must have the same type as `alpha`. The change.
use_locking: An optional `bool`. Defaults to `False`.
If True, the subtraction will be protected by a lock;
otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyProximalGradientDescent", var=var, alpha=alpha, l1=l1,
l2=l2, delta=delta, use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([alpha, l1, l2, delta], _ctx)
(alpha, l1, l2, delta) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
_inputs_flat = [var, alpha, l1, l2, delta]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyProximalGradientDescent", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_apply_rms_prop(var, ms, mom, lr, rho, momentum, epsilon, grad, use_locking=False, name=None):
r"""Update '*var' according to the RMSProp algorithm.
Note that in dense implementation of this algorithm, ms and mom will
update even if the grad is zero, but in this sparse implementation, ms
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
Delta = learning_rate * gradient / sqrt(mean_square + epsilon)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon)
var <- var - mom
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
ms: A `Tensor` of type `resource`. Should be from a Variable().
mom: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `lr`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `lr`.
epsilon: A `Tensor`. Must have the same type as `lr`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, ms, and mom tensors is protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceApplyRMSProp", var=var, ms=ms, mom=mom, lr=lr, rho=rho,
momentum=momentum, epsilon=epsilon, grad=grad,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, rho, momentum, epsilon, grad], _ctx)
(lr, rho, momentum, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
ms = _ops.convert_to_tensor(ms, _dtypes.resource)
mom = _ops.convert_to_tensor(mom, _dtypes.resource)
_inputs_flat = [var, ms, mom, lr, rho, momentum, epsilon, grad]
_attrs = ("T", _attr_T, "use_locking", use_locking)
_result = _execute.execute(b"ResourceApplyRMSProp", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_adadelta(var, accum, accum_update, lr, rho, epsilon, grad, indices, use_locking=False, name=None):
r"""var: Should be from a Variable().
Args:
var: A `Tensor` of type `resource`.
accum: A `Tensor` of type `resource`. Should be from a Variable().
accum_update: A `Tensor` of type `resource`.
: Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Learning rate. Must be a scalar.
rho: A `Tensor`. Must have the same type as `lr`.
Decay factor. Must be a scalar.
epsilon: A `Tensor`. Must have the same type as `lr`.
Constant factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyAdadelta", var=var, accum=accum,
accum_update=accum_update, lr=lr, rho=rho, epsilon=epsilon, grad=grad,
indices=indices, use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, rho, epsilon, grad], _ctx)
(lr, rho, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
accum_update = _ops.convert_to_tensor(accum_update, _dtypes.resource)
_inputs_flat = [var, accum, accum_update, lr, rho, epsilon, grad, indices]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyAdadelta", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_adagrad(var, accum, lr, grad, indices, use_locking=False, name=None):
r"""Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
That is for rows we have grad for, we update var and accum as follows:
accum += grad * grad
var -= lr * grad * (1 / sqrt(accum))
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Learning rate. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyAdagrad", var=var, accum=accum, lr=lr, grad=grad,
indices=indices, use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, grad], _ctx)
(lr, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
_inputs_flat = [var, accum, lr, grad, indices]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyAdagrad", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_adagrad_da(var, gradient_accumulator, gradient_squared_accumulator, grad, indices, lr, l1, l2, global_step, use_locking=False, name=None):
r"""Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
gradient_accumulator: A `Tensor` of type `resource`.
Should be from a Variable().
gradient_squared_accumulator: A `Tensor` of type `resource`.
Should be from a Variable().
grad: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
lr: A `Tensor`. Must have the same type as `grad`.
Learning rate. Must be a scalar.
l1: A `Tensor`. Must have the same type as `grad`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `grad`.
L2 regularization. Must be a scalar.
global_step: A `Tensor` of type `int64`.
Training step number. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyAdagradDA", var=var,
gradient_accumulator=gradient_accumulator,
gradient_squared_accumulator=gradient_squared_accumulator, grad=grad,
indices=indices, lr=lr, l1=l1, l2=l2, global_step=global_step,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([grad, lr, l1, l2], _ctx)
(grad, lr, l1, l2) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
gradient_accumulator = _ops.convert_to_tensor(gradient_accumulator, _dtypes.resource)
gradient_squared_accumulator = _ops.convert_to_tensor(gradient_squared_accumulator, _dtypes.resource)
global_step = _ops.convert_to_tensor(global_step, _dtypes.int64)
_inputs_flat = [var, gradient_accumulator, gradient_squared_accumulator, grad, indices, lr, l1, l2, global_step]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyAdagradDA", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_centered_rms_prop(var, mg, ms, mom, lr, rho, momentum, epsilon, grad, indices, use_locking=False, name=None):
r"""Update '*var' according to the centered RMSProp algorithm.
The centered RMSProp algorithm uses an estimate of the centered second moment
(i.e., the variance) for normalization, as opposed to regular RMSProp, which
uses the (uncentered) second moment. This often helps with training, but is
slightly more expensive in terms of computation and memory.
Note that in dense implementation of this algorithm, mg, ms, and mom will
update even if the grad is zero, but in this sparse implementation, mg, ms,
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
mean_grad = decay * mean_grad + (1-decay) * gradient
Delta = learning_rate * gradient / sqrt(mean_square + epsilon - mean_grad ** 2)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon)
var <- var - mom
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
mg: A `Tensor` of type `resource`. Should be from a Variable().
ms: A `Tensor` of type `resource`. Should be from a Variable().
mom: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `lr`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `lr`.
epsilon: A `Tensor`. Must have the same type as `lr`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var, ms and mom.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, mg, ms, and mom tensors is
protected by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyCenteredRMSProp", var=var, mg=mg, ms=ms, mom=mom,
lr=lr, rho=rho, momentum=momentum, epsilon=epsilon, grad=grad,
indices=indices, use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, rho, momentum, epsilon, grad], _ctx)
(lr, rho, momentum, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
mg = _ops.convert_to_tensor(mg, _dtypes.resource)
ms = _ops.convert_to_tensor(ms, _dtypes.resource)
mom = _ops.convert_to_tensor(mom, _dtypes.resource)
_inputs_flat = [var, mg, ms, mom, lr, rho, momentum, epsilon, grad, indices]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyCenteredRMSProp", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_ftrl(var, accum, linear, grad, indices, lr, l1, l2, lr_power, use_locking=False, name=None):
r"""Update relevant entries in '*var' according to the Ftrl-proximal scheme.
That is for rows we have grad for, we update var, accum and linear as follows:
accum_new = accum + grad * grad
linear += grad + (accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
linear: A `Tensor` of type `resource`. Should be from a Variable().
grad: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
lr: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `grad`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `grad`.
L2 regularization. Must be a scalar.
lr_power: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyFtrl", var=var, accum=accum, linear=linear,
grad=grad, indices=indices, lr=lr, l1=l1, l2=l2, lr_power=lr_power,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([grad, lr, l1, l2, lr_power], _ctx)
(grad, lr, l1, l2, lr_power) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
linear = _ops.convert_to_tensor(linear, _dtypes.resource)
_inputs_flat = [var, accum, linear, grad, indices, lr, l1, l2, lr_power]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyFtrl", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_ftrl_v2(var, accum, linear, grad, indices, lr, l1, l2, l2_shrinkage, lr_power, use_locking=False, name=None):
r"""Update relevant entries in '*var' according to the Ftrl-proximal scheme.
That is for rows we have grad for, we update var, accum and linear as follows:
grad_with_shrinkage = grad + 2 * l2_shrinkage * var
accum_new = accum + grad_with_shrinkage * grad_with_shrinkage
linear += grad_with_shrinkage +
(accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
linear: A `Tensor` of type `resource`. Should be from a Variable().
grad: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
lr: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `grad`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `grad`.
L2 shrinkage regulariation. Must be a scalar.
l2_shrinkage: A `Tensor`. Must have the same type as `grad`.
lr_power: A `Tensor`. Must have the same type as `grad`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyFtrlV2", var=var, accum=accum, linear=linear,
grad=grad, indices=indices, lr=lr, l1=l1, l2=l2,
l2_shrinkage=l2_shrinkage, lr_power=lr_power, use_locking=use_locking,
name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([grad, lr, l1, l2, l2_shrinkage, lr_power], _ctx)
(grad, lr, l1, l2, l2_shrinkage, lr_power) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
linear = _ops.convert_to_tensor(linear, _dtypes.resource)
_inputs_flat = [var, accum, linear, grad, indices, lr, l1, l2, l2_shrinkage, lr_power]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyFtrlV2", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_momentum(var, accum, lr, grad, indices, momentum, use_locking=False, use_nesterov=False, name=None):
r"""Update relevant entries in '*var' and '*accum' according to the momentum scheme.
Set use_nesterov = True if you want to use Nesterov momentum.
That is for rows we have grad for, we update var and accum as follows:
accum = accum * momentum + grad
var -= lr * accum
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Learning rate. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
momentum: A `Tensor`. Must have the same type as `lr`.
Momentum. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
use_nesterov: An optional `bool`. Defaults to `False`.
If `True`, the tensor passed to compute grad will be
var - lr * momentum * accum, so in the end, the var you get is actually
var - lr * momentum * accum.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
if use_nesterov is None:
use_nesterov = False
use_nesterov = _execute.make_bool(use_nesterov, "use_nesterov")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyMomentum", var=var, accum=accum, lr=lr, grad=grad,
indices=indices, momentum=momentum, use_locking=use_locking,
use_nesterov=use_nesterov, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, grad, momentum], _ctx)
(lr, grad, momentum) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
_inputs_flat = [var, accum, lr, grad, indices, momentum]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking, "use_nesterov", use_nesterov)
_result = _execute.execute(b"ResourceSparseApplyMomentum", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_proximal_adagrad(var, accum, lr, l1, l2, grad, indices, use_locking=False, name=None):
r"""Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
That is for rows we have grad for, we update var and accum as follows:
accum += grad * grad
prox_v = var
prox_v -= lr * grad * (1 / sqrt(accum))
var = sign(prox_v)/(1+lr*l2) * max{|prox_v|-lr*l1,0}
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
accum: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Learning rate. Must be a scalar.
l1: A `Tensor`. Must have the same type as `lr`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `lr`.
L2 regularization. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyProximalAdagrad", var=var, accum=accum, lr=lr,
l1=l1, l2=l2, grad=grad, indices=indices, use_locking=use_locking,
name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, l1, l2, grad], _ctx)
(lr, l1, l2, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
accum = _ops.convert_to_tensor(accum, _dtypes.resource)
_inputs_flat = [var, accum, lr, l1, l2, grad, indices]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyProximalAdagrad", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_proximal_gradient_descent(var, alpha, l1, l2, grad, indices, use_locking=False, name=None):
r"""Sparse update '*var' as FOBOS algorithm with fixed learning rate.
That is for rows we have grad for, we update var as follows:
prox_v = var - alpha * grad
var = sign(prox_v)/(1+alpha*l2) * max{|prox_v|-alpha*l1,0}
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
alpha: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `alpha`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `alpha`.
L2 regularization. Must be a scalar.
grad: A `Tensor`. Must have the same type as `alpha`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If True, the subtraction will be protected by a lock;
otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyProximalGradientDescent", var=var, alpha=alpha,
l1=l1, l2=l2, grad=grad, indices=indices, use_locking=use_locking,
name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([alpha, l1, l2, grad], _ctx)
(alpha, l1, l2, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
_inputs_flat = [var, alpha, l1, l2, grad, indices]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyProximalGradientDescent",
0, inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def resource_sparse_apply_rms_prop(var, ms, mom, lr, rho, momentum, epsilon, grad, indices, use_locking=False, name=None):
r"""Update '*var' according to the RMSProp algorithm.
Note that in dense implementation of this algorithm, ms and mom will
update even if the grad is zero, but in this sparse implementation, ms
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
Delta = learning_rate * gradient / sqrt(mean_square + epsilon)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon)
var <- var - mom
Args:
var: A `Tensor` of type `resource`. Should be from a Variable().
ms: A `Tensor` of type `resource`. Should be from a Variable().
mom: A `Tensor` of type `resource`. Should be from a Variable().
lr: A `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `lr`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `lr`.
epsilon: A `Tensor`. Must have the same type as `lr`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `lr`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var, ms and mom.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, ms, and mom tensors is protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
The created Operation.
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"ResourceSparseApplyRMSProp", var=var, ms=ms, mom=mom, lr=lr, rho=rho,
momentum=momentum, epsilon=epsilon, grad=grad, indices=indices,
use_locking=use_locking, name=name)
return _op
else:
_attr_T, _inputs_T = _execute.args_to_matching_eager([lr, rho, momentum, epsilon, grad], _ctx)
(lr, rho, momentum, epsilon, grad) = _inputs_T
_attr_T = _attr_T.as_datatype_enum
_attr_Tindices, (indices,) = _execute.args_to_matching_eager([indices], _ctx)
_attr_Tindices = _attr_Tindices.as_datatype_enum
var = _ops.convert_to_tensor(var, _dtypes.resource)
ms = _ops.convert_to_tensor(ms, _dtypes.resource)
mom = _ops.convert_to_tensor(mom, _dtypes.resource)
_inputs_flat = [var, ms, mom, lr, rho, momentum, epsilon, grad, indices]
_attrs = ("T", _attr_T, "Tindices", _attr_Tindices, "use_locking",
use_locking)
_result = _execute.execute(b"ResourceSparseApplyRMSProp", 0,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
return _result
def sparse_apply_adadelta(var, accum, accum_update, lr, rho, epsilon, grad, indices, use_locking=False, name=None):
r"""var: Should be from a Variable().
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
accum_update: A mutable `Tensor`. Must have the same type as `var`.
: Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Learning rate. Must be a scalar.
rho: A `Tensor`. Must have the same type as `var`.
Decay factor. Must be a scalar.
epsilon: A `Tensor`. Must have the same type as `var`.
Constant factor. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyAdadelta", var=var, accum=accum,
accum_update=accum_update, lr=lr, rho=rho, epsilon=epsilon, grad=grad,
indices=indices, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_adadelta op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyAdadelta", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_adagrad(var, accum, lr, grad, indices, use_locking=False, name=None):
r"""Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
That is for rows we have grad for, we update var and accum as follows:
accum += grad * grad
var -= lr * grad * (1 / sqrt(accum))
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Learning rate. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyAdagrad", var=var, accum=accum, lr=lr, grad=grad,
indices=indices, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_adagrad op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyAdagrad", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_adagrad_da(var, gradient_accumulator, gradient_squared_accumulator, grad, indices, lr, l1, l2, global_step, use_locking=False, name=None):
r"""Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
gradient_accumulator: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
gradient_squared_accumulator: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
lr: A `Tensor`. Must have the same type as `var`.
Learning rate. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
global_step: A `Tensor` of type `int64`.
Training step number. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyAdagradDA", var=var,
gradient_accumulator=gradient_accumulator,
gradient_squared_accumulator=gradient_squared_accumulator, grad=grad,
indices=indices, lr=lr, l1=l1, l2=l2, global_step=global_step,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_adagrad_da op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyAdagradDA", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_centered_rms_prop(var, mg, ms, mom, lr, rho, momentum, epsilon, grad, indices, use_locking=False, name=None):
r"""Update '*var' according to the centered RMSProp algorithm.
The centered RMSProp algorithm uses an estimate of the centered second moment
(i.e., the variance) for normalization, as opposed to regular RMSProp, which
uses the (uncentered) second moment. This often helps with training, but is
slightly more expensive in terms of computation and memory.
Note that in dense implementation of this algorithm, mg, ms, and mom will
update even if the grad is zero, but in this sparse implementation, mg, ms,
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
mean_grad = decay * mean_grad + (1-decay) * gradient
Delta = learning_rate * gradient / sqrt(mean_square + epsilon - mean_grad ** 2)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon)
var <- var - mom
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
mg: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
ms: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
mom: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `var`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `var`.
epsilon: A `Tensor`. Must have the same type as `var`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var, ms and mom.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, mg, ms, and mom tensors is
protected by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyCenteredRMSProp", var=var, mg=mg, ms=ms, mom=mom, lr=lr,
rho=rho, momentum=momentum, epsilon=epsilon, grad=grad,
indices=indices, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_centered_rms_prop op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyCenteredRMSProp", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_ftrl(var, accum, linear, grad, indices, lr, l1, l2, lr_power, use_locking=False, name=None):
r"""Update relevant entries in '*var' according to the Ftrl-proximal scheme.
That is for rows we have grad for, we update var, accum and linear as follows:
accum_new = accum + grad * grad
linear += grad + (accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
linear: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
lr_power: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyFtrl", var=var, accum=accum, linear=linear, grad=grad,
indices=indices, lr=lr, l1=l1, l2=l2, lr_power=lr_power,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_ftrl op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyFtrl", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_ftrl_v2(var, accum, linear, grad, indices, lr, l1, l2, l2_shrinkage, lr_power, use_locking=False, name=None):
r"""Update relevant entries in '*var' according to the Ftrl-proximal scheme.
That is for rows we have grad for, we update var, accum and linear as follows:
grad_with_shrinkage = grad + 2 * l2_shrinkage * var
accum_new = accum + grad_with_shrinkage * grad_with_shrinkage
linear += grad_with_shrinkage +
(accum_new^(-lr_power) - accum^(-lr_power)) / lr * var
quadratic = 1.0 / (accum_new^(lr_power) * lr) + 2 * l2
var = (sign(linear) * l1 - linear) / quadratic if |linear| > l1 else 0.0
accum = accum_new
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
linear: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 shrinkage regulariation. Must be a scalar.
l2_shrinkage: A `Tensor`. Must have the same type as `var`.
lr_power: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyFtrlV2", var=var, accum=accum, linear=linear, grad=grad,
indices=indices, lr=lr, l1=l1, l2=l2, l2_shrinkage=l2_shrinkage,
lr_power=lr_power, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_ftrl_v2 op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyFtrlV2", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_momentum(var, accum, lr, grad, indices, momentum, use_locking=False, use_nesterov=False, name=None):
r"""Update relevant entries in '*var' and '*accum' according to the momentum scheme.
Set use_nesterov = True if you want to use Nesterov momentum.
That is for rows we have grad for, we update var and accum as follows:
accum = accum * momentum + grad
var -= lr * accum
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Learning rate. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
momentum: A `Tensor`. Must have the same type as `var`.
Momentum. Must be a scalar.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
use_nesterov: An optional `bool`. Defaults to `False`.
If `True`, the tensor passed to compute grad will be
var - lr * momentum * accum, so in the end, the var you get is actually
var - lr * momentum * accum.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
if use_nesterov is None:
use_nesterov = False
use_nesterov = _execute.make_bool(use_nesterov, "use_nesterov")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyMomentum", var=var, accum=accum, lr=lr, grad=grad,
indices=indices, momentum=momentum, use_locking=use_locking,
use_nesterov=use_nesterov, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"), "use_nesterov",
_op.get_attr("use_nesterov"))
else:
raise RuntimeError(
"sparse_apply_momentum op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyMomentum", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_proximal_adagrad(var, accum, lr, l1, l2, grad, indices, use_locking=False, name=None):
r"""Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
That is for rows we have grad for, we update var and accum as follows:
accum += grad * grad
prox_v = var
prox_v -= lr * grad * (1 / sqrt(accum))
var = sign(prox_v)/(1+lr*l2) * max{|prox_v|-lr*l1,0}
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
accum: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Learning rate. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyProximalAdagrad", var=var, accum=accum, lr=lr, l1=l1,
l2=l2, grad=grad, indices=indices, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_proximal_adagrad op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyProximalAdagrad", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_proximal_gradient_descent(var, alpha, l1, l2, grad, indices, use_locking=False, name=None):
r"""Sparse update '*var' as FOBOS algorithm with fixed learning rate.
That is for rows we have grad for, we update var as follows:
prox_v = var - alpha * grad
var = sign(prox_v)/(1+alpha*l2) * max{|prox_v|-alpha*l1,0}
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
alpha: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
l1: A `Tensor`. Must have the same type as `var`.
L1 regularization. Must be a scalar.
l2: A `Tensor`. Must have the same type as `var`.
L2 regularization. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var and accum.
use_locking: An optional `bool`. Defaults to `False`.
If True, the subtraction will be protected by a lock;
otherwise the behavior is undefined, but may exhibit less contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyProximalGradientDescent", var=var, alpha=alpha, l1=l1,
l2=l2, grad=grad, indices=indices, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_proximal_gradient_descent op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyProximalGradientDescent", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def sparse_apply_rms_prop(var, ms, mom, lr, rho, momentum, epsilon, grad, indices, use_locking=False, name=None):
r"""Update '*var' according to the RMSProp algorithm.
Note that in dense implementation of this algorithm, ms and mom will
update even if the grad is zero, but in this sparse implementation, ms
and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2
Delta = learning_rate * gradient / sqrt(mean_square + epsilon)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad
mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon)
var <- var - mom
Args:
var: A mutable `Tensor`. Must be one of the following types: `float32`, `float64`, `int64`, `int32`, `uint8`, `uint16`, `int16`, `int8`, `complex64`, `complex128`, `qint8`, `quint8`, `qint32`, `half`.
Should be from a Variable().
ms: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
mom: A mutable `Tensor`. Must have the same type as `var`.
Should be from a Variable().
lr: A `Tensor`. Must have the same type as `var`.
Scaling factor. Must be a scalar.
rho: A `Tensor`. Must have the same type as `var`.
Decay rate. Must be a scalar.
momentum: A `Tensor`. Must have the same type as `var`.
epsilon: A `Tensor`. Must have the same type as `var`.
Ridge term. Must be a scalar.
grad: A `Tensor`. Must have the same type as `var`. The gradient.
indices: A `Tensor`. Must be one of the following types: `int32`, `int64`.
A vector of indices into the first dimension of var, ms and mom.
use_locking: An optional `bool`. Defaults to `False`.
If `True`, updating of the var, ms, and mom tensors is protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
name: A name for the operation (optional).
Returns:
A mutable `Tensor`. Has the same type as `var`. Same as "var".
"""
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_ctx = _context.context()
if _ctx.in_graph_mode():
_, _, _op = _op_def_lib._apply_op_helper(
"SparseApplyRMSProp", var=var, ms=ms, mom=mom, lr=lr, rho=rho,
momentum=momentum, epsilon=epsilon, grad=grad, indices=indices,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "Tindices", _op.get_attr("Tindices"),
"use_locking", _op.get_attr("use_locking"))
else:
raise RuntimeError(
"sparse_apply_rms_prop op does not support eager execution. Arg 'out'' is a ref.")
_execute.record_gradient(
"SparseApplyRMSProp", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def _InitOpDefLibrary(op_list_proto_bytes):
op_list = _op_def_pb2.OpList()
op_list.ParseFromString(op_list_proto_bytes)
_op_def_registry.register_op_list(op_list)
op_def_lib = _op_def_library.OpDefLibrary()
op_def_lib.add_op_list(op_list)
return op_def_lib
# op {
# name: "ApplyAdadelta"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum_update"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyAdagrad"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyAdagradDA"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "gradient_accumulator"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "gradient_squared_accumulator"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "global_step"
# type: DT_INT64
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyAdam"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "m"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "v"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "beta1_power"
# type_attr: "T"
# }
# input_arg {
# name: "beta2_power"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "beta1"
# type_attr: "T"
# }
# input_arg {
# name: "beta2"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "use_nesterov"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyCenteredRMSProp"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "mg"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "ms"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "mom"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyFtrl"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "linear"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyFtrlV2"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "linear"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "l2_shrinkage"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyGradientDescent"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "alpha"
# type_attr: "T"
# }
# input_arg {
# name: "delta"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyMomentum"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "use_nesterov"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyProximalAdagrad"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyProximalGradientDescent"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "alpha"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "delta"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ApplyRMSProp"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "ms"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "mom"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "ResourceApplyAdadelta"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum_update"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyAdagrad"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyAdagradDA"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "gradient_accumulator"
# type: DT_RESOURCE
# }
# input_arg {
# name: "gradient_squared_accumulator"
# type: DT_RESOURCE
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "global_step"
# type: DT_INT64
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyAdam"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "m"
# type: DT_RESOURCE
# }
# input_arg {
# name: "v"
# type: DT_RESOURCE
# }
# input_arg {
# name: "beta1_power"
# type_attr: "T"
# }
# input_arg {
# name: "beta2_power"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "beta1"
# type_attr: "T"
# }
# input_arg {
# name: "beta2"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "use_nesterov"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyCenteredRMSProp"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "mg"
# type: DT_RESOURCE
# }
# input_arg {
# name: "ms"
# type: DT_RESOURCE
# }
# input_arg {
# name: "mom"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyFtrl"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "linear"
# type: DT_RESOURCE
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyFtrlV2"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "linear"
# type: DT_RESOURCE
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "l2_shrinkage"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyGradientDescent"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "alpha"
# type_attr: "T"
# }
# input_arg {
# name: "delta"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyMomentum"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "use_nesterov"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyProximalAdagrad"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyProximalGradientDescent"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "alpha"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "delta"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceApplyRMSProp"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "ms"
# type: DT_RESOURCE
# }
# input_arg {
# name: "mom"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyAdadelta"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum_update"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyAdagrad"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyAdagradDA"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "gradient_accumulator"
# type: DT_RESOURCE
# }
# input_arg {
# name: "gradient_squared_accumulator"
# type: DT_RESOURCE
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "global_step"
# type: DT_INT64
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyCenteredRMSProp"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "mg"
# type: DT_RESOURCE
# }
# input_arg {
# name: "ms"
# type: DT_RESOURCE
# }
# input_arg {
# name: "mom"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyFtrl"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "linear"
# type: DT_RESOURCE
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyFtrlV2"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "linear"
# type: DT_RESOURCE
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "l2_shrinkage"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyMomentum"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "use_nesterov"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyProximalAdagrad"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "accum"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyProximalGradientDescent"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "alpha"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "ResourceSparseApplyRMSProp"
# input_arg {
# name: "var"
# type: DT_RESOURCE
# }
# input_arg {
# name: "ms"
# type: DT_RESOURCE
# }
# input_arg {
# name: "mom"
# type: DT_RESOURCE
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "SparseApplyAdadelta"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum_update"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyAdagrad"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyAdagradDA"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "gradient_accumulator"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "gradient_squared_accumulator"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "global_step"
# type: DT_INT64
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyCenteredRMSProp"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "mg"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "ms"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "mom"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyFtrl"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "linear"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyFtrlV2"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "linear"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "l2_shrinkage"
# type_attr: "T"
# }
# input_arg {
# name: "lr_power"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyMomentum"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "use_nesterov"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyProximalAdagrad"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "accum"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyProximalGradientDescent"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "alpha"
# type_attr: "T"
# }
# input_arg {
# name: "l1"
# type_attr: "T"
# }
# input_arg {
# name: "l2"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
# op {
# name: "SparseApplyRMSProp"
# input_arg {
# name: "var"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "ms"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "mom"
# type_attr: "T"
# is_ref: true
# }
# input_arg {
# name: "lr"
# type_attr: "T"
# }
# input_arg {
# name: "rho"
# type_attr: "T"
# }
# input_arg {
# name: "momentum"
# type_attr: "T"
# }
# input_arg {
# name: "epsilon"
# type_attr: "T"
# }
# input_arg {
# name: "grad"
# type_attr: "T"
# }
# input_arg {
# name: "indices"
# type_attr: "Tindices"
# }
# output_arg {
# name: "out"
# type_attr: "T"
# is_ref: true
# }
# attr {
# name: "T"
# type: "type"
# allowed_values {
# list {
# type: DT_FLOAT
# type: DT_DOUBLE
# type: DT_INT64
# type: DT_INT32
# type: DT_UINT8
# type: DT_UINT16
# type: DT_INT16
# type: DT_INT8
# type: DT_COMPLEX64
# type: DT_COMPLEX128
# type: DT_QINT8
# type: DT_QUINT8
# type: DT_QINT32
# type: DT_HALF
# }
# }
# }
# attr {
# name: "Tindices"
# type: "type"
# allowed_values {
# list {
# type: DT_INT32
# type: DT_INT64
# }
# }
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# }
_op_def_lib = _InitOpDefLibrary(b"\n\262\001\n\rApplyAdadelta\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\024\n\014accum_update\"\001T\200\001\001\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\203\001\n\014ApplyAdagrad\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\335\001\n\016ApplyAdagradDA\022\013\n\003var\"\001T\200\001\001\022\034\n\024gradient_accumulator\"\001T\200\001\001\022$\n\034gradient_squared_accumulator\"\001T\200\001\001\022\t\n\004grad\"\001T\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\017\n\013global_step\030\t\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\353\001\n\tApplyAdam\022\013\n\003var\"\001T\200\001\001\022\t\n\001m\"\001T\200\001\001\022\t\n\001v\"\001T\200\001\001\022\020\n\013beta1_power\"\001T\022\020\n\013beta2_power\"\001T\022\007\n\002lr\"\001T\022\n\n\005beta1\"\001T\022\n\n\005beta2\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\"\030\n\014use_nesterov\022\004bool\032\002(\000\n\310\001\n\024ApplyCenteredRMSProp\022\013\n\003var\"\001T\200\001\001\022\n\n\002mg\"\001T\200\001\001\022\n\n\002ms\"\001T\200\001\001\022\013\n\003mom\"\001T\200\001\001\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\261\001\n\tApplyFtrl\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\016\n\006linear\"\001T\200\001\001\022\t\n\004grad\"\001T\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\r\n\010lr_power\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\306\001\n\013ApplyFtrlV2\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\016\n\006linear\"\001T\200\001\001\022\t\n\004grad\"\001T\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\021\n\014l2_shrinkage\"\001T\022\r\n\010lr_power\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\200\001\n\024ApplyGradientDescent\022\013\n\003var\"\001T\200\001\001\022\n\n\005alpha\"\001T\022\n\n\005delta\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\255\001\n\rApplyMomentum\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\022\r\n\010momentum\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\"\030\n\014use_nesterov\022\004bool\032\002(\000\n\235\001\n\024ApplyProximalAdagrad\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\t\n\004grad\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\232\001\n\034ApplyProximalGradientDescent\022\013\n\003var\"\001T\200\001\001\022\n\n\005alpha\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\n\n\005delta\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\264\001\n\014ApplyRMSProp\022\013\n\003var\"\001T\200\001\001\022\n\n\002ms\"\001T\200\001\001\022\013\n\003mom\"\001T\200\001\001\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\n\244\001\n\025ResourceApplyAdadelta\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\020\n\014accum_update\030\024\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\ny\n\024ResourceApplyAdagrad\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\317\001\n\026ResourceApplyAdagradDA\022\007\n\003var\030\024\022\030\n\024gradient_accumulator\030\024\022 \n\034gradient_squared_accumulator\030\024\022\t\n\004grad\"\001T\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\017\n\013global_step\030\t\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\335\001\n\021ResourceApplyAdam\022\007\n\003var\030\024\022\005\n\001m\030\024\022\005\n\001v\030\024\022\020\n\013beta1_power\"\001T\022\020\n\013beta2_power\"\001T\022\007\n\002lr\"\001T\022\n\n\005beta1\"\001T\022\n\n\005beta2\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\"\030\n\014use_nesterov\022\004bool\032\002(\000\210\001\001\n\266\001\n\034ResourceApplyCenteredRMSProp\022\007\n\003var\030\024\022\006\n\002mg\030\024\022\006\n\002ms\030\024\022\007\n\003mom\030\024\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\243\001\n\021ResourceApplyFtrl\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\n\n\006linear\030\024\022\t\n\004grad\"\001T\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\r\n\010lr_power\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\270\001\n\023ResourceApplyFtrlV2\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\n\n\006linear\030\024\022\t\n\004grad\"\001T\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\021\n\014l2_shrinkage\"\001T\022\r\n\010lr_power\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\nz\n\034ResourceApplyGradientDescent\022\007\n\003var\030\024\022\n\n\005alpha\"\001T\022\n\n\005delta\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\243\001\n\025ResourceApplyMomentum\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\022\r\n\010momentum\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\"\030\n\014use_nesterov\022\004bool\032\002(\000\210\001\001\n\223\001\n\034ResourceApplyProximalAdagrad\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\t\n\004grad\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\224\001\n$ResourceApplyProximalGradientDescent\022\007\n\003var\030\024\022\n\n\005alpha\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\n\n\005delta\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\246\001\n\024ResourceApplyRMSProp\022\007\n\003var\030\024\022\006\n\002ms\030\024\022\007\n\003mom\030\024\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\331\001\n\033ResourceSparseApplyAdadelta\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\020\n\014accum_update\030\024\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\256\001\n\032ResourceSparseApplyAdagrad\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\204\002\n\034ResourceSparseApplyAdagradDA\022\007\n\003var\030\024\022\030\n\024gradient_accumulator\030\024\022 \n\034gradient_squared_accumulator\030\024\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\017\n\013global_step\030\t\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\353\001\n\"ResourceSparseApplyCenteredRMSProp\022\007\n\003var\030\024\022\006\n\002mg\030\024\022\006\n\002ms\030\024\022\007\n\003mom\030\024\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\330\001\n\027ResourceSparseApplyFtrl\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\n\n\006linear\030\024\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\r\n\010lr_power\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\355\001\n\031ResourceSparseApplyFtrlV2\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\n\n\006linear\030\024\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\021\n\014l2_shrinkage\"\001T\022\r\n\010lr_power\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\330\001\n\033ResourceSparseApplyMomentum\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\r\n\010momentum\"\001T\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\"\030\n\014use_nesterov\022\004bool\032\002(\000\210\001\001\n\310\001\n\"ResourceSparseApplyProximalAdagrad\022\007\n\003var\030\024\022\t\n\005accum\030\024\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\310\001\n*ResourceSparseApplyProximalGradientDescent\022\007\n\003var\030\024\022\n\n\005alpha\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\333\001\n\032ResourceSparseApplyRMSProp\022\007\n\003var\030\024\022\006\n\002ms\030\024\022\007\n\003mom\030\024\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\347\001\n\023SparseApplyAdadelta\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\024\n\014accum_update\"\001T\200\001\001\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\270\001\n\022SparseApplyAdagrad\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\222\002\n\024SparseApplyAdagradDA\022\013\n\003var\"\001T\200\001\001\022\034\n\024gradient_accumulator\"\001T\200\001\001\022$\n\034gradient_squared_accumulator\"\001T\200\001\001\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\017\n\013global_step\030\t\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\375\001\n\032SparseApplyCenteredRMSProp\022\013\n\003var\"\001T\200\001\001\022\n\n\002mg\"\001T\200\001\001\022\n\n\002ms\"\001T\200\001\001\022\013\n\003mom\"\001T\200\001\001\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\346\001\n\017SparseApplyFtrl\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\016\n\006linear\"\001T\200\001\001\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\r\n\010lr_power\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\373\001\n\021SparseApplyFtrlV2\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\016\n\006linear\"\001T\200\001\001\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\021\n\014l2_shrinkage\"\001T\022\r\n\010lr_power\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\342\001\n\023SparseApplyMomentum\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\007\n\002lr\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\022\r\n\010momentum\"\001T\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\"\030\n\014use_nesterov\022\004bool\032\002(\000\n\322\001\n\032SparseApplyProximalAdagrad\022\013\n\003var\"\001T\200\001\001\022\r\n\005accum\"\001T\200\001\001\022\007\n\002lr\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\316\001\n\"SparseApplyProximalGradientDescent\022\013\n\003var\"\001T\200\001\001\022\n\n\005alpha\"\001T\022\007\n\002l1\"\001T\022\007\n\002l2\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000\n\351\001\n\022SparseApplyRMSProp\022\013\n\003var\"\001T\200\001\001\022\n\n\002ms\"\001T\200\001\001\022\013\n\003mom\"\001T\200\001\001\022\007\n\002lr\"\001T\022\010\n\003rho\"\001T\022\r\n\010momentum\"\001T\022\014\n\007epsilon\"\001T\022\t\n\004grad\"\001T\022\023\n\007indices\"\010Tindices\032\013\n\003out\"\001T\200\001\001\"\035\n\001T\022\004type:\022\n\0202\016\001\002\t\003\004\021\005\006\010\022\013\014\r\023\"\030\n\010Tindices\022\004type:\006\n\0042\002\003\t\"\027\n\013use_locking\022\004bool\032\002(\000")
| 34.017816 | 19,244 | 0.61923 | 26,773 | 190,942 | 4.224368 | 0.017219 | 0.037984 | 0.034165 | 0.025288 | 0.956657 | 0.953519 | 0.953094 | 0.951803 | 0.951803 | 0.950459 | 0 | 0.083235 | 0.243948 | 190,942 | 5,612 | 19,245 | 34.023877 | 0.700205 | 0.587964 | 0 | 0.754579 | 1 | 0.049451 | 0.198466 | 0.141211 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041209 | false | 0 | 0.010073 | 0 | 0.112637 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b861664b8daa1e124c94d066b4531012609e4f54 | 2,720 | py | Python | Script-http-socks4/6.py | Alpha-Demon404/RE-14 | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 39 | 2020-02-26T09:44:36.000Z | 2022-03-23T00:18:25.000Z | Script-http-socks4/6.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 15 | 2020-05-14T10:07:26.000Z | 2022-01-06T02:55:32.000Z | Script-http-socks4/6.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 41 | 2020-03-16T22:36:38.000Z | 2022-03-17T14:47:19.000Z | import marshal,zlib,base64,dis
exec(marshal.loads(zlib.decompress(base64.b32decode("PCOEKVWLVZSTOEJ5W6JQBYJ3DD2NBZL2XBFEECF3ZKPCAMKBJQZOXJFPECWW4NOKXVGJR4SL7QPSYR2JHA5GPH756C3KWVVNWXFN6PL67J6ITXY76F57SEYO57YX26T4PQ6L56PF7TU7DTOTZ7TW6HW7XR4TYPY6D67HQPB7HU7HZ6LYP3Z7R4GV4P6ZXR57T57PUYWYS574LYJPN674A7G774LZ6P5773T6XR3DPDSFENJ5OZUISUVLWN4DHSZA4LMUFISLGPTSLDKW57XLWCUH4A7F3F6Z423MRVQZSYKW237RUNW6XGMVFFPHTVVOHW3XAT7NFVKGQC46J4JLF4WZXJVJPYOS67CBRLL5MY3GCPRNOL3JDTWDLPCKVRLZMTLZVUWPRK22E2SFISO4BQVHOBKYQ33RH42JNSJKH7PJ26RLHKXKU5GDIVYIJP6ZXBS2KDWEGLEVPKHGZ23I3W54HGEFLGTLHO5R3LFG4LJBI6LTEJSGXMITKEUVGUVZCTHPAUTXJGFRJUSNXVJRUAG2EOGGXFRPVQTIYCMDNT3ZXMA7UG3KGKZOOX2BA4OC2OA4W6MIAYZEWR2FYT3Y6V44H2UEX5VMEVVTIP6BM26FO6FWCYF5GWW7UY6G3FKVHNW2OY4JWQK23CM2ZM7KWDKFLJSY5466VDGRS23W5XSWHCVJWRHJGD6OAOUMYAWK2NISJ4Z6ACESRM7VPZKE5AREFB4RGGVHBZFHFPFVTK45NIKTSO3O5S5KLWKDXGXWDA6S76RTQ6GIS2F26FCFEO6N2ZQK3FTDRGIQYEDAQEFVVVWV65VKJ7JQ25AN3UVSB45BUQ4XE24WTG4YNZHUXGKUNVEBSDLGT5QEQECRMQL5L6YAWOWKTXRTHTNDC3ZAYRVOFC5GDXYD5I3EF5NU5NFWZLCCRTC33FJBGCMPY2VMAFNQEIEZBKACBTFJUOU5RMMSKA3I2QI67JCU2ATE5GZAHIBZA4Y2V7MIBMWNBDFTC3ITBBZWEZFT2E3Z5GPALZBA3VQHRYETLTRGIWMGTH7BAAMSUGHHNQORBVMI2HDNSIYJYAQEOQG6PMAYNQCIAUXWYZA6NMUVQKCGQAYXDOFAAR6PNKBTFIEBJRQQVGJR4NBT3FSPMZCAXGVQ7HW53JNNWO5VYPAAWESDNS35AYHX6ISSJXILOWTADSUCF7D2ZQIJECAQHHQSUGGDAE6EAHCZJ2MSUEJVLJTRGWBBXJ6XAKTFGINASVRYAKS4SAN4VHOFTZAHKN43EPYBFWF3HDQB4QD2OKQFZW3RSMG6XMXG3C6NWLAGNA7GOGQI3VNDMKHXQQL4VKUNKGBOLZ6CBF6S5WISSOLNVVL3RVEO6Z6IYHQNLEHWFVNYC5VWYVICPR525WXJ7SE5AUGOLVWUJLDVZGAO57IBDQPC2WE6ANITNFXZJ4HQJZEH7YVFHWIIASCHYQZFGOQFIQDDVC2AKYQ2BKSDAGYAEG335B6QJMJ2TTRIZLGGDMWADZKRBY4GPHCDQAPAWNQY5OEIHQNRYB6PQQOBI2NMYNTSUJPKIVZPMMNYQ2WJ5ZVUG6RIEUVFDY5URJJFB6UK2CD5QWC2SJPFPODADRWBYFG3BBBBQWXJGBZUQ3WBQMLJ6SPD2YHSR3TNJD3KJQIMNBHTAAD6W6KSXRSYMNAIWDV6DBOAJVM27VNDX2EJGBQB7VPHOJSMMAS74KSELUPYYBFCDMTTMECR5ZRQ7GRSAI3QSVVT5OJBMQID3WYAQFZCMCA3KAPKN3RQUJKCOWAF7F25POFVQQ527A2OOMGL5VKDJOFXLIAD5BKEK6BQCKKBUDR5LGX5K34HTLHEM6Z2ER7BGQUHXIMQMEUKSDSO5CAPB3L5YLU6Q5QDWIALONB7WQM3CQCCJHP36VDZOE7QANS5THVAZUWTIWB4QEH7QCTGIPH65YCFL7C6Q63IOTWNZS7WAH33GG66TUBQ2DCNAOTQ3UY6IPF5Q2GSGCLVS42DJRGAKZSAN2JVTANSCPOHBOMCNEKLQE2S3F6RWZU3DJZUAQ3IWFAFFKDAOTT4BTWDZV7SYOLMBIKSVM36W4DR32QKMCCB4FHFTWMWDG7ATQNKB3FLN6AGJCWXDBW5P666BAKKYY5YHRCMUY5BCPXUAEYBPN5VBZ7LO23PNHLBN2NEHUJQEGTETYFY3IDK2FBEQ3F26DQLM2AMYHSDMBJGE27CYLWRJ7BKA2WVFNDN6RULQCXMLHG6A44ABD5YJGJQAS6BACRCFQNZADSY24MDZLZLHQKFCQQ43YHGDQOP2GMNC3AB4KZMB2XK2MHTLDO3SLI7QFN5BU5CZDQN2OAFTLMOD42JQB6CGG4ZWBII57XENOWDM4TMELEYVTTDVWBZ3U3BHYH7SF2TLPEGMDXYIBSJ3AK2C6YF5ADP5FP6ZDH5SLW2PR3RMDQ2IKGXBYPFCPM65N57YIPO55OXHB6HV56DZPP567L4XW54K7357T7NXV734TNXOL6PE2XV7YPU4PTXP327LZ7XXN4QZ7PX37XUR4PZ4X26L2734Y6I52O7PT3P55TPH74HS43XP3G7LZ7YHP67P75PZ7U7D6P774HU6P656PP5B76GGX62"))))
#dis.dis(exe)
| 680 | 2,674 | 0.991176 | 16 | 2,720 | 168.5 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190722 | 0.001471 | 2,720 | 3 | 2,675 | 906.666667 | 0.801915 | 0.004412 | 0 | 0 | 0 | 0 | 0.966383 | 0.966383 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
b89e6c7517ef65d45ab804cf4a84cbb05b2d772f | 9,342 | py | Python | pypy/interpreter/pyparser/dfa_generated.py | yxzoro/pypy | 6e47b3d3e5513d9639a21554963a6ace172ccfee | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | pypy/interpreter/pyparser/dfa_generated.py | yxzoro/pypy | 6e47b3d3e5513d9639a21554963a6ace172ccfee | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | pypy/interpreter/pyparser/dfa_generated.py | yxzoro/pypy | 6e47b3d3e5513d9639a21554963a6ace172ccfee | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | # THIS FILE IS AUTOMATICALLY GENERATED BY gendfa.py
# DO NOT EDIT
# TO REGENERATE THE FILE, RUN:
# python gendfa.py > dfa_generated.py
from pypy.interpreter.pyparser import automata
accepts = [True, True, True, True, True, True, True, True,
True, True, True, False, True, True, True, True,
True, False, False, False, False, True, False,
False, False, True, False, True, False, True,
False, True, False, True, False, False, True,
False, False, True, True, True, False, False,
True, False, False, False, True]
states = [
# 0
{'\t': 0, '\n': 15, '\x0c': 0,
'\r': 16, ' ': 0, '!': 11, '"': 19,
'#': 21, '$': 17, '%': 14, '&': 14,
"'": 18, '(': 15, ')': 15, '*': 8,
'+': 14, ',': 15, '-': 12, '.': 7,
'/': 13, '0': 5, '1': 6, '2': 6,
'3': 6, '4': 6, '5': 6, '6': 6,
'7': 6, '8': 6, '9': 6, ':': 15,
';': 15, '<': 10, '=': 14, '>': 9,
'@': 14, 'A': 1, 'B': 2, 'C': 1,
'D': 1, 'E': 1, 'F': 2, 'G': 1,
'H': 1, 'I': 1, 'J': 1, 'K': 1,
'L': 1, 'M': 1, 'N': 1, 'O': 1,
'P': 1, 'Q': 1, 'R': 3, 'S': 1,
'T': 1, 'U': 4, 'V': 1, 'W': 1,
'X': 1, 'Y': 1, 'Z': 1, '[': 15,
'\\': 20, ']': 15, '^': 14, '_': 1,
'`': 15, 'a': 1, 'b': 2, 'c': 1,
'd': 1, 'e': 1, 'f': 2, 'g': 1,
'h': 1, 'i': 1, 'j': 1, 'k': 1,
'l': 1, 'm': 1, 'n': 1, 'o': 1,
'p': 1, 'q': 1, 'r': 3, 's': 1,
't': 1, 'u': 4, 'v': 1, 'w': 1,
'x': 1, 'y': 1, 'z': 1, '{': 15,
'|': 14, '}': 15, '~': 15,
'\x80': 1},
# 1
{'0': 1, '1': 1, '2': 1, '3': 1,
'4': 1, '5': 1, '6': 1, '7': 1,
'8': 1, '9': 1, 'A': 1, 'B': 1,
'C': 1, 'D': 1, 'E': 1, 'F': 1,
'G': 1, 'H': 1, 'I': 1, 'J': 1,
'K': 1, 'L': 1, 'M': 1, 'N': 1,
'O': 1, 'P': 1, 'Q': 1, 'R': 1,
'S': 1, 'T': 1, 'U': 1, 'V': 1,
'W': 1, 'X': 1, 'Y': 1, 'Z': 1,
'_': 1, 'a': 1, 'b': 1, 'c': 1,
'd': 1, 'e': 1, 'f': 1, 'g': 1,
'h': 1, 'i': 1, 'j': 1, 'k': 1,
'l': 1, 'm': 1, 'n': 1, 'o': 1,
'p': 1, 'q': 1, 'r': 1, 's': 1,
't': 1, 'u': 1, 'v': 1, 'w': 1,
'x': 1, 'y': 1, 'z': 1, '\x80': 1},
# 2
{'"': 19, "'": 18, '0': 1, '1': 1,
'2': 1, '3': 1, '4': 1, '5': 1,
'6': 1, '7': 1, '8': 1, '9': 1,
'A': 1, 'B': 1, 'C': 1, 'D': 1,
'E': 1, 'F': 1, 'G': 1, 'H': 1,
'I': 1, 'J': 1, 'K': 1, 'L': 1,
'M': 1, 'N': 1, 'O': 1, 'P': 1,
'Q': 1, 'R': 4, 'S': 1, 'T': 1,
'U': 1, 'V': 1, 'W': 1, 'X': 1,
'Y': 1, 'Z': 1, '_': 1, 'a': 1,
'b': 1, 'c': 1, 'd': 1, 'e': 1,
'f': 1, 'g': 1, 'h': 1, 'i': 1,
'j': 1, 'k': 1, 'l': 1, 'm': 1,
'n': 1, 'o': 1, 'p': 1, 'q': 1,
'r': 4, 's': 1, 't': 1, 'u': 1,
'v': 1, 'w': 1, 'x': 1, 'y': 1,
'z': 1, '\x80': 1},
# 3
{'"': 19, "'": 18, '0': 1, '1': 1,
'2': 1, '3': 1, '4': 1, '5': 1,
'6': 1, '7': 1, '8': 1, '9': 1,
'A': 1, 'B': 4, 'C': 1, 'D': 1,
'E': 1, 'F': 4, 'G': 1, 'H': 1,
'I': 1, 'J': 1, 'K': 1, 'L': 1,
'M': 1, 'N': 1, 'O': 1, 'P': 1,
'Q': 1, 'R': 1, 'S': 1, 'T': 1,
'U': 1, 'V': 1, 'W': 1, 'X': 1,
'Y': 1, 'Z': 1, '_': 1, 'a': 1,
'b': 4, 'c': 1, 'd': 1, 'e': 1,
'f': 4, 'g': 1, 'h': 1, 'i': 1,
'j': 1, 'k': 1, 'l': 1, 'm': 1,
'n': 1, 'o': 1, 'p': 1, 'q': 1,
'r': 1, 's': 1, 't': 1, 'u': 1,
'v': 1, 'w': 1, 'x': 1, 'y': 1,
'z': 1, '\x80': 1},
# 4
{'"': 19, "'": 18, '0': 1, '1': 1,
'2': 1, '3': 1, '4': 1, '5': 1,
'6': 1, '7': 1, '8': 1, '9': 1,
'A': 1, 'B': 1, 'C': 1, 'D': 1,
'E': 1, 'F': 1, 'G': 1, 'H': 1,
'I': 1, 'J': 1, 'K': 1, 'L': 1,
'M': 1, 'N': 1, 'O': 1, 'P': 1,
'Q': 1, 'R': 1, 'S': 1, 'T': 1,
'U': 1, 'V': 1, 'W': 1, 'X': 1,
'Y': 1, 'Z': 1, '_': 1, 'a': 1,
'b': 1, 'c': 1, 'd': 1, 'e': 1,
'f': 1, 'g': 1, 'h': 1, 'i': 1,
'j': 1, 'k': 1, 'l': 1, 'm': 1,
'n': 1, 'o': 1, 'p': 1, 'q': 1,
'r': 1, 's': 1, 't': 1, 'u': 1,
'v': 1, 'w': 1, 'x': 1, 'y': 1,
'z': 1, '\x80': 1},
# 5
{'.': 27, '0': 25, '1': 26, '2': 26,
'3': 26, '4': 26, '5': 26, '6': 26,
'7': 26, '8': 26, '9': 26, 'B': 24,
'E': 28, 'J': 15, 'O': 23, 'X': 22,
'b': 24, 'e': 28, 'j': 15, 'o': 23,
'x': 22},
# 6
{'.': 27, '0': 6, '1': 6, '2': 6,
'3': 6, '4': 6, '5': 6, '6': 6,
'7': 6, '8': 6, '9': 6, 'E': 28,
'J': 15, 'e': 28, 'j': 15},
# 7
{'.': 30, '0': 29, '1': 29, '2': 29,
'3': 29, '4': 29, '5': 29, '6': 29,
'7': 29, '8': 29, '9': 29},
# 8
{'*': 14, '=': 15},
# 9
{'=': 15, '>': 14},
# 10
{'<': 14, '=': 15, '>': 15},
# 11
{'=': 15},
# 12
{'=': 15, '>': 15},
# 13
{'/': 14, '=': 15},
# 14
{'=': 15},
# 15
{},
# 16
{'\n': 15},
# 17
{'0': 31, '1': 31, '2': 31, '3': 31,
'4': 31, '5': 31, '6': 31, '7': 31,
'8': 31, '9': 31},
# 18
{automata.DEFAULT: 35, '\n': 32,
'\r': 32, "'": 33, '\\': 34},
# 19
{automata.DEFAULT: 38, '\n': 32,
'\r': 32, '"': 36, '\\': 37},
# 20
{'\n': 15, '\r': 16},
# 21
{automata.DEFAULT: 21, '\n': 32, '\r': 32},
# 22
{'0': 39, '1': 39, '2': 39, '3': 39,
'4': 39, '5': 39, '6': 39, '7': 39,
'8': 39, '9': 39, 'A': 39, 'B': 39,
'C': 39, 'D': 39, 'E': 39, 'F': 39,
'a': 39, 'b': 39, 'c': 39, 'd': 39,
'e': 39, 'f': 39},
# 23
{'0': 40, '1': 40, '2': 40, '3': 40,
'4': 40, '5': 40, '6': 40, '7': 40},
# 24
{'0': 41, '1': 41},
# 25
{'.': 27, '0': 25, '1': 26, '2': 26,
'3': 26, '4': 26, '5': 26, '6': 26,
'7': 26, '8': 26, '9': 26, 'E': 28,
'J': 15, 'e': 28, 'j': 15},
# 26
{'.': 27, '0': 26, '1': 26, '2': 26,
'3': 26, '4': 26, '5': 26, '6': 26,
'7': 26, '8': 26, '9': 26, 'E': 28,
'J': 15, 'e': 28, 'j': 15},
# 27
{'0': 27, '1': 27, '2': 27, '3': 27,
'4': 27, '5': 27, '6': 27, '7': 27,
'8': 27, '9': 27, 'E': 42, 'J': 15,
'e': 42, 'j': 15},
# 28
{'+': 43, '-': 43, '0': 44, '1': 44,
'2': 44, '3': 44, '4': 44, '5': 44,
'6': 44, '7': 44, '8': 44, '9': 44},
# 29
{'0': 29, '1': 29, '2': 29, '3': 29,
'4': 29, '5': 29, '6': 29, '7': 29,
'8': 29, '9': 29, 'E': 42, 'J': 15,
'e': 42, 'j': 15},
# 30
{'.': 15},
# 31
{'0': 31, '1': 31, '2': 31, '3': 31,
'4': 31, '5': 31, '6': 31, '7': 31,
'8': 31, '9': 31},
# 32
{},
# 33
{"'": 15},
# 34
{automata.DEFAULT: 45, '\n': 15, '\r': 16},
# 35
{automata.DEFAULT: 35, '\n': 32,
'\r': 32, "'": 15, '\\': 34},
# 36
{'"': 15},
# 37
{automata.DEFAULT: 46, '\n': 15, '\r': 16},
# 38
{automata.DEFAULT: 38, '\n': 32,
'\r': 32, '"': 15, '\\': 37},
# 39
{'0': 39, '1': 39, '2': 39, '3': 39,
'4': 39, '5': 39, '6': 39, '7': 39,
'8': 39, '9': 39, 'A': 39, 'B': 39,
'C': 39, 'D': 39, 'E': 39, 'F': 39,
'a': 39, 'b': 39, 'c': 39, 'd': 39,
'e': 39, 'f': 39},
# 40
{'0': 40, '1': 40, '2': 40, '3': 40,
'4': 40, '5': 40, '6': 40, '7': 40},
# 41
{'0': 41, '1': 41},
# 42
{'+': 47, '-': 47, '0': 48, '1': 48,
'2': 48, '3': 48, '4': 48, '5': 48,
'6': 48, '7': 48, '8': 48, '9': 48},
# 43
{'0': 44, '1': 44, '2': 44, '3': 44,
'4': 44, '5': 44, '6': 44, '7': 44,
'8': 44, '9': 44},
# 44
{'0': 44, '1': 44, '2': 44, '3': 44,
'4': 44, '5': 44, '6': 44, '7': 44,
'8': 44, '9': 44, 'J': 15, 'j': 15},
# 45
{automata.DEFAULT: 45, '\n': 32,
'\r': 32, "'": 15, '\\': 34},
# 46
{automata.DEFAULT: 46, '\n': 32,
'\r': 32, '"': 15, '\\': 37},
# 47
{'0': 48, '1': 48, '2': 48, '3': 48,
'4': 48, '5': 48, '6': 48, '7': 48,
'8': 48, '9': 48},
# 48
{'0': 48, '1': 48, '2': 48, '3': 48,
'4': 48, '5': 48, '6': 48, '7': 48,
'8': 48, '9': 48, 'J': 15, 'j': 15},
]
pseudoDFA = automata.DFA(states, accepts)
accepts = [False, False, False, False, False, True]
states = [
# 0
{automata.DEFAULT: 0, '"': 1, '\\': 2},
# 1
{automata.DEFAULT: 4, '"': 3, '\\': 2},
# 2
{automata.DEFAULT: 4},
# 3
{automata.DEFAULT: 4, '"': 5, '\\': 2},
# 4
{automata.DEFAULT: 4, '"': 1, '\\': 2},
# 5
{automata.DEFAULT: 4, '"': 5, '\\': 2},
]
double3DFA = automata.NonGreedyDFA(states, accepts)
accepts = [False, False, False, False, False, True]
states = [
# 0
{automata.DEFAULT: 0, "'": 1, '\\': 2},
# 1
{automata.DEFAULT: 4, "'": 3, '\\': 2},
# 2
{automata.DEFAULT: 4},
# 3
{automata.DEFAULT: 4, "'": 5, '\\': 2},
# 4
{automata.DEFAULT: 4, "'": 1, '\\': 2},
# 5
{automata.DEFAULT: 4, "'": 5, '\\': 2},
]
single3DFA = automata.NonGreedyDFA(states, accepts)
accepts = [False, True, False, False]
states = [
# 0
{automata.DEFAULT: 0, "'": 1, '\\': 2},
# 1
{},
# 2
{automata.DEFAULT: 3},
# 3
{automata.DEFAULT: 3, "'": 1, '\\': 2},
]
singleDFA = automata.DFA(states, accepts)
accepts = [False, True, False, False]
states = [
# 0
{automata.DEFAULT: 0, '"': 1, '\\': 2},
# 1
{},
# 2
{automata.DEFAULT: 3},
# 3
{automata.DEFAULT: 3, '"': 1, '\\': 2},
]
doubleDFA = automata.DFA(states, accepts)
| 29.19375 | 59 | 0.308927 | 1,565 | 9,342 | 1.840256 | 0.070927 | 0.140625 | 0.054167 | 0.055556 | 0.792708 | 0.788194 | 0.736111 | 0.684375 | 0.671875 | 0.654514 | 0 | 0.228008 | 0.331942 | 9,342 | 319 | 60 | 29.285266 | 0.233456 | 0.032862 | 0 | 0.541667 | 1 | 0 | 0.084598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004167 | 0 | 0.004167 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b8a36dc93914f2d559456678858e42b1ab8eaf01 | 1,875 | py | Python | tests/app/lib/test_timezone_converter.py | Wynndow/meeting_room_project | 8fa01b8558d7a34811782f2a207d50ad02bdc878 | [
"MIT"
] | 11 | 2016-08-11T09:41:01.000Z | 2021-01-20T16:52:51.000Z | tests/app/lib/test_timezone_converter.py | Wynndow/meeting_room_project | 8fa01b8558d7a34811782f2a207d50ad02bdc878 | [
"MIT"
] | 40 | 2016-11-11T17:37:44.000Z | 2021-11-11T16:10:57.000Z | tests/app/lib/test_timezone_converter.py | Wynndow/meeting_room_project | 8fa01b8558d7a34811782f2a207d50ad02bdc878 | [
"MIT"
] | 4 | 2016-09-28T08:05:21.000Z | 2019-01-23T04:17:56.000Z | from app.lib.timezone_converter import TimeZoneConverter
class TestTimeZoneConverter():
def test_it_converts_utc_times_to_london_times_out_of_BST(self):
google_api_utc_time_string = u'2016-01-01T09:30:00Z'
converted_time_string = TimeZoneConverter.utc_to_london(google_api_utc_time_string)
assert converted_time_string == u'2016-01-01T09:30:00'
def test_it_converts_utc_times_to_london_times_during_BST(self):
google_api_utc_time_string = u'2016-10-28T09:30:00Z'
converted_time_string = TimeZoneConverter.utc_to_london(google_api_utc_time_string)
assert converted_time_string == u'2016-10-28T10:30:00'
def test_it_converts_correctly_on_the_day_the_clocks_go_forward_pre_change(self):
google_api_utc_time_string = u'2017-03-26T00:59:59Z'
converted_time_string = TimeZoneConverter.utc_to_london(google_api_utc_time_string)
assert converted_time_string == u'2017-03-26T00:59:59'
def test_it_converts_correctly_on_the_day_the_clocks_go_forward_post_change(self):
google_api_utc_time_string = u'2017-03-26T01:00:00Z'
converted_time_string = TimeZoneConverter.utc_to_london(google_api_utc_time_string)
assert converted_time_string == u'2017-03-26T02:00:00'
def test_it_converts_correctly_on_the_day_the_clocks_go_back_pre_change(self):
google_api_utc_time_string = u'2016-10-30T00:59:59Z'
converted_time_string = TimeZoneConverter.utc_to_london(google_api_utc_time_string)
assert converted_time_string == u'2016-10-30T01:59:59'
def test_it_converts_correctly_on_the_day_the_clocks_go_back_post_change(self):
google_api_utc_time_string = u'2016-10-30T01:00:00Z'
converted_time_string = TimeZoneConverter.utc_to_london(google_api_utc_time_string)
assert converted_time_string == u'2016-10-30T01:00:00'
| 46.875 | 91 | 0.7888 | 292 | 1,875 | 4.544521 | 0.19863 | 0.180859 | 0.108515 | 0.144687 | 0.912585 | 0.912585 | 0.90957 | 0.884702 | 0.884702 | 0.76413 | 0 | 0.10396 | 0.138133 | 1,875 | 39 | 92 | 48.076923 | 0.717203 | 0 | 0 | 0.230769 | 0 | 0 | 0.1248 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 1 | 0.230769 | false | 0 | 0.038462 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b24c8d9ac3b663fa30e60da8c8048f222b9c35b3 | 13,149 | py | Python | Ecomscrapers/amazon-review.py | sudhanshu-jha/Scrapers | 1203c5ed3ebb4b0664af41e95bde3fc15662af64 | [
"MIT"
] | null | null | null | Ecomscrapers/amazon-review.py | sudhanshu-jha/Scrapers | 1203c5ed3ebb4b0664af41e95bde3fc15662af64 | [
"MIT"
] | null | null | null | Ecomscrapers/amazon-review.py | sudhanshu-jha/Scrapers | 1203c5ed3ebb4b0664af41e95bde3fc15662af64 | [
"MIT"
] | 1 | 2019-05-29T09:54:14.000Z | 2019-05-29T09:54:14.000Z | # -*- coding: utf-8 -*-
import os
import sys
import urllib,urllib2,cookielib
import datetime,time
import re
import random
from bs4 import BeautifulSoup as soup
import io
text_file = open("amazonreviewlinks.txt", "r")
lines = text_file.read().split(',')
no = len(lines)
#opening product link
print no
for hij in lines :
Review_link_len = 0
hdr1 = {'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
hdr2 = {'User-Agent': 'Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US)',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
hdr3 = {'User-Agent': 'Mozilla/5.0 (compatible; MSIE 10.0; Macintosh; Intel Mac OS X 10_7_3; Trident/6.0)',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
hdr4 = {'User-Agent': 'Opera/9.80 (X11; Linux i686; U; ru) Presto/2.8.131 Version/11.11',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
hdr5 = {'User-Agent': 'Mozilla/5.0 (iPad; CPU OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5355d Safari/8536.25',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
while Review_link_len == 0 :
try :
hdr = random.choice([hdr1,hdr2,hdr3,hdr4,hdr5])
req = urllib2.Request(hij, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout=10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
site1 = page_soup.findAll("a",{"id" : "acrCustomerReviewLink"})
Review_link_len = len(site1)
print Review_link_len
except :
hdr = random.choice([hdr1,hdr2,hdr3,hdr4,hdr5])
req = urllib2.Request(hij, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout=10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
site1 = page_soup.findAll("a",{"id" : "acrCustomerReviewLink"})
Review_link_len = len(site1)
print Review_link_len
site2 = site1[0]['href']
amazon_homepage = "https://www.amazon.in"
site = amazon_homepage + site2
print site
#openinig review link
hdr = {'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
len_captcha = 0
while len_captcha == 0 :
try :
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
except :
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
next_button1 = page_soup.findAll("li",{"class" : "a-last"})[0]
next_button = next_button1.findAll("a")
next_button_len = len(next_button)
if next_button_len == 0 :
next_button_len = 50
else :
next_button = next_button1.findAll("a")[0]['href']
print len(next_button)
try :
filename = page_soup.findAll("a",{"class" : "a-link-normal"})[0].text.strip()
#filename = filename.replace('\'', "!")
filename = filename + '.csv'
f = io.open(filename,"w",encoding="utf-8")
headers = "Customer name,Customer ratings out of 5.0,Review date,Review word count,Customer review\n"
f.write(unicode(headers,"utf-8"))
except :
filename = page_soup.findAll("a",{"class" : "a-link-normal"})[0].text.strip()[:10]
#filename = filename.replace('\'', "!")
filename = filename + '.csv'
f = io.open(filename,"w",encoding="utf-8")
headers = "Customer name,Customer ratings out of 5.0,Review date,Review word count,Customer review\n"
f.write(unicode(headers,"utf-8"))
#loop for content extraction
while next_button_len != 50 :
for single in review_element :
try :
review007 = single.findAll("span",{"data-hook" : "review-body"})[0].getText()
review007 = review007.replace(",", "|")
review007 = len(review007.split())
review = single.findAll("span",{"data-hook" : "review-body"})[0].getText()
review = (review.encode('utf-8', 'ignore')).encode("utf-8",errors='ignore')
review = unicode(review,"utf-8",errors='ignore')
review = review.replace(",", "|")
print review
except :
review = "can not extract review"
try :
ratings = single.findAll("a",{"class" : "a-link-normal"})[0]['title'][:3]
ratings = (ratings.encode('utf-8', 'ignore')).encode("utf-8",errors='ignore')
ratings = unicode(ratings,"utf-8",errors='ignore')
ratings = ratings.replace(",", "|")
print ratings
except :
ratings = "can not extract ratings"
try :
review_date = single.findAll("span",{"class" : "a-size-base a-color-secondary review-date"})[0].getText()[3:]
review_date = (review_date.encode('utf-8', 'ignore')).encode("utf-8",errors='ignore')
review_date = unicode(review_date,"utf-8",errors='ignore')
review_date = review_date.replace(",", "|")
print review_date
except :
ratings = "can not extract review date"
try :
review_length = unicode(review007)
print review_length
except :
review_length = "can not extract review length"
try :
customer_name = single.findAll("a",{"data-hook" : "review-author"})[0].getText().strip()
customer_name = (customer_name.encode('utf-8', 'ignore')).encode("utf-8",errors='ignore')
customer_name = unicode(customer_name,"utf-8",errors='ignore')
customer_name = customer_name.replace(",", "|")
print customer_name
except :
customer_name = "can not extract customer name"
data1 = [customer_name , ratings, review_date, review_length, review]
data1 = customer_name + "," + ratings + "," + review_date + "," + review_length + "," + review + "\n"
try :
f.write(data1)
except :
data1 = unicode("can not find customr name") + "," + unicode("can not extract review") + "\n"
f.write(data1)
next_button1 = page_soup.findAll("li",{"class" : "a-last"})[0]
next_button = next_button1.findAll("a")
next_button_len = len(next_button)
if next_button_len == 0 :
next_button_len = 50
else :
next_button = next_button1.findAll("a")[0]['href']
try :
amazon_homepage = "https://www.amazon.in"
site = amazon_homepage + next_button
except :
site = 0
print site
if site == 0 :
next_button_len = 50
else :
try :
try :
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
cj.clear_session_cookies()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
while len_captcha == 0 :
cj.clear_session_cookies()
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
except :
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
cj.clear_session_cookies()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
while len_captcha == 0 :
cj.clear_session_cookies()
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
except :
print "sleeping due to connection errors"
try :
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
cj.clear_session_cookies()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
while len_captcha == 0 :
cj.clear_session_cookies()
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
except :
print "sleeping due to connection error"
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
cj.clear_session_cookies()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
while len_captcha == 0 :
cj.clear_session_cookies()
req = urllib2.Request(site, headers=hdr)
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
response = opener.open(req,timeout = 10)
content = response.read()
response.close()
page_soup = soup(content,"html.parser")
review_element = page_soup.findAll("div",{"class" : "a-section celwidget"})
captcha = page_soup.findAll("li",{"class" : "a-last"})
len_captcha = len(captcha)
f.close()
| 35.252011 | 153 | 0.621188 | 1,683 | 13,149 | 4.75104 | 0.122995 | 0.038019 | 0.048774 | 0.031516 | 0.811906 | 0.792146 | 0.78089 | 0.777764 | 0.777764 | 0.732491 | 0 | 0.036598 | 0.216594 | 13,149 | 372 | 154 | 35.346774 | 0.739637 | 0.012472 | 0 | 0.764706 | 0 | 0.069204 | 0.250198 | 0.049262 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027682 | null | null | 0.044983 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b24d2cdb1099cf4c0e913c3a69ee506f73fa0166 | 70 | py | Python | app/util/helper.py | xiaomi2019/lolita_son | 8205dff0d423aaedfa7fca8790d1d6fe50213e6e | [
"MIT"
] | null | null | null | app/util/helper.py | xiaomi2019/lolita_son | 8205dff0d423aaedfa7fca8790d1d6fe50213e6e | [
"MIT"
] | null | null | null | app/util/helper.py | xiaomi2019/lolita_son | 8205dff0d423aaedfa7fca8790d1d6fe50213e6e | [
"MIT"
] | 2 | 2019-02-18T03:49:52.000Z | 2020-03-03T16:42:02.000Z | #coding:utf8
import time
def get_svr_tm():
return int(time.time()); | 11.666667 | 25 | 0.714286 | 12 | 70 | 4 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016393 | 0.128571 | 70 | 6 | 25 | 11.666667 | 0.770492 | 0.157143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
b2982317202140472b84323da8ccf95017c1817e | 92 | py | Python | tests/conftest.py | arne-cl/rst-converter-service | e3b5dedff7fd0f9e05508ba6a23119d2b77686d2 | [
"BSD-3-Clause"
] | 3 | 2021-07-01T06:58:40.000Z | 2022-01-27T06:37:30.000Z | tests/conftest.py | arne-cl/rst-converter-service | e3b5dedff7fd0f9e05508ba6a23119d2b77686d2 | [
"BSD-3-Clause"
] | 7 | 2019-12-05T12:15:25.000Z | 2021-03-15T13:54:44.000Z | tests/conftest.py | arne-cl/rst-converter-service | e3b5dedff7fd0f9e05508ba6a23119d2b77686d2 | [
"BSD-3-Clause"
] | 2 | 2019-05-28T05:41:43.000Z | 2020-02-15T17:48:30.000Z | import pytest
@pytest.fixture
def fixtures_input_dir():
return 'tests/fixtures/input'
| 13.142857 | 33 | 0.76087 | 12 | 92 | 5.666667 | 0.75 | 0.382353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141304 | 92 | 6 | 34 | 15.333333 | 0.860759 | 0 | 0 | 0 | 0 | 0 | 0.217391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
b2a43f07f555ee3af01258bf81e0d4862753c0a7 | 47,935 | py | Python | django/bosscore/test/test_resource_views.py | jhuapl-boss/boss | c2e26d272bd7b8d54abdc2948193163537e31291 | [
"Apache-2.0"
] | 20 | 2016-05-16T21:08:13.000Z | 2021-11-16T11:50:19.000Z | django/bosscore/test/test_resource_views.py | jhuapl-boss/boss | c2e26d272bd7b8d54abdc2948193163537e31291 | [
"Apache-2.0"
] | 31 | 2016-10-28T17:51:11.000Z | 2022-02-10T08:07:31.000Z | django/bosscore/test/test_resource_views.py | jhuapl-boss/boss | c2e26d272bd7b8d54abdc2948193163537e31291 | [
"Apache-2.0"
] | 12 | 2016-10-28T17:47:01.000Z | 2021-05-18T23:47:06.000Z | # Copyright 2016 The Johns Hopkins University Applied Physics Laboratory
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from rest_framework.test import APITestCase
from django.conf import settings
from .setup_db import SetupTestDB, TEST_DATA_EXPERIMENTS
from bosscore.models import Channel
version = settings.BOSS_VERSION
class ResourceViewsCollectionTests(APITestCase):
"""
Class to test the resource service
"""
def setUp(self):
"""
Initialize the database
:return:
"""
dbsetup = SetupTestDB()
user = dbsetup.create_user('testuser')
dbsetup.add_role('resource-manager')
dbsetup.set_user(user)
self.client.force_login(user)
dbsetup.insert_test_data()
def test_get_collection_doesnotexist(self):
"""
Get a collection that does not exist
"""
url = '/' + version + '/collection/col10/'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 404)
def test_get_collection_exist(self):
"""
Get a valid collection
"""
url = '/' + version + '/collection/col1/'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['name'], 'col1')
def test_post_collection(self):
"""
Post a new collection (valid)
"""
url = '/' + version + '/collection/col55'
data = {'description': 'A new collection for unit tests'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_collection_special_characters(self):
"""
Post a new collection (valid)
"""
url = '/' + version + '/collection/col55-22'
data = {'description': 'A new collection for unit tests'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['name'], 'col55-22')
def test_post_collection_already_exists(self):
"""
Post a new collection (invalid - Name already exists)
"""
url = '/' + version + '/collection/col1/'
data = {'description': 'A new collection for unit tests'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_collection_no_data(self):
"""
Post a new collection (valid)
"""
url = '/' + version + '/collection/col55/'
# Get an existing collection
response = self.client.post(url)
self.assertEqual(response.status_code, 201)
def test_put_collection_exists(self):
"""
Update a collection (Valid - The collection exists)
"""
url = '/' + version + '/collection/col1/'
data = {'description': 'A new collection for unit tests. Updated'}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_collection_doesnotexist(self):
"""
Update a collection that does not exist
"""
url = '/' + version + '/collection/col55/'
data = {'description': 'A new collection for unit tests. Updated'}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 404)
def test_put_collection_name(self):
"""
Update collection name (valid)
"""
url = '/' + version + '/collection/col1/'
data = {'name': 'col10'}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_delete_collection(self):
"""
Delete a collection (invalid - Violates integrity constraint)
"""
url = '/' + version + '/collection/col55/'
data = {'description': 'A new collection for unit tests'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
def test_flag_delete_collection(self):
"""
Delete a collection (valid- Check that the flag is set correctly)
"""
url = '/' + version + '/collection/col55/'
data = {'description': 'A new collection for unit tests'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
# Get on a deleted collection
response = self.client.get(url)
self.assertEqual(response.status_code, 404)
def test_delete_collection_invalid(self):
"""
Delete a collection (invalid - Violates integrity constraint)
"""
url = '/' + version + '/collection/col1/'
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 400)
def test_delete_collection_doesnotexist(self):
"""
Delete a collection (invalid - The collection does not exist )
"""
url = '/' + version + '/collection/col10/'
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 404)
def test_get_collections(self):
"""
Get list of collections
"""
url = '/' + version + '/collection/'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['collections'][0], 'col1')
class ResourceViewsExperimentTests(APITestCase):
"""
Class to test the resource service
"""
def setUp(self):
"""
Initialize the database
"""
dbsetup = SetupTestDB()
user = dbsetup.create_user('testuser')
dbsetup.add_role('resource-manager')
dbsetup.set_user(user)
self.client.force_login(user)
dbsetup.insert_test_data()
def test_get_experiment_doesnotexist(self):
"""
Get a collection that does not exist
"""
url = '/' + version + '/collection/col1/experiment/exp10/'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 404)
def test_get_experiment_exist(self):
"""
Get a valid experiment
"""
url = '/' + version + '/collection/col1/experiment/exp1/'
# Get an existing experiment
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['name'], 'exp1')
def test_post_experiment(self):
"""
Post a new experiment (valid _ the post has all the required data and does not already exist)
"""
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp2'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'isotropic', 'num_time_samples': 10}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_experiment_not_unique(self):
"""
Post a new experiment with a name that already exists in the database but is unique to the collection
"""
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col2/experiment/exp1'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'isotropic', 'num_time_samples': 10}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_experiment_no_collection(self):
"""
Post a new experiment (valid - No collection in the post data. This is picked up from the request)
"""
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp2'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'anisotropic', 'num_time_samples': 10, 'dummy': 'dummy'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_experiment_no_time(self):
"""
Post a new experiment (valid - No time in post data)
"""
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp2'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'anisotropic'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp2/'
# Get an existing experiment
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['name'], 'exp2')
self.assertEqual(response.data['num_time_samples'], 1)
def test_post_experiment_exists(self):
"""
Post a new collection (invalid - Collection,experiment already exist)
"""
# Get the collection id
url = '/' + version + '/collection/col1/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp1'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'anisotropic', 'num_time_samples': 10}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_experiment_with_time_step(self):
"""
Post a new experiment (valid _ the post has all the required data and does not already exist and includes
timestep)
"""
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp2'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'anisotropic', 'num_time_samples': 10,
'time_step': 1, 'time_step_unit': 'nanoseconds'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_experiment_no_data(self):
"""
Post a new experiment (invalid _ the post has no body)
"""
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp2'
response = self.client.post(url)
self.assertEqual(response.status_code, 400)
def test_put_experiment_exists(self):
"""
Update a experiment (Valid - The experiment exists)
"""
url = '/' + version + '/collection/col1/experiment/exp1'
data = {'description': 'A new experiment for unit tests. Updated'}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_experiment_doesnotexist(self):
"""
Update a experiment that does not exist
"""
url = '/' + version + '/collection/col1/experiment/exp55'
data = {'description': 'A new experiment for unit tests. Updated'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 404)
def test_put_experiment_name(self):
"""
Update experiment name (valid)
"""
url = '/' + version + '/collection/col1/experiment/exp1'
data = {'name': 'exp10'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_delete_experiment(self):
"""
Delete a experiment
"""
# Post a new experiment
# Get the coordinate frame id
url = '/' + version + '/coord/cf1'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
cf = response.data['name']
# Post a new experiment
url = '/' + version + '/collection/col1/experiment/exp2'
data = {'description': 'This is a new experiment', 'coord_frame': cf,
'num_hierarchy_levels': 10, 'hierarchy_method': 'isotropic', 'num_time_samples': 10}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp2'
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
response = self.client.get(url)
self.assertEquals(response.status_code, 404)
self.assertEquals((response.json())['code'], 4005)
def test_delete_experiment_invalid(self):
"""
Delete a experiment (invalid - Violates integrity constraint)
"""
url = '/' + version + '/collection/col1/experiment/exp1/'
response = self.client.delete(url)
self.assertEqual(response.status_code, 400)
def test_delete_experiment_doesnotexist(self):
"""
Delete a experiment (invalid - The experiment does not exist )
"""
url = '/' + version + '/collection/col1/experiment/exp10'
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 404)
def test_get_experiments(self):
"""
Get list of experiments for a collection
"""
url = '/' + version + '/collection/col1/experiment/'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertCountEqual(response.data['experiments'], TEST_DATA_EXPERIMENTS)
class ResourceViewsCoordinateTests(APITestCase):
"""
Class to test the resource service for coordinate frame objects
"""
def setUp(self):
"""
Initialize the database
"""
dbsetup = SetupTestDB()
user = dbsetup.create_user('testuser')
dbsetup.add_role('resource-manager')
dbsetup.set_user(user)
self.client.force_login(user)
dbsetup.insert_test_data()
def test_get_coordinateframes(self):
"""
Get list of coordinateframes
"""
url = '/' + version + '/coord/'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['coords'][0], 'cf1')
def test_get_coordinateframes_owner(self):
"""
Get list of coordinateframes
"""
url = '/' + version + '/coord/?owner=True'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['coords'][0], 'cf1')
def test_get_coordinateframe_doesnotexist(self):
"""
Get a coordinate frame that does not exist
"""
url = '/' + version + '/coord/cf10'
# Get an coordinate frame that does not exist
response = self.client.get(url)
self.assertEqual(response.status_code, 404)
def test_get_coordinateframe_exist(self):
"""
Get a valid coordinate frame
"""
url = '/' + version + '/coord/cf1'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['name'], 'cf1')
def test_post_coordinateframe(self):
"""
Post a new coordinate frame (valid)
"""
url = '/' + version + '/coord/cf10'
data = {'description': 'This is a test coordinateframe', 'x_start': 0, 'x_stop': 1000,
'y_start': 0, 'y_stop': 1000, 'z_start': 0, 'z_stop': 1000,
'x_voxel_size': 4, 'y_voxel_size': 4, 'z_voxel_size': 4, 'voxel_unit': 'nanometers',
'time_step_unit': 'nanoseconds'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_coordinateframe_already_exists(self):
"""
Post a new coordinate frame (invalid - Name already exists)
"""
url = '/' + version + '/coord/cf1'
data = {'description': 'This is a test coordinateframe', 'x_start': 0, 'x_stop': 1000,
'y_start': 0, 'y_stop': 1000, 'z_start': 0, 'z_stop': 1000,
'x_voxel_size': 4, 'y_voxel_size': 4, 'z_voxel_size': 4, 'voxel_unit': 'nanometers',
'time_step_unit': 'nanoseconds'}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_put_coorddinateframe_exists(self):
"""
Update a coordinateframe (Valid - The coordinateframe exists)
"""
url = '/' + version + '/coord/cf1'
data = {'description': 'This is a test coordinateframe. Updated'}
# Update an existing coordinate frame
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_coorddinateframe_extrafields(self):
"""
Update a coordinateframe (Valid - The coordinateframe exists)
"""
url = '/' + version + '/coord/cf1'
data = {'description': 'This is a test coordinateframe. Updated', 'x_start': 22}
# Update an existing coordinate frame
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 400)
def test_put_coordinateframe_doesnotexist(self):
"""
Update a coordinateframe that does not exist
"""
url = '/' + version + '/coord/cf55'
data = {'description': 'This is a test coordinateframe. Updated'}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 404)
def test_put_coordinateframe_name(self):
"""
Update collection name (valid)
"""
url = '/' + version + '/coord/cf1'
data = {'name': 'cf10'}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_delete_coordinateframe(self):
"""
Delete a coordinateframe (invalid - Violates integrity constraint)
"""
url = '/' + version + '/coord/cf55/'
data = {'description': 'This is a test coordinateframe', 'x_start': 0, 'x_stop': 1000,
'y_start': 0, 'y_stop': 1000, 'z_start': 0, 'z_stop': 1000,
'x_voxel_size': 4, 'y_voxel_size': 4, 'z_voxel_size': 4, 'voxel_unit': 'nanometers',
'time_step_unit': 'nanoseconds', 'time_step': 1}
# Get an existing collection
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
# Get on the resource should return an error since it is marked for deleton
response = self.client.get(url)
resp = response.json()
self.assertEquals(resp['code'], 4005)
url = '/' + version + '/coord/cf1/'
response = self.client.delete(url)
resp = response.json()
self.assertEqual(resp['code'], 4003)
def test_delete_coordinateframe_invalid(self):
"""
Delete a collection (invalid - Violates integrity constraint)
"""
url = '/' + version + '/coord/cf1/'
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 400)
def test_delete_coordinateframe_doesnotexist(self):
"""
Delete a collection (invalid - The collection does not exist )
"""
url = '/' + version + '/coord/cf55/'
# Get an existing collection
response = self.client.delete(url)
self.assertEqual(response.status_code, 404)
class ResourceViewsChannelTests(APITestCase):
"""
Class to test the resource service
"""
def setUp(self):
"""
Initialize the database
"""
dbsetup = SetupTestDB()
self.super_user = dbsetup.create_super_user()
user = dbsetup.create_user('testuser')
dbsetup.add_role('resource-manager')
dbsetup.set_user(user)
self.client.force_login(user)
dbsetup.insert_test_data()
def test_get_channel_doesnotexist(self):
"""
Get a Channel that does not exist
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel55'
# Get an existing collection
response = self.client.get(url)
self.assertEqual(response.status_code, 404)
def test_get_channel_exist(self):
"""
Get a valid experiment
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1/'
# Get an existing experiment
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['name'], 'channel1')
self.assertEqual(response.data['downsample_status'], 'NOT_DOWNSAMPLED')
def test_post_channel(self):
"""
Post a new channel (Valid - the post has all the required data and does not already exist)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_channel_set_cloudvol_storage_no_cv_path(self):
"""
When using CloudVolume storage type w/o providing cv_path, cv_path
should default to /{coll}/{exp}/{chan}.
"""
coll = 'col1'
exp = 'exp1'
chan = 'channel10'
url = '/' + version + f'/collection/{coll}/experiment/{exp}/channel/{chan}/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image',
'storage_type': Channel.StorageType.CLOUD_VOLUME}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
self.assertEqual(response.json()['cv_path'], f'/{coll}/{exp}/{chan}')
def test_post_channel_set_bucket_forbidden_for_non_admins(self):
"""
Only admins should be able to set the bucket name.
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image', 'bucket': 'my.bucket.boss'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 403)
def test_post_channel_set_bucket_as_admin(self):
"""
Only admins should be able to set the bucket name.
"""
self.client.force_login(self.super_user)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
bucket_name = 'my.bucket.boss'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image', 'bucket': bucket_name}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
self.assertEqual(response.data['bucket'], bucket_name)
def test_post_channel_spdb_with_cv_path(self):
"""
Setting cv_path when storage_type != CloudVolume is invalid.
"""
self.client.force_login(self.super_user)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image',
'cv_path': '/custom/cv', 'storage_type': Channel.StorageType.SPDB}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_channel_cloudvol_with_cv_path_forbidden_when_not_admin(self):
"""
Setting cv_path is isvalid when not an admin.
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image',
'cv_path': '/custom/cv', 'storage_type': Channel.StorageType.CLOUD_VOLUME}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 403)
def test_post_channel_cloudvol_with_cv_path_as_admin(self):
"""
Setting cv_path when storage_type == CloudVolume is valid when done as admin.
"""
self.client.force_login(self.super_user)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
cv_path = '/custom/cv'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image',
'cv_path': cv_path, 'storage_type': Channel.StorageType.CLOUD_VOLUME}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
self.assertEqual(response.json()['cv_path'], cv_path)
def test_post_channel_with_valid_timestep(self):
"""
Post a new channel with the default_time_step
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image', 'default_time_sample': 5}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_channel_with_invalid_timestep(self):
"""
Post a new channel with the default_time_step
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image', 'default_time_sample': 15}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_channel_no_experiment(self):
"""
Post a new channel (valid - No experiment in the post data. This is picked up from the request)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'type': 'image', 'datatype': 'uint8'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_channel_exists(self):
"""
Post a new channel (invalid - Collection,experiment, channel already exist)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1/'
data = {'description': 'This is a new channel', 'type': 'image', 'datatype': 'uint8'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_channel_annotation_without_source(self):
"""
Post a new channel of type annotation w/o providing a source channel.
This used to be forbidden but we decided to allow this.
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
def test_post_channel_annotation_with_source(self):
"""
Post a new channel of type annotation
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1'], 'related': ['channel2']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing experiment
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['sources'], ['channel1'])
self.assertEqual(response.data['related'], ['channel2'])
def test_post_channel_annotation_with_multiple_sources(self):
"""
Post a new channel of type annotation(invalid - source missing)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1', 'channel2']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing experiment
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['sources'], ['channel1', 'channel2'])
# Ensure that this is Asymmetrical
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['sources'], [])
def test_post_channel_annotation_with_common_source_related(self):
"""
Post a new channel of type annotation(invalid - source missing)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1'],
'related': ['channel1', 'channel3']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_channel_bad_source(self):
"""
Post a new channel of type annotation(invalid - source missing)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1eeee'],
'related': ['channel1', 'channel3']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_channel_bad_related(self):
"""
Post a new channel of type annotation(invalid - source missing)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1'],
'related': ['channel3eee']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 400)
def test_post_channel_annotation_with_multiple_related(self):
"""
Post a new channel of type annotation(invalid - source missing)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1'],
'related': ['channel2', 'channel3']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing experiment
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['related'], ['channel2', 'channel3'])
# Make sure it is symmetrical
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel2/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['related'], ['channel33'])
def test_put_channel(self):
"""
Update a channel (Valid - The channel exists)
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'description': 'A new channel for unit tests. Updated'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_channel_set_cv_path_forbidden_for_non_admins(self):
"""
Update a channel (Invalid - only admins can set cv_path)
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'cv_path': '/my/custom/cv/dataset'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 403)
def test_put_channel_set_cv_path_as_admin(self):
"""
Update a channel's bucket (Valid - admins can set cv_path)
"""
self.client.force_login(self.super_user)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'cv_path': '/my/custom/cv/dataset'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_channel_set_bucket_forbidden_for_non_admins(self):
"""
Update a channel (Invalid - only admins can set the bucket name)
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'bucket': 'new.bucket.boss'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 403)
def test_put_channel_set_bucket_as_admin(self):
"""
Update a channel's bucket (Valid - admins can set the bucket name)
"""
self.client.force_login(self.super_user)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'bucket': 'new.bucket.boss'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_channel_set_storage_type_forbidden_for_non_admins(self):
"""
Update a channel (Invalid - only admins can set the storage type after creation)
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'storage_type': Channel.StorageType.CLOUD_VOLUME}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 403)
def test_put_channel_set_storage_type_as_admin(self):
"""
Update a channel's storage type (Valid - admins can change this after creation)
"""
self.client.force_login(self.super_user)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'storage_type': Channel.StorageType.CLOUD_VOLUME}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_channel_source(self):
"""
Update a channel (Valid - The channel exists)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1'],
'related': ['channel2', 'channel3']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
data = {'description': 'A new channel for unit tests. Updated', 'default_time_sample': 1,
'sources': ['channel2'],
'related': ['channel3']
}
# Get an existing collection
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_put_channel_downsample(self):
"""
Try to update a downsample property of the channel but you can't
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1'],
'related': ['channel2', 'channel3']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
data = {'downsample_status': 'DOWNSAMPLED'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 400)
data = {'downsample_arn': 'asdfasfasdf'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 400)
def test_put_channel_remove_source(self):
"""
Update a channel (Valid - The channel exists)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint8',
'sources': ['channel1', 'channel2']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(set(response.data['sources']), {'channel1', 'channel2'})
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33'
data = {'description': 'A new channel for unit tests. Updated', 'sources': ['channel2']}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
# Get an existing channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(set(response.data['sources']), {'channel2'})
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33'
data = {'description': 'A new channel for unit tests. Updated', 'sources': []}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
# Get an existing channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['sources'], [])
def test_put_channel_remove_related(self):
"""
Update a channel (Valid - The channel exists)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'image', 'datatype': 'uint8',
'related': ['channel1', 'channel2']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Get an existing channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(set(response.data['related']), {'channel1', 'channel2'})
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33'
data = {'description': 'A new channel for unit tests. Updated', 'related': ['channel2']}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
# Get an existing channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(set(response.data['related']), {'channel2'})
def test_put_channel_doesnotexist(self):
"""
Update a channel that does not exist
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel55/'
data = {'description': 'A new experiment for unit tests. Updated'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 404)
def test_put_channel_name(self):
"""
Update channel name (valid)
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1/'
data = {'name': 'channel10'}
response = self.client.put(url, data=data)
self.assertEqual(response.status_code, 200)
def test_delete_channel(self):
"""
Delete a channel
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10/'
data = {'description': 'This is a new channel', 'datatype': 'uint8', 'type': 'image'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10'
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
response = self.client.get(url)
self.assertEquals(response.status_code, 404)
self.assertEquals((response.json())['code'], 4005)
def test_delete_channel_invalid(self):
"""
Delete a channel (invalid - Violates integrity constraint because channels are linked to it)
"""
# Post a new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint64',
'sources': ['channel1'], 'related': ['channel2']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1'
response = self.client.delete(url)
self.assertEqual(response.status_code, 400)
# Ensure channel still exists
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel1/'
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
def test_delete_channel_ignore_derived_channels_marked_for_deletion(self):
"""
Delete a channel (allow when all derived channels are marked for deletion)
"""
# Post new channels
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel11/'
data = {'description': 'This is a new source channel', 'type': 'image', 'datatype': 'uint8'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel22/'
data = {'description': 'This is a new related channel', 'type': 'image', 'datatype': 'uint8'}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
data = {'description': 'This is a new channel', 'type': 'annotation', 'datatype': 'uint64',
'sources': ['channel11'], 'related': ['channel22']}
response = self.client.post(url, data=data)
self.assertEqual(response.status_code, 201)
# Delete the new channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel33/'
response = self.client.delete(url, data=data)
self.assertEqual(response.status_code, 204)
# Delete the source channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel11'
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
# Delete the related channel
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel22'
response = self.client.delete(url)
self.assertEqual(response.status_code, 204)
def test_delete_channel_doesnotexist(self):
"""
Delete a channel (invalid - The channel does not exist )
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/channel10'
response = self.client.delete(url)
self.assertEqual(response.status_code, 404)
def test_get_channels(self):
"""
Get list of collections
"""
url = '/' + version + '/collection/col1/experiment/exp1/channel/'
# Get an existing channel
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['channels'][0], 'channel1')
| 37.130132 | 121 | 0.617545 | 5,382 | 47,935 | 5.382386 | 0.058343 | 0.076118 | 0.112745 | 0.120133 | 0.867958 | 0.849316 | 0.832885 | 0.816107 | 0.788284 | 0.774855 | 0 | 0.024699 | 0.257578 | 47,935 | 1,290 | 122 | 37.158915 | 0.789283 | 0.160238 | 0 | 0.716981 | 0 | 0 | 0.239816 | 0.091677 | 0 | 0 | 0 | 0 | 0.240566 | 1 | 0.133648 | false | 0 | 0.006289 | 0 | 0.146226 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a24054aa29da3d96a2baf8ca341580c6acc9f084 | 9,675 | py | Python | nodeeditor/PyFlowGraph.py | madhusenthilvel/NodeEditor | e5612d917a24924a7961d196aafa85ca9b650dcf | [
"MIT"
] | 53 | 2019-07-17T17:42:13.000Z | 2022-02-07T20:19:48.000Z | nodeeditor/PyFlowGraph.py | madhusenthilvel/NodeEditor | e5612d917a24924a7961d196aafa85ca9b650dcf | [
"MIT"
] | 13 | 2019-07-10T11:15:34.000Z | 2020-12-31T05:03:48.000Z | nodeeditor/PyFlowGraph.py | madhusenthilvel/NodeEditor | e5612d917a24924a7961d196aafa85ca9b650dcf | [
"MIT"
] | 12 | 2019-07-10T11:03:39.000Z | 2022-02-15T11:58:14.000Z | import FreeCAD,FreeCADGui
import nodeeditor.PythonObjects
from nodeeditor.PythonObjects import FeaturePython,ViewProvider
import nodeeditor.pfwrap as pfwrap
from nodeeditor.say import *
def _PyFlowGraph(FeaturePython):
def __init__(self,obj):
FeaturePython.__init__(self, obj)
obj.Proxy = self
self.Type = self.__class__.__name__
class _PyFlowGraphViewProvider(ViewProvider):
def recompute(self):
obj=self.Object
say("Recompute ",obj.Label)
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
a=PyFlowGraph()
data=eval(a.graph)
instance.loadFromData(data)
pfwrap.getInstance().show()
def setupContextMenu(self, obj, menu):
action = menu.addAction("load and show Graph ...")
action.triggered.connect(self.recompute)
action = menu.addAction("show PyFlow ...")
action.triggered.connect(self.showPyFlow)
action = menu.addAction("hide PyFlow ...")
action.triggered.connect(self.hidePyFlow)
action = menu.addAction("clear Graph ...")
action.triggered.connect(self.clearGraph)
def hidePyFlow(self):
pfwrap.deleteInstance()
def showPyFlow(self):
try:
FreeCAD.PF.hide()
except:
pass
pfwrap.getInstance().show()
def clearGraph(self):
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
def setEdit(self,vobj,mode=0):
say("set edit deactivated")
self.recompute()
return False
# anwendungsklassen
def PyFlowGraph():
name="PyFlowGraph"
obj = FreeCAD.ActiveDocument.getObject(name)
if obj == None:
#obj = FreeCAD.ActiveDocument.addObject("Part::FeaturePython",name)
obj=FreeCAD.ActiveDocument.addObject("App::DocumentObjectGroupPython",name)
obj.addProperty("App::PropertyString", "graph", "Data","serialized data of the flow graph")
_PyFlowGraph(obj)
_PyFlowGraphViewProvider(obj.ViewObject,'/home/thomas/.FreeCAD/Mod.PyFlow/NodeEditor/icons/BB.svg')
return obj
import time
import sys
if sys.version_info[0] !=2:
from importlib import reload
class _PyFlowRef(FeaturePython):
def __init__(self,obj):
FeaturePython.__init__(self, obj)
obj.Proxy = self
self.Type = self.__class__.__name__
self.lastExec=0
def myExecute(self,fp):
if not fp.ViewObject.Visibility:
sayl(fp.Label,"hidden --no execute")
return
try:
_=self.lastExec
except:
self.lastExec=0
say ("pause",self.lastExec+fp.pauseAfter*0.001 -time.time())
if self.lastExec+fp.pauseAfter*0.001>time.time():
sayl("still pausing ...")
say (self.lastExec+fp.pauseAfter*0.001 -time.time())
return
self.lastExec = time.time()
say("My Execute")
import nodeeditor.dev
reload (nodeeditor.dev)
nodeeditor.dev.myExecute_PyFlowRef(self,fp)
class _PyFlowRefViewProvider(ViewProvider):
def recompute(self):
obj=self.Object
say("Recompute ",obj.Label)
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
a=PyFlowGraph()
data=eval(a.graph)
instance.loadFromData(data)
pfwrap.getInstance().show()
def XsetupContextMenu(self, obj, menu):
action = menu.addAction("load and show Graph ...")
action.triggered.connect(self.recompute)
action = menu.addAction("show PyFlow ...")
action.triggered.connect(self.showPyFlow)
action = menu.addAction("hide PyFlow ...")
action.triggered.connect(self.hidePyFlow)
action = menu.addAction("clear Graph ...")
action.triggered.connect(self.clearGraph)
def hidePyFlow(self):
pfwrap.deleteInstance()
def showPyFlow(self):
try:
FreeCAD.PF.hide()
except:
pass
pfwrap.getInstance().show()
def clearGraph(self):
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
def setEdit(self,vobj,mode=0):
say("set edit deactivated")
self.recompute()
return False
# anwendungsklassen
def PyFlowRef(name="Ref2",):
obj = FreeCAD.ActiveDocument.getObject(name)
if 1 or obj == None:
obj = FreeCAD.ActiveDocument.addObject("Part::FeaturePython",name)
#obj=FreeCAD.ActiveDocument.addObject("App::DocumentObjectGroupPython",name)
obj.addProperty("App::PropertyString", "refname", "Data","name of the node in pyflow")
obj.addProperty("App::PropertyLinkList", "sources", "Data",)
obj.addProperty("App::PropertyInteger", "pauseAfter", "_aux","minimum time between consecutive recomputes")
obj.pauseAfter=1000
_PyFlowRef(obj)
_PyFlowRefViewProvider(obj.ViewObject,'/home/thomas/.FreeCAD/Mod.PyFlow/NodeEditor/icons/BB.svg')
say(obj)
#obj.myExecute()
return obj
class _Blinker(FeaturePython):
def __init__(self,obj):
FeaturePython.__init__(self, obj)
obj.Proxy = self
self.Type = self.__class__.__name__
self.lastExec=0
def myExecute(self,fp):
if not fp.ViewObject.Visibility:
sayl(fp.Label,"hidden --no execute")
return
import nodeeditor.dev
reload (nodeeditor.dev)
nodeeditor.dev.myExecute_Blinker(self,fp)
class _BlinkerViewProvider(ViewProvider):
def recompute(self):
obj=self.Object
say("Recompute ",obj.Label)
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
a=PyFlowGraph()
data=eval(a.graph)
instance.loadFromData(data)
pfwrap.getInstance().show()
def XsetupContextMenu(self, obj, menu):
action = menu.addAction("load and show Graph ...")
action.triggered.connect(self.recompute)
action = menu.addAction("show PyFlow ...")
action.triggered.connect(self.showPyFlow)
action = menu.addAction("hide PyFlow ...")
action.triggered.connect(self.hidePyFlow)
action = menu.addAction("clear Graph ...")
action.triggered.connect(self.clearGraph)
def hidePyFlow(self):
pfwrap.deleteInstance()
def showPyFlow(self):
try:
FreeCAD.PF.hide()
except:
pass
pfwrap.getInstance().show()
def clearGraph(self):
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
def setEdit(self,vobj,mode=0):
say("set edit deactivated")
self.recompute()
return False
# anwendungsklassen
def Blinker(name="Document_Blinker",):
obj = FreeCAD.ActiveDocument.getObject(name)
if 1 or obj == None:
obj = FreeCAD.ActiveDocument.addObject("Part::FeaturePython",name)
#obj=FreeCAD.ActiveDocument.addObject("App::DocumentObjectGroupPython",name)
obj.addProperty("App::PropertyString", "signalName", "Data","name of the signal")
obj.signalName='blink'
obj.addProperty("App::PropertyLinkList", "sources", "Data",)
_Blinker(obj)
_BlinkerViewProvider(obj.ViewObject,'/home/thomas/.FreeCAD/Mod.PyFlow/NodeEditor/icons/BB.svg')
say(obj)
#obj.myExecute()
return obj
class _Receiver(FeaturePython):
def __init__(self,obj):
FeaturePython.__init__(self, obj)
obj.Proxy = self
self.Type = self.__class__.__name__
self.lastExec=0
def myExecute(self,fp):
if not fp.ViewObject.Visibility:
sayl(fp.Label,"hidden --no execute")
return
import nodeeditor.dev
reload (nodeeditor.dev)
nodeeditor.dev.myExecute_Receiver(self,fp)
class _ReceiverViewProvider(ViewProvider):
def recompute(self):
obj=self.Object
say("Recompute ",obj.Label)
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
a=PyFlowGraph()
data=eval(a.graph)
instance.loadFromData(data)
pfwrap.getInstance().show()
def XsetupContextMenu(self, obj, menu):
action = menu.addAction("load and show Graph ...")
action.triggered.connect(self.recompute)
action = menu.addAction("show PyFlow ...")
action.triggered.connect(self.showPyFlow)
action = menu.addAction("hide PyFlow ...")
action.triggered.connect(self.hidePyFlow)
action = menu.addAction("clear Graph ...")
action.triggered.connect(self.clearGraph)
def hidePyFlow(self):
pfwrap.deleteInstance()
def showPyFlow(self):
try:
FreeCAD.PF.hide()
except:
pass
pfwrap.getInstance().show()
def clearGraph(self):
instance=pfwrap.getInstance()
instance.graphManager.get().clear()
def setEdit(self,vobj,mode=0):
say("set edit deactivated")
self.recompute()
return False
# anwendungsklassen
def Receiver(name="Document_Receiver",):
obj = FreeCAD.ActiveDocument.getObject(name)
if 1 or obj == None:
obj = FreeCAD.ActiveDocument.addObject("Part::FeaturePython",name)
#obj=FreeCAD.ActiveDocument.addObject("App::DocumentObjectGroupPython",name)
obj.addProperty("App::PropertyString", "senderName", "Data","name of the signal sender")
_Receiver(obj)
_ReceiverViewProvider(obj.ViewObject,'/home/thomas/.FreeCAD/Mod.PyFlow/NodeEditor/icons/BB.svg')
say(obj)
#obj.myExecute()
return obj
| 27.330508 | 115 | 0.635762 | 996 | 9,675 | 6.087349 | 0.135542 | 0.018473 | 0.05014 | 0.068613 | 0.838364 | 0.832096 | 0.81115 | 0.81115 | 0.793337 | 0.783441 | 0 | 0.004099 | 0.243618 | 9,675 | 353 | 116 | 27.407932 | 0.824406 | 0.042481 | 0 | 0.792531 | 0 | 0 | 0.131027 | 0.032 | 0 | 0 | 0 | 0 | 0 | 1 | 0.149378 | false | 0.016598 | 0.045643 | 0 | 0.273859 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a75afe987c94e31618b6dbb5fa1fa148012c0e7f | 1,922 | py | Python | userbot/plugins/iloveyou.py | LUCKYRAJPUTOP/EllipsUserbot | a995ae32bb895897eedb1c71051c2a8b82366304 | [
"MIT"
] | null | null | null | userbot/plugins/iloveyou.py | LUCKYRAJPUTOP/EllipsUserbot | a995ae32bb895897eedb1c71051c2a8b82366304 | [
"MIT"
] | null | null | null | userbot/plugins/iloveyou.py | LUCKYRAJPUTOP/EllipsUserbot | a995ae32bb895897eedb1c71051c2a8b82366304 | [
"MIT"
] | null | null | null | # PLUGIN MADE BY @H1M4N5HU0P FOR darkbot
# KEEP CREDITS ELSE GAY
import random, re
from darkbot.utils import admin_cmd
import asyncio
from telethon import events
@borg.on(admin_cmd(pattern="iloveyou ?(.*)"))
async def _(event):
if not event.text[0].isalpha() and event.text[0] not in ("/", "#", "@", "!"):
await event.edit("""😘😘😘😘😘😘😘😘
😘😘😘😘😘😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘
😘😘😘😘😘😘😘😘
😘😘😘😘😘😘😘😘\n
😘😘
😘😘
😘😘
😘😘
😘😘
😘😘
😘😘
😘😘
😘😘😘😘😘😘😘😘
😘😘😘😘😘😘😘😘\n
😘😘😘😘😘
😘😘😘😘😘😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘😘😘😘😘😘
😘😘😘😘😘\n
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘😘
😘\n
😘😘😘😘😘😘😘😘
😘😘😘😘😘😘😘😘
😘😘
😘😘
😘😘😘😘😘😘
😘😘😘😘😘😘
😘😘
😘😘
😘😘😘😘😘😘😘😘
😘😘😘😘😘😘😘😘\n
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘😘
😘😘
😘😘
😘😘
😘😘
😘😘\n
😘😘😘😘😘😘
😘😘😘😘😘😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘😘😘😘😘😘
😘😘😘😘😘\n
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘 😘😘
😘😘😘😘😘😘
😘😘😘😘""")
| 18.304762 | 82 | 0.123309 | 178 | 1,922 | 3.314607 | 0.258427 | 0.501695 | 0.661017 | 0.786441 | 0.488136 | 0.435593 | 0.435593 | 0.435593 | 0.389831 | 0.389831 | 0 | 0.009119 | 0.657648 | 1,922 | 104 | 83 | 18.480769 | 0.346505 | 0.031217 | 0 | 0.822917 | 0 | 0 | 0.853685 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a7d6dac63ec39f35249b893966cce210d2b1e375 | 43 | py | Python | frame/__init__.py | shaxov/spy-eye | 1041d4b0561b4c079af15f074a29c76c2ff7a2b8 | [
"MIT"
] | null | null | null | frame/__init__.py | shaxov/spy-eye | 1041d4b0561b4c079af15f074a29c76c2ff7a2b8 | [
"MIT"
] | null | null | null | frame/__init__.py | shaxov/spy-eye | 1041d4b0561b4c079af15f074a29c76c2ff7a2b8 | [
"MIT"
] | null | null | null | from . import filters
from . import drawer
| 14.333333 | 21 | 0.767442 | 6 | 43 | 5.5 | 0.666667 | 0.606061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 43 | 2 | 22 | 21.5 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ac0888cc118d42a86fb3247ef0fc5b8eac907b75 | 596 | py | Python | roles/fortigate.ztp/files/python/organizational_workflow_data_model/__init__.py | ftntcorecse/FA-ZTP | 877e17cc5c6cd9a97abc911bebaadfc4d0e67680 | [
"BSD-2-Clause"
] | 7 | 2021-04-23T08:51:53.000Z | 2022-02-07T11:22:56.000Z | roles/fortigate.ztp/files/python/organizational_workflow_data_model/__init__.py | ftntcorecse/FA-ZTP | 877e17cc5c6cd9a97abc911bebaadfc4d0e67680 | [
"BSD-2-Clause"
] | null | null | null | roles/fortigate.ztp/files/python/organizational_workflow_data_model/__init__.py | ftntcorecse/FA-ZTP | 877e17cc5c6cd9a97abc911bebaadfc4d0e67680 | [
"BSD-2-Clause"
] | 7 | 2021-05-26T19:28:34.000Z | 2022-03-10T08:08:17.000Z | from .organizational_workflow_data_model import Locations, DeviceLocations, FortiGates, StaticRoutes
from .organizational_workflow_data_model import Networks, FortiSwitches, FortiSwitchPorts, FortiAPs
from .organizational_workflow_data_model import SSIDs, APProfiles, IPSec, SDWANInterfaces, SDWANRules, SDWANSLAs
from .organizational_workflow_data_model import BGPNeighbors, BGPNeighborGroups, BGPRouteMaps, BGPCommunityLists
from .organizational_workflow_data_model import PolicyPackages, ADOMPolicies, GlobalPolicies
from .organizational_workflow_data_model import AddressGroups, ServiceGroups | 99.333333 | 112 | 0.894295 | 59 | 596 | 8.728814 | 0.491525 | 0.209709 | 0.302913 | 0.349515 | 0.47767 | 0.47767 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067114 | 596 | 6 | 113 | 99.333333 | 0.926259 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ac131b62d23e9d1157bc4bcfced2983488c286fc | 84 | py | Python | utils/string_utils.py | sesam-community/content-extractor2 | 54eb604c2c82f4915028a76d8ca74dff5a7388ea | [
"Apache-2.0"
] | 1 | 2019-03-13T09:49:27.000Z | 2019-03-13T09:49:27.000Z | utils/string_utils.py | timurgen/content-extractor | ca0ae3c30320e054d940b8161c4fdca92ed646b1 | [
"Apache-2.0"
] | 1 | 2019-11-28T10:49:57.000Z | 2019-11-28T10:49:57.000Z | utils/string_utils.py | sesam-community/content-extractor2 | 54eb604c2c82f4915028a76d8ca74dff5a7388ea | [
"Apache-2.0"
] | 1 | 2019-03-13T09:18:58.000Z | 2019-03-13T09:18:58.000Z | def str_to_bool(s: str) -> bool:
return (s == 'True' or s == 'true') or False
| 16.8 | 48 | 0.559524 | 15 | 84 | 3 | 0.6 | 0.222222 | 0.311111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 84 | 4 | 49 | 21 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ac322abaadcaf056862e04da63fad54e80420fa3 | 6,084 | py | Python | app/tests/crud/test_crud_user.py | wlsouza/cashbackgb | c5cffe782eb0f8c2ec0303405820e49c494d04a3 | [
"MIT"
] | null | null | null | app/tests/crud/test_crud_user.py | wlsouza/cashbackgb | c5cffe782eb0f8c2ec0303405820e49c494d04a3 | [
"MIT"
] | null | null | null | app/tests/crud/test_crud_user.py | wlsouza/cashbackgb | c5cffe782eb0f8c2ec0303405820e49c494d04a3 | [
"MIT"
] | 1 | 2022-02-10T04:15:19.000Z | 2022-02-10T04:15:19.000Z | import pytest
from sqlalchemy.ext.asyncio import AsyncSession
from app import crud, models, schemas
from app.core.security import verify_password
from app.tests.utils.user import fake, random_user_dict
@pytest.mark.asyncio
async def test_create_user_by_schema(db: AsyncSession) -> None:
user_dict = random_user_dict()
user_in = schemas.UserCreate(**user_dict)
new_user = await crud.user.create(db=db, user_in=user_in)
assert new_user.email == user_in.email
@pytest.mark.asyncio
async def test_create_user_by_dict(db: AsyncSession) -> None:
user_dict = random_user_dict()
new_user = await crud.user.create(db=db, user_in=user_dict)
assert new_user.email == user_dict["email"]
@pytest.mark.asyncio
async def test_when_create_user_return_hashed_password(
db: AsyncSession,
) -> None:
user_dict = random_user_dict()
user_in = schemas.UserCreate(**user_dict)
new_user = await crud.user.create(db=db, user_in=user_in)
assert hasattr(new_user, "hashed_password")
@pytest.mark.asyncio
async def test_when_create_user_return_valid_hashed_password(
db: AsyncSession,
) -> None:
user_dict = random_user_dict()
user_in = schemas.UserCreate(**user_dict)
new_user = await crud.user.create(db=db, user_in=user_in)
result = verify_password(user_dict["password"], new_user.hashed_password)
assert result
@pytest.mark.asyncio
async def test_if_get_by_email_return_correct_user(db: AsyncSession) -> None:
new_user = await crud.user.create(db=db, user_in=random_user_dict())
returned_user = await crud.user.get_by_email(db=db, email=new_user.email)
assert returned_user.id == new_user.id
@pytest.mark.asyncio
async def test_if_get_by_id_return_correct_user(db: AsyncSession) -> None:
new_user = await crud.user.create(db=db, user_in=random_user_dict())
returned_user = await crud.user.get_by_id(db=db, id=new_user.id)
assert returned_user.id == new_user.id
@pytest.mark.asyncio
async def test_if_get_by_cpf_return_correct_user(db: AsyncSession) -> None:
new_user = await crud.user.create(db=db, user_in=random_user_dict())
returned_user = await crud.user.get_by_cpf(db=db, cpf=new_user.cpf)
assert returned_user.id == new_user.id
@pytest.mark.asyncio
async def test_if_delete_by_id_really_delete_the_user(db: AsyncSession):
user_dict = random_user_dict()
new_user = await crud.user.create(db=db, user_in=user_dict)
await crud.user.delete_by_id(db=db, id=new_user.id)
returned_user = await crud.user.get_by_id(db=db, id=new_user.id)
assert returned_user is None
@pytest.mark.asyncio
async def test_update_user_by_userupdateput_schema(db: AsyncSession) -> None:
user_dict = random_user_dict()
new_user = await crud.user.create(db=db, user_in=user_dict)
user_update_in = schemas.UserUpdatePUT(**random_user_dict())
updated_user = await crud.user.update(
db=db, db_user=new_user, user_in=user_update_in
)
assert updated_user.email == user_update_in.email
@pytest.mark.asyncio
async def test_update_user_by_userupdatepatch_schema(db: AsyncSession) -> None:
user_dict = random_user_dict()
new_user = await crud.user.create(db=db, user_in=user_dict)
user_update_in = schemas.UserUpdatePATCH(email=fake.free_email())
updated_user = await crud.user.update(
db=db, db_user=new_user, user_in=user_update_in
)
assert updated_user.email == user_update_in.email
@pytest.mark.asyncio
async def test_update_user_by_dict(db: AsyncSession) -> None:
user_dict = random_user_dict()
new_user = await crud.user.create(db=db, user_in=user_dict)
user_update_in = {"email": fake.free_email()}
updated_user = await crud.user.update(
db=db, db_user=new_user, user_in=user_update_in
)
assert updated_user.email == user_update_in["email"]
@pytest.mark.asyncio
async def test_if_get_multi_return_a_list_of_users(db: AsyncSession) -> None:
user_dict = random_user_dict()
await crud.user.create(db=db, user_in=user_dict)
users = await crud.user.get_multi(db=db, limit=1)
assert isinstance(users, list)
@pytest.mark.asyncio
async def test_if_get_multi_return_the_correct_quantity_of_user(
db: AsyncSession,
) -> None:
for _ in range(3):
user_dict = random_user_dict()
await crud.user.create(db=db, user_in=user_dict)
users = await crud.user.get_multi(db=db, limit=2)
assert len(users) == 2
@pytest.mark.asyncio
async def test_if_get_multi_skip_the_correct_quantity_of_user(
db: AsyncSession,
) -> None:
for _ in range(5):
user_dict = random_user_dict()
await crud.user.create(db=db, user_in=user_dict)
db_users = await crud.user.get_multi(db=db, limit=5)
users = await crud.user.get_multi(db=db, skip=2, limit=1)
assert users[0].id == db_users[2].id
@pytest.mark.asyncio
async def test_when_successfully_get_authenticated_user_must_return_user(
db: AsyncSession,
) -> None:
user_dict = random_user_dict()
await crud.user.create(db=db, user_in=user_dict)
result = await crud.user.get_authenticated_user(
db=db,
user_email=user_dict["email"],
user_password=user_dict["password"],
)
assert isinstance(result, models.User)
@pytest.mark.asyncio
async def test_when_getting_authenticated_user_if_invalid_email_must_return_none(
db: AsyncSession,
) -> None:
user_dict = random_user_dict()
await crud.user.create(db=db, user_in=user_dict)
result = await crud.user.get_authenticated_user(
db=db,
user_email="invalid_email@test.com",
user_password=user_dict["password"],
)
assert result is None
@pytest.mark.asyncio
async def test_when_getting_authenticated_user_if_invalid_password_must_return_none(
db: AsyncSession,
) -> None:
user_dict = random_user_dict()
await crud.user.create(db=db, user_in=user_dict)
result = await crud.user.get_authenticated_user(
db=db,
user_email=user_dict["email"],
user_password="invalid_password_test",
)
assert result is None
| 33.988827 | 84 | 0.74211 | 947 | 6,084 | 4.432946 | 0.084477 | 0.101 | 0.099095 | 0.072892 | 0.836112 | 0.823011 | 0.809671 | 0.800143 | 0.783945 | 0.728442 | 0 | 0.001944 | 0.154504 | 6,084 | 178 | 85 | 34.179775 | 0.814152 | 0 | 0 | 0.629371 | 0 | 0 | 0.017587 | 0.007068 | 0 | 0 | 0 | 0 | 0.118881 | 1 | 0 | false | 0.062937 | 0.034965 | 0 | 0.034965 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
3badd06cebba23f75d5a931323919339cce5e810 | 30,254 | py | Python | sdk/remoterendering/azure-mixedreality-remoterendering/azure/mixedreality/remoterendering/_generated/operations/_remote_rendering_operations.py | vincenttran-msft/azure-sdk-for-python | 348b56f9f03eeb3f7b502eed51daf494ffff874d | [
"MIT"
] | 2,728 | 2015-01-09T10:19:32.000Z | 2022-03-31T14:50:33.000Z | sdk/remoterendering/azure-mixedreality-remoterendering/azure/mixedreality/remoterendering/_generated/operations/_remote_rendering_operations.py | v-xuto/azure-sdk-for-python | 9c6296d22094c5ede410bc83749e8df8694ccacc | [
"MIT"
] | 17,773 | 2015-01-05T15:57:17.000Z | 2022-03-31T23:50:25.000Z | sdk/remoterendering/azure-mixedreality-remoterendering/azure/mixedreality/remoterendering/_generated/operations/_remote_rendering_operations.py | v-xuto/azure-sdk-for-python | 9c6296d22094c5ede410bc83749e8df8694ccacc | [
"MIT"
] | 1,916 | 2015-01-19T05:05:41.000Z | 2022-03-31T19:36:44.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from typing import TYPE_CHECKING
import warnings
from azure.core.exceptions import HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.paging import ItemPaged
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import HttpRequest, HttpResponse
from .. import models
if TYPE_CHECKING:
# pylint: disable=unused-import,ungrouped-imports
from typing import Any, Callable, Dict, Generic, Iterable, Optional, TypeVar, Union
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, HttpResponse], T, Dict[str, Any]], Any]]
class RemoteRenderingOperations(object):
"""RemoteRenderingOperations operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:ivar models: Alias to model classes used in this operation group.
:type models: ~azure.mixedreality.remoterendering._generated.models
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
def create_conversion(
self,
account_id, # type: str
conversion_id, # type: str
body, # type: "models.CreateAssetConversionSettings"
**kwargs # type: Any
):
# type: (...) -> "models.AssetConversion"
"""Creates a conversion using an asset stored in an Azure Blob Storage account.
Creates a conversion using an asset stored in an Azure Blob Storage account.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:param conversion_id: An ID uniquely identifying the conversion for the given account. The ID
is case sensitive, can contain any combination of alphanumeric characters including hyphens and
underscores, and cannot contain more than 256 characters.
:type conversion_id: str
:param body: Request body configuring the settings for an asset conversion.
:type body: ~azure.mixedreality.remoterendering._generated.models.CreateAssetConversionSettings
:keyword callable cls: A custom type or function that will be passed the direct response
:return: AssetConversion, or the result of cls(response)
:rtype: ~azure.mixedreality.remoterendering._generated.models.AssetConversion
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.AssetConversion"]
error_map = {
404: ResourceNotFoundError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
409: lambda response: ResourceExistsError(response=response, model=self._deserialize(models.ErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
content_type = kwargs.pop("content_type", "application/json")
# Construct URL
url = self.create_conversion.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
'conversion_id': self._serialize.url("conversion_id", conversion_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = 'application/json'
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(body, 'CreateAssetConversionSettings')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 201]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
response_headers = {}
if response.status_code == 200:
response_headers['MS-CV']=self._deserialize('str', response.headers.get('MS-CV'))
deserialized = self._deserialize('AssetConversion', pipeline_response)
if response.status_code == 201:
response_headers['MS-CV']=self._deserialize('str', response.headers.get('MS-CV'))
deserialized = self._deserialize('AssetConversion', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, response_headers)
return deserialized
create_conversion.metadata = {'url': '/accounts/{account_id}/conversions/{conversion_id}'} # type: ignore
def get_conversion(
self,
account_id, # type: str
conversion_id, # type: str
**kwargs # type: Any
):
# type: (...) -> "models.AssetConversion"
"""Gets the status of a particular conversion.
Gets the status of a particular conversion.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:param conversion_id: An ID uniquely identifying the conversion for the given account. The ID
is case sensitive, can contain any combination of alphanumeric characters including hyphens and
underscores, and cannot contain more than 256 characters.
:type conversion_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: AssetConversion, or the result of cls(response)
:rtype: ~azure.mixedreality.remoterendering._generated.models.AssetConversion
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.AssetConversion"]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
# Construct URL
url = self.get_conversion.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
'conversion_id': self._serialize.url("conversion_id", conversion_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = 'application/json'
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
response_headers = {}
response_headers['MS-CV']=self._deserialize('str', response.headers.get('MS-CV'))
response_headers['Retry-After']=self._deserialize('int', response.headers.get('Retry-After'))
deserialized = self._deserialize('AssetConversion', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, response_headers)
return deserialized
get_conversion.metadata = {'url': '/accounts/{account_id}/conversions/{conversion_id}'} # type: ignore
def list_conversions(
self,
account_id, # type: str
**kwargs # type: Any
):
# type: (...) -> Iterable["models.ConversionList"]
"""Gets a list of all conversions.
Gets a list of all conversions.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either ConversionList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mixedreality.remoterendering._generated.models.ConversionList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.ConversionList"]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = 'application/json'
if not next_link:
# Construct URL
url = self.list_conversions.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('ConversionList', pipeline_response)
list_of_elem = deserialized.conversions
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_conversions.metadata = {'url': '/accounts/{account_id}/conversions'} # type: ignore
def create_session(
self,
account_id, # type: str
session_id, # type: str
body, # type: "models.CreateRenderingSessionSettings"
**kwargs # type: Any
):
# type: (...) -> "models.RenderingSession"
"""Creates a new rendering session.
Creates a new rendering session.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:param session_id: An ID uniquely identifying the rendering session for the given account. The
ID is case sensitive, can contain any combination of alphanumeric characters including hyphens
and underscores, and cannot contain more than 256 characters.
:type session_id: str
:param body: Settings of the session to be created.
:type body: ~azure.mixedreality.remoterendering._generated.models.CreateRenderingSessionSettings
:keyword callable cls: A custom type or function that will be passed the direct response
:return: RenderingSession, or the result of cls(response)
:rtype: ~azure.mixedreality.remoterendering._generated.models.RenderingSession
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.RenderingSession"]
error_map = {
404: ResourceNotFoundError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
409: lambda response: ResourceExistsError(response=response, model=self._deserialize(models.ErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
content_type = kwargs.pop("content_type", "application/json")
# Construct URL
url = self.create_session.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
'session_id': self._serialize.url("session_id", session_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = 'application/json'
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(body, 'CreateRenderingSessionSettings')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 201]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
response_headers = {}
if response.status_code == 200:
deserialized = self._deserialize('RenderingSession', pipeline_response)
if response.status_code == 201:
response_headers['MS-CV']=self._deserialize('str', response.headers.get('MS-CV'))
deserialized = self._deserialize('RenderingSession', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, response_headers)
return deserialized
create_session.metadata = {'url': '/accounts/{account_id}/sessions/{session_id}'} # type: ignore
def get_session(
self,
account_id, # type: str
session_id, # type: str
**kwargs # type: Any
):
# type: (...) -> "models.RenderingSession"
"""Gets the properties of a particular rendering session.
Gets the properties of a particular rendering session.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:param session_id: An ID uniquely identifying the rendering session for the given account. The
ID is case sensitive, can contain any combination of alphanumeric characters including hyphens
and underscores, and cannot contain more than 256 characters.
:type session_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: RenderingSession, or the result of cls(response)
:rtype: ~azure.mixedreality.remoterendering._generated.models.RenderingSession
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.RenderingSession"]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
# Construct URL
url = self.get_session.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
'session_id': self._serialize.url("session_id", session_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = 'application/json'
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('RenderingSession', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_session.metadata = {'url': '/accounts/{account_id}/sessions/{session_id}'} # type: ignore
def update_session(
self,
account_id, # type: str
session_id, # type: str
body, # type: "models.UpdateSessionSettings"
**kwargs # type: Any
):
# type: (...) -> "models.RenderingSession"
"""Updates the max lease time of a particular rendering session.
Updates the max lease time of a particular rendering session.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:param session_id: An ID uniquely identifying the rendering session for the given account. The
ID is case sensitive, can contain any combination of alphanumeric characters including hyphens
and underscores, and cannot contain more than 256 characters.
:type session_id: str
:param body: Settings used to update the session.
:type body: ~azure.mixedreality.remoterendering._generated.models.UpdateSessionSettings
:keyword callable cls: A custom type or function that will be passed the direct response
:return: RenderingSession, or the result of cls(response)
:rtype: ~azure.mixedreality.remoterendering._generated.models.RenderingSession
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.RenderingSession"]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
422: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
content_type = kwargs.pop("content_type", "application/json")
# Construct URL
url = self.update_session.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
'session_id': self._serialize.url("session_id", session_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = 'application/json'
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(body, 'UpdateSessionSettings')
body_content_kwargs['content'] = body_content
request = self._client.patch(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('RenderingSession', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
update_session.metadata = {'url': '/accounts/{account_id}/sessions/{session_id}'} # type: ignore
def stop_session(
self,
account_id, # type: str
session_id, # type: str
**kwargs # type: Any
):
# type: (...) -> None
"""Stops a particular rendering session.
Stops a particular rendering session.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:param session_id: An ID uniquely identifying the rendering session for the given account. The
ID is case sensitive, can contain any combination of alphanumeric characters including hyphens
and underscores, and cannot contain more than 256 characters.
:type session_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
# Construct URL
url = self.stop_session.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
'session_id': self._serialize.url("session_id", session_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
response_headers = {}
response_headers['MS-CV']=self._deserialize('str', response.headers.get('MS-CV'))
if cls:
return cls(pipeline_response, None, response_headers)
stop_session.metadata = {'url': '/accounts/{account_id}/sessions/{session_id}/:stop'} # type: ignore
def list_sessions(
self,
account_id, # type: str
**kwargs # type: Any
):
# type: (...) -> Iterable["models.SessionsList"]
"""Gets a list of all rendering sessions.
Gets a list of all rendering sessions.
:param account_id: The Azure Remote Rendering account ID.
:type account_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either SessionsList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mixedreality.remoterendering._generated.models.SessionsList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["models.SessionsList"]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
500: lambda response: HttpResponseError(response=response, model=self._deserialize(models.ErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-01-01"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = 'application/json'
if not next_link:
# Construct URL
url = self.list_sessions.metadata['url'] # type: ignore
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
path_format_arguments = {
'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'account_id': self._serialize.url("account_id", account_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('SessionsList', pipeline_response)
list_of_elem = deserialized.sessions
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_sessions.metadata = {'url': '/accounts/{account_id}/sessions'} # type: ignore
| 47.794629 | 130 | 0.656079 | 3,288 | 30,254 | 5.847324 | 0.077251 | 0.032768 | 0.021637 | 0.015292 | 0.891605 | 0.884947 | 0.864558 | 0.845262 | 0.840476 | 0.835067 | 0 | 0.008955 | 0.239605 | 30,254 | 632 | 131 | 47.870253 | 0.826777 | 0.291961 | 0 | 0.800532 | 0 | 0 | 0.103425 | 0.03112 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039894 | false | 0 | 0.021277 | 0 | 0.117021 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3bb11e58b79b4ef1221327d5417fcd7bf870ebe1 | 255 | py | Python | fundamentals/14-advance-python-modules/7-timeit.py | davidokun/Python | 0172e4c6669dc0bdb1beab762948f0ade248bde0 | [
"MIT"
] | null | null | null | fundamentals/14-advance-python-modules/7-timeit.py | davidokun/Python | 0172e4c6669dc0bdb1beab762948f0ade248bde0 | [
"MIT"
] | null | null | null | fundamentals/14-advance-python-modules/7-timeit.py | davidokun/Python | 0172e4c6669dc0bdb1beab762948f0ade248bde0 | [
"MIT"
] | null | null | null | import timeit
t = timeit.timeit('"-".join(str(n) for n in range(100))', number=10000)
print(t)
t = timeit.timeit('"-".join([str(n) for n in range(100)])', number=10000)
print(t)
t = timeit.timeit('"-".join(map(str, range(100)))', number=10000)
print(t) | 25.5 | 73 | 0.643137 | 44 | 255 | 3.727273 | 0.318182 | 0.128049 | 0.237805 | 0.310976 | 0.890244 | 0.890244 | 0.737805 | 0.737805 | 0.737805 | 0.737805 | 0 | 0.105727 | 0.109804 | 255 | 10 | 74 | 25.5 | 0.61674 | 0 | 0 | 0.428571 | 0 | 0 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.428571 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
ce238f1f48d4435528ad2130fcac6d00232fee91 | 342,611 | py | Python | stubs/System/Windows/Interop.py | ricardyn/ironpython-stubs | 4d2b405eda3ceed186e8adca55dd97c332c6f49d | [
"MIT"
] | 1 | 2021-02-02T13:39:16.000Z | 2021-02-02T13:39:16.000Z | stubs/System/Windows/Interop.py | hdm-dt-fb/ironpython-stubs | 4d2b405eda3ceed186e8adca55dd97c332c6f49d | [
"MIT"
] | null | null | null | stubs/System/Windows/Interop.py | hdm-dt-fb/ironpython-stubs | 4d2b405eda3ceed186e8adca55dd97c332c6f49d | [
"MIT"
] | null | null | null | # encoding: utf-8
# module System.Windows.Interop calls itself Interop
# from PresentationFramework, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, WindowsBase, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, PresentationCore, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
# by generator 1.145
# no doc
# no imports
# no functions
# classes
class IKeyboardInputSink:
""" Provides a keyboard sink for components that manages tabbing, accelerators, and mnemonics across interop boundaries and between HWNDs. This interface implements keyboard message management in WPF-Win32 interoperation scenarios. """
def HasFocusWithin(self):
"""
HasFocusWithin(self: IKeyboardInputSink) -> bool
Gets a value that indicates whether the sink or one of its contained components
has focus.
Returns: true if the sink or one of its contained components has focus; otherwise, false.
"""
pass
def OnMnemonic(self, msg, modifiers):
"""
OnMnemonic(self: IKeyboardInputSink, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Called when one of the mnemonics (access keys) for this sink is invoked.
msg: The message for the mnemonic and associated data. Do not modify this message
structure. It is passed by reference for performance reasons only.
modifiers: Modifier keys.
Returns: true if the message was handled; otherwise, false.
"""
pass
def RegisterKeyboardInputSink(self, sink):
"""
RegisterKeyboardInputSink(self: IKeyboardInputSink, sink: IKeyboardInputSink) -> IKeyboardInputSite
Registers the System.Windows.Interop.IKeyboardInputSink interface of a
contained component.
sink: The System.Windows.Interop.IKeyboardInputSink sink of the contained component.
Returns: The System.Windows.Interop.IKeyboardInputSite site of the contained component.
"""
pass
def TabInto(self, request):
"""
TabInto(self: IKeyboardInputSink, request: TraversalRequest) -> bool
Sets focus on either the first tab stop or the last tab stop of the sink.
request: Specifies whether focus should be set to the first or the last tab stop.
Returns: true if the focus has been set as requested; false, if there are no tab stops.
"""
pass
def TranslateAccelerator(self, msg, modifiers):
"""
TranslateAccelerator(self: IKeyboardInputSink, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes keyboard input at the keydown message level.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: true if the message was handled by the method implementation; otherwise, false.
"""
pass
def TranslateChar(self, msg, modifiers):
"""
TranslateChar(self: IKeyboardInputSink, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes WM_CHAR, WM_SYSCHAR, WM_DEADCHAR, and WM_SYSDEADCHAR input messages
before
System.Windows.Interop.IKeyboardInputSink.OnMnemonic(System.Windows.Interop.MSG@
,System.Windows.Input.ModifierKeys) is called.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: true if the message was processed and
System.Windows.Interop.IKeyboardInputSink.OnMnemonic(System.Windows.Interop.MSG@
,System.Windows.Input.ModifierKeys) should not be called; otherwise, false.
"""
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
KeyboardInputSite = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a reference to the component's container's System.Windows.Interop.IKeyboardInputSite interface.
Get: KeyboardInputSite(self: IKeyboardInputSink) -> IKeyboardInputSite
Set: KeyboardInputSite(self: IKeyboardInputSink) = value
"""
class IWin32Window:
""" Defines the contract for Win32 window handles. """
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
Handle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the window handle.
Get: Handle(self: IWin32Window) -> IntPtr
"""
class HwndHost(FrameworkElement, IResource, IAnimatable, IInputElement, IFrameworkInputElement, ISupportInitialize, IHaveResources, IQueryAmbient, IDisposable, IWin32Window, IKeyboardInputSink):
""" Hosts a Win32 window as an element within�Windows Presentation Foundation (WPF)�content. """
def AddLogicalChild(self, *args): #cannot find CLR method
"""
AddLogicalChild(self: FrameworkElement, child: object)
Adds the provided object to the logical tree of this element.
child: Child element to be added.
AddLogicalChild(self: Window_16$17, child: object)AddLogicalChild(self: Label_17$18, child: object)AddLogicalChild(self: TextBox_18$19, child: object)AddLogicalChild(self: Button_19$20, child: object)AddLogicalChild(self: CheckBox_20$21, child: object)AddLogicalChild(self: ComboBox_21$22, child: object)AddLogicalChild(self: Separator_22$23, child: object)
"""
pass
def AddVisualChild(self, *args): #cannot find CLR method
"""
AddVisualChild(self: Visual, child: Visual)
Defines the parent-child relationship between two visuals.
child: The child visual object to add to parent visual.
AddVisualChild(self: Window_16$17, child: Window_16$17)AddVisualChild(self: Label_17$18, child: Label_17$18)AddVisualChild(self: TextBox_18$19, child: TextBox_18$19)AddVisualChild(self: Button_19$20, child: Button_19$20)AddVisualChild(self: CheckBox_20$21, child: CheckBox_20$21)AddVisualChild(self: ComboBox_21$22, child: ComboBox_21$22)AddVisualChild(self: Separator_22$23, child: Separator_22$23)
"""
pass
def ArrangeCore(self, *args): #cannot find CLR method
"""
ArrangeCore(self: FrameworkElement, finalRect: Rect)
Implements System.Windows.UIElement.ArrangeCore(System.Windows.Rect) (defined
as virtual in System.Windows.UIElement) and seals the implementation.
finalRect: The final area within the parent that this element should use to arrange itself
and its children.
ArrangeCore(self: Window_16$17, finalRect: Rect)ArrangeCore(self: Label_17$18, finalRect: Rect)ArrangeCore(self: TextBox_18$19, finalRect: Rect)ArrangeCore(self: Button_19$20, finalRect: Rect)ArrangeCore(self: CheckBox_20$21, finalRect: Rect)ArrangeCore(self: ComboBox_21$22, finalRect: Rect)ArrangeCore(self: Separator_22$23, finalRect: Rect)
"""
pass
def ArrangeOverride(self, *args): #cannot find CLR method
"""
ArrangeOverride(self: FrameworkElement, finalSize: Size) -> Size
When overridden in a derived class, positions child elements and determines a
size for a System.Windows.FrameworkElement derived class.
finalSize: The final area within the parent that this element should use to arrange itself
and its children.
Returns: The actual size used.
ArrangeOverride(self: Window_16$17, arrangeBounds: Size) -> Size
ArrangeOverride(self: Label_17$18, arrangeBounds: Size) -> Size
ArrangeOverride(self: TextBox_18$19, arrangeBounds: Size) -> Size
ArrangeOverride(self: Button_19$20, arrangeBounds: Size) -> Size
ArrangeOverride(self: CheckBox_20$21, arrangeBounds: Size) -> Size
ArrangeOverride(self: ComboBox_21$22, arrangeBounds: Size) -> Size
ArrangeOverride(self: Separator_22$23, arrangeBounds: Size) -> Size
"""
pass
def BuildWindowCore(self, *args): #cannot find CLR method
"""
BuildWindowCore(self: HwndHost, hwndParent: HandleRef) -> HandleRef
When overridden in a derived class, creates the window to be hosted.
hwndParent: The window handle of the parent window.
Returns: The handle to the child Win32�window to create.
"""
pass
def DestroyWindowCore(self, *args): #cannot find CLR method
"""
DestroyWindowCore(self: HwndHost, hwnd: HandleRef)
When overridden in a derived class, destroys the hosted window.
hwnd: A structure that contains the window handle.
"""
pass
def Dispose(self):
"""
Dispose(self: HwndHost)
Immediately frees any system resources that the object might hold.
"""
pass
def GetLayoutClip(self, *args): #cannot find CLR method
"""
GetLayoutClip(self: FrameworkElement, layoutSlotSize: Size) -> Geometry
Returns a geometry for a clipping mask. The mask applies if the layout system
attempts to arrange an element that is larger than the available display space.
layoutSlotSize: The size of the part of the element that does visual presentation.
Returns: The clipping geometry.
GetLayoutClip(self: Window_16$17, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: Label_17$18, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: TextBox_18$19, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: Button_19$20, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: CheckBox_20$21, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: ComboBox_21$22, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: Separator_22$23, layoutSlotSize: Size) -> Geometry
"""
pass
def GetTemplateChild(self, *args): #cannot find CLR method
"""
GetTemplateChild(self: FrameworkElement, childName: str) -> DependencyObject
Returns the named element in the visual tree of an instantiated
System.Windows.Controls.ControlTemplate.
childName: Name of the child to find.
Returns: The requested element. May be null if no element of the requested name exists.
GetTemplateChild(self: Window_16$17, childName: str) -> DependencyObject
GetTemplateChild(self: Label_17$18, childName: str) -> DependencyObject
GetTemplateChild(self: TextBox_18$19, childName: str) -> DependencyObject
GetTemplateChild(self: Button_19$20, childName: str) -> DependencyObject
GetTemplateChild(self: CheckBox_20$21, childName: str) -> DependencyObject
GetTemplateChild(self: ComboBox_21$22, childName: str) -> DependencyObject
GetTemplateChild(self: Separator_22$23, childName: str) -> DependencyObject
"""
pass
def GetUIParentCore(self, *args): #cannot find CLR method
"""
GetUIParentCore(self: FrameworkElement) -> DependencyObject
Returns an alternative logical parent for this element if there is no visual
parent.
Returns: Returns something other than null whenever a WPF framework-level implementation
of this method has a non-visual parent connection.
GetUIParentCore(self: Window_16$17) -> DependencyObject
GetUIParentCore(self: Label_17$18) -> DependencyObject
GetUIParentCore(self: TextBox_18$19) -> DependencyObject
GetUIParentCore(self: Button_19$20) -> DependencyObject
GetUIParentCore(self: CheckBox_20$21) -> DependencyObject
GetUIParentCore(self: ComboBox_21$22) -> DependencyObject
GetUIParentCore(self: Separator_22$23) -> DependencyObject
"""
pass
def GetVisualChild(self, *args): #cannot find CLR method
"""
GetVisualChild(self: FrameworkElement, index: int) -> Visual
Overrides System.Windows.Media.Visual.GetVisualChild(System.Int32), and returns
a child at the specified index from a collection of child elements.
index: The zero-based index of the requested child element in the collection.
Returns: The requested child element. This should not return null; if the provided index
is out of range, an exception is thrown.
GetVisualChild(self: Window_16$17, index: int) -> Visual
GetVisualChild(self: Label_17$18, index: int) -> Visual
GetVisualChild(self: TextBox_18$19, index: int) -> Visual
GetVisualChild(self: Button_19$20, index: int) -> Visual
GetVisualChild(self: CheckBox_20$21, index: int) -> Visual
GetVisualChild(self: ComboBox_21$22, index: int) -> Visual
GetVisualChild(self: Separator_22$23, index: int) -> Visual
"""
pass
def HasFocusWithinCore(self, *args): #cannot find CLR method
"""
HasFocusWithinCore(self: HwndHost) -> bool
Gets a value that indicates whether the sink or one of its contained components
has focus.
Returns: true if the sink or one of its contained components has focus; otherwise, false.
"""
pass
def HitTestCore(self, *args): #cannot find CLR method
"""
HitTestCore(self: UIElement, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
Implements
System.Windows.Media.Visual.HitTestCore(System.Windows.Media.GeometryHitTestPara
meters) to supply base element hit testing behavior (returning
System.Windows.Media.GeometryHitTestResult).
hitTestParameters: Describes the hit test to perform, including the initial hit point.
Returns: Results of the test, including the evaluated geometry.
HitTestCore(self: UIElement, hitTestParameters: PointHitTestParameters) -> HitTestResult
Implements
System.Windows.Media.Visual.HitTestCore(System.Windows.Media.PointHitTestParamet
ers) to supply base element hit testing behavior (returning
System.Windows.Media.HitTestResult).
hitTestParameters: Describes the hit test to perform, including the initial hit point.
Returns: Results of the test, including the evaluated point.
HitTestCore(self: Window_16$17, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Window_16$17, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: Label_17$18, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Label_17$18, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: TextBox_18$19, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: TextBox_18$19, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: Button_19$20, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Button_19$20, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: CheckBox_20$21, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: CheckBox_20$21, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: ComboBox_21$22, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: ComboBox_21$22, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: Separator_22$23, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Separator_22$23, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
"""
pass
def MeasureCore(self, *args): #cannot find CLR method
"""
MeasureCore(self: FrameworkElement, availableSize: Size) -> Size
Implements basic measure-pass layout system behavior for
System.Windows.FrameworkElement.
availableSize: The available size that the parent element can give to the child elements.
Returns: The desired size of this element in layout.
MeasureCore(self: Window_16$17, availableSize: Size) -> Size
MeasureCore(self: Label_17$18, availableSize: Size) -> Size
MeasureCore(self: TextBox_18$19, availableSize: Size) -> Size
MeasureCore(self: Button_19$20, availableSize: Size) -> Size
MeasureCore(self: CheckBox_20$21, availableSize: Size) -> Size
MeasureCore(self: ComboBox_21$22, availableSize: Size) -> Size
MeasureCore(self: Separator_22$23, availableSize: Size) -> Size
"""
pass
def MeasureOverride(self, *args): #cannot find CLR method
"""
MeasureOverride(self: HwndHost, constraint: Size) -> Size
Returns the size of the window represented by the
System.Windows.Interop.HwndHost object, as requested by layout engine
operations.
constraint: The size of the System.Windows.Interop.HwndHost object.
Returns: The size of the System.Windows.Interop.HwndHost object.
"""
pass
def OnAccessKey(self, *args): #cannot find CLR method
"""
OnAccessKey(self: UIElement, e: AccessKeyEventArgs)
Provides class handling for when an access key that is meaningful for this
element is invoked.
e: The event data to the access key event. The event data reports which key was
invoked, and indicate whether the System.Windows.Input.AccessKeyManager object
that controls the sending of these events also sent this access key invocation
to other elements.
OnAccessKey(self: Window_16$17, e: AccessKeyEventArgs)OnAccessKey(self: Label_17$18, e: AccessKeyEventArgs)OnAccessKey(self: TextBox_18$19, e: AccessKeyEventArgs)OnAccessKey(self: Button_19$20, e: AccessKeyEventArgs)OnAccessKey(self: CheckBox_20$21, e: AccessKeyEventArgs)OnAccessKey(self: ComboBox_21$22, e: AccessKeyEventArgs)OnAccessKey(self: Separator_22$23, e: AccessKeyEventArgs)
"""
pass
def OnChildDesiredSizeChanged(self, *args): #cannot find CLR method
"""
OnChildDesiredSizeChanged(self: UIElement, child: UIElement)
Supports layout behavior when a child element is resized.
child: The child element that is being resized.
OnChildDesiredSizeChanged(self: Window_16$17, child: Window_16$17)OnChildDesiredSizeChanged(self: Label_17$18, child: Label_17$18)OnChildDesiredSizeChanged(self: TextBox_18$19, child: TextBox_18$19)OnChildDesiredSizeChanged(self: Button_19$20, child: Button_19$20)OnChildDesiredSizeChanged(self: CheckBox_20$21, child: CheckBox_20$21)OnChildDesiredSizeChanged(self: ComboBox_21$22, child: ComboBox_21$22)OnChildDesiredSizeChanged(self: Separator_22$23, child: Separator_22$23)
"""
pass
def OnContextMenuClosing(self, *args): #cannot find CLR method
"""
OnContextMenuClosing(self: FrameworkElement, e: ContextMenuEventArgs)
Invoked whenever an unhandled
System.Windows.FrameworkElement.ContextMenuClosing routed event reaches this
class in its route. Implement this method to add class handling for this event.
e: Provides data about the event.
OnContextMenuClosing(self: Window_16$17, e: ContextMenuEventArgs)OnContextMenuClosing(self: Label_17$18, e: ContextMenuEventArgs)OnContextMenuClosing(self: TextBox_18$19, e: ContextMenuEventArgs)OnContextMenuClosing(self: Button_19$20, e: ContextMenuEventArgs)OnContextMenuClosing(self: CheckBox_20$21, e: ContextMenuEventArgs)OnContextMenuClosing(self: ComboBox_21$22, e: ContextMenuEventArgs)OnContextMenuClosing(self: Separator_22$23, e: ContextMenuEventArgs)
"""
pass
def OnContextMenuOpening(self, *args): #cannot find CLR method
"""
OnContextMenuOpening(self: FrameworkElement, e: ContextMenuEventArgs)
Invoked whenever an unhandled
System.Windows.FrameworkElement.ContextMenuOpening routed event reaches this
class in its route. Implement this method to add class handling for this event.
e: The System.Windows.RoutedEventArgs that contains the event data.
OnContextMenuOpening(self: Window_16$17, e: ContextMenuEventArgs)OnContextMenuOpening(self: Label_17$18, e: ContextMenuEventArgs)OnContextMenuOpening(self: TextBox_18$19, e: ContextMenuEventArgs)OnContextMenuOpening(self: Button_19$20, e: ContextMenuEventArgs)OnContextMenuOpening(self: CheckBox_20$21, e: ContextMenuEventArgs)OnContextMenuOpening(self: ComboBox_21$22, e: ContextMenuEventArgs)OnContextMenuOpening(self: Separator_22$23, e: ContextMenuEventArgs)
"""
pass
def OnCreateAutomationPeer(self, *args): #cannot find CLR method
"""
OnCreateAutomationPeer(self: HwndHost) -> AutomationPeer
Creates an System.Windows.Automation.Peers.AutomationPeer for
System.Windows.Interop.HwndHost .
Returns: The type-specific System.Windows.Automation.Peers.AutomationPeer implementation.
"""
pass
def OnDpiChanged(self, *args): #cannot find CLR method
""" OnDpiChanged(self: HwndHost, oldDpi: DpiScale, newDpi: DpiScale) """
pass
def OnDragEnter(self, *args): #cannot find CLR method
"""
OnDragEnter(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragEnter�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDragEnter(self: Window_16$17, e: DragEventArgs)OnDragEnter(self: Label_17$18, e: DragEventArgs)OnDragEnter(self: TextBox_18$19, e: DragEventArgs)OnDragEnter(self: Button_19$20, e: DragEventArgs)OnDragEnter(self: CheckBox_20$21, e: DragEventArgs)OnDragEnter(self: ComboBox_21$22, e: DragEventArgs)OnDragEnter(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnDragLeave(self, *args): #cannot find CLR method
"""
OnDragLeave(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragLeave�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDragLeave(self: Window_16$17, e: DragEventArgs)OnDragLeave(self: Label_17$18, e: DragEventArgs)OnDragLeave(self: TextBox_18$19, e: DragEventArgs)OnDragLeave(self: Button_19$20, e: DragEventArgs)OnDragLeave(self: CheckBox_20$21, e: DragEventArgs)OnDragLeave(self: ComboBox_21$22, e: DragEventArgs)OnDragLeave(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnDragOver(self, *args): #cannot find CLR method
"""
OnDragOver(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragOver�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDragOver(self: Window_16$17, e: DragEventArgs)OnDragOver(self: Label_17$18, e: DragEventArgs)OnDragOver(self: TextBox_18$19, e: DragEventArgs)OnDragOver(self: Button_19$20, e: DragEventArgs)OnDragOver(self: CheckBox_20$21, e: DragEventArgs)OnDragOver(self: ComboBox_21$22, e: DragEventArgs)OnDragOver(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnDrop(self, *args): #cannot find CLR method
"""
OnDrop(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragEnter�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDrop(self: Window_16$17, e: DragEventArgs)OnDrop(self: Label_17$18, e: DragEventArgs)OnDrop(self: TextBox_18$19, e: DragEventArgs)OnDrop(self: Button_19$20, e: DragEventArgs)OnDrop(self: CheckBox_20$21, e: DragEventArgs)OnDrop(self: ComboBox_21$22, e: DragEventArgs)OnDrop(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnGiveFeedback(self, *args): #cannot find CLR method
"""
OnGiveFeedback(self: UIElement, e: GiveFeedbackEventArgs)
Invoked when an unhandled System.Windows.DragDrop.GiveFeedback�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.GiveFeedbackEventArgs that contains the event data.
OnGiveFeedback(self: Window_16$17, e: GiveFeedbackEventArgs)OnGiveFeedback(self: Label_17$18, e: GiveFeedbackEventArgs)OnGiveFeedback(self: TextBox_18$19, e: GiveFeedbackEventArgs)OnGiveFeedback(self: Button_19$20, e: GiveFeedbackEventArgs)OnGiveFeedback(self: CheckBox_20$21, e: GiveFeedbackEventArgs)OnGiveFeedback(self: ComboBox_21$22, e: GiveFeedbackEventArgs)OnGiveFeedback(self: Separator_22$23, e: GiveFeedbackEventArgs)
"""
pass
def OnGotFocus(self, *args): #cannot find CLR method
"""
OnGotFocus(self: FrameworkElement, e: RoutedEventArgs)
Invoked whenever an unhandled System.Windows.UIElement.GotFocus event reaches
this element in its route.
e: The System.Windows.RoutedEventArgs that contains the event data.
OnGotFocus(self: Window_16$17, e: RoutedEventArgs)OnGotFocus(self: Label_17$18, e: RoutedEventArgs)OnGotFocus(self: TextBox_18$19, e: RoutedEventArgs)OnGotFocus(self: Button_19$20, e: RoutedEventArgs)OnGotFocus(self: CheckBox_20$21, e: RoutedEventArgs)OnGotFocus(self: Separator_22$23, e: RoutedEventArgs)
"""
pass
def OnGotKeyboardFocus(self, *args): #cannot find CLR method
"""
OnGotKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.GotKeyboardFocus�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains the event
data.
OnGotKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnGotMouseCapture(self, *args): #cannot find CLR method
"""
OnGotMouseCapture(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.GotMouseCapture�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnGotMouseCapture(self: Window_16$17, e: MouseEventArgs)OnGotMouseCapture(self: Label_17$18, e: MouseEventArgs)OnGotMouseCapture(self: TextBox_18$19, e: MouseEventArgs)OnGotMouseCapture(self: Button_19$20, e: MouseEventArgs)OnGotMouseCapture(self: CheckBox_20$21, e: MouseEventArgs)OnGotMouseCapture(self: ComboBox_21$22, e: MouseEventArgs)OnGotMouseCapture(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnGotStylusCapture(self, *args): #cannot find CLR method
"""
OnGotStylusCapture(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.GotStylusCapture�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnGotStylusCapture(self: Window_16$17, e: StylusEventArgs)OnGotStylusCapture(self: Label_17$18, e: StylusEventArgs)OnGotStylusCapture(self: TextBox_18$19, e: StylusEventArgs)OnGotStylusCapture(self: Button_19$20, e: StylusEventArgs)OnGotStylusCapture(self: CheckBox_20$21, e: StylusEventArgs)OnGotStylusCapture(self: ComboBox_21$22, e: StylusEventArgs)OnGotStylusCapture(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnGotTouchCapture(self, *args): #cannot find CLR method
"""
OnGotTouchCapture(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.GotTouchCapture routed
event that occurs when a touch is captured to this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnGotTouchCapture(self: Window_16$17, e: TouchEventArgs)OnGotTouchCapture(self: Label_17$18, e: TouchEventArgs)OnGotTouchCapture(self: TextBox_18$19, e: TouchEventArgs)OnGotTouchCapture(self: Button_19$20, e: TouchEventArgs)OnGotTouchCapture(self: CheckBox_20$21, e: TouchEventArgs)OnGotTouchCapture(self: ComboBox_21$22, e: TouchEventArgs)OnGotTouchCapture(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnInitialized(self, *args): #cannot find CLR method
"""
OnInitialized(self: FrameworkElement, e: EventArgs)
Raises the System.Windows.FrameworkElement.Initialized event. This method is
invoked whenever System.Windows.FrameworkElement.IsInitialized is set to true
internally.
e: The System.Windows.RoutedEventArgs that contains the event data.
OnInitialized(self: Window_16$17, e: EventArgs)OnInitialized(self: Label_17$18, e: EventArgs)OnInitialized(self: TextBox_18$19, e: EventArgs)OnInitialized(self: Button_19$20, e: EventArgs)OnInitialized(self: CheckBox_20$21, e: EventArgs)OnInitialized(self: ComboBox_21$22, e: EventArgs)OnInitialized(self: Separator_22$23, e: EventArgs)
"""
pass
def OnIsKeyboardFocusedChanged(self, *args): #cannot find CLR method
"""
OnIsKeyboardFocusedChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsKeyboardFocusedChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsKeyboardFocusedChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsKeyboardFocusWithinChanged(self, *args): #cannot find CLR method
"""
OnIsKeyboardFocusWithinChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked just before the System.Windows.UIElement.IsKeyboardFocusWithinChanged
event is raised by this element. Implement this method to add class handling
for this event.
e: A System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsKeyboardFocusWithinChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsMouseCapturedChanged(self, *args): #cannot find CLR method
"""
OnIsMouseCapturedChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsMouseCapturedChanged event
is raised on this element. Implement this method to add class handling for this
event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsMouseCapturedChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsMouseCaptureWithinChanged(self, *args): #cannot find CLR method
"""
OnIsMouseCaptureWithinChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsMouseCaptureWithinChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: A System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsMouseCaptureWithinChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsMouseDirectlyOverChanged(self, *args): #cannot find CLR method
"""
OnIsMouseDirectlyOverChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsMouseDirectlyOverChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsMouseDirectlyOverChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsStylusCapturedChanged(self, *args): #cannot find CLR method
"""
OnIsStylusCapturedChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsStylusCapturedChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: A System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsStylusCapturedChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsStylusCaptureWithinChanged(self, *args): #cannot find CLR method
"""
OnIsStylusCaptureWithinChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsStylusCaptureWithinChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsStylusCaptureWithinChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsStylusDirectlyOverChanged(self, *args): #cannot find CLR method
"""
OnIsStylusDirectlyOverChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsStylusDirectlyOverChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsStylusDirectlyOverChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnKeyDown(self, *args): #cannot find CLR method
"""
OnKeyDown(self: HwndHost, e: KeyEventArgs)
Called when the hosted window receives a WM_KEYDOWN message.
e: The event data.
"""
pass
def OnKeyUp(self, *args): #cannot find CLR method
"""
OnKeyUp(self: HwndHost, e: KeyEventArgs)
Called when the hosted window receives a WM_KEYUP message.
e: The event data.
"""
pass
def OnLostFocus(self, *args): #cannot find CLR method
"""
OnLostFocus(self: UIElement, e: RoutedEventArgs)
Raises the System.Windows.UIElement.LostFocus�routed event by using the event
data that is provided.
e: A System.Windows.RoutedEventArgs that contains event data. This event data must
contain the identifier for the System.Windows.UIElement.LostFocus event.
OnLostFocus(self: Window_16$17, e: RoutedEventArgs)OnLostFocus(self: Label_17$18, e: RoutedEventArgs)OnLostFocus(self: TextBox_18$19, e: RoutedEventArgs)OnLostFocus(self: Button_19$20, e: RoutedEventArgs)OnLostFocus(self: CheckBox_20$21, e: RoutedEventArgs)OnLostFocus(self: ComboBox_21$22, e: RoutedEventArgs)OnLostFocus(self: Separator_22$23, e: RoutedEventArgs)
"""
pass
def OnLostKeyboardFocus(self, *args): #cannot find CLR method
"""
OnLostKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.LostKeyboardFocus�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains event data.
OnLostKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnLostMouseCapture(self, *args): #cannot find CLR method
"""
OnLostMouseCapture(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.LostMouseCapture�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains event data.
OnLostMouseCapture(self: Window_16$17, e: MouseEventArgs)OnLostMouseCapture(self: Label_17$18, e: MouseEventArgs)OnLostMouseCapture(self: TextBox_18$19, e: MouseEventArgs)OnLostMouseCapture(self: Button_19$20, e: MouseEventArgs)OnLostMouseCapture(self: CheckBox_20$21, e: MouseEventArgs)OnLostMouseCapture(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnLostStylusCapture(self, *args): #cannot find CLR method
"""
OnLostStylusCapture(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.LostStylusCapture�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains event data.
OnLostStylusCapture(self: Window_16$17, e: StylusEventArgs)OnLostStylusCapture(self: Label_17$18, e: StylusEventArgs)OnLostStylusCapture(self: TextBox_18$19, e: StylusEventArgs)OnLostStylusCapture(self: Button_19$20, e: StylusEventArgs)OnLostStylusCapture(self: CheckBox_20$21, e: StylusEventArgs)OnLostStylusCapture(self: ComboBox_21$22, e: StylusEventArgs)OnLostStylusCapture(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnLostTouchCapture(self, *args): #cannot find CLR method
"""
OnLostTouchCapture(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.LostTouchCapture
routed event that occurs when this element loses a touch capture.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnLostTouchCapture(self: Window_16$17, e: TouchEventArgs)OnLostTouchCapture(self: Label_17$18, e: TouchEventArgs)OnLostTouchCapture(self: TextBox_18$19, e: TouchEventArgs)OnLostTouchCapture(self: Button_19$20, e: TouchEventArgs)OnLostTouchCapture(self: CheckBox_20$21, e: TouchEventArgs)OnLostTouchCapture(self: ComboBox_21$22, e: TouchEventArgs)OnLostTouchCapture(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnManipulationBoundaryFeedback(self, *args): #cannot find CLR method
"""
OnManipulationBoundaryFeedback(self: UIElement, e: ManipulationBoundaryFeedbackEventArgs)
Called when the System.Windows.UIElement.ManipulationBoundaryFeedback event
occurs.
e: The data for the event.
OnManipulationBoundaryFeedback(self: Window_16$17, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: Label_17$18, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: TextBox_18$19, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: Button_19$20, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: CheckBox_20$21, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: ComboBox_21$22, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: Separator_22$23, e: ManipulationBoundaryFeedbackEventArgs)
"""
pass
def OnManipulationCompleted(self, *args): #cannot find CLR method
"""
OnManipulationCompleted(self: UIElement, e: ManipulationCompletedEventArgs)
Called when the System.Windows.UIElement.ManipulationCompleted event occurs.
e: The data for the event.
OnManipulationCompleted(self: Window_16$17, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: Label_17$18, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: TextBox_18$19, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: Button_19$20, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: CheckBox_20$21, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: ComboBox_21$22, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: Separator_22$23, e: ManipulationCompletedEventArgs)
"""
pass
def OnManipulationDelta(self, *args): #cannot find CLR method
"""
OnManipulationDelta(self: UIElement, e: ManipulationDeltaEventArgs)
Called when the System.Windows.UIElement.ManipulationDelta event occurs.
e: The data for the event.
OnManipulationDelta(self: Window_16$17, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: Label_17$18, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: TextBox_18$19, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: Button_19$20, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: CheckBox_20$21, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: ComboBox_21$22, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: Separator_22$23, e: ManipulationDeltaEventArgs)
"""
pass
def OnManipulationInertiaStarting(self, *args): #cannot find CLR method
"""
OnManipulationInertiaStarting(self: UIElement, e: ManipulationInertiaStartingEventArgs)
Called when the System.Windows.UIElement.ManipulationInertiaStarting event
occurs.
e: The data for the event.
OnManipulationInertiaStarting(self: Window_16$17, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: Label_17$18, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: TextBox_18$19, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: Button_19$20, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: CheckBox_20$21, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: ComboBox_21$22, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: Separator_22$23, e: ManipulationInertiaStartingEventArgs)
"""
pass
def OnManipulationStarted(self, *args): #cannot find CLR method
"""
OnManipulationStarted(self: UIElement, e: ManipulationStartedEventArgs)
Called when the System.Windows.UIElement.ManipulationStarted event occurs.
e: The data for the event.
OnManipulationStarted(self: Window_16$17, e: ManipulationStartedEventArgs)OnManipulationStarted(self: Label_17$18, e: ManipulationStartedEventArgs)OnManipulationStarted(self: TextBox_18$19, e: ManipulationStartedEventArgs)OnManipulationStarted(self: Button_19$20, e: ManipulationStartedEventArgs)OnManipulationStarted(self: CheckBox_20$21, e: ManipulationStartedEventArgs)OnManipulationStarted(self: ComboBox_21$22, e: ManipulationStartedEventArgs)OnManipulationStarted(self: Separator_22$23, e: ManipulationStartedEventArgs)
"""
pass
def OnManipulationStarting(self, *args): #cannot find CLR method
"""
OnManipulationStarting(self: UIElement, e: ManipulationStartingEventArgs)
Provides class handling for the System.Windows.UIElement.ManipulationStarting
routed event that occurs when the manipulation processor is first created.
e: A System.Windows.Input.ManipulationStartingEventArgs that contains the event
data.
OnManipulationStarting(self: Window_16$17, e: ManipulationStartingEventArgs)OnManipulationStarting(self: Label_17$18, e: ManipulationStartingEventArgs)OnManipulationStarting(self: TextBox_18$19, e: ManipulationStartingEventArgs)OnManipulationStarting(self: Button_19$20, e: ManipulationStartingEventArgs)OnManipulationStarting(self: CheckBox_20$21, e: ManipulationStartingEventArgs)OnManipulationStarting(self: ComboBox_21$22, e: ManipulationStartingEventArgs)OnManipulationStarting(self: Separator_22$23, e: ManipulationStartingEventArgs)
"""
pass
def OnMnemonicCore(self, *args): #cannot find CLR method
"""
OnMnemonicCore(self: HwndHost, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Called when one of the mnemonics (access keys) for this sink is invoked.
msg: The message for the mnemonic and associated data.
modifiers: Modifier keys.
Returns: Always returns false.
"""
pass
def OnMouseDown(self, *args): #cannot find CLR method
"""
OnMouseDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseDown�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data.
This event data reports details about the mouse button that was pressed and the
handled state.
OnMouseDown(self: Window_16$17, e: MouseButtonEventArgs)OnMouseDown(self: Label_17$18, e: MouseButtonEventArgs)OnMouseDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseDown(self: Button_19$20, e: MouseButtonEventArgs)OnMouseDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseEnter(self, *args): #cannot find CLR method
"""
OnMouseEnter(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseEnter�attached event
is raised on this element. Implement this method to add class handling for this
event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnMouseEnter(self: Window_16$17, e: MouseEventArgs)OnMouseEnter(self: Label_17$18, e: MouseEventArgs)OnMouseEnter(self: TextBox_18$19, e: MouseEventArgs)OnMouseEnter(self: Button_19$20, e: MouseEventArgs)OnMouseEnter(self: CheckBox_20$21, e: MouseEventArgs)OnMouseEnter(self: ComboBox_21$22, e: MouseEventArgs)OnMouseEnter(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnMouseLeave(self, *args): #cannot find CLR method
"""
OnMouseLeave(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseLeave�attached event
is raised on this element. Implement this method to add class handling for this
event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnMouseLeave(self: Window_16$17, e: MouseEventArgs)OnMouseLeave(self: Label_17$18, e: MouseEventArgs)OnMouseLeave(self: TextBox_18$19, e: MouseEventArgs)OnMouseLeave(self: Button_19$20, e: MouseEventArgs)OnMouseLeave(self: CheckBox_20$21, e: MouseEventArgs)OnMouseLeave(self: ComboBox_21$22, e: MouseEventArgs)OnMouseLeave(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnMouseLeftButtonDown(self, *args): #cannot find CLR method
"""
OnMouseLeftButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseLeftButtonDown�routed
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was pressed.
OnMouseLeftButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseLeftButtonUp(self, *args): #cannot find CLR method
"""
OnMouseLeftButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseLeftButtonUp�routed
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was released.
OnMouseLeftButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseMove(self, *args): #cannot find CLR method
"""
OnMouseMove(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseMove�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnMouseMove(self: Window_16$17, e: MouseEventArgs)OnMouseMove(self: Label_17$18, e: MouseEventArgs)OnMouseMove(self: TextBox_18$19, e: MouseEventArgs)OnMouseMove(self: Button_19$20, e: MouseEventArgs)OnMouseMove(self: CheckBox_20$21, e: MouseEventArgs)OnMouseMove(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnMouseRightButtonDown(self, *args): #cannot find CLR method
"""
OnMouseRightButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseRightButtonDown�routed
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was pressed.
OnMouseRightButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseRightButtonUp(self, *args): #cannot find CLR method
"""
OnMouseRightButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseRightButtonUp�routed
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was released.
OnMouseRightButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseUp(self, *args): #cannot find CLR method
"""
OnMouseUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseUp�routed event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the mouse button was released.
OnMouseUp(self: Window_16$17, e: MouseButtonEventArgs)OnMouseUp(self: Label_17$18, e: MouseButtonEventArgs)OnMouseUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseUp(self: Button_19$20, e: MouseButtonEventArgs)OnMouseUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseWheel(self, *args): #cannot find CLR method
"""
OnMouseWheel(self: UIElement, e: MouseWheelEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseWheel�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseWheelEventArgs that contains the event data.
OnMouseWheel(self: Window_16$17, e: MouseWheelEventArgs)OnMouseWheel(self: Label_17$18, e: MouseWheelEventArgs)OnMouseWheel(self: TextBox_18$19, e: MouseWheelEventArgs)OnMouseWheel(self: Button_19$20, e: MouseWheelEventArgs)OnMouseWheel(self: CheckBox_20$21, e: MouseWheelEventArgs)OnMouseWheel(self: Separator_22$23, e: MouseWheelEventArgs)
"""
pass
def OnPreviewDragEnter(self, *args): #cannot find CLR method
"""
OnPreviewDragEnter(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDragEnter�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDragEnter(self: Window_16$17, e: DragEventArgs)OnPreviewDragEnter(self: Label_17$18, e: DragEventArgs)OnPreviewDragEnter(self: TextBox_18$19, e: DragEventArgs)OnPreviewDragEnter(self: Button_19$20, e: DragEventArgs)OnPreviewDragEnter(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDragEnter(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDragEnter(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewDragLeave(self, *args): #cannot find CLR method
"""
OnPreviewDragLeave(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDragLeave�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDragLeave(self: Window_16$17, e: DragEventArgs)OnPreviewDragLeave(self: Label_17$18, e: DragEventArgs)OnPreviewDragLeave(self: TextBox_18$19, e: DragEventArgs)OnPreviewDragLeave(self: Button_19$20, e: DragEventArgs)OnPreviewDragLeave(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDragLeave(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDragLeave(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewDragOver(self, *args): #cannot find CLR method
"""
OnPreviewDragOver(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDragOver�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDragOver(self: Window_16$17, e: DragEventArgs)OnPreviewDragOver(self: Label_17$18, e: DragEventArgs)OnPreviewDragOver(self: TextBox_18$19, e: DragEventArgs)OnPreviewDragOver(self: Button_19$20, e: DragEventArgs)OnPreviewDragOver(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDragOver(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDragOver(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewDrop(self, *args): #cannot find CLR method
"""
OnPreviewDrop(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDrop�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDrop(self: Window_16$17, e: DragEventArgs)OnPreviewDrop(self: Label_17$18, e: DragEventArgs)OnPreviewDrop(self: TextBox_18$19, e: DragEventArgs)OnPreviewDrop(self: Button_19$20, e: DragEventArgs)OnPreviewDrop(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDrop(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDrop(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewGiveFeedback(self, *args): #cannot find CLR method
"""
OnPreviewGiveFeedback(self: UIElement, e: GiveFeedbackEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewGiveFeedback�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.GiveFeedbackEventArgs that contains the event data.
OnPreviewGiveFeedback(self: Window_16$17, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: Label_17$18, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: TextBox_18$19, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: Button_19$20, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: CheckBox_20$21, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: ComboBox_21$22, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: Separator_22$23, e: GiveFeedbackEventArgs)
"""
pass
def OnPreviewGotKeyboardFocus(self, *args): #cannot find CLR method
"""
OnPreviewGotKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewGotKeyboardFocus�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains the event
data.
OnPreviewGotKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnPreviewKeyDown(self, *args): #cannot find CLR method
"""
OnPreviewKeyDown(self: UIElement, e: KeyEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewKeyDown�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyEventArgs that contains the event data.
OnPreviewKeyDown(self: Window_16$17, e: KeyEventArgs)OnPreviewKeyDown(self: Label_17$18, e: KeyEventArgs)OnPreviewKeyDown(self: TextBox_18$19, e: KeyEventArgs)OnPreviewKeyDown(self: Button_19$20, e: KeyEventArgs)OnPreviewKeyDown(self: CheckBox_20$21, e: KeyEventArgs)OnPreviewKeyDown(self: ComboBox_21$22, e: KeyEventArgs)OnPreviewKeyDown(self: Separator_22$23, e: KeyEventArgs)
"""
pass
def OnPreviewKeyUp(self, *args): #cannot find CLR method
"""
OnPreviewKeyUp(self: UIElement, e: KeyEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewKeyUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyEventArgs that contains the event data.
OnPreviewKeyUp(self: Window_16$17, e: KeyEventArgs)OnPreviewKeyUp(self: Label_17$18, e: KeyEventArgs)OnPreviewKeyUp(self: TextBox_18$19, e: KeyEventArgs)OnPreviewKeyUp(self: Button_19$20, e: KeyEventArgs)OnPreviewKeyUp(self: CheckBox_20$21, e: KeyEventArgs)OnPreviewKeyUp(self: ComboBox_21$22, e: KeyEventArgs)OnPreviewKeyUp(self: Separator_22$23, e: KeyEventArgs)
"""
pass
def OnPreviewLostKeyboardFocus(self, *args): #cannot find CLR method
"""
OnPreviewLostKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewKeyDown�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains the event
data.
OnPreviewLostKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnPreviewMouseDown(self, *args): #cannot find CLR method
"""
OnPreviewMouseDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseDown attached�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that one or more mouse buttons were pressed.
OnPreviewMouseDown(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseDown(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseDown(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseLeftButtonDown(self, *args): #cannot find CLR method
"""
OnPreviewMouseLeftButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseLeftButtonDown�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was pressed.
OnPreviewMouseLeftButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseLeftButtonUp(self, *args): #cannot find CLR method
"""
OnPreviewMouseLeftButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseLeftButtonUp�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was released.
OnPreviewMouseLeftButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseMove(self, *args): #cannot find CLR method
"""
OnPreviewMouseMove(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseMove�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnPreviewMouseMove(self: Window_16$17, e: MouseEventArgs)OnPreviewMouseMove(self: Label_17$18, e: MouseEventArgs)OnPreviewMouseMove(self: TextBox_18$19, e: MouseEventArgs)OnPreviewMouseMove(self: Button_19$20, e: MouseEventArgs)OnPreviewMouseMove(self: CheckBox_20$21, e: MouseEventArgs)OnPreviewMouseMove(self: ComboBox_21$22, e: MouseEventArgs)OnPreviewMouseMove(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnPreviewMouseRightButtonDown(self, *args): #cannot find CLR method
"""
OnPreviewMouseRightButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseRightButtonDown�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was pressed.
OnPreviewMouseRightButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseRightButtonUp(self, *args): #cannot find CLR method
"""
OnPreviewMouseRightButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseRightButtonUp�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was released.
OnPreviewMouseRightButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseUp(self, *args): #cannot find CLR method
"""
OnPreviewMouseUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that one or more mouse buttons were released.
OnPreviewMouseUp(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseUp(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseUp(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseWheel(self, *args): #cannot find CLR method
"""
OnPreviewMouseWheel(self: UIElement, e: MouseWheelEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseWheel�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseWheelEventArgs that contains the event data.
OnPreviewMouseWheel(self: Window_16$17, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: Label_17$18, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: TextBox_18$19, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: Button_19$20, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: CheckBox_20$21, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: ComboBox_21$22, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: Separator_22$23, e: MouseWheelEventArgs)
"""
pass
def OnPreviewQueryContinueDrag(self, *args): #cannot find CLR method
"""
OnPreviewQueryContinueDrag(self: UIElement, e: QueryContinueDragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewQueryContinueDrag�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.QueryContinueDragEventArgs that contains the event data.
OnPreviewQueryContinueDrag(self: Window_16$17, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: Label_17$18, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: TextBox_18$19, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: Button_19$20, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: CheckBox_20$21, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: ComboBox_21$22, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: Separator_22$23, e: QueryContinueDragEventArgs)
"""
pass
def OnPreviewStylusButtonDown(self, *args): #cannot find CLR method
"""
OnPreviewStylusButtonDown(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusButtonDown�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnPreviewStylusButtonDown(self: Window_16$17, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: Label_17$18, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: TextBox_18$19, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: Button_19$20, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: CheckBox_20$21, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: ComboBox_21$22, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnPreviewStylusButtonUp(self, *args): #cannot find CLR method
"""
OnPreviewStylusButtonUp(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusButtonUp�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnPreviewStylusButtonUp(self: Window_16$17, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: Label_17$18, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: TextBox_18$19, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: Button_19$20, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: CheckBox_20$21, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: ComboBox_21$22, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnPreviewStylusDown(self, *args): #cannot find CLR method
"""
OnPreviewStylusDown(self: UIElement, e: StylusDownEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusDown�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusDownEventArgs that contains the event data.
OnPreviewStylusDown(self: Window_16$17, e: StylusDownEventArgs)OnPreviewStylusDown(self: Label_17$18, e: StylusDownEventArgs)OnPreviewStylusDown(self: TextBox_18$19, e: StylusDownEventArgs)OnPreviewStylusDown(self: Button_19$20, e: StylusDownEventArgs)OnPreviewStylusDown(self: CheckBox_20$21, e: StylusDownEventArgs)OnPreviewStylusDown(self: ComboBox_21$22, e: StylusDownEventArgs)OnPreviewStylusDown(self: Separator_22$23, e: StylusDownEventArgs)
"""
pass
def OnPreviewStylusInAirMove(self, *args): #cannot find CLR method
"""
OnPreviewStylusInAirMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusInAirMove�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusInAirMove(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusInAirMove(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusInAirMove(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusInAirMove(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusInAirMove(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusInAirMove(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusInAirMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusInRange(self, *args): #cannot find CLR method
"""
OnPreviewStylusInRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusInRange�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusInRange(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusInRange(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusInRange(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusInRange(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusInRange(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusInRange(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusInRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusMove(self, *args): #cannot find CLR method
"""
OnPreviewStylusMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusMove�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusMove(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusMove(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusMove(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusMove(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusMove(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusMove(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusOutOfRange(self, *args): #cannot find CLR method
"""
OnPreviewStylusOutOfRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusOutOfRange�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusOutOfRange(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusSystemGesture(self, *args): #cannot find CLR method
"""
OnPreviewStylusSystemGesture(self: UIElement, e: StylusSystemGestureEventArgs)
Invoked when an unhandled
System.Windows.Input.Stylus.PreviewStylusSystemGesture�attached event reaches
an element in its route that is derived from this class. Implement this method
to add class handling for this event.
e: The System.Windows.Input.StylusSystemGestureEventArgs that contains the event
data.
OnPreviewStylusSystemGesture(self: Window_16$17, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: Label_17$18, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: TextBox_18$19, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: Button_19$20, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: CheckBox_20$21, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: ComboBox_21$22, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: Separator_22$23, e: StylusSystemGestureEventArgs)
"""
pass
def OnPreviewStylusUp(self, *args): #cannot find CLR method
"""
OnPreviewStylusUp(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusUp(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusUp(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusUp(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusUp(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusUp(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusUp(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusUp(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewTextInput(self, *args): #cannot find CLR method
"""
OnPreviewTextInput(self: UIElement, e: TextCompositionEventArgs)
Invoked when an unhandled
System.Windows.Input.TextCompositionManager.PreviewTextInput�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.TextCompositionEventArgs that contains the event data.
OnPreviewTextInput(self: Window_16$17, e: TextCompositionEventArgs)OnPreviewTextInput(self: Label_17$18, e: TextCompositionEventArgs)OnPreviewTextInput(self: TextBox_18$19, e: TextCompositionEventArgs)OnPreviewTextInput(self: Button_19$20, e: TextCompositionEventArgs)OnPreviewTextInput(self: CheckBox_20$21, e: TextCompositionEventArgs)OnPreviewTextInput(self: ComboBox_21$22, e: TextCompositionEventArgs)OnPreviewTextInput(self: Separator_22$23, e: TextCompositionEventArgs)
"""
pass
def OnPreviewTouchDown(self, *args): #cannot find CLR method
"""
OnPreviewTouchDown(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.PreviewTouchDown
routed event that occurs when a touch presses this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnPreviewTouchDown(self: Window_16$17, e: TouchEventArgs)OnPreviewTouchDown(self: Label_17$18, e: TouchEventArgs)OnPreviewTouchDown(self: TextBox_18$19, e: TouchEventArgs)OnPreviewTouchDown(self: Button_19$20, e: TouchEventArgs)OnPreviewTouchDown(self: CheckBox_20$21, e: TouchEventArgs)OnPreviewTouchDown(self: ComboBox_21$22, e: TouchEventArgs)OnPreviewTouchDown(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnPreviewTouchMove(self, *args): #cannot find CLR method
"""
OnPreviewTouchMove(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.PreviewTouchMove
routed event that occurs when a touch moves while inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnPreviewTouchMove(self: Window_16$17, e: TouchEventArgs)OnPreviewTouchMove(self: Label_17$18, e: TouchEventArgs)OnPreviewTouchMove(self: TextBox_18$19, e: TouchEventArgs)OnPreviewTouchMove(self: Button_19$20, e: TouchEventArgs)OnPreviewTouchMove(self: CheckBox_20$21, e: TouchEventArgs)OnPreviewTouchMove(self: ComboBox_21$22, e: TouchEventArgs)OnPreviewTouchMove(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnPreviewTouchUp(self, *args): #cannot find CLR method
"""
OnPreviewTouchUp(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.PreviewTouchUp routed
event that occurs when a touch is released inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnPreviewTouchUp(self: Window_16$17, e: TouchEventArgs)OnPreviewTouchUp(self: Label_17$18, e: TouchEventArgs)OnPreviewTouchUp(self: TextBox_18$19, e: TouchEventArgs)OnPreviewTouchUp(self: Button_19$20, e: TouchEventArgs)OnPreviewTouchUp(self: CheckBox_20$21, e: TouchEventArgs)OnPreviewTouchUp(self: ComboBox_21$22, e: TouchEventArgs)OnPreviewTouchUp(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnPropertyChanged(self, *args): #cannot find CLR method
"""
OnPropertyChanged(self: FrameworkElement, e: DependencyPropertyChangedEventArgs)
Invoked whenever the effective value of any dependency property on this
System.Windows.FrameworkElement has been updated. The specific dependency
property that changed is reported in the arguments parameter. Overrides
System.Windows.DependencyObject.OnPropertyChanged(System.Windows.DependencyPrope
rtyChangedEventArgs).
e: The event data that describes the property that changed, as well as old and new
values.
OnPropertyChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnPropertyChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnPropertyChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnPropertyChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnPropertyChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnPropertyChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnPropertyChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnQueryContinueDrag(self, *args): #cannot find CLR method
"""
OnQueryContinueDrag(self: UIElement, e: QueryContinueDragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.QueryContinueDrag�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.QueryContinueDragEventArgs that contains the event data.
OnQueryContinueDrag(self: Window_16$17, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: Label_17$18, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: TextBox_18$19, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: Button_19$20, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: CheckBox_20$21, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: ComboBox_21$22, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: Separator_22$23, e: QueryContinueDragEventArgs)
"""
pass
def OnQueryCursor(self, *args): #cannot find CLR method
"""
OnQueryCursor(self: UIElement, e: QueryCursorEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.QueryCursor�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.QueryCursorEventArgs that contains the event data.
OnQueryCursor(self: Window_16$17, e: QueryCursorEventArgs)OnQueryCursor(self: Label_17$18, e: QueryCursorEventArgs)OnQueryCursor(self: TextBox_18$19, e: QueryCursorEventArgs)OnQueryCursor(self: Button_19$20, e: QueryCursorEventArgs)OnQueryCursor(self: CheckBox_20$21, e: QueryCursorEventArgs)OnQueryCursor(self: ComboBox_21$22, e: QueryCursorEventArgs)OnQueryCursor(self: Separator_22$23, e: QueryCursorEventArgs)
"""
pass
def OnRender(self, *args): #cannot find CLR method
"""
OnRender(self: UIElement, drawingContext: DrawingContext)
When overridden in a derived class, participates in rendering operations that
are directed by the layout system. The rendering instructions for this element
are not used directly when this method is invoked, and are instead preserved
for later asynchronous use by layout and drawing.
drawingContext: The drawing instructions for a specific element. This context is provided to
the layout system.
OnRender(self: Window_16$17, drawingContext: DrawingContext)OnRender(self: Label_17$18, drawingContext: DrawingContext)OnRender(self: TextBox_18$19, drawingContext: DrawingContext)OnRender(self: Button_19$20, drawingContext: DrawingContext)OnRender(self: CheckBox_20$21, drawingContext: DrawingContext)OnRender(self: ComboBox_21$22, drawingContext: DrawingContext)OnRender(self: Separator_22$23, drawingContext: DrawingContext)
"""
pass
def OnRenderSizeChanged(self, *args): #cannot find CLR method
"""
OnRenderSizeChanged(self: FrameworkElement, sizeInfo: SizeChangedInfo)
Raises the System.Windows.FrameworkElement.SizeChanged event, using the
specified information as part of the eventual event data.
sizeInfo: Details of the old and new size involved in the change.
OnRenderSizeChanged(self: Window_16$17, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: Label_17$18, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: TextBox_18$19, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: Button_19$20, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: CheckBox_20$21, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: ComboBox_21$22, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: Separator_22$23, sizeInfo: SizeChangedInfo)
"""
pass
def OnStyleChanged(self, *args): #cannot find CLR method
"""
OnStyleChanged(self: FrameworkElement, oldStyle: Style, newStyle: Style)
Invoked when the style in use on this element changes, which will invalidate
the layout.
oldStyle: The old style.
newStyle: The new style.
OnStyleChanged(self: Window_16$17, oldStyle: Style, newStyle: Style)OnStyleChanged(self: Label_17$18, oldStyle: Style, newStyle: Style)OnStyleChanged(self: TextBox_18$19, oldStyle: Style, newStyle: Style)OnStyleChanged(self: Button_19$20, oldStyle: Style, newStyle: Style)OnStyleChanged(self: CheckBox_20$21, oldStyle: Style, newStyle: Style)OnStyleChanged(self: ComboBox_21$22, oldStyle: Style, newStyle: Style)OnStyleChanged(self: Separator_22$23, oldStyle: Style, newStyle: Style)
"""
pass
def OnStylusButtonDown(self, *args): #cannot find CLR method
"""
OnStylusButtonDown(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusButtonDown�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnStylusButtonDown(self: Window_16$17, e: StylusButtonEventArgs)OnStylusButtonDown(self: Label_17$18, e: StylusButtonEventArgs)OnStylusButtonDown(self: TextBox_18$19, e: StylusButtonEventArgs)OnStylusButtonDown(self: Button_19$20, e: StylusButtonEventArgs)OnStylusButtonDown(self: CheckBox_20$21, e: StylusButtonEventArgs)OnStylusButtonDown(self: ComboBox_21$22, e: StylusButtonEventArgs)OnStylusButtonDown(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnStylusButtonUp(self, *args): #cannot find CLR method
"""
OnStylusButtonUp(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusButtonUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnStylusButtonUp(self: Window_16$17, e: StylusButtonEventArgs)OnStylusButtonUp(self: Label_17$18, e: StylusButtonEventArgs)OnStylusButtonUp(self: TextBox_18$19, e: StylusButtonEventArgs)OnStylusButtonUp(self: Button_19$20, e: StylusButtonEventArgs)OnStylusButtonUp(self: CheckBox_20$21, e: StylusButtonEventArgs)OnStylusButtonUp(self: ComboBox_21$22, e: StylusButtonEventArgs)OnStylusButtonUp(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnStylusDown(self, *args): #cannot find CLR method
"""
OnStylusDown(self: UIElement, e: StylusDownEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusDown�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.StylusDownEventArgs that contains the event data.
OnStylusDown(self: Window_16$17, e: StylusDownEventArgs)OnStylusDown(self: Label_17$18, e: StylusDownEventArgs)OnStylusDown(self: TextBox_18$19, e: StylusDownEventArgs)OnStylusDown(self: Button_19$20, e: StylusDownEventArgs)OnStylusDown(self: CheckBox_20$21, e: StylusDownEventArgs)OnStylusDown(self: ComboBox_21$22, e: StylusDownEventArgs)OnStylusDown(self: Separator_22$23, e: StylusDownEventArgs)
"""
pass
def OnStylusEnter(self, *args): #cannot find CLR method
"""
OnStylusEnter(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusEnter�attached
event is raised by this element. Implement this method to add class handling
for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusEnter(self: Window_16$17, e: StylusEventArgs)OnStylusEnter(self: Label_17$18, e: StylusEventArgs)OnStylusEnter(self: TextBox_18$19, e: StylusEventArgs)OnStylusEnter(self: Button_19$20, e: StylusEventArgs)OnStylusEnter(self: CheckBox_20$21, e: StylusEventArgs)OnStylusEnter(self: ComboBox_21$22, e: StylusEventArgs)OnStylusEnter(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusInAirMove(self, *args): #cannot find CLR method
"""
OnStylusInAirMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusInAirMove�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusInAirMove(self: Window_16$17, e: StylusEventArgs)OnStylusInAirMove(self: Label_17$18, e: StylusEventArgs)OnStylusInAirMove(self: TextBox_18$19, e: StylusEventArgs)OnStylusInAirMove(self: Button_19$20, e: StylusEventArgs)OnStylusInAirMove(self: CheckBox_20$21, e: StylusEventArgs)OnStylusInAirMove(self: ComboBox_21$22, e: StylusEventArgs)OnStylusInAirMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusInRange(self, *args): #cannot find CLR method
"""
OnStylusInRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusInRange�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusInRange(self: Window_16$17, e: StylusEventArgs)OnStylusInRange(self: Label_17$18, e: StylusEventArgs)OnStylusInRange(self: TextBox_18$19, e: StylusEventArgs)OnStylusInRange(self: Button_19$20, e: StylusEventArgs)OnStylusInRange(self: CheckBox_20$21, e: StylusEventArgs)OnStylusInRange(self: ComboBox_21$22, e: StylusEventArgs)OnStylusInRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusLeave(self, *args): #cannot find CLR method
"""
OnStylusLeave(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusLeave�attached
event is raised by this element. Implement this method to add class handling
for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusLeave(self: Window_16$17, e: StylusEventArgs)OnStylusLeave(self: Label_17$18, e: StylusEventArgs)OnStylusLeave(self: TextBox_18$19, e: StylusEventArgs)OnStylusLeave(self: Button_19$20, e: StylusEventArgs)OnStylusLeave(self: CheckBox_20$21, e: StylusEventArgs)OnStylusLeave(self: ComboBox_21$22, e: StylusEventArgs)OnStylusLeave(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusMove(self, *args): #cannot find CLR method
"""
OnStylusMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusMove�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusMove(self: Window_16$17, e: StylusEventArgs)OnStylusMove(self: Label_17$18, e: StylusEventArgs)OnStylusMove(self: TextBox_18$19, e: StylusEventArgs)OnStylusMove(self: Button_19$20, e: StylusEventArgs)OnStylusMove(self: CheckBox_20$21, e: StylusEventArgs)OnStylusMove(self: ComboBox_21$22, e: StylusEventArgs)OnStylusMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusOutOfRange(self, *args): #cannot find CLR method
"""
OnStylusOutOfRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusOutOfRange�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusOutOfRange(self: Window_16$17, e: StylusEventArgs)OnStylusOutOfRange(self: Label_17$18, e: StylusEventArgs)OnStylusOutOfRange(self: TextBox_18$19, e: StylusEventArgs)OnStylusOutOfRange(self: Button_19$20, e: StylusEventArgs)OnStylusOutOfRange(self: CheckBox_20$21, e: StylusEventArgs)OnStylusOutOfRange(self: ComboBox_21$22, e: StylusEventArgs)OnStylusOutOfRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusSystemGesture(self, *args): #cannot find CLR method
"""
OnStylusSystemGesture(self: UIElement, e: StylusSystemGestureEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusSystemGesture�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusSystemGestureEventArgs that contains the event
data.
OnStylusSystemGesture(self: Window_16$17, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: Label_17$18, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: TextBox_18$19, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: Button_19$20, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: CheckBox_20$21, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: ComboBox_21$22, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: Separator_22$23, e: StylusSystemGestureEventArgs)
"""
pass
def OnStylusUp(self, *args): #cannot find CLR method
"""
OnStylusUp(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusUp�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusUp(self: Window_16$17, e: StylusEventArgs)OnStylusUp(self: Label_17$18, e: StylusEventArgs)OnStylusUp(self: TextBox_18$19, e: StylusEventArgs)OnStylusUp(self: Button_19$20, e: StylusEventArgs)OnStylusUp(self: CheckBox_20$21, e: StylusEventArgs)OnStylusUp(self: ComboBox_21$22, e: StylusEventArgs)OnStylusUp(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnTextInput(self, *args): #cannot find CLR method
"""
OnTextInput(self: UIElement, e: TextCompositionEventArgs)
Invoked when an unhandled System.Windows.Input.TextCompositionManager.TextInput�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.TextCompositionEventArgs that contains the event data.
OnTextInput(self: Window_16$17, e: TextCompositionEventArgs)OnTextInput(self: Label_17$18, e: TextCompositionEventArgs)OnTextInput(self: TextBox_18$19, e: TextCompositionEventArgs)OnTextInput(self: Button_19$20, e: TextCompositionEventArgs)OnTextInput(self: CheckBox_20$21, e: TextCompositionEventArgs)OnTextInput(self: ComboBox_21$22, e: TextCompositionEventArgs)OnTextInput(self: Separator_22$23, e: TextCompositionEventArgs)
"""
pass
def OnToolTipClosing(self, *args): #cannot find CLR method
"""
OnToolTipClosing(self: FrameworkElement, e: ToolTipEventArgs)
Invoked whenever an unhandled System.Windows.FrameworkElement.ToolTipClosing
routed event reaches this class in its route. Implement this method to add
class handling for this event.
e: Provides data about the event.
OnToolTipClosing(self: Window_16$17, e: ToolTipEventArgs)OnToolTipClosing(self: Label_17$18, e: ToolTipEventArgs)OnToolTipClosing(self: TextBox_18$19, e: ToolTipEventArgs)OnToolTipClosing(self: Button_19$20, e: ToolTipEventArgs)OnToolTipClosing(self: CheckBox_20$21, e: ToolTipEventArgs)OnToolTipClosing(self: ComboBox_21$22, e: ToolTipEventArgs)OnToolTipClosing(self: Separator_22$23, e: ToolTipEventArgs)
"""
pass
def OnToolTipOpening(self, *args): #cannot find CLR method
"""
OnToolTipOpening(self: FrameworkElement, e: ToolTipEventArgs)
Invoked whenever the System.Windows.FrameworkElement.ToolTipOpening routed
event reaches this class in its route. Implement this method to add class
handling for this event.
e: Provides data about the event.
OnToolTipOpening(self: Window_16$17, e: ToolTipEventArgs)OnToolTipOpening(self: Label_17$18, e: ToolTipEventArgs)OnToolTipOpening(self: TextBox_18$19, e: ToolTipEventArgs)OnToolTipOpening(self: Button_19$20, e: ToolTipEventArgs)OnToolTipOpening(self: CheckBox_20$21, e: ToolTipEventArgs)OnToolTipOpening(self: ComboBox_21$22, e: ToolTipEventArgs)OnToolTipOpening(self: Separator_22$23, e: ToolTipEventArgs)
"""
pass
def OnTouchDown(self, *args): #cannot find CLR method
"""
OnTouchDown(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchDown routed event
that occurs when a touch presses inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchDown(self: Window_16$17, e: TouchEventArgs)OnTouchDown(self: Label_17$18, e: TouchEventArgs)OnTouchDown(self: TextBox_18$19, e: TouchEventArgs)OnTouchDown(self: Button_19$20, e: TouchEventArgs)OnTouchDown(self: CheckBox_20$21, e: TouchEventArgs)OnTouchDown(self: ComboBox_21$22, e: TouchEventArgs)OnTouchDown(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchEnter(self, *args): #cannot find CLR method
"""
OnTouchEnter(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchEnter routed
event that occurs when a touch moves from outside to inside the bounds of this
element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchEnter(self: Window_16$17, e: TouchEventArgs)OnTouchEnter(self: Label_17$18, e: TouchEventArgs)OnTouchEnter(self: TextBox_18$19, e: TouchEventArgs)OnTouchEnter(self: Button_19$20, e: TouchEventArgs)OnTouchEnter(self: CheckBox_20$21, e: TouchEventArgs)OnTouchEnter(self: ComboBox_21$22, e: TouchEventArgs)OnTouchEnter(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchLeave(self, *args): #cannot find CLR method
"""
OnTouchLeave(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchLeave routed
event that occurs when a touch moves from inside to outside the bounds of this
System.Windows.UIElement.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchLeave(self: Window_16$17, e: TouchEventArgs)OnTouchLeave(self: Label_17$18, e: TouchEventArgs)OnTouchLeave(self: TextBox_18$19, e: TouchEventArgs)OnTouchLeave(self: Button_19$20, e: TouchEventArgs)OnTouchLeave(self: CheckBox_20$21, e: TouchEventArgs)OnTouchLeave(self: ComboBox_21$22, e: TouchEventArgs)OnTouchLeave(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchMove(self, *args): #cannot find CLR method
"""
OnTouchMove(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchMove routed event
that occurs when a touch moves while inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchMove(self: Window_16$17, e: TouchEventArgs)OnTouchMove(self: Label_17$18, e: TouchEventArgs)OnTouchMove(self: TextBox_18$19, e: TouchEventArgs)OnTouchMove(self: Button_19$20, e: TouchEventArgs)OnTouchMove(self: CheckBox_20$21, e: TouchEventArgs)OnTouchMove(self: ComboBox_21$22, e: TouchEventArgs)OnTouchMove(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchUp(self, *args): #cannot find CLR method
"""
OnTouchUp(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchUp routed event
that occurs when a touch is released inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchUp(self: Window_16$17, e: TouchEventArgs)OnTouchUp(self: Label_17$18, e: TouchEventArgs)OnTouchUp(self: TextBox_18$19, e: TouchEventArgs)OnTouchUp(self: Button_19$20, e: TouchEventArgs)OnTouchUp(self: CheckBox_20$21, e: TouchEventArgs)OnTouchUp(self: ComboBox_21$22, e: TouchEventArgs)OnTouchUp(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnVisualChildrenChanged(self, *args): #cannot find CLR method
"""
OnVisualChildrenChanged(self: Visual, visualAdded: DependencyObject, visualRemoved: DependencyObject)
Called when the System.Windows.Media.VisualCollection of the visual object is
modified.
visualAdded: The System.Windows.Media.Visual that was added to the collection
visualRemoved: The System.Windows.Media.Visual that was removed from the collection
OnVisualChildrenChanged(self: Window_16$17, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: Label_17$18, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: TextBox_18$19, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: Button_19$20, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: CheckBox_20$21, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: ComboBox_21$22, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: Separator_22$23, visualAdded: DependencyObject, visualRemoved: DependencyObject)
"""
pass
def OnVisualParentChanged(self, *args): #cannot find CLR method
"""
OnVisualParentChanged(self: FrameworkElement, oldParent: DependencyObject)
Invoked when the parent of this element in the visual tree is changed.
Overrides
System.Windows.UIElement.OnVisualParentChanged(System.Windows.DependencyObject).
oldParent: The old parent element. May be null to indicate that the element did not have a
visual parent previously.
OnVisualParentChanged(self: Window_16$17, oldParent: DependencyObject)OnVisualParentChanged(self: Label_17$18, oldParent: DependencyObject)OnVisualParentChanged(self: TextBox_18$19, oldParent: DependencyObject)OnVisualParentChanged(self: Button_19$20, oldParent: DependencyObject)OnVisualParentChanged(self: CheckBox_20$21, oldParent: DependencyObject)OnVisualParentChanged(self: ComboBox_21$22, oldParent: DependencyObject)OnVisualParentChanged(self: Separator_22$23, oldParent: DependencyObject)
"""
pass
def OnWindowPositionChanged(self, *args): #cannot find CLR method
"""
OnWindowPositionChanged(self: HwndHost, rcBoundingBox: Rect)
Called when the hosted window's position changes.
rcBoundingBox: The window's position.
"""
pass
def ParentLayoutInvalidated(self, *args): #cannot find CLR method
"""
ParentLayoutInvalidated(self: FrameworkElement, child: UIElement)
Supports incremental layout implementations in specialized subclasses of
System.Windows.FrameworkElement.
System.Windows.FrameworkElement.ParentLayoutInvalidated(System.Windows.UIElement
) is invoked when a child element has invalidated a property that is marked in
metadata as affecting the parent's measure or arrange passes during layout.
child: The child element reporting the change.
ParentLayoutInvalidated(self: Window_16$17, child: UIElement)ParentLayoutInvalidated(self: Label_17$18, child: UIElement)ParentLayoutInvalidated(self: TextBox_18$19, child: UIElement)ParentLayoutInvalidated(self: Button_19$20, child: UIElement)ParentLayoutInvalidated(self: CheckBox_20$21, child: UIElement)ParentLayoutInvalidated(self: ComboBox_21$22, child: UIElement)ParentLayoutInvalidated(self: Separator_22$23, child: UIElement)
"""
pass
def RegisterKeyboardInputSinkCore(self, *args): #cannot find CLR method
"""
RegisterKeyboardInputSinkCore(self: HwndHost, sink: IKeyboardInputSink) -> IKeyboardInputSite
Registers the System.Windows.Interop.IKeyboardInputSink interface of a
contained component.
sink: The System.Windows.Interop.IKeyboardInputSink sink of the contained component.
Returns: The System.Windows.Interop.IKeyboardInputSite site of the contained component.
"""
pass
def RemoveLogicalChild(self, *args): #cannot find CLR method
"""
RemoveLogicalChild(self: FrameworkElement, child: object)
Removes the provided object from this element's logical tree.
System.Windows.FrameworkElement updates the affected logical tree parent
pointers to keep in sync with this deletion.
child: The element to remove.
RemoveLogicalChild(self: Window_16$17, child: object)RemoveLogicalChild(self: Label_17$18, child: object)RemoveLogicalChild(self: TextBox_18$19, child: object)RemoveLogicalChild(self: Button_19$20, child: object)RemoveLogicalChild(self: CheckBox_20$21, child: object)RemoveLogicalChild(self: ComboBox_21$22, child: object)RemoveLogicalChild(self: Separator_22$23, child: object)
"""
pass
def RemoveVisualChild(self, *args): #cannot find CLR method
"""
RemoveVisualChild(self: Visual, child: Visual)
Removes the parent-child relationship between two visuals.
child: The child visual object to remove from the parent visual.
RemoveVisualChild(self: Window_16$17, child: Window_16$17)RemoveVisualChild(self: Label_17$18, child: Label_17$18)RemoveVisualChild(self: TextBox_18$19, child: TextBox_18$19)RemoveVisualChild(self: Button_19$20, child: Button_19$20)RemoveVisualChild(self: CheckBox_20$21, child: CheckBox_20$21)RemoveVisualChild(self: ComboBox_21$22, child: ComboBox_21$22)RemoveVisualChild(self: Separator_22$23, child: Separator_22$23)
"""
pass
def ShouldSerializeProperty(self, *args): #cannot find CLR method
"""
ShouldSerializeProperty(self: DependencyObject, dp: DependencyProperty) -> bool
Returns a value that indicates whether serialization processes should serialize
the value for the provided dependency property.
dp: The identifier for the dependency property that should be serialized.
Returns: true if the dependency property that is supplied should be value-serialized;
otherwise, false.
ShouldSerializeProperty(self: Window_16$17, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Label_17$18, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: TextBox_18$19, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Button_19$20, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: CheckBox_20$21, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: ComboBox_21$22, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Separator_22$23, dp: DependencyProperty) -> bool
"""
pass
def TabIntoCore(self, *args): #cannot find CLR method
"""
TabIntoCore(self: HwndHost, request: TraversalRequest) -> bool
Sets focus on either the first tab stop or the last tab stop of the sink.
request: Specifies whether focus should be set to the first or the last tab stop.
Returns: Always returns false.
"""
pass
def TranslateAcceleratorCore(self, *args): #cannot find CLR method
"""
TranslateAcceleratorCore(self: HwndHost, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes keyboard input at the keydown message level.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: Always returns false.
"""
pass
def TranslateCharCore(self, *args): #cannot find CLR method
"""
TranslateCharCore(self: HwndHost, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes WM_CHAR, WM_SYSCHAR, WM_DEADCHAR, and WM_SYSDEADCHAR input messages
before the
System.Windows.Interop.IKeyboardInputSink.OnMnemonic(System.Windows.Interop.MSG@
,System.Windows.Input.ModifierKeys) method is called.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: Always returns false.
"""
pass
def UpdateWindowPos(self):
"""
UpdateWindowPos(self: HwndHost)
Updates the child window's size, visibility, and position to reflect the
current state of the element.
"""
pass
def WndProc(self, *args): #cannot find CLR method
"""
WndProc(self: HwndHost, hwnd: IntPtr, msg: int, wParam: IntPtr, lParam: IntPtr, handled: bool) -> (IntPtr, bool)
When overridden in a derived class, accesses the window process (handle) of the
hosted child window.
hwnd: The window handle of the hosted window.
msg: The message to act upon.
wParam: Information that may be relevant to handling the message. This is typically
used to store small pieces of information, such as flags.
lParam: Information that may be relevant to handling the message. This is typically
used to reference an object.
handled: Whether events resulting should be marked handled.
Returns: The window handle of the child window.
"""
pass
def __enter__(self, *args): #cannot find CLR method
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self, *args): #cannot find CLR method
""" __exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object) """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
DefaultStyleKey = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the key to use to reference the style for this control, when theme styles are used or defined.
"""
Handle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the window handle of the hosted window.
Get: Handle(self: HwndHost) -> IntPtr
"""
HasEffectiveKeyboardFocus = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
InheritanceBehavior = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the scope limits for property value inheritance, resource key lookup, and RelativeSource FindAncestor lookup.
"""
IsEnabledCore = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that becomes the return value of System.Windows.UIElement.IsEnabled in derived classes.
"""
LogicalChildren = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets an enumerator for logical child elements of this element.
"""
StylusPlugIns = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a collection of all stylus plug-in (customization) objects associated with this element.
"""
VisualBitmapEffect = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Effects.BitmapEffect value for the System.Windows.Media.Visual.
"""
VisualBitmapEffectInput = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Effects.BitmapEffectInput value for the System.Windows.Media.Visual.
"""
VisualBitmapScalingMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.BitmapScalingMode for the System.Windows.Media.Visual.
"""
VisualCacheMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a cached representation of the System.Windows.Media.Visual.
"""
VisualChildrenCount = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the number of visual child elements within this element.
"""
VisualClearTypeHint = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.ClearTypeHint that determines how ClearType is rendered in the System.Windows.Media.Visual.
"""
VisualClip = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the clip region of the System.Windows.Media.Visual as a System.Windows.Media.Geometry value.
"""
VisualEdgeMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the edge mode of the System.Windows.Media.Visual as an System.Windows.Media.EdgeMode value.
"""
VisualEffect = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the bitmap effect to apply to the System.Windows.Media.Visual.
"""
VisualOffset = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the offset value of the visual object.
"""
VisualOpacity = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the opacity of the System.Windows.Media.Visual.
"""
VisualOpacityMask = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Brush value that represents the opacity mask of the System.Windows.Media.Visual.
"""
VisualParent = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the visual tree parent of the visual object.
"""
VisualScrollableAreaClip = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a clipped scrollable area for the System.Windows.Media.Visual.
"""
VisualTextHintingMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.TextHintingMode of the System.Windows.Media.Visual.
"""
VisualTextRenderingMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.TextRenderingMode of the System.Windows.Media.Visual.
"""
VisualTransform = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Transform value for the System.Windows.Media.Visual.
"""
VisualXSnappingGuidelines = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the x-coordinate (vertical) guideline collection.
"""
VisualYSnappingGuidelines = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the y-coordinate (horizontal) guideline collection.
"""
DpiChanged = None
DpiChangedEvent = None
MessageHook = None
class ActiveXHost(HwndHost, IResource, IAnimatable, IInputElement, IFrameworkInputElement, ISupportInitialize, IHaveResources, IQueryAmbient, IDisposable, IWin32Window, IKeyboardInputSink):
""" Hosts an ActiveX control as an element within Windows Presentation Foundation (WPF) content. """
def AddLogicalChild(self, *args): #cannot find CLR method
"""
AddLogicalChild(self: FrameworkElement, child: object)
Adds the provided object to the logical tree of this element.
child: Child element to be added.
AddLogicalChild(self: Window_16$17, child: object)AddLogicalChild(self: Label_17$18, child: object)AddLogicalChild(self: TextBox_18$19, child: object)AddLogicalChild(self: Button_19$20, child: object)AddLogicalChild(self: CheckBox_20$21, child: object)AddLogicalChild(self: ComboBox_21$22, child: object)AddLogicalChild(self: Separator_22$23, child: object)
"""
pass
def AddVisualChild(self, *args): #cannot find CLR method
"""
AddVisualChild(self: Visual, child: Visual)
Defines the parent-child relationship between two visuals.
child: The child visual object to add to parent visual.
AddVisualChild(self: Window_16$17, child: Window_16$17)AddVisualChild(self: Label_17$18, child: Label_17$18)AddVisualChild(self: TextBox_18$19, child: TextBox_18$19)AddVisualChild(self: Button_19$20, child: Button_19$20)AddVisualChild(self: CheckBox_20$21, child: CheckBox_20$21)AddVisualChild(self: ComboBox_21$22, child: ComboBox_21$22)AddVisualChild(self: Separator_22$23, child: Separator_22$23)
"""
pass
def ArrangeCore(self, *args): #cannot find CLR method
"""
ArrangeCore(self: FrameworkElement, finalRect: Rect)
Implements System.Windows.UIElement.ArrangeCore(System.Windows.Rect) (defined
as virtual in System.Windows.UIElement) and seals the implementation.
finalRect: The final area within the parent that this element should use to arrange itself
and its children.
ArrangeCore(self: Window_16$17, finalRect: Rect)ArrangeCore(self: Label_17$18, finalRect: Rect)ArrangeCore(self: TextBox_18$19, finalRect: Rect)ArrangeCore(self: Button_19$20, finalRect: Rect)ArrangeCore(self: CheckBox_20$21, finalRect: Rect)ArrangeCore(self: ComboBox_21$22, finalRect: Rect)ArrangeCore(self: Separator_22$23, finalRect: Rect)
"""
pass
def ArrangeOverride(self, *args): #cannot find CLR method
"""
ArrangeOverride(self: FrameworkElement, finalSize: Size) -> Size
When overridden in a derived class, positions child elements and determines a
size for a System.Windows.FrameworkElement derived class.
finalSize: The final area within the parent that this element should use to arrange itself
and its children.
Returns: The actual size used.
ArrangeOverride(self: Window_16$17, arrangeBounds: Size) -> Size
ArrangeOverride(self: Label_17$18, arrangeBounds: Size) -> Size
ArrangeOverride(self: TextBox_18$19, arrangeBounds: Size) -> Size
ArrangeOverride(self: Button_19$20, arrangeBounds: Size) -> Size
ArrangeOverride(self: CheckBox_20$21, arrangeBounds: Size) -> Size
ArrangeOverride(self: ComboBox_21$22, arrangeBounds: Size) -> Size
ArrangeOverride(self: Separator_22$23, arrangeBounds: Size) -> Size
"""
pass
def BuildWindowCore(self, *args): #cannot find CLR method
"""
BuildWindowCore(self: ActiveXHost, hwndParent: HandleRef) -> HandleRef
Creates the System.Windows.Interop.ActiveXHost window and assigns it to a
parent.
hwndParent: The parent window.
Returns: A System.Runtime.InteropServices.HandleRef to the
System.Windows.Interop.ActiveXHost window.
"""
pass
def DestroyWindowCore(self, *args): #cannot find CLR method
"""
DestroyWindowCore(self: ActiveXHost, hwnd: HandleRef)
Destroys the hosted window.
hwnd: A structure that contains the window handle.
"""
pass
def Dispose(self):
"""
Dispose(self: ActiveXHost, disposing: bool)
Releases the unmanaged resources that are used by the
System.Windows.Interop.ActiveXHost and optionally releases the managed
resources.
disposing: true to release both managed and unmanaged resources; false to release only
unmanaged resources.
"""
pass
def GetLayoutClip(self, *args): #cannot find CLR method
"""
GetLayoutClip(self: FrameworkElement, layoutSlotSize: Size) -> Geometry
Returns a geometry for a clipping mask. The mask applies if the layout system
attempts to arrange an element that is larger than the available display space.
layoutSlotSize: The size of the part of the element that does visual presentation.
Returns: The clipping geometry.
GetLayoutClip(self: Window_16$17, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: Label_17$18, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: TextBox_18$19, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: Button_19$20, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: CheckBox_20$21, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: ComboBox_21$22, layoutSlotSize: Size) -> Geometry
GetLayoutClip(self: Separator_22$23, layoutSlotSize: Size) -> Geometry
"""
pass
def GetTemplateChild(self, *args): #cannot find CLR method
"""
GetTemplateChild(self: FrameworkElement, childName: str) -> DependencyObject
Returns the named element in the visual tree of an instantiated
System.Windows.Controls.ControlTemplate.
childName: Name of the child to find.
Returns: The requested element. May be null if no element of the requested name exists.
GetTemplateChild(self: Window_16$17, childName: str) -> DependencyObject
GetTemplateChild(self: Label_17$18, childName: str) -> DependencyObject
GetTemplateChild(self: TextBox_18$19, childName: str) -> DependencyObject
GetTemplateChild(self: Button_19$20, childName: str) -> DependencyObject
GetTemplateChild(self: CheckBox_20$21, childName: str) -> DependencyObject
GetTemplateChild(self: ComboBox_21$22, childName: str) -> DependencyObject
GetTemplateChild(self: Separator_22$23, childName: str) -> DependencyObject
"""
pass
def GetUIParentCore(self, *args): #cannot find CLR method
"""
GetUIParentCore(self: FrameworkElement) -> DependencyObject
Returns an alternative logical parent for this element if there is no visual
parent.
Returns: Returns something other than null whenever a WPF framework-level implementation
of this method has a non-visual parent connection.
GetUIParentCore(self: Window_16$17) -> DependencyObject
GetUIParentCore(self: Label_17$18) -> DependencyObject
GetUIParentCore(self: TextBox_18$19) -> DependencyObject
GetUIParentCore(self: Button_19$20) -> DependencyObject
GetUIParentCore(self: CheckBox_20$21) -> DependencyObject
GetUIParentCore(self: ComboBox_21$22) -> DependencyObject
GetUIParentCore(self: Separator_22$23) -> DependencyObject
"""
pass
def GetVisualChild(self, *args): #cannot find CLR method
"""
GetVisualChild(self: FrameworkElement, index: int) -> Visual
Overrides System.Windows.Media.Visual.GetVisualChild(System.Int32), and returns
a child at the specified index from a collection of child elements.
index: The zero-based index of the requested child element in the collection.
Returns: The requested child element. This should not return null; if the provided index
is out of range, an exception is thrown.
GetVisualChild(self: Window_16$17, index: int) -> Visual
GetVisualChild(self: Label_17$18, index: int) -> Visual
GetVisualChild(self: TextBox_18$19, index: int) -> Visual
GetVisualChild(self: Button_19$20, index: int) -> Visual
GetVisualChild(self: CheckBox_20$21, index: int) -> Visual
GetVisualChild(self: ComboBox_21$22, index: int) -> Visual
GetVisualChild(self: Separator_22$23, index: int) -> Visual
"""
pass
def HasFocusWithinCore(self, *args): #cannot find CLR method
"""
HasFocusWithinCore(self: HwndHost) -> bool
Gets a value that indicates whether the sink or one of its contained components
has focus.
Returns: true if the sink or one of its contained components has focus; otherwise, false.
"""
pass
def HitTestCore(self, *args): #cannot find CLR method
"""
HitTestCore(self: UIElement, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
Implements
System.Windows.Media.Visual.HitTestCore(System.Windows.Media.GeometryHitTestPara
meters) to supply base element hit testing behavior (returning
System.Windows.Media.GeometryHitTestResult).
hitTestParameters: Describes the hit test to perform, including the initial hit point.
Returns: Results of the test, including the evaluated geometry.
HitTestCore(self: UIElement, hitTestParameters: PointHitTestParameters) -> HitTestResult
Implements
System.Windows.Media.Visual.HitTestCore(System.Windows.Media.PointHitTestParamet
ers) to supply base element hit testing behavior (returning
System.Windows.Media.HitTestResult).
hitTestParameters: Describes the hit test to perform, including the initial hit point.
Returns: Results of the test, including the evaluated point.
HitTestCore(self: Window_16$17, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Window_16$17, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: Label_17$18, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Label_17$18, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: TextBox_18$19, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: TextBox_18$19, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: Button_19$20, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Button_19$20, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: CheckBox_20$21, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: CheckBox_20$21, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: ComboBox_21$22, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: ComboBox_21$22, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
HitTestCore(self: Separator_22$23, hitTestParameters: PointHitTestParameters) -> HitTestResult
HitTestCore(self: Separator_22$23, hitTestParameters: GeometryHitTestParameters) -> GeometryHitTestResult
"""
pass
def MeasureCore(self, *args): #cannot find CLR method
"""
MeasureCore(self: FrameworkElement, availableSize: Size) -> Size
Implements basic measure-pass layout system behavior for
System.Windows.FrameworkElement.
availableSize: The available size that the parent element can give to the child elements.
Returns: The desired size of this element in layout.
MeasureCore(self: Window_16$17, availableSize: Size) -> Size
MeasureCore(self: Label_17$18, availableSize: Size) -> Size
MeasureCore(self: TextBox_18$19, availableSize: Size) -> Size
MeasureCore(self: Button_19$20, availableSize: Size) -> Size
MeasureCore(self: CheckBox_20$21, availableSize: Size) -> Size
MeasureCore(self: ComboBox_21$22, availableSize: Size) -> Size
MeasureCore(self: Separator_22$23, availableSize: Size) -> Size
"""
pass
def MeasureOverride(self, *args): #cannot find CLR method
"""
MeasureOverride(self: ActiveXHost, swConstraint: Size) -> Size
Returns the size of the window represented by the
System.Windows.Interop.HwndHost object, as requested by layout engine
operations.
swConstraint: The size of the System.Windows.Interop.HwndHost object.
Returns: The size of the System.Windows.Interop.HwndHost object.
"""
pass
def OnAccessKey(self, *args): #cannot find CLR method
"""
OnAccessKey(self: ActiveXHost, args: AccessKeyEventArgs)
Provides class handling for when an access key that is meaningful for this
element is invoked.
args: The event data to the access key event. The event data reports which key was
invoked, and indicate whether the System.Windows.Input.AccessKeyManager object
that controls the sending of these events also sent this access key invocation
to other elements.
"""
pass
def OnChildDesiredSizeChanged(self, *args): #cannot find CLR method
"""
OnChildDesiredSizeChanged(self: UIElement, child: UIElement)
Supports layout behavior when a child element is resized.
child: The child element that is being resized.
OnChildDesiredSizeChanged(self: Window_16$17, child: Window_16$17)OnChildDesiredSizeChanged(self: Label_17$18, child: Label_17$18)OnChildDesiredSizeChanged(self: TextBox_18$19, child: TextBox_18$19)OnChildDesiredSizeChanged(self: Button_19$20, child: Button_19$20)OnChildDesiredSizeChanged(self: CheckBox_20$21, child: CheckBox_20$21)OnChildDesiredSizeChanged(self: ComboBox_21$22, child: ComboBox_21$22)OnChildDesiredSizeChanged(self: Separator_22$23, child: Separator_22$23)
"""
pass
def OnContextMenuClosing(self, *args): #cannot find CLR method
"""
OnContextMenuClosing(self: FrameworkElement, e: ContextMenuEventArgs)
Invoked whenever an unhandled
System.Windows.FrameworkElement.ContextMenuClosing routed event reaches this
class in its route. Implement this method to add class handling for this event.
e: Provides data about the event.
OnContextMenuClosing(self: Window_16$17, e: ContextMenuEventArgs)OnContextMenuClosing(self: Label_17$18, e: ContextMenuEventArgs)OnContextMenuClosing(self: TextBox_18$19, e: ContextMenuEventArgs)OnContextMenuClosing(self: Button_19$20, e: ContextMenuEventArgs)OnContextMenuClosing(self: CheckBox_20$21, e: ContextMenuEventArgs)OnContextMenuClosing(self: ComboBox_21$22, e: ContextMenuEventArgs)OnContextMenuClosing(self: Separator_22$23, e: ContextMenuEventArgs)
"""
pass
def OnContextMenuOpening(self, *args): #cannot find CLR method
"""
OnContextMenuOpening(self: FrameworkElement, e: ContextMenuEventArgs)
Invoked whenever an unhandled
System.Windows.FrameworkElement.ContextMenuOpening routed event reaches this
class in its route. Implement this method to add class handling for this event.
e: The System.Windows.RoutedEventArgs that contains the event data.
OnContextMenuOpening(self: Window_16$17, e: ContextMenuEventArgs)OnContextMenuOpening(self: Label_17$18, e: ContextMenuEventArgs)OnContextMenuOpening(self: TextBox_18$19, e: ContextMenuEventArgs)OnContextMenuOpening(self: Button_19$20, e: ContextMenuEventArgs)OnContextMenuOpening(self: CheckBox_20$21, e: ContextMenuEventArgs)OnContextMenuOpening(self: ComboBox_21$22, e: ContextMenuEventArgs)OnContextMenuOpening(self: Separator_22$23, e: ContextMenuEventArgs)
"""
pass
def OnCreateAutomationPeer(self, *args): #cannot find CLR method
"""
OnCreateAutomationPeer(self: HwndHost) -> AutomationPeer
Creates an System.Windows.Automation.Peers.AutomationPeer for
System.Windows.Interop.HwndHost .
Returns: The type-specific System.Windows.Automation.Peers.AutomationPeer implementation.
"""
pass
def OnDpiChanged(self, *args): #cannot find CLR method
""" OnDpiChanged(self: HwndHost, oldDpi: DpiScale, newDpi: DpiScale) """
pass
def OnDragEnter(self, *args): #cannot find CLR method
"""
OnDragEnter(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragEnter�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDragEnter(self: Window_16$17, e: DragEventArgs)OnDragEnter(self: Label_17$18, e: DragEventArgs)OnDragEnter(self: TextBox_18$19, e: DragEventArgs)OnDragEnter(self: Button_19$20, e: DragEventArgs)OnDragEnter(self: CheckBox_20$21, e: DragEventArgs)OnDragEnter(self: ComboBox_21$22, e: DragEventArgs)OnDragEnter(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnDragLeave(self, *args): #cannot find CLR method
"""
OnDragLeave(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragLeave�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDragLeave(self: Window_16$17, e: DragEventArgs)OnDragLeave(self: Label_17$18, e: DragEventArgs)OnDragLeave(self: TextBox_18$19, e: DragEventArgs)OnDragLeave(self: Button_19$20, e: DragEventArgs)OnDragLeave(self: CheckBox_20$21, e: DragEventArgs)OnDragLeave(self: ComboBox_21$22, e: DragEventArgs)OnDragLeave(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnDragOver(self, *args): #cannot find CLR method
"""
OnDragOver(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragOver�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDragOver(self: Window_16$17, e: DragEventArgs)OnDragOver(self: Label_17$18, e: DragEventArgs)OnDragOver(self: TextBox_18$19, e: DragEventArgs)OnDragOver(self: Button_19$20, e: DragEventArgs)OnDragOver(self: CheckBox_20$21, e: DragEventArgs)OnDragOver(self: ComboBox_21$22, e: DragEventArgs)OnDragOver(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnDrop(self, *args): #cannot find CLR method
"""
OnDrop(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.DragEnter�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnDrop(self: Window_16$17, e: DragEventArgs)OnDrop(self: Label_17$18, e: DragEventArgs)OnDrop(self: TextBox_18$19, e: DragEventArgs)OnDrop(self: Button_19$20, e: DragEventArgs)OnDrop(self: CheckBox_20$21, e: DragEventArgs)OnDrop(self: ComboBox_21$22, e: DragEventArgs)OnDrop(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnGiveFeedback(self, *args): #cannot find CLR method
"""
OnGiveFeedback(self: UIElement, e: GiveFeedbackEventArgs)
Invoked when an unhandled System.Windows.DragDrop.GiveFeedback�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.GiveFeedbackEventArgs that contains the event data.
OnGiveFeedback(self: Window_16$17, e: GiveFeedbackEventArgs)OnGiveFeedback(self: Label_17$18, e: GiveFeedbackEventArgs)OnGiveFeedback(self: TextBox_18$19, e: GiveFeedbackEventArgs)OnGiveFeedback(self: Button_19$20, e: GiveFeedbackEventArgs)OnGiveFeedback(self: CheckBox_20$21, e: GiveFeedbackEventArgs)OnGiveFeedback(self: ComboBox_21$22, e: GiveFeedbackEventArgs)OnGiveFeedback(self: Separator_22$23, e: GiveFeedbackEventArgs)
"""
pass
def OnGotFocus(self, *args): #cannot find CLR method
"""
OnGotFocus(self: FrameworkElement, e: RoutedEventArgs)
Invoked whenever an unhandled System.Windows.UIElement.GotFocus event reaches
this element in its route.
e: The System.Windows.RoutedEventArgs that contains the event data.
OnGotFocus(self: Window_16$17, e: RoutedEventArgs)OnGotFocus(self: Label_17$18, e: RoutedEventArgs)OnGotFocus(self: TextBox_18$19, e: RoutedEventArgs)OnGotFocus(self: Button_19$20, e: RoutedEventArgs)OnGotFocus(self: CheckBox_20$21, e: RoutedEventArgs)OnGotFocus(self: Separator_22$23, e: RoutedEventArgs)
"""
pass
def OnGotKeyboardFocus(self, *args): #cannot find CLR method
"""
OnGotKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.GotKeyboardFocus�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains the event
data.
OnGotKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnGotKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnGotMouseCapture(self, *args): #cannot find CLR method
"""
OnGotMouseCapture(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.GotMouseCapture�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnGotMouseCapture(self: Window_16$17, e: MouseEventArgs)OnGotMouseCapture(self: Label_17$18, e: MouseEventArgs)OnGotMouseCapture(self: TextBox_18$19, e: MouseEventArgs)OnGotMouseCapture(self: Button_19$20, e: MouseEventArgs)OnGotMouseCapture(self: CheckBox_20$21, e: MouseEventArgs)OnGotMouseCapture(self: ComboBox_21$22, e: MouseEventArgs)OnGotMouseCapture(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnGotStylusCapture(self, *args): #cannot find CLR method
"""
OnGotStylusCapture(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.GotStylusCapture�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnGotStylusCapture(self: Window_16$17, e: StylusEventArgs)OnGotStylusCapture(self: Label_17$18, e: StylusEventArgs)OnGotStylusCapture(self: TextBox_18$19, e: StylusEventArgs)OnGotStylusCapture(self: Button_19$20, e: StylusEventArgs)OnGotStylusCapture(self: CheckBox_20$21, e: StylusEventArgs)OnGotStylusCapture(self: ComboBox_21$22, e: StylusEventArgs)OnGotStylusCapture(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnGotTouchCapture(self, *args): #cannot find CLR method
"""
OnGotTouchCapture(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.GotTouchCapture routed
event that occurs when a touch is captured to this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnGotTouchCapture(self: Window_16$17, e: TouchEventArgs)OnGotTouchCapture(self: Label_17$18, e: TouchEventArgs)OnGotTouchCapture(self: TextBox_18$19, e: TouchEventArgs)OnGotTouchCapture(self: Button_19$20, e: TouchEventArgs)OnGotTouchCapture(self: CheckBox_20$21, e: TouchEventArgs)OnGotTouchCapture(self: ComboBox_21$22, e: TouchEventArgs)OnGotTouchCapture(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnInitialized(self, *args): #cannot find CLR method
"""
OnInitialized(self: FrameworkElement, e: EventArgs)
Raises the System.Windows.FrameworkElement.Initialized event. This method is
invoked whenever System.Windows.FrameworkElement.IsInitialized is set to true
internally.
e: The System.Windows.RoutedEventArgs that contains the event data.
OnInitialized(self: Window_16$17, e: EventArgs)OnInitialized(self: Label_17$18, e: EventArgs)OnInitialized(self: TextBox_18$19, e: EventArgs)OnInitialized(self: Button_19$20, e: EventArgs)OnInitialized(self: CheckBox_20$21, e: EventArgs)OnInitialized(self: ComboBox_21$22, e: EventArgs)OnInitialized(self: Separator_22$23, e: EventArgs)
"""
pass
def OnIsKeyboardFocusedChanged(self, *args): #cannot find CLR method
"""
OnIsKeyboardFocusedChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsKeyboardFocusedChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsKeyboardFocusedChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusedChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsKeyboardFocusWithinChanged(self, *args): #cannot find CLR method
"""
OnIsKeyboardFocusWithinChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked just before the System.Windows.UIElement.IsKeyboardFocusWithinChanged
event is raised by this element. Implement this method to add class handling
for this event.
e: A System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsKeyboardFocusWithinChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsKeyboardFocusWithinChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsMouseCapturedChanged(self, *args): #cannot find CLR method
"""
OnIsMouseCapturedChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsMouseCapturedChanged event
is raised on this element. Implement this method to add class handling for this
event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsMouseCapturedChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsMouseCapturedChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsMouseCaptureWithinChanged(self, *args): #cannot find CLR method
"""
OnIsMouseCaptureWithinChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsMouseCaptureWithinChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: A System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsMouseCaptureWithinChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsMouseCaptureWithinChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsMouseDirectlyOverChanged(self, *args): #cannot find CLR method
"""
OnIsMouseDirectlyOverChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsMouseDirectlyOverChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsMouseDirectlyOverChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsMouseDirectlyOverChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsStylusCapturedChanged(self, *args): #cannot find CLR method
"""
OnIsStylusCapturedChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsStylusCapturedChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: A System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsStylusCapturedChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsStylusCapturedChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsStylusCaptureWithinChanged(self, *args): #cannot find CLR method
"""
OnIsStylusCaptureWithinChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsStylusCaptureWithinChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsStylusCaptureWithinChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsStylusCaptureWithinChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnIsStylusDirectlyOverChanged(self, *args): #cannot find CLR method
"""
OnIsStylusDirectlyOverChanged(self: UIElement, e: DependencyPropertyChangedEventArgs)
Invoked when an unhandled System.Windows.UIElement.IsStylusDirectlyOverChanged
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.DependencyPropertyChangedEventArgs that contains the event
data.
OnIsStylusDirectlyOverChanged(self: Window_16$17, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: Label_17$18, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: TextBox_18$19, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: Button_19$20, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: CheckBox_20$21, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: ComboBox_21$22, e: DependencyPropertyChangedEventArgs)OnIsStylusDirectlyOverChanged(self: Separator_22$23, e: DependencyPropertyChangedEventArgs)
"""
pass
def OnKeyDown(self, *args): #cannot find CLR method
"""
OnKeyDown(self: HwndHost, e: KeyEventArgs)
Called when the hosted window receives a WM_KEYDOWN message.
e: The event data.
"""
pass
def OnKeyUp(self, *args): #cannot find CLR method
"""
OnKeyUp(self: HwndHost, e: KeyEventArgs)
Called when the hosted window receives a WM_KEYUP message.
e: The event data.
"""
pass
def OnLostFocus(self, *args): #cannot find CLR method
"""
OnLostFocus(self: UIElement, e: RoutedEventArgs)
Raises the System.Windows.UIElement.LostFocus�routed event by using the event
data that is provided.
e: A System.Windows.RoutedEventArgs that contains event data. This event data must
contain the identifier for the System.Windows.UIElement.LostFocus event.
OnLostFocus(self: Window_16$17, e: RoutedEventArgs)OnLostFocus(self: Label_17$18, e: RoutedEventArgs)OnLostFocus(self: TextBox_18$19, e: RoutedEventArgs)OnLostFocus(self: Button_19$20, e: RoutedEventArgs)OnLostFocus(self: CheckBox_20$21, e: RoutedEventArgs)OnLostFocus(self: ComboBox_21$22, e: RoutedEventArgs)OnLostFocus(self: Separator_22$23, e: RoutedEventArgs)
"""
pass
def OnLostKeyboardFocus(self, *args): #cannot find CLR method
"""
OnLostKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.LostKeyboardFocus�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains event data.
OnLostKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnLostKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnLostMouseCapture(self, *args): #cannot find CLR method
"""
OnLostMouseCapture(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.LostMouseCapture�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains event data.
OnLostMouseCapture(self: Window_16$17, e: MouseEventArgs)OnLostMouseCapture(self: Label_17$18, e: MouseEventArgs)OnLostMouseCapture(self: TextBox_18$19, e: MouseEventArgs)OnLostMouseCapture(self: Button_19$20, e: MouseEventArgs)OnLostMouseCapture(self: CheckBox_20$21, e: MouseEventArgs)OnLostMouseCapture(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnLostStylusCapture(self, *args): #cannot find CLR method
"""
OnLostStylusCapture(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.LostStylusCapture�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains event data.
OnLostStylusCapture(self: Window_16$17, e: StylusEventArgs)OnLostStylusCapture(self: Label_17$18, e: StylusEventArgs)OnLostStylusCapture(self: TextBox_18$19, e: StylusEventArgs)OnLostStylusCapture(self: Button_19$20, e: StylusEventArgs)OnLostStylusCapture(self: CheckBox_20$21, e: StylusEventArgs)OnLostStylusCapture(self: ComboBox_21$22, e: StylusEventArgs)OnLostStylusCapture(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnLostTouchCapture(self, *args): #cannot find CLR method
"""
OnLostTouchCapture(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.LostTouchCapture
routed event that occurs when this element loses a touch capture.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnLostTouchCapture(self: Window_16$17, e: TouchEventArgs)OnLostTouchCapture(self: Label_17$18, e: TouchEventArgs)OnLostTouchCapture(self: TextBox_18$19, e: TouchEventArgs)OnLostTouchCapture(self: Button_19$20, e: TouchEventArgs)OnLostTouchCapture(self: CheckBox_20$21, e: TouchEventArgs)OnLostTouchCapture(self: ComboBox_21$22, e: TouchEventArgs)OnLostTouchCapture(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnManipulationBoundaryFeedback(self, *args): #cannot find CLR method
"""
OnManipulationBoundaryFeedback(self: UIElement, e: ManipulationBoundaryFeedbackEventArgs)
Called when the System.Windows.UIElement.ManipulationBoundaryFeedback event
occurs.
e: The data for the event.
OnManipulationBoundaryFeedback(self: Window_16$17, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: Label_17$18, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: TextBox_18$19, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: Button_19$20, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: CheckBox_20$21, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: ComboBox_21$22, e: ManipulationBoundaryFeedbackEventArgs)OnManipulationBoundaryFeedback(self: Separator_22$23, e: ManipulationBoundaryFeedbackEventArgs)
"""
pass
def OnManipulationCompleted(self, *args): #cannot find CLR method
"""
OnManipulationCompleted(self: UIElement, e: ManipulationCompletedEventArgs)
Called when the System.Windows.UIElement.ManipulationCompleted event occurs.
e: The data for the event.
OnManipulationCompleted(self: Window_16$17, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: Label_17$18, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: TextBox_18$19, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: Button_19$20, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: CheckBox_20$21, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: ComboBox_21$22, e: ManipulationCompletedEventArgs)OnManipulationCompleted(self: Separator_22$23, e: ManipulationCompletedEventArgs)
"""
pass
def OnManipulationDelta(self, *args): #cannot find CLR method
"""
OnManipulationDelta(self: UIElement, e: ManipulationDeltaEventArgs)
Called when the System.Windows.UIElement.ManipulationDelta event occurs.
e: The data for the event.
OnManipulationDelta(self: Window_16$17, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: Label_17$18, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: TextBox_18$19, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: Button_19$20, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: CheckBox_20$21, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: ComboBox_21$22, e: ManipulationDeltaEventArgs)OnManipulationDelta(self: Separator_22$23, e: ManipulationDeltaEventArgs)
"""
pass
def OnManipulationInertiaStarting(self, *args): #cannot find CLR method
"""
OnManipulationInertiaStarting(self: UIElement, e: ManipulationInertiaStartingEventArgs)
Called when the System.Windows.UIElement.ManipulationInertiaStarting event
occurs.
e: The data for the event.
OnManipulationInertiaStarting(self: Window_16$17, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: Label_17$18, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: TextBox_18$19, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: Button_19$20, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: CheckBox_20$21, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: ComboBox_21$22, e: ManipulationInertiaStartingEventArgs)OnManipulationInertiaStarting(self: Separator_22$23, e: ManipulationInertiaStartingEventArgs)
"""
pass
def OnManipulationStarted(self, *args): #cannot find CLR method
"""
OnManipulationStarted(self: UIElement, e: ManipulationStartedEventArgs)
Called when the System.Windows.UIElement.ManipulationStarted event occurs.
e: The data for the event.
OnManipulationStarted(self: Window_16$17, e: ManipulationStartedEventArgs)OnManipulationStarted(self: Label_17$18, e: ManipulationStartedEventArgs)OnManipulationStarted(self: TextBox_18$19, e: ManipulationStartedEventArgs)OnManipulationStarted(self: Button_19$20, e: ManipulationStartedEventArgs)OnManipulationStarted(self: CheckBox_20$21, e: ManipulationStartedEventArgs)OnManipulationStarted(self: ComboBox_21$22, e: ManipulationStartedEventArgs)OnManipulationStarted(self: Separator_22$23, e: ManipulationStartedEventArgs)
"""
pass
def OnManipulationStarting(self, *args): #cannot find CLR method
"""
OnManipulationStarting(self: UIElement, e: ManipulationStartingEventArgs)
Provides class handling for the System.Windows.UIElement.ManipulationStarting
routed event that occurs when the manipulation processor is first created.
e: A System.Windows.Input.ManipulationStartingEventArgs that contains the event
data.
OnManipulationStarting(self: Window_16$17, e: ManipulationStartingEventArgs)OnManipulationStarting(self: Label_17$18, e: ManipulationStartingEventArgs)OnManipulationStarting(self: TextBox_18$19, e: ManipulationStartingEventArgs)OnManipulationStarting(self: Button_19$20, e: ManipulationStartingEventArgs)OnManipulationStarting(self: CheckBox_20$21, e: ManipulationStartingEventArgs)OnManipulationStarting(self: ComboBox_21$22, e: ManipulationStartingEventArgs)OnManipulationStarting(self: Separator_22$23, e: ManipulationStartingEventArgs)
"""
pass
def OnMnemonicCore(self, *args): #cannot find CLR method
"""
OnMnemonicCore(self: HwndHost, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Called when one of the mnemonics (access keys) for this sink is invoked.
msg: The message for the mnemonic and associated data.
modifiers: Modifier keys.
Returns: Always returns false.
"""
pass
def OnMouseDown(self, *args): #cannot find CLR method
"""
OnMouseDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseDown�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data.
This event data reports details about the mouse button that was pressed and the
handled state.
OnMouseDown(self: Window_16$17, e: MouseButtonEventArgs)OnMouseDown(self: Label_17$18, e: MouseButtonEventArgs)OnMouseDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseDown(self: Button_19$20, e: MouseButtonEventArgs)OnMouseDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseEnter(self, *args): #cannot find CLR method
"""
OnMouseEnter(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseEnter�attached event
is raised on this element. Implement this method to add class handling for this
event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnMouseEnter(self: Window_16$17, e: MouseEventArgs)OnMouseEnter(self: Label_17$18, e: MouseEventArgs)OnMouseEnter(self: TextBox_18$19, e: MouseEventArgs)OnMouseEnter(self: Button_19$20, e: MouseEventArgs)OnMouseEnter(self: CheckBox_20$21, e: MouseEventArgs)OnMouseEnter(self: ComboBox_21$22, e: MouseEventArgs)OnMouseEnter(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnMouseLeave(self, *args): #cannot find CLR method
"""
OnMouseLeave(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseLeave�attached event
is raised on this element. Implement this method to add class handling for this
event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnMouseLeave(self: Window_16$17, e: MouseEventArgs)OnMouseLeave(self: Label_17$18, e: MouseEventArgs)OnMouseLeave(self: TextBox_18$19, e: MouseEventArgs)OnMouseLeave(self: Button_19$20, e: MouseEventArgs)OnMouseLeave(self: CheckBox_20$21, e: MouseEventArgs)OnMouseLeave(self: ComboBox_21$22, e: MouseEventArgs)OnMouseLeave(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnMouseLeftButtonDown(self, *args): #cannot find CLR method
"""
OnMouseLeftButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseLeftButtonDown�routed
event is raised on this element. Implement this method to add class handling
for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was pressed.
OnMouseLeftButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseLeftButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseLeftButtonUp(self, *args): #cannot find CLR method
"""
OnMouseLeftButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseLeftButtonUp�routed
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was released.
OnMouseLeftButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseLeftButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseMove(self, *args): #cannot find CLR method
"""
OnMouseMove(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseMove�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnMouseMove(self: Window_16$17, e: MouseEventArgs)OnMouseMove(self: Label_17$18, e: MouseEventArgs)OnMouseMove(self: TextBox_18$19, e: MouseEventArgs)OnMouseMove(self: Button_19$20, e: MouseEventArgs)OnMouseMove(self: CheckBox_20$21, e: MouseEventArgs)OnMouseMove(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnMouseRightButtonDown(self, *args): #cannot find CLR method
"""
OnMouseRightButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseRightButtonDown�routed
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was pressed.
OnMouseRightButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseRightButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseRightButtonUp(self, *args): #cannot find CLR method
"""
OnMouseRightButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.MouseRightButtonUp�routed
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was released.
OnMouseRightButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseRightButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseUp(self, *args): #cannot find CLR method
"""
OnMouseUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseUp�routed event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the mouse button was released.
OnMouseUp(self: Window_16$17, e: MouseButtonEventArgs)OnMouseUp(self: Label_17$18, e: MouseButtonEventArgs)OnMouseUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnMouseUp(self: Button_19$20, e: MouseButtonEventArgs)OnMouseUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnMouseUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnMouseUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnMouseWheel(self, *args): #cannot find CLR method
"""
OnMouseWheel(self: UIElement, e: MouseWheelEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.MouseWheel�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.MouseWheelEventArgs that contains the event data.
OnMouseWheel(self: Window_16$17, e: MouseWheelEventArgs)OnMouseWheel(self: Label_17$18, e: MouseWheelEventArgs)OnMouseWheel(self: TextBox_18$19, e: MouseWheelEventArgs)OnMouseWheel(self: Button_19$20, e: MouseWheelEventArgs)OnMouseWheel(self: CheckBox_20$21, e: MouseWheelEventArgs)OnMouseWheel(self: Separator_22$23, e: MouseWheelEventArgs)
"""
pass
def OnPreviewDragEnter(self, *args): #cannot find CLR method
"""
OnPreviewDragEnter(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDragEnter�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDragEnter(self: Window_16$17, e: DragEventArgs)OnPreviewDragEnter(self: Label_17$18, e: DragEventArgs)OnPreviewDragEnter(self: TextBox_18$19, e: DragEventArgs)OnPreviewDragEnter(self: Button_19$20, e: DragEventArgs)OnPreviewDragEnter(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDragEnter(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDragEnter(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewDragLeave(self, *args): #cannot find CLR method
"""
OnPreviewDragLeave(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDragLeave�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDragLeave(self: Window_16$17, e: DragEventArgs)OnPreviewDragLeave(self: Label_17$18, e: DragEventArgs)OnPreviewDragLeave(self: TextBox_18$19, e: DragEventArgs)OnPreviewDragLeave(self: Button_19$20, e: DragEventArgs)OnPreviewDragLeave(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDragLeave(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDragLeave(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewDragOver(self, *args): #cannot find CLR method
"""
OnPreviewDragOver(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDragOver�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDragOver(self: Window_16$17, e: DragEventArgs)OnPreviewDragOver(self: Label_17$18, e: DragEventArgs)OnPreviewDragOver(self: TextBox_18$19, e: DragEventArgs)OnPreviewDragOver(self: Button_19$20, e: DragEventArgs)OnPreviewDragOver(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDragOver(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDragOver(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewDrop(self, *args): #cannot find CLR method
"""
OnPreviewDrop(self: UIElement, e: DragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewDrop�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.DragEventArgs that contains the event data.
OnPreviewDrop(self: Window_16$17, e: DragEventArgs)OnPreviewDrop(self: Label_17$18, e: DragEventArgs)OnPreviewDrop(self: TextBox_18$19, e: DragEventArgs)OnPreviewDrop(self: Button_19$20, e: DragEventArgs)OnPreviewDrop(self: CheckBox_20$21, e: DragEventArgs)OnPreviewDrop(self: ComboBox_21$22, e: DragEventArgs)OnPreviewDrop(self: Separator_22$23, e: DragEventArgs)
"""
pass
def OnPreviewGiveFeedback(self, *args): #cannot find CLR method
"""
OnPreviewGiveFeedback(self: UIElement, e: GiveFeedbackEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewGiveFeedback�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.GiveFeedbackEventArgs that contains the event data.
OnPreviewGiveFeedback(self: Window_16$17, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: Label_17$18, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: TextBox_18$19, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: Button_19$20, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: CheckBox_20$21, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: ComboBox_21$22, e: GiveFeedbackEventArgs)OnPreviewGiveFeedback(self: Separator_22$23, e: GiveFeedbackEventArgs)
"""
pass
def OnPreviewGotKeyboardFocus(self, *args): #cannot find CLR method
"""
OnPreviewGotKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewGotKeyboardFocus�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains the event
data.
OnPreviewGotKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnPreviewGotKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnPreviewKeyDown(self, *args): #cannot find CLR method
"""
OnPreviewKeyDown(self: UIElement, e: KeyEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewKeyDown�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyEventArgs that contains the event data.
OnPreviewKeyDown(self: Window_16$17, e: KeyEventArgs)OnPreviewKeyDown(self: Label_17$18, e: KeyEventArgs)OnPreviewKeyDown(self: TextBox_18$19, e: KeyEventArgs)OnPreviewKeyDown(self: Button_19$20, e: KeyEventArgs)OnPreviewKeyDown(self: CheckBox_20$21, e: KeyEventArgs)OnPreviewKeyDown(self: ComboBox_21$22, e: KeyEventArgs)OnPreviewKeyDown(self: Separator_22$23, e: KeyEventArgs)
"""
pass
def OnPreviewKeyUp(self, *args): #cannot find CLR method
"""
OnPreviewKeyUp(self: UIElement, e: KeyEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewKeyUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyEventArgs that contains the event data.
OnPreviewKeyUp(self: Window_16$17, e: KeyEventArgs)OnPreviewKeyUp(self: Label_17$18, e: KeyEventArgs)OnPreviewKeyUp(self: TextBox_18$19, e: KeyEventArgs)OnPreviewKeyUp(self: Button_19$20, e: KeyEventArgs)OnPreviewKeyUp(self: CheckBox_20$21, e: KeyEventArgs)OnPreviewKeyUp(self: ComboBox_21$22, e: KeyEventArgs)OnPreviewKeyUp(self: Separator_22$23, e: KeyEventArgs)
"""
pass
def OnPreviewLostKeyboardFocus(self, *args): #cannot find CLR method
"""
OnPreviewLostKeyboardFocus(self: UIElement, e: KeyboardFocusChangedEventArgs)
Invoked when an unhandled System.Windows.Input.Keyboard.PreviewKeyDown�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.KeyboardFocusChangedEventArgs that contains the event
data.
OnPreviewLostKeyboardFocus(self: Window_16$17, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: Label_17$18, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: TextBox_18$19, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: Button_19$20, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: CheckBox_20$21, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: ComboBox_21$22, e: KeyboardFocusChangedEventArgs)OnPreviewLostKeyboardFocus(self: Separator_22$23, e: KeyboardFocusChangedEventArgs)
"""
pass
def OnPreviewMouseDown(self, *args): #cannot find CLR method
"""
OnPreviewMouseDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseDown attached�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that one or more mouse buttons were pressed.
OnPreviewMouseDown(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseDown(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseDown(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseLeftButtonDown(self, *args): #cannot find CLR method
"""
OnPreviewMouseLeftButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseLeftButtonDown�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was pressed.
OnPreviewMouseLeftButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseLeftButtonUp(self, *args): #cannot find CLR method
"""
OnPreviewMouseLeftButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseLeftButtonUp�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the left mouse button was released.
OnPreviewMouseLeftButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseLeftButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseMove(self, *args): #cannot find CLR method
"""
OnPreviewMouseMove(self: UIElement, e: MouseEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseMove�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseEventArgs that contains the event data.
OnPreviewMouseMove(self: Window_16$17, e: MouseEventArgs)OnPreviewMouseMove(self: Label_17$18, e: MouseEventArgs)OnPreviewMouseMove(self: TextBox_18$19, e: MouseEventArgs)OnPreviewMouseMove(self: Button_19$20, e: MouseEventArgs)OnPreviewMouseMove(self: CheckBox_20$21, e: MouseEventArgs)OnPreviewMouseMove(self: ComboBox_21$22, e: MouseEventArgs)OnPreviewMouseMove(self: Separator_22$23, e: MouseEventArgs)
"""
pass
def OnPreviewMouseRightButtonDown(self, *args): #cannot find CLR method
"""
OnPreviewMouseRightButtonDown(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseRightButtonDown�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was pressed.
OnPreviewMouseRightButtonDown(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseRightButtonDown(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseRightButtonUp(self, *args): #cannot find CLR method
"""
OnPreviewMouseRightButtonUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.UIElement.PreviewMouseRightButtonUp�
routed event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that the right mouse button was released.
OnPreviewMouseRightButtonUp(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseRightButtonUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseUp(self, *args): #cannot find CLR method
"""
OnPreviewMouseUp(self: UIElement, e: MouseButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseButtonEventArgs that contains the event data. The
event data reports that one or more mouse buttons were released.
OnPreviewMouseUp(self: Window_16$17, e: MouseButtonEventArgs)OnPreviewMouseUp(self: Label_17$18, e: MouseButtonEventArgs)OnPreviewMouseUp(self: TextBox_18$19, e: MouseButtonEventArgs)OnPreviewMouseUp(self: Button_19$20, e: MouseButtonEventArgs)OnPreviewMouseUp(self: CheckBox_20$21, e: MouseButtonEventArgs)OnPreviewMouseUp(self: ComboBox_21$22, e: MouseButtonEventArgs)OnPreviewMouseUp(self: Separator_22$23, e: MouseButtonEventArgs)
"""
pass
def OnPreviewMouseWheel(self, *args): #cannot find CLR method
"""
OnPreviewMouseWheel(self: UIElement, e: MouseWheelEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.PreviewMouseWheel�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.MouseWheelEventArgs that contains the event data.
OnPreviewMouseWheel(self: Window_16$17, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: Label_17$18, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: TextBox_18$19, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: Button_19$20, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: CheckBox_20$21, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: ComboBox_21$22, e: MouseWheelEventArgs)OnPreviewMouseWheel(self: Separator_22$23, e: MouseWheelEventArgs)
"""
pass
def OnPreviewQueryContinueDrag(self, *args): #cannot find CLR method
"""
OnPreviewQueryContinueDrag(self: UIElement, e: QueryContinueDragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.PreviewQueryContinueDrag�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.QueryContinueDragEventArgs that contains the event data.
OnPreviewQueryContinueDrag(self: Window_16$17, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: Label_17$18, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: TextBox_18$19, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: Button_19$20, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: CheckBox_20$21, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: ComboBox_21$22, e: QueryContinueDragEventArgs)OnPreviewQueryContinueDrag(self: Separator_22$23, e: QueryContinueDragEventArgs)
"""
pass
def OnPreviewStylusButtonDown(self, *args): #cannot find CLR method
"""
OnPreviewStylusButtonDown(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusButtonDown�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnPreviewStylusButtonDown(self: Window_16$17, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: Label_17$18, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: TextBox_18$19, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: Button_19$20, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: CheckBox_20$21, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: ComboBox_21$22, e: StylusButtonEventArgs)OnPreviewStylusButtonDown(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnPreviewStylusButtonUp(self, *args): #cannot find CLR method
"""
OnPreviewStylusButtonUp(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusButtonUp�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnPreviewStylusButtonUp(self: Window_16$17, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: Label_17$18, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: TextBox_18$19, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: Button_19$20, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: CheckBox_20$21, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: ComboBox_21$22, e: StylusButtonEventArgs)OnPreviewStylusButtonUp(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnPreviewStylusDown(self, *args): #cannot find CLR method
"""
OnPreviewStylusDown(self: UIElement, e: StylusDownEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusDown�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusDownEventArgs that contains the event data.
OnPreviewStylusDown(self: Window_16$17, e: StylusDownEventArgs)OnPreviewStylusDown(self: Label_17$18, e: StylusDownEventArgs)OnPreviewStylusDown(self: TextBox_18$19, e: StylusDownEventArgs)OnPreviewStylusDown(self: Button_19$20, e: StylusDownEventArgs)OnPreviewStylusDown(self: CheckBox_20$21, e: StylusDownEventArgs)OnPreviewStylusDown(self: ComboBox_21$22, e: StylusDownEventArgs)OnPreviewStylusDown(self: Separator_22$23, e: StylusDownEventArgs)
"""
pass
def OnPreviewStylusInAirMove(self, *args): #cannot find CLR method
"""
OnPreviewStylusInAirMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusInAirMove�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusInAirMove(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusInAirMove(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusInAirMove(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusInAirMove(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusInAirMove(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusInAirMove(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusInAirMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusInRange(self, *args): #cannot find CLR method
"""
OnPreviewStylusInRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusInRange�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusInRange(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusInRange(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusInRange(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusInRange(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusInRange(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusInRange(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusInRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusMove(self, *args): #cannot find CLR method
"""
OnPreviewStylusMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusMove�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusMove(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusMove(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusMove(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusMove(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusMove(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusMove(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusOutOfRange(self, *args): #cannot find CLR method
"""
OnPreviewStylusOutOfRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusOutOfRange�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusOutOfRange(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusOutOfRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewStylusSystemGesture(self, *args): #cannot find CLR method
"""
OnPreviewStylusSystemGesture(self: UIElement, e: StylusSystemGestureEventArgs)
Invoked when an unhandled
System.Windows.Input.Stylus.PreviewStylusSystemGesture�attached event reaches
an element in its route that is derived from this class. Implement this method
to add class handling for this event.
e: The System.Windows.Input.StylusSystemGestureEventArgs that contains the event
data.
OnPreviewStylusSystemGesture(self: Window_16$17, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: Label_17$18, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: TextBox_18$19, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: Button_19$20, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: CheckBox_20$21, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: ComboBox_21$22, e: StylusSystemGestureEventArgs)OnPreviewStylusSystemGesture(self: Separator_22$23, e: StylusSystemGestureEventArgs)
"""
pass
def OnPreviewStylusUp(self, *args): #cannot find CLR method
"""
OnPreviewStylusUp(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.PreviewStylusUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnPreviewStylusUp(self: Window_16$17, e: StylusEventArgs)OnPreviewStylusUp(self: Label_17$18, e: StylusEventArgs)OnPreviewStylusUp(self: TextBox_18$19, e: StylusEventArgs)OnPreviewStylusUp(self: Button_19$20, e: StylusEventArgs)OnPreviewStylusUp(self: CheckBox_20$21, e: StylusEventArgs)OnPreviewStylusUp(self: ComboBox_21$22, e: StylusEventArgs)OnPreviewStylusUp(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnPreviewTextInput(self, *args): #cannot find CLR method
"""
OnPreviewTextInput(self: UIElement, e: TextCompositionEventArgs)
Invoked when an unhandled
System.Windows.Input.TextCompositionManager.PreviewTextInput�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.TextCompositionEventArgs that contains the event data.
OnPreviewTextInput(self: Window_16$17, e: TextCompositionEventArgs)OnPreviewTextInput(self: Label_17$18, e: TextCompositionEventArgs)OnPreviewTextInput(self: TextBox_18$19, e: TextCompositionEventArgs)OnPreviewTextInput(self: Button_19$20, e: TextCompositionEventArgs)OnPreviewTextInput(self: CheckBox_20$21, e: TextCompositionEventArgs)OnPreviewTextInput(self: ComboBox_21$22, e: TextCompositionEventArgs)OnPreviewTextInput(self: Separator_22$23, e: TextCompositionEventArgs)
"""
pass
def OnPreviewTouchDown(self, *args): #cannot find CLR method
"""
OnPreviewTouchDown(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.PreviewTouchDown
routed event that occurs when a touch presses this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnPreviewTouchDown(self: Window_16$17, e: TouchEventArgs)OnPreviewTouchDown(self: Label_17$18, e: TouchEventArgs)OnPreviewTouchDown(self: TextBox_18$19, e: TouchEventArgs)OnPreviewTouchDown(self: Button_19$20, e: TouchEventArgs)OnPreviewTouchDown(self: CheckBox_20$21, e: TouchEventArgs)OnPreviewTouchDown(self: ComboBox_21$22, e: TouchEventArgs)OnPreviewTouchDown(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnPreviewTouchMove(self, *args): #cannot find CLR method
"""
OnPreviewTouchMove(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.PreviewTouchMove
routed event that occurs when a touch moves while inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnPreviewTouchMove(self: Window_16$17, e: TouchEventArgs)OnPreviewTouchMove(self: Label_17$18, e: TouchEventArgs)OnPreviewTouchMove(self: TextBox_18$19, e: TouchEventArgs)OnPreviewTouchMove(self: Button_19$20, e: TouchEventArgs)OnPreviewTouchMove(self: CheckBox_20$21, e: TouchEventArgs)OnPreviewTouchMove(self: ComboBox_21$22, e: TouchEventArgs)OnPreviewTouchMove(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnPreviewTouchUp(self, *args): #cannot find CLR method
"""
OnPreviewTouchUp(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.PreviewTouchUp routed
event that occurs when a touch is released inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnPreviewTouchUp(self: Window_16$17, e: TouchEventArgs)OnPreviewTouchUp(self: Label_17$18, e: TouchEventArgs)OnPreviewTouchUp(self: TextBox_18$19, e: TouchEventArgs)OnPreviewTouchUp(self: Button_19$20, e: TouchEventArgs)OnPreviewTouchUp(self: CheckBox_20$21, e: TouchEventArgs)OnPreviewTouchUp(self: ComboBox_21$22, e: TouchEventArgs)OnPreviewTouchUp(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnPropertyChanged(self, *args): #cannot find CLR method
"""
OnPropertyChanged(self: ActiveXHost, e: DependencyPropertyChangedEventArgs)
Invoked whenever the effective value of any dependency property on this
System.Windows.FrameworkElement has been updated. The specific dependency
property that changed is reported in the arguments parameter. Overrides
System.Windows.DependencyObject.OnPropertyChanged(System.Windows.DependencyPrope
rtyChangedEventArgs).
e: The event data that describes the property that changed, as well as old and new
values.
"""
pass
def OnQueryContinueDrag(self, *args): #cannot find CLR method
"""
OnQueryContinueDrag(self: UIElement, e: QueryContinueDragEventArgs)
Invoked when an unhandled System.Windows.DragDrop.QueryContinueDrag�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.QueryContinueDragEventArgs that contains the event data.
OnQueryContinueDrag(self: Window_16$17, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: Label_17$18, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: TextBox_18$19, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: Button_19$20, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: CheckBox_20$21, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: ComboBox_21$22, e: QueryContinueDragEventArgs)OnQueryContinueDrag(self: Separator_22$23, e: QueryContinueDragEventArgs)
"""
pass
def OnQueryCursor(self, *args): #cannot find CLR method
"""
OnQueryCursor(self: UIElement, e: QueryCursorEventArgs)
Invoked when an unhandled System.Windows.Input.Mouse.QueryCursor�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.QueryCursorEventArgs that contains the event data.
OnQueryCursor(self: Window_16$17, e: QueryCursorEventArgs)OnQueryCursor(self: Label_17$18, e: QueryCursorEventArgs)OnQueryCursor(self: TextBox_18$19, e: QueryCursorEventArgs)OnQueryCursor(self: Button_19$20, e: QueryCursorEventArgs)OnQueryCursor(self: CheckBox_20$21, e: QueryCursorEventArgs)OnQueryCursor(self: ComboBox_21$22, e: QueryCursorEventArgs)OnQueryCursor(self: Separator_22$23, e: QueryCursorEventArgs)
"""
pass
def OnRender(self, *args): #cannot find CLR method
"""
OnRender(self: UIElement, drawingContext: DrawingContext)
When overridden in a derived class, participates in rendering operations that
are directed by the layout system. The rendering instructions for this element
are not used directly when this method is invoked, and are instead preserved
for later asynchronous use by layout and drawing.
drawingContext: The drawing instructions for a specific element. This context is provided to
the layout system.
OnRender(self: Window_16$17, drawingContext: DrawingContext)OnRender(self: Label_17$18, drawingContext: DrawingContext)OnRender(self: TextBox_18$19, drawingContext: DrawingContext)OnRender(self: Button_19$20, drawingContext: DrawingContext)OnRender(self: CheckBox_20$21, drawingContext: DrawingContext)OnRender(self: ComboBox_21$22, drawingContext: DrawingContext)OnRender(self: Separator_22$23, drawingContext: DrawingContext)
"""
pass
def OnRenderSizeChanged(self, *args): #cannot find CLR method
"""
OnRenderSizeChanged(self: FrameworkElement, sizeInfo: SizeChangedInfo)
Raises the System.Windows.FrameworkElement.SizeChanged event, using the
specified information as part of the eventual event data.
sizeInfo: Details of the old and new size involved in the change.
OnRenderSizeChanged(self: Window_16$17, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: Label_17$18, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: TextBox_18$19, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: Button_19$20, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: CheckBox_20$21, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: ComboBox_21$22, sizeInfo: SizeChangedInfo)OnRenderSizeChanged(self: Separator_22$23, sizeInfo: SizeChangedInfo)
"""
pass
def OnStyleChanged(self, *args): #cannot find CLR method
"""
OnStyleChanged(self: FrameworkElement, oldStyle: Style, newStyle: Style)
Invoked when the style in use on this element changes, which will invalidate
the layout.
oldStyle: The old style.
newStyle: The new style.
OnStyleChanged(self: Window_16$17, oldStyle: Style, newStyle: Style)OnStyleChanged(self: Label_17$18, oldStyle: Style, newStyle: Style)OnStyleChanged(self: TextBox_18$19, oldStyle: Style, newStyle: Style)OnStyleChanged(self: Button_19$20, oldStyle: Style, newStyle: Style)OnStyleChanged(self: CheckBox_20$21, oldStyle: Style, newStyle: Style)OnStyleChanged(self: ComboBox_21$22, oldStyle: Style, newStyle: Style)OnStyleChanged(self: Separator_22$23, oldStyle: Style, newStyle: Style)
"""
pass
def OnStylusButtonDown(self, *args): #cannot find CLR method
"""
OnStylusButtonDown(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusButtonDown�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnStylusButtonDown(self: Window_16$17, e: StylusButtonEventArgs)OnStylusButtonDown(self: Label_17$18, e: StylusButtonEventArgs)OnStylusButtonDown(self: TextBox_18$19, e: StylusButtonEventArgs)OnStylusButtonDown(self: Button_19$20, e: StylusButtonEventArgs)OnStylusButtonDown(self: CheckBox_20$21, e: StylusButtonEventArgs)OnStylusButtonDown(self: ComboBox_21$22, e: StylusButtonEventArgs)OnStylusButtonDown(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnStylusButtonUp(self, *args): #cannot find CLR method
"""
OnStylusButtonUp(self: UIElement, e: StylusButtonEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusButtonUp�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusButtonEventArgs that contains the event data.
OnStylusButtonUp(self: Window_16$17, e: StylusButtonEventArgs)OnStylusButtonUp(self: Label_17$18, e: StylusButtonEventArgs)OnStylusButtonUp(self: TextBox_18$19, e: StylusButtonEventArgs)OnStylusButtonUp(self: Button_19$20, e: StylusButtonEventArgs)OnStylusButtonUp(self: CheckBox_20$21, e: StylusButtonEventArgs)OnStylusButtonUp(self: ComboBox_21$22, e: StylusButtonEventArgs)OnStylusButtonUp(self: Separator_22$23, e: StylusButtonEventArgs)
"""
pass
def OnStylusDown(self, *args): #cannot find CLR method
"""
OnStylusDown(self: UIElement, e: StylusDownEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusDown�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.StylusDownEventArgs that contains the event data.
OnStylusDown(self: Window_16$17, e: StylusDownEventArgs)OnStylusDown(self: Label_17$18, e: StylusDownEventArgs)OnStylusDown(self: TextBox_18$19, e: StylusDownEventArgs)OnStylusDown(self: Button_19$20, e: StylusDownEventArgs)OnStylusDown(self: CheckBox_20$21, e: StylusDownEventArgs)OnStylusDown(self: ComboBox_21$22, e: StylusDownEventArgs)OnStylusDown(self: Separator_22$23, e: StylusDownEventArgs)
"""
pass
def OnStylusEnter(self, *args): #cannot find CLR method
"""
OnStylusEnter(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusEnter�attached
event is raised by this element. Implement this method to add class handling
for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusEnter(self: Window_16$17, e: StylusEventArgs)OnStylusEnter(self: Label_17$18, e: StylusEventArgs)OnStylusEnter(self: TextBox_18$19, e: StylusEventArgs)OnStylusEnter(self: Button_19$20, e: StylusEventArgs)OnStylusEnter(self: CheckBox_20$21, e: StylusEventArgs)OnStylusEnter(self: ComboBox_21$22, e: StylusEventArgs)OnStylusEnter(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusInAirMove(self, *args): #cannot find CLR method
"""
OnStylusInAirMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusInAirMove�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusInAirMove(self: Window_16$17, e: StylusEventArgs)OnStylusInAirMove(self: Label_17$18, e: StylusEventArgs)OnStylusInAirMove(self: TextBox_18$19, e: StylusEventArgs)OnStylusInAirMove(self: Button_19$20, e: StylusEventArgs)OnStylusInAirMove(self: CheckBox_20$21, e: StylusEventArgs)OnStylusInAirMove(self: ComboBox_21$22, e: StylusEventArgs)OnStylusInAirMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusInRange(self, *args): #cannot find CLR method
"""
OnStylusInRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusInRange�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusInRange(self: Window_16$17, e: StylusEventArgs)OnStylusInRange(self: Label_17$18, e: StylusEventArgs)OnStylusInRange(self: TextBox_18$19, e: StylusEventArgs)OnStylusInRange(self: Button_19$20, e: StylusEventArgs)OnStylusInRange(self: CheckBox_20$21, e: StylusEventArgs)OnStylusInRange(self: ComboBox_21$22, e: StylusEventArgs)OnStylusInRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusLeave(self, *args): #cannot find CLR method
"""
OnStylusLeave(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusLeave�attached
event is raised by this element. Implement this method to add class handling
for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusLeave(self: Window_16$17, e: StylusEventArgs)OnStylusLeave(self: Label_17$18, e: StylusEventArgs)OnStylusLeave(self: TextBox_18$19, e: StylusEventArgs)OnStylusLeave(self: Button_19$20, e: StylusEventArgs)OnStylusLeave(self: CheckBox_20$21, e: StylusEventArgs)OnStylusLeave(self: ComboBox_21$22, e: StylusEventArgs)OnStylusLeave(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusMove(self, *args): #cannot find CLR method
"""
OnStylusMove(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusMove�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusMove(self: Window_16$17, e: StylusEventArgs)OnStylusMove(self: Label_17$18, e: StylusEventArgs)OnStylusMove(self: TextBox_18$19, e: StylusEventArgs)OnStylusMove(self: Button_19$20, e: StylusEventArgs)OnStylusMove(self: CheckBox_20$21, e: StylusEventArgs)OnStylusMove(self: ComboBox_21$22, e: StylusEventArgs)OnStylusMove(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusOutOfRange(self, *args): #cannot find CLR method
"""
OnStylusOutOfRange(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusOutOfRange�attached
event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusOutOfRange(self: Window_16$17, e: StylusEventArgs)OnStylusOutOfRange(self: Label_17$18, e: StylusEventArgs)OnStylusOutOfRange(self: TextBox_18$19, e: StylusEventArgs)OnStylusOutOfRange(self: Button_19$20, e: StylusEventArgs)OnStylusOutOfRange(self: CheckBox_20$21, e: StylusEventArgs)OnStylusOutOfRange(self: ComboBox_21$22, e: StylusEventArgs)OnStylusOutOfRange(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnStylusSystemGesture(self, *args): #cannot find CLR method
"""
OnStylusSystemGesture(self: UIElement, e: StylusSystemGestureEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusSystemGesture�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.StylusSystemGestureEventArgs that contains the event
data.
OnStylusSystemGesture(self: Window_16$17, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: Label_17$18, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: TextBox_18$19, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: Button_19$20, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: CheckBox_20$21, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: ComboBox_21$22, e: StylusSystemGestureEventArgs)OnStylusSystemGesture(self: Separator_22$23, e: StylusSystemGestureEventArgs)
"""
pass
def OnStylusUp(self, *args): #cannot find CLR method
"""
OnStylusUp(self: UIElement, e: StylusEventArgs)
Invoked when an unhandled System.Windows.Input.Stylus.StylusUp�attached event
reaches an element in its route that is derived from this class. Implement this
method to add class handling for this event.
e: The System.Windows.Input.StylusEventArgs that contains the event data.
OnStylusUp(self: Window_16$17, e: StylusEventArgs)OnStylusUp(self: Label_17$18, e: StylusEventArgs)OnStylusUp(self: TextBox_18$19, e: StylusEventArgs)OnStylusUp(self: Button_19$20, e: StylusEventArgs)OnStylusUp(self: CheckBox_20$21, e: StylusEventArgs)OnStylusUp(self: ComboBox_21$22, e: StylusEventArgs)OnStylusUp(self: Separator_22$23, e: StylusEventArgs)
"""
pass
def OnTextInput(self, *args): #cannot find CLR method
"""
OnTextInput(self: UIElement, e: TextCompositionEventArgs)
Invoked when an unhandled System.Windows.Input.TextCompositionManager.TextInput�
attached event reaches an element in its route that is derived from this class.
Implement this method to add class handling for this event.
e: The System.Windows.Input.TextCompositionEventArgs that contains the event data.
OnTextInput(self: Window_16$17, e: TextCompositionEventArgs)OnTextInput(self: Label_17$18, e: TextCompositionEventArgs)OnTextInput(self: TextBox_18$19, e: TextCompositionEventArgs)OnTextInput(self: Button_19$20, e: TextCompositionEventArgs)OnTextInput(self: CheckBox_20$21, e: TextCompositionEventArgs)OnTextInput(self: ComboBox_21$22, e: TextCompositionEventArgs)OnTextInput(self: Separator_22$23, e: TextCompositionEventArgs)
"""
pass
def OnToolTipClosing(self, *args): #cannot find CLR method
"""
OnToolTipClosing(self: FrameworkElement, e: ToolTipEventArgs)
Invoked whenever an unhandled System.Windows.FrameworkElement.ToolTipClosing
routed event reaches this class in its route. Implement this method to add
class handling for this event.
e: Provides data about the event.
OnToolTipClosing(self: Window_16$17, e: ToolTipEventArgs)OnToolTipClosing(self: Label_17$18, e: ToolTipEventArgs)OnToolTipClosing(self: TextBox_18$19, e: ToolTipEventArgs)OnToolTipClosing(self: Button_19$20, e: ToolTipEventArgs)OnToolTipClosing(self: CheckBox_20$21, e: ToolTipEventArgs)OnToolTipClosing(self: ComboBox_21$22, e: ToolTipEventArgs)OnToolTipClosing(self: Separator_22$23, e: ToolTipEventArgs)
"""
pass
def OnToolTipOpening(self, *args): #cannot find CLR method
"""
OnToolTipOpening(self: FrameworkElement, e: ToolTipEventArgs)
Invoked whenever the System.Windows.FrameworkElement.ToolTipOpening routed
event reaches this class in its route. Implement this method to add class
handling for this event.
e: Provides data about the event.
OnToolTipOpening(self: Window_16$17, e: ToolTipEventArgs)OnToolTipOpening(self: Label_17$18, e: ToolTipEventArgs)OnToolTipOpening(self: TextBox_18$19, e: ToolTipEventArgs)OnToolTipOpening(self: Button_19$20, e: ToolTipEventArgs)OnToolTipOpening(self: CheckBox_20$21, e: ToolTipEventArgs)OnToolTipOpening(self: ComboBox_21$22, e: ToolTipEventArgs)OnToolTipOpening(self: Separator_22$23, e: ToolTipEventArgs)
"""
pass
def OnTouchDown(self, *args): #cannot find CLR method
"""
OnTouchDown(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchDown routed event
that occurs when a touch presses inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchDown(self: Window_16$17, e: TouchEventArgs)OnTouchDown(self: Label_17$18, e: TouchEventArgs)OnTouchDown(self: TextBox_18$19, e: TouchEventArgs)OnTouchDown(self: Button_19$20, e: TouchEventArgs)OnTouchDown(self: CheckBox_20$21, e: TouchEventArgs)OnTouchDown(self: ComboBox_21$22, e: TouchEventArgs)OnTouchDown(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchEnter(self, *args): #cannot find CLR method
"""
OnTouchEnter(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchEnter routed
event that occurs when a touch moves from outside to inside the bounds of this
element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchEnter(self: Window_16$17, e: TouchEventArgs)OnTouchEnter(self: Label_17$18, e: TouchEventArgs)OnTouchEnter(self: TextBox_18$19, e: TouchEventArgs)OnTouchEnter(self: Button_19$20, e: TouchEventArgs)OnTouchEnter(self: CheckBox_20$21, e: TouchEventArgs)OnTouchEnter(self: ComboBox_21$22, e: TouchEventArgs)OnTouchEnter(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchLeave(self, *args): #cannot find CLR method
"""
OnTouchLeave(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchLeave routed
event that occurs when a touch moves from inside to outside the bounds of this
System.Windows.UIElement.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchLeave(self: Window_16$17, e: TouchEventArgs)OnTouchLeave(self: Label_17$18, e: TouchEventArgs)OnTouchLeave(self: TextBox_18$19, e: TouchEventArgs)OnTouchLeave(self: Button_19$20, e: TouchEventArgs)OnTouchLeave(self: CheckBox_20$21, e: TouchEventArgs)OnTouchLeave(self: ComboBox_21$22, e: TouchEventArgs)OnTouchLeave(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchMove(self, *args): #cannot find CLR method
"""
OnTouchMove(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchMove routed event
that occurs when a touch moves while inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchMove(self: Window_16$17, e: TouchEventArgs)OnTouchMove(self: Label_17$18, e: TouchEventArgs)OnTouchMove(self: TextBox_18$19, e: TouchEventArgs)OnTouchMove(self: Button_19$20, e: TouchEventArgs)OnTouchMove(self: CheckBox_20$21, e: TouchEventArgs)OnTouchMove(self: ComboBox_21$22, e: TouchEventArgs)OnTouchMove(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnTouchUp(self, *args): #cannot find CLR method
"""
OnTouchUp(self: UIElement, e: TouchEventArgs)
Provides class handling for the System.Windows.UIElement.TouchUp routed event
that occurs when a touch is released inside this element.
e: A System.Windows.Input.TouchEventArgs that contains the event data.
OnTouchUp(self: Window_16$17, e: TouchEventArgs)OnTouchUp(self: Label_17$18, e: TouchEventArgs)OnTouchUp(self: TextBox_18$19, e: TouchEventArgs)OnTouchUp(self: Button_19$20, e: TouchEventArgs)OnTouchUp(self: CheckBox_20$21, e: TouchEventArgs)OnTouchUp(self: ComboBox_21$22, e: TouchEventArgs)OnTouchUp(self: Separator_22$23, e: TouchEventArgs)
"""
pass
def OnVisualChildrenChanged(self, *args): #cannot find CLR method
"""
OnVisualChildrenChanged(self: Visual, visualAdded: DependencyObject, visualRemoved: DependencyObject)
Called when the System.Windows.Media.VisualCollection of the visual object is
modified.
visualAdded: The System.Windows.Media.Visual that was added to the collection
visualRemoved: The System.Windows.Media.Visual that was removed from the collection
OnVisualChildrenChanged(self: Window_16$17, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: Label_17$18, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: TextBox_18$19, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: Button_19$20, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: CheckBox_20$21, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: ComboBox_21$22, visualAdded: DependencyObject, visualRemoved: DependencyObject)OnVisualChildrenChanged(self: Separator_22$23, visualAdded: DependencyObject, visualRemoved: DependencyObject)
"""
pass
def OnVisualParentChanged(self, *args): #cannot find CLR method
"""
OnVisualParentChanged(self: FrameworkElement, oldParent: DependencyObject)
Invoked when the parent of this element in the visual tree is changed.
Overrides
System.Windows.UIElement.OnVisualParentChanged(System.Windows.DependencyObject).
oldParent: The old parent element. May be null to indicate that the element did not have a
visual parent previously.
OnVisualParentChanged(self: Window_16$17, oldParent: DependencyObject)OnVisualParentChanged(self: Label_17$18, oldParent: DependencyObject)OnVisualParentChanged(self: TextBox_18$19, oldParent: DependencyObject)OnVisualParentChanged(self: Button_19$20, oldParent: DependencyObject)OnVisualParentChanged(self: CheckBox_20$21, oldParent: DependencyObject)OnVisualParentChanged(self: ComboBox_21$22, oldParent: DependencyObject)OnVisualParentChanged(self: Separator_22$23, oldParent: DependencyObject)
"""
pass
def OnWindowPositionChanged(self, *args): #cannot find CLR method
"""
OnWindowPositionChanged(self: ActiveXHost, bounds: Rect)
Called when the hosted window's position changes.
bounds: The window's position.
"""
pass
def ParentLayoutInvalidated(self, *args): #cannot find CLR method
"""
ParentLayoutInvalidated(self: FrameworkElement, child: UIElement)
Supports incremental layout implementations in specialized subclasses of
System.Windows.FrameworkElement.
System.Windows.FrameworkElement.ParentLayoutInvalidated(System.Windows.UIElement
) is invoked when a child element has invalidated a property that is marked in
metadata as affecting the parent's measure or arrange passes during layout.
child: The child element reporting the change.
ParentLayoutInvalidated(self: Window_16$17, child: UIElement)ParentLayoutInvalidated(self: Label_17$18, child: UIElement)ParentLayoutInvalidated(self: TextBox_18$19, child: UIElement)ParentLayoutInvalidated(self: Button_19$20, child: UIElement)ParentLayoutInvalidated(self: CheckBox_20$21, child: UIElement)ParentLayoutInvalidated(self: ComboBox_21$22, child: UIElement)ParentLayoutInvalidated(self: Separator_22$23, child: UIElement)
"""
pass
def RegisterKeyboardInputSinkCore(self, *args): #cannot find CLR method
"""
RegisterKeyboardInputSinkCore(self: HwndHost, sink: IKeyboardInputSink) -> IKeyboardInputSite
Registers the System.Windows.Interop.IKeyboardInputSink interface of a
contained component.
sink: The System.Windows.Interop.IKeyboardInputSink sink of the contained component.
Returns: The System.Windows.Interop.IKeyboardInputSite site of the contained component.
"""
pass
def RemoveLogicalChild(self, *args): #cannot find CLR method
"""
RemoveLogicalChild(self: FrameworkElement, child: object)
Removes the provided object from this element's logical tree.
System.Windows.FrameworkElement updates the affected logical tree parent
pointers to keep in sync with this deletion.
child: The element to remove.
RemoveLogicalChild(self: Window_16$17, child: object)RemoveLogicalChild(self: Label_17$18, child: object)RemoveLogicalChild(self: TextBox_18$19, child: object)RemoveLogicalChild(self: Button_19$20, child: object)RemoveLogicalChild(self: CheckBox_20$21, child: object)RemoveLogicalChild(self: ComboBox_21$22, child: object)RemoveLogicalChild(self: Separator_22$23, child: object)
"""
pass
def RemoveVisualChild(self, *args): #cannot find CLR method
"""
RemoveVisualChild(self: Visual, child: Visual)
Removes the parent-child relationship between two visuals.
child: The child visual object to remove from the parent visual.
RemoveVisualChild(self: Window_16$17, child: Window_16$17)RemoveVisualChild(self: Label_17$18, child: Label_17$18)RemoveVisualChild(self: TextBox_18$19, child: TextBox_18$19)RemoveVisualChild(self: Button_19$20, child: Button_19$20)RemoveVisualChild(self: CheckBox_20$21, child: CheckBox_20$21)RemoveVisualChild(self: ComboBox_21$22, child: ComboBox_21$22)RemoveVisualChild(self: Separator_22$23, child: Separator_22$23)
"""
pass
def ShouldSerializeProperty(self, *args): #cannot find CLR method
"""
ShouldSerializeProperty(self: DependencyObject, dp: DependencyProperty) -> bool
Returns a value that indicates whether serialization processes should serialize
the value for the provided dependency property.
dp: The identifier for the dependency property that should be serialized.
Returns: true if the dependency property that is supplied should be value-serialized;
otherwise, false.
ShouldSerializeProperty(self: Window_16$17, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Label_17$18, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: TextBox_18$19, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Button_19$20, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: CheckBox_20$21, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: ComboBox_21$22, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Separator_22$23, dp: DependencyProperty) -> bool
"""
pass
def TabIntoCore(self, *args): #cannot find CLR method
"""
TabIntoCore(self: HwndHost, request: TraversalRequest) -> bool
Sets focus on either the first tab stop or the last tab stop of the sink.
request: Specifies whether focus should be set to the first or the last tab stop.
Returns: Always returns false.
"""
pass
def TranslateAcceleratorCore(self, *args): #cannot find CLR method
"""
TranslateAcceleratorCore(self: HwndHost, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes keyboard input at the keydown message level.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: Always returns false.
"""
pass
def TranslateCharCore(self, *args): #cannot find CLR method
"""
TranslateCharCore(self: HwndHost, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes WM_CHAR, WM_SYSCHAR, WM_DEADCHAR, and WM_SYSDEADCHAR input messages
before the
System.Windows.Interop.IKeyboardInputSink.OnMnemonic(System.Windows.Interop.MSG@
,System.Windows.Input.ModifierKeys) method is called.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: Always returns false.
"""
pass
def WndProc(self, *args): #cannot find CLR method
"""
WndProc(self: HwndHost, hwnd: IntPtr, msg: int, wParam: IntPtr, lParam: IntPtr, handled: bool) -> (IntPtr, bool)
When overridden in a derived class, accesses the window process (handle) of the
hosted child window.
hwnd: The window handle of the hosted window.
msg: The message to act upon.
wParam: Information that may be relevant to handling the message. This is typically
used to store small pieces of information, such as flags.
lParam: Information that may be relevant to handling the message. This is typically
used to reference an object.
handled: Whether events resulting should be marked handled.
Returns: The window handle of the child window.
"""
pass
def __enter__(self, *args): #cannot find CLR method
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self, *args): #cannot find CLR method
""" __exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object) """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
DefaultStyleKey = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the key to use to reference the style for this control, when theme styles are used or defined.
"""
HasEffectiveKeyboardFocus = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
InheritanceBehavior = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the scope limits for property value inheritance, resource key lookup, and RelativeSource FindAncestor lookup.
"""
IsDisposed = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that indicates whether the System.Windows.Interop.ActiveXHost.Dispose(System.Boolean) method has been called on the System.Windows.Interop.ActiveXHost instance.
"""
IsEnabledCore = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that becomes the return value of System.Windows.UIElement.IsEnabled in derived classes.
"""
LogicalChildren = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets an enumerator for logical child elements of this element.
"""
StylusPlugIns = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a collection of all stylus plug-in (customization) objects associated with this element.
"""
VisualBitmapEffect = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Effects.BitmapEffect value for the System.Windows.Media.Visual.
"""
VisualBitmapEffectInput = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Effects.BitmapEffectInput value for the System.Windows.Media.Visual.
"""
VisualBitmapScalingMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.BitmapScalingMode for the System.Windows.Media.Visual.
"""
VisualCacheMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a cached representation of the System.Windows.Media.Visual.
"""
VisualChildrenCount = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the number of visual child elements within this element.
"""
VisualClearTypeHint = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.ClearTypeHint that determines how ClearType is rendered in the System.Windows.Media.Visual.
"""
VisualClip = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the clip region of the System.Windows.Media.Visual as a System.Windows.Media.Geometry value.
"""
VisualEdgeMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the edge mode of the System.Windows.Media.Visual as an System.Windows.Media.EdgeMode value.
"""
VisualEffect = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the bitmap effect to apply to the System.Windows.Media.Visual.
"""
VisualOffset = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the offset value of the visual object.
"""
VisualOpacity = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the opacity of the System.Windows.Media.Visual.
"""
VisualOpacityMask = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Brush value that represents the opacity mask of the System.Windows.Media.Visual.
"""
VisualParent = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the visual tree parent of the visual object.
"""
VisualScrollableAreaClip = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a clipped scrollable area for the System.Windows.Media.Visual.
"""
VisualTextHintingMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.TextHintingMode of the System.Windows.Media.Visual.
"""
VisualTextRenderingMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.TextRenderingMode of the System.Windows.Media.Visual.
"""
VisualTransform = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.Transform value for the System.Windows.Media.Visual.
"""
VisualXSnappingGuidelines = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the x-coordinate (vertical) guideline collection.
"""
VisualYSnappingGuidelines = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the y-coordinate (horizontal) guideline collection.
"""
class BrowserInteropHelper(object):
""" A helper class that provides information about the browser environment in which a XAML browser application (XBAP) application is hosted. """
ClientSite = None
HostScript = None
IsBrowserHosted = False
Source = None
__all__ = []
class ComponentDispatcher(object):
""" Enables shared control of the message pump between Win32 and WPF in interoperation scenarios. """
@staticmethod
def PopModal():
"""
PopModal()
Called to indicate that a modal thread is no longer modal.
"""
pass
@staticmethod
def PushModal():
"""
PushModal()
Called to indicate that the thread is modal.
"""
pass
@staticmethod
def RaiseIdle():
"""
RaiseIdle()
Called to indicate that a thread is idle.
"""
pass
@staticmethod
def RaiseThreadMessage(msg):
"""
RaiseThreadMessage(msg: MSG) -> (bool, MSG)
Indicates that a new message is available for possible handling.
msg: The message and its associated data.
Returns: true, if one of the modules listening to the message loop processed the
message. The owner of the message loop should ignore the message. false, if the
message was not processed. In this case, the owner of the message pump should
call the Win32 function TranslateMessage followed by DispatchMessage.
"""
pass
CurrentKeyboardMessage = None
EnterThreadModal = None
IsThreadModal = True
LeaveThreadModal = None
ThreadFilterMessage = None
ThreadIdle = None
ThreadPreprocessMessage = None
__all__ = [
'EnterThreadModal',
'LeaveThreadModal',
'PopModal',
'PushModal',
'RaiseIdle',
'RaiseThreadMessage',
'ThreadFilterMessage',
'ThreadIdle',
'ThreadPreprocessMessage',
]
class CursorInteropHelper(object):
""" Provides a static helper class for WPF/Win32 interoperation with one method, which is used to obtain a Windows Presentation Foundation (WPF)�System.Windows.Input.Cursor object based on a provided Win32 cursor handle. """
@staticmethod
def Create(cursorHandle):
"""
Create(cursorHandle: SafeHandle) -> Cursor
Returns a Windows Presentation Foundation (WPF)�System.Windows.Input.Cursor
object based on a provided Win32 cursor handle.
cursorHandle: Cursor reference to use for interoperation.
Returns: The Windows Presentation Foundation (WPF)�cursor object based on the provided
Win32 cursor handle.
"""
pass
__all__ = [
'Create',
]
class D3DImage(ImageSource, ISealable, IAnimatable, IResource, IFormattable, IAppDomainShutdownListener):
"""
An System.Windows.Media.ImageSource that displays a user-created Direct3D surface.
D3DImage()
D3DImage(dpiX: float, dpiY: float)
"""
def AddDirtyRect(self, dirtyRect):
"""
AddDirtyRect(self: D3DImage, dirtyRect: Int32Rect)
Specifies the area of the back buffer that changed.
dirtyRect: An System.Windows.Int32Rect that represents the area that changed.
"""
pass
def Clone(self):
"""
Clone(self: D3DImage) -> D3DImage
Creates a modifiable clone of this System.Windows.Interop.D3DImage object,
making deep copies of this object's values. When copying dependency properties,
this method copies resource references and data bindings (which may no longer
resolve), but not animations or their current values.
Returns: A modifiable clone of the current object. The cloned object's
System.Windows.Freezable.IsFrozen property will be false even if the source's
System.Windows.Freezable.IsFrozen property was true.
"""
pass
def CloneCore(self, *args): #cannot find CLR method
"""
CloneCore(self: D3DImage, sourceFreezable: Freezable)
sourceFreezable: The object to clone.
"""
pass
def CloneCurrentValue(self):
"""
CloneCurrentValue(self: D3DImage) -> D3DImage
Creates a modifiable clone of this System.Windows.Interop.D3DImage object,
making deep copies of this object's current values. Resource references, data
bindings, and animations are not copied, but their current values are copied.
Returns: A modifiable clone of the current object. The cloned object's
System.Windows.Freezable.IsFrozen property will be false even if the source's
System.Windows.Freezable.IsFrozen property was true.
"""
pass
def CloneCurrentValueCore(self, *args): #cannot find CLR method
"""
CloneCurrentValueCore(self: D3DImage, sourceFreezable: Freezable)
sourceFreezable: The System.Windows.Freezable to be cloned.
"""
pass
def CopyBackBuffer(self, *args): #cannot find CLR method
"""
CopyBackBuffer(self: D3DImage) -> BitmapSource
Creates a software copy of the System.Windows.Interop.D3DImage.
Returns: A System.Windows.Media.Imaging.BitmapSource that is a software copy of the
current state of the back buffer; otherwise, null if the back buffer cannot be
read.
"""
pass
def CreateInstance(self, *args): #cannot find CLR method
"""
CreateInstance(self: Freezable) -> Freezable
Initializes a new instance of the System.Windows.Freezable class.
Returns: The new instance.
"""
pass
def CreateInstanceCore(self, *args): #cannot find CLR method
"""
CreateInstanceCore(self: D3DImage) -> Freezable
When implemented in a derived class, creates a new instance of the
System.Windows.Interop.D3DImage derived class.
Returns: The new instance.
"""
pass
def FreezeCore(self, *args): #cannot find CLR method
"""
FreezeCore(self: D3DImage, isChecking: bool) -> bool
Makes the System.Windows.Interop.D3DImage unmodifiable or determines whether it
can be made unmodifiable.
isChecking: Has no effect.
Returns: false in all cases.
"""
pass
def GetAsFrozenCore(self, *args): #cannot find CLR method
"""
GetAsFrozenCore(self: D3DImage, sourceFreezable: Freezable)
sourceFreezable: The instance to copy.
"""
pass
def GetCurrentValueAsFrozenCore(self, *args): #cannot find CLR method
"""
GetCurrentValueAsFrozenCore(self: D3DImage, sourceFreezable: Freezable)
sourceFreezable: The System.Windows.Freezable to copy and freeze.
"""
pass
def Lock(self):
"""
Lock(self: D3DImage)
Locks the System.Windows.Interop.D3DImage and enables operations on the back
buffer.
"""
pass
def OnChanged(self, *args): #cannot find CLR method
"""
OnChanged(self: Freezable)
Called when the current System.Windows.Freezable object is modified.
"""
pass
def OnFreezablePropertyChanged(self, *args): #cannot find CLR method
"""
OnFreezablePropertyChanged(self: Freezable, oldValue: DependencyObject, newValue: DependencyObject, property: DependencyProperty)
This member supports the Windows Presentation Foundation (WPF) infrastructure
and is not intended to be used directly from your code.
oldValue: The previous value of the data member.
newValue: The current value of the data member.
property: The property that changed.
OnFreezablePropertyChanged(self: Freezable, oldValue: DependencyObject, newValue: DependencyObject)
Ensures that appropriate context pointers are established for a
System.Windows.DependencyObjectType data member that has just been set.
oldValue: The previous value of the data member.
newValue: The current value of the data member.
"""
pass
def OnPropertyChanged(self, *args): #cannot find CLR method
"""
OnPropertyChanged(self: Freezable, e: DependencyPropertyChangedEventArgs)
Overrides the System.Windows.DependencyObject implementation of
System.Windows.DependencyObject.OnPropertyChanged(System.Windows.DependencyPrope
rtyChangedEventArgs) to also invoke any System.Windows.Freezable.Changed
handlers in response to a changing dependency property of type
System.Windows.Freezable.
e: Event data that contains information about which property changed, and its old
and new values.
"""
pass
def ReadPreamble(self, *args): #cannot find CLR method
"""
ReadPreamble(self: Freezable)
Ensures that the System.Windows.Freezable is being accessed from a valid
thread. Inheritors of System.Windows.Freezable must call this method at the
beginning of any API that reads data members that are not dependency
properties.
"""
pass
def SetBackBuffer(self, backBufferType, backBuffer, enableSoftwareFallback=None):
"""
SetBackBuffer(self: D3DImage, backBufferType: D3DResourceType, backBuffer: IntPtr, enableSoftwareFallback: bool)SetBackBuffer(self: D3DImage, backBufferType: D3DResourceType, backBuffer: IntPtr)
Assigns a Direct3D surface as the source of the back buffer.
backBufferType: The type of Direct3D surface. Must be a valid
System.Windows.Interop.D3DResourceType.
backBuffer: The Direct3D surface to assign as the back buffer.
"""
pass
def ShouldSerializeProperty(self, *args): #cannot find CLR method
"""
ShouldSerializeProperty(self: DependencyObject, dp: DependencyProperty) -> bool
Returns a value that indicates whether serialization processes should serialize
the value for the provided dependency property.
dp: The identifier for the dependency property that should be serialized.
Returns: true if the dependency property that is supplied should be value-serialized;
otherwise, false.
ShouldSerializeProperty(self: Window_16$17, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Label_17$18, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: TextBox_18$19, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Button_19$20, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: CheckBox_20$21, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: ComboBox_21$22, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Separator_22$23, dp: DependencyProperty) -> bool
"""
pass
def TryLock(self, timeout):
"""
TryLock(self: D3DImage, timeout: Duration) -> bool
Attempts to lock the System.Windows.Interop.D3DImage and waits for the
specified duration.
timeout: The duration to wait for the lock to be acquired.
Returns: true if the lock was acquired; otherwise, false.
"""
pass
def Unlock(self):
"""
Unlock(self: D3DImage)
Decrements the lock count for the System.Windows.Interop.D3DImage.
"""
pass
def WritePostscript(self, *args): #cannot find CLR method
"""
WritePostscript(self: Freezable)
Raises the System.Windows.Freezable.Changed event for the
System.Windows.Freezable and invokes its System.Windows.Freezable.OnChanged
method. Classes that derive from System.Windows.Freezable should call this
method at the end of any API that modifies class members that are not stored as
dependency properties.
"""
pass
def WritePreamble(self, *args): #cannot find CLR method
"""
WritePreamble(self: Freezable)
Verifies that the System.Windows.Freezable is not frozen and that it is being
accessed from a valid threading context. System.Windows.Freezable inheritors
should call this method at the beginning of any API that writes to data members
that are not dependency properties.
"""
pass
def __format__(self, *args): #cannot find CLR method
""" __format__(formattable: IFormattable, format: str) -> str """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, dpiX=None, dpiY=None):
"""
__new__(cls: type)
__new__(cls: type, dpiX: float, dpiY: float)
"""
pass
def __str__(self, *args): #cannot find CLR method
pass
Height = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the height of the System.Windows.Interop.D3DImage.
Get: Height(self: D3DImage) -> float
"""
IsFrontBufferAvailable = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that indicates whether a front buffer exists.
Get: IsFrontBufferAvailable(self: D3DImage) -> bool
"""
Metadata = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the metadata associated with the image source.
Get: Metadata(self: D3DImage) -> ImageMetadata
"""
PixelHeight = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the height of the System.Windows.Interop.D3DImage, in pixels.
Get: PixelHeight(self: D3DImage) -> int
"""
PixelWidth = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the width of the System.Windows.Interop.D3DImage, in pixels.
Get: PixelWidth(self: D3DImage) -> int
"""
Width = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the width of the System.Windows.Interop.D3DImage.
Get: Width(self: D3DImage) -> float
"""
IsFrontBufferAvailableChanged = None
IsFrontBufferAvailableProperty = None
class D3DResourceType(Enum, IComparable, IFormattable, IConvertible):
"""
Specifies the Direct3D surface types that are compatible with the System.Windows.Interop.D3DImage class.
enum D3DResourceType, values: IDirect3DSurface9 (0)
"""
def __eq__(self, *args): #cannot find CLR method
""" x.__eq__(y) <==> x==yx.__eq__(y) <==> x==yx.__eq__(y) <==> x==y """
pass
def __format__(self, *args): #cannot find CLR method
""" __format__(formattable: IFormattable, format: str) -> str """
pass
def __ge__(self, *args): #cannot find CLR method
pass
def __gt__(self, *args): #cannot find CLR method
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __le__(self, *args): #cannot find CLR method
pass
def __lt__(self, *args): #cannot find CLR method
pass
def __ne__(self, *args): #cannot find CLR method
pass
def __reduce_ex__(self, *args): #cannot find CLR method
pass
def __str__(self, *args): #cannot find CLR method
pass
IDirect3DSurface9 = None
value__ = None
class DocObjHost(MarshalByRefObject, IServiceProvider, IHostService, IBrowserHostServices, IByteRangeDownloaderService):
"""
This type or member supports the Windows Presentation Foundation (WPF) infrastructure and is not intended to be used directly from your code.
DocObjHost()
"""
def InitializeLifetimeService(self):
"""
InitializeLifetimeService(self: DocObjHost) -> object
This type or member supports the Windows Presentation Foundation (WPF)
infrastructure and is not intended to be used directly from your code.
Returns: A new System.Runtime.Remoting.Lifetime.ILease object.
"""
pass
def MemberwiseClone(self, *args): #cannot find CLR method
"""
MemberwiseClone(self: MarshalByRefObject, cloneIdentity: bool) -> MarshalByRefObject
Creates a shallow copy of the current System.MarshalByRefObject object.
cloneIdentity: false to delete the current System.MarshalByRefObject object's identity, which
will cause the object to be assigned a new identity when it is marshaled across
a remoting boundary. A value of false is usually appropriate. true to copy the
current System.MarshalByRefObject object's identity to its clone, which will
cause remoting client calls to be routed to the remote server object.
Returns: A shallow copy of the current System.MarshalByRefObject object.
MemberwiseClone(self: object) -> object
Creates a shallow copy of the current System.Object.
Returns: A shallow copy of the current System.Object.
"""
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
class DynamicScriptObject(DynamicObject, IDynamicMetaObjectProvider):
""" Enables calls from a XAML browser application (XBAP) to the HTML window that hosts the application. """
def ToString(self):
"""
ToString(self: DynamicScriptObject) -> str
Attempts to convert the script object to a string representation.
Returns: A string representation of the script object, if the object can be converted;
otherwise, a string representation of the object's default property or method.
"""
pass
def TryGetIndex(self, binder, indexes, result):
"""
TryGetIndex(self: DynamicScriptObject, binder: GetIndexBinder, indexes: Array[object]) -> (bool, object)
Gets an indexed value from the script object by using the first index value
from the indexes collection.
binder: The binder provided by the call site.
indexes: The index to be retrieved.
Returns: Always returns true.
"""
pass
def TryGetMember(self, binder, result):
"""
TryGetMember(self: DynamicScriptObject, binder: GetMemberBinder) -> (bool, object)
Gets an member value from the script object.
binder: The binder provided by the call site.
Returns: Always returns true.
"""
pass
def TryInvoke(self, binder, args, result):
"""
TryInvoke(self: DynamicScriptObject, binder: InvokeBinder, args: Array[object]) -> (bool, object)
Calls the default script method.
binder: The binder provided by the call site.
args: The arguments to pass to the default method.
Returns: Always return true.
"""
pass
def TryInvokeMember(self, binder, args, result):
"""
TryInvokeMember(self: DynamicScriptObject, binder: InvokeMemberBinder, args: Array[object]) -> (bool, object)
Calls a method on the script object.
binder: The binder provided by the call site.
args: The arguments to pass to the default method.
Returns: Always return true.
"""
pass
def TrySetIndex(self, binder, indexes, value):
"""
TrySetIndex(self: DynamicScriptObject, binder: SetIndexBinder, indexes: Array[object], value: object) -> bool
Sets a member on the script object by using the first index specified in the
indexes collection.
binder: The binder provided by the call site.
indexes: The index to be retrieved.
value: The method result
Returns: Always returns true.
"""
pass
def TrySetMember(self, binder, value):
"""
TrySetMember(self: DynamicScriptObject, binder: SetMemberBinder, value: object) -> bool
Sets a member on the script object to the specified value.
binder: The binder provided by the call site.
value: The value to set for the member.
Returns: Always returns true.
"""
pass
def __dir__(self, *args): #cannot find CLR method
""" __dir__(self: IDynamicMetaObjectProvider) -> list """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __str__(self, *args): #cannot find CLR method
pass
class HwndSource(PresentationSource, IDisposable, IWin32Window, IKeyboardInputSink):
"""
Presents Windows Presentation Foundation (WPF) content in a Win32 window.
HwndSource(classStyle: int, style: int, exStyle: int, x: int, y: int, name: str, parent: IntPtr)
HwndSource(classStyle: int, style: int, exStyle: int, x: int, y: int, width: int, height: int, name: str, parent: IntPtr, adjustSizingForNonClientArea: bool)
HwndSource(classStyle: int, style: int, exStyle: int, x: int, y: int, width: int, height: int, name: str, parent: IntPtr)
HwndSource(parameters: HwndSourceParameters)
"""
def AddHook(self, hook):
"""
AddHook(self: HwndSource, hook: HwndSourceHook)
Adds an event handler that receives all window messages.
hook: The handler implementation (based on the System.Windows.Interop.HwndSourceHook
delegate) that receives the window messages.
"""
pass
def AddSource(self, *args): #cannot find CLR method
"""
AddSource(self: PresentationSource)
Adds a System.Windows.PresentationSource derived class instance to the list of
known presentation sources.
"""
pass
def ClearContentRenderedListeners(self, *args): #cannot find CLR method
"""
ClearContentRenderedListeners(self: PresentationSource)
Sets the list of listeners for the
System.Windows.PresentationSource.ContentRendered event to null.
"""
pass
def CreateHandleRef(self):
"""
CreateHandleRef(self: HwndSource) -> HandleRef
Gets the window handle for the System.Windows.Interop.HwndSource. The window
handle is packaged as part of a System.Runtime.InteropServices.HandleRef
structure.
Returns: A structure that contains the window handle for this
System.Windows.Interop.HwndSource.
"""
pass
def Dispose(self):
"""
Dispose(self: HwndSource)
Releases all managed resources that are used by the
System.Windows.Interop.HwndSource, and raises the
System.Windows.Interop.HwndSource.Disposed event.
"""
pass
@staticmethod
def FromHwnd(hwnd):
"""
FromHwnd(hwnd: IntPtr) -> HwndSource
Returns the System.Windows.Interop.HwndSource object of the specified window.
hwnd: The provided window handle.
Returns: The System.Windows.Interop.HwndSource object for the window that is specified
by the hwnd window handle.
"""
pass
def GetCompositionTargetCore(self, *args): #cannot find CLR method
"""
GetCompositionTargetCore(self: HwndSource) -> CompositionTarget
Gets the visual target of the window.
Returns: Returns the visual target of the window.
"""
pass
def HasFocusWithinCore(self, *args): #cannot find CLR method
"""
HasFocusWithinCore(self: HwndSource) -> bool
Gets a value that indicates whether the sink or one of its contained components
has focus.
Returns: true if the sink or one of its contained components has focus; otherwise, false.
"""
pass
def OnDpiChanged(self, *args): #cannot find CLR method
""" OnDpiChanged(self: HwndSource, e: HwndDpiChangedEventArgs) """
pass
def OnMnemonicCore(self, *args): #cannot find CLR method
"""
OnMnemonicCore(self: HwndSource, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Called when one of the mnemonics (access keys) for this sink is invoked.
msg: The message for the mnemonic and associated data.
modifiers: Modifier keys.
Returns: true if the message was handled; otherwise, false.
"""
pass
def RegisterKeyboardInputSinkCore(self, *args): #cannot find CLR method
"""
RegisterKeyboardInputSinkCore(self: HwndSource, sink: IKeyboardInputSink) -> IKeyboardInputSite
Registers the System.Windows.Interop.IKeyboardInputSink interface of a
contained component.
sink: The System.Windows.Interop.IKeyboardInputSink sink of the contained component.
Returns: The System.Windows.Interop.IKeyboardInputSite site of the contained component.
"""
pass
def RemoveHook(self, hook):
"""
RemoveHook(self: HwndSource, hook: HwndSourceHook)
Removes the event handlers that were added by
System.Windows.Interop.HwndSource.AddHook(System.Windows.Interop.HwndSourceHook)
.
hook: The event handler to remove.
"""
pass
def RemoveSource(self, *args): #cannot find CLR method
"""
RemoveSource(self: PresentationSource)
Removes a System.Windows.PresentationSource derived class instance from the
list of known presentation sources.
"""
pass
def RootChanged(self, *args): #cannot find CLR method
"""
RootChanged(self: PresentationSource, oldRoot: Visual, newRoot: Visual)
Provides notification that the root System.Windows.Media.Visual has changed.
oldRoot: The old root System.Windows.Media.Visual.
newRoot: The new root System.Windows.Media.Visual.
"""
pass
def TabIntoCore(self, *args): #cannot find CLR method
"""
TabIntoCore(self: HwndSource, request: TraversalRequest) -> bool
Sets focus on either the first tab stop or the last tab stop of the sink.
request: Specifies whether focus should be set to the first or the last tab stop.
Returns: true if the focus has been set as requested; false, if there are no tab stops.
"""
pass
def TranslateAcceleratorCore(self, *args): #cannot find CLR method
"""
TranslateAcceleratorCore(self: HwndSource, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes keyboard input at the key-down message level.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: true if the message was handled by the method implementation; otherwise, false.
"""
pass
def TranslateCharCore(self, *args): #cannot find CLR method
"""
TranslateCharCore(self: HwndSource, msg: MSG, modifiers: ModifierKeys) -> (bool, MSG)
Processes WM_CHAR, WM_SYSCHAR, WM_DEADCHAR, and WM_SYSDEADCHAR input messages
before the
System.Windows.Interop.IKeyboardInputSink.OnMnemonic(System.Windows.Interop.MSG@
,System.Windows.Input.ModifierKeys) method is called.
msg: The message and associated data. Do not modify this structure. It is passed by
reference for performance reasons only.
modifiers: Modifier keys.
Returns: true if the message was processed and
System.Windows.Interop.IKeyboardInputSink.OnMnemonic(System.Windows.Interop.MSG@
,System.Windows.Input.ModifierKeys) should not be called; otherwise, false.
"""
pass
def __enter__(self, *args): #cannot find CLR method
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self, *args): #cannot find CLR method
""" __exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object) """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, *__args):
"""
__new__(cls: type, classStyle: int, style: int, exStyle: int, x: int, y: int, name: str, parent: IntPtr)
__new__(cls: type, classStyle: int, style: int, exStyle: int, x: int, y: int, width: int, height: int, name: str, parent: IntPtr, adjustSizingForNonClientArea: bool)
__new__(cls: type, classStyle: int, style: int, exStyle: int, x: int, y: int, width: int, height: int, name: str, parent: IntPtr)
__new__(cls: type, parameters: HwndSourceParameters)
"""
pass
AcquireHwndFocusInMenuMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the value that determines whether to acquire Win32 focus for the WPF containing window for this System.Windows.Interop.HwndSource.
Get: AcquireHwndFocusInMenuMode(self: HwndSource) -> bool
"""
ChildKeyboardInputSinks = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a sequence of registered input sinks.
Get: ChildKeyboardInputSinks(self: HwndSource) -> IEnumerable[IKeyboardInputSink]
"""
CompositionTarget = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the visual manager for the hosted window.
Get: CompositionTarget(self: HwndSource) -> HwndTarget
"""
Handle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the window handle for this System.Windows.Interop.HwndSource.
Get: Handle(self: HwndSource) -> IntPtr
"""
IsDisposed = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that indicates whether System.Windows.Interop.HwndSource.Dispose has been called on this System.Windows.Interop.HwndSource.
Get: IsDisposed(self: HwndSource) -> bool
"""
KeyboardInputSiteCore = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a reference to the component's container's System.Windows.Interop.IKeyboardInputSite interface.
"""
RestoreFocusMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the System.Windows.Input.RestoreFocusMode for the window.
Get: RestoreFocusMode(self: HwndSource) -> RestoreFocusMode
"""
RootVisual = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Windows.Media.CompositionTarget.RootVisual of the window.
Get: RootVisual(self: HwndSource) -> Visual
Set: RootVisual(self: HwndSource) = value
"""
SizeToContent = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Get or sets whether and how the window is sized to its content.
Get: SizeToContent(self: HwndSource) -> SizeToContent
Set: SizeToContent(self: HwndSource) = value
"""
UsesPerPixelOpacity = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that declares whether the per-pixel opacity of the source window content is respected.
Get: UsesPerPixelOpacity(self: HwndSource) -> bool
"""
AutoResized = None
DefaultAcquireHwndFocusInMenuMode = True
Disposed = None
DpiChanged = None
SizeToContentChanged = None
class HwndSourceHook(MulticastDelegate, ICloneable, ISerializable):
"""
Represents the method that handles Win32 window messages.
HwndSourceHook(object: object, method: IntPtr)
"""
def BeginInvoke(self, hwnd, msg, wParam, lParam, handled, callback, object):
""" BeginInvoke(self: HwndSourceHook, hwnd: IntPtr, msg: int, wParam: IntPtr, lParam: IntPtr, handled: bool, callback: AsyncCallback, object: object) -> (IAsyncResult, bool) """
pass
def CombineImpl(self, *args): #cannot find CLR method
"""
CombineImpl(self: MulticastDelegate, follow: Delegate) -> Delegate
Combines this System.Delegate with the specified System.Delegate to form a new
delegate.
follow: The delegate to combine with this delegate.
Returns: A delegate that is the new root of the System.MulticastDelegate invocation list.
"""
pass
def DynamicInvokeImpl(self, *args): #cannot find CLR method
"""
DynamicInvokeImpl(self: Delegate, args: Array[object]) -> object
Dynamically invokes (late-bound) the method represented by the current delegate.
args: An array of objects that are the arguments to pass to the method represented by
the current delegate.-or- null, if the method represented by the current
delegate does not require arguments.
Returns: The object returned by the method represented by the delegate.
"""
pass
def EndInvoke(self, handled, result):
""" EndInvoke(self: HwndSourceHook, handled: bool, result: IAsyncResult) -> (IntPtr, bool) """
pass
def GetMethodImpl(self, *args): #cannot find CLR method
"""
GetMethodImpl(self: MulticastDelegate) -> MethodInfo
Returns a static method represented by the current System.MulticastDelegate.
Returns: A static method represented by the current System.MulticastDelegate.
"""
pass
def Invoke(self, hwnd, msg, wParam, lParam, handled):
""" Invoke(self: HwndSourceHook, hwnd: IntPtr, msg: int, wParam: IntPtr, lParam: IntPtr, handled: bool) -> (IntPtr, bool) """
pass
def RemoveImpl(self, *args): #cannot find CLR method
"""
RemoveImpl(self: MulticastDelegate, value: Delegate) -> Delegate
Removes an element from the invocation list of this System.MulticastDelegate
that is equal to the specified delegate.
value: The delegate to search for in the invocation list.
Returns: If value is found in the invocation list for this instance, then a new
System.Delegate without value in its invocation list; otherwise, this instance
with its original invocation list.
"""
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, object, method):
""" __new__(cls: type, object: object, method: IntPtr) """
pass
def __reduce_ex__(self, *args): #cannot find CLR method
pass
class HwndSourceParameters(object):
"""
Contains the parameters that are used to create an System.Windows.Interop.HwndSource object using the System.Windows.Interop.HwndSource.#ctor(System.Windows.Interop.HwndSourceParameters) constructor.
HwndSourceParameters(name: str)
HwndSourceParameters(name: str, width: int, height: int)
"""
def Equals(self, obj):
"""
Equals(self: HwndSourceParameters, obj: HwndSourceParameters) -> bool
Determines whether this structure is equal to a specified
System.Windows.Interop.HwndSourceParameters structure.
obj: The structure to be tested for equality.
Returns: true if the structures are equal; otherwise, false.
Equals(self: HwndSourceParameters, obj: object) -> bool
Determines whether this structure is equal to a specified object.
obj: The objects to be tested for equality.
Returns: true if the comparison is equal; otherwise, false.
"""
pass
def GetHashCode(self):
"""
GetHashCode(self: HwndSourceParameters) -> int
Returns the hash code for this System.Windows.Interop.HwndSourceParameters
instance.
Returns: A 32-bit signed integer hash code.
"""
pass
def SetPosition(self, x, y):
"""
SetPosition(self: HwndSourceParameters, x: int, y: int)
Sets the values that are used for the screen position of the window for the
System.Windows.Interop.HwndSource.
x: The position of the left edge of the window.
y: The position of the upper edge of the window.
"""
pass
def SetSize(self, width, height):
"""
SetSize(self: HwndSourceParameters, width: int, height: int)
Sets the values that are used for the window size of the
System.Windows.Interop.HwndSource.
width: The width of the window, in device pixels.
height: The height of the window, in device pixels.
"""
pass
def __eq__(self, *args): #cannot find CLR method
""" x.__eq__(y) <==> x==y """
pass
@staticmethod # known case of __new__
def __new__(self, name, width=None, height=None):
"""
__new__[HwndSourceParameters]() -> HwndSourceParameters
__new__(cls: type, name: str)
__new__(cls: type, name: str, width: int, height: int)
"""
pass
def __ne__(self, *args): #cannot find CLR method
pass
AcquireHwndFocusInMenuMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the value that determines whether to acquire Win32 focus for the WPF containing window when an System.Windows.Interop.HwndSource is created.
Get: AcquireHwndFocusInMenuMode(self: HwndSourceParameters) -> bool
Set: AcquireHwndFocusInMenuMode(self: HwndSourceParameters) = value
"""
AdjustSizingForNonClientArea = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a value that indicates whether to include the nonclient area for sizing.
Get: AdjustSizingForNonClientArea(self: HwndSourceParameters) -> bool
Set: AdjustSizingForNonClientArea(self: HwndSourceParameters) = value
"""
ExtendedWindowStyle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the extended Microsoft Windows styles for the window.
Get: ExtendedWindowStyle(self: HwndSourceParameters) -> int
Set: ExtendedWindowStyle(self: HwndSourceParameters) = value
"""
HasAssignedSize = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that indicates whether a size was assigned.
Get: HasAssignedSize(self: HwndSourceParameters) -> bool
"""
Height = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a value that indicates the height of the window.
Get: Height(self: HwndSourceParameters) -> int
Set: Height(self: HwndSourceParameters) = value
"""
HwndSourceHook = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the message hook for the window.
Get: HwndSourceHook(self: HwndSourceParameters) -> HwndSourceHook
Set: HwndSourceHook(self: HwndSourceParameters) = value
"""
ParentWindow = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the window handle (HWND) of the parent for the created window.
Get: ParentWindow(self: HwndSourceParameters) -> IntPtr
Set: ParentWindow(self: HwndSourceParameters) = value
"""
PositionX = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the left-edge position of the window.
Get: PositionX(self: HwndSourceParameters) -> int
Set: PositionX(self: HwndSourceParameters) = value
"""
PositionY = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the upper-edge position of the window.
Get: PositionY(self: HwndSourceParameters) -> int
Set: PositionY(self: HwndSourceParameters) = value
"""
RestoreFocusMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets how WPF handles restoring focus to the window.
Get: RestoreFocusMode(self: HwndSourceParameters) -> RestoreFocusMode
Set: RestoreFocusMode(self: HwndSourceParameters) = value
"""
TreatAncestorsAsNonClientArea = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Get: TreatAncestorsAsNonClientArea(self: HwndSourceParameters) -> bool
Set: TreatAncestorsAsNonClientArea(self: HwndSourceParameters) = value
"""
TreatAsInputRoot = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Get: TreatAsInputRoot(self: HwndSourceParameters) -> bool
Set: TreatAsInputRoot(self: HwndSourceParameters) = value
"""
UsesPerPixelOpacity = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that declares whether the per-pixel opacity of the source window content is respected.
Get: UsesPerPixelOpacity(self: HwndSourceParameters) -> bool
Set: UsesPerPixelOpacity(self: HwndSourceParameters) = value
"""
UsesPerPixelTransparency = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Get: UsesPerPixelTransparency(self: HwndSourceParameters) -> bool
Set: UsesPerPixelTransparency(self: HwndSourceParameters) = value
"""
Width = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a value that indicates the width of the window.
Get: Width(self: HwndSourceParameters) -> int
Set: Width(self: HwndSourceParameters) = value
"""
WindowClassStyle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the Microsoft Windows class style for the window.
Get: WindowClassStyle(self: HwndSourceParameters) -> int
Set: WindowClassStyle(self: HwndSourceParameters) = value
"""
WindowName = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the name of the window.
Get: WindowName(self: HwndSourceParameters) -> str
Set: WindowName(self: HwndSourceParameters) = value
"""
WindowStyle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the style for the window.
Get: WindowStyle(self: HwndSourceParameters) -> int
Set: WindowStyle(self: HwndSourceParameters) = value
"""
class HwndTarget(CompositionTarget, IDisposable, ICompositionTarget):
"""
Represents a binding to a window handle that supports visual composition.
HwndTarget(hwnd: IntPtr)
"""
def Dispose(self):
"""
Dispose(self: HwndTarget)
Releases all resources used by the System.Windows.Interop.HwndTarget.
"""
pass
def __enter__(self, *args): #cannot find CLR method
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self, *args): #cannot find CLR method
""" __exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object) """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, hwnd):
""" __new__(cls: type, hwnd: IntPtr) """
pass
BackgroundColor = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the background color of the window referenced by this System.Windows.Interop.HwndTarget.
Get: BackgroundColor(self: HwndTarget) -> Color
Set: BackgroundColor(self: HwndTarget) = value
"""
RenderMode = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the rendering mode for the window referenced by this System.Windows.Interop.HwndTarget.
Get: RenderMode(self: HwndTarget) -> RenderMode
Set: RenderMode(self: HwndTarget) = value
"""
RootVisual = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the root visual object of the page that is hosted by the window.
Set: RootVisual(self: HwndTarget) = value
"""
TransformFromDevice = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a matrix that transforms the coordinates of the device that is associated with the rendering destination of this target.
Get: TransformFromDevice(self: HwndTarget) -> Matrix
"""
TransformToDevice = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a matrix that transforms the coordinates of this target to the device that is associated with the rendering destination.
Get: TransformToDevice(self: HwndTarget) -> Matrix
"""
UsesPerPixelOpacity = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets a value that declares whether the per-pixel opacity value of the source window content is used for rendering.
Get: UsesPerPixelOpacity(self: HwndTarget) -> bool
"""
class IErrorPage:
""" Defines the interaction between Windows Presentation Foundation (WPF) applications that are hosting interoperation content and interpreted by the Windows Presentation Foundation (WPF) executable, and a host supplied error page. """
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
DeploymentPath = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the path to an application's deployment manifest.
Get: DeploymentPath(self: IErrorPage) -> Uri
Set: DeploymentPath(self: IErrorPage) = value
"""
ErrorFlag = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a value that indicates whether this represents an error or some other condition such as a warning. true denotes an error.
Get: ErrorFlag(self: IErrorPage) -> bool
Set: ErrorFlag(self: IErrorPage) = value
"""
ErrorText = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a verbose description of the error.
Get: ErrorText(self: IErrorPage) -> str
Set: ErrorText(self: IErrorPage) = value
"""
ErrorTitle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the string title of the error page.
Get: ErrorTitle(self: IErrorPage) -> str
Set: ErrorTitle(self: IErrorPage) = value
"""
GetWinFxCallback = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a reference to a System.Windows.Threading.DispatcherOperationCallback handler, which can handle requests for Microsoft .NET Framework runtime downloads.
Get: GetWinFxCallback(self: IErrorPage) -> DispatcherOperationCallback
Set: GetWinFxCallback(self: IErrorPage) = value
"""
LogFilePath = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the string path to the error's log file, if any.
Get: LogFilePath(self: IErrorPage) -> str
Set: LogFilePath(self: IErrorPage) = value
"""
RefreshCallback = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a reference to a System.Windows.Threading.DispatcherOperationCallback handler, that can handle refresh of the error page.
Get: RefreshCallback(self: IErrorPage) -> DispatcherOperationCallback
Set: RefreshCallback(self: IErrorPage) = value
"""
SupportUri = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a uniform resource identifier (URI) for support information associated with the error.
Get: SupportUri(self: IErrorPage) -> Uri
Set: SupportUri(self: IErrorPage) = value
"""
class IKeyboardInputSite:
""" Manages keyboard focus within the container. This interface implements keyboard message management in WPF-Win32 interoperation scenarios. """
def OnNoMoreTabStops(self, request):
"""
OnNoMoreTabStops(self: IKeyboardInputSite, request: TraversalRequest) -> bool
Called by a contained component when it has reached its last tab stop and has
no further items to tab to.
request: Specifies whether focus should be set to the first or the last tab stop.
Returns: If this method returns true, the site has shifted focus to another component.
If this method returns false, focus is still within the calling component. The
component should "wrap around" and set focus to its first contained tab stop.
"""
pass
def Unregister(self):
"""
Unregister(self: IKeyboardInputSite)
Unregisters a child keyboard input sink from this site.
"""
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
Sink = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the keyboard sink associated with this site.
Get: Sink(self: IKeyboardInputSite) -> IKeyboardInputSink
"""
class Imaging(object):
""" Provides managed to unmanaged interoperation support for creating image objects. """
@staticmethod
def CreateBitmapSourceFromHBitmap(bitmap, palette, sourceRect, sizeOptions):
"""
CreateBitmapSourceFromHBitmap(bitmap: IntPtr, palette: IntPtr, sourceRect: Int32Rect, sizeOptions: BitmapSizeOptions) -> BitmapSource
Returns a managed System.Windows.Media.Imaging.BitmapSource, based on the
provided pointer to an unmanaged bitmap and palette information.
bitmap: A pointer to the unmanaged bitmap.
palette: A pointer to the bitmap's palette map.
sourceRect: The size of the source image.
sizeOptions: A value of the enumeration that specifies how to handle conversions.
Returns: The created System.Windows.Media.Imaging.BitmapSource.
"""
pass
@staticmethod
def CreateBitmapSourceFromHIcon(icon, sourceRect, sizeOptions):
"""
CreateBitmapSourceFromHIcon(icon: IntPtr, sourceRect: Int32Rect, sizeOptions: BitmapSizeOptions) -> BitmapSource
Returns a managed System.Windows.Media.Imaging.BitmapSource, based on the
provided pointer to an unmanaged icon image.
icon: A pointer to the unmanaged icon source.
sourceRect: The size of the source image.
sizeOptions: A value of the enumeration that specifies how to handle conversions.
Returns: The created System.Windows.Media.Imaging.BitmapSource.
"""
pass
@staticmethod
def CreateBitmapSourceFromMemorySection(section, pixelWidth, pixelHeight, format, stride, offset):
"""
CreateBitmapSourceFromMemorySection(section: IntPtr, pixelWidth: int, pixelHeight: int, format: PixelFormat, stride: int, offset: int) -> BitmapSource
Returns a managed System.Windows.Media.Imaging.BitmapSource, based on the
provided unmanaged memory location.
section: A pointer to a memory section.
pixelWidth: An integer that specifies the width, in pixels, of the bitmap.
pixelHeight: An integer that specifies the height, in pixels, of the bitmap.
format: A value of the enumeration.
stride: The stride of the bitmap.
offset: The byte offset into the memory stream where the image starts.
Returns: The created System.Windows.Media.Imaging.BitmapSource.
"""
pass
__all__ = [
'CreateBitmapSourceFromHBitmap',
'CreateBitmapSourceFromHIcon',
'CreateBitmapSourceFromMemorySection',
]
class InteropBitmap(BitmapSource, ISealable, IAnimatable, IResource, IFormattable):
""" System.Windows.Interop.InteropBitmap enables developers to improve rendering performance of non-WPF�UIs that are hosted by WPF in interoperability scenarios. """
def CheckIfSiteOfOrigin(self, *args): #cannot find CLR method
"""
CheckIfSiteOfOrigin(self: BitmapSource)
Checks whether the bitmap source content is from a known site of origin. This
method is used to make sure that pixel copying operations are safe.
"""
pass
def CloneCore(self, *args): #cannot find CLR method
""" CloneCore(self: InteropBitmap, sourceFreezable: Freezable) """
pass
def CloneCurrentValueCore(self, *args): #cannot find CLR method
""" CloneCurrentValueCore(self: InteropBitmap, sourceFreezable: Freezable) """
pass
def CreateInstance(self, *args): #cannot find CLR method
"""
CreateInstance(self: Freezable) -> Freezable
Initializes a new instance of the System.Windows.Freezable class.
Returns: The new instance.
"""
pass
def CreateInstanceCore(self, *args): #cannot find CLR method
""" CreateInstanceCore(self: InteropBitmap) -> Freezable """
pass
def FreezeCore(self, *args): #cannot find CLR method
"""
FreezeCore(self: BitmapSource, isChecking: bool) -> bool
Makes an instance of System.Windows.Media.Imaging.BitmapSource or a derived
class immutable.
isChecking: true if this instance should actually freeze itself when this method is called;
otherwise, false.
Returns: If isChecking is true, this method returns true if this
System.Windows.Media.Animation.Animatable can be made unmodifiable, or false if
it cannot be made unmodifiable. If isChecking is false, this method returns
true if the if this System.Windows.Media.Animation.Animatable is now
unmodifiable, or false if it cannot be made unmodifiable, with the side effect
of having begun to change the frozen status of this object.
"""
pass
def GetAsFrozenCore(self, *args): #cannot find CLR method
""" GetAsFrozenCore(self: InteropBitmap, sourceFreezable: Freezable) """
pass
def GetCurrentValueAsFrozenCore(self, *args): #cannot find CLR method
""" GetCurrentValueAsFrozenCore(self: InteropBitmap, sourceFreezable: Freezable) """
pass
def Invalidate(self, dirtyRect=None):
"""
Invalidate(self: InteropBitmap, dirtyRect: Nullable[Int32Rect])Invalidate(self: InteropBitmap)
Forces the hosted non-WPF�UI to be rendered.
"""
pass
def OnChanged(self, *args): #cannot find CLR method
"""
OnChanged(self: Freezable)
Called when the current System.Windows.Freezable object is modified.
"""
pass
def OnFreezablePropertyChanged(self, *args): #cannot find CLR method
"""
OnFreezablePropertyChanged(self: Freezable, oldValue: DependencyObject, newValue: DependencyObject, property: DependencyProperty)
This member supports the Windows Presentation Foundation (WPF) infrastructure
and is not intended to be used directly from your code.
oldValue: The previous value of the data member.
newValue: The current value of the data member.
property: The property that changed.
OnFreezablePropertyChanged(self: Freezable, oldValue: DependencyObject, newValue: DependencyObject)
Ensures that appropriate context pointers are established for a
System.Windows.DependencyObjectType data member that has just been set.
oldValue: The previous value of the data member.
newValue: The current value of the data member.
"""
pass
def OnPropertyChanged(self, *args): #cannot find CLR method
"""
OnPropertyChanged(self: Freezable, e: DependencyPropertyChangedEventArgs)
Overrides the System.Windows.DependencyObject implementation of
System.Windows.DependencyObject.OnPropertyChanged(System.Windows.DependencyPrope
rtyChangedEventArgs) to also invoke any System.Windows.Freezable.Changed
handlers in response to a changing dependency property of type
System.Windows.Freezable.
e: Event data that contains information about which property changed, and its old
and new values.
"""
pass
def ReadPreamble(self, *args): #cannot find CLR method
"""
ReadPreamble(self: Freezable)
Ensures that the System.Windows.Freezable is being accessed from a valid
thread. Inheritors of System.Windows.Freezable must call this method at the
beginning of any API that reads data members that are not dependency
properties.
"""
pass
def ShouldSerializeProperty(self, *args): #cannot find CLR method
"""
ShouldSerializeProperty(self: DependencyObject, dp: DependencyProperty) -> bool
Returns a value that indicates whether serialization processes should serialize
the value for the provided dependency property.
dp: The identifier for the dependency property that should be serialized.
Returns: true if the dependency property that is supplied should be value-serialized;
otherwise, false.
ShouldSerializeProperty(self: Window_16$17, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Label_17$18, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: TextBox_18$19, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Button_19$20, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: CheckBox_20$21, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: ComboBox_21$22, dp: DependencyProperty) -> bool
ShouldSerializeProperty(self: Separator_22$23, dp: DependencyProperty) -> bool
"""
pass
def WritePostscript(self, *args): #cannot find CLR method
"""
WritePostscript(self: Freezable)
Raises the System.Windows.Freezable.Changed event for the
System.Windows.Freezable and invokes its System.Windows.Freezable.OnChanged
method. Classes that derive from System.Windows.Freezable should call this
method at the end of any API that modifies class members that are not stored as
dependency properties.
"""
pass
def WritePreamble(self, *args): #cannot find CLR method
"""
WritePreamble(self: Freezable)
Verifies that the System.Windows.Freezable is not frozen and that it is being
accessed from a valid threading context. System.Windows.Freezable inheritors
should call this method at the beginning of any API that writes to data members
that are not dependency properties.
"""
pass
def __format__(self, *args): #cannot find CLR method
""" __format__(formattable: IFormattable, format: str) -> str """
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __str__(self, *args): #cannot find CLR method
pass
class IProgressPage:
""" Defines the interaction between Windows Presentation Foundation (WPF) applications that are hosting interoperation content, and a host supplied progress page. """
def UpdateProgress(self, bytesDownloaded, bytesTotal):
"""
UpdateProgress(self: IProgressPage, bytesDownloaded: Int64, bytesTotal: Int64)
Provides upload progress numeric information that can be used to update the
progress indicators.
bytesDownloaded: Total bytes downloaded thus far.
bytesTotal: Total bytes that need to be downloaded for the application.
"""
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
ApplicationName = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the application's name.
Get: ApplicationName(self: IProgressPage) -> str
Set: ApplicationName(self: IProgressPage) = value
"""
DeploymentPath = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the System.Uri path to the application deployment manifest.
Get: DeploymentPath(self: IProgressPage) -> Uri
Set: DeploymentPath(self: IProgressPage) = value
"""
PublisherName = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the application's publisher.
Get: PublisherName(self: IProgressPage) -> str
Set: PublisherName(self: IProgressPage) = value
"""
RefreshCallback = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a reference to a System.Windows.Threading.DispatcherOperationCallback handler, that can handle the case of a user-initiated Refresh command.
Get: RefreshCallback(self: IProgressPage) -> DispatcherOperationCallback
Set: RefreshCallback(self: IProgressPage) = value
"""
StopCallback = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets a reference to a System.Windows.Threading.DispatcherOperationCallback handler, that can handle the case of a user-initiated Stop command.
Get: StopCallback(self: IProgressPage) -> DispatcherOperationCallback
Set: StopCallback(self: IProgressPage) = value
"""
class MSG(object):
""" Contains message information from a thread's message queue. """
hwnd = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the window handle (HWND) to the window whose window procedure receives the message.
Get: hwnd(self: MSG) -> IntPtr
Set: hwnd(self: MSG) = value
"""
lParam = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the lParam value that specifies additional information about the message. The exact meaning depends on the value of the System.Windows.Interop.MSG.message member.
Get: lParam(self: MSG) -> IntPtr
Set: lParam(self: MSG) = value
"""
message = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the message identifier.
Get: message(self: MSG) -> int
Set: message(self: MSG) = value
"""
pt_x = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the x coordinate of the cursor position on the screen, when the message was posted.
Get: pt_x(self: MSG) -> int
Set: pt_x(self: MSG) = value
"""
pt_y = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the y coordinate of the cursor position on the screen, when the message was posted.
Get: pt_y(self: MSG) -> int
Set: pt_y(self: MSG) = value
"""
time = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the time at which the message was posted.
Get: time(self: MSG) -> int
Set: time(self: MSG) = value
"""
wParam = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the wParam value for the message, which specifies additional information about the message. The exact meaning depends on the value of the message.
Get: wParam(self: MSG) -> IntPtr
Set: wParam(self: MSG) = value
"""
class RenderMode(Enum, IComparable, IFormattable, IConvertible):
"""
Specifies the rendering preference.
enum RenderMode, values: Default (0), SoftwareOnly (1)
"""
def __eq__(self, *args): #cannot find CLR method
""" x.__eq__(y) <==> x==yx.__eq__(y) <==> x==yx.__eq__(y) <==> x==y """
pass
def __format__(self, *args): #cannot find CLR method
""" __format__(formattable: IFormattable, format: str) -> str """
pass
def __ge__(self, *args): #cannot find CLR method
pass
def __gt__(self, *args): #cannot find CLR method
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __le__(self, *args): #cannot find CLR method
pass
def __lt__(self, *args): #cannot find CLR method
pass
def __ne__(self, *args): #cannot find CLR method
pass
def __reduce_ex__(self, *args): #cannot find CLR method
pass
def __str__(self, *args): #cannot find CLR method
pass
Default = None
SoftwareOnly = None
value__ = None
class ThreadMessageEventHandler(MulticastDelegate, ICloneable, ISerializable):
"""
Represents the method that handles the System.Windows.Interop.ComponentDispatcher.ThreadFilterMessage and System.Windows.Interop.ComponentDispatcher.ThreadPreprocessMessage events.
ThreadMessageEventHandler(object: object, method: IntPtr)
"""
def BeginInvoke(self, msg, handled, callback, object):
""" BeginInvoke(self: ThreadMessageEventHandler, msg: MSG, handled: bool, callback: AsyncCallback, object: object) -> (IAsyncResult, MSG, bool) """
pass
def CombineImpl(self, *args): #cannot find CLR method
"""
CombineImpl(self: MulticastDelegate, follow: Delegate) -> Delegate
Combines this System.Delegate with the specified System.Delegate to form a new
delegate.
follow: The delegate to combine with this delegate.
Returns: A delegate that is the new root of the System.MulticastDelegate invocation list.
"""
pass
def DynamicInvokeImpl(self, *args): #cannot find CLR method
"""
DynamicInvokeImpl(self: Delegate, args: Array[object]) -> object
Dynamically invokes (late-bound) the method represented by the current delegate.
args: An array of objects that are the arguments to pass to the method represented by
the current delegate.-or- null, if the method represented by the current
delegate does not require arguments.
Returns: The object returned by the method represented by the delegate.
"""
pass
def EndInvoke(self, msg, handled, result):
""" EndInvoke(self: ThreadMessageEventHandler, msg: MSG, handled: bool, result: IAsyncResult) -> (MSG, bool) """
pass
def GetMethodImpl(self, *args): #cannot find CLR method
"""
GetMethodImpl(self: MulticastDelegate) -> MethodInfo
Returns a static method represented by the current System.MulticastDelegate.
Returns: A static method represented by the current System.MulticastDelegate.
"""
pass
def Invoke(self, msg, handled):
""" Invoke(self: ThreadMessageEventHandler, msg: MSG, handled: bool) -> (MSG, bool) """
pass
def RemoveImpl(self, *args): #cannot find CLR method
"""
RemoveImpl(self: MulticastDelegate, value: Delegate) -> Delegate
Removes an element from the invocation list of this System.MulticastDelegate
that is equal to the specified delegate.
value: The delegate to search for in the invocation list.
Returns: If value is found in the invocation list for this instance, then a new
System.Delegate without value in its invocation list; otherwise, this instance
with its original invocation list.
"""
pass
def __init__(self, *args): #cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, object, method):
""" __new__(cls: type, object: object, method: IntPtr) """
pass
def __reduce_ex__(self, *args): #cannot find CLR method
pass
class WindowInteropHelper(object):
"""
Assists interoperation between Windows Presentation Foundation (WPF) and Win32 code.
WindowInteropHelper(window: Window)
"""
def EnsureHandle(self):
"""
EnsureHandle(self: WindowInteropHelper) -> IntPtr
Creates the HWND of the window if the HWND has not been created yet.
Returns: An System.IntPtr that represents the HWND.
"""
pass
@staticmethod # known case of __new__
def __new__(self, window):
""" __new__(cls: type, window: Window) """
pass
Handle = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets the window handle for a Windows Presentation Foundation (WPF) window�that is used to create this System.Windows.Interop.WindowInteropHelper.
Get: Handle(self: WindowInteropHelper) -> IntPtr
"""
Owner = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""Gets or sets the handle of the Windows Presentation Foundation (WPF)�owner window.
Get: Owner(self: WindowInteropHelper) -> IntPtr
Set: Owner(self: WindowInteropHelper) = value
"""
| 57.707765 | 765 | 0.68773 | 36,070 | 342,611 | 6.461713 | 0.037621 | 0.034972 | 0.021984 | 0.028266 | 0.919661 | 0.911127 | 0.902705 | 0.898659 | 0.896732 | 0.891768 | 0.000403 | 0.026284 | 0.238658 | 342,611 | 5,936 | 766 | 57.717487 | 0.866725 | 0.725572 | 0 | 0.838863 | 0 | 0 | 0.005377 | 0.002724 | 0 | 0 | 0 | 0 | 0 | 1 | 0.403791 | false | 0.403791 | 0 | 0 | 0.565877 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
021d8bd598b7a65e47e0e8e3e158a811c8e4da90 | 13,188 | py | Python | tests/test_field_equality.py | AppliedAcousticsChalmers/levitate | c6ad1038327abfc82a5402b42019b69e52f7f5c4 | [
"MIT"
] | 11 | 2019-07-13T13:09:16.000Z | 2021-11-13T09:23:34.000Z | tests/test_field_equality.py | AppliedAcousticsChalmers/levitate | c6ad1038327abfc82a5402b42019b69e52f7f5c4 | [
"MIT"
] | 4 | 2019-03-08T09:15:08.000Z | 2019-03-08T09:15:51.000Z | tests/test_field_equality.py | AppliedAcousticsChalmers/levitate | c6ad1038327abfc82a5402b42019b69e52f7f5c4 | [
"MIT"
] | 5 | 2020-09-14T16:23:35.000Z | 2021-11-14T16:19:32.000Z | import numpy as np
import levitate
import pickle
# Tests created with these air properties
from levitate.materials import air
air.c = 343
air.rho = 1.2
pos = np.array([0.1, 0.2, 0.3])
pos_b = np.array([-0.15, 1.27, 0.001])
array = levitate.arrays.RectangularArray(shape=(4, 5))
array_b = levitate.arrays.RectangularArray(shape=(5, 4))
def test_spheherical_harmonics_parameters():
assert levitate.fields.SphericalHarmonicsExpansion(array, orders=3) == levitate.fields.SphericalHarmonicsExpansion(array, orders=3)
assert levitate.fields.SphericalHarmonicsExpansion(array, orders=3) == pickle.loads(pickle.dumps(levitate.fields.SphericalHarmonicsExpansion(array, orders=3)))
assert levitate.fields.SphericalHarmonicsExpansion(array, orders=3) != levitate.fields.SphericalHarmonicsExpansion(array, orders=4)
assert levitate.fields.SphericalHarmonicsExpansion(array, orders=3) != levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=3)
assert levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=3) == levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=3)
assert levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=3) == pickle.loads(pickle.dumps(levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=3)))
assert levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=3) != levitate.fields.SphericalHarmonicsExpansionGradient(array, orders=4)
def test_gorkov_parameters():
assert levitate.fields.GorkovPotential(array) == levitate.fields.GorkovPotential(array)
assert levitate.fields.GorkovPotential(array) == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array)))
assert levitate.fields.GorkovPotential(array, radius=1e-3) != levitate.fields.GorkovPotential(array, radius=1.1e-3)
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array, material=levitate.materials.air)
assert levitate.fields.GorkovGradient(array) == levitate.fields.GorkovGradient(array)
assert levitate.fields.GorkovGradient(array) == pickle.loads(pickle.dumps(levitate.fields.GorkovGradient(array)))
assert levitate.fields.GorkovGradient(array, radius=1e-3) != levitate.fields.GorkovGradient(array, radius=1.1e-3)
assert levitate.fields.GorkovGradient(array) != levitate.fields.GorkovGradient(array, material=levitate.materials.air)
assert levitate.fields.GorkovLaplacian(array) == levitate.fields.GorkovLaplacian(array)
assert levitate.fields.GorkovLaplacian(array) == pickle.loads(pickle.dumps(levitate.fields.GorkovLaplacian(array)))
assert levitate.fields.GorkovLaplacian(array, radius=1e-3) != levitate.fields.GorkovLaplacian(array, radius=1.1e-3)
assert levitate.fields.GorkovLaplacian(array) != levitate.fields.GorkovLaplacian(array, material=levitate.materials.air)
def test_radiation_force_parameters():
assert levitate.fields.RadiationForce(array) == levitate.fields.RadiationForce(array)
assert levitate.fields.RadiationForce(array) == pickle.loads(pickle.dumps(levitate.fields.RadiationForce(array)))
assert levitate.fields.RadiationForce(array, radius=1e-3) != levitate.fields.RadiationForce(array, radius=1.1e-3)
assert levitate.fields.RadiationForce(array) != levitate.fields.RadiationForce(array, material=levitate.materials.air)
assert levitate.fields.RadiationForceStiffness(array) == levitate.fields.RadiationForceStiffness(array)
assert levitate.fields.RadiationForceStiffness(array) == pickle.loads(pickle.dumps(levitate.fields.RadiationForceStiffness(array)))
assert levitate.fields.RadiationForceStiffness(array, radius=1e-3) != levitate.fields.RadiationForceStiffness(array, radius=1.1e-3)
assert levitate.fields.RadiationForceStiffness(array) != levitate.fields.RadiationForceStiffness(array, material=levitate.materials.air)
assert levitate.fields.RadiationForceCurl(array) == levitate.fields.RadiationForceCurl(array)
assert levitate.fields.RadiationForceCurl(array) == pickle.loads(pickle.dumps(levitate.fields.RadiationForceCurl(array)))
assert levitate.fields.RadiationForceCurl(array, radius=1e-3) != levitate.fields.RadiationForceCurl(array, radius=1.1e-3)
assert levitate.fields.RadiationForceCurl(array) != levitate.fields.RadiationForceCurl(array, material=levitate.materials.air)
assert levitate.fields.RadiationForceGradient(array) == levitate.fields.RadiationForceGradient(array)
assert levitate.fields.RadiationForceGradient(array) == pickle.loads(pickle.dumps(levitate.fields.RadiationForceGradient(array)))
assert levitate.fields.RadiationForceGradient(array, radius=1e-3) != levitate.fields.RadiationForceGradient(array, radius=1.1e-3)
assert levitate.fields.RadiationForceGradient(array) != levitate.fields.RadiationForceGradient(array, material=levitate.materials.air)
def test_spherical_harmonics_force_parameters():
assert levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2) == levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2)
assert levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2) == pickle.loads(pickle.dumps(levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2)))
assert levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2) != levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=3)
assert levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2) != levitate.fields.SphericalHarmonicsForce(array, orders=2, radius=1.1e-3)
assert levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2) != levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2, scattering_model='compressible')
assert levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2, scattering_model='compressible') != levitate.fields.SphericalHarmonicsForce(array, radius=1e-3, orders=2, material=levitate.materials.air, scattering_model='compressible')
def test_direct_params():
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array_b)
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array, weight=1)
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array, position=pos)
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array, weight=1, position=pos)
def test_simple_types():
# Field, should diff if array, field, or type is different.
assert levitate.fields.GorkovPotential(array) == levitate.fields.GorkovPotential(array)
assert levitate.fields.GorkovPotential(array) == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array)))
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array_b)
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovGradient(array)
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array) * 1
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array) @ pos
assert levitate.fields.GorkovPotential(array) != levitate.fields.GorkovPotential(array) * 1 @ pos
# CostField, should also diff if weight is different
assert levitate.fields.GorkovPotential(array) * 1 == levitate.fields.GorkovPotential(array) * 1
assert levitate.fields.GorkovPotential(array) * 1 == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) * 1))
assert levitate.fields.GorkovPotential(array) * 1 != levitate.fields.GorkovPotential(array) * 2
assert levitate.fields.GorkovPotential(array) * 1 != levitate.fields.GorkovPotential(array)
assert levitate.fields.GorkovPotential(array) * 1 != levitate.fields.GorkovPotential(array) @ pos
assert levitate.fields.GorkovPotential(array) * 1 != levitate.fields.GorkovPotential(array) * 1 @ pos
# FieldPoint, should also diff if position is different
assert levitate.fields.GorkovPotential(array) @ pos == levitate.fields.GorkovPotential(array) @ pos
assert levitate.fields.GorkovPotential(array) @ pos == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) @ pos))
assert levitate.fields.GorkovPotential(array) @ pos != levitate.fields.GorkovPotential(array) * 4
assert levitate.fields.GorkovPotential(array) @ pos != levitate.fields.GorkovPotential(array)
assert levitate.fields.GorkovPotential(array) @ pos != levitate.fields.GorkovPotential(array) * 1
assert levitate.fields.GorkovPotential(array) @ pos != levitate.fields.GorkovPotential(array) @ pos * 1
# CostFieldPoint, should diff if position or weight is different
assert levitate.fields.GorkovPotential(array) * 1 @ pos == levitate.fields.GorkovPotential(array) * 1 @ pos
assert levitate.fields.GorkovPotential(array) * 1 @ pos == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) * 1 @ pos))
assert levitate.fields.GorkovPotential(array) * 1 @ pos != levitate.fields.GorkovPotential(array) * 1 @ pos_b
assert levitate.fields.GorkovPotential(array) * 1 @ pos != levitate.fields.GorkovPotential(array) * 2 @ pos
assert levitate.fields.GorkovPotential(array) * 1 @ pos != levitate.fields.GorkovPotential(array) * 2 @ pos_b
assert levitate.fields.GorkovPotential(array) * 1 @ pos != levitate.fields.GorkovPotential(array)
assert levitate.fields.GorkovPotential(array) * 1 @ pos != levitate.fields.GorkovPotential(array) * 1
assert levitate.fields.GorkovPotential(array) * 1 @ pos != levitate.fields.GorkovPotential(array) @ pos
def test_squared_types():
# These should diff if the field is different, or if the target "vector" is different.
# SquaredField
assert levitate.fields.GorkovPotential(array) - 0 == levitate.fields.GorkovPotential(array) - 0
assert levitate.fields.GorkovPotential(array) - 0 == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) - 0))
assert levitate.fields.GorkovPotential(array) - 0 != levitate.fields.GorkovGradient(array) - 0
assert levitate.fields.GorkovPotential(array) - 0 != levitate.fields.GorkovPotential(array) - 1
# SquaredCostField
assert levitate.fields.GorkovPotential(array) * 1 - 0 == levitate.fields.GorkovPotential(array) * 1 - 0
assert levitate.fields.GorkovPotential(array) * 1 - 0 == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) * 1 - 0))
assert levitate.fields.GorkovPotential(array) * 1 - 0 != levitate.fields.GorkovPotential(array) * 1 - 1
# SquaredFieldPoint
assert levitate.fields.GorkovPotential(array) @ pos - 0 == levitate.fields.GorkovPotential(array) @ pos - 0
assert levitate.fields.GorkovPotential(array) @ pos - 0 == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) @ pos - 0))
assert levitate.fields.GorkovPotential(array) @ pos - 0 != levitate.fields.GorkovPotential(array) @ pos - 1
# SquaredCostFieldPoint
assert levitate.fields.GorkovPotential(array) * 1 @ pos - 0 == levitate.fields.GorkovPotential(array) * 1 @ pos - 0
assert levitate.fields.GorkovPotential(array) * 1 @ pos - 0 == pickle.loads(pickle.dumps(levitate.fields.GorkovPotential(array) * 1 @ pos - 0))
assert levitate.fields.GorkovPotential(array) * 1 @ pos - 0 != levitate.fields.GorkovPotential(array) * 1 @ pos - 1
def test_multis():
# Should diff if any one field is different, or if they have a different order, or if the type is different.
field_a = levitate.fields.GorkovPotential(array)
field_b = levitate.fields.GorkovGradient(array)
# MultiField
assert field_a + field_a == field_a + field_a
assert field_a + field_a == pickle.loads(pickle.dumps(field_a + field_a))
assert field_a + field_b == field_a + field_b
assert field_a + field_a != field_a + field_b
assert field_a + field_b != field_b + field_a
# MultiFieldPoint / MultiPoint
assert field_a @ pos + field_a @ pos == (field_a + field_a) @ pos
assert field_a @ pos + field_a @ pos_b == field_a @ pos + field_a @ pos_b
assert field_a @ pos + field_a @ pos != field_a @ pos_b + field_a @ pos_b
assert field_a @ pos + field_a @ pos != field_a @ pos_b + field_a @ pos
assert field_a @ pos + field_a @ pos != field_a @ pos + field_b @ pos
# MultiCostField
assert field_a * 2 + field_a * 2 == (field_a + field_a) * 2
assert field_a * 2 + field_a * 2 != field_a * 4 + field_a * 4
assert field_a * 2 + field_a * 2 != field_a * 2 + field_b * 2
# MultiCostFieldPoint / MultiPoint
assert (field_a * 2 + field_a * 2) @ pos == field_a * 2 @ pos + field_a * 2 @ pos
assert (field_a @ pos + field_a @ pos_b) * 2 == field_a * 2 @ pos + field_a * 2 @ pos_b
assert (field_a @ pos + field_a @ pos_b) * 2 == pickle.loads(pickle.dumps(field_a * 2 @ pos + field_a * 2 @ pos_b))
assert (field_a @ pos + field_a @ pos_b) * 2 != field_b * 2 @ pos + field_a * 2 @ pos
assert (field_a * 2 + field_a * 2) @ pos != (field_a * 2 + field_a * 2) @ pos_b
assert (field_a * 2 + field_a * 2) @ pos != (field_a * 2 + field_b * 2) @ pos_b
assert (field_a * 2 + field_a * 2) @ pos != (field_a * 2 + field_a * 4) @ pos
| 74.508475 | 252 | 0.759554 | 1,598 | 13,188 | 6.192115 | 0.062578 | 0.243355 | 0.278423 | 0.326427 | 0.900354 | 0.880748 | 0.855382 | 0.830723 | 0.790905 | 0.574432 | 0 | 0.018661 | 0.122308 | 13,188 | 176 | 253 | 74.931818 | 0.836199 | 0.046633 | 0 | 0.048 | 0 | 0 | 0.002867 | 0 | 0 | 0 | 0 | 0 | 0.84 | 1 | 0.064 | false | 0 | 0.032 | 0 | 0.096 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
022d4022a1150b81c29d2bab86f497e35d71dd9a | 79 | py | Python | torchsketch/data/__init__.py | songyzh/torchsketch | 42bca1b31ab9699d9b6d77a102b1f46bba82fb33 | [
"MIT"
] | 182 | 2020-03-25T01:59:11.000Z | 2022-03-29T08:58:47.000Z | torchsketch/data/__init__.py | songyzh/torchsketch | 42bca1b31ab9699d9b6d77a102b1f46bba82fb33 | [
"MIT"
] | 5 | 2020-03-25T13:16:50.000Z | 2022-02-19T09:51:39.000Z | torchsketch/data/__init__.py | songyzh/torchsketch | 42bca1b31ab9699d9b6d77a102b1f46bba82fb33 | [
"MIT"
] | 17 | 2020-03-25T12:40:49.000Z | 2022-03-28T06:34:40.000Z | from torchsketch.data import dataloaders
from torchsketch.data import datasets | 39.5 | 41 | 0.873418 | 10 | 79 | 6.9 | 0.6 | 0.434783 | 0.550725 | 0.724638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101266 | 79 | 2 | 42 | 39.5 | 0.971831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
0289fb3883b86a4de5c8de12abfb21c678d21ec8 | 2,053 | py | Python | staff/migrations/0004_auto_20200311_1944.py | mamalmaleki/maktab-community | 8ce25053ea0f6f0a6c082617c9ff306d1ada9707 | [
"MIT"
] | null | null | null | staff/migrations/0004_auto_20200311_1944.py | mamalmaleki/maktab-community | 8ce25053ea0f6f0a6c082617c9ff306d1ada9707 | [
"MIT"
] | null | null | null | staff/migrations/0004_auto_20200311_1944.py | mamalmaleki/maktab-community | 8ce25053ea0f6f0a6c082617c9ff306d1ada9707 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.3 on 2020-03-11 19:44
from django.db import migrations, models
import uuid
class Migration(migrations.Migration):
dependencies = [
('staff', '0003_auto_20200310_2228'),
]
operations = [
migrations.AddField(
model_name='instructor',
name='history',
field=models.TextField(blank=True, null=True, verbose_name='history'),
),
migrations.AddField(
model_name='instructor',
name='integration_code',
field=models.CharField(blank=True, max_length=255, verbose_name='integration code'),
),
migrations.AddField(
model_name='instructor',
name='status',
field=models.IntegerField(choices=[(0, 'draft'), (1, 'hidden'), (2, 'published'), (3, 'deleted'), (4, 'archive')], db_index=True, default=0, verbose_name='status'),
),
migrations.AddField(
model_name='instructor',
name='unique_id',
field=models.UUIDField(default=uuid.UUID('2310f234-e37e-4970-8997-709427c8aaf5'), editable=False, unique=True),
),
migrations.AddField(
model_name='student',
name='history',
field=models.TextField(blank=True, null=True, verbose_name='history'),
),
migrations.AddField(
model_name='student',
name='integration_code',
field=models.CharField(blank=True, max_length=255, verbose_name='integration code'),
),
migrations.AddField(
model_name='student',
name='status',
field=models.IntegerField(choices=[(0, 'draft'), (1, 'hidden'), (2, 'published'), (3, 'deleted'), (4, 'archive')], db_index=True, default=0, verbose_name='status'),
),
migrations.AddField(
model_name='student',
name='unique_id',
field=models.UUIDField(default=uuid.UUID('2310f234-e37e-4970-8997-709427c8aaf5'), editable=False, unique=True),
),
]
| 37.327273 | 176 | 0.586946 | 209 | 2,053 | 5.645933 | 0.320574 | 0.122034 | 0.155932 | 0.183051 | 0.866102 | 0.866102 | 0.762712 | 0.762712 | 0.762712 | 0.762712 | 0 | 0.065868 | 0.267901 | 2,053 | 54 | 177 | 38.018519 | 0.719228 | 0.021919 | 0 | 0.833333 | 1 | 0 | 0.184447 | 0.047358 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5a06829a117abb16827df4b18a8f79edcc3d7a9a | 1,050 | py | Python | ark_nlp/model/re/prgc_bert/__init__.py | confstantine/nlp-task | cb152e885bc6f6f1243a12ad90b1c715eb548736 | [
"Apache-2.0"
] | 1 | 2021-12-27T04:48:40.000Z | 2021-12-27T04:48:40.000Z | ark_nlp/model/re/prgc_bert/__init__.py | confstantine/nlp-task | cb152e885bc6f6f1243a12ad90b1c715eb548736 | [
"Apache-2.0"
] | null | null | null | ark_nlp/model/re/prgc_bert/__init__.py | confstantine/nlp-task | cb152e885bc6f6f1243a12ad90b1c715eb548736 | [
"Apache-2.0"
] | 1 | 2021-12-27T04:49:35.000Z | 2021-12-27T04:49:35.000Z | from ark_nlp.model.re.prgc_bert.prgc_relation_extraction_dataset import PRGCREDataset
from ark_nlp.model.re.prgc_bert.prgc_relation_extraction_dataset import PRGCREDataset as Dataset
from ark_nlp.processor.tokenizer.transfomer import SpanTokenizer as Tokenizer
from ark_nlp.processor.tokenizer.transfomer import SpanTokenizer as PRGCRETokenizer
from ark_nlp.nn import BertConfig as PRGCBertConfig
from ark_nlp.model.re.prgc_bert.prgc_bert import PRGCBert
from ark_nlp.factory.optimizer import get_default_bert_optimizer as get_default_model_optimizer
from ark_nlp.factory.optimizer import get_default_bert_optimizer as get_default_prgc_bert_optimizer
from ark_nlp.model.re.prgc_bert.prgc_relation_extraction_task import PRGCRETask as Task
from ark_nlp.model.re.prgc_bert.prgc_relation_extraction_task import PRGCRETask as PRGCRETask
from ark_nlp.model.re.prgc_bert.prgc_relation_extraction_predictor import PRGCREPredictor as Predictor
from ark_nlp.model.re.prgc_bert.prgc_relation_extraction_predictor import PRGCREPredictor as PRGCREPredictor | 61.764706 | 108 | 0.893333 | 158 | 1,050 | 5.613924 | 0.189873 | 0.094701 | 0.135287 | 0.118377 | 0.815107 | 0.815107 | 0.815107 | 0.815107 | 0.782413 | 0.64938 | 0 | 0 | 0.069524 | 1,050 | 17 | 108 | 61.764706 | 0.907881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
5a0ae7997ee7847409104cd0306337fc5a475fa4 | 2,166 | py | Python | src/movement/all.py | Quanta-Robotics/Robot-Blueberry | 7b7e77e09ac5e9ec5afd947e0db1ecc8773e56da | [
"MIT"
] | 25 | 2021-06-08T07:09:30.000Z | 2021-12-30T06:28:35.000Z | src/movement/all.py | ICT-CoU/Robot-Blueberry | d19fd1be037df9d67de64df57a87006d74cd6c43 | [
"MIT"
] | 2 | 2021-05-23T12:54:51.000Z | 2021-06-07T17:47:56.000Z | src/movement/all.py | ICT-CoU/Robot-Blueberry | d19fd1be037df9d67de64df57a87006d74cd6c43 | [
"MIT"
] | 14 | 2021-06-08T13:02:28.000Z | 2021-12-30T20:07:18.000Z | import os
import time
duration=0
os.system('python3 /home/pi/Robot-Blueberry/robot-control/bothHand.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/hand_shake.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/left-right.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/touchHeadL.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/touchHeadR.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/circle.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/hayHay.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/no.py')
time.sleep(duration)
#os.system('python3 /home/pi/Robot-Blueberry/robot-control/touchNose.py')
#time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/hello.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/turnLeft.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/rightHand.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/turnRight.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/duldul.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/hug.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/yes.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/goForward.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/goBack.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/introduction.py')
time.sleep(duration)
#os.system('python3 /home/pi/Robot-Blueberry/robot-control/touchEad.py')
#time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/LeftHand.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/salute.py')
time.sleep(duration)
os.system('python3 /home/pi/Robot-Blueberry/robot-control/touchEye.py')
time.sleep(duration)
| 40.867925 | 75 | 0.791782 | 330 | 2,166 | 5.193939 | 0.118182 | 0.107351 | 0.201284 | 0.254959 | 0.874562 | 0.874562 | 0.874562 | 0.874562 | 0.874562 | 0.847141 | 0 | 0.011489 | 0.035549 | 2,166 | 52 | 76 | 41.653846 | 0.809 | 0.084488 | 0 | 0.466667 | 0 | 0 | 0.609312 | 0.524292 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.044444 | 0 | 0.044444 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
5a2379b1df7a0d84f50bbe1b11ddfc51adbdf731 | 29,666 | py | Python | opensilexClientToolsPython/api/documents_api.py | OpenSILEX/opensilexClientToolsPython | 41b1e7e707670ecf1b2c06d79bdd9749945788cb | [
"RSA-MD"
] | null | null | null | opensilexClientToolsPython/api/documents_api.py | OpenSILEX/opensilexClientToolsPython | 41b1e7e707670ecf1b2c06d79bdd9749945788cb | [
"RSA-MD"
] | 7 | 2021-05-25T14:06:04.000Z | 2021-11-05T15:42:14.000Z | opensilexClientToolsPython/api/documents_api.py | OpenSILEX/opensilexClientToolsPython | 41b1e7e707670ecf1b2c06d79bdd9749945788cb | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
OpenSilex API
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen) # noqa: E501
OpenAPI spec version: INSTANCE-SNAPSHOT
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from opensilexClientToolsPython.api_client import ApiClient
class DocumentsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_document(self, description, **kwargs): # noqa: E501
"""Add a document # noqa: E501
{ uri: http://opensilex.dev/set/documents#ProtocolExperimental, identifier: doi:10.1340/309registries, rdf_type: http://www.opensilex.org/vocabulary/oeso#ScientificDocument, title: title, date: 2020-06-01, description: description, targets: http://opensilex.dev/opensilex/id/variables/v001, authors: Author name, language: fr, format: jpg, deprecated: false, keywords: keywords} # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_document(description, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str description: File description with metadata (required)
:param str authorization: Authentication token (required)
:param file file: file
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_document_with_http_info(description, **kwargs) # noqa: E501
else:
(data) = self.create_document_with_http_info(description, **kwargs) # noqa: E501
return data
def create_document_with_http_info(self, description, **kwargs): # noqa: E501
"""Add a document # noqa: E501
{ uri: http://opensilex.dev/set/documents#ProtocolExperimental, identifier: doi:10.1340/309registries, rdf_type: http://www.opensilex.org/vocabulary/oeso#ScientificDocument, title: title, date: 2020-06-01, description: description, targets: http://opensilex.dev/opensilex/id/variables/v001, authors: Author name, language: fr, format: jpg, deprecated: false, keywords: keywords} # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_document_with_http_info(description, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str description: File description with metadata (required)
:param str authorization: Authentication token (required)
:param file file: file
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['description', 'file', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_document" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'description' is set
if ('description' not in params or
params['description'] is None):
raise ValueError("Missing the required parameter `description` when calling `create_document`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
if 'description' in params:
form_params.append(('description', params['description'])) # noqa: E501
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/documents', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_document(self, uri, **kwargs): # noqa: E501
"""Delete a document # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_document(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Document URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_document_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.delete_document_with_http_info(uri, **kwargs) # noqa: E501
return data
def delete_document_with_http_info(self, uri, **kwargs): # noqa: E501
"""Delete a document # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_document_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Document URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_document" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `delete_document`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/documents/{uri}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_document_file(self, uri, **kwargs): # noqa: E501
"""Get document # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_document_file(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Document URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_document_file_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_document_file_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_document_file_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get document # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_document_file_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Document URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_document_file" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_document_file`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/octet-stream']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/documents/{uri}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_document_metadata(self, uri, **kwargs): # noqa: E501
"""Get document's description # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_document_metadata(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Document URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DocumentGetDTO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_document_metadata_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_document_metadata_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_document_metadata_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get document's description # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_document_metadata_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Document URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DocumentGetDTO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_document_metadata" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_document_metadata`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/documents/{uri}/description', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DocumentGetDTO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def search_documents(self, **kwargs): # noqa: E501
"""Search documents # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_documents(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str rdf_type: Search by type
:param str title: Regex pattern for filtering list by title
:param str _date: Regex pattern for filtering list by date
:param str targets: Search by targets
:param str authors: Regex pattern for filtering list by author
:param str keyword: Regex pattern for filtering list by keyword
:param str multiple: Regex pattern for filtering list by keyword or title
:param str deprecated: Search deprecated file
:param list[str] order_by: List of fields to sort as an array of fieldTitle=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[DocumentGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.search_documents_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.search_documents_with_http_info(**kwargs) # noqa: E501
return data
def search_documents_with_http_info(self, **kwargs): # noqa: E501
"""Search documents # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_documents_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str rdf_type: Search by type
:param str title: Regex pattern for filtering list by title
:param str _date: Regex pattern for filtering list by date
:param str targets: Search by targets
:param str authors: Regex pattern for filtering list by author
:param str keyword: Regex pattern for filtering list by keyword
:param str multiple: Regex pattern for filtering list by keyword or title
:param str deprecated: Search deprecated file
:param list[str] order_by: List of fields to sort as an array of fieldTitle=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[DocumentGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['rdf_type', 'title', '_date', 'targets', 'authors', 'keyword', 'multiple', 'deprecated', 'order_by', 'page', 'page_size', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method search_documents" % key
)
params[key] = val
del params['kwargs']
if 'page' in params and params['page'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page` when calling `search_documents`, must be a value greater than or equal to `0`") # noqa: E501
if 'page_size' in params and params['page_size'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page_size` when calling `search_documents`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'rdf_type' in params:
query_params.append(('rdf_type', params['rdf_type'])) # noqa: E501
if 'title' in params:
query_params.append(('title', params['title'])) # noqa: E501
if '_date' in params:
query_params.append(('date', params['_date'])) # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
if 'authors' in params:
query_params.append(('authors', params['authors'])) # noqa: E501
if 'keyword' in params:
query_params.append(('keyword', params['keyword'])) # noqa: E501
if 'multiple' in params:
query_params.append(('multiple', params['multiple'])) # noqa: E501
if 'deprecated' in params:
query_params.append(('deprecated', params['deprecated'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
collection_formats['order_by'] = 'multi' # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'page_size' in params:
query_params.append(('pageSize', params['page_size'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/documents', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[DocumentGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_document(self, description, **kwargs): # noqa: E501
"""Update document's description # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_document(description, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str description: description (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_document_with_http_info(description, **kwargs) # noqa: E501
else:
(data) = self.update_document_with_http_info(description, **kwargs) # noqa: E501
return data
def update_document_with_http_info(self, description, **kwargs): # noqa: E501
"""Update document's description # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_document_with_http_info(description, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str description: description (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['description', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_document" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'description' is set
if ('description' not in params or
params['description'] is None):
raise ValueError("Missing the required parameter `description` when calling `update_document`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
if 'description' in params:
form_params.append(('description', params['description'])) # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/documents', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 41.54902 | 400 | 0.616194 | 3,337 | 29,666 | 5.272101 | 0.066826 | 0.054567 | 0.019099 | 0.024555 | 0.932587 | 0.911783 | 0.908998 | 0.897857 | 0.895015 | 0.887853 | 0 | 0.019456 | 0.289658 | 29,666 | 713 | 401 | 41.607293 | 0.815404 | 0.39729 | 0 | 0.739377 | 1 | 0.005666 | 0.187886 | 0.03328 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036827 | false | 0 | 0.011331 | 0 | 0.101983 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5a40ee277cf7a6503eca358cf97acb1b6b2561f4 | 3,639 | py | Python | NSEDataMining/PE_Earning_Analysis.py | m4ni5h/PythonScripts | 7adffd478cf5ab3863eb69af1c2a04b3655a872f | [
"MIT"
] | null | null | null | NSEDataMining/PE_Earning_Analysis.py | m4ni5h/PythonScripts | 7adffd478cf5ab3863eb69af1c2a04b3655a872f | [
"MIT"
] | null | null | null | NSEDataMining/PE_Earning_Analysis.py | m4ni5h/PythonScripts | 7adffd478cf5ab3863eb69af1c2a04b3655a872f | [
"MIT"
] | null | null | null | from nsepy import get_history
from nsepy import get_index_pe_history
from datetime import date
import pandas as pd
from pandas import Series, DataFrame
def indexhistory(indexsymbol):
index_history = get_history(symbol=indexsymbol, start=date(2009,3,31), end=date(2009,3,31), index=True)
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2010,3,31), end=date(2010,3,31), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2011,3,31), end=date(2011,3,31), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2012,3,30), end=date(2012,3,30), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2013,3,28), end=date(2013,3,28), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2014,3,31), end=date(2014,3,31), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2015,3,31), end=date(2015,3,31), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2016,3,31), end=date(2016,3,31), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2017,3,31), end=date(2017,3,31), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2018,3,28), end=date(2018,3,28), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2019,3,29), end=date(2019,3,29), index=True))
index_history = index_history.append(get_history(symbol=indexsymbol, start=date(2020,3,31), end=date(2020,3,31), index=True))
print(index_history)
return index_history
def PEhistory(indexsymbol):
pe_history = get_index_pe_history(symbol=indexsymbol, start=date(2009,3,31), end=date(2009,3,31))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2010,3,31), end=date(2010,3,31)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2011,3,31), end=date(2011,3,31)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2012,3,30), end=date(2012,3,30)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2013,3,28), end=date(2013,3,28)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2014,3,31), end=date(2014,3,31)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2015,3,31), end=date(2015,3,31)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2016,3,31), end=date(2016,3,31)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2017,3,31), end=date(2017,3,31)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2018,3,28), end=date(2018,3,28)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2019,3,29), end=date(2019,3,29)))
pe_history = pe_history.append(get_index_pe_history(symbol=indexsymbol, start=date(2020,3,31), end=date(2020,3,31)))
print(pe_history)
return pe_history
pe_history = PEhistory("NIFTY ENERGY")
index_history = indexhistory("NIFTY ENERGY")
pe_analysis = pd.merge(pe_history, index_history, on='Date')
earnings = (pe_analysis['Close']/pe_analysis['P/E']).rename("Earnings")
earnings =pd.DataFrame(earnings)
pe_analysis = pd.merge(pe_analysis, earnings, on='Date')
pe_analysis.to_csv("NIFTY ENERGY_peanalysis.csv") | 74.265306 | 129 | 0.767244 | 592 | 3,639 | 4.523649 | 0.092905 | 0.134429 | 0.215086 | 0.259895 | 0.815161 | 0.800971 | 0.799851 | 0.799851 | 0.796117 | 0.796117 | 0 | 0.10081 | 0.084089 | 3,639 | 49 | 130 | 74.265306 | 0.70267 | 0 | 0 | 0 | 0 | 0 | 0.020604 | 0.005769 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.119048 | 0 | 0.214286 | 0.047619 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5a78424babe7353c9c4e012a6f5285216bbbb4b5 | 157 | py | Python | {{cookiecutter.project_name}}/server/src/common/exceptions.py | gradam/full-docker-django-cookiecutter | 527ff2ca178424cdd7e552159324de51d334e43d | [
"MIT"
] | 4 | 2018-03-28T15:49:36.000Z | 2021-01-07T12:22:29.000Z | {{cookiecutter.project_name}}/server/src/common/exceptions.py | gradam/full-docker-django-cookiecutter | 527ff2ca178424cdd7e552159324de51d334e43d | [
"MIT"
] | null | null | null | {{cookiecutter.project_name}}/server/src/common/exceptions.py | gradam/full-docker-django-cookiecutter | 527ff2ca178424cdd7e552159324de51d334e43d | [
"MIT"
] | null | null | null | class {{cookiecutter.project_name_camel_case}}Exception(Exception):
pass
class OwnerError({{cookiecutter.project_name_camel_case}}Exception):
pass
| 22.428571 | 68 | 0.796178 | 18 | 157 | 6.611111 | 0.5 | 0.319328 | 0.386555 | 0.470588 | 0.689076 | 0.689076 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101911 | 157 | 6 | 69 | 26.166667 | 0.843972 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.5 | 0 | null | null | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
cecfb2bfe5b31b5626d6cf6f83b4c672f8221950 | 12,382 | py | Python | miwell-flask-app/tests/functional_tests/page_objects/main_page_objects/patient_register_page_object.py | joshuahigginson1/DevOps-Assessment-1 | d617522ada565b8b587e2ff7525e1138d1559a75 | [
"MIT"
] | 1 | 2020-08-09T20:52:42.000Z | 2020-08-09T20:52:42.000Z | miwell-flask-app/tests/functional_tests/page_objects/main_page_objects/patient_register_page_object.py | joshuahigginson1/DevOps-Assessment-1 | d617522ada565b8b587e2ff7525e1138d1559a75 | [
"MIT"
] | null | null | null | miwell-flask-app/tests/functional_tests/page_objects/main_page_objects/patient_register_page_object.py | joshuahigginson1/DevOps-Assessment-1 | d617522ada565b8b587e2ff7525e1138d1559a75 | [
"MIT"
] | 1 | 2020-08-08T11:47:27.000Z | 2020-08-08T11:47:27.000Z | # Contains the objects found on our patient register page.
# Imports -------------------------------------------------------------------------------------------------
from tests.functional_tests.page_objects.common_page_objects import CommonPageObject, PatientNavBar
# Page Objects --------------------------------------------------------------------------------------------
class PatientRegisterPageObject(CommonPageObject, PatientNavBar):
# Default Page Variables.
username = 'DefaultPatient'
user_email = 'Email@gmail.com'
user_password = 'Default Password'
user_first_name = 'Default'
user_last_name = 'Patient'
user_phone_number = '07777777777'
user_postcode = 'L1 1AA'
user_medical_conditions = 'Robotitis. I require constant love and attention. Prone to flashing.'
def get_username_field(self): # A function to return the attributes of our username register field.
get_field_element = self.client.find_element_by_xpath('//*[@id="username"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[1]/label')
username_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return username_field_attributes
def type_in_username_form(self, input_to_type=username): # A function to type text into our username form box.
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_username_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_email_field(self): # A function to return the attributes of our email field.
get_field_element = self.client.find_element_by_xpath('//*[@id="email"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[2]/label')
email_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return email_field_attributes
def type_in_email_form(self, input_to_type=user_email): # A function to type text into our email form box.
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_email_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_new_password_field(self): # A function to return the attributes of our new password field.
get_field_element = self.client.find_element_by_xpath('//*[@id="password"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[3]/label')
new_password_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return new_password_field_attributes
def type_in_new_password_form(self,
input_to_type=user_password): # A function to type text into our password form box.
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_new_password_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_confirm_password_field(self): # A function to return the attributes of our confirm password field.
get_field_element = self.client.find_element_by_xpath('//*[@id="confirm_password"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[4]/label')
confirm_password_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return confirm_password_field_attributes
def type_in_confirm_password_form(self, input_to_type=user_password):
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_confirm_password_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_first_name_field(self):
get_field_element = self.client.find_element_by_xpath('//*[@id="first_name"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[5]/label')
first_name_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return first_name_field_attributes
def type_in_first_name_form(self, input_to_type=user_first_name):
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_first_name_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_last_name_field(self):
get_field_element = self.client.find_element_by_xpath('//*[@id="last_name"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[6]/label')
last_name_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return last_name_field_attributes
def type_in_last_name_form(self, input_to_type=user_last_name):
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_last_name_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_phone_number_field(self):
get_field_element = self.client.find_element_by_xpath('//*[@id="phone_number"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[7]/label')
phone_number_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return phone_number_field_attributes
def type_in_phone_number_form(self, input_to_type=user_phone_number):
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_phone_number_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_postcode_field(self):
get_field_element = self.client.find_element_by_xpath('//*[@id="postcode"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[8]/label')
postcode_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return postcode_field_attributes
def type_in_postcode_form(self, input_to_type=user_postcode):
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_postcode_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_medical_conditions_field(self):
get_field_element = self.client.find_element_by_xpath('//*[@id="medical_conditions"]')
get_label_element = self.client.find_element_by_xpath('/html/body/div[2]/form/div[9]/label')
medical_conditions_field_attributes = {
'field element': get_field_element,
'label name': get_label_element.get_attribute('innerHTML'),
'label element': get_label_element
}
return medical_conditions_field_attributes
def type_in_medical_conditions_form(self, input_to_type=user_medical_conditions):
# Retrieve our form attributes.
get_field_attributes = PatientRegisterPageObject.get_medical_conditions_field(self)
get_field_element = get_field_attributes['field element']
get_field_label = get_field_attributes['label name']
# After retrieving the field element, simulate typing into a form box.
get_field_element.send_keys(input_to_type)
print(f"Running Simulation: Currently typing '{input_to_type}' in the {get_field_label} field.")
def get_submit_button(self):
get_button_element = self.client.find_element_by_xpath('//*[@id="submit"]')
submit_button_attributes = {
'button label': get_button_element.get_attribute('innerHTML'),
'button element': get_button_element
}
return submit_button_attributes
def click_submit_button(self):
get_submit_button_element = self.get_submit_button()['button element']
get_submit_button_element.click()
def get_already_registered_button(self):
get_button_element = self.client.find_element_by_xpath('/html/body/a[1]')
already_registered_button_attributes = {
'button label': get_button_element.get_attribute('innerHTML'),
'button element': get_button_element
}
return already_registered_button_attributes
def click_already_registered_button(self):
get_already_registered_button_element = self.get_already_registered_button()['button element']
get_already_registered_button_element.click()
def get_register_as_psych_button(self):
get_button_element = self.client.find_element_by_xpath('/html/body/a[2]')
register_as_psychiatrist_button_attributes = {
'button label': get_button_element.get_attribute('innerHTML'),
'button element': get_button_element
}
return register_as_psychiatrist_button_attributes
def click_register_as_psychiatrist_button(self):
get_register_as_psych_button_element = self.get_register_as_psych_button()['button element']
get_register_as_psych_button_element.click()
| 43.293706 | 118 | 0.702471 | 1,554 | 12,382 | 5.209781 | 0.069498 | 0.08004 | 0.0667 | 0.0667 | 0.854249 | 0.80373 | 0.765934 | 0.747653 | 0.728631 | 0.693552 | 0 | 0.003325 | 0.198352 | 12,382 | 285 | 119 | 43.445614 | 0.812311 | 0.128412 | 0 | 0.392045 | 0 | 0 | 0.207396 | 0.038562 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0.079545 | 0.005682 | 0 | 0.261364 | 0.051136 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
0c9338cd3238ed7ffd4c4186aca74be2e6adc113 | 282 | py | Python | setup_python_package/utils/get_default_package_name.py | LucaCappelletti94/setup_python_package | 61b5f3cff1ed3181f932293c63c4fcb71cbe0062 | [
"MIT"
] | 5 | 2019-09-17T14:46:35.000Z | 2020-06-06T08:17:02.000Z | setup_python_package/utils/get_default_package_name.py | LucaCappelletti94/setup_python_package | 61b5f3cff1ed3181f932293c63c4fcb71cbe0062 | [
"MIT"
] | 2 | 2020-12-18T01:47:55.000Z | 2020-12-25T10:08:30.000Z | setup_python_package/utils/get_default_package_name.py | LucaCappelletti94/setup_python_package | 61b5f3cff1ed3181f932293c63c4fcb71cbe0062 | [
"MIT"
] | null | null | null | from .load_repository import load_repository_name
from .normalize_package_name import normalize_package_name_for_code
def get_default_package_name()->str:
"""Return default package name based on repo name."""
return normalize_package_name_for_code(load_repository_name())
| 35.25 | 67 | 0.829787 | 40 | 282 | 5.4 | 0.425 | 0.25463 | 0.277778 | 0.212963 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106383 | 282 | 7 | 68 | 40.285714 | 0.857143 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
0cb554fdbecfb118941e4465d3b2dc251a783ef8 | 32,095 | py | Python | telingo/tests/scheduler_test.py | DerHunger/telingo | 5358836f8eb2f431fc52590b1a322cb1539623d5 | [
"MIT"
] | null | null | null | telingo/tests/scheduler_test.py | DerHunger/telingo | 5358836f8eb2f431fc52590b1a322cb1539623d5 | [
"MIT"
] | null | null | null | telingo/tests/scheduler_test.py | DerHunger/telingo | 5358836f8eb2f431fc52590b1a322cb1539623d5 | [
"MIT"
] | null | null | null | import unittest
import sys
import copy
import scheduler as _sd
class TestCase(unittest.TestCase):
def assertRaisesRegex(self, *args, **kwargs):
return (self.assertRaisesRegexp(*args, **kwargs)
if sys.version_info[0] < 3
else unittest.TestCase.assertRaisesRegex(self, *args, **kwargs))
class SolveResult():
"""mock clingo SolveResult. """
def __init__(self):
self.unknown = False
self.satisfiable = False
self.unsatisfiable = False
def set_unknown(self):
""" set SolveResult to unknown. """
self.unknown = True
self.satisfiable = False
self.unsatisfiable = False
def set_satisfiable(self):
""" set SolveResult to SAT. """
self.unknown = False
self.satisfiable = True
self.unsatisfiable = False
def set_unsatisfiable(self):
""" set SolveResult to UNSAT. """
self.unknown = False
self.satisfiable = False
self.unsatisfiable = True
def __str__(self):
ret = "error"
if self.unknown:
ret = "UNKNOWN"
elif self.satisfiable:
ret = "SAT"
elif self.unsatisfiable:
ret = "UNSAT"
return ret
def string_to_result(s="NONE"):
""" convert a string to a SolveResult. """
ret = SolveResult()
if s.upper() == "SAT":
ret.set_satisfiable()
elif s.upper() == "UNSAT":
ret.set_unsatisfiable()
elif s.upper() == "UKN" or s.upper() == "UNKNOWN":
ret.set_unknown()
elif s.upper() == "NONE":
ret = None
else:
sys.stdout.write("wrong result string given\n")
return ret
def list_exp(s):
""" expand array of strings by multiplying the strings with the leading number. """
ret = []
i = 0
while i < len(s):
for j in range(0, s[i]):
ret.append(s[i+1])
i += 2
return ret
def schedule(scheduler, results, repeat=True, imax=10):
""" mock schedule part of the smain function. """
scheduler = copy.deepcopy(scheduler)
iteration = 1
ret = []
n = scheduler.next(string_to_result("None"))
if n is None:
return ret
ret.append(n)
while True:
for r in results:
n = scheduler.next(string_to_result(r))
if iteration >= imax or n is None:
repeat = False
break
ret.append(n)
iteration += 1
if not repeat:
break
return ret
class TestSchedulerA(TestCase):
""" class containing all tests for scheduler A. """
def test_A_result(self):
""" test for result parameter. """
start, inc, limit, size, propagate_unsat, verbose = 0, 5, 30, 4, True, 0
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 5, 10, 15, 0, 5, 10, 15, 0, 5])
self.assertEqual(schedule(scheduler, ["NONE"]), [0, 5, 10, 15])
def test_A_start_inc_limit(self):
""" tests for start, inc and limit parameters. """
size, propagate_unsat, verbose = 4, True, 0
start, inc, limit = 0, 5, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 5, 10, 15, 0, 5, 10, 15, 0, 5])
start, inc, limit = 30, 5, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [30, 30, 30, 30, 30, 30, 30, 30, 30, 30])
start, inc, limit = 25, 5, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [25, 30, 25, 30, 25, 30, 25, 30, 25, 30])
start, inc, limit = -5, 5, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 35, 5, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 5, 0
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
start, inc, limit = 0, 5, 5
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 5, 0, 5, 0, 5, 0, 5, 0, 5])
start, inc, limit = 0, 5, -5
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 0, 5
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 11, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 11, 22])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 11, 22])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 11, 22, 0, 11, 22, 0, 11, 22, 0])
start, inc, limit = 0, -11, 30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, -11, -30
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
def test_A_size(self):
""" test for the size parameter. """
start, inc, propagate_unsat, verbose = 0, 5, True, 0
limit, size = 30, 4
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 5, 10, 15, 0, 5, 10, 15, 0, 5])
limit, size = 30, 0
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
limit, size = 30, -4
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
limit, size = 10, 4
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 5, 10, 0, 5, 10, 0, 5, 10, 0])
def test_A_propagate_unsat(self):
""" tests for the propagate_unsat parameter. """
start, inc, limit, size, verbose = 0, 1, 30, 4, 0
propagate_unsat = True
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UNSAT"]), [0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
self.assertEqual(schedule(scheduler, list_exp([3, "UKN", 1, "UNSAT", 9, "UKN"]), imax=13),
[0, 1, 2, 3, 4, 5, 6, 7, 4, 5, 6, 7, 4])
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"]), imax=17),
[0, 1, 2, 3, 0, 1, 2, 3, 4, 5, 2, 3, 4, 5, 6, 7, 4])
propagate_unsat = False
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UNSAT"]), [0, 1, 2, 3, 0, 4, 2, 5, 0, 6])
self.assertEqual(schedule(scheduler, list_exp([3, "UKN", 1, "UNSAT", 9, "UKN"]), imax=13),
[0, 1, 2, 3, 0, 1, 2, 4, 0, 1, 2, 4, 0])
start, inc, limit, size, verbose = 0, 5, 30, 4, 0
propagate_unsat = True
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, list_exp([3, "UKN", 1, "UNSAT", 8, "UKN"])),
[0, 5, 10, 15, 20, 25, 30, 20, 25, 30])
start, inc, limit, size, verbose = 0, 5, 10, 4, 0
propagate_unsat = True
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UKN", "UNSAT"]), [0, 5, 10])
propagate_unsat = False
scheduler = _sd.A_Scheduler(start, inc, limit, size, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, list_exp([2, "UKN", 1, "UNSAT", 7, "UKN"])),
[0, 5, 10, 0, 5, 0, 5, 0, 5, 0])
self.assertEqual(schedule(scheduler, ["UKN", "UKN", "UNSAT"]), [0, 5, 10, 0, 5, 0, 5, 5, 5])
class TestSchedulerB(TestCase):
""" class containing all tests for scheduler B. """
def test_B_result(self):
""" test for result parameter. """
start, inc, limit, processes, propagate_unsat, gamma, verbose = 0, 5, 30, 5, True, 0.5, 0
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 5, 0, 5, 10, 0, 5, 10, 0, 15, 0, 5, 15, 0])
self.assertEqual(schedule(scheduler, ["NONE"]), [0])
def test_B_start_inc_limit(self):
""" test for start, inc and limit parameters. """
processes, propagate_unsat, gamma, verbose = 4, True, 0.5, 0
start, inc, limit = 0, 5, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 5, 0, 5, 10, 0, 5, 10, 0])
start, inc, limit = 30, 5, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [30, 30, 30, 30, 30, 30, 30, 30, 30, 30])
start, inc, limit = 25, 5, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [25, 25, 30, 25, 30, 25, 30, 25, 25, 30])
start, inc, limit = -5, 5, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 35, 5, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 5, 0
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
start, inc, limit = 0, 5, 5
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 5, 0, 5, 0, 5, 0, 0, 5])
start, inc, limit = 0, 5, -5
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 0, 5
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 11, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 11, 22])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 11, 22])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 11, 0, 11, 22, 0, 11, 22, 0])
start, inc, limit = 0, -11, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, -11, -30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
def test_B_processes(self):
""" tests for processes parameter. """
start, inc, propagate_unsat, gamma, verbose = 0, 5, True, 0.5, 0
processes, limit = 4, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10, 15, 20, 25, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 5, 0, 5, 10, 0, 5, 10, 0])
processes, limit = 0, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
processes, limit = -4, 30
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
processes, limit = 4, 10
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 5, 10])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 5, 10])
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 5, 0, 5, 10, 0, 5, 10, 0, 0, 5, 0, 10, 0])
def test_B_gamma(self):
""" tests for gamma parameter. """
start, inc, limit, processes, propagate_unsat, verbose = 0, 5, 30, 5, True, 0
gamma = -2
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
gamma = -0.5
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
gamma = 0
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
gamma = 0.1
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 0, 0, 0, 0, 5, 0, 5, 0, 0, 0, 0, 0, 0])
gamma = 0.5
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 5, 0, 5, 10, 0, 5, 10, 0, 15, 0, 5, 15, 0])
gamma = 0.25
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 0, 0, 5, 0, 5, 0, 0, 0, 5, 0, 0, 10, 0, 10])
gamma = 0.75
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 5, 10, 0, 5, 10, 15, 20, 0, 5, 10, 15, 20, 0, 5])
gamma = 1
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 5, 10, 15, 20, 0, 5, 10, 15, 20, 0, 5, 10, 15, 20])
processes, limit = 10, 100
gamma = 1
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 0, 5, 10, 15, 20])
gamma = 2
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UNKNOWN"], imax=15),
[0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 0, 5, 10, 15, 20])
def test_B_propagate_unsat(self):
""" tests for propagate_unsat parameter. """
start, inc, limit, processes, gamma, verbose = 0, 1, 30, 4, 0.5, 0
propagate_unsat = True
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UNSAT"]), [0, 0, 1, 1, 2, 2, 3, 3, 4, 4])
self.assertEqual(schedule(scheduler, ["UKN", "UKN", "UNSAT"]), [0, 0, 1, 2, 2, 3, 4, 4, 5, 6])
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT", 6, "UKN"]), imax=12),
[0, 0, 1, 0, 1, 2, 3, 3, 4, 3, 4, 5])
self.assertEqual(schedule(scheduler, list_exp([11, "UKN", 1, "UNSAT", 8, "UKN"]), imax=20),
[0, 0, 1, 0, 1, 2, 0, 1, 2, 0, 3, 0, 1, 3, 1, 2, 4, 1, 2, 4])
propagate_unsat = False
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UNSAT"]), [0, 0, 1, 1, 2, 2, 3, 3, 4, 4])
self.assertEqual(schedule(scheduler, ["UKN", "UKN", "UNSAT"]), [0, 0, 1, 0, 2, 0, 2, 2, 3, 4])
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT", 6, "UKN"]), imax=12),
[0, 0, 1, 0, 1, 2, 0, 1, 0, 3, 0, 1])
self.assertEqual(schedule(scheduler, list_exp([11, "UKN", 1, "UNSAT", 8, "UKN"]), imax=20),
[0, 0, 1, 0, 1, 2, 0, 1, 2, 0, 3, 0, 1, 3, 1, 2, 4, 1, 2, 4])
inc, limit = 5, 10
propagate_unsat = True
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"])), [0, 0, 5, 0, 5, 10])
propagate_unsat = False
scheduler = _sd.B_Scheduler(start, inc, limit, processes, propagate_unsat, gamma, verbose)
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"])),
[0, 0, 5, 0, 5, 10, 0, 5, 0, 0])
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 2, "UNSAT"])),
[0, 0, 5, 0, 5, 10, 0, 5, 5, 5])
class TestSchedulerC(TestCase):
""" class containing all tests for scheduler C. """
def test_C_result(self):
""" tests for result parameter. """
start, inc, limit, propagate_unsat, verbose = 0, 1.5, 30, True, 0
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1, 2, 3, 4, 6, 10, 15, 22])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1, 2, 3, 4, 6, 10, 15, 22])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 2, 1, 3, 0, 4, 2, 6])
self.assertEqual(schedule(scheduler, ["NONE"]), [0])
def test_C_start_inc_limit(self):
""" tests for start, inc and limit parameters. """
propagate_unsat, verbose = True, 0
start, inc, limit = 0, 1.5, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1, 2, 3, 4, 6, 10, 15, 22])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1, 2, 3, 4, 6, 10, 15, 22])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 2, 1, 3, 0, 4, 2, 6])
start, inc, limit = 4, 1.5, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [4, 6, 9, 13, 20, 30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [4, 6, 9, 13, 20, 30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [4, 6, 4, 9, 6, 13, 4, 20, 9, 30])
start, inc, limit = 30, 1.5, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [30])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [30])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [30, 30, 30, 30, 30, 30, 30, 30, 30, 30])
start, inc, limit = -5, 1.5, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 35, 1.5, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 1.5, 0
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
start, inc, limit = 0, 1.5, 5
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1, 2, 3, 4])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1, 2, 3, 4])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 2, 1, 3, 0, 4, 2, 1])
start, inc, limit = 0, 1.5, 1
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 1, 0, 1, 0, 1, 0, 1])
start, inc, limit = 0, 1.5, -5
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 1, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 2, 1, 3, 0, 4, 2, 5])
start, inc, limit = 0, 1, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 2, 1, 3, 0, 4, 2, 5])
start, inc, limit = 4, 1, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [4, 5, 6, 7, 8, 9, 10, 11, 12, 13])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [4, 5, 6, 7, 8, 9, 10, 11, 12, 13])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [4, 5, 4, 6, 5, 7, 4, 8, 6, 9])
start, inc, limit = 0, 11, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [0, 1, 11])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [0, 1, 11])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [0, 1, 0, 11, 1, 0, 11, 1, 0, 11])
start, inc, limit = 0, 11, -30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, -11, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, -11, -30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 0.5, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
start, inc, limit = 0, 0, 30
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["SAT"]), [])
self.assertEqual(schedule(scheduler, ["UNSAT"]), [])
self.assertEqual(schedule(scheduler, ["UNKNOWN"]), [])
def test_C_propagate_unsat(self):
""" tests for propagate_unsat parameter. """
start, inc, limit, verbose = 0, 1.5, 30, 0
propagate_unsat = True
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UNSAT"], imax=11),
[0, 1, 2, 3, 4, 6, 10, 15, 22, 22])
self.assertEqual(schedule(scheduler, ["UKN", "UKN", "UNSAT"]),
[0, 1, 0, 2, 1, 3, 4, 6, 10, 15])
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"]), imax=12),
[0, 1, 0, 2, 1, 3, 4, 6, 10, 15, 4, 22])
propagate_unsat = False
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, ["UKN", "UNSAT"], imax=11),
[0, 1, 0, 2, 3, 0, 4, 6, 3, 10, 15])
self.assertEqual(schedule(scheduler, ["UKN", "UKN", "UNSAT"]),
[0, 1, 0, 2, 1, 3, 4, 2, 6, 1])
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"]), imax=12),
[0, 1, 0, 2, 1, 3, 0, 4, 2, 6, 1, 10])
start, inc, limit, verbose = 0, 1.5, 3, 0
propagate_unsat = True
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"])),
[0, 1, 0, 2, 1, 3])
propagate_unsat = False
scheduler = _sd.C_Scheduler(start, inc, limit, propagate_unsat, verbose)
self.assertEqual(schedule(scheduler, list_exp([5, "UKN", 1, "UNSAT"])),
[0, 1, 0, 2, 1, 3, 0, 2, 1, 0])
class TestSchedulerConfig(TestCase):
""" class containing all tests for scheduler config. """
def test_build(self):
""" tests for building a scheduler. """
config = _sd.Scheduler_Config()
self.assertEqual(config.single_scheduler(), False)
config.A = 5
self.assertEqual(config.single_scheduler(), True)
config.B = 0.5
with self.assertRaises(Exception) as context:
config.single_scheduler()
config.A = None
self.assertEqual(config.single_scheduler(), True)
config.C = 1
with self.assertRaises(Exception) as context:
config.single_scheduler()
config.B = None
self.assertEqual(config.single_scheduler(), True)
config.A = 5
with self.assertRaises(Exception) as context:
config.single_scheduler()
config.B = 0.5
with self.assertRaises(Exception) as context:
config.single_scheduler()
if __name__ == '__main__':
unittest.main()
| 51.269968 | 102 | 0.584172 | 4,120 | 32,095 | 4.467233 | 0.034223 | 0.166259 | 0.249932 | 0.347732 | 0.905297 | 0.889052 | 0.871502 | 0.839446 | 0.80777 | 0.799348 | 0 | 0.071257 | 0.247484 | 32,095 | 625 | 103 | 51.352 | 0.690792 | 0.028135 | 0 | 0.645793 | 0 | 0 | 0.039553 | 0 | 0 | 0 | 0 | 0 | 0.412916 | 1 | 0.043053 | false | 0 | 0.007828 | 0.001957 | 0.074364 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
0cdcc389577b005c7f4ee0a45b0dd098791b0f66 | 60,076 | gyp | Python | skia/skia.gyp | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | 1 | 2019-04-23T15:57:04.000Z | 2019-04-23T15:57:04.000Z | skia/skia.gyp | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | null | null | null | skia/skia.gyp | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) 2012 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
{
'targets': [
{
'target_name': 'skia',
'type': '<(component)',
'variables': {
'optimize': 'max',
},
'sources': [
#'../third_party/skia/src/animator/SkAnimate.h',
#'../third_party/skia/src/animator/SkAnimateActive.cpp',
#'../third_party/skia/src/animator/SkAnimateActive.h',
#'../third_party/skia/src/animator/SkAnimateBase.cpp',
#'../third_party/skia/src/animator/SkAnimateBase.h',
#'../third_party/skia/src/animator/SkAnimateField.cpp',
#'../third_party/skia/src/animator/SkAnimateMaker.cpp',
#'../third_party/skia/src/animator/SkAnimateMaker.h',
#'../third_party/skia/src/animator/SkAnimateProperties.h',
#'../third_party/skia/src/animator/SkAnimateSchema.xsd',
#'../third_party/skia/src/animator/SkAnimateSchema.xsx',
#'../third_party/skia/src/animator/SkAnimateSet.cpp',
#'../third_party/skia/src/animator/SkAnimateSet.h',
#'../third_party/skia/src/animator/SkAnimator.cpp',
#'../third_party/skia/src/animator/SkAnimatorScript.cpp',
#'../third_party/skia/src/animator/SkAnimatorScript.h',
#'../third_party/skia/src/animator/SkAnimatorScript2.cpp',
#'../third_party/skia/src/animator/SkAnimatorScript2.h',
#'../third_party/skia/src/animator/SkBase64.cpp',
#'../third_party/skia/src/animator/SkBase64.h',
#'../third_party/skia/src/animator/SkBoundable.cpp',
#'../third_party/skia/src/animator/SkBoundable.h',
#'../third_party/skia/src/animator/SkBuildCondensedInfo.cpp',
#'../third_party/skia/src/animator/SkCondensedDebug.cpp',
#'../third_party/skia/src/animator/SkCondensedRelease.cpp',
#'../third_party/skia/src/animator/SkDisplayAdd.cpp',
#'../third_party/skia/src/animator/SkDisplayAdd.h',
#'../third_party/skia/src/animator/SkDisplayApply.cpp',
#'../third_party/skia/src/animator/SkDisplayApply.h',
#'../third_party/skia/src/animator/SkDisplayBounds.cpp',
#'../third_party/skia/src/animator/SkDisplayBounds.h',
#'../third_party/skia/src/animator/SkDisplayEvent.cpp',
#'../third_party/skia/src/animator/SkDisplayEvent.h',
#'../third_party/skia/src/animator/SkDisplayEvents.cpp',
#'../third_party/skia/src/animator/SkDisplayEvents.h',
#'../third_party/skia/src/animator/SkDisplayInclude.cpp',
#'../third_party/skia/src/animator/SkDisplayInclude.h',
#'../third_party/skia/src/animator/SkDisplayInput.cpp',
#'../third_party/skia/src/animator/SkDisplayInput.h',
#'../third_party/skia/src/animator/SkDisplayList.cpp',
#'../third_party/skia/src/animator/SkDisplayList.h',
#'../third_party/skia/src/animator/SkDisplayMath.cpp',
#'../third_party/skia/src/animator/SkDisplayMath.h',
#'../third_party/skia/src/animator/SkDisplayMovie.cpp',
#'../third_party/skia/src/animator/SkDisplayMovie.h',
#'../third_party/skia/src/animator/SkDisplayNumber.cpp',
#'../third_party/skia/src/animator/SkDisplayNumber.h',
#'../third_party/skia/src/animator/SkDisplayPost.cpp',
#'../third_party/skia/src/animator/SkDisplayPost.h',
#'../third_party/skia/src/animator/SkDisplayRandom.cpp',
#'../third_party/skia/src/animator/SkDisplayRandom.h',
#'../third_party/skia/src/animator/SkDisplayScreenplay.cpp',
#'../third_party/skia/src/animator/SkDisplayScreenplay.h',
#'../third_party/skia/src/animator/SkDisplayType.cpp',
#'../third_party/skia/src/animator/SkDisplayType.h',
#'../third_party/skia/src/animator/SkDisplayTypes.cpp',
#'../third_party/skia/src/animator/SkDisplayTypes.h',
#'../third_party/skia/src/animator/SkDisplayXMLParser.cpp',
#'../third_party/skia/src/animator/SkDisplayXMLParser.h',
#'../third_party/skia/src/animator/SkDisplayable.cpp',
#'../third_party/skia/src/animator/SkDisplayable.h',
#'../third_party/skia/src/animator/SkDraw3D.cpp',
#'../third_party/skia/src/animator/SkDraw3D.h',
#'../third_party/skia/src/animator/SkDrawBitmap.cpp',
#'../third_party/skia/src/animator/SkDrawBitmap.h',
#'../third_party/skia/src/animator/SkDrawBlur.cpp',
#'../third_party/skia/src/animator/SkDrawBlur.h',
#'../third_party/skia/src/animator/SkDrawClip.cpp',
#'../third_party/skia/src/animator/SkDrawClip.h',
#'../third_party/skia/src/animator/SkDrawColor.cpp',
#'../third_party/skia/src/animator/SkDrawColor.h',
#'../third_party/skia/src/animator/SkDrawDash.cpp',
#'../third_party/skia/src/animator/SkDrawDash.h',
#'../third_party/skia/src/animator/SkDrawDiscrete.cpp',
#'../third_party/skia/src/animator/SkDrawDiscrete.h',
#'../third_party/skia/src/animator/SkDrawEmboss.cpp',
#'../third_party/skia/src/animator/SkDrawEmboss.h',
#'../third_party/skia/src/animator/SkDrawExtraPathEffect.cpp',
#'../third_party/skia/src/animator/SkDrawFull.cpp',
#'../third_party/skia/src/animator/SkDrawFull.h',
#'../third_party/skia/src/animator/SkDrawGradient.cpp',
#'../third_party/skia/src/animator/SkDrawGradient.h',
#'../third_party/skia/src/animator/SkDrawGroup.cpp',
#'../third_party/skia/src/animator/SkDrawGroup.h',
#'../third_party/skia/src/animator/SkDrawLine.cpp',
#'../third_party/skia/src/animator/SkDrawLine.h',
#'../third_party/skia/src/animator/SkDrawMatrix.cpp',
#'../third_party/skia/src/animator/SkDrawMatrix.h',
#'../third_party/skia/src/animator/SkDrawOval.cpp',
#'../third_party/skia/src/animator/SkDrawOval.h',
#'../third_party/skia/src/animator/SkDrawPaint.cpp',
#'../third_party/skia/src/animator/SkDrawPaint.h',
#'../third_party/skia/src/animator/SkDrawPath.cpp',
#'../third_party/skia/src/animator/SkDrawPath.h',
#'../third_party/skia/src/animator/SkDrawPoint.cpp',
#'../third_party/skia/src/animator/SkDrawPoint.h',
#'../third_party/skia/src/animator/SkDrawRectangle.cpp',
#'../third_party/skia/src/animator/SkDrawRectangle.h',
#'../third_party/skia/src/animator/SkDrawSaveLayer.cpp',
#'../third_party/skia/src/animator/SkDrawSaveLayer.h',
#'../third_party/skia/src/animator/SkDrawShader.cpp',
#'../third_party/skia/src/animator/SkDrawShader.h',
#'../third_party/skia/src/animator/SkDrawText.cpp',
#'../third_party/skia/src/animator/SkDrawText.h',
#'../third_party/skia/src/animator/SkDrawTextBox.cpp',
#'../third_party/skia/src/animator/SkDrawTextBox.h',
#'../third_party/skia/src/animator/SkDrawTo.cpp',
#'../third_party/skia/src/animator/SkDrawTo.h',
#'../third_party/skia/src/animator/SkDrawTransparentShader.cpp',
#'../third_party/skia/src/animator/SkDrawTransparentShader.h',
#'../third_party/skia/src/animator/SkDrawable.cpp',
#'../third_party/skia/src/animator/SkDrawable.h',
#'../third_party/skia/src/animator/SkDump.cpp',
#'../third_party/skia/src/animator/SkDump.h',
#'../third_party/skia/src/animator/SkExtraPathEffects.xsd',
#'../third_party/skia/src/animator/SkExtras.h',
#'../third_party/skia/src/animator/SkGetCondensedInfo.cpp',
#'../third_party/skia/src/animator/SkHitClear.cpp',
#'../third_party/skia/src/animator/SkHitClear.h',
#'../third_party/skia/src/animator/SkHitTest.cpp',
#'../third_party/skia/src/animator/SkHitTest.h',
#'../third_party/skia/src/animator/SkIntArray.h',
#'../third_party/skia/src/animator/SkMatrixParts.cpp',
#'../third_party/skia/src/animator/SkMatrixParts.h',
#'../third_party/skia/src/animator/SkMemberInfo.cpp',
#'../third_party/skia/src/animator/SkMemberInfo.h',
#'../third_party/skia/src/animator/SkOpArray.cpp',
#'../third_party/skia/src/animator/SkOpArray.h',
#'../third_party/skia/src/animator/SkOperand.h',
#'../third_party/skia/src/animator/SkOperand2.h',
#'../third_party/skia/src/animator/SkOperandInterpolator.h',
#'../third_party/skia/src/animator/SkOperandIterpolator.cpp',
#'../third_party/skia/src/animator/SkPaintParts.cpp',
#'../third_party/skia/src/animator/SkPaintParts.h',
#'../third_party/skia/src/animator/SkParseSVGPath.cpp',
#'../third_party/skia/src/animator/SkPathParts.cpp',
#'../third_party/skia/src/animator/SkPathParts.h',
#'../third_party/skia/src/animator/SkPostParts.cpp',
#'../third_party/skia/src/animator/SkPostParts.h',
#'../third_party/skia/src/animator/SkScript.cpp',
#'../third_party/skia/src/animator/SkScript.h',
#'../third_party/skia/src/animator/SkScript2.h',
#'../third_party/skia/src/animator/SkScriptCallBack.h',
#'../third_party/skia/src/animator/SkScriptDecompile.cpp',
#'../third_party/skia/src/animator/SkScriptRuntime.cpp',
#'../third_party/skia/src/animator/SkScriptRuntime.h',
#'../third_party/skia/src/animator/SkScriptTokenizer.cpp',
#'../third_party/skia/src/animator/SkSnapshot.cpp',
#'../third_party/skia/src/animator/SkSnapshot.h',
#'../third_party/skia/src/animator/SkTDArray_Experimental.h',
#'../third_party/skia/src/animator/SkTextOnPath.cpp',
#'../third_party/skia/src/animator/SkTextOnPath.h',
#'../third_party/skia/src/animator/SkTextToPath.cpp',
#'../third_party/skia/src/animator/SkTextToPath.h',
#'../third_party/skia/src/animator/SkTypedArray.cpp',
#'../third_party/skia/src/animator/SkTypedArray.h',
#'../third_party/skia/src/animator/SkXMLAnimatorWriter.cpp',
#'../third_party/skia/src/animator/SkXMLAnimatorWriter.h',
'../third_party/skia/src/animator/SkTime.cpp',
'../third_party/skia/src/core/ARGB32_Clamp_Bilinear_BitmapShader.h',
'../third_party/skia/src/core/Sk64.cpp',
'../third_party/skia/src/core/SkAAClip.cpp',
'../third_party/skia/src/core/SkAdvancedTypefaceMetrics.cpp',
'../third_party/skia/src/core/SkAlphaRuns.cpp',
'../third_party/skia/src/core/SkAntiRun.h',
'../third_party/skia/src/core/SkBitmap.cpp',
'../third_party/skia/src/core/SkBitmapProcShader.cpp',
'../third_party/skia/src/core/SkBitmapProcShader.h',
'../third_party/skia/src/core/SkBitmapProcState.cpp',
'../third_party/skia/src/core/SkBitmapProcState.h',
'../third_party/skia/src/core/SkBitmapProcState_matrix.h',
'../third_party/skia/src/core/SkBitmapProcState_matrixProcs.cpp',
'../third_party/skia/src/core/SkBitmapProcState_sample.h',
'../third_party/skia/src/core/SkBitmapSampler.cpp',
'../third_party/skia/src/core/SkBitmapSampler.h',
'../third_party/skia/src/core/SkBitmapSamplerTemplate.h',
'../third_party/skia/src/core/SkBitmapShader16BilerpTemplate.h',
'../third_party/skia/src/core/SkBitmapShaderTemplate.h',
'../third_party/skia/src/core/SkBitmap_scroll.cpp',
'../third_party/skia/src/core/SkBlitBWMaskTemplate.h',
'../third_party/skia/src/core/SkBlitMask_D32.cpp',
'../third_party/skia/src/core/SkBlitRow_D16.cpp',
'../third_party/skia/src/core/SkBlitRow_D32.cpp',
'../third_party/skia/src/core/SkBlitRow_D4444.cpp',
'../third_party/skia/src/core/SkBlitter.cpp',
'../third_party/skia/src/core/SkBlitter_4444.cpp',
'../third_party/skia/src/core/SkBlitter_A1.cpp',
'../third_party/skia/src/core/SkBlitter_A8.cpp',
'../third_party/skia/src/core/SkBlitter_ARGB32.cpp',
'../third_party/skia/src/core/SkBlitter_RGB16.cpp',
'../third_party/skia/src/core/SkBlitter_Sprite.cpp',
'../third_party/skia/src/core/SkBuffer.cpp',
'../third_party/skia/src/core/SkCanvas.cpp',
'../third_party/skia/src/core/SkChunkAlloc.cpp',
'../third_party/skia/src/core/SkClipStack.cpp',
'../third_party/skia/src/core/SkColor.cpp',
'../third_party/skia/src/core/SkColorFilter.cpp',
'../third_party/skia/src/core/SkColorTable.cpp',
'../third_party/skia/src/core/SkComposeShader.cpp',
'../third_party/skia/src/core/SkConcaveToTriangles.cpp',
'../third_party/skia/src/core/SkConcaveToTriangles.h',
'../third_party/skia/src/core/SkConfig8888.cpp',
'../third_party/skia/src/core/SkConfig8888.h',
'../third_party/skia/src/core/SkCordic.cpp',
'../third_party/skia/src/core/SkCordic.h',
'../third_party/skia/src/core/SkCoreBlitters.h',
'../third_party/skia/src/core/SkCubicClipper.cpp',
'../third_party/skia/src/core/SkCubicClipper.h',
'../third_party/skia/src/core/SkData.cpp',
'../third_party/skia/src/core/SkDebug.cpp',
#'../third_party/skia/src/core/SkDebug_stdio.cpp',
'../third_party/skia/src/core/SkDeque.cpp',
'../third_party/skia/src/core/SkDevice.cpp',
'../third_party/skia/src/core/SkDither.cpp',
'../third_party/skia/src/core/SkDraw.cpp',
'../third_party/skia/src/core/SkDrawProcs.h',
#'../third_party/skia/src/core/SkDrawing.cpp',
'../third_party/skia/src/core/SkEdgeBuilder.cpp',
'../third_party/skia/src/core/SkEdgeClipper.cpp',
'../third_party/skia/src/core/SkEdge.cpp',
'../third_party/skia/src/core/SkEdge.h',
'../third_party/skia/src/core/SkFP.h',
'../third_party/skia/src/core/SkFilterProc.cpp',
'../third_party/skia/src/core/SkFilterProc.h',
'../third_party/skia/src/core/SkFlate.cpp',
'../third_party/skia/src/core/SkFlattenable.cpp',
'../third_party/skia/src/core/SkFloat.cpp',
'../third_party/skia/src/core/SkFloat.h',
'../third_party/skia/src/core/SkFloatBits.cpp',
'../third_party/skia/src/core/SkFontHost.cpp',
'../third_party/skia/src/core/SkGeometry.cpp',
'../third_party/skia/src/core/SkGlyphCache.cpp',
'../third_party/skia/src/core/SkGlyphCache.h',
'../third_party/skia/src/core/SkGraphics.cpp',
'../third_party/skia/src/core/SkLineClipper.cpp',
'../third_party/skia/src/core/SkMMapStream.cpp',
'../third_party/skia/src/core/SkMallocPixelRef.cpp',
'../third_party/skia/src/core/SkMask.cpp',
'../third_party/skia/src/core/SkMaskFilter.cpp',
'../third_party/skia/src/core/SkMath.cpp',
'../third_party/skia/src/core/SkMatrix.cpp',
'../third_party/skia/src/core/SkMetaData.cpp',
'../third_party/skia/src/core/SkOrderedReadBuffer.cpp',
'../third_party/skia/src/core/SkOrderedWriteBuffer.cpp',
'../third_party/skia/src/core/SkPackBits.cpp',
'../third_party/skia/src/core/SkPaint.cpp',
'../third_party/skia/src/core/SkPath.cpp',
'../third_party/skia/src/core/SkPathEffect.cpp',
'../third_party/skia/src/core/SkPathHeap.cpp',
'../third_party/skia/src/core/SkPathHeap.h',
'../third_party/skia/src/core/SkPathMeasure.cpp',
'../third_party/skia/src/core/SkPicture.cpp',
'../third_party/skia/src/core/SkPictureFlat.cpp',
'../third_party/skia/src/core/SkPictureFlat.h',
'../third_party/skia/src/core/SkPicturePlayback.cpp',
'../third_party/skia/src/core/SkPicturePlayback.h',
'../third_party/skia/src/core/SkPictureRecord.cpp',
'../third_party/skia/src/core/SkPictureRecord.h',
'../third_party/skia/src/core/SkPixelRef.cpp',
'../third_party/skia/src/core/SkPoint.cpp',
'../third_party/skia/src/core/SkProcSpriteBlitter.cpp',
'../third_party/skia/src/core/SkPtrRecorder.cpp',
'../third_party/skia/src/core/SkQuadClipper.cpp',
'../third_party/skia/src/core/SkQuadClipper.h',
'../third_party/skia/src/core/SkRasterClip.cpp',
'../third_party/skia/src/core/SkRasterizer.cpp',
'../third_party/skia/src/core/SkRect.cpp',
'../third_party/skia/src/core/SkRefDict.cpp',
'../third_party/skia/src/core/SkRegion.cpp',
'../third_party/skia/src/core/SkRegionPriv.h',
'../third_party/skia/src/core/SkRegion_path.cpp',
'../third_party/skia/src/core/SkScalar.cpp',
'../third_party/skia/src/core/SkScalerContext.cpp',
'../third_party/skia/src/core/SkScan.cpp',
'../third_party/skia/src/core/SkScanPriv.h',
'../third_party/skia/src/core/SkScan_AntiPath.cpp',
'../third_party/skia/src/core/SkScan_Antihair.cpp',
'../third_party/skia/src/core/SkScan_Hairline.cpp',
'../third_party/skia/src/core/SkScan_Path.cpp',
'../third_party/skia/src/core/SkShader.cpp',
'../third_party/skia/src/core/SkShape.cpp',
'../third_party/skia/src/core/SkSpriteBlitter_ARGB32.cpp',
'../third_party/skia/src/core/SkSpriteBlitter_RGB16.cpp',
'../third_party/skia/src/core/SkSinTable.h',
'../third_party/skia/src/core/SkSpriteBlitter.h',
'../third_party/skia/src/core/SkSpriteBlitterTemplate.h',
'../third_party/skia/src/core/SkStream.cpp',
'../third_party/skia/src/core/SkString.cpp',
'../third_party/skia/src/core/SkStroke.cpp',
'../third_party/skia/src/core/SkStrokerPriv.cpp',
'../third_party/skia/src/core/SkStrokerPriv.h',
'../third_party/skia/src/core/SkTextFormatParams.h',
'../third_party/skia/src/core/SkTLS.cpp',
'../third_party/skia/src/core/SkTSearch.cpp',
'../third_party/skia/src/core/SkTSort.h',
'../third_party/skia/src/core/SkTemplatesPriv.h',
'../third_party/skia/src/core/SkTypeface.cpp',
'../third_party/skia/src/core/SkTypefaceCache.cpp',
'../third_party/skia/src/core/SkUnPreMultiply.cpp',
'../third_party/skia/src/core/SkUtils.cpp',
'../third_party/skia/src/core/SkWriter32.cpp',
'../third_party/skia/src/core/SkXfermode.cpp',
'../third_party/skia/src/effects/Sk1DPathEffect.cpp',
'../third_party/skia/src/effects/Sk2DPathEffect.cpp',
'../third_party/skia/src/effects/SkAvoidXfermode.cpp',
'../third_party/skia/src/effects/SkBitmapCache.cpp',
'../third_party/skia/src/effects/SkBitmapCache.h',
'../third_party/skia/src/effects/SkBlurDrawLooper.cpp',
'../third_party/skia/src/effects/SkBlurImageFilter.cpp',
'../third_party/skia/src/effects/SkBlurMask.cpp',
'../third_party/skia/src/effects/SkBlurMask.h',
'../third_party/skia/src/effects/SkBlurMaskFilter.cpp',
'../third_party/skia/src/effects/SkClampRange.cpp',
'../third_party/skia/src/effects/SkColorFilters.cpp',
'../third_party/skia/src/effects/SkColorMatrixFilter.cpp',
'../third_party/skia/src/effects/SkCornerPathEffect.cpp',
'../third_party/skia/src/effects/SkDashPathEffect.cpp',
'../third_party/skia/src/effects/SkDiscretePathEffect.cpp',
'../third_party/skia/src/effects/SkEmbossMask.cpp',
'../third_party/skia/src/effects/SkEmbossMask.h',
'../third_party/skia/src/effects/SkEmbossMask_Table.h',
'../third_party/skia/src/effects/SkEmbossMaskFilter.cpp',
'../third_party/skia/src/effects/SkGradientShader.cpp',
'../third_party/skia/src/effects/SkKernel33MaskFilter.cpp',
'../third_party/skia/src/effects/SkLayerDrawLooper.cpp',
'../third_party/skia/src/effects/SkLayerRasterizer.cpp',
'../third_party/skia/src/effects/SkMorphologyImageFilter.cpp',
'../third_party/skia/src/effects/SkPaintFlagsDrawFilter.cpp',
'../third_party/skia/src/effects/SkPorterDuff.cpp',
'../third_party/skia/src/effects/SkPixelXorXfermode.cpp',
'../third_party/skia/src/effects/SkRadialGradient_Table.h',
'../third_party/skia/src/effects/SkTableColorFilter.cpp',
'../third_party/skia/src/effects/SkTransparentShader.cpp',
'../third_party/skia/src/gpu/GrAAConvexPathRenderer.cpp',
'../third_party/skia/src/gpu/GrAAConvexPathRenderer.h',
'../third_party/skia/src/gpu/GrAAHairLinePathRenderer.cpp',
'../third_party/skia/src/gpu/GrAAHairLinePathRenderer.h',
'../third_party/skia/src/gpu/GrAddPathRenderers_default.cpp',
'../third_party/skia/src/gpu/GrAllocPool.cpp',
'../third_party/skia/src/gpu/GrAllocPool.h',
'../third_party/skia/src/gpu/GrAllocator.h',
'../third_party/skia/src/gpu/GrAtlas.cpp',
'../third_party/skia/src/gpu/GrAtlas.h',
'../third_party/skia/src/gpu/GrBatchedTextContext.cpp',
'../third_party/skia/src/gpu/GrBatchedTextContext.h',
'../third_party/skia/src/gpu/GrBinHashKey.h',
'../third_party/skia/src/gpu/GrBufferAllocPool.cpp',
'../third_party/skia/src/gpu/GrBufferAllocPool.h',
'../third_party/skia/src/gpu/GrClip.cpp',
'../third_party/skia/src/gpu/GrClipMaskManager.cpp',
'../third_party/skia/src/gpu/GrClipMaskManager.h',
'../third_party/skia/src/gpu/GrContext.cpp',
'../third_party/skia/src/gpu/GrCustomStage.cpp',
'../third_party/skia/src/gpu/GrDefaultPathRenderer.cpp',
'../third_party/skia/src/gpu/GrDefaultPathRenderer.h',
'../third_party/skia/src/gpu/GrDefaultTextContext.cpp',
'../third_party/skia/src/gpu/GrDefaultTextContext.h',
'../third_party/skia/src/gpu/GrDrawTarget.cpp',
'../third_party/skia/src/gpu/GrDrawTarget.h',
'../third_party/skia/src/gpu/GrGeometryBuffer.h',
'../third_party/skia/src/gpu/GrGpu.cpp',
'../third_party/skia/src/gpu/GrGpu.h',
'../third_party/skia/src/gpu/GrGpuFactory.cpp',
'../third_party/skia/src/gpu/GrInOrderDrawBuffer.cpp',
'../third_party/skia/src/gpu/GrInOrderDrawBuffer.h',
'../third_party/skia/src/gpu/GrIndexBuffer.h',
'../third_party/skia/src/gpu/GrMatrix.cpp',
'../third_party/skia/src/gpu/GrMemory.cpp',
'../third_party/skia/src/gpu/GrPathRenderer.cpp',
'../third_party/skia/src/gpu/GrPathRenderer.h',
'../third_party/skia/src/gpu/GrPathRendererChain.cpp',
'../third_party/skia/src/gpu/GrPathRendererChain.h',
'../third_party/skia/src/gpu/GrSoftwarePathRenderer.cpp',
'../third_party/skia/src/gpu/GrSoftwarePathRenderer.h',
'../third_party/skia/src/gpu/GrPathUtils.cpp',
'../third_party/skia/src/gpu/GrPlotMgr.h',
'../third_party/skia/src/gpu/GrRandom.h',
'../third_party/skia/src/gpu/GrRectanizer.h',
'../third_party/skia/src/gpu/GrRectanizer_fifo.cpp',
'../third_party/skia/src/gpu/GrRenderTarget.cpp',
'../third_party/skia/src/gpu/GrResource.cpp',
'../third_party/skia/src/gpu/GrResourceCache.cpp',
'../third_party/skia/src/gpu/GrResourceCache.h',
'../third_party/skia/src/gpu/GrStencil.cpp',
'../third_party/skia/src/gpu/GrStencil.h',
'../third_party/skia/src/gpu/GrStencilBuffer.cpp',
'../third_party/skia/src/gpu/GrStencilBuffer.h',
'../third_party/skia/src/gpu/GrStringBuilder.h',
'../third_party/skia/src/gpu/GrTBSearch.h',
'../third_party/skia/src/gpu/GrTDArray.h',
'../third_party/skia/src/gpu/GrTHashCache.h',
'../third_party/skia/src/gpu/GrTLList.h',
'../third_party/skia/src/gpu/GrTextStrike.cpp',
'../third_party/skia/src/gpu/GrTextStrike.h',
'../third_party/skia/src/gpu/GrTextStrike_impl.h',
'../third_party/skia/src/gpu/GrTexture.cpp',
'../third_party/skia/src/gpu/GrVertexBuffer.h',
'../third_party/skia/src/gpu/SkGpuCanvas.cpp',
'../third_party/skia/src/gpu/SkGpuDevice.cpp',
'../third_party/skia/src/gpu/SkGr.cpp',
'../third_party/skia/src/gpu/SkGrFontScaler.cpp',
'../third_party/skia/src/gpu/SkGrTexturePixelRef.cpp',
'../third_party/skia/src/gpu/effects/Gr1DKernelEffect.h',
'../third_party/skia/src/gpu/effects/GrConvolutionEffect.cpp',
'../third_party/skia/src/gpu/effects/GrConvolutionEffect.h',
'../third_party/skia/src/gpu/effects/GrMorphologyEffect.cpp',
'../third_party/skia/src/gpu/effects/GrMorphologyEffect.h',
'../third_party/skia/src/gpu/gl/GrGLCaps.cpp',
'../third_party/skia/src/gpu/gl/GrGLCaps.h',
'../third_party/skia/src/gpu/gl/GrGLContextInfo.cpp',
'../third_party/skia/src/gpu/gl/GrGLContextInfo.h',
'../third_party/skia/src/gpu/gl/GrGLCreateNativeInterface_none.cpp',
'../third_party/skia/src/gpu/gl/GrGLDefaultInterface_none.cpp',
'../third_party/skia/src/gpu/gl/GrGLDefines.h',
'../third_party/skia/src/gpu/gl/GrGLIRect.h',
'../third_party/skia/src/gpu/gl/GrGLIndexBuffer.cpp',
'../third_party/skia/src/gpu/gl/GrGLIndexBuffer.h',
'../third_party/skia/src/gpu/gl/GrGLInterface.cpp',
'../third_party/skia/src/gpu/gl/GrGLProgram.cpp',
'../third_party/skia/src/gpu/gl/GrGLProgram.h',
'../third_party/skia/src/gpu/gl/GrGLProgramStage.cpp',
'../third_party/skia/src/gpu/gl/GrGLProgramStage.h',
'../third_party/skia/src/gpu/gl/GrGLRenderTarget.cpp',
'../third_party/skia/src/gpu/gl/GrGLRenderTarget.h',
'../third_party/skia/src/gpu/gl/GrGLSL.cpp',
'../third_party/skia/src/gpu/gl/GrGLSL.h',
'../third_party/skia/src/gpu/gl/GrGLShaderBuilder.cpp',
'../third_party/skia/src/gpu/gl/GrGLShaderBuilder.h',
'../third_party/skia/src/gpu/gl/GrGLStencilBuffer.cpp',
'../third_party/skia/src/gpu/gl/GrGLTexture.cpp',
'../third_party/skia/src/gpu/gl/GrGLTexture.h',
'../third_party/skia/src/gpu/gl/GrGLUtil.cpp',
'../third_party/skia/src/gpu/gl/GrGLUtil.h',
'../third_party/skia/src/gpu/gl/GrGLVertexBuffer.cpp',
'../third_party/skia/src/gpu/gl/GrGLVertexBuffer.h',
'../third_party/skia/src/gpu/gl/GrGpuGL.cpp',
'../third_party/skia/src/gpu/gl/GrGpuGL.h',
'../third_party/skia/src/gpu/gl/GrGpuGL_program.cpp',
'../third_party/skia/src/images/bmpdecoderhelper.cpp',
'../third_party/skia/src/images/bmpdecoderhelper.h',
#'../third_party/skia/src/images/SkFDStream.cpp',
#'../third_party/skia/src/images/SkFlipPixelRef.cpp',
'../third_party/skia/src/images/SkImageDecoder.cpp',
'../third_party/skia/src/images/SkImageDecoder_Factory.cpp',
#'../third_party/skia/src/images/SkImageDecoder_fpdfemb.cpp',
#'../third_party/skia/src/images/SkImageDecoder_libbmp.cpp',
#'../third_party/skia/src/images/SkImageDecoder_libgif.cpp',
#'../third_party/skia/src/images/SkImageDecoder_libico.cpp',
#'../third_party/skia/src/images/SkImageDecoder_libjpeg.cpp',
#'../third_party/skia/src/images/SkImageDecoder_libpng.cpp',
#'../third_party/skia/src/images/SkImageDecoder_libpvjpeg.cpp',
#'../third_party/skia/src/images/SkImageDecoder_wbmp.cpp',
#'../third_party/skia/src/images/SkImageEncoder.cpp',
#'../third_party/skia/src/images/SkImageEncoder_Factory.cpp',
#'../third_party/skia/src/images/SkImageRef.cpp',
#'../third_party/skia/src/images/SkImageRefPool.cpp',
#'../third_party/skia/src/images/SkImageRefPool.h',
#'../third_party/skia/src/images/SkImageRef_GlobalPool.cpp',
#'../third_party/skia/src/images/SkMovie.cpp',
#'../third_party/skia/src/images/SkMovie_gif.cpp',
'../third_party/skia/src/images/SkScaledBitmapSampler.cpp',
'../third_party/skia/src/images/SkScaledBitmapSampler.h',
'../third_party/skia/src/opts/opts_check_SSE2.cpp',
'../third_party/skia/src/pdf/SkPDFCatalog.cpp',
'../third_party/skia/src/pdf/SkPDFCatalog.h',
'../third_party/skia/src/pdf/SkPDFDevice.cpp',
'../third_party/skia/src/pdf/SkPDFDocument.cpp',
'../third_party/skia/src/pdf/SkPDFFont.cpp',
'../third_party/skia/src/pdf/SkPDFFont.h',
'../third_party/skia/src/pdf/SkPDFFormXObject.cpp',
'../third_party/skia/src/pdf/SkPDFFormXObject.h',
'../third_party/skia/src/pdf/SkPDFGraphicState.cpp',
'../third_party/skia/src/pdf/SkPDFGraphicState.h',
'../third_party/skia/src/pdf/SkPDFImage.cpp',
'../third_party/skia/src/pdf/SkPDFImage.h',
'../third_party/skia/src/pdf/SkPDFPage.cpp',
'../third_party/skia/src/pdf/SkPDFPage.h',
'../third_party/skia/src/pdf/SkPDFShader.cpp',
'../third_party/skia/src/pdf/SkPDFShader.h',
'../third_party/skia/src/pdf/SkPDFStream.cpp',
'../third_party/skia/src/pdf/SkPDFStream.h',
'../third_party/skia/src/pdf/SkPDFTypes.cpp',
'../third_party/skia/src/pdf/SkPDFTypes.h',
'../third_party/skia/src/pdf/SkPDFUtils.cpp',
'../third_party/skia/src/pdf/SkPDFUtils.h',
'../third_party/skia/src/ports/FontHostConfiguration_android.cpp',
#'../third_party/skia/src/ports/SkFontHost_FONTPATH.cpp',
'../third_party/skia/src/ports/SkFontHost_FreeType.cpp',
'../third_party/skia/src/ports/SkFontHost_android.cpp',
#'../third_party/skia/src/ports/SkFontHost_ascender.cpp',
'../third_party/skia/src/ports/SkFontHost_tables.cpp',
'../third_party/skia/src/ports/SkFontHost_gamma.cpp',
'../third_party/skia/src/ports/SkFontHost_gamma_none.cpp',
#'../third_party/skia/src/ports/SkFontHost_linux.cpp',
'../third_party/skia/src/ports/SkFontHost_mac.cpp',
#'../third_party/skia/src/ports/SkFontHost_none.cpp',
'../third_party/skia/src/ports/SkFontHost_sandbox_none.cpp',
'../third_party/skia/src/ports/SkFontHost_win.cpp',
'../third_party/skia/src/ports/SkGlobalInitialization_chromium.cpp',
#'../third_party/skia/src/ports/SkImageDecoder_CG.cpp',
#'../third_party/skia/src/ports/SkImageDecoder_empty.cpp',
#'../third_party/skia/src/ports/SkImageRef_ashmem.cpp',
#'../third_party/skia/src/ports/SkImageRef_ashmem.h',
#'../third_party/skia/src/ports/SkOSEvent_android.cpp',
#'../third_party/skia/src/ports/SkOSEvent_dummy.cpp',
'../third_party/skia/src/ports/SkOSFile_stdio.cpp',
#'../third_party/skia/src/ports/SkThread_none.cpp',
'../third_party/skia/src/ports/SkThread_pthread.cpp',
'../third_party/skia/src/ports/SkThread_win.cpp',
'../third_party/skia/src/ports/SkTime_Unix.cpp',
#'../third_party/skia/src/ports/SkXMLParser_empty.cpp',
#'../third_party/skia/src/ports/SkXMLParser_expat.cpp',
#'../third_party/skia/src/ports/SkXMLParser_tinyxml.cpp',
#'../third_party/skia/src/ports/SkXMLPullParser_expat.cpp',
'../third_party/skia/src/ports/sk_predefined_gamma.h',
'../third_party/skia/src/sfnt/SkOTUtils.cpp',
'../third_party/skia/src/sfnt/SkOTUtils.h',
'../third_party/skia/include/utils/mac/SkCGUtils.h',
'../third_party/skia/include/utils/SkDeferredCanvas.h',
'../third_party/skia/include/utils/SkMatrix44.h',
'../third_party/skia/src/utils/mac/SkCreateCGImageRef.cpp',
'../third_party/skia/src/utils/SkBase64.cpp',
'../third_party/skia/src/utils/SkBase64.h',
'../third_party/skia/src/utils/SkBitSet.cpp',
'../third_party/skia/src/utils/SkBitSet.h',
'../third_party/skia/src/utils/SkDeferredCanvas.cpp',
'../third_party/skia/src/utils/SkMatrix44.cpp',
'../third_party/skia/include/utils/SkNWayCanvas.h',
'../third_party/skia/src/utils/SkNWayCanvas.cpp',
'../third_party/skia/include/core/Sk64.h',
'../third_party/skia/include/core/SkAdvancedTypefaceMetrics.h',
'../third_party/skia/include/core/SkAutoKern.h',
'../third_party/skia/include/core/SkBitmap.h',
'../third_party/skia/include/core/SkBlitRow.h',
'../third_party/skia/include/core/SkBlitter.h',
'../third_party/skia/include/core/SkBounder.h',
'../third_party/skia/include/core/SkBuffer.h',
'../third_party/skia/include/core/SkCanvas.h',
'../third_party/skia/include/core/SkChunkAlloc.h',
'../third_party/skia/include/core/SkClipStack.h',
'../third_party/skia/include/core/SkColor.h',
'../third_party/skia/include/core/SkColorFilter.h',
'../third_party/skia/include/core/SkColorPriv.h',
'../third_party/skia/include/core/SkColorShader.h',
'../third_party/skia/include/core/SkComposeShader.h',
'../third_party/skia/include/core/SkData.h',
'../third_party/skia/include/core/SkDeque.h',
'../third_party/skia/include/core/SkDescriptor.h',
'../third_party/skia/include/core/SkDevice.h',
'../third_party/skia/include/core/SkDither.h',
'../third_party/skia/include/core/SkDraw.h',
'../third_party/skia/include/core/SkDrawFilter.h',
'../third_party/skia/include/core/SkDrawLooper.h',
#'../third_party/skia/include/core/SkDrawing.h',
'../third_party/skia/include/core/SkEndian.h',
'../third_party/skia/include/core/SkFDot6.h',
'../third_party/skia/include/core/SkFixed.h',
'../third_party/skia/include/core/SkFlate.h',
'../third_party/skia/include/core/SkFlattenable.h',
'../third_party/skia/include/core/SkFloatBits.h',
'../third_party/skia/include/core/SkFloatingPoint.h',
'../third_party/skia/include/core/SkFontHost.h',
'../third_party/skia/include/core/SkGeometry.h',
'../third_party/skia/include/core/SkGraphics.h',
'../third_party/skia/include/core/SkMMapStream.h',
'../third_party/skia/include/core/SkMallocPixelRef.h',
'../third_party/skia/include/core/SkMask.h',
'../third_party/skia/include/core/SkMaskFilter.h',
'../third_party/skia/include/core/SkMath.h',
'../third_party/skia/include/core/SkMatrix.h',
'../third_party/skia/include/core/SkOSFile.h',
'../third_party/skia/include/core/SkPackBits.h',
'../third_party/skia/include/core/SkPaint.h',
'../third_party/skia/include/core/SkPath.h',
'../third_party/skia/include/core/SkPathEffect.h',
'../third_party/skia/include/core/SkPathMeasure.h',
'../third_party/skia/include/core/SkPerspIter.h',
'../third_party/skia/include/core/SkPicture.h',
'../third_party/skia/include/core/SkPixelRef.h',
'../third_party/skia/include/core/SkPoint.h',
'../third_party/skia/include/core/SkPtrRecorder.h',
'../third_party/skia/include/core/SkRandom.h',
'../third_party/skia/include/core/SkRasterizer.h',
'../third_party/skia/include/core/SkReader32.h',
'../third_party/skia/include/core/SkRect.h',
'../third_party/skia/include/core/SkRefCnt.h',
'../third_party/skia/include/core/SkRefDict.h',
'../third_party/skia/include/core/SkRegion.h',
'../third_party/skia/include/core/SkScalar.h',
'../third_party/skia/include/core/SkScalarCompare.h',
'../third_party/skia/include/core/SkScalerContext.h',
'../third_party/skia/include/core/SkScan.h',
'../third_party/skia/include/core/SkShader.h',
'../third_party/skia/include/core/SkStream.h',
'../third_party/skia/include/core/SkString.h',
'../third_party/skia/include/core/SkTArray.h',
'../third_party/skia/include/core/SkTDArray.h',
'../third_party/skia/include/core/SkTDStack.h',
'../third_party/skia/include/core/SkTDict.h',
'../third_party/skia/include/core/SkTRegistry.h',
'../third_party/skia/include/core/SkTScopedPtr.h',
'../third_party/skia/include/core/SkTSearch.h',
'../third_party/skia/include/core/SkTemplates.h',
'../third_party/skia/include/core/SkThread.h',
'../third_party/skia/include/core/SkThread_platform.h',
'../third_party/skia/include/core/SkTime.h',
'../third_party/skia/include/core/SkTypeface.h',
'../third_party/skia/include/core/SkTypes.h',
'../third_party/skia/include/core/SkUnPreMultiply.h',
'../third_party/skia/include/core/SkUnitMapper.h',
'../third_party/skia/include/core/SkUtils.h',
'../third_party/skia/include/core/SkWriter32.h',
'../third_party/skia/include/core/SkXfermode.h',
'../third_party/skia/include/effects/Sk1DPathEffect.h',
'../third_party/skia/include/effects/Sk2DPathEffect.h',
'../third_party/skia/include/effects/SkAvoidXfermode.h',
'../third_party/skia/include/effects/SkBlurDrawLooper.h',
'../third_party/skia/include/effects/SkBlurImageFilter.h',
'../third_party/skia/include/effects/SkBlurMaskFilter.h',
'../third_party/skia/include/effects/SkColorMatrix.h',
'../third_party/skia/include/effects/SkColorMatrixFilter.h',
'../third_party/skia/include/effects/SkCornerPathEffect.h',
'../third_party/skia/include/effects/SkDashPathEffect.h',
'../third_party/skia/include/effects/SkDiscretePathEffect.h',
'../third_party/skia/include/effects/SkDrawExtraPathEffect.h',
'../third_party/skia/include/effects/SkEmbossMaskFilter.h',
'../third_party/skia/include/effects/SkGradientShader.h',
'../third_party/skia/include/effects/SkKernel33MaskFilter.h',
'../third_party/skia/include/effects/SkLayerDrawLooper.h',
'../third_party/skia/include/effects/SkLayerRasterizer.h',
'../third_party/skia/include/effects/SkMorphologyImageFilter.h',
'../third_party/skia/include/effects/SkPaintFlagsDrawFilter.h',
'../third_party/skia/include/effects/SkPixelXorXfermode.h',
'../third_party/skia/include/effects/SkPorterDuff.h',
'../third_party/skia/include/effects/SkTransparentShader.h',
'../third_party/skia/include/gpu/GrClip.h',
'../third_party/skia/include/gpu/GrClipIterator.h',
'../third_party/skia/include/gpu/GrColor.h',
'../third_party/skia/include/gpu/GrConfig.h',
'../third_party/skia/include/gpu/GrContext.h',
'../third_party/skia/include/gpu/GrCustomStage.h',
'../third_party/skia/include/gpu/GrFontScaler.h',
'../third_party/skia/include/gpu/gl/GrGLConfig.h',
'../third_party/skia/include/gpu/gl/GrGLConfig_chrome.h',
'../third_party/skia/include/gpu/gl/GrGLFunctions.h',
'../third_party/skia/include/gpu/gl/GrGLInterface.h',
'../third_party/skia/include/gpu/GrGlyph.h',
'../third_party/skia/include/gpu/GrInstanceCounter.h',
'../third_party/skia/include/gpu/GrKey.h',
'../third_party/skia/include/gpu/GrMatrix.h',
'../third_party/skia/include/gpu/GrNoncopyable.h',
'../third_party/skia/include/gpu/GrPaint.h',
'../third_party/skia/include/gpu/GrPoint.h',
'../third_party/skia/include/gpu/GrProgramStageFactory.h',
'../third_party/skia/include/gpu/GrRect.h',
'../third_party/skia/include/gpu/GrRefCnt.h',
'../third_party/skia/include/gpu/GrRenderTarget.h',
'../third_party/skia/include/gpu/GrSamplerState.h',
'../third_party/skia/include/gpu/GrScalar.h',
'../third_party/skia/include/gpu/GrTextContext.h',
'../third_party/skia/include/gpu/GrTexture.h',
'../third_party/skia/include/gpu/GrTypes.h',
'../third_party/skia/include/gpu/GrUserConfig.h',
'../third_party/skia/include/gpu/SkGpuCanvas.h',
'../third_party/skia/include/gpu/SkGpuDevice.h',
'../third_party/skia/include/gpu/SkGr.h',
'../third_party/skia/include/gpu/SkGrTexturePixelRef.h',
'../third_party/skia/include/pdf/SkPDFDevice.h',
'../third_party/skia/include/pdf/SkPDFDocument.h',
'../third_party/skia/include/ports/SkStream_Win.h',
'../third_party/skia/include/ports/SkTypeface_win.h',
'../third_party/skia/include/images/SkFlipPixelRef.h',
'../third_party/skia/include/images/SkImageDecoder.h',
'../third_party/skia/include/images/SkImageEncoder.h',
'../third_party/skia/include/images/SkImageRef.h',
'../third_party/skia/include/images/SkImageRef_GlobalPool.h',
'../third_party/skia/include/images/SkMovie.h',
'../third_party/skia/include/images/SkPageFlipper.h',
'ext/bitmap_platform_device.h',
'ext/bitmap_platform_device_android.cc',
'ext/bitmap_platform_device_android.h',
'ext/bitmap_platform_device_data.h',
'ext/bitmap_platform_device_linux.cc',
'ext/bitmap_platform_device_linux.h',
'ext/bitmap_platform_device_mac.cc',
'ext/bitmap_platform_device_mac.h',
'ext/bitmap_platform_device_win.cc',
'ext/bitmap_platform_device_win.h',
'ext/canvas_paint.h',
'ext/canvas_paint_common.h',
'ext/canvas_paint_gtk.h',
'ext/canvas_paint_mac.h',
'ext/canvas_paint_win.h',
'ext/convolver.cc',
'ext/convolver.h',
'ext/google_logging.cc',
'ext/image_operations.cc',
'ext/image_operations.h',
'ext/SkThread_chrome.cc',
'ext/platform_canvas.cc',
'ext/platform_canvas.h',
'ext/platform_canvas_linux.cc',
'ext/platform_canvas_mac.cc',
'ext/platform_canvas_skia.cc',
'ext/platform_canvas_win.cc',
'ext/platform_device.cc',
'ext/platform_device.h',
'ext/platform_device_linux.cc',
'ext/platform_device_mac.cc',
'ext/platform_device_win.cc',
'ext/SkMemory_new_handler.cpp',
'ext/skia_sandbox_support_win.h',
'ext/skia_sandbox_support_win.cc',
'ext/skia_trace_shim.h',
'ext/skia_utils_mac.mm',
'ext/skia_utils_mac.h',
'ext/skia_utils_win.cc',
'ext/skia_utils_win.h',
'ext/vector_canvas.cc',
'ext/vector_canvas.h',
'ext/vector_platform_device_emf_win.cc',
'ext/vector_platform_device_emf_win.h',
'ext/vector_platform_device_skia.cc',
'ext/vector_platform_device_skia.h',
],
'include_dirs': [
'..',
'config',
'../third_party/skia/include/config',
'../third_party/skia/include/core',
'../third_party/skia/include/effects',
'../third_party/skia/include/gpu',
'../third_party/skia/include/gpu/gl',
'../third_party/skia/include/images',
'../third_party/skia/include/pdf',
'../third_party/skia/include/ports',
'../third_party/skia/include/utils',
'../third_party/skia/src/core',
'../third_party/skia/src/gpu',
'../third_party/skia/src/sfnt',
'../third_party/skia/src/utils',
],
'msvs_disabled_warnings': [4244, 4267, 4341, 4345, 4390, 4554, 4800],
'mac_framework_dirs': [
'$(SDKROOT)/System/Library/Frameworks/ApplicationServices.framework/Frameworks',
],
'defines': [
'SK_BUILD_NO_IMAGE_ENCODE',
'GR_GL_CUSTOM_SETUP_HEADER="GrGLConfig_chrome.h"',
'GR_STATIC_RECT_VB=1',
'GR_AGGRESSIVE_SHADER_OPTS=1',
'SK_DISABLE_FAST_AA_STROKE_RECT',
'SK_DEFAULT_FONT_CACHE_LIMIT=(20*1024*1024)',
# temporary for landing Skia rev 3077 with minimal layout test breakage
'SK_SIMPLE_TWOCOLOR_VERTICAL_GRADIENTS',
# skia uses static initializers to initialize the serialization logic
# of its "pictures" library. This is currently not used in chrome; if
# it ever gets used the processes that use it need to call
# SkGraphics::Init().
'SK_ALLOW_STATIC_GLOBAL_INITIALIZERS=0',
# Temporarily disable the Skia fix in
# http://code.google.com/p/skia/source/detail?r=3037 ; enabling that
# fix will require substantial rebaselining.
'SK_DRAW_POS_TEXT_IGNORE_SUBPIXEL_LEFT_ALIGN_FIX',
# Temporarily ignore fix to antialias coverage, until we can rebaseline
'SK_USE_LEGACY_AA_COVERAGE',
],
'sources!': [
'../third_party/skia/include/core/SkTypes.h',
],
'conditions': [
['order_profiling != 0', {
'target_conditions' : [
['_toolset=="target"', {
'cflags!': [ '-finstrument-functions' ],
}],
],
}],
# For POSIX platforms, prefer the Mutex implementation provided by Skia
# since it does not generate static initializers.
[ 'OS == "android" or OS == "linux" or OS == "mac"', {
'defines+': [
'SK_USE_POSIX_THREADS',
],
'sources!': [
'ext/SkThread_chrome.cc',
],
}],
[ 'OS != "android"', {
'sources/': [
['exclude', '_android\\.(cc|cpp)$'],
],
'sources!': [
# Below files are only used by Android
'../third_party/skia/src/ports/SkFontHost_gamma.cpp',
],
}],
[ 'OS != "mac"', {
'sources/': [
['exclude', '_mac\\.(cc|cpp|mm?)$'],
['exclude', '/mac/']
],
}],
[ 'OS != "win"', {
'sources/': [ ['exclude', '_win\\.(cc|cpp)$'] ],
}],
[ 'armv7 == 1', {
'defines': [
'__ARM_ARCH__=7',
],
}],
[ 'armv7 == 1 and arm_neon == 1', {
'defines': [
'__ARM_HAVE_NEON',
],
}],
[ 'target_arch == "arm"', {
'sources!': [
'../third_party/skia/src/opts/opts_check_SSE2.cpp'
],
}],
[ 'use_glib == 1', {
'dependencies': [
'../build/linux/system.gyp:fontconfig',
'../build/linux/system.gyp:freetype2',
'../build/linux/system.gyp:pangocairo',
'../third_party/harfbuzz/harfbuzz.gyp:harfbuzz',
'../third_party/icu/icu.gyp:icuuc',
],
'cflags': [
'-Wno-unused',
'-Wno-unused-function',
],
'sources': [
'ext/SkFontHost_fontconfig.cpp',
'ext/SkFontHost_fontconfig_direct.cpp',
],
'defines': [
# 'SK_USE_COLOR_LUMINANCE',
],
}],
[ 'use_glib == 0 and OS != "android"', {
'sources/': [ ['exclude', '_linux\\.(cc|cpp)$'] ],
'sources!': [
'../third_party/skia/src/ports/SkFontHost_FreeType.cpp',
'../third_party/skia/src/ports/SkFontHost_TryeType_Tables.cpp',
'../third_party/skia/src/ports/SkFontHost_gamma_none.cpp',
],
}],
[ 'OS == "android"', {
'sources/': [
['exclude', '_linux\\.(cc|cpp)$'],
['include', 'ext/platform_device_linux\\.cc$'],
['include', 'ext/platform_canvas_linux\\.cc$'],
],
}],
[ 'use_aura == 1 and use_canvas_skia == 1', {
'sources/': [
['exclude', 'ext/platform_canvas_mac\\.cc$'],
['exclude', 'ext/platform_canvas_linux\\.cc$'],
['exclude', 'ext/platform_canvas_win\\.cc$'],
],
}, { # use_aura == 0 and use_canvas_skia == 1
'sources/': [ ['exclude', 'ext/platform_canvas_skia\\.cc$'] ],
}],
[ 'toolkit_uses_gtk == 1', {
'dependencies': [
'../build/linux/system.gyp:gdk',
],
}, { # toolkit_uses_gtk == 0
'sources/': [ ['exclude', '_gtk\\.(cc|cpp)$'] ],
}],
[ 'OS == "android"', {
'defines': [
'SK_BUILD_FOR_ANDROID_NDK',
],
'conditions': [
[ '_toolset == "target"', {
'defines': [
'HAVE_ENDIAN_H',
'HAVE_PTHREADS',
'OS_ANDROID',
'USE_CHROMIUM_SKIA',
],
'dependencies': [
'../third_party/freetype/freetype.gyp:ft2',
'../third_party/harfbuzz/harfbuzz.gyp:harfbuzz',
'../third_party/expat/expat.gyp:expat',
'skia_opts'
],
'dependencies!': [
# Android doesn't use Skia's PDF generation, which is what uses
# sfntly.
'../third_party/sfntly/sfntly.gyp:sfntly',
],
# This exports a hard dependency because it needs to run its
# symlink action in order to expose the skia header files.
'hard_dependency': 1,
'include_dirs': [
'../third_party/expat/files/lib',
],
'sources!': [
'ext/vector_platform_device_skia.cc',
'../third_party/skia/src/ports/SkFontHost_gamma_none.cpp',
],
'export_dependent_settings': [
'../third_party/harfbuzz/harfbuzz.gyp:harfbuzz',
],
}],
[ '_toolset=="host" and host_os=="linux"', {
'sources': [
'ext/platform_device_linux.cc',
'ext/platform_canvas_linux.cc',
],
}],
],
}],
[ 'OS == "mac"', {
'defines': [
'SK_BUILD_FOR_MAC',
],
'include_dirs': [
'../third_party/skia/include/utils/mac',
],
'link_settings': {
'libraries': [
'$(SDKROOT)/System/Library/Frameworks/AppKit.framework',
],
},
'sources': [
'../third_party/skia/src/utils/mac/SkStream_mac.cpp',
],
'sources!': [
# The mac's fonthost implements the table methods natively,
# so no need for these generic versions.
'../third_party/skia/src/ports/SkFontHost_tables.cpp',
],
'conditions': [
[ 'use_skia == 0', {
'sources/': [
['exclude', '/pdf/'],
['exclude', 'ext/vector_platform_device_skia\\.(cc|h)'],
],
},
{ # use_skia
'defines': [
'SK_USE_MAC_CORE_TEXT',
# 'SK_USE_COLOR_LUMINANCE',
],
}],
],
}],
[ 'OS == "win"', {
'sources!': [
'../third_party/skia/src/core/SkMMapStream.cpp',
'../third_party/skia/src/ports/SkFontHost_sandbox_none.cpp',
'../third_party/skia/src/ports/SkThread_pthread.cpp',
'../third_party/skia/src/ports/SkTime_Unix.cpp',
'ext/SkThread_chrome.cc',
],
'include_dirs': [
'config/win',
],
'direct_dependent_settings': {
'include_dirs': [
'config/win',
],
},
}],
['component=="shared_library"', {
'defines': [
'GR_DLL=1',
'GR_IMPLEMENTATION=1',
'SKIA_DLL',
'SKIA_IMPLEMENTATION=1',
],
'dependencies': [
'../base/base.gyp:base',
],
'direct_dependent_settings': {
'defines': [
'GR_DLL',
'SKIA_DLL',
],
},
}],
],
'dependencies': [
'skia_opts',
'../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations',
'../third_party/sfntly/sfntly.gyp:sfntly',
'../third_party/zlib/zlib.gyp:zlib',
],
'direct_dependent_settings': {
'include_dirs': [
'config',
'../third_party/skia/include/config',
'../third_party/skia/include/core',
'../third_party/skia/include/effects',
'../third_party/skia/include/pdf',
'../third_party/skia/include/gpu',
'../third_party/skia/include/gpu/gl',
'../third_party/skia/include/ports',
'../third_party/skia/include/utils',
'ext',
],
'mac_framework_dirs': [
'$(SDKROOT)/System/Library/Frameworks/ApplicationServices.framework/Frameworks',
],
'defines': [
'SK_BUILD_NO_IMAGE_ENCODE',
'GR_GL_CUSTOM_SETUP_HEADER="GrGLConfig_chrome.h"',
'GR_AGGRESSIVE_SHADER_OPTS=1',
],
'conditions': [
['OS=="android"', {
'defines': [
'SK_BUILD_FOR_ANDROID_NDK',
],
'conditions': [
[ '_toolset == "target"', {
'defines': [
'HAVE_ENDIAN_H',
'SK_RELEASE', # Assume platform has a release build.
],
'dependencies!': [
'skia_opts',
'../third_party/zlib/zlib.gyp:zlib',
],
}],
],
}],
['OS=="mac"', {
'include_dirs': [
'../third_party/skia/include/utils/mac',
],
}],
],
},
},
# Due to an unfortunate intersection of lameness between gcc and gyp,
# we have to build the *_SSE2.cpp files in a separate target. The
# gcc lameness is that, in order to compile SSE2 intrinsics code, it
# must be passed the -msse2 flag. However, with this flag, it may
# emit SSE2 instructions even for scalar code, such as the CPUID
# test used to test for the presence of SSE2. So that, and all other
# code must be compiled *without* -msse2. The gyp lameness is that it
# does not allow file-specific CFLAGS, so we must create this extra
# target for those files to be compiled with -msse2.
#
# This is actually only a problem on 32-bit Linux (all Intel Macs have
# SSE2, Linux x86_64 has SSE2 by definition, and MSC will happily emit
# SSE2 from instrinsics, which generating plain ol' 386 for everything
# else). However, to keep the .gyp file simple and avoid platform-specific
# build breakage, we do this on all platforms.
# For about the same reason, we need to compile the ARM opts files
# separately as well.
{
'target_name': 'skia_opts',
'type': 'static_library',
'variables': {
'optimize': 'max',
},
'include_dirs': [
'..',
'config',
'../third_party/skia/include/config',
'../third_party/skia/include/core',
'../third_party/skia/include/effects',
'../third_party/skia/include/images',
'../third_party/skia/include/utils',
'../third_party/skia/src/core',
],
'conditions': [
['order_profiling != 0', {
'target_conditions' : [
['_toolset=="target"', {
'cflags!': [ '-finstrument-functions' ],
}],
],
}],
[ 'os_posix == 1 and OS != "mac" and OS != "android" and target_arch != "arm"', {
'cflags': [
'-msse2',
],
}],
[ 'OS == "android"', {
'defines': [
'SK_BUILD_FOR_ANDROID_NDK',
],
}],
[ 'target_arch != "arm"', {
'sources': [
'../third_party/skia/src/opts/SkBitmapProcState_opts_SSE2.cpp',
'../third_party/skia/src/opts/SkBlitRect_opts_SSE2.cpp',
'../third_party/skia/src/opts/SkBlitRow_opts_SSE2.cpp',
'../third_party/skia/src/opts/SkUtils_opts_SSE2.cpp',
],
'conditions': [
# x86 Android doesn't support SSSE3 instructions.
[ 'OS != "android"', {
'dependencies': [
'skia_opts_ssse3',
],
}],
],
},
{ # arm
'conditions': [
['order_profiling != 0', {
'target_conditions' : [
['_toolset=="target"', {
'cflags!': [ '-finstrument-functions' ],
}],
],
}],
[ 'armv7 == 1', {
'defines': [
'__ARM_ARCH__=7',
],
}],
[ 'armv7 == 1 and arm_neon == 1', {
'defines': [
'__ARM_HAVE_NEON',
],
'cflags': [
# The neon assembly contains conditional instructions which
# aren't enclosed in an IT block. The assembler complains
# without this option.
# See #86592.
'-Wa,-mimplicit-it=always',
],
}],
],
# The assembly uses the frame pointer register (r7 in Thumb/r11 in
# ARM), the compiler doesn't like that. Explicitly remove the
# -fno-omit-frame-pointer flag for Android, as that gets added to all
# targets via common.gypi.
'cflags!': [
'-fno-omit-frame-pointer',
],
'cflags': [
'-fomit-frame-pointer',
],
'sources': [
'../third_party/skia/src/opts/SkBitmapProcState_opts_arm.cpp',
'../third_party/skia/src/opts/SkBlitRow_opts_arm.cpp',
'../third_party/skia/src/opts/opts_check_arm.cpp',
],
}],
[ 'armv7 == 1 and arm_neon == 0', {
'sources': [
'../third_party/skia/src/opts/memset.arm.S',
],
}],
[ 'armv7 == 1 and arm_neon == 1', {
'sources': [
'../third_party/skia/src/opts/memset16_neon.S',
'../third_party/skia/src/opts/memset32_neon.S',
],
}],
[ 'target_arch == "arm" and armv7 != 1', {
'sources': [
'../third_party/skia/src/opts/SkBlitRow_opts_none.cpp',
],
'sources!': [
'../third_party/skia/src/opts/SkBlitRow_opts_arm.cpp',
],
}],
],
},
# For the same lame reasons as what is done for skia_opts, we have to
# create another target specifically for SSSE3 code as we would not want
# to compile the SSE2 code with -mssse3 which would potentially allow
# gcc to generate SSSE3 code.
{
'target_name': 'skia_opts_ssse3',
'type': 'static_library',
'variables': {
'optimize': 'max',
},
'include_dirs': [
'..',
'config',
'../third_party/skia/include/config',
'../third_party/skia/include/core',
'../third_party/skia/src/core',
],
'conditions': [
[ 'OS in ["linux", "freebsd", "openbsd", "solaris"]', {
'cflags': [
'-mssse3',
],
}],
['order_profiling != 0', {
'target_conditions' : [
['_toolset=="target"', {
'cflags!': [ '-finstrument-functions' ],
}],
],
}],
[ 'OS == "mac"', {
'xcode_settings': {
'GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS': 'YES',
},
}],
[ 'OS == "win"', {
'include_dirs': [
'config/win',
],
'direct_dependent_settings': {
'include_dirs': [
'config/win',
],
},
}],
[ 'target_arch != "arm"', {
'sources': [
'../third_party/skia/src/opts/SkBitmapProcState_opts_SSSE3.cpp',
],
}],
],
},
{
'target_name': 'image_operations_bench',
'type': 'executable',
'dependencies': [
'../base/base.gyp:base',
'skia',
],
'include_dirs': [
'..',
],
'sources': [
'ext/image_operations_bench.cc',
],
},
],
}
| 47.081505 | 94 | 0.603818 | 6,767 | 60,076 | 5.180878 | 0.122063 | 0.213355 | 0.293905 | 0.269603 | 0.74851 | 0.722154 | 0.29302 | 0.117687 | 0.086654 | 0.073933 | 0 | 0.004643 | 0.218523 | 60,076 | 1,275 | 95 | 47.118431 | 0.742119 | 0.217691 | 0 | 0.35757 | 0 | 0.000996 | 0.662588 | 0.602409 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ce05ca73671981472b0d1b5b99c7fcef3a3b5b1 | 34,001 | py | Python | genomics_data_index/test/unit/variant/io/mutation/test_VcfSnpEffAnnotationParser.py | apetkau/genomics-data-index | d0cc119fd57b8cbd701affb1c84450cf7832fa01 | [
"Apache-2.0"
] | 12 | 2021-05-03T20:56:05.000Z | 2022-01-04T14:52:19.000Z | genomics_data_index/test/unit/variant/io/mutation/test_VcfSnpEffAnnotationParser.py | apetkau/genomics-data-index | d0cc119fd57b8cbd701affb1c84450cf7832fa01 | [
"Apache-2.0"
] | 30 | 2021-04-26T23:03:40.000Z | 2022-02-25T18:41:14.000Z | genomics_data_index/test/unit/variant/io/mutation/test_VcfSnpEffAnnotationParser.py | apetkau/genomics-data-index | d0cc119fd57b8cbd701affb1c84450cf7832fa01 | [
"Apache-2.0"
] | null | null | null | import pandas as pd
import pytest
from genomics_data_index.storage.io.mutation.VcfSnpEffAnnotationParser import VcfSnpEffAnnotationParser, \
InvalidSnpEffVcfError
@pytest.fixture
def vcf_snpeff_annotation_parser() -> VcfSnpEffAnnotationParser:
return VcfSnpEffAnnotationParser()
@pytest.fixture
def mock_snpeff_infos():
class MockAnn():
def __init__(self):
pass
desc = ("Functional annotations: 'Allele | Annotation | Annotation_Impact | Gene_Name | Gene_ID"
" | Feature_Type | Feature_ID | Transcript_BioType | Rank | HGVS.c | HGVS.p"
" | cDNA.pos / cDNA.length | CDS.pos / CDS.length | AA.pos / AA.length | Distance"
" | ERRORS / WARNINGS / INFO'")
return {
'ANN': MockAnn()
}
@pytest.fixture
def mock_vcf_df_with_ann_single() -> pd.DataFrame:
return pd.DataFrame([
['NC_011083', 140658, 'C', 'A',
{'ANN': ('A|missense_variant|MODERATE|murF|SEHA_RS01180|transcript|SEHA_RS01180|'
'protein_coding|1/1|c.497C>A|p.Ala166Glu|497/1359|497/1359|166/452||')}
],
], columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_vcf_df_without_ann_single() -> pd.DataFrame:
return pd.DataFrame([
['NC_011083', 140658, 'C', 'A', {}],
], columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_vcf_df_without_ann_multiple() -> pd.DataFrame:
return pd.DataFrame([
['NC_011083', 140658, 'C', 'A', {}],
['NC_011083', 203200, 'C', 'T', {}],
], columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_vcf_df_with_and_without_ann() -> pd.DataFrame:
return pd.DataFrame([
['NC_011083', 140658, 'C', 'A',
{'ANN': [('A|missense_variant|MODERATE|murF|SEHA_RS01180|transcript|SEHA_RS01180|'
'protein_coding|1/1|c.497C>A|p.Ala166Glu|497/1359|497/1359|166/452||'),
('A|upstream_gene_variant|MODIFIER|mraY|SEHA_RS01185|transcript|SEHA_RS01185|'
'protein_coding||c.-856C>A|||||856|'),
('A|upstream_gene_variant|MODIFIER|murD|SEHA_RS01190|transcript|SEHA_RS01190|'
'protein_coding||c.-1941C>A|||||1941|')]}
],
['NC_011083', 203200, 'C', 'T', {'ANN': []}],
], columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_vcf_df_empty() -> pd.DataFrame:
return pd.DataFrame(columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_snpeff_infos_empty():
return {}
@pytest.fixture
def mock_vcf_df_with_ann_multiple_entries_single_sample() -> pd.DataFrame:
return pd.DataFrame([
['NC_011083', 140658, 'C', 'A',
{'ANN': [('A|missense_variant|MODERATE|murF|SEHA_RS01180|transcript|SEHA_RS01180|'
'protein_coding|1/1|c.497C>A|p.Ala166Glu|497/1359|497/1359|166/452||'),
('A|upstream_gene_variant|MODIFIER|mraY|SEHA_RS01185|transcript|SEHA_RS01185|'
'protein_coding||c.-856C>A|||||856|'),
('A|upstream_gene_variant|MODIFIER|murD|SEHA_RS01190|transcript|SEHA_RS01190|'
'protein_coding||c.-1941C>A|||||1941|')]}
],
], columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_vcf_df_single_sample_annotations() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359'],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA]
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length',
])
@pytest.fixture
def mock_vcf_df_single_sample_annotations_some_na_values() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', pd.NA,
'c.497C>A', 'p.Ala166Glu', '497/1359'],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA]
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length',
])
@pytest.fixture
def mock_vcf_df_multiple_sample_annotations() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359'],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf', 'NC_011083:203200:C:T',
'T', 'missense_variant', 'MODERATE', 'SEHA_RS01460', 'SEHA_RS01460', 'transcript',
'protein_coding', 'c.602C>T', 'p.Thr201Met', '602/927'],
['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf', 'NC_011083:203200:C:T',
'T', 'upstream_gene_variant', 'MODIFIER', 'SEHA_RS01455', 'SEHA_RS01455',
'transcript', 'protein_coding', 'c.-2172G>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf', 'NC_011083:203200:C:T',
'T', 'upstream_gene_variant', 'MODIFIER', 'SEHA_RS01455', 'SEHA_RS01455',
'transcript', 'protein_coding', 'c.-710G>A', pd.NA, pd.NA],
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length',
])
@pytest.fixture
def mock_vcf_df_single_sample_annotations_sars_cov_2() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.3683A>G', 'p.D1228G', '3683/21291'],
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.3683A>G', 'p.D1228G', '3683/13218'],
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.1229A>G', 'p.D410G', '1229/5835'],
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.1229A>G', 'p.D410G', '1229/5835'],
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length',
])
@pytest.fixture
def mock_vcf_df_single_sample_annotations_rv_sars_cov_2() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.1229A>G', 'p.D410G', '1229/5835'],
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.1229A>G', 'p.D410G', '1229/5835'],
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.3683A>G', 'p.D1228G', '3683/13218'],
['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.3683A>G', 'p.D1228G', '3683/21291'],
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length',
])
@pytest.fixture
def mock_vcf_df_multiple_sample_one_empty() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359'],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf', 'NC_011083:203200:C:T'] + [pd.NA] * 10,
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length',
])
@pytest.fixture
def mock_vcf_df_multiple_sample_one_invalid() -> pd.DataFrame:
return pd.DataFrame([
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359'],
['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA],
# This one is invalid because snpeff 'Allele' refers to some other variant (indicating it's a compound variant)
# I want to ignore all compound variants because it would be very difficult to index them in my software.
['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf', 'NC_011083:203200:C:T',
'T:123456_A>T', 'missense_variant', 'MODERATE', 'SEHA_RS01460', 'SEHA_RS01460', 'transcript',
'protein_coding', 'c.602C>T', 'p.Thr201Met', '602/927'],
], columns=['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p',
'ANN.cDNA.pos / cDNA.length'
])
@pytest.fixture
def mock_vcf_df_with_ann_multiple_entries_multiple_samples() -> pd.DataFrame:
return pd.DataFrame([
['NC_011083', 140658, 'C', 'A',
{'ANN': [('A|missense_variant|MODERATE|murF|SEHA_RS01180|transcript|SEHA_RS01180|'
'protein_coding|1/1|c.497C>A|p.Ala166Glu|497/1359|497/1359|166/452||'),
('A|upstream_gene_variant|MODIFIER|mraY|SEHA_RS01185|transcript|SEHA_RS01185|'
'protein_coding||c.-856C>A|||||856|'),
('A|upstream_gene_variant|MODIFIER|murD|SEHA_RS01190|transcript|SEHA_RS01190|'
'protein_coding||c.-1941C>A|||||1941|')]}
],
['NC_011083', 203200, 'C', 'T',
{'ANN': [('T|missense_variant|MODERATE|SEHA_RS01460|SEHA_RS01460|transcript|SEHA_RS01460|'
'protein_coding|1/1|c.602C>T|p.Thr201Met|602/927|602/927|201/308||'),
('T|upstream_gene_variant|MODIFIER|SEHA_RS01445|SEHA_RS01445|transcript|SEHA_RS01445|'
'protein_coding||c.-2172G>A|||||2172|'),
('T|upstream_gene_variant|MODIFIER|can|SEHA_RS01455|transcript|SEHA_RS01455|'
'protein_coding||c.-710G>A|||||710|')]}
],
], columns=[
'CHROM', 'POS', 'REF', 'ALT', 'INFO',
])
@pytest.fixture
def mock_snpeff_infos_invalid():
class MockAnn():
def __init__(self):
pass
desc = 'invalid'
return {
'ANN': MockAnn()
}
def test_parse_annotation_headers(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser, mock_snpeff_infos):
headers_list = vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos)
assert ['Allele', 'Annotation', 'Annotation_Impact', 'Gene_Name', 'Gene_ID', 'Feature_Type', 'Feature_ID',
'Transcript_BioType', 'Rank', 'HGVS.c', 'HGVS.p', 'cDNA.pos / cDNA.length', 'CDS.pos / CDS.length',
'AA.pos / AA.length', 'Distance', 'ERRORS / WARNINGS / INFO'] == headers_list
def test_parse_annotation_headers_invalid(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_snpeff_infos_invalid):
with pytest.raises(InvalidSnpEffVcfError) as execinfo:
vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos_invalid)
assert "Found 'ANN' in VCF information but description" in str(execinfo.value)
def test_parse_annotation_headers_no_annotation(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_snpeff_infos_empty):
headers_list = vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos_empty)
assert [] == headers_list
def test_parse_annotation_entries_single(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_snpeff_infos, mock_vcf_df_with_ann_single: pd.DataFrame):
headers_list = vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos)
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=headers_list,
vcf_df=mock_vcf_df_with_ann_single)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 1 == len(ann_entries_df)
assert [0] == list(ann_entries_df.index)
assert ['A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359', 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[0])
def test_parse_annotation_entries_multiple_entries_single_sample(
vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_snpeff_infos, mock_vcf_df_with_ann_multiple_entries_single_sample: pd.DataFrame):
headers_list = vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos)
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=headers_list,
vcf_df=mock_vcf_df_with_ann_multiple_entries_single_sample)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 3 == len(ann_entries_df)
ann_entries_df = ann_entries_df.sort_values(['original_index', 'ANN.Gene_ID'])
assert [0, 0, 0] == list(ann_entries_df.index)
assert ['A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359', 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[0])
assert ['A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA, 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[1])
assert ['A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA, 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[2])
def test_parse_annotation_entries_multiple_entries_multiple_samples(
vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_snpeff_infos, mock_vcf_df_with_ann_multiple_entries_multiple_samples: pd.DataFrame):
headers_list = vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos)
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=headers_list,
vcf_df=mock_vcf_df_with_ann_multiple_entries_multiple_samples)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 6 == len(ann_entries_df)
ann_entries_df = ann_entries_df.sort_values(['original_index', 'ANN.Gene_ID'])
assert [0, 0, 0, 1, 1, 1] == list(ann_entries_df.index)
assert ['A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359', 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[0])
assert ['A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA, 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[1])
assert ['A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA, 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[2])
assert ['T', 'upstream_gene_variant', 'MODIFIER', 'SEHA_RS01445', 'SEHA_RS01445', 'transcript', 'protein_coding',
'c.-2172G>A', pd.NA, pd.NA, 'NC_011083:203200:C:T'] == list(ann_entries_df.iloc[3])
assert ['T', 'upstream_gene_variant', 'MODIFIER', 'can', 'SEHA_RS01455', 'transcript', 'protein_coding',
'c.-710G>A', pd.NA, pd.NA, 'NC_011083:203200:C:T'] == list(ann_entries_df.iloc[4])
assert ['T', 'missense_variant', 'MODERATE', 'SEHA_RS01460', 'SEHA_RS01460', 'transcript', 'protein_coding',
'c.602C>T', 'p.Thr201Met', '602/927', 'NC_011083:203200:C:T'] == list(ann_entries_df.iloc[5])
def test_parse_annotation_entries_no_annotation_single_sample(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_without_ann_single: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=[],
vcf_df=mock_vcf_df_without_ann_single)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 1 == len(ann_entries_df)
assert [0] == list(ann_entries_df.index)
assert {True} == set(ann_entries_df.drop('VARIANT_ID', axis='columns').iloc[0].isna())
assert 'NC_011083:140658:C:A' == ann_entries_df['VARIANT_ID'].iloc[0]
def test_parse_annotation_entries_no_annotation_multiple_sample(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_without_ann_multiple: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=[],
vcf_df=mock_vcf_df_without_ann_multiple)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 2 == len(ann_entries_df)
assert [0, 1] == list(ann_entries_df.index)
assert {True} == set(ann_entries_df.drop('VARIANT_ID', axis='columns').iloc[0].isna())
assert 'NC_011083:140658:C:A' == ann_entries_df['VARIANT_ID'].iloc[0]
assert {True} == set(ann_entries_df.drop('VARIANT_ID', axis='columns').iloc[1].isna())
assert 'NC_011083:203200:C:T' == ann_entries_df['VARIANT_ID'].iloc[1]
def test_parse_annotation_entries_some_with_some_without_annotations(
vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_snpeff_infos,
mock_vcf_df_with_and_without_ann: pd.DataFrame):
headers_list = vcf_snpeff_annotation_parser.parse_annotation_headers(mock_snpeff_infos)
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=headers_list,
vcf_df=mock_vcf_df_with_and_without_ann)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 4 == len(ann_entries_df)
ann_entries_df = ann_entries_df.sort_values(['original_index', 'ANN.Gene_ID'])
assert [0, 0, 0, 1] == list(ann_entries_df.index)
assert ['A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu', '497/1359', 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[0])
assert ['A', 'upstream_gene_variant', 'MODIFIER', 'mraY', 'SEHA_RS01185', 'transcript', 'protein_coding',
'c.-856C>A', pd.NA, pd.NA, 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[1])
assert ['A', 'upstream_gene_variant', 'MODIFIER', 'murD', 'SEHA_RS01190', 'transcript', 'protein_coding',
'c.-1941C>A', pd.NA, pd.NA, 'NC_011083:140658:C:A'] == list(ann_entries_df.iloc[2])
assert {True} == set(ann_entries_df.drop('VARIANT_ID', axis='columns').iloc[3].isna())
assert 'NC_011083:203200:C:T' == ann_entries_df['VARIANT_ID'].iloc[3]
def test_parse_annotation_entries_empty(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_empty: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.parse_annotation_entries(vcf_ann_headers=[],
vcf_df=mock_vcf_df_empty)
assert ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p', 'ANN.cDNA.pos / cDNA.length',
'VARIANT_ID'] == list(
ann_entries_df.columns)
assert 0 == len(ann_entries_df)
def test_select_variant_annotations_single_sample(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_single_sample_annotations: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(mock_vcf_df_single_sample_annotations)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 1 == len(ann_entries_df)
assert ['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu'] == list(ann_entries_df.iloc[0])
def test_select_variant_annotations_some_na_values(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_single_sample_annotations_some_na_values: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(
mock_vcf_df_single_sample_annotations_some_na_values)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 1 == len(ann_entries_df)
assert ['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'NA',
'c.497C>A', 'p.Ala166Glu'] == list(ann_entries_df.iloc[0].fillna('NA'))
def test_select_variant_annotations_single_sample_sars_cov_2(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_single_sample_annotations_sars_cov_2: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(
mock_vcf_df_single_sample_annotations_sars_cov_2)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 1 == len(ann_entries_df)
assert ['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.3683A>G', 'p.D1228G'] == list(ann_entries_df.iloc[0])
def test_select_variant_annotations_rv_single_sample_sars_cov_2(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_single_sample_annotations_rv_sars_cov_2: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(
mock_vcf_df_single_sample_annotations_rv_sars_cov_2)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 1 == len(ann_entries_df)
assert ['SampleA', 'NC_045512.2', 3948, 'A', 'G', 'SNP', 'SampleA.vcf', 'NC_045512.2:3948:A:G',
'G', 'missense_variant', 'MODERATE', 'ORF1ab', 'GU280_gp01', 'transcript', 'protein_coding',
'c.3683A>G', 'p.D1228G'] == list(ann_entries_df.iloc[0])
def test_select_variant_annotations_multiple_sample(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_multiple_sample_annotations: pd.DataFrame):
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(mock_vcf_df_multiple_sample_annotations)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 2 == len(ann_entries_df)
assert ['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu'] == list(ann_entries_df.iloc[0])
assert ['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf', 'NC_011083:203200:C:T',
'T', 'missense_variant', 'MODERATE', 'SEHA_RS01460', 'SEHA_RS01460', 'transcript', 'protein_coding',
'c.602C>T', 'p.Thr201Met'] == list(ann_entries_df.iloc[1])
def test_select_variant_annotations_one_no_annotation(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_multiple_sample_one_empty: pd.DataFrame):
annotation_columns = ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p']
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(mock_vcf_df_multiple_sample_one_empty)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 2 == len(ann_entries_df)
assert ['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu'] == list(ann_entries_df.iloc[0])
assert ['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf',
'NC_011083:203200:C:T'] == list(ann_entries_df.drop(annotation_columns, axis='columns').iloc[1])
assert {True} == set(ann_entries_df[annotation_columns].iloc[1].isna().tolist())
def test_select_variant_annotations_one_invalid_annotation(vcf_snpeff_annotation_parser: VcfSnpEffAnnotationParser,
mock_vcf_df_multiple_sample_one_invalid: pd.DataFrame):
annotation_columns = ['ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p']
ann_entries_df = vcf_snpeff_annotation_parser.select_variant_annotations(mock_vcf_df_multiple_sample_one_invalid)
assert ['SAMPLE', 'CHROM', 'POS', 'REF', 'ALT', 'TYPE', 'FILE', 'VARIANT_ID',
'ANN.Allele', 'ANN.Annotation', 'ANN.Annotation_Impact', 'ANN.Gene_Name', 'ANN.Gene_ID',
'ANN.Feature_Type', 'ANN.Transcript_BioType', 'ANN.HGVS.c', 'ANN.HGVS.p'] == list(
ann_entries_df.columns)
assert 2 == len(ann_entries_df)
assert ['SampleA', 'NC_011083', 140658, 'C', 'A', 'snp', 'SampleA.vcf', 'NC_011083:140658:C:A',
'A', 'missense_variant', 'MODERATE', 'murF', 'SEHA_RS01180', 'transcript', 'protein_coding',
'c.497C>A', 'p.Ala166Glu'] == list(ann_entries_df.iloc[0])
assert ['SampleA', 'NC_011083', 203200, 'C', 'T', 'snp', 'SampleA.vcf',
'NC_011083:203200:C:T'] == list(ann_entries_df.drop(annotation_columns, axis='columns').iloc[1])
assert {True} == set(ann_entries_df[annotation_columns].iloc[1].isna().tolist())
| 60.39254 | 137 | 0.627776 | 4,321 | 34,001 | 4.642907 | 0.045591 | 0.043366 | 0.052039 | 0.043366 | 0.961021 | 0.948559 | 0.936497 | 0.919848 | 0.902602 | 0.890739 | 0 | 0.084689 | 0.199171 | 34,001 | 562 | 138 | 60.5 | 0.652099 | 0.006265 | 0 | 0.726115 | 0 | 0.012739 | 0.381649 | 0.088412 | 0 | 0 | 0 | 0 | 0.14862 | 1 | 0.078556 | false | 0.004246 | 0.006369 | 0.03397 | 0.131635 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b9140b98d0a37d96ddaccb8296f488e032c48b3 | 14,635 | py | Python | nova/tests/unit/virt/test_imagecache.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/virt/test_imagecache.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/virt/test_imagecache.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2013 OpenStack Foundation'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'from'
name|'nova'
name|'import'
name|'block_device'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'compute'
name|'import'
name|'vm_states'
newline|'\n'
name|'import'
name|'nova'
op|'.'
name|'conf'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'context'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'block_device'
name|'as'
name|'block_device_obj'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'test'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'fake_instance'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'virt'
name|'import'
name|'imagecache'
newline|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'nova'
op|'.'
name|'conf'
op|'.'
name|'CONF'
newline|'\n'
nl|'\n'
DECL|variable|swap_bdm_128
name|'swap_bdm_128'
op|'='
op|'['
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'1'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdb1'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'blank'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'local'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'swap'"
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'128'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
op|']'
newline|'\n'
nl|'\n'
DECL|variable|swap_bdm_256
name|'swap_bdm_256'
op|'='
op|'['
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'1'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdb1'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'blank'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'local'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'swap'"
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'256'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|ImageCacheManagerTests
name|'class'
name|'ImageCacheManagerTests'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|member|test_configurationi_defaults
indent|' '
name|'def'
name|'test_configurationi_defaults'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2400'
op|','
name|'CONF'
op|'.'
name|'image_cache_manager_interval'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'_base'"
op|','
name|'CONF'
op|'.'
name|'image_cache_subdirectory_name'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'CONF'
op|'.'
name|'remove_unused_base_images'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'24'
op|'*'
number|'3600'
op|','
nl|'\n'
name|'CONF'
op|'.'
name|'remove_unused_original_minimum_age_seconds'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cache_manager
dedent|''
name|'def'
name|'test_cache_manager'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cache_manager'
op|'='
name|'imagecache'
op|'.'
name|'ImageCacheManager'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'cache_manager'
op|'.'
name|'remove_unused_base_images'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'NotImplementedError'
op|','
nl|'\n'
name|'cache_manager'
op|'.'
name|'update'
op|','
name|'None'
op|','
op|'['
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'NotImplementedError'
op|','
nl|'\n'
name|'cache_manager'
op|'.'
name|'_get_base'
op|')'
newline|'\n'
name|'base_images'
op|'='
name|'cache_manager'
op|'.'
name|'_list_base_images'
op|'('
name|'None'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'['
op|']'
op|','
name|'base_images'
op|'['
string|"'unexplained_images'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'['
op|']'
op|','
name|'base_images'
op|'['
string|"'originals'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'NotImplementedError'
op|','
nl|'\n'
name|'cache_manager'
op|'.'
name|'_age_and_verify_cached_images'
op|','
nl|'\n'
name|'None'
op|','
op|'['
op|']'
op|','
name|'None'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list_running_instances
dedent|''
name|'def'
name|'test_list_running_instances'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instances'
op|'='
op|'['
op|'{'
string|"'image_ref'"
op|':'
string|"'1'"
op|','
nl|'\n'
string|"'host'"
op|':'
name|'CONF'
op|'.'
name|'host'
op|','
nl|'\n'
string|"'id'"
op|':'
string|"'1'"
op|','
nl|'\n'
string|"'uuid'"
op|':'
string|"'123'"
op|','
nl|'\n'
string|"'vm_state'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'task_state'"
op|':'
string|"''"
op|'}'
op|','
nl|'\n'
op|'{'
string|"'image_ref'"
op|':'
string|"'2'"
op|','
nl|'\n'
string|"'host'"
op|':'
name|'CONF'
op|'.'
name|'host'
op|','
nl|'\n'
string|"'id'"
op|':'
string|"'2'"
op|','
nl|'\n'
string|"'uuid'"
op|':'
string|"'456'"
op|','
nl|'\n'
string|"'vm_state'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'task_state'"
op|':'
string|"''"
op|'}'
op|','
nl|'\n'
op|'{'
string|"'image_ref'"
op|':'
string|"'2'"
op|','
nl|'\n'
string|"'kernel_id'"
op|':'
string|"'21'"
op|','
nl|'\n'
string|"'ramdisk_id'"
op|':'
string|"'22'"
op|','
nl|'\n'
string|"'host'"
op|':'
string|"'remotehost'"
op|','
nl|'\n'
string|"'id'"
op|':'
string|"'3'"
op|','
nl|'\n'
string|"'uuid'"
op|':'
string|"'789'"
op|','
nl|'\n'
string|"'vm_state'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'task_state'"
op|':'
string|"''"
op|'}'
op|']'
newline|'\n'
nl|'\n'
name|'all_instances'
op|'='
op|'['
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'None'
op|','
op|'**'
name|'instance'
op|')'
nl|'\n'
name|'for'
name|'instance'
name|'in'
name|'instances'
op|']'
newline|'\n'
nl|'\n'
name|'image_cache_manager'
op|'='
name|'imagecache'
op|'.'
name|'ImageCacheManager'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'objects'
op|'.'
name|'block_device'
op|'.'
name|'BlockDeviceMappingList'
op|','
nl|'\n'
string|"'bdms_by_instance_uuid'"
op|')'
newline|'\n'
nl|'\n'
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'swap_bdm_256_list'
op|'='
name|'block_device_obj'
op|'.'
name|'block_device_make_list_from_dicts'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'swap_bdm_256'
op|')'
newline|'\n'
name|'swap_bdm_128_list'
op|'='
name|'block_device_obj'
op|'.'
name|'block_device_make_list_from_dicts'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'swap_bdm_128'
op|')'
newline|'\n'
name|'objects'
op|'.'
name|'block_device'
op|'.'
name|'BlockDeviceMappingList'
op|'.'
name|'bdms_by_instance_uuid'
op|'('
nl|'\n'
name|'ctxt'
op|','
op|'['
string|"'123'"
op|','
string|"'456'"
op|','
string|"'789'"
op|']'
op|')'
op|'.'
name|'AndReturn'
op|'('
op|'{'
string|"'123'"
op|':'
name|'swap_bdm_256_list'
op|','
nl|'\n'
string|"'456'"
op|':'
name|'swap_bdm_128_list'
op|','
nl|'\n'
string|"'789'"
op|':'
name|'swap_bdm_128_list'
op|'}'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|"# The argument here should be a context, but it's mocked out"
nl|'\n'
name|'running'
op|'='
name|'image_cache_manager'
op|'.'
name|'_list_running_instances'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'all_instances'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'4'
op|','
name|'len'
op|'('
name|'running'
op|'['
string|"'used_images'"
op|']'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'('
number|'1'
op|','
number|'0'
op|','
op|'['
string|"'instance-00000001'"
op|']'
op|')'
op|','
nl|'\n'
name|'running'
op|'['
string|"'used_images'"
op|']'
op|'['
string|"'1'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'('
number|'1'
op|','
number|'1'
op|','
op|'['
string|"'instance-00000002'"
op|','
nl|'\n'
string|"'instance-00000003'"
op|']'
op|')'
op|','
nl|'\n'
name|'running'
op|'['
string|"'used_images'"
op|']'
op|'['
string|"'2'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'('
number|'0'
op|','
number|'1'
op|','
op|'['
string|"'instance-00000003'"
op|']'
op|')'
op|','
nl|'\n'
name|'running'
op|'['
string|"'used_images'"
op|']'
op|'['
string|"'21'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'('
number|'0'
op|','
number|'1'
op|','
op|'['
string|"'instance-00000003'"
op|']'
op|')'
op|','
nl|'\n'
name|'running'
op|'['
string|"'used_images'"
op|']'
op|'['
string|"'22'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'instance-00000001'"
op|','
name|'running'
op|'['
string|"'instance_names'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'123'"
op|','
name|'running'
op|'['
string|"'instance_names'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'4'
op|','
name|'len'
op|'('
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|'['
string|"'1'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|'['
string|"'2'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|'['
string|"'21'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|'['
string|"'22'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'len'
op|'('
name|'running'
op|'['
string|"'used_swap_images'"
op|']'
op|')'
op|','
number|'2'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'swap_128'"
op|','
name|'running'
op|'['
string|"'used_swap_images'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'swap_256'"
op|','
name|'running'
op|'['
string|"'used_swap_images'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list_resizing_instances
dedent|''
name|'def'
name|'test_list_resizing_instances'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instances'
op|'='
op|'['
op|'{'
string|"'image_ref'"
op|':'
string|"'1'"
op|','
nl|'\n'
string|"'host'"
op|':'
name|'CONF'
op|'.'
name|'host'
op|','
nl|'\n'
string|"'id'"
op|':'
string|"'1'"
op|','
nl|'\n'
string|"'uuid'"
op|':'
string|"'123'"
op|','
nl|'\n'
string|"'vm_state'"
op|':'
name|'vm_states'
op|'.'
name|'RESIZED'
op|','
nl|'\n'
string|"'task_state'"
op|':'
name|'None'
op|'}'
op|']'
newline|'\n'
nl|'\n'
name|'all_instances'
op|'='
op|'['
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'None'
op|','
op|'**'
name|'instance'
op|')'
nl|'\n'
name|'for'
name|'instance'
name|'in'
name|'instances'
op|']'
newline|'\n'
nl|'\n'
name|'image_cache_manager'
op|'='
name|'imagecache'
op|'.'
name|'ImageCacheManager'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'objects'
op|'.'
name|'block_device'
op|'.'
name|'BlockDeviceMappingList'
op|','
nl|'\n'
string|"'bdms_by_instance_uuid'"
op|')'
newline|'\n'
nl|'\n'
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'bdms'
op|'='
name|'block_device_obj'
op|'.'
name|'block_device_make_list_from_dicts'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'swap_bdm_256'
op|')'
newline|'\n'
name|'objects'
op|'.'
name|'block_device'
op|'.'
name|'BlockDeviceMappingList'
op|'.'
name|'bdms_by_instance_uuid'
op|'('
nl|'\n'
name|'ctxt'
op|','
op|'['
string|"'123'"
op|']'
op|')'
op|'.'
name|'AndReturn'
op|'('
op|'{'
string|"'123'"
op|':'
name|'bdms'
op|'}'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'running'
op|'='
name|'image_cache_manager'
op|'.'
name|'_list_running_instances'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'all_instances'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'len'
op|'('
name|'running'
op|'['
string|"'used_images'"
op|']'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'('
number|'1'
op|','
number|'0'
op|','
op|'['
string|"'instance-00000001'"
op|']'
op|')'
op|','
nl|'\n'
name|'running'
op|'['
string|"'used_images'"
op|']'
op|'['
string|"'1'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'set'
op|'('
op|'['
string|"'instance-00000001'"
op|','
string|"'123'"
op|','
nl|'\n'
string|"'instance-00000001_resize'"
op|','
string|"'123_resize'"
op|']'
op|')'
op|','
nl|'\n'
name|'running'
op|'['
string|"'instance_names'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'len'
op|'('
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'running'
op|'['
string|"'image_popularity'"
op|']'
op|'['
string|"'1'"
op|']'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 12.350211 | 88 | 0.596242 | 2,154 | 14,635 | 3.947075 | 0.097029 | 0.11362 | 0.039991 | 0.056928 | 0.838156 | 0.812985 | 0.78064 | 0.742766 | 0.713714 | 0.676076 | 0 | 0.017682 | 0.091903 | 14,635 | 1,184 | 89 | 12.360642 | 0.622047 | 0 | 0 | 0.94848 | 0 | 0 | 0.384011 | 0.046191 | 0 | 0 | 0 | 0 | 0.025338 | 0 | null | null | 0 | 0.007601 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0babd5d99e5de009ecb0233dadd309cb5f1c6506 | 4,510 | py | Python | OnePy/builtin_module/plotters/by_matplotlib.py | Chandlercjy/OnePyfx | 9bd43b721d3f7352495b6ccab76bd533a3d2e8f2 | [
"MIT"
] | 321 | 2017-07-09T09:25:45.000Z | 2022-03-29T16:51:35.000Z | OnePy/builtin_module/plotters/by_matplotlib.py | sunzhouhong/OnePy | 4e225945de297ba1211035a7b95b5094cdddc2a7 | [
"MIT"
] | 7 | 2017-08-23T12:10:29.000Z | 2020-03-26T12:56:09.000Z | OnePy/builtin_module/plotters/by_matplotlib.py | sunzhouhong/OnePy | 4e225945de297ba1211035a7b95b5094cdddc2a7 | [
"MIT"
] | 134 | 2017-07-26T22:29:18.000Z | 2022-03-23T09:22:10.000Z | import matplotlib.pyplot as plt
import matplotlib.style as style
import statsmodels.api as sm
from matplotlib.widgets import MultiCursor
import OnePy.custom_module.analysis as analysis
from OnePy.builtin_module.plotters.by_plotly import PlotBase
class Matplotlib(PlotBase):
def setting(self):
style.use('ggplot')
plt.rcParams['lines.linewidth'] = 1.4
plt.rcParams['figure.figsize'] = 6, 10
def close_df(self, ticker):
dataframe = self.ohlc_df(ticker)[['close']]
dataframe.rename(columns=dict(close=ticker), inplace=True)
return dataframe
def plot(self, ticker):
self.setting()
fig = plt.figure(tight_layout=True)
ax1 = fig.add_subplot(5, 2, 1)
ax2 = fig.add_subplot(5, 2, 2)
ax3 = fig.add_subplot(5, 2, 3)
ax4 = fig.add_subplot(5, 2, 4)
ax5 = fig.add_subplot(5, 2, 5)
ax6 = fig.add_subplot(5, 2, 6)
ax7 = fig.add_subplot(5, 2, 7)
ax8 = fig.add_subplot(5, 2, 8)
ax9 = fig.add_subplot(5, 2, 9)
ax10 = fig.add_subplot(5, 2, 10)
# 左边
self.close_df(ticker).plot(ax=ax1, sharex=ax5)
self.balance_df.plot(ax=ax3)
self.cash_df.plot(ax=ax5, sharex=ax5)
analysis.get_drawdown_df(self.balance_df).plot(ax=ax9, sharex=ax5)
holding_pnl = self.env.recorder.holding_pnl.single_dataframe()
holding_pnl.rename(columns=dict(
value=f'holding_pnl'), inplace=True)
holding_pnl.plot(ax=ax7, sharex=ax5)
# for i in self.holding_pnl_df:
# i.plot(ax=ax7, sharex=ax5)
# 右边
market_value = self.env.recorder.market_value.single_dataframe()
market_value.rename(columns=dict(
value=f'market_value'), inplace=True)
market_value.plot(ax=ax2, sharex=ax5)
margin = self.env.recorder.margin.single_dataframe()
margin.rename(columns=dict(
value=f'margin'), inplace=True)
margin.plot(ax=ax4, sharex=ax5)
# for i in self.positions_df:
# i.plot(ax=ax2, sharex=ax5)
# for i in self.margin_df:
# i.plot(ax=ax4, sharex=ax5)
self.realized_pnl_df.plot(ax=ax6, sharex=ax5, kind='bar')
sm.qqplot(self.returns_df['returns'],
dist='norm', line='s', ax=ax8, marker='.')
self.returns_df[self.returns_df != 0].hist(bins=100, ax=ax10)
MultiCursor(fig.canvas, (ax1, ax2, ax3, ax4, ax5, ax6,
ax7, ax9), color='r', lw=1)
plt.show()
def plot_A_share(self, ticker):
self.setting()
fig = plt.figure(tight_layout=True)
ax1 = fig.add_subplot(5, 2, 1)
ax2 = fig.add_subplot(5, 2, 2)
ax3 = fig.add_subplot(5, 2, 3)
ax4 = fig.add_subplot(5, 2, 4)
ax5 = fig.add_subplot(5, 2, 5)
ax6 = fig.add_subplot(5, 2, 6)
ax7 = fig.add_subplot(5, 2, 7)
ax8 = fig.add_subplot(5, 2, 8)
ax9 = fig.add_subplot(5, 2, 9)
ax10 = fig.add_subplot(5, 2, 10)
# 左边
self.close_df(ticker).plot(ax=ax1, sharex=ax5)
self.balance_df.plot(ax=ax3)
self.cash_df.plot(ax=ax5, sharex=ax5)
analysis.get_drawdown_df(self.balance_df).plot(ax=ax9, sharex=ax5)
holding_pnl = self.env.recorder.holding_pnl.single_dataframe()
holding_pnl.rename(columns=dict(
value=f'holding_pnl'), inplace=True)
holding_pnl.plot(ax=ax7, sharex=ax5)
# for i in self.holding_pnl_df:
# i.plot(ax=ax7, sharex=ax5)
# 右边
market_value = self.env.recorder.market_value.single_dataframe()
market_value.rename(columns=dict(
value=f'market_value'), inplace=True)
market_value.plot(ax=ax2, sharex=ax5)
margin = self.env.recorder.margin.single_dataframe()
margin.rename(columns=dict(
value=f'margin'), inplace=True)
margin.plot(ax=ax4, sharex=ax5)
# for i in self.positions_df:
# i.plot(ax=ax2, sharex=ax5)
# for i in self.margin_df:
# i.plot(ax=ax4, sharex=ax5)
self.realized_pnl_df.plot(ax=ax6, sharex=ax5, kind='bar')
sm.qqplot(self.returns_df['returns'],
dist='norm', line='s', ax=ax8, marker='.')
self.returns_df[self.returns_df != 0].hist(bins=100, ax=ax10)
MultiCursor(fig.canvas, (ax1, ax2, ax3, ax4, ax5, ax6,
ax7, ax9), color='r', lw=1)
plt.show()
| 31.760563 | 74 | 0.596231 | 650 | 4,510 | 4.006154 | 0.170769 | 0.050691 | 0.099846 | 0.107527 | 0.827957 | 0.827957 | 0.827957 | 0.827957 | 0.827957 | 0.827957 | 0 | 0.049424 | 0.268736 | 4,510 | 141 | 75 | 31.985816 | 0.740146 | 0.075166 | 0 | 0.804598 | 0 | 0 | 0.031777 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045977 | false | 0 | 0.068966 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0bb41cdd70f207eb6a3abf362949d69d1f018268 | 2,652 | py | Python | parser/team04/Interpreter/Expression/relational.py | mr8ug/tytus | a09abe4095e49d333a8ed9ca81cb3d88f90872ba | [
"MIT"
] | 1 | 2021-01-09T05:32:35.000Z | 2021-01-09T05:32:35.000Z | parser/team04/Interpreter/Expression/relational.py | XiomRB/tytus | 0873e4bdce5c110bee6ef2aa98240be6a93ae024 | [
"MIT"
] | null | null | null | parser/team04/Interpreter/Expression/relational.py | XiomRB/tytus | 0873e4bdce5c110bee6ef2aa98240be6a93ae024 | [
"MIT"
] | null | null | null | from Interpreter.Expressions.expression import Expression
class Relational(Expression):
def __init__(self, left, right):
self.left = left
self.right = right
def getValue(self, env):
pass
def isNumeric(self, value):
return isinstance(value, int) or isinstance(value, float)
class MayorQue(Relational):
def __init__(self, left, right):
Relational.__init__(self, left, right)
def getValue(self, env):
leftValue = self.left.getValue(env)
rightValue = self.right.getValue(env)
areNums = self.isNumeric(leftValue) and self.isNumeric(rightValue)
if areNums:
return leftValue > rightValue
class MenorQue(Relational):
def __init__(self, left, right):
Relational.__init__(self, left, right)
def getValue(self, env):
leftValue = self.left.getValue(env)
rightValue = self.right.getValue(env)
areNums = self.isNumeric(leftValue) and self.isNumeric(rightValue)
if areNums:
return leftValue < rightValue
class MayorIgual(Relational):
def __init__(self, left, right):
Relational.__init__(self, left, right)
def getValue(self, env):
leftValue = self.left.getValue(env)
rightValue = self.right.getValue(env)
areNums = self.isNumeric(leftValue) and self.isNumeric(rightValue)
if areNums:
return leftValue >= rightValue
class MenorIgual(Relational):
def __init__(self, left, right):
Relational.__init__(self, left, right)
def getValue(self, env):
leftValue = self.left.getValue(env)
rightValue = self.right.getValue(env)
areNums = self.isNumeric(leftValue) and self.isNumeric(rightValue)
if areNums:
return leftValue <= rightValue
class IgualQue(Relational):
def __init__(self, left, right):
Relational.__init__(self, left, right)
def getValue(self, env):
leftValue = self.left.getValue(env)
rightValue = self.right.getValue(env)
areNums = self.isNumeric(leftValue) and self.isNumeric(rightValue)
if areNums:
return leftValue == rightValue
class Distinto(Relational):
def __init__(self, left, right):
Relational.__init__(self, left, right)
def getValue(self, env):
leftValue = self.left.getValue(env)
rightValue = self.right.getValue(env)
areNums = self.isNumeric(leftValue) and self.isNumeric(rightValue)
if areNums:
return leftValue != rightValue
| 28.826087 | 75 | 0.63273 | 277 | 2,652 | 5.870036 | 0.119134 | 0.098401 | 0.095941 | 0.135916 | 0.861009 | 0.834563 | 0.834563 | 0.834563 | 0.834563 | 0.834563 | 0 | 0 | 0.273756 | 2,652 | 91 | 76 | 29.142857 | 0.844237 | 0 | 0 | 0.698413 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0.015873 | 0.015873 | 0.015873 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f02c933acb58989445f218a9a62611df9ee45f16 | 15,603 | py | Python | mastermind_py/mastermind/test.py | dominguezvazquezdavid/inari-mastermind | 76aa0186aaaec2950c61f294ec6776e248797950 | [
"MIT"
] | null | null | null | mastermind_py/mastermind/test.py | dominguezvazquezdavid/inari-mastermind | 76aa0186aaaec2950c61f294ec6776e248797950 | [
"MIT"
] | null | null | null | mastermind_py/mastermind/test.py | dominguezvazquezdavid/inari-mastermind | 76aa0186aaaec2950c61f294ec6776e248797950 | [
"MIT"
] | null | null | null | from typing import Any, Dict, List
from django.test import TestCase
from rest_framework.test import APIClient
from rest_framework import status
from mastermind_py.mastermind.domain import Game
from mastermind_py.mastermind.repo import Games
class UserTestCase(TestCase):
def setUp(self):
self.client = APIClient()
## Utils
@staticmethod
def __createGame(num_slots: int, num_colors: int, max_guesses: int, reference: str,
status: str, colors: List[str], secret_code: List[str]) -> Game:
game = Game(
id = None,
num_slots = num_slots,
num_colors = num_colors,
max_guesses = max_guesses,
reference = reference,
status = status,
secret_code = secret_code,
guesses = []
)
game.colors = colors
game = Games().save(game)
return game
def __assertGuess(self, response: Any, expected_white_peg: int, expected_black_peg: int):
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(response.json()["guesses"][0]["white_pegs"], expected_white_peg)
self.assertEqual(response.json()["guesses"][0]["black_pegs"], expected_black_peg)
def test_get_games(self):
"""Check if retrieve all games correctly"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "red", "green", "yellow"])
response = self.client.get('/api/games/')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.json()["results"][0]), 8)
def test_get_game(self):
"""Check if retrieve a game correctly"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "red", "green", "yellow"])
response = self.client.get(f'/api/games/{game.id}/')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.json()), 8)
def test_create_game(self):
"""Check if a game is created correctly"""
response = self.client.post('/api/games/', '{ "num_slots": 4, "num_colors": 4, "max_guesses": 2 }', content_type='application/json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(len(response.json()), 8)
def test_create_guess(self):
"""Check if guess create correctly"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "orange", "orange", "orange"] }', content_type='application/json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(len(response.json()["guesses"]), 1)
def test_retrieve_guesses(self):
"""Check if guesses are retrieved correctly"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "orange", "orange", "orange"] }', content_type='application/json')
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "orange", "orange", "orange"] }', content_type='application/json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(len(response.json()["guesses"]), 2)
def test_none_white_peg(self):
"""Check if return none white peg"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["red", "blue", "yellow", "blue"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "orange", "orange", "orange"] }', content_type='application/json')
self.__assertGuess(response, 0, 0)
def test_one_white_peg(self):
"""Check if return one white peg"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "blue", "yellow", "blue"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 1, 0)
def test_two_white_peg(self):
"""Check if return two white peg"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 2, 0)
def test_three_white_peg(self):
"""Check if return three white peg"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "green", "blue"] }', content_type='application/json')
self.__assertGuess(response, 3, 0)
def test_four_white_peg(self):
"""Check if return four white peg"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "red", "green", "blue"] }', content_type='application/json')
self.__assertGuess(response, 4, 0)
def test_one_black_peg(self):
"""Check if return one black peg"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["green", "orange", "orange", "orange"] }', content_type='application/json')
self.__assertGuess(response, 0, 1)
def test_two_black_peg(self):
"""Check if return two black peg"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["green", "blue", "orange", "orange"] }', content_type='application/json')
self.__assertGuess(response, 0, 2)
def test_three_black_peg(self):
"""Check if return three black peg"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["green", "blue", "yellow", "orange"] }', content_type='application/json')
self.__assertGuess(response, 0, 3)
def test_won_game(self):
"""Check if returned status is won when reach four black peg"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["green", "blue", "yellow", "red"] }', content_type='application/json')
self.__assertGuess(response, 0, 4)
self.assertEqual(response.json()["status"], "won")
def test_lost_game(self):
"""Check if returned status is lost when reach max_guesses"""
game = self.__createGame(4, 5, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange"], ["green", "blue", "yellow", "red"])
self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "orange", "orange", "orange"] }', content_type='application/json')
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "orange", "orange", "orange"] }', content_type='application/json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(response.json()["status"], "lost")
def test_one_black_peg_same_color_guess(self):
"""Check if return only one black peg with same color guess"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "red", "green", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "yellow", "yellow"] }', content_type='application/json')
self.__assertGuess(response, 0, 1)
def test_two_black_peg_same_color_guess(self):
"""Check if return two black peg with same color guess"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "yellow", "green", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "yellow", "yellow"] }', content_type='application/json')
self.__assertGuess(response, 0, 2)
def test_three_black_peg_same_color_guess(self):
"""Check if return three black peg with same color guess"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["yellow", "yellow", "green", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "yellow", "yellow"] }', content_type='application/json')
self.__assertGuess(response, 0, 3)
def test_one_white_peg_same_color(self):
"""Check if return one white peg with same color"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "blue", "blue", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 1, 0)
def test_two_white_peg_same_color(self):
"""Check if return two white peg with same color"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "blue", "yellow", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 2, 0)
def test_one_white_peg_one_black_peg_same_color(self):
"""Check if return one white peg one black peg with same color"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "yellow", "yellow", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 1, 1)
def test_one_white_peg_two_black_peg_same_color(self):
"""Check if return one white peg two black peg with same color"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "yellow", "yellow", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "yellow", "yellow", "green"] }', content_type='application/json')
self.__assertGuess(response, 1, 2)
def test_one_white_peg_same_color(self):
"""Check if return one white peg when secret code if formed by the same color"""
game = self.__createGame(4, 4, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow"], ["red", "yellow", "yellow", "yellow"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["yellow", "green", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 1, 0)
#Feedback Unit Test
def test_One(self):
"""RGGB | RGGB | 4 | 0"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["red", "green", "green", "blue"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["red", "green", "green", "blue"] }', content_type='application/json')
self.__assertGuess(response, 0, 4)
def test_Two(self):
"""RRRR | BYOB | 0 | 0"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["red", "red", "red", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["blue", "yellow", "orange", "blue"] }', content_type='application/json')
self.__assertGuess(response, 0, 0)
def test_Three(self):
"""GBBR | GBRB | 2 | 2"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["green", "blue", "blue", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["green", "blue", "red", "blue"] }', content_type='application/json')
self.__assertGuess(response, 2, 2)
def test_Four(self):
"""BBBR | RBGG | 1 | 1"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["blue", "blue", "blue", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["red", "blue", "green", "green"] }', content_type='application/json')
self.__assertGuess(response, 1, 1)
def test_Five(self):
"""RBGG | BBBR | 1 | 1"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["red", "blue", "green", "green"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["blue", "blue", "blue", "red"] }', content_type='application/json')
self.__assertGuess(response, 1, 1)
def test_Six(self):
"""BBBR | BBRB | 4 | 0"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["blue", "blue", "blue", "red"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["blue", "blue", "red", "blue"] }', content_type='application/json')
self.assertNotEqual(response.json()["guesses"][0]["white_pegs"], 0)
self.assertNotEqual(response.json()["guesses"][0]["black_pegs"], 4)
self.__assertGuess(response, 2, 2)
def test_Seven(self):
"""WBWB | BWBW | 0 | 4"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["white", "blue", "white", "blue"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["blue", "white", "blue", "white"] }', content_type='application/json')
self.__assertGuess(response, 4, 0)
def test_Eight(self):
"""OOOW | OWWW | 2 | 0"""
game = self.__createGame(4, 6, 2, "3DB2C149E8", "running", ["red", "blue", "green", "yellow", "orange", "white"], ["orange", "orange", "orange", "white"])
response = self.client.post(f'/api/games/{game.id}/guesses/', '{ "code": ["orange", "white", "white", "white"] }', content_type='application/json')
self.__assertGuess(response, 0, 2) | 53.071429 | 162 | 0.598346 | 1,907 | 15,603 | 4.742003 | 0.066072 | 0.029415 | 0.042464 | 0.044565 | 0.850492 | 0.843083 | 0.808581 | 0.787018 | 0.771978 | 0.725755 | 0 | 0.029536 | 0.192783 | 15,603 | 294 | 163 | 53.071429 | 0.688448 | 0.083125 | 0 | 0.404908 | 0 | 0 | 0.336914 | 0.062973 | 0 | 0 | 0 | 0 | 0.269939 | 1 | 0.208589 | false | 0 | 0.03681 | 0 | 0.257669 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b2c39dbd8f7df1f9ab115059395a03aeacfc9de8 | 3,613 | py | Python | sped_correcao/_old/rule13.py | teocrono/scripts | 2970192e3184c9e1d3dd67390e544d767b809c23 | [
"MIT"
] | null | null | null | sped_correcao/_old/rule13.py | teocrono/scripts | 2970192e3184c9e1d3dd67390e544d767b809c23 | [
"MIT"
] | null | null | null | sped_correcao/_old/rule13.py | teocrono/scripts | 2970192e3184c9e1d3dd67390e544d767b809c23 | [
"MIT"
] | null | null | null | ##
##
##
def exec(conexao):
cursor = conexao.cursor()
print("RULE 13 - Inicializando",end=' ')
select = " SELECT r0 FROM principal WHERE r1 in (\"C501\") AND r2 = \"00\" AND r4 = \"04\" "
select = cursor.execute(select)
select = select.fetchall()
c501s = [i[0] for i in select]
for i in c501s:
print('-',end=' ')
update = " UPDATE principal SET "
update = update + " r2 = \"50\", "
update = update + " r6 = \"1,6500\", "
update = update + " r5 = r3, "
update = update + " r7 = REPLACE(CAST( ROUND((CAST(replace(r3,',','.') AS FLOAT) * 1.65 / 100),2) AS TEXT),'.',',') "
update = update + " WHERE 1=1 "
update = update + " AND r0 = " + str(i) + " "
cursor.execute(update)
conexao.commit()
select = " SELECT max(r0) FROM principal WHERE r1 in (\"C500\") AND r0 < " + str(i) + " "
select = cursor.execute(select)
r0 = select.fetchone()[0]
##verifica se não é null o ultimo C500
select = " SELECT min(r0)-1 FROM principal where r1 in (\"C500\") AND r0 > " + str(r0) + " "
select = cursor.execute(select)
r01 = select.fetchone()[0]
r01 = r01 if r01 != None else r0 + 100
update = " UPDATE principal SET "
update = update + " r13 = "
update = update + " (SELECT REPLACE(CAST( ROUND( SUM(CAST(replace(r7,',','.') AS FLOAT)),2) AS TEXT),'.',',') "
update = update + " FROM principal WHERE "
update = update + " r1 = \"C501\" "
update = update + " AND r0 BETWEEN " + str(r0) + " AND " + str(r01) + " "
update = update + " ) "
update = update + " WHERE 1=1 "
update = update + " AND r0 = " + str(r0) + " "
cursor.execute(update)
conexao.commit()
select = " SELECT r0 FROM principal WHERE r1 in (\"C505\") AND r2 = \"00\" AND r4 = \"04\" "
select = cursor.execute(select)
select = select.fetchall()
c501s = [i[0] for i in select]
for i in c501s:
print('-',end=' ')
update = " UPDATE principal SET "
update = update + " r2 = \"50\", "
update = update + " r6 = \"7,6000\", "
update = update + " r5 = r3, "
update = update + " r7 = REPLACE(CAST( ROUND((CAST(replace(r3,',','.') AS FLOAT) * 7.6 / 100),2) AS TEXT),'.',',') "
update = update + " WHERE 1=1 "
update = update + " AND r0 = " + str(i) + " "
cursor.execute(update)
conexao.commit()
select = " SELECT max(r0) FROM principal WHERE r1 in (\"C500\") AND r0 < " + str(i) + " "
select = cursor.execute(select)
r0 = select.fetchone()[0]
##verifica se não é null o ultimo C500
select = " SELECT min(r0)-1 FROM principal where r1 in (\"C500\") AND r0 > " + str(r0) + " "
select = cursor.execute(select)
r01 = select.fetchone()[0]
r01 = r01 if r01 != None else r0 + 100
update = " UPDATE principal SET "
update = update + " r13 = "
update = update + " (SELECT REPLACE(CAST( ROUND( SUM(CAST(replace(r7,',','.') AS FLOAT)),2) AS TEXT),'.',',') "
update = update + " FROM principal WHERE "
update = update + " r1 = \"C505\" "
update = update + " AND r0 BETWEEN " + str(r0) + " AND " + str(r01) + " "
update = update + " ) "
update = update + " WHERE 1=1 "
update = update + " AND r0 = " + str(r0) + " "
cursor.execute(update)
conexao.commit()
print("Finalizado") | 37.635417 | 127 | 0.499308 | 423 | 3,613 | 4.264775 | 0.167849 | 0.226164 | 0.079823 | 0.066519 | 0.941242 | 0.941242 | 0.941242 | 0.941242 | 0.90133 | 0.90133 | 0 | 0.074151 | 0.331857 | 3,613 | 96 | 128 | 37.635417 | 0.673157 | 0.019928 | 0 | 0.828571 | 0 | 0.057143 | 0.335222 | 0.033437 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014286 | false | 0 | 0 | 0 | 0.014286 | 0.057143 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6539777d2fd395d145298fffb5feac9aa6915766 | 48 | py | Python | utils/__init__.py | kanttouchthis/clip-search | 463c3f2849a6f5ae7ebc6bfe7a932ec82f2ab0c1 | [
"MIT"
] | 1 | 2021-10-12T12:15:00.000Z | 2021-10-12T12:15:00.000Z | utils/__init__.py | kanttouchthis/clip-search | 463c3f2849a6f5ae7ebc6bfe7a932ec82f2ab0c1 | [
"MIT"
] | null | null | null | utils/__init__.py | kanttouchthis/clip-search | 463c3f2849a6f5ae7ebc6bfe7a932ec82f2ab0c1 | [
"MIT"
] | 1 | 2021-11-20T14:51:11.000Z | 2021-11-20T14:51:11.000Z | from utils.util import *
from utils.cli import * | 24 | 24 | 0.770833 | 8 | 48 | 4.625 | 0.625 | 0.486486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 48 | 2 | 25 | 24 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
332009b87bc82685a84397448cb18a296f8cfa3e | 6,761 | py | Python | day04_giant-squid/day04.py | notromanramirez/advent-of-code_2021 | 067c2f0597b0123ed1f4406b1c336b6982afc563 | [
"MIT"
] | null | null | null | day04_giant-squid/day04.py | notromanramirez/advent-of-code_2021 | 067c2f0597b0123ed1f4406b1c336b6982afc563 | [
"MIT"
] | null | null | null | day04_giant-squid/day04.py | notromanramirez/advent-of-code_2021 | 067c2f0597b0123ed1f4406b1c336b6982afc563 | [
"MIT"
] | null | null | null | # Roman Ramirz, rr8rk@virignia,edu
# Advent of Code, DAY 04
#%% LONG INPUT
my_input = []
with open('input.txt', 'r') as f:
for line in f:
my_input.append(line.strip('\n'))
#%% EXAMPLE INPUT
my_input = [
'7,4,9,5,11,17,23,2,0,14,21,24,10,16,13,6,15,25,12,22,18,20,8,19,3,26,1',
'', #1
'22 13 17 11 0',
' 8 2 23 4 24',
'21 9 14 16 7',
' 6 10 3 18 5',
' 1 12 20 15 19',
'', #7
' 3 15 0 2 22',
' 9 18 13 17 5',
'19 8 7 25 23',
'20 11 10 24 4',
'14 21 16 12 6',
'',
'14 21 17 24 4',
'10 16 15 9 19',
'18 8 23 26 20',
'22 11 13 6 5',
' 2 0 12 3 7'
]
#%% PART 1 CODE
bingo_nums = [int(n) for n in my_input[0].split(',')]
board_init = [[int(i) for i in n.split(' ') if i != ''] for n in my_input[1:] if n != '']
board_nums = []
board_guess = []
for i in range(0, len(board_init), 5):
nums_temp = [
board_init[i+0],
board_init[i+1],
board_init[i+2],
board_init[i+3],
board_init[i+4]
]
guess_temp = [
[False for _ in range(len(board_init[i+0]))],
[False for _ in range(len(board_init[i+1]))],
[False for _ in range(len(board_init[i+2]))],
[False for _ in range(len(board_init[i+3]))],
[False for _ in range(len(board_init[i+4]))]
]
[False for _ in range(len(board_init[0]))]
board_guess.append(guess_temp)
board_nums.append(nums_temp)
# iterate through the bingo_nums and do stuff
winning_board_num = False
last_num_called = False
for bingo_num in bingo_nums:
# change the boards according to the next guess
for i in range(len(board_nums)): # iterate through each board
for j in range(len(board_nums[i])): # iterate through each board row
for k in range(len(board_nums[i][j])): # iterate through each board row element
if ((bingo_num == board_nums[i][j][k]) and (not winning_board_num)):
board_guess[i][j][k] = True
# horizontal checks
for i in range(len(board_nums)): # iterate through each board
for j in range(len(board_nums[i])): # iterate through each board row
if (all(board_guess[i][j])):
# print(f"Board number {i} at row {j}, horizontal check from {bingo_num}")
if (type(winning_board_num) == bool):
winning_board_num = i
last_num_called = bingo_num
# vertical checks
for i in range(len(board_nums)): # iterate through each board
for j in range(len(board_nums[i])): # iterate through each board row
check_list = []
for k in range(len(board_nums[i][j])): # iterate through each board row element
check_list.append(board_guess[i][k][j])
if (all(check_list)):
# print(f"Board number {i} at column {j}, vertical check from {bingo_num}")
if (type(winning_board_num) == bool):
winning_board_num = i
last_num_called = bingo_num
sum_accum = 0
# for row in board_nums[winning_board_num]: print(row)
# for row in board_guess[winning_board_num]: print(row)
for j in range(len(board_nums[winning_board_num])):
for k in range(len(board_nums[winning_board_num][j])):
if board_guess[winning_board_num][j][k] == False:
# print(board_nums[winning_board_num][j][k], board_guess[winning_board_num][j][k])
sum_accum += board_nums[winning_board_num][j][k]
score = sum_accum * last_num_called
print(score)
#%% PART 2 CODE
bingo_nums = [int(n) for n in my_input[0].split(',')]
board_init = [[int(i) for i in n.split(' ') if i != ''] for n in my_input[1:] if n != '']
board_nums = []
board_guess = []
for i in range(0, len(board_init), 5):
nums_temp = [
board_init[i+0],
board_init[i+1],
board_init[i+2],
board_init[i+3],
board_init[i+4]
]
guess_temp = [
[False for _ in range(len(board_init[i+0]))],
[False for _ in range(len(board_init[i+1]))],
[False for _ in range(len(board_init[i+2]))],
[False for _ in range(len(board_init[i+3]))],
[False for _ in range(len(board_init[i+4]))]
]
[False for _ in range(len(board_init[0]))]
board_guess.append(guess_temp)
board_nums.append(nums_temp)
# iterate through the bingo_nums and do stuff
winners = []
last_num_called = False
for bingo_num in bingo_nums:
# change the boards according to the next guess
for i in range(len(board_nums)): # iterate through each board
for j in range(len(board_nums[i])): # iterate through each board row
for k in range(len(board_nums[i][j])): # iterate through each board row element
if ((bingo_num == board_nums[i][j][k]) and (not last_num_called)):
board_guess[i][j][k] = True
# horizontal checks
for i in range(len(board_nums)): # iterate through each board
for j in range(len(board_nums[i])): # iterate through each board row
if (all(board_guess[i][j])):
# print(f"Board number {i} at row {j}, horizontal check from {bingo_num}")
if ((len(winners) < len(board_nums)) and (i not in winners)):
winners.append(i)
if ((len(winners) == len(board_nums)) and (not last_num_called)):
last_num_called = bingo_num
# vertical checks
for i in range(len(board_nums)): # iterate through each board
for j in range(len(board_nums[i])): # iterate through each board row
check_list = []
for k in range(len(board_nums[i][j])): # iterate through each board row element
check_list.append(board_guess[i][k][j])
if (all(check_list)):
# print(f"Board number {i} at column {j}, vertical check from {bingo_num}")
if ((len(winners) < len(board_nums)) and (i not in winners)):
winners.append(i)
if ((len(winners) == len(board_nums)) and (not last_num_called)):
last_num_called = bingo_num
sum_accum = 0
# for row in board_nums[winning_board_num]: print(row)
# for row in board_guess[winning_board_num]: print(row)
for j in range(len(board_nums[winners[-1]])):
for k in range(len(board_nums[winners[-1]][j])):
if board_guess[winners[-1]][j][k] == False:
# print(board_nums[winners[-1]][j][k], board_guess[winners[-1]][j][k])
sum_accum += board_nums[winners[-1]][j][k]
score = sum_accum * last_num_called
print(score) | 35.962766 | 94 | 0.57595 | 1,058 | 6,761 | 3.50189 | 0.100189 | 0.082051 | 0.08637 | 0.129555 | 0.903914 | 0.900135 | 0.869636 | 0.82996 | 0.816734 | 0.816734 | 0 | 0.043913 | 0.289306 | 6,761 | 188 | 95 | 35.962766 | 0.727159 | 0.22541 | 0 | 0.709924 | 0 | 0.007634 | 0.056978 | 0.013474 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.015267 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
332eb82584d8df6cbdc748275e7a2c80a7356952 | 134 | py | Python | tests/conftest.py | kant/textar | 8e0ca260a784a68baf48c9de23b0d34f17bae1dc | [
"MIT"
] | 28 | 2016-11-26T18:56:19.000Z | 2021-06-28T22:27:47.000Z | tests/conftest.py | kant/textar | 8e0ca260a784a68baf48c9de23b0d34f17bae1dc | [
"MIT"
] | 7 | 2017-04-10T09:36:24.000Z | 2021-03-25T21:58:21.000Z | tests/conftest.py | kant/textar | 8e0ca260a784a68baf48c9de23b0d34f17bae1dc | [
"MIT"
] | 12 | 2017-12-09T19:41:24.000Z | 2021-01-28T17:19:24.000Z | import os
import os.path
import sys
sys.path.append(os.path.join(os.getcwd(), '.'))
sys.path.append(os.path.join(os.getcwd(), '..'))
| 19.142857 | 48 | 0.679104 | 23 | 134 | 3.956522 | 0.304348 | 0.197802 | 0.285714 | 0.32967 | 0.681319 | 0.681319 | 0.681319 | 0.681319 | 0 | 0 | 0 | 0 | 0.08209 | 134 | 6 | 49 | 22.333333 | 0.739837 | 0 | 0 | 0 | 0 | 0 | 0.022388 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3331d4035ee2a2f927bb33e7619b2cdde264212a | 30,394 | py | Python | test/requirements/test_container.py | denz/ldp | e49cff6f39a4b6d68998d90b8c75158e5b9b450a | [
"BSD-3-Clause"
] | null | null | null | test/requirements/test_container.py | denz/ldp | e49cff6f39a4b6d68998d90b8c75158e5b9b450a | [
"BSD-3-Clause"
] | null | null | null | test/requirements/test_container.py | denz/ldp | e49cff6f39a4b6d68998d90b8c75158e5b9b450a | [
"BSD-3-Clause"
] | null | null | null | """
### 5.2 Container
The following section contains normative clauses for Linked Data Platform
Container.
#### 5.2.1 General
The Linked Data Platform does not define how clients discover LDPCs.
##### 5.2.1.1 Each Linked Data Platform Container _MUST_ also be a conforming
Linked Data Platform RDF Source. LDP clients _MAY_ infer the following triple:
one whose subject is the LDPC, whose predicate is `rdf:type`, and whose object
is `ldp:RDFSource`, but there is no requirement to materialize this triple in
the LDPC representation.
##### 5.2.1.2 The representation of a LDPC _MAY_ have an `rdf:type` of
`ldp:Container` for Linked Data Platform Container. Non-normative note: LDPCs
might have additional types, like any LDP-RS.
##### 5.2.1.3 LDPC representations _SHOULD NOT_ use RDF container types
`rdf:Bag`, `rdf:Seq` or `rdf:List`.
Feature At Risk
The LDP Working Group proposes the REMOVAL of indirect containers, unless more
implementation reports arrive shortly, which would change the contents of the
list below.
##### 5.2.1.4 LDP servers exposing LDPCs _MUST_ advertise their LDP support by
exposing a HTTP `Link` header with a target URI matching the type of container
(see below) the server supports, and a link relation type of `type` (that is,
`rel='type'`) in all responses to requests made to the LDPC's HTTP `Request-
URI`. LDP servers _MAY_ provide additional HTTP `Link: rel='type'` headers.
The notes on the corresponding LDPR constraint apply equally to LDPCs.
> Valid container type URIs for `rel='type'` defined by this document are:
>
> * `http://www.w3.org/ns/ldp#BasicContainer` \- for LDP Basic Containers
> * `http://www.w3.org/ns/ldp#DirectContainer` \- for LDP Direct Containers
> * `http://www.w3.org/ns/ldp#IndirectContainer` \- for LDP Indirect
Containers
##### 5.2.1.5 LDP servers _SHOULD_ respect all of a client's LDP-defined
hints, for example which subsets of LDP-defined state the client is interested
in processing, to influence the set of triples returned in representations of
a LDPC, particularly for large LDPCs. See also [LDP-PAGING].
#### 5.2.2 HTTP GET
Per section 4.2.2 HTTP GET the HTTP GET method is required and additional
requirements can be found in section 5.2.1 General.
#### 5.2.3 HTTP POST
Per [RFC7231], this HTTP method is optional and this specification does not
require LDP servers to support it. When a LDP server supports this method,
this specification imposes the following new requirements for LDPCs.
Any server-imposed constraints on creation or update must be advertised to
clients.
##### 5.2.3.1 LDP clients _SHOULD_ create member resources by submitting a
representation as the entity body of the HTTP `POST` to a known LDPC. If the
resource was created successfully, LDP servers _MUST_ respond with status code
201 (Created) and the `Location` header set to the new resource's URL. Clients
shall not expect any representation in the response entity body on a 201
(Created) response.
##### 5.2.3.2 When a successful HTTP `POST` request to a LDPC results in the
creation of a LDPR, a containment triple _MUST_ be added to the state of the
LDPC whose subject is the LDPC URI, whose predicate is `ldp:contains` and
whose object is the URI for the newly created document (LDPR). Other triples
may be added as well. The newly created LDPR appears as a contained resource
of the LDPC until the newly created document is deleted or removed by other
methods.
##### 5.2.3.3 LDP servers _MAY_ accept an HTTP `POST` of non-RDF
representations (LDP-NRs) for creation of any kind of resource, for example
binary resources. See the Accept-Post section for details on how clients can
discover whether a LDPC supports this behavior.
##### 5.2.3.4 LDP servers that successfully create a resource from a RDF
representation in the request entity body _MUST_ honor the client's requested
interaction model(s). If any requested interaction model cannot be honored,
the server _MUST_ fail the request.
> * If the request header specifies a LDPR interaction model, then the
server _MUST_ handle subsequent requests to the newly created resource's URI
as if it is a LDPR (even if the content contains an `rdf:type` triple
indicating a type of LDPC).
> * If the request header specifies a LDPC interaction model, then the
server _MUST_ handle subsequent requests to the newly created resource's URI
as if it is a LDPC.
> * This specification does not constrain the server's behavior in other
cases.
>
> Clients use the same syntax, that is `HTTP Link` headers, to specify the
desired interaction model when creating a resource as servers use to advertise
it on responses.
>
> Note: A consequence of this is that LDPCs can be used to create LDPCs, if
the server supports doing so.
##### 5.2.3.5 LDP servers that allow creation of LDP-RSs via POST _MUST_ allow
clients to create new members by enclosing a request entity body with a
`Content-Type` request header whose value is `text/turtle` [turtle].
##### 5.2.3.6 LDP servers _SHOULD_ use the `Content-Type` request header to
determine the request representation's format when the request has an entity
body.
##### 5.2.3.7 LDP servers creating a LDP-RS via POST _MUST_ interpret the null
relative URI for the subject of triples in the LDP-RS representation in the
request entity body as identifying the entity in the request body. Commonly,
that entity is the model for the "to be created" LDPR, so triples whose
subject is the null relative URI result in triples in the created resource
whose subject is the created resource.
##### 5.2.3.8 LDP servers _SHOULD_ assign the URI for the resource to be
created using server application specific rules in the absence of a client
hint.
##### 5.2.3.9 LDP servers _SHOULD_ allow clients to create new resources
without requiring detailed knowledge of application-specific constraints. This
is a consequence of the requirement to enable simple creation and modification
of LDPRs. LDP servers expose these application-specific constraints as
described in section 4.2.1 General.
##### 5.2.3.10 LDP servers _MAY_ allow clients to suggest the URI for a
resource created through `POST`, using the HTTP `Slug` header as defined in
[RFC5023]. LDP adds no new requirements to this usage, so its presence
functions as a client hint to the server providing a desired string to be
incorporated into the server's final choice of resource URI.
##### 5.2.3.11 LDP servers that allow member creation via `POST` _SHOULD NOT_
re-use URIs.
##### 5.2.3.12 Upon successful creation of an LDP-NR (HTTP status code of
201-Created and URI indicated by `Location` response header), LDP servers
_MAY_ create an associated LDP-RS to contain data about the newly created LDP-
NR. If a LDP server creates this associated LDP-RS, it _MUST_ indicate its
location in the response by adding a HTTP `Link` header with a context URI
identifying the newly created LDP-NR (instead of the effective request URI), a
link relation value of `describedby`, and a target URI identifying the
associated LDP-RS resource [RFC5988].
##### 5.2.3.13 LDP servers that support `POST` _MUST_ include an `Accept-Post`
response header on HTTP `OPTIONS` responses, listing `POST` request media
type(s) supported by the server. LDP only specifies the use of `POST` for the
purpose of creating new resources, but a server can accept `POST` requests
with other semantics. While "POST to create" is a common interaction pattern,
LDP clients are not guaranteed, even when making requests to a LDP server,
that every successful `POST` request will result in the creation of a new
resource; they must rely on out of band information for knowledge of which
`POST` requests, if any, will have the "create new resource" semantics. This
requirement on LDP servers is intentionally stronger than the one levied in
the header registration; it is unrealistic to expect all existing resources
that support `POST` to suddenly return a new header or for all new
specifications constraining `POST` to be aware of its existence and require
it, but it is a reasonable requirement for new specifications such as LDP.
Feature At Risk
The LDP Working Group proposes incorporation of the following clause requiring
JSON-LD support.
##### 5.2.3.14 LDP servers that allow creation of LDP-RSs via POST _MUST_
allow clients to create new members by enclosing a request entity body with a
`Content-Type` request header whose value is `application/ld+json` [JSON-LD].
#### 5.2.4 HTTP PUT
Per [RFC7231], this HTTP method is optional and this specification does not
require LDP servers to support it. When a LDP server supports this method,
this specification imposes the following new requirements for LDPCs.
Any server-imposed constraints on creation or update must be advertised to
clients.
##### 5.2.4.1 LDP servers _SHOULD NOT_ allow HTTP `PUT` to update a LDPC's
containment triples; if the server receives such a request, it _SHOULD_
respond with a 409 (Conflict) status code.
##### 5.2.4.2 LDP servers that allow LDPR creation via `PUT` _SHOULD NOT_ re-
use URIs.
#### 5.2.5 HTTP DELETE
Per [RFC7231], this HTTP method is optional and this specification does not
require LDP servers to support it. When a LDP server supports this method,
this specification imposes the following new requirements for LDPCs.
##### 5.2.5.1 When a contained LDPR is deleted, the LDPC server _MUST_ also
remove the corresponding containment triple, which has the effect of removing
the deleted LDPR from the containing LDPC.
> Non-normative note: The LDP server might perform additional actions, as
described in the normative references like [RFC7231]. For example, the server
could remove membership triples referring to the deleted LDPR, perform
additional cleanup tasks for resources it knows are no longer referenced or
have not been accessed for some period of time, and so on.
##### 5.2.5.2 When a contained LDPR is deleted, and the LDPC server created
an associated LDP-RS (see the LDPC POST section), the LDPC server _MUST_ also
delete the associated LDP-RS it created.
#### 5.2.6 HTTP HEAD
Note that certain LDP mechanisms rely on HTTP headers, and HTTP recommends
that `HEAD` responses include the same headers as `GET` responses. LDP servers
must also include HTTP headers on responses to `OPTIONS`, see section 4.2.8
HTTP OPTIONS. Thus, implementers supporting `HEAD` should also carefully read
the section 5.2.2 HTTP GET and section 5.2.8 HTTP OPTIONS.
#### 5.2.7 HTTP PATCH
Per [RFC5789], this HTTP method is optional and this specification does not
require LDP servers to support it. When a LDP server supports this method,
this specification imposes the following new requirements for LDPCs.
Any server-imposed constraints on LDPR creation or update must be advertised
to clients.
##### 5.2.7.1 LDP servers are _RECOMMENDED_ to support HTTP `PATCH` as the
preferred method for updating a LDPC's minimal-container triples.
#### 5.2.8 HTTP OPTIONS
This specification imposes the following new requirements on HTTP `OPTIONS`
for LDPCs. Note that support for this method is required for LDPCs, since it
is required for LDPRs and LDPCs adhere to LDP-RS requirements.
##### 5.2.8.1 When responding to requests whose `request-URI` is a LDP-NR
with an associated LDP-RS, a LDPC server _MUST_ provide the same HTTP `Link`
response header as is required in the create response.
*[LDPRs]: Linked Data Platform Resources
*[LDPCs]: Linked Data Platform Containers
*[LDPR]: Linked Data Platform Resource
*[LDPC]: Linked Data Platform Container
*[LDP-RS]: Linked Data Platform RDF Source
*[RDF]: Resource Description Framework
*[LDP-NR]: Linked Data Platform Non-RDF Source
"""
from rdflib.namespace import Namespace, RDF
from rdflib import URIRef
from rdflib import Graph
from flask import render_template
from test.base import LDPTest, CONTINENTS, GN, PUT, AF, AS
from ldp import NS as LDP
from ldp.helpers import URL
BOB = URIRef('http://example.org/bob')
class LdpcGeneral(LDPTest):
DATASET_DESCRIPTORS = {'continents': {'source': 'test/continents.rdf',
'publicID': CONTINENTS}}
def test_5_2_1_1(self):
"""
5.2.1.1 Each Linked Data Platform Container MUST also be
a conforming Linked Data Platform RDF Source. LDP clients MAY infer the following triple: one
whose subject is the LDPC,
whose predicate is rdf:type,
and whose object is ldp:RDFSource,
but there is no requirement to materialize this triple in the LDPC representation.
"""
pass
def test_5_2_1_4(self):
"""
5.2.1.4 LDP servers
exposing LDPCs
MUST advertise their LDP support by exposing a HTTP Link header
with a target URI matching the type of container (see below) the
server supports, and
a link relation type of type (that is, rel='type')
in all responses to requests made to the LDPC's HTTP Request-URI.
LDP servers MAY provide additional HTTP Link: rel='type' headers.
The notes on the corresponding LDPR constraint apply
equally to LDPCs.
Valid container type URIs for rel='type' defined by this document are:
http://www.w3.org/ns/ldp#BasicContainer - for LDP Basic Containers
http://www.w3.org/ns/ldp#DirectContainer - for LDP Direct Containers
http://www.w3.org/ns/ldp#IndirectContainer - for LDP Indirect Containers
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
response = self.client.open('/x/AF', method='OPTIONS')
self.assertIn('Container', response.headers['Link'])
def test_5_2_1_5(self):
"""
5.2.1.5 LDP servers
SHOULD respect all of a client's LDP-defined hints, for example
which subsets of LDP-defined state
the client is interested in processing,
to influence the set of triples returned in representations of a LDPC,
particularly for large LDPCs. See also [LDP-PAGING].
"""
pass
POST = '''@prefix dcterms: <http://purl.org/dc/terms/>.
@prefix ldp: <http://www.w3.org/ns/ldp#>.
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#>.
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix cv: <http://purl.org/captsolo/resume-rdf/0.2/cv#> .
@base <{base}> .
<{name}> a foaf:Person;
dcterms:title '{name}’s data storage on the Web' .
'''
POST_JSON = '''[
{{
"@id": "{base}{name}",
"@type": [
"http://xmlns.com/foaf/0.1/Person"
],
"http://purl.org/dc/terms/title": [
{{
"@value": "bob\u2019s data storage on the Web"
}}
]
}}
]'''
class LdpcHttpPost(LDPTest):
DATASET_DESCRIPTORS = {'continents': {'source': 'test/continents.rdf',
'publicID': CONTINENTS}}
def test_5_2_3_1(self):
"""
5.2.3.1 LDP clients SHOULD create member resources by submitting a representation as
the entity body of the HTTP POST to a known LDPC. If the resource was created successfully, LDP servers MUST
respond with status code 201 (Created) and the Location
header set to the new resource’s URL. Clients shall not expect any representation in the response
entity body on a 201 (Created) response.
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
@self.app.route('/person/<nick>')
@self.app.bind('person', 'http://example.org/<nick>',
types=(LDP.RDFSource,))
@self.app.bind('continent', AF,
types=(LDP.RDFSource,))
def person(person, continent):
return '%s' % (continent, person)
response = self.client.open('/x/AF',
method='POST',
data=POST.format(base='http://example.org/',
name='bob'),
headers={'Content-Type': 'text/turtle'})
self.assertEqual(response.status_code, 201)
self.assertEqual(response.headers['Location'], 'http://localhost/person/bob')
response = self.client.open('/x/AF',
method='POST',
data=POST.format(base='http://example.org/',
name='bob'),
headers={'Content-Type': 'text/turtle'})
self.assertEqual(response.status_code, 409)
response = self.client.open('/x/AF',
method='POST',
data=POST.format(base='http://example.orgXX/',
name='bob'),
headers={'Content-Type': 'text/turtle'})
self.assertEqual(response.status_code, 422)
def test_5_2_3_2(self):
"""5.2.3.2 When a successful HTTP POST request to a LDPC results in the creation of a LDPR, a
containment triple MUST be added to the state of the LDPC
whose subject is the LDPC URI,
whose predicate is ldp:contains and whose object is the URI for the newly created document (LDPR). Other triples may be added as well.
The newly created LDPR appears as a contained resource of the LDPC until the
newly created document is deleted or removed by other methods.
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
@self.app.route('/person/<nick>')
@self.app.bind('person', 'http://example.org/<nick>',
types=(LDP.RDFSource,))
def person(person):
return '%s' % person
response = self.client.open('/x/AF',
method='POST',
data=POST.format(
base='http://example.org/',
name='bob'),
headers={'Content-Type': 'text/turtle'})
self.assertEqual(response.status_code, 201)
g = Graph().parse(data=self.client.get('/x/AF',
headers={'Accept': 'text/turtle'}).data.decode(),
format='turtle')
triples = list(g[:LDP.contains:BOB])
self.assertTrue(triples)
# def test_5_2_3_4(self):
# """
# 5.2.3.4 LDP servers that successfully create a resource from a
# RDF representation in the request entity body MUST honor the client's requested interaction model(s).
# If any requested interaction model cannot be honored, the server MUST fail the request.
# If the request header specifies a LDPR interaction model, then the server MUST handle subsequent
# requests to the newly created resource's URI as if it is a LDPR
# (even if the content contains an rdf:type triple indicating a type of LDPC).
# If the request header specifies a LDPC interaction model, then the server MUST handle subsequent
# requests to the newly created resource's URI as if it is a LDPC.
# This specification does not constrain the server's behavior in other cases.
# Clients use the same syntax, that is HTTP Link headers, to specify the desired interaction model
# when creating a resource as servers use to advertise it on responses.
# Note: A consequence of this is that LDPCs can be used to create LDPCs, if the server supports doing so.
# """
# pass
# def test_5_2_3_5(self):
# """
# 5.2.3.5 LDP servers
# that allow creation of LDP-RSs via POST MUST
# allow clients to create new members by enclosing a request entity body with a
# Content-Type request header whose value is text/turtle [turtle].
# """
# pass
# def test_5_2_3_6(self):
# """
# 5.2.3.6 LDP servers SHOULD use the Content-Type request header
# to determine the request representation's format when the request has an entity body.
# """
# pass
# def test_5_2_3_7(self):
# """
# 5.2.3.7 LDP servers
# creating a LDP-RS via POST MUST
# interpret the null relative
# URI for the subject of triples in the LDP-RS representation in the
# request entity body as identifying the entity in the request body.
# Commonly, that entity is the model for the "to be created" LDPR, so
# triples whose subject is the null relative URI result in
# triples in the created resource whose subject is the created
# resource.
# """
# pass
# def test_5_2_3_8(self):
# """
# 5.2.3.8 LDP servers SHOULD assign the URI for the resource to be
# created using server application specific rules in the absence of a client hint.
# """
# pass
# def test_5_2_3_9(self):
# """
# 5.2.3.9 LDP servers SHOULD allow clients to create new resources without
# requiring detailed knowledge of application-specific constraints.
# This is a consequence of the requirement to enable simple creation and modification of LDPRs. LDP servers
# expose these application-specific constraints as described in section 4.2.1 General.
# """
# pass
def test_5_2_3_12(self):
"""
5.2.3.12 Upon successful creation of an
LDP-NR (HTTP status code of
201-Created and URI indicated by Location response header), LDP servers MAY create an associated
LDP-RS
to contain data about the newly created LDP-NR.
If a LDP server creates this associated LDP-RS, it MUST indicate
its location in the response by adding a HTTP Link header with
a context URI identifying the newly created LDP-NR (instead of the effective request URI),
a link relation value of describedby,
and a target URI identifying the associated LDP-RS resource [RFC5988].
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
@self.app.route('/person/<nick>')
@self.app.bind('nick', 'http://example.org/<nick>',
types=(LDP.RDFSource,))
def person(nick):
return '%s' % nick
response = self.client.open('/x/AF',
method='POST',
data=POST.format(
base='http://example.org/',
name='bob'),
headers={'Content-Type': 'text/turtle'})
response = self.client.get(URL(response.headers['Location']).path,
headers={'Accept': 'text/turtle'})
self.assertIn('data storage on the Web', response.data.decode())
def test_5_2_3_13(self):
"""
5.2.3.13 LDP servers that support POST MUST
include an Accept-Post response header on HTTP OPTIONS
responses, listing POST request media type(s) supported by the server.
LDP only specifies the use of POST for the purpose of creating new resources, but a server
can accept POST requests with other semantics.
While "POST to create" is a common interaction pattern, LDP clients are not guaranteed, even when
making requests to a LDP server, that every successful POST request will result in the
creation of a new resource; they must rely on out of band information for knowledge of which POST requests,
if any, will have the "create new resource" semantics.
This requirement on LDP servers is intentionally stronger than the one levied in the
header registration; it is unrealistic to expect all existing resources
that support POST to suddenly return a new header or for all new specifications constraining
POST to be aware of its existence and require it, but it is a reasonable requirement for new
specifications such as LDP.
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
@self.app.route('/person/<nick>')
@self.app.bind('nick', 'http://example.org/<nick>',
types=(LDP.RDFSource,))
def person(nick):
return '%s' % nick
response = self.client.open('/x/AF', method='OPTIONS')
self.assertIn('Accept-Post', response.headers)
def test_5_2_3_14(self):
"""
5.2.3.14 LDP servers
that allow creation of LDP-RSs via POST MUST
allow clients to create new members by enclosing a request entity body with a
Content-Type request header whose value is application/ld+json [JSON-LD].
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
@self.app.route('/person/<nick>')
@self.app.bind('person', 'http://example.org/<nick>',
types=(LDP.RDFSource,))
def person(person):
return '%s' % person
response = self.client.open('/x/AF',
method='POST',
data=POST_JSON.format(
base='http://example.org/',
name='bob'),
headers={'Content-Type': 'application/ld+json'})
self.assertEqual(response.status_code, 201)
g = Graph().parse(data=self.client.get('/x/AF',
headers={'Accept': 'text/turtle'}).data.decode(),
format='turtle')
triples = list(g[:LDP.contains:BOB])
self.assertTrue(triples)
PUT_CONFLICT = '''@prefix dc: <http://purl.org/dc/terms/> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix gn: <http://www.geonames.org/ontology#> .
<http://www.telegraphis.net/data/continents/{0}#{0}> a foaf:PersonalProfileDocument;
foaf:primaryTopic <#me> ;
gn:population "922011001" ;
dc:title "Alice’s FOAF file" .
'''
PUT_NON_CONFLICT = '''@prefix dc: <http://purl.org/dc/terms/> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix gn: <http://www.geonames.org/ontology#> .
<http://www.telegraphis.net/data/continents/{0}#{0}> a foaf:PersonalProfileDocument;
foaf:primaryTopic <#me> ;
gn:population "922011001" ;
<http://www.w3.org/ns/ldp#contains> <http://example.org/bob> ;
dc:title "Alice’s FOAF file" .
'''
class LdpcHttpPut(LDPTest):
DATASET_DESCRIPTORS = {'continents': {'source': 'test/continents.rdf',
'publicID': CONTINENTS}}
def test_5_2_4_1(self):
"""
5.2.4.1 LDP servers SHOULD NOT allow HTTP PUT to update a LDPC’s containment triples;
if the server receives such a request, it SHOULD respond with a 409
(Conflict) status code.
"""
@self.app.route('/x/<c>')
@self.app.bind('c', CONTINENTS['<c>#<c>'],
types=(LDP.BasicContainer, ))
def population(c):
return render_template('test.html', c=c, GN=GN)
@self.app.route('/person/<nick>')
@self.app.bind('person', 'http://example.org/<nick>',
types=(LDP.RDFSource,))
def person(person):
return '%s' % person
response = self.client.open('/x/AF',
method='POST',
data=POST_JSON.format(base='http://example.org/',
name='bob'),
headers={'Content-Type': 'application/ld+json'})
response = self.client.open('/x/AF',
method='PUT',
data=PUT_CONFLICT.format('AF'),
headers={'Content-Type': 'text/turtle'})
self.assertEqual(response.status_code, 409)
response = self.client.open('/x/AF',
method='PUT',
data=PUT_NON_CONFLICT.format('AF'),
headers={'Content-Type': 'text/turtle'})
self.assertEqual(response.status_code, 204)
class LdpcHttpDelete(LDPTest):
def test_5_2_5_1(self):
"""
5.2.5.1
When a contained LDPR is deleted, the LDPC server MUST also remove
the corresponding containment triple, which has the effect of removing the deleted LDPR from the containing LDPC.
Non-normative note: The LDP server might perform additional actions,
as described in the normative references like [RFC7231].
For example, the server could remove membership triples referring to the deleted LDPR,
perform additional cleanup tasks for resources it knows are no longer referenced or have not
been accessed for some period of time, and so on.
"""
pass
def test_5_2_5_2(self):
"""
5.2.5.2
When a contained LDPR is deleted, and the LDPC server
created an associated LDP-RS (see the LDPC POST section), the LDPC server MUST also delete the associated LDP-RS it created.
"""
pass
class LdpcHttpOptions(LDPTest):
def test_5_2_8_1(self):
"""
5.2.8.1
When responding to requests whose request-URI is a
LDP-NR with an associated LDP-RS,
a LDPC server MUST provide the same HTTP Link
response header as is required in the create response.
"""
pass | 43.296296 | 135 | 0.655754 | 4,346 | 30,394 | 4.54809 | 0.112287 | 0.00769 | 0.005616 | 0.008196 | 0.856774 | 0.840433 | 0.827836 | 0.823333 | 0.819387 | 0.817667 | 0 | 0.019318 | 0.25232 | 30,394 | 702 | 136 | 43.296296 | 0.850473 | 0.644206 | 0 | 0.714953 | 0 | 0.023364 | 0.260982 | 0.005777 | 0 | 0 | 0 | 0 | 0.060748 | 1 | 0.116822 | false | 0.023364 | 0.03271 | 0.060748 | 0.247664 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
334ecfd0df916aa661a148f0417627a1f5dcc1b8 | 4,146 | py | Python | src/reixs/add_subtract.py | pmb399/REIXSAnalysis | 237c6b3b7beeea70a9e36d3055305e3525cd4787 | [
"MIT"
] | null | null | null | src/reixs/add_subtract.py | pmb399/REIXSAnalysis | 237c6b3b7beeea70a9e36d3055305e3525cd4787 | [
"MIT"
] | null | null | null | src/reixs/add_subtract.py | pmb399/REIXSAnalysis | 237c6b3b7beeea70a9e36d3055305e3525cd4787 | [
"MIT"
] | null | null | null | import numpy as np
from scipy.interpolate import interp1d
from .sca import loadSCAscans
from .simplemath import apply_offset
def ScanAddition(basedir, file, x_stream, y_stream, *args, avg=True, norm=False, is_XAS=False, background=None, xoffset=None, xcoffset=None, yoffset=None, ycoffset=None, deriv=None,energyloss=None,grid_x=[None,None,None]):
class added_object:
def __init__(self):
pass
for i in args:
if args.count(i) > 1:
raise ValueError("Cannot add the same scan to itself")
# Get the appropriate data first
Scandata = loadSCAscans(basedir, file, x_stream, y_stream, *args,
norm=False, is_XAS=is_XAS, background=background, deriv=deriv,energyloss=None,grid_x=grid_x)
for i, (k, v) in enumerate(Scandata.items()):
if i == 0:
MASTER_x_stream = v.x_stream
MASTER_y_stream = v.y_stream
name = str(k)+'+'
else:
if y_stream == 'XES' or y_stream.startswith('rXES'):
if not np.array_equal(MASTER_x_stream, v.x_stream):
raise ValueError(
"Cannot add emission spectra with different energy scales.")
else:
MASTER_y_stream += v.y_stream
else:
interp = interp1d(v.x_stream, v.y_stream,
fill_value='extrapolate')(MASTER_x_stream)
MASTER_y_stream += interp
name += "_" + str(k)
if avg == True:
MASTER_y_stream = MASTER_y_stream/(i+1)
data = dict()
data[0] = added_object()
data[0].x_stream = MASTER_x_stream
data[0].y_stream = MASTER_y_stream
data[0].scan = name
if norm == True:
data[0].y_stream = np.interp(
data[0].y_stream, (data[0].y_stream.min(), data[0].y_stream.max()), (0, 1))
data[0].x_stream = apply_offset(data[0].x_stream, xoffset, xcoffset)
data[0].y_stream = apply_offset(data[0].y_stream, yoffset, ycoffset)
if energyloss!=None:
data[0].x_stream = energyloss-data[0].x_stream
return data
def ScanSubtraction(basedir, file, x_stream, y_stream, *args, avg=True, norm=False, is_XAS=False, background=None, xoffset=None, xcoffset=None, yoffset=None, ycoffset=None, deriv=None,energyloss=None,grid_x=[None,None,None]):
class added_object:
def __init__(self):
pass
for i in args:
if args.count(i) > 1:
raise ValueError("Cannot add the same scan to itself")
# Get the appropriate data first
Scandata = loadSCAscans(basedir, file, x_stream, y_stream, *args,
norm=False, is_XAS=is_XAS, background=background, deriv=deriv,energyloss=None,grid_x=grid_x,)
for i, (k, v) in enumerate(Scandata.items()):
if i == 0:
MASTER_x_stream = v.x_stream
MASTER_y_stream = v.y_stream
name = str(k) + '-'
else:
if y_stream == 'XES' or y_stream.startswith('rXES'):
if not np.array_equal(MASTER_x_stream, v.x_stream):
raise ValueError(
"Cannot subtract emission spectra with different energy scales.")
else:
MASTER_y_stream -= v.y_stream
else:
interp = interp1d(v.x_stream, v.y_stream,
fill_value='extrapolate')(MASTER_x_stream)
MASTER_y_stream -= interp
name += "_" + str(k)
if avg == True:
MASTER_y_stream = MASTER_y_stream/(i+1)
data = dict()
data[0] = added_object()
data[0].x_stream = MASTER_x_stream
data[0].y_stream = MASTER_y_stream
data[0].scan = name
if norm == True:
data[0].y_stream = np.interp(
data[0].y_stream, (data[0].y_stream.min(), data[0].y_stream.max()), (0, 1))
data[0].x_stream = apply_offset(data[0].x_stream, xoffset, xcoffset)
data[0].y_stream = apply_offset(data[0].y_stream, yoffset, ycoffset)
if energyloss!=None:
data[0].x_stream = energyloss-data[0].x_stream
return data
| 36.690265 | 225 | 0.595755 | 566 | 4,146 | 4.146643 | 0.166078 | 0.119301 | 0.03579 | 0.071581 | 0.936515 | 0.936515 | 0.936515 | 0.936515 | 0.936515 | 0.936515 | 0 | 0.013927 | 0.289918 | 4,146 | 112 | 226 | 37.017857 | 0.783288 | 0.014713 | 0 | 0.813953 | 0 | 0 | 0.05561 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046512 | false | 0.023256 | 0.046512 | 0 | 0.139535 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
685f11844c9d3e44e79102e4cc40f302a4c718d7 | 164 | py | Python | rubyenv/__init__.py | schinckel/rubyenv | d4dc1f8764f4d25609ba39605d499f23c2d1975b | [
"MIT"
] | 6 | 2015-08-06T08:58:42.000Z | 2018-09-21T08:35:53.000Z | rubyenv/__init__.py | search5/rubyenv | d4dc1f8764f4d25609ba39605d499f23c2d1975b | [
"MIT"
] | 6 | 2017-12-30T09:35:54.000Z | 2020-10-29T08:11:09.000Z | rubyenv/__init__.py | search5/rubyenv | d4dc1f8764f4d25609ba39605d499f23c2d1975b | [
"MIT"
] | 5 | 2015-07-30T18:34:16.000Z | 2020-10-28T15:28:11.000Z | import six
if six.PY2:
from _version import __version__
from app import main
elif six.PY3:
from ._version import __version__
from .app import main
| 18.222222 | 37 | 0.72561 | 24 | 164 | 4.541667 | 0.416667 | 0.201835 | 0.311927 | 0.440367 | 0.752294 | 0.752294 | 0.752294 | 0.752294 | 0 | 0 | 0 | 0.016 | 0.237805 | 164 | 8 | 38 | 20.5 | 0.856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.714286 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
68615871cd070e639ca4bca8359ef6feff2c1ab0 | 82 | py | Python | References/Geovana Neves/TCC_Geovana_Neves_GitHub/SUAVE_modifications/SUAVE-feature-constant_throttle_EAS/trunk/SUAVE/Analyses/Atmospheric/__init__.py | Vinicius-Tanigawa/Undergraduate-Research-Project | e92372f07882484b127d7affe305eeec2238b8a9 | [
"MIT"
] | null | null | null | References/Geovana Neves/TCC_Geovana_Neves_GitHub/SUAVE_modifications/SUAVE-feature-constant_throttle_EAS/trunk/SUAVE/Analyses/Atmospheric/__init__.py | Vinicius-Tanigawa/Undergraduate-Research-Project | e92372f07882484b127d7affe305eeec2238b8a9 | [
"MIT"
] | null | null | null | References/Geovana Neves/TCC_Geovana_Neves_GitHub/SUAVE_modifications/SUAVE-feature-constant_throttle_EAS/trunk/SUAVE/Analyses/Atmospheric/__init__.py | Vinicius-Tanigawa/Undergraduate-Research-Project | e92372f07882484b127d7affe305eeec2238b8a9 | [
"MIT"
] | null | null | null |
from Atmospheric import Atmospheric
from US_Standard_1976 import US_Standard_1976 | 27.333333 | 45 | 0.902439 | 12 | 82 | 5.833333 | 0.5 | 0.285714 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 0.097561 | 82 | 3 | 45 | 27.333333 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d7a7e46eb8d952932be6b435528dc78fd7c99524 | 11,608 | py | Python | cars.py | chalo2812/cars | f3dc9aec29d057c17725461d753c21f9b326b61b | [
"Apache-2.0"
] | null | null | null | cars.py | chalo2812/cars | f3dc9aec29d057c17725461d753c21f9b326b61b | [
"Apache-2.0"
] | null | null | null | cars.py | chalo2812/cars | f3dc9aec29d057c17725461d753c21f9b326b61b | [
"Apache-2.0"
] | null | null | null | from itertools import cycle
import random
import sys
import pygame
from pygame.locals import *
FPS = 30
SCREENWIDTH = 846#282 * 3
SCREENHEIGHT = 358#179 * 2
# amount by which base can maximum shift to left
PIPEGAPSIZE = 100 # gap between upper and lower part of pipe
#BASEY = SCREENHEIGHT * 0.79
# image, sound and hitmask dicts
IMAGES, SOUNDS, HITMASKS = {}, {}, {}
BACKGROUNDS = ('naipe_doble.jpg')
BACKGROUNDS_LIST = (
'imagenesCars/dinaco.jpg',
'imagenesCars/fly.jpg',
'imagenesCars/hippy.jpg',
'imagenesCars/luigi.jpg',
'imagenesCars/ramon.jpg',
'imagenesCars/sargento.jpg',
'imagenesCars/Doc.jpg',
'imagenesCars/sheriff.jpg',
)
def main():
global SCREEN, FPSCLOCK
FPSCLOCK = pygame.time.Clock()
pygame.init()
SCREEN = pygame.display.set_mode((SCREENWIDTH, SCREENHEIGHT))
pygame.display.set_caption('Cars')
IMAGES['background'] = pygame.image.load(BACKGROUNDS).convert()
posx = 0;
#pygame.mixer.music.load('tumblr_m9v2ujdS2q1r5dxz2o1.mp3');
posy = 0;
SCREEN.blit(IMAGES['background'], (0,0));
#pygame.mixer.music.play(1, 0);
while True:
for event in pygame.event.get():
if event.type == KEYDOWN:
if event.key == K_SPACE:
posx=0; posy=0
randBg = random.randint(0, len(BACKGROUNDS_LIST) - 1)
IMAGES['background'] = pygame.image.load(BACKGROUNDS_LIST[randBg]).convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_RETURN:
posx=0; posy=0
IMAGES['background'] = pygame.image.load(BACKGROUNDS).convert()
SCREEN.blit(IMAGES['background'], (posx,posy))
if event.key == K_r:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/rayo.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_d:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/derby1.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_w:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/Wingo.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_o:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/Miss_Fritter.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_s:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/sally.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_1:
posx=0; posy=0
#IMAGES['background'] = pygame.image.load('imagenesCars/red.jpg').convert()
IMAGES['background'] = pygame.image.load('imagenesPacman/Pac_Man.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_2:
posx=0; posy=0
#IMAGES['background'] = pygame.image.load('imagenesCars/Jackson_Storm.jpg').convert()
IMAGES['background'] = pygame.image.load('imagenesPacman/Pacman 240.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_f:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/franchesco.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_t:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/mate.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_m:
posx=0; posy=0
#IMAGES['background'] = pygame.image.load('imagenesCars/mate.jpg').convert()
IMAGES['background'] = pygame.image.load('imagenesMario/mario.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_c:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/cruz.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_j:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesMario/Wario.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_n:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesMario/Yoshi.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_b:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesMario/interrogacionjj.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_y:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesMario/shroom.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_z:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/rayo lodo.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_x:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesPacman/bola.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_0:
posx=0; posy=0
#IMAGES['background'] = pygame.image.load('imagenesCars/cruz lodo.jpg').convert()
IMAGES['background'] = pygame.image.load('imagenesPacman/pink.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_3:
posx=0; posy=0
#IMAGES['background'] = pygame.image.load('imagenesCars/dj.jpg').convert()
IMAGES['background'] = pygame.image.load('imagenesPacman/blinky.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_4:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/ambulancia.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_5:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/cruz 95.jpg').convert()
SCREEN.blit(IMAGES['background'],(0,0))
if event.key == K_e:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/boost.jpg').convert()
SCREEN.blit(IMAGES['background'],(0,0))
if event.key == K_a:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/apb.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_8:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/Chick.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_q:
posx=0; posy=0;
IMAGES['background'] = pygame.image.load('imagenesCars/cruz final.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_k:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/mack.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_g:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/guido.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_h:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/Holley_Shiftwell.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_6:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/doc final.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_7:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/Doc.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_9:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesCars/Snot_rod_side.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_v:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesPacman/Clyde.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_p:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesPacman/inky.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_i:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesPacman/fantas.png').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_u:
posx=0; posy=0
IMAGES['background'] = pygame.image.load('imagenesPacman/pacman izq.jpg').convert()
SCREEN.blit(IMAGES['background'], (0,0))
if event.key == K_RIGHT:
if posx < 846:
posx = posx + 20
SCREEN.blit(IMAGES['background'], (posx,posy))
if event.key == K_LEFT:
if posx > 0:
posx = posx - 20
SCREEN.blit(IMAGES['background'], (posx,posy))
if event.key == K_DOWN:
if posy < 358:
posy = posy + 30
SCREEN.blit(IMAGES['background'], (posx,posy))
if event.key == K_UP:
if posy > 0:
posy=posy - 30
SCREEN.blit(IMAGES['background'], (posx,posy))
if event.key == K_ESCAPE:
#pygame.mixer.music.stop();
pygame.quit();
sys.exit();
if event.type == QUIT:
#pygame.mixer.music.stop();
pygame.quit();
sys.exit();
pygame.display.flip()
FPSCLOCK.tick(FPS)
if __name__ == '__main__':
main()
| 51.136564 | 113 | 0.492074 | 1,205 | 11,608 | 4.687967 | 0.157676 | 0.240751 | 0.167463 | 0.205523 | 0.793238 | 0.793238 | 0.778545 | 0.759072 | 0.707028 | 0.69977 | 0 | 0.029404 | 0.367161 | 11,608 | 226 | 114 | 51.362832 | 0.739586 | 0.059873 | 0 | 0.397129 | 0 | 0 | 0.171041 | 0.076895 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004785 | false | 0 | 0.023923 | 0 | 0.028708 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d7acac633dc920fb0a4fb37b43b4bb6c5e503902 | 90 | py | Python | xnmt/rl/__init__.py | philip30/xnmt | b5e6985d3bedfac102312cab030a60594bc17baf | [
"Apache-2.0"
] | null | null | null | xnmt/rl/__init__.py | philip30/xnmt | b5e6985d3bedfac102312cab030a60594bc17baf | [
"Apache-2.0"
] | null | null | null | xnmt/rl/__init__.py | philip30/xnmt | b5e6985d3bedfac102312cab030a60594bc17baf | [
"Apache-2.0"
] | 1 | 2019-08-08T08:10:56.000Z | 2019-08-08T08:10:56.000Z | import xnmt.rl.policy_gradient
import xnmt.rl.policy_network
import xnmt.rl.policy_action
| 22.5 | 30 | 0.866667 | 15 | 90 | 5 | 0.466667 | 0.4 | 0.48 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 90 | 3 | 31 | 30 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d7ebb35b5e92b985ee0dcc521ef65819bf420723 | 942 | py | Python | src/extensible_locks/__init__.py | QuiNovas/extensible-locks | 413c0da194e5cf47205a8a161a10fac8e285db47 | [
"Apache-2.0"
] | null | null | null | src/extensible_locks/__init__.py | QuiNovas/extensible-locks | 413c0da194e5cf47205a8a161a10fac8e285db47 | [
"Apache-2.0"
] | null | null | null | src/extensible_locks/__init__.py | QuiNovas/extensible-locks | 413c0da194e5cf47205a8a161a10fac8e285db47 | [
"Apache-2.0"
] | null | null | null | from threading import Lock as _Lock, RLock as _RLock
class Lock(object):
def __init__(self):
self.__lock = _Lock()
def __enter__(self):
return self.__lock.__enter__()
def __exit__(self, *args):
return self.__lock.__exit__(*args)
def __repr__(self):
return self.__lock.__repr__()
def acquire(self, *args):
self.__lock.acquire(*args)
def release(self):
self.__lock.release()
def locked(self):
return self.__lock.locked()
class RLock(object):
def __init__(self):
self.__lock = _RLock()
def __enter__(self):
return self.__lock.__enter__()
def __exit__(self, *args):
return self.__lock.__exit__(*args)
def __repr__(self):
return self.__lock.__repr__()
def acquire(self, *args):
self.__lock.acquire(*args)
def release(self):
self.__lock.release()
__all__ = ["Lock", "RLock"]
| 19.625 | 52 | 0.615711 | 111 | 942 | 4.414414 | 0.18018 | 0.212245 | 0.2 | 0.183673 | 0.767347 | 0.767347 | 0.665306 | 0.665306 | 0.665306 | 0.665306 | 0 | 0 | 0.259023 | 942 | 47 | 53 | 20.042553 | 0.702006 | 0 | 0 | 0.733333 | 0 | 0 | 0.009554 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.433333 | false | 0 | 0.033333 | 0.233333 | 0.766667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
0bced985d35bf54c99ca69c94b1dfa99c8e904d7 | 5,124 | py | Python | function/stickers.py | Jianghuchengphilip/Master-art-punk | 4102d82148bf571e0cd418e363c51fa8486c5a43 | [
"Apache-2.0"
] | 37 | 2022-01-12T07:07:59.000Z | 2022-03-31T10:25:46.000Z | function/stickers.py | Jianghuchengphilip/Master-art-punk | 4102d82148bf571e0cd418e363c51fa8486c5a43 | [
"Apache-2.0"
] | 1 | 2022-01-25T12:24:57.000Z | 2022-02-03T10:45:00.000Z | function/stickers.py | Jianghuchengphilip/Master-art-punk | 4102d82148bf571e0cd418e363c51fa8486c5a43 | [
"Apache-2.0"
] | 10 | 2022-01-12T07:29:37.000Z | 2022-03-28T23:37:42.000Z | #!/usr/bin/env python
# -*- coding: UTF-8 -*-
"""=================================================
@Author :蒋虎成
@Date :2021/9/22 17:04
@Desc :贴纸元素数据
=================================================="""
# 香烟贴纸
cigarette = {
'colors': [0, '000000', 'dddddd', 'c6c6c6', 'e25b26'],
'data': [
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 3, 3, 3, 3, 3, 4, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0]
]
}
# 男头发贴纸
hairman = {
'colors': [0, 'ed93f0'],
'data': [
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
]
}
# 女头发贴纸
hairwoman = {
'colors': [0, 'ed93f0'],
'data': [
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
]
}
| 61.73494 | 81 | 0.299571 | 1,406 | 5,124 | 1.09175 | 0.024182 | 1.662541 | 2.450814 | 3.215635 | 0.912052 | 0.912052 | 0.912052 | 0.912052 | 0.912052 | 0.90228 | 0 | 0.437442 | 0.374512 | 5,124 | 82 | 82 | 62.487805 | 0.041498 | 0.041764 | 0 | 0.736111 | 0 | 0 | 0.013469 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
0bd35f0c281fb12d7f510b9ff9691759429131dc | 3,449 | py | Python | django_ocr_server/management/commands/ttl.py | shmakovpn/django_ocr_server | 4d694629c39c18a6c13bcdfafdb8258b78e5a859 | [
"Apache-2.0"
] | 17 | 2019-12-04T03:14:56.000Z | 2022-03-27T07:05:19.000Z | django_ocr_server/management/commands/ttl.py | tjennt/django_ocr_server | 3c11bec5a9220495ac3d4ab26c21a3ff0cbbf946 | [
"Apache-2.0"
] | 1 | 2020-04-17T07:32:30.000Z | 2020-04-17T07:32:30.000Z | django_ocr_server/management/commands/ttl.py | tjennt/django_ocr_server | 3c11bec5a9220495ac3d4ab26c21a3ff0cbbf946 | [
"Apache-2.0"
] | 5 | 2020-03-16T10:43:03.000Z | 2021-07-14T14:43:49.000Z | """
django_ocr_server/management/commands/ttl.py
Removes all instances of OCRedFile whose OCRedFile.uploaded+OCR_TTL lower current datetime
if OCR_TTL does not 0, (NOTE: if OCR_TTL<0 all instances of OCRedFile will be removed, use only for tests).
Removes all OCRedFile.files whose OCRedFile.uploaded+OCR_FILES_TTL lower current datetime
if OCR_FILES_TTL does not 0,
(NOTE: if OCR_FILES_TTL<0 all OCRedFile.files will be removed, use only for tests).
Removes all OCRedFile.ocred_pdfs whose OCRedFile.uploaded+OCR_PDF_TTL lower current datetime
if OCR_PDF_TTL does not 0,
(NOTE: if OCR_PDF_TTL<0 all OCRedFile.ocred_pdfs will be removed, use only for tests). 2019-04-13
"""
__author__ = 'shmakovpn <shmakovpn@yandex.ru>'
__date__ = '2019-04-18'
from django.core.management.base import BaseCommand
from django_ocr_server.models import *
class Command(BaseCommand):
"""
Removes all instances of OCRedFile whose OCRedFile.uploaded+OCR_TTL lower current datetime
if OCR_TTL does not 0, (NOTE: if OCR_TTL<0 all instances of OCRedFile will be removed, use only for tests).
Removes all OCRedFile.files whose OCRedFile.uploaded+OCR_FILES_TTL lower current datetime
if OCR_FILES_TTL does not 0,
(NOTE: if OCR_FILES_TTL<0 all OCRedFile.files will be removed, use only for tests).
Removes all OCRedFile.ocred_pdfs whose OCRedFile.uploaded+OCR_PDF_TTL lower current datetime
if OCR_PDF_TTL does not 0,
(NOTE: if OCR_PDF_TTL<0 all OCRedFile.ocred_pdfs will be removed, use only for tests). 2019-04-18
"""
help = """
Removes all instances of OCRedFile whose OCRedFile.uploaded+OCR_TTL lower current datetime
if OCR_TTL does not 0, (NOTE: if OCR_TTL<0 all instances of OCRedFile will be removed, use only for tests).
Removes all OCRedFile.files whose OCRedFile.uploaded+OCR_FILES_TTL lower current datetime
if OCR_FILES_TTL does not 0,
(NOTE: if OCR_FILES_TTL<0 all OCRedFile.files will be removed, use only for tests).
Removes all OCRedFile.ocred_pdfs whose OCRedFile.uploaded+OCR_PDF_TTL lower current datetime
if OCR_PDF_TTL does not 0,
(NOTE: if OCR_PDF_TTL<0 all OCRedFile.ocred_pdfs will be removed, use only for tests).
"""
def handle(self, *args, **options):
"""
Removes all instances of OCRedFile whose OCRedFile.uploaded+OCR_TTL lower current datetime
if OCR_TTL does not 0, (NOTE: if OCR_TTL<0 all instances of OCRedFile will be removed, use only for tests).
Removes all OCRedFile.files whose OCRedFile.uploaded+OCR_FILES_TTL lower current datetime
if OCR_FILES_TTL does not 0,
(NOTE: if OCR_FILES_TTL<0 all OCRedFile.files will be removed, use only for tests).
Removes all OCRedFile.ocred_pdfs whose OCRedFile.uploaded+OCR_PDF_TTL lower current datetime
if OCR_PDF_TTL does not 0,
(NOTE: if OCR_PDF_TTL<0 all OCRedFile.ocred_pdfs will be removed, use only for tests). 2019-04-18
:param args: not used
:param options: not used
:return: None
"""
ttl_result = OCRedFile.ttl()
self.stdout.write(self.style.SUCCESS('Total models removed: %s' % str(ttl_result[0])))
self.stdout.write(self.style.SUCCESS('Total files removed: %s' % str(ttl_result[1])))
self.stdout.write(self.style.SUCCESS('Total pdf removed: %s' % str(ttl_result[2])))
| 57.483333 | 120 | 0.723688 | 539 | 3,449 | 4.474954 | 0.12987 | 0.049751 | 0.109453 | 0.124378 | 0.878109 | 0.853234 | 0.853234 | 0.808458 | 0.808458 | 0.808458 | 0 | 0.021501 | 0.204407 | 3,449 | 59 | 121 | 58.457627 | 0.857507 | 0.591186 | 0 | 0 | 0 | 0.05 | 0.620223 | 0.086783 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.1 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0bfb1f6bf8a7c6f30713b0086464ea1dd67d392d | 159,677 | py | Python | sdk/python/pulumi_alicloud/kvstore/instance.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/kvstore/instance.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/kvstore/instance.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['InstanceArgs', 'Instance']
@pulumi.input_type
class InstanceArgs:
def __init__(__self__, *,
auto_renew: Optional[pulumi.Input[bool]] = None,
auto_renew_period: Optional[pulumi.Input[int]] = None,
auto_use_coupon: Optional[pulumi.Input[bool]] = None,
availability_zone: Optional[pulumi.Input[str]] = None,
backup_id: Optional[pulumi.Input[str]] = None,
backup_periods: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_time: Optional[pulumi.Input[str]] = None,
business_info: Optional[pulumi.Input[str]] = None,
capacity: Optional[pulumi.Input[int]] = None,
config: Optional[pulumi.Input[Mapping[str, Any]]] = None,
connection_string_prefix: Optional[pulumi.Input[str]] = None,
coupon_no: Optional[pulumi.Input[str]] = None,
db_instance_name: Optional[pulumi.Input[str]] = None,
dedicated_host_group_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
enable_backup_log: Optional[pulumi.Input[int]] = None,
enable_public: Optional[pulumi.Input[bool]] = None,
engine_version: Optional[pulumi.Input[str]] = None,
force_upgrade: Optional[pulumi.Input[bool]] = None,
global_instance: Optional[pulumi.Input[bool]] = None,
global_instance_id: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
instance_name: Optional[pulumi.Input[str]] = None,
instance_release_protection: Optional[pulumi.Input[bool]] = None,
instance_type: Optional[pulumi.Input[str]] = None,
kms_encrypted_password: Optional[pulumi.Input[str]] = None,
kms_encryption_context: Optional[pulumi.Input[Mapping[str, Any]]] = None,
maintain_end_time: Optional[pulumi.Input[str]] = None,
maintain_start_time: Optional[pulumi.Input[str]] = None,
modify_mode: Optional[pulumi.Input[int]] = None,
node_type: Optional[pulumi.Input[str]] = None,
order_type: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]]] = None,
password: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
private_connection_port: Optional[pulumi.Input[str]] = None,
private_connection_prefix: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
restore_time: Optional[pulumi.Input[str]] = None,
secondary_zone_id: Optional[pulumi.Input[str]] = None,
security_group_id: Optional[pulumi.Input[str]] = None,
security_ip_group_attribute: Optional[pulumi.Input[str]] = None,
security_ip_group_name: Optional[pulumi.Input[str]] = None,
security_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
srcdb_instance_id: Optional[pulumi.Input[str]] = None,
ssl_enable: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_auth_mode: Optional[pulumi.Input[str]] = None,
vswitch_id: Optional[pulumi.Input[str]] = None,
zone_id: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Instance resource.
:param pulumi.Input[bool] auto_renew: Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
:param pulumi.Input[int] auto_renew_period: Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
:param pulumi.Input[bool] auto_use_coupon: Specifies whether to use a coupon. Default to: `false`.
:param pulumi.Input[str] availability_zone: It has been deprecated from provider version 1.101.0 and `zone_id` instead.
:param pulumi.Input[str] backup_id: The ID of the backup file of the source instance.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_periods: Backup period.
:param pulumi.Input[str] backup_time: Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
:param pulumi.Input[str] business_info: The ID of the event or the business information.
:param pulumi.Input[int] capacity: The storage capacity of the KVStore DBInstance. Unit: MB.
:param pulumi.Input[Mapping[str, Any]] config: The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
:param pulumi.Input[str] connection_string_prefix: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] coupon_no: The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
:param pulumi.Input[str] db_instance_name: The name of KVStore DBInstance. It is a string of 2 to 256 characters.
:param pulumi.Input[str] dedicated_host_group_id: The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
:param pulumi.Input[int] enable_backup_log: Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
:param pulumi.Input[bool] enable_public: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] engine_version: The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
:param pulumi.Input[bool] force_upgrade: Specifies whether to forcibly change the type. Default to: `true`.
:param pulumi.Input[bool] global_instance: Whether to create a distributed cache. Default to: `false`.
:param pulumi.Input[str] global_instance_id: The ID of distributed cache.
:param pulumi.Input[str] instance_charge_type: It has been deprecated from provider version 1.101.0 and `payment_type` instead.
:param pulumi.Input[str] instance_name: It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
:param pulumi.Input[bool] instance_release_protection: Whether to open the release protection.
:param pulumi.Input[str] instance_type: The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
:param pulumi.Input[str] kms_encrypted_password: An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
:param pulumi.Input[Mapping[str, Any]] kms_encryption_context: An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
:param pulumi.Input[str] maintain_end_time: The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[str] maintain_start_time: The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[int] modify_mode: The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
:param pulumi.Input[str] node_type: "Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
:param pulumi.Input[str] order_type: Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
:param pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]] parameters: It has been deprecated from provider version 1.101.0 and `config` instead..
:param pulumi.Input[str] password: The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
:param pulumi.Input[str] payment_type: The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
:param pulumi.Input[str] period: The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
:param pulumi.Input[int] port: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] private_connection_port: Private network connection port, used to modify the private network connection port.
:param pulumi.Input[str] private_connection_prefix: Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
:param pulumi.Input[str] private_ip: The internal IP address of the instance.
:param pulumi.Input[str] resource_group_id: The ID of resource group which the resource belongs.
:param pulumi.Input[str] restore_time: The point in time of a backup file.
:param pulumi.Input[str] secondary_zone_id: The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
:param pulumi.Input[str] security_group_id: The ID of security groups.
:param pulumi.Input[str] security_ip_group_attribute: The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
:param pulumi.Input[str] security_ip_group_name: The name of the whitelist group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_ips: The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
:param pulumi.Input[str] srcdb_instance_id: The ID of the source instance.
:param pulumi.Input[str] ssl_enable: Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
:param pulumi.Input[Mapping[str, Any]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] vpc_auth_mode: Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
:param pulumi.Input[str] vswitch_id: The ID of VSwitch.
:param pulumi.Input[str] zone_id: The ID of the zone.
"""
if auto_renew is not None:
pulumi.set(__self__, "auto_renew", auto_renew)
if auto_renew_period is not None:
pulumi.set(__self__, "auto_renew_period", auto_renew_period)
if auto_use_coupon is not None:
pulumi.set(__self__, "auto_use_coupon", auto_use_coupon)
if availability_zone is not None:
warnings.warn("""Field 'availability_zone' has been deprecated from version 1.101.0. Use 'zone_id' instead.""", DeprecationWarning)
pulumi.log.warn("""availability_zone is deprecated: Field 'availability_zone' has been deprecated from version 1.101.0. Use 'zone_id' instead.""")
if availability_zone is not None:
pulumi.set(__self__, "availability_zone", availability_zone)
if backup_id is not None:
pulumi.set(__self__, "backup_id", backup_id)
if backup_periods is not None:
pulumi.set(__self__, "backup_periods", backup_periods)
if backup_time is not None:
pulumi.set(__self__, "backup_time", backup_time)
if business_info is not None:
pulumi.set(__self__, "business_info", business_info)
if capacity is not None:
pulumi.set(__self__, "capacity", capacity)
if config is not None:
pulumi.set(__self__, "config", config)
if connection_string_prefix is not None:
warnings.warn("""Field 'connection_string_prefix' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""connection_string_prefix is deprecated: Field 'connection_string_prefix' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
if connection_string_prefix is not None:
pulumi.set(__self__, "connection_string_prefix", connection_string_prefix)
if coupon_no is not None:
pulumi.set(__self__, "coupon_no", coupon_no)
if db_instance_name is not None:
pulumi.set(__self__, "db_instance_name", db_instance_name)
if dedicated_host_group_id is not None:
pulumi.set(__self__, "dedicated_host_group_id", dedicated_host_group_id)
if dry_run is not None:
pulumi.set(__self__, "dry_run", dry_run)
if enable_backup_log is not None:
pulumi.set(__self__, "enable_backup_log", enable_backup_log)
if enable_public is not None:
warnings.warn("""Field 'enable_public' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""enable_public is deprecated: Field 'enable_public' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
if enable_public is not None:
pulumi.set(__self__, "enable_public", enable_public)
if engine_version is not None:
pulumi.set(__self__, "engine_version", engine_version)
if force_upgrade is not None:
pulumi.set(__self__, "force_upgrade", force_upgrade)
if global_instance is not None:
pulumi.set(__self__, "global_instance", global_instance)
if global_instance_id is not None:
pulumi.set(__self__, "global_instance_id", global_instance_id)
if instance_charge_type is not None:
warnings.warn("""Field 'instance_charge_type' has been deprecated from version 1.101.0. Use 'payment_type' instead.""", DeprecationWarning)
pulumi.log.warn("""instance_charge_type is deprecated: Field 'instance_charge_type' has been deprecated from version 1.101.0. Use 'payment_type' instead.""")
if instance_charge_type is not None:
pulumi.set(__self__, "instance_charge_type", instance_charge_type)
if instance_class is not None:
pulumi.set(__self__, "instance_class", instance_class)
if instance_name is not None:
warnings.warn("""Field 'instance_name' has been deprecated from version 1.101.0. Use 'db_instance_name' instead.""", DeprecationWarning)
pulumi.log.warn("""instance_name is deprecated: Field 'instance_name' has been deprecated from version 1.101.0. Use 'db_instance_name' instead.""")
if instance_name is not None:
pulumi.set(__self__, "instance_name", instance_name)
if instance_release_protection is not None:
pulumi.set(__self__, "instance_release_protection", instance_release_protection)
if instance_type is not None:
pulumi.set(__self__, "instance_type", instance_type)
if kms_encrypted_password is not None:
pulumi.set(__self__, "kms_encrypted_password", kms_encrypted_password)
if kms_encryption_context is not None:
pulumi.set(__self__, "kms_encryption_context", kms_encryption_context)
if maintain_end_time is not None:
pulumi.set(__self__, "maintain_end_time", maintain_end_time)
if maintain_start_time is not None:
pulumi.set(__self__, "maintain_start_time", maintain_start_time)
if modify_mode is not None:
pulumi.set(__self__, "modify_mode", modify_mode)
if node_type is not None:
warnings.warn("""Field 'node_type' has been deprecated from version 1.120.1""", DeprecationWarning)
pulumi.log.warn("""node_type is deprecated: Field 'node_type' has been deprecated from version 1.120.1""")
if node_type is not None:
pulumi.set(__self__, "node_type", node_type)
if order_type is not None:
pulumi.set(__self__, "order_type", order_type)
if parameters is not None:
warnings.warn("""Field 'parameters' has been deprecated from version 1.101.0. Use 'config' instead.""", DeprecationWarning)
pulumi.log.warn("""parameters is deprecated: Field 'parameters' has been deprecated from version 1.101.0. Use 'config' instead.""")
if parameters is not None:
pulumi.set(__self__, "parameters", parameters)
if password is not None:
pulumi.set(__self__, "password", password)
if payment_type is not None:
pulumi.set(__self__, "payment_type", payment_type)
if period is not None:
pulumi.set(__self__, "period", period)
if port is not None:
pulumi.set(__self__, "port", port)
if private_connection_port is not None:
pulumi.set(__self__, "private_connection_port", private_connection_port)
if private_connection_prefix is not None:
pulumi.set(__self__, "private_connection_prefix", private_connection_prefix)
if private_ip is not None:
pulumi.set(__self__, "private_ip", private_ip)
if resource_group_id is not None:
pulumi.set(__self__, "resource_group_id", resource_group_id)
if restore_time is not None:
pulumi.set(__self__, "restore_time", restore_time)
if secondary_zone_id is not None:
pulumi.set(__self__, "secondary_zone_id", secondary_zone_id)
if security_group_id is not None:
pulumi.set(__self__, "security_group_id", security_group_id)
if security_ip_group_attribute is not None:
pulumi.set(__self__, "security_ip_group_attribute", security_ip_group_attribute)
if security_ip_group_name is not None:
pulumi.set(__self__, "security_ip_group_name", security_ip_group_name)
if security_ips is not None:
pulumi.set(__self__, "security_ips", security_ips)
if srcdb_instance_id is not None:
pulumi.set(__self__, "srcdb_instance_id", srcdb_instance_id)
if ssl_enable is not None:
pulumi.set(__self__, "ssl_enable", ssl_enable)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if vpc_auth_mode is not None:
pulumi.set(__self__, "vpc_auth_mode", vpc_auth_mode)
if vswitch_id is not None:
pulumi.set(__self__, "vswitch_id", vswitch_id)
if zone_id is not None:
pulumi.set(__self__, "zone_id", zone_id)
@property
@pulumi.getter(name="autoRenew")
def auto_renew(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
"""
return pulumi.get(self, "auto_renew")
@auto_renew.setter
def auto_renew(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_renew", value)
@property
@pulumi.getter(name="autoRenewPeriod")
def auto_renew_period(self) -> Optional[pulumi.Input[int]]:
"""
Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
"""
return pulumi.get(self, "auto_renew_period")
@auto_renew_period.setter
def auto_renew_period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "auto_renew_period", value)
@property
@pulumi.getter(name="autoUseCoupon")
def auto_use_coupon(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to use a coupon. Default to: `false`.
"""
return pulumi.get(self, "auto_use_coupon")
@auto_use_coupon.setter
def auto_use_coupon(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_use_coupon", value)
@property
@pulumi.getter(name="availabilityZone")
def availability_zone(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and `zone_id` instead.
"""
return pulumi.get(self, "availability_zone")
@availability_zone.setter
def availability_zone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "availability_zone", value)
@property
@pulumi.getter(name="backupId")
def backup_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the backup file of the source instance.
"""
return pulumi.get(self, "backup_id")
@backup_id.setter
def backup_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_id", value)
@property
@pulumi.getter(name="backupPeriods")
def backup_periods(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Backup period.
"""
return pulumi.get(self, "backup_periods")
@backup_periods.setter
def backup_periods(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "backup_periods", value)
@property
@pulumi.getter(name="backupTime")
def backup_time(self) -> Optional[pulumi.Input[str]]:
"""
Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
"""
return pulumi.get(self, "backup_time")
@backup_time.setter
def backup_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_time", value)
@property
@pulumi.getter(name="businessInfo")
def business_info(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the event or the business information.
"""
return pulumi.get(self, "business_info")
@business_info.setter
def business_info(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "business_info", value)
@property
@pulumi.getter
def capacity(self) -> Optional[pulumi.Input[int]]:
"""
The storage capacity of the KVStore DBInstance. Unit: MB.
"""
return pulumi.get(self, "capacity")
@capacity.setter
def capacity(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "capacity", value)
@property
@pulumi.getter
def config(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
"""
return pulumi.get(self, "config")
@config.setter
def config(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "config", value)
@property
@pulumi.getter(name="connectionStringPrefix")
def connection_string_prefix(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "connection_string_prefix")
@connection_string_prefix.setter
def connection_string_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_string_prefix", value)
@property
@pulumi.getter(name="couponNo")
def coupon_no(self) -> Optional[pulumi.Input[str]]:
"""
The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
"""
return pulumi.get(self, "coupon_no")
@coupon_no.setter
def coupon_no(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "coupon_no", value)
@property
@pulumi.getter(name="dbInstanceName")
def db_instance_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of KVStore DBInstance. It is a string of 2 to 256 characters.
"""
return pulumi.get(self, "db_instance_name")
@db_instance_name.setter
def db_instance_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "db_instance_name", value)
@property
@pulumi.getter(name="dedicatedHostGroupId")
def dedicated_host_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
"""
return pulumi.get(self, "dedicated_host_group_id")
@dedicated_host_group_id.setter
def dedicated_host_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dedicated_host_group_id", value)
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
"""
return pulumi.get(self, "dry_run")
@dry_run.setter
def dry_run(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dry_run", value)
@property
@pulumi.getter(name="enableBackupLog")
def enable_backup_log(self) -> Optional[pulumi.Input[int]]:
"""
Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
"""
return pulumi.get(self, "enable_backup_log")
@enable_backup_log.setter
def enable_backup_log(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "enable_backup_log", value)
@property
@pulumi.getter(name="enablePublic")
def enable_public(self) -> Optional[pulumi.Input[bool]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "enable_public")
@enable_public.setter
def enable_public(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enable_public", value)
@property
@pulumi.getter(name="engineVersion")
def engine_version(self) -> Optional[pulumi.Input[str]]:
"""
The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
"""
return pulumi.get(self, "engine_version")
@engine_version.setter
def engine_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "engine_version", value)
@property
@pulumi.getter(name="forceUpgrade")
def force_upgrade(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to forcibly change the type. Default to: `true`.
"""
return pulumi.get(self, "force_upgrade")
@force_upgrade.setter
def force_upgrade(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_upgrade", value)
@property
@pulumi.getter(name="globalInstance")
def global_instance(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to create a distributed cache. Default to: `false`.
"""
return pulumi.get(self, "global_instance")
@global_instance.setter
def global_instance(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "global_instance", value)
@property
@pulumi.getter(name="globalInstanceId")
def global_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of distributed cache.
"""
return pulumi.get(self, "global_instance_id")
@global_instance_id.setter
def global_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "global_instance_id", value)
@property
@pulumi.getter(name="instanceChargeType")
def instance_charge_type(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and `payment_type` instead.
"""
return pulumi.get(self, "instance_charge_type")
@instance_charge_type.setter
def instance_charge_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_charge_type", value)
@property
@pulumi.getter(name="instanceClass")
def instance_class(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "instance_class")
@instance_class.setter
def instance_class(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_class", value)
@property
@pulumi.getter(name="instanceName")
def instance_name(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
"""
return pulumi.get(self, "instance_name")
@instance_name.setter
def instance_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_name", value)
@property
@pulumi.getter(name="instanceReleaseProtection")
def instance_release_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to open the release protection.
"""
return pulumi.get(self, "instance_release_protection")
@instance_release_protection.setter
def instance_release_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "instance_release_protection", value)
@property
@pulumi.getter(name="instanceType")
def instance_type(self) -> Optional[pulumi.Input[str]]:
"""
The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
"""
return pulumi.get(self, "instance_type")
@instance_type.setter
def instance_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_type", value)
@property
@pulumi.getter(name="kmsEncryptedPassword")
def kms_encrypted_password(self) -> Optional[pulumi.Input[str]]:
"""
An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
"""
return pulumi.get(self, "kms_encrypted_password")
@kms_encrypted_password.setter
def kms_encrypted_password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "kms_encrypted_password", value)
@property
@pulumi.getter(name="kmsEncryptionContext")
def kms_encryption_context(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
"""
return pulumi.get(self, "kms_encryption_context")
@kms_encryption_context.setter
def kms_encryption_context(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "kms_encryption_context", value)
@property
@pulumi.getter(name="maintainEndTime")
def maintain_end_time(self) -> Optional[pulumi.Input[str]]:
"""
The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
"""
return pulumi.get(self, "maintain_end_time")
@maintain_end_time.setter
def maintain_end_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "maintain_end_time", value)
@property
@pulumi.getter(name="maintainStartTime")
def maintain_start_time(self) -> Optional[pulumi.Input[str]]:
"""
The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
"""
return pulumi.get(self, "maintain_start_time")
@maintain_start_time.setter
def maintain_start_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "maintain_start_time", value)
@property
@pulumi.getter(name="modifyMode")
def modify_mode(self) -> Optional[pulumi.Input[int]]:
"""
The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
"""
return pulumi.get(self, "modify_mode")
@modify_mode.setter
def modify_mode(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "modify_mode", value)
@property
@pulumi.getter(name="nodeType")
def node_type(self) -> Optional[pulumi.Input[str]]:
"""
"Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
"""
return pulumi.get(self, "node_type")
@node_type.setter
def node_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "node_type", value)
@property
@pulumi.getter(name="orderType")
def order_type(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
"""
return pulumi.get(self, "order_type")
@order_type.setter
def order_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "order_type", value)
@property
@pulumi.getter
def parameters(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]]]:
"""
It has been deprecated from provider version 1.101.0 and `config` instead..
"""
return pulumi.get(self, "parameters")
@parameters.setter
def parameters(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]]]):
pulumi.set(self, "parameters", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter(name="paymentType")
def payment_type(self) -> Optional[pulumi.Input[str]]:
"""
The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
"""
return pulumi.get(self, "payment_type")
@payment_type.setter
def payment_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "payment_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[str]]:
"""
The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="privateConnectionPort")
def private_connection_port(self) -> Optional[pulumi.Input[str]]:
"""
Private network connection port, used to modify the private network connection port.
"""
return pulumi.get(self, "private_connection_port")
@private_connection_port.setter
def private_connection_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_connection_port", value)
@property
@pulumi.getter(name="privateConnectionPrefix")
def private_connection_prefix(self) -> Optional[pulumi.Input[str]]:
"""
Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
"""
return pulumi.get(self, "private_connection_prefix")
@private_connection_prefix.setter
def private_connection_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_connection_prefix", value)
@property
@pulumi.getter(name="privateIp")
def private_ip(self) -> Optional[pulumi.Input[str]]:
"""
The internal IP address of the instance.
"""
return pulumi.get(self, "private_ip")
@private_ip.setter
def private_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_ip", value)
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of resource group which the resource belongs.
"""
return pulumi.get(self, "resource_group_id")
@resource_group_id.setter
def resource_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_id", value)
@property
@pulumi.getter(name="restoreTime")
def restore_time(self) -> Optional[pulumi.Input[str]]:
"""
The point in time of a backup file.
"""
return pulumi.get(self, "restore_time")
@restore_time.setter
def restore_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "restore_time", value)
@property
@pulumi.getter(name="secondaryZoneId")
def secondary_zone_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
"""
return pulumi.get(self, "secondary_zone_id")
@secondary_zone_id.setter
def secondary_zone_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secondary_zone_id", value)
@property
@pulumi.getter(name="securityGroupId")
def security_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of security groups.
"""
return pulumi.get(self, "security_group_id")
@security_group_id.setter
def security_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "security_group_id", value)
@property
@pulumi.getter(name="securityIpGroupAttribute")
def security_ip_group_attribute(self) -> Optional[pulumi.Input[str]]:
"""
The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
"""
return pulumi.get(self, "security_ip_group_attribute")
@security_ip_group_attribute.setter
def security_ip_group_attribute(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "security_ip_group_attribute", value)
@property
@pulumi.getter(name="securityIpGroupName")
def security_ip_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the whitelist group.
"""
return pulumi.get(self, "security_ip_group_name")
@security_ip_group_name.setter
def security_ip_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "security_ip_group_name", value)
@property
@pulumi.getter(name="securityIps")
def security_ips(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
"""
return pulumi.get(self, "security_ips")
@security_ips.setter
def security_ips(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "security_ips", value)
@property
@pulumi.getter(name="srcdbInstanceId")
def srcdb_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the source instance.
"""
return pulumi.get(self, "srcdb_instance_id")
@srcdb_instance_id.setter
def srcdb_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "srcdb_instance_id", value)
@property
@pulumi.getter(name="sslEnable")
def ssl_enable(self) -> Optional[pulumi.Input[str]]:
"""
Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
"""
return pulumi.get(self, "ssl_enable")
@ssl_enable.setter
def ssl_enable(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_enable", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="vpcAuthMode")
def vpc_auth_mode(self) -> Optional[pulumi.Input[str]]:
"""
Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
"""
return pulumi.get(self, "vpc_auth_mode")
@vpc_auth_mode.setter
def vpc_auth_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vpc_auth_mode", value)
@property
@pulumi.getter(name="vswitchId")
def vswitch_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of VSwitch.
"""
return pulumi.get(self, "vswitch_id")
@vswitch_id.setter
def vswitch_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vswitch_id", value)
@property
@pulumi.getter(name="zoneId")
def zone_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the zone.
"""
return pulumi.get(self, "zone_id")
@zone_id.setter
def zone_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "zone_id", value)
@pulumi.input_type
class _InstanceState:
def __init__(__self__, *,
auto_renew: Optional[pulumi.Input[bool]] = None,
auto_renew_period: Optional[pulumi.Input[int]] = None,
auto_use_coupon: Optional[pulumi.Input[bool]] = None,
availability_zone: Optional[pulumi.Input[str]] = None,
backup_id: Optional[pulumi.Input[str]] = None,
backup_periods: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_time: Optional[pulumi.Input[str]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
business_info: Optional[pulumi.Input[str]] = None,
capacity: Optional[pulumi.Input[int]] = None,
config: Optional[pulumi.Input[Mapping[str, Any]]] = None,
connection_domain: Optional[pulumi.Input[str]] = None,
connection_string: Optional[pulumi.Input[str]] = None,
connection_string_prefix: Optional[pulumi.Input[str]] = None,
coupon_no: Optional[pulumi.Input[str]] = None,
db_instance_name: Optional[pulumi.Input[str]] = None,
dedicated_host_group_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
enable_backup_log: Optional[pulumi.Input[int]] = None,
enable_public: Optional[pulumi.Input[bool]] = None,
end_time: Optional[pulumi.Input[str]] = None,
engine_version: Optional[pulumi.Input[str]] = None,
force_upgrade: Optional[pulumi.Input[bool]] = None,
global_instance: Optional[pulumi.Input[bool]] = None,
global_instance_id: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
instance_name: Optional[pulumi.Input[str]] = None,
instance_release_protection: Optional[pulumi.Input[bool]] = None,
instance_type: Optional[pulumi.Input[str]] = None,
kms_encrypted_password: Optional[pulumi.Input[str]] = None,
kms_encryption_context: Optional[pulumi.Input[Mapping[str, Any]]] = None,
maintain_end_time: Optional[pulumi.Input[str]] = None,
maintain_start_time: Optional[pulumi.Input[str]] = None,
modify_mode: Optional[pulumi.Input[int]] = None,
node_type: Optional[pulumi.Input[str]] = None,
order_type: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]]] = None,
password: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
private_connection_port: Optional[pulumi.Input[str]] = None,
private_connection_prefix: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
qps: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
restore_time: Optional[pulumi.Input[str]] = None,
secondary_zone_id: Optional[pulumi.Input[str]] = None,
security_group_id: Optional[pulumi.Input[str]] = None,
security_ip_group_attribute: Optional[pulumi.Input[str]] = None,
security_ip_group_name: Optional[pulumi.Input[str]] = None,
security_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
srcdb_instance_id: Optional[pulumi.Input[str]] = None,
ssl_enable: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_auth_mode: Optional[pulumi.Input[str]] = None,
vswitch_id: Optional[pulumi.Input[str]] = None,
zone_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Instance resources.
:param pulumi.Input[bool] auto_renew: Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
:param pulumi.Input[int] auto_renew_period: Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
:param pulumi.Input[bool] auto_use_coupon: Specifies whether to use a coupon. Default to: `false`.
:param pulumi.Input[str] availability_zone: It has been deprecated from provider version 1.101.0 and `zone_id` instead.
:param pulumi.Input[str] backup_id: The ID of the backup file of the source instance.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_periods: Backup period.
:param pulumi.Input[str] backup_time: Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
:param pulumi.Input[int] bandwidth: The bandwidth.
:param pulumi.Input[str] business_info: The ID of the event or the business information.
:param pulumi.Input[int] capacity: The storage capacity of the KVStore DBInstance. Unit: MB.
:param pulumi.Input[Mapping[str, Any]] config: The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
:param pulumi.Input[str] connection_string_prefix: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] coupon_no: The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
:param pulumi.Input[str] db_instance_name: The name of KVStore DBInstance. It is a string of 2 to 256 characters.
:param pulumi.Input[str] dedicated_host_group_id: The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
:param pulumi.Input[int] enable_backup_log: Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
:param pulumi.Input[bool] enable_public: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] end_time: The expiration time of the prepaid instance.
:param pulumi.Input[str] engine_version: The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
:param pulumi.Input[bool] force_upgrade: Specifies whether to forcibly change the type. Default to: `true`.
:param pulumi.Input[bool] global_instance: Whether to create a distributed cache. Default to: `false`.
:param pulumi.Input[str] global_instance_id: The ID of distributed cache.
:param pulumi.Input[str] instance_charge_type: It has been deprecated from provider version 1.101.0 and `payment_type` instead.
:param pulumi.Input[str] instance_name: It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
:param pulumi.Input[bool] instance_release_protection: Whether to open the release protection.
:param pulumi.Input[str] instance_type: The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
:param pulumi.Input[str] kms_encrypted_password: An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
:param pulumi.Input[Mapping[str, Any]] kms_encryption_context: An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
:param pulumi.Input[str] maintain_end_time: The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[str] maintain_start_time: The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[int] modify_mode: The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
:param pulumi.Input[str] node_type: "Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
:param pulumi.Input[str] order_type: Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
:param pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]] parameters: It has been deprecated from provider version 1.101.0 and `config` instead..
:param pulumi.Input[str] password: The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
:param pulumi.Input[str] payment_type: The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
:param pulumi.Input[str] period: The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
:param pulumi.Input[int] port: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] private_connection_port: Private network connection port, used to modify the private network connection port.
:param pulumi.Input[str] private_connection_prefix: Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
:param pulumi.Input[str] private_ip: The internal IP address of the instance.
:param pulumi.Input[int] qps: Theoretical maximum QPS value.
:param pulumi.Input[str] resource_group_id: The ID of resource group which the resource belongs.
:param pulumi.Input[str] restore_time: The point in time of a backup file.
:param pulumi.Input[str] secondary_zone_id: The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
:param pulumi.Input[str] security_group_id: The ID of security groups.
:param pulumi.Input[str] security_ip_group_attribute: The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
:param pulumi.Input[str] security_ip_group_name: The name of the whitelist group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_ips: The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
:param pulumi.Input[str] srcdb_instance_id: The ID of the source instance.
:param pulumi.Input[str] ssl_enable: Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
:param pulumi.Input[str] status: The status of KVStore DBInstance.
* `connection_domain`- Intranet connection address of the KVStore instance.
:param pulumi.Input[Mapping[str, Any]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] vpc_auth_mode: Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
:param pulumi.Input[str] vswitch_id: The ID of VSwitch.
:param pulumi.Input[str] zone_id: The ID of the zone.
"""
if auto_renew is not None:
pulumi.set(__self__, "auto_renew", auto_renew)
if auto_renew_period is not None:
pulumi.set(__self__, "auto_renew_period", auto_renew_period)
if auto_use_coupon is not None:
pulumi.set(__self__, "auto_use_coupon", auto_use_coupon)
if availability_zone is not None:
warnings.warn("""Field 'availability_zone' has been deprecated from version 1.101.0. Use 'zone_id' instead.""", DeprecationWarning)
pulumi.log.warn("""availability_zone is deprecated: Field 'availability_zone' has been deprecated from version 1.101.0. Use 'zone_id' instead.""")
if availability_zone is not None:
pulumi.set(__self__, "availability_zone", availability_zone)
if backup_id is not None:
pulumi.set(__self__, "backup_id", backup_id)
if backup_periods is not None:
pulumi.set(__self__, "backup_periods", backup_periods)
if backup_time is not None:
pulumi.set(__self__, "backup_time", backup_time)
if bandwidth is not None:
pulumi.set(__self__, "bandwidth", bandwidth)
if business_info is not None:
pulumi.set(__self__, "business_info", business_info)
if capacity is not None:
pulumi.set(__self__, "capacity", capacity)
if config is not None:
pulumi.set(__self__, "config", config)
if connection_domain is not None:
pulumi.set(__self__, "connection_domain", connection_domain)
if connection_string is not None:
warnings.warn("""Field 'connection_string' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""connection_string is deprecated: Field 'connection_string' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
if connection_string is not None:
pulumi.set(__self__, "connection_string", connection_string)
if connection_string_prefix is not None:
warnings.warn("""Field 'connection_string_prefix' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""connection_string_prefix is deprecated: Field 'connection_string_prefix' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
if connection_string_prefix is not None:
pulumi.set(__self__, "connection_string_prefix", connection_string_prefix)
if coupon_no is not None:
pulumi.set(__self__, "coupon_no", coupon_no)
if db_instance_name is not None:
pulumi.set(__self__, "db_instance_name", db_instance_name)
if dedicated_host_group_id is not None:
pulumi.set(__self__, "dedicated_host_group_id", dedicated_host_group_id)
if dry_run is not None:
pulumi.set(__self__, "dry_run", dry_run)
if enable_backup_log is not None:
pulumi.set(__self__, "enable_backup_log", enable_backup_log)
if enable_public is not None:
warnings.warn("""Field 'enable_public' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""enable_public is deprecated: Field 'enable_public' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
if enable_public is not None:
pulumi.set(__self__, "enable_public", enable_public)
if end_time is not None:
pulumi.set(__self__, "end_time", end_time)
if engine_version is not None:
pulumi.set(__self__, "engine_version", engine_version)
if force_upgrade is not None:
pulumi.set(__self__, "force_upgrade", force_upgrade)
if global_instance is not None:
pulumi.set(__self__, "global_instance", global_instance)
if global_instance_id is not None:
pulumi.set(__self__, "global_instance_id", global_instance_id)
if instance_charge_type is not None:
warnings.warn("""Field 'instance_charge_type' has been deprecated from version 1.101.0. Use 'payment_type' instead.""", DeprecationWarning)
pulumi.log.warn("""instance_charge_type is deprecated: Field 'instance_charge_type' has been deprecated from version 1.101.0. Use 'payment_type' instead.""")
if instance_charge_type is not None:
pulumi.set(__self__, "instance_charge_type", instance_charge_type)
if instance_class is not None:
pulumi.set(__self__, "instance_class", instance_class)
if instance_name is not None:
warnings.warn("""Field 'instance_name' has been deprecated from version 1.101.0. Use 'db_instance_name' instead.""", DeprecationWarning)
pulumi.log.warn("""instance_name is deprecated: Field 'instance_name' has been deprecated from version 1.101.0. Use 'db_instance_name' instead.""")
if instance_name is not None:
pulumi.set(__self__, "instance_name", instance_name)
if instance_release_protection is not None:
pulumi.set(__self__, "instance_release_protection", instance_release_protection)
if instance_type is not None:
pulumi.set(__self__, "instance_type", instance_type)
if kms_encrypted_password is not None:
pulumi.set(__self__, "kms_encrypted_password", kms_encrypted_password)
if kms_encryption_context is not None:
pulumi.set(__self__, "kms_encryption_context", kms_encryption_context)
if maintain_end_time is not None:
pulumi.set(__self__, "maintain_end_time", maintain_end_time)
if maintain_start_time is not None:
pulumi.set(__self__, "maintain_start_time", maintain_start_time)
if modify_mode is not None:
pulumi.set(__self__, "modify_mode", modify_mode)
if node_type is not None:
warnings.warn("""Field 'node_type' has been deprecated from version 1.120.1""", DeprecationWarning)
pulumi.log.warn("""node_type is deprecated: Field 'node_type' has been deprecated from version 1.120.1""")
if node_type is not None:
pulumi.set(__self__, "node_type", node_type)
if order_type is not None:
pulumi.set(__self__, "order_type", order_type)
if parameters is not None:
warnings.warn("""Field 'parameters' has been deprecated from version 1.101.0. Use 'config' instead.""", DeprecationWarning)
pulumi.log.warn("""parameters is deprecated: Field 'parameters' has been deprecated from version 1.101.0. Use 'config' instead.""")
if parameters is not None:
pulumi.set(__self__, "parameters", parameters)
if password is not None:
pulumi.set(__self__, "password", password)
if payment_type is not None:
pulumi.set(__self__, "payment_type", payment_type)
if period is not None:
pulumi.set(__self__, "period", period)
if port is not None:
pulumi.set(__self__, "port", port)
if private_connection_port is not None:
pulumi.set(__self__, "private_connection_port", private_connection_port)
if private_connection_prefix is not None:
pulumi.set(__self__, "private_connection_prefix", private_connection_prefix)
if private_ip is not None:
pulumi.set(__self__, "private_ip", private_ip)
if qps is not None:
pulumi.set(__self__, "qps", qps)
if resource_group_id is not None:
pulumi.set(__self__, "resource_group_id", resource_group_id)
if restore_time is not None:
pulumi.set(__self__, "restore_time", restore_time)
if secondary_zone_id is not None:
pulumi.set(__self__, "secondary_zone_id", secondary_zone_id)
if security_group_id is not None:
pulumi.set(__self__, "security_group_id", security_group_id)
if security_ip_group_attribute is not None:
pulumi.set(__self__, "security_ip_group_attribute", security_ip_group_attribute)
if security_ip_group_name is not None:
pulumi.set(__self__, "security_ip_group_name", security_ip_group_name)
if security_ips is not None:
pulumi.set(__self__, "security_ips", security_ips)
if srcdb_instance_id is not None:
pulumi.set(__self__, "srcdb_instance_id", srcdb_instance_id)
if ssl_enable is not None:
pulumi.set(__self__, "ssl_enable", ssl_enable)
if status is not None:
pulumi.set(__self__, "status", status)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if vpc_auth_mode is not None:
pulumi.set(__self__, "vpc_auth_mode", vpc_auth_mode)
if vswitch_id is not None:
pulumi.set(__self__, "vswitch_id", vswitch_id)
if zone_id is not None:
pulumi.set(__self__, "zone_id", zone_id)
@property
@pulumi.getter(name="autoRenew")
def auto_renew(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
"""
return pulumi.get(self, "auto_renew")
@auto_renew.setter
def auto_renew(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_renew", value)
@property
@pulumi.getter(name="autoRenewPeriod")
def auto_renew_period(self) -> Optional[pulumi.Input[int]]:
"""
Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
"""
return pulumi.get(self, "auto_renew_period")
@auto_renew_period.setter
def auto_renew_period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "auto_renew_period", value)
@property
@pulumi.getter(name="autoUseCoupon")
def auto_use_coupon(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to use a coupon. Default to: `false`.
"""
return pulumi.get(self, "auto_use_coupon")
@auto_use_coupon.setter
def auto_use_coupon(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_use_coupon", value)
@property
@pulumi.getter(name="availabilityZone")
def availability_zone(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and `zone_id` instead.
"""
return pulumi.get(self, "availability_zone")
@availability_zone.setter
def availability_zone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "availability_zone", value)
@property
@pulumi.getter(name="backupId")
def backup_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the backup file of the source instance.
"""
return pulumi.get(self, "backup_id")
@backup_id.setter
def backup_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_id", value)
@property
@pulumi.getter(name="backupPeriods")
def backup_periods(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Backup period.
"""
return pulumi.get(self, "backup_periods")
@backup_periods.setter
def backup_periods(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "backup_periods", value)
@property
@pulumi.getter(name="backupTime")
def backup_time(self) -> Optional[pulumi.Input[str]]:
"""
Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
"""
return pulumi.get(self, "backup_time")
@backup_time.setter
def backup_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_time", value)
@property
@pulumi.getter
def bandwidth(self) -> Optional[pulumi.Input[int]]:
"""
The bandwidth.
"""
return pulumi.get(self, "bandwidth")
@bandwidth.setter
def bandwidth(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "bandwidth", value)
@property
@pulumi.getter(name="businessInfo")
def business_info(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the event or the business information.
"""
return pulumi.get(self, "business_info")
@business_info.setter
def business_info(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "business_info", value)
@property
@pulumi.getter
def capacity(self) -> Optional[pulumi.Input[int]]:
"""
The storage capacity of the KVStore DBInstance. Unit: MB.
"""
return pulumi.get(self, "capacity")
@capacity.setter
def capacity(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "capacity", value)
@property
@pulumi.getter
def config(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
"""
return pulumi.get(self, "config")
@config.setter
def config(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "config", value)
@property
@pulumi.getter(name="connectionDomain")
def connection_domain(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "connection_domain")
@connection_domain.setter
def connection_domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_domain", value)
@property
@pulumi.getter(name="connectionString")
def connection_string(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "connection_string")
@connection_string.setter
def connection_string(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_string", value)
@property
@pulumi.getter(name="connectionStringPrefix")
def connection_string_prefix(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "connection_string_prefix")
@connection_string_prefix.setter
def connection_string_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_string_prefix", value)
@property
@pulumi.getter(name="couponNo")
def coupon_no(self) -> Optional[pulumi.Input[str]]:
"""
The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
"""
return pulumi.get(self, "coupon_no")
@coupon_no.setter
def coupon_no(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "coupon_no", value)
@property
@pulumi.getter(name="dbInstanceName")
def db_instance_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of KVStore DBInstance. It is a string of 2 to 256 characters.
"""
return pulumi.get(self, "db_instance_name")
@db_instance_name.setter
def db_instance_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "db_instance_name", value)
@property
@pulumi.getter(name="dedicatedHostGroupId")
def dedicated_host_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
"""
return pulumi.get(self, "dedicated_host_group_id")
@dedicated_host_group_id.setter
def dedicated_host_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dedicated_host_group_id", value)
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
"""
return pulumi.get(self, "dry_run")
@dry_run.setter
def dry_run(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dry_run", value)
@property
@pulumi.getter(name="enableBackupLog")
def enable_backup_log(self) -> Optional[pulumi.Input[int]]:
"""
Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
"""
return pulumi.get(self, "enable_backup_log")
@enable_backup_log.setter
def enable_backup_log(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "enable_backup_log", value)
@property
@pulumi.getter(name="enablePublic")
def enable_public(self) -> Optional[pulumi.Input[bool]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "enable_public")
@enable_public.setter
def enable_public(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enable_public", value)
@property
@pulumi.getter(name="endTime")
def end_time(self) -> Optional[pulumi.Input[str]]:
"""
The expiration time of the prepaid instance.
"""
return pulumi.get(self, "end_time")
@end_time.setter
def end_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "end_time", value)
@property
@pulumi.getter(name="engineVersion")
def engine_version(self) -> Optional[pulumi.Input[str]]:
"""
The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
"""
return pulumi.get(self, "engine_version")
@engine_version.setter
def engine_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "engine_version", value)
@property
@pulumi.getter(name="forceUpgrade")
def force_upgrade(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to forcibly change the type. Default to: `true`.
"""
return pulumi.get(self, "force_upgrade")
@force_upgrade.setter
def force_upgrade(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_upgrade", value)
@property
@pulumi.getter(name="globalInstance")
def global_instance(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to create a distributed cache. Default to: `false`.
"""
return pulumi.get(self, "global_instance")
@global_instance.setter
def global_instance(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "global_instance", value)
@property
@pulumi.getter(name="globalInstanceId")
def global_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of distributed cache.
"""
return pulumi.get(self, "global_instance_id")
@global_instance_id.setter
def global_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "global_instance_id", value)
@property
@pulumi.getter(name="instanceChargeType")
def instance_charge_type(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and `payment_type` instead.
"""
return pulumi.get(self, "instance_charge_type")
@instance_charge_type.setter
def instance_charge_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_charge_type", value)
@property
@pulumi.getter(name="instanceClass")
def instance_class(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "instance_class")
@instance_class.setter
def instance_class(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_class", value)
@property
@pulumi.getter(name="instanceName")
def instance_name(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
"""
return pulumi.get(self, "instance_name")
@instance_name.setter
def instance_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_name", value)
@property
@pulumi.getter(name="instanceReleaseProtection")
def instance_release_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to open the release protection.
"""
return pulumi.get(self, "instance_release_protection")
@instance_release_protection.setter
def instance_release_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "instance_release_protection", value)
@property
@pulumi.getter(name="instanceType")
def instance_type(self) -> Optional[pulumi.Input[str]]:
"""
The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
"""
return pulumi.get(self, "instance_type")
@instance_type.setter
def instance_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_type", value)
@property
@pulumi.getter(name="kmsEncryptedPassword")
def kms_encrypted_password(self) -> Optional[pulumi.Input[str]]:
"""
An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
"""
return pulumi.get(self, "kms_encrypted_password")
@kms_encrypted_password.setter
def kms_encrypted_password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "kms_encrypted_password", value)
@property
@pulumi.getter(name="kmsEncryptionContext")
def kms_encryption_context(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
"""
return pulumi.get(self, "kms_encryption_context")
@kms_encryption_context.setter
def kms_encryption_context(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "kms_encryption_context", value)
@property
@pulumi.getter(name="maintainEndTime")
def maintain_end_time(self) -> Optional[pulumi.Input[str]]:
"""
The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
"""
return pulumi.get(self, "maintain_end_time")
@maintain_end_time.setter
def maintain_end_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "maintain_end_time", value)
@property
@pulumi.getter(name="maintainStartTime")
def maintain_start_time(self) -> Optional[pulumi.Input[str]]:
"""
The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
"""
return pulumi.get(self, "maintain_start_time")
@maintain_start_time.setter
def maintain_start_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "maintain_start_time", value)
@property
@pulumi.getter(name="modifyMode")
def modify_mode(self) -> Optional[pulumi.Input[int]]:
"""
The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
"""
return pulumi.get(self, "modify_mode")
@modify_mode.setter
def modify_mode(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "modify_mode", value)
@property
@pulumi.getter(name="nodeType")
def node_type(self) -> Optional[pulumi.Input[str]]:
"""
"Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
"""
return pulumi.get(self, "node_type")
@node_type.setter
def node_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "node_type", value)
@property
@pulumi.getter(name="orderType")
def order_type(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
"""
return pulumi.get(self, "order_type")
@order_type.setter
def order_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "order_type", value)
@property
@pulumi.getter
def parameters(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]]]:
"""
It has been deprecated from provider version 1.101.0 and `config` instead..
"""
return pulumi.get(self, "parameters")
@parameters.setter
def parameters(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['InstanceParameterArgs']]]]):
pulumi.set(self, "parameters", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter(name="paymentType")
def payment_type(self) -> Optional[pulumi.Input[str]]:
"""
The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
"""
return pulumi.get(self, "payment_type")
@payment_type.setter
def payment_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "payment_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[str]]:
"""
The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="privateConnectionPort")
def private_connection_port(self) -> Optional[pulumi.Input[str]]:
"""
Private network connection port, used to modify the private network connection port.
"""
return pulumi.get(self, "private_connection_port")
@private_connection_port.setter
def private_connection_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_connection_port", value)
@property
@pulumi.getter(name="privateConnectionPrefix")
def private_connection_prefix(self) -> Optional[pulumi.Input[str]]:
"""
Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
"""
return pulumi.get(self, "private_connection_prefix")
@private_connection_prefix.setter
def private_connection_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_connection_prefix", value)
@property
@pulumi.getter(name="privateIp")
def private_ip(self) -> Optional[pulumi.Input[str]]:
"""
The internal IP address of the instance.
"""
return pulumi.get(self, "private_ip")
@private_ip.setter
def private_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_ip", value)
@property
@pulumi.getter
def qps(self) -> Optional[pulumi.Input[int]]:
"""
Theoretical maximum QPS value.
"""
return pulumi.get(self, "qps")
@qps.setter
def qps(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "qps", value)
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of resource group which the resource belongs.
"""
return pulumi.get(self, "resource_group_id")
@resource_group_id.setter
def resource_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_id", value)
@property
@pulumi.getter(name="restoreTime")
def restore_time(self) -> Optional[pulumi.Input[str]]:
"""
The point in time of a backup file.
"""
return pulumi.get(self, "restore_time")
@restore_time.setter
def restore_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "restore_time", value)
@property
@pulumi.getter(name="secondaryZoneId")
def secondary_zone_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
"""
return pulumi.get(self, "secondary_zone_id")
@secondary_zone_id.setter
def secondary_zone_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secondary_zone_id", value)
@property
@pulumi.getter(name="securityGroupId")
def security_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of security groups.
"""
return pulumi.get(self, "security_group_id")
@security_group_id.setter
def security_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "security_group_id", value)
@property
@pulumi.getter(name="securityIpGroupAttribute")
def security_ip_group_attribute(self) -> Optional[pulumi.Input[str]]:
"""
The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
"""
return pulumi.get(self, "security_ip_group_attribute")
@security_ip_group_attribute.setter
def security_ip_group_attribute(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "security_ip_group_attribute", value)
@property
@pulumi.getter(name="securityIpGroupName")
def security_ip_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the whitelist group.
"""
return pulumi.get(self, "security_ip_group_name")
@security_ip_group_name.setter
def security_ip_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "security_ip_group_name", value)
@property
@pulumi.getter(name="securityIps")
def security_ips(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
"""
return pulumi.get(self, "security_ips")
@security_ips.setter
def security_ips(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "security_ips", value)
@property
@pulumi.getter(name="srcdbInstanceId")
def srcdb_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the source instance.
"""
return pulumi.get(self, "srcdb_instance_id")
@srcdb_instance_id.setter
def srcdb_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "srcdb_instance_id", value)
@property
@pulumi.getter(name="sslEnable")
def ssl_enable(self) -> Optional[pulumi.Input[str]]:
"""
Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
"""
return pulumi.get(self, "ssl_enable")
@ssl_enable.setter
def ssl_enable(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_enable", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
The status of KVStore DBInstance.
* `connection_domain`- Intranet connection address of the KVStore instance.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="vpcAuthMode")
def vpc_auth_mode(self) -> Optional[pulumi.Input[str]]:
"""
Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
"""
return pulumi.get(self, "vpc_auth_mode")
@vpc_auth_mode.setter
def vpc_auth_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vpc_auth_mode", value)
@property
@pulumi.getter(name="vswitchId")
def vswitch_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of VSwitch.
"""
return pulumi.get(self, "vswitch_id")
@vswitch_id.setter
def vswitch_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vswitch_id", value)
@property
@pulumi.getter(name="zoneId")
def zone_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the zone.
"""
return pulumi.get(self, "zone_id")
@zone_id.setter
def zone_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "zone_id", value)
class Instance(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_renew: Optional[pulumi.Input[bool]] = None,
auto_renew_period: Optional[pulumi.Input[int]] = None,
auto_use_coupon: Optional[pulumi.Input[bool]] = None,
availability_zone: Optional[pulumi.Input[str]] = None,
backup_id: Optional[pulumi.Input[str]] = None,
backup_periods: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_time: Optional[pulumi.Input[str]] = None,
business_info: Optional[pulumi.Input[str]] = None,
capacity: Optional[pulumi.Input[int]] = None,
config: Optional[pulumi.Input[Mapping[str, Any]]] = None,
connection_string_prefix: Optional[pulumi.Input[str]] = None,
coupon_no: Optional[pulumi.Input[str]] = None,
db_instance_name: Optional[pulumi.Input[str]] = None,
dedicated_host_group_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
enable_backup_log: Optional[pulumi.Input[int]] = None,
enable_public: Optional[pulumi.Input[bool]] = None,
engine_version: Optional[pulumi.Input[str]] = None,
force_upgrade: Optional[pulumi.Input[bool]] = None,
global_instance: Optional[pulumi.Input[bool]] = None,
global_instance_id: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
instance_name: Optional[pulumi.Input[str]] = None,
instance_release_protection: Optional[pulumi.Input[bool]] = None,
instance_type: Optional[pulumi.Input[str]] = None,
kms_encrypted_password: Optional[pulumi.Input[str]] = None,
kms_encryption_context: Optional[pulumi.Input[Mapping[str, Any]]] = None,
maintain_end_time: Optional[pulumi.Input[str]] = None,
maintain_start_time: Optional[pulumi.Input[str]] = None,
modify_mode: Optional[pulumi.Input[int]] = None,
node_type: Optional[pulumi.Input[str]] = None,
order_type: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['InstanceParameterArgs']]]]] = None,
password: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
private_connection_port: Optional[pulumi.Input[str]] = None,
private_connection_prefix: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
restore_time: Optional[pulumi.Input[str]] = None,
secondary_zone_id: Optional[pulumi.Input[str]] = None,
security_group_id: Optional[pulumi.Input[str]] = None,
security_ip_group_attribute: Optional[pulumi.Input[str]] = None,
security_ip_group_name: Optional[pulumi.Input[str]] = None,
security_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
srcdb_instance_id: Optional[pulumi.Input[str]] = None,
ssl_enable: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_auth_mode: Optional[pulumi.Input[str]] = None,
vswitch_id: Optional[pulumi.Input[str]] = None,
zone_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides an ApsaraDB Redis / Memcache instance resource. A DB instance is an isolated database environment in the cloud. It support be associated with IP whitelists and backup configuration which are separate resource providers. For information about Alicloud KVStore DBInstance more and how to use it, see [What is Resource Alicloud KVStore DBInstance](https://www.alibabacloud.com/help/doc-detail/60873.htm).
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.kvstore.Instance("example",
config={
"appendonly": "yes",
"lazyfree-lazy-eviction": "yes",
},
db_instance_name="tf-test-basic",
engine_version="4.0",
instance_class="redis.master.large.default",
instance_type="Redis",
resource_group_id="rg-123456",
security_ips=["10.23.12.24"],
tags={
"Created": "TF",
"For": "Test",
},
vswitch_id="vsw-123456",
zone_id="cn-beijing-h")
```
Transform To PrePaid
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.kvstore.Instance("example",
config={
"appendonly": "yes",
"lazyfree-lazy-eviction": "yes",
},
db_instance_name="tf-test-basic",
engine_version="4.0",
instance_class="redis.master.large.default",
instance_type="Redis",
payment_type="PrePaid",
period="12",
resource_group_id="rg-123456",
security_ips=["10.23.12.24"],
tags={
"Created": "TF",
"For": "Test",
},
vswitch_id="vsw-123456",
zone_id="cn-beijing-h")
```
Modify Private Connection String
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.kvstore.Instance("example",
config={
"appendonly": "yes",
"lazyfree-lazy-eviction": "yes",
},
db_instance_name="tf-test-basic",
engine_version="4.0",
instance_class="redis.master.large.default",
instance_type="Redis",
private_connection_prefix="privateconnectionstringprefix",
resource_group_id="rg-123456",
security_ips=["10.23.12.24"],
tags={
"Created": "TF",
"For": "Test",
},
vswitch_id="vsw-123456",
zone_id="cn-beijing-h")
```
## Import
KVStore instance can be imported using the id, e.g.
```sh
$ pulumi import alicloud:kvstore/instance:Instance example r-abc12345678
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_renew: Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
:param pulumi.Input[int] auto_renew_period: Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
:param pulumi.Input[bool] auto_use_coupon: Specifies whether to use a coupon. Default to: `false`.
:param pulumi.Input[str] availability_zone: It has been deprecated from provider version 1.101.0 and `zone_id` instead.
:param pulumi.Input[str] backup_id: The ID of the backup file of the source instance.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_periods: Backup period.
:param pulumi.Input[str] backup_time: Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
:param pulumi.Input[str] business_info: The ID of the event or the business information.
:param pulumi.Input[int] capacity: The storage capacity of the KVStore DBInstance. Unit: MB.
:param pulumi.Input[Mapping[str, Any]] config: The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
:param pulumi.Input[str] connection_string_prefix: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] coupon_no: The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
:param pulumi.Input[str] db_instance_name: The name of KVStore DBInstance. It is a string of 2 to 256 characters.
:param pulumi.Input[str] dedicated_host_group_id: The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
:param pulumi.Input[int] enable_backup_log: Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
:param pulumi.Input[bool] enable_public: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] engine_version: The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
:param pulumi.Input[bool] force_upgrade: Specifies whether to forcibly change the type. Default to: `true`.
:param pulumi.Input[bool] global_instance: Whether to create a distributed cache. Default to: `false`.
:param pulumi.Input[str] global_instance_id: The ID of distributed cache.
:param pulumi.Input[str] instance_charge_type: It has been deprecated from provider version 1.101.0 and `payment_type` instead.
:param pulumi.Input[str] instance_name: It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
:param pulumi.Input[bool] instance_release_protection: Whether to open the release protection.
:param pulumi.Input[str] instance_type: The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
:param pulumi.Input[str] kms_encrypted_password: An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
:param pulumi.Input[Mapping[str, Any]] kms_encryption_context: An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
:param pulumi.Input[str] maintain_end_time: The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[str] maintain_start_time: The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[int] modify_mode: The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
:param pulumi.Input[str] node_type: "Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
:param pulumi.Input[str] order_type: Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['InstanceParameterArgs']]]] parameters: It has been deprecated from provider version 1.101.0 and `config` instead..
:param pulumi.Input[str] password: The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
:param pulumi.Input[str] payment_type: The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
:param pulumi.Input[str] period: The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
:param pulumi.Input[int] port: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] private_connection_port: Private network connection port, used to modify the private network connection port.
:param pulumi.Input[str] private_connection_prefix: Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
:param pulumi.Input[str] private_ip: The internal IP address of the instance.
:param pulumi.Input[str] resource_group_id: The ID of resource group which the resource belongs.
:param pulumi.Input[str] restore_time: The point in time of a backup file.
:param pulumi.Input[str] secondary_zone_id: The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
:param pulumi.Input[str] security_group_id: The ID of security groups.
:param pulumi.Input[str] security_ip_group_attribute: The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
:param pulumi.Input[str] security_ip_group_name: The name of the whitelist group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_ips: The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
:param pulumi.Input[str] srcdb_instance_id: The ID of the source instance.
:param pulumi.Input[str] ssl_enable: Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
:param pulumi.Input[Mapping[str, Any]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] vpc_auth_mode: Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
:param pulumi.Input[str] vswitch_id: The ID of VSwitch.
:param pulumi.Input[str] zone_id: The ID of the zone.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[InstanceArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides an ApsaraDB Redis / Memcache instance resource. A DB instance is an isolated database environment in the cloud. It support be associated with IP whitelists and backup configuration which are separate resource providers. For information about Alicloud KVStore DBInstance more and how to use it, see [What is Resource Alicloud KVStore DBInstance](https://www.alibabacloud.com/help/doc-detail/60873.htm).
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.kvstore.Instance("example",
config={
"appendonly": "yes",
"lazyfree-lazy-eviction": "yes",
},
db_instance_name="tf-test-basic",
engine_version="4.0",
instance_class="redis.master.large.default",
instance_type="Redis",
resource_group_id="rg-123456",
security_ips=["10.23.12.24"],
tags={
"Created": "TF",
"For": "Test",
},
vswitch_id="vsw-123456",
zone_id="cn-beijing-h")
```
Transform To PrePaid
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.kvstore.Instance("example",
config={
"appendonly": "yes",
"lazyfree-lazy-eviction": "yes",
},
db_instance_name="tf-test-basic",
engine_version="4.0",
instance_class="redis.master.large.default",
instance_type="Redis",
payment_type="PrePaid",
period="12",
resource_group_id="rg-123456",
security_ips=["10.23.12.24"],
tags={
"Created": "TF",
"For": "Test",
},
vswitch_id="vsw-123456",
zone_id="cn-beijing-h")
```
Modify Private Connection String
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.kvstore.Instance("example",
config={
"appendonly": "yes",
"lazyfree-lazy-eviction": "yes",
},
db_instance_name="tf-test-basic",
engine_version="4.0",
instance_class="redis.master.large.default",
instance_type="Redis",
private_connection_prefix="privateconnectionstringprefix",
resource_group_id="rg-123456",
security_ips=["10.23.12.24"],
tags={
"Created": "TF",
"For": "Test",
},
vswitch_id="vsw-123456",
zone_id="cn-beijing-h")
```
## Import
KVStore instance can be imported using the id, e.g.
```sh
$ pulumi import alicloud:kvstore/instance:Instance example r-abc12345678
```
:param str resource_name: The name of the resource.
:param InstanceArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(InstanceArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_renew: Optional[pulumi.Input[bool]] = None,
auto_renew_period: Optional[pulumi.Input[int]] = None,
auto_use_coupon: Optional[pulumi.Input[bool]] = None,
availability_zone: Optional[pulumi.Input[str]] = None,
backup_id: Optional[pulumi.Input[str]] = None,
backup_periods: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_time: Optional[pulumi.Input[str]] = None,
business_info: Optional[pulumi.Input[str]] = None,
capacity: Optional[pulumi.Input[int]] = None,
config: Optional[pulumi.Input[Mapping[str, Any]]] = None,
connection_string_prefix: Optional[pulumi.Input[str]] = None,
coupon_no: Optional[pulumi.Input[str]] = None,
db_instance_name: Optional[pulumi.Input[str]] = None,
dedicated_host_group_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
enable_backup_log: Optional[pulumi.Input[int]] = None,
enable_public: Optional[pulumi.Input[bool]] = None,
engine_version: Optional[pulumi.Input[str]] = None,
force_upgrade: Optional[pulumi.Input[bool]] = None,
global_instance: Optional[pulumi.Input[bool]] = None,
global_instance_id: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
instance_name: Optional[pulumi.Input[str]] = None,
instance_release_protection: Optional[pulumi.Input[bool]] = None,
instance_type: Optional[pulumi.Input[str]] = None,
kms_encrypted_password: Optional[pulumi.Input[str]] = None,
kms_encryption_context: Optional[pulumi.Input[Mapping[str, Any]]] = None,
maintain_end_time: Optional[pulumi.Input[str]] = None,
maintain_start_time: Optional[pulumi.Input[str]] = None,
modify_mode: Optional[pulumi.Input[int]] = None,
node_type: Optional[pulumi.Input[str]] = None,
order_type: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['InstanceParameterArgs']]]]] = None,
password: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
private_connection_port: Optional[pulumi.Input[str]] = None,
private_connection_prefix: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
restore_time: Optional[pulumi.Input[str]] = None,
secondary_zone_id: Optional[pulumi.Input[str]] = None,
security_group_id: Optional[pulumi.Input[str]] = None,
security_ip_group_attribute: Optional[pulumi.Input[str]] = None,
security_ip_group_name: Optional[pulumi.Input[str]] = None,
security_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
srcdb_instance_id: Optional[pulumi.Input[str]] = None,
ssl_enable: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_auth_mode: Optional[pulumi.Input[str]] = None,
vswitch_id: Optional[pulumi.Input[str]] = None,
zone_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = InstanceArgs.__new__(InstanceArgs)
__props__.__dict__["auto_renew"] = auto_renew
__props__.__dict__["auto_renew_period"] = auto_renew_period
__props__.__dict__["auto_use_coupon"] = auto_use_coupon
if availability_zone is not None and not opts.urn:
warnings.warn("""Field 'availability_zone' has been deprecated from version 1.101.0. Use 'zone_id' instead.""", DeprecationWarning)
pulumi.log.warn("""availability_zone is deprecated: Field 'availability_zone' has been deprecated from version 1.101.0. Use 'zone_id' instead.""")
__props__.__dict__["availability_zone"] = availability_zone
__props__.__dict__["backup_id"] = backup_id
__props__.__dict__["backup_periods"] = backup_periods
__props__.__dict__["backup_time"] = backup_time
__props__.__dict__["business_info"] = business_info
__props__.__dict__["capacity"] = capacity
__props__.__dict__["config"] = config
if connection_string_prefix is not None and not opts.urn:
warnings.warn("""Field 'connection_string_prefix' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""connection_string_prefix is deprecated: Field 'connection_string_prefix' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
__props__.__dict__["connection_string_prefix"] = connection_string_prefix
__props__.__dict__["coupon_no"] = coupon_no
__props__.__dict__["db_instance_name"] = db_instance_name
__props__.__dict__["dedicated_host_group_id"] = dedicated_host_group_id
__props__.__dict__["dry_run"] = dry_run
__props__.__dict__["enable_backup_log"] = enable_backup_log
if enable_public is not None and not opts.urn:
warnings.warn("""Field 'enable_public' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""", DeprecationWarning)
pulumi.log.warn("""enable_public is deprecated: Field 'enable_public' has been deprecated from version 1.101.0. Please use resource 'alicloud_kvstore_connection' instead.""")
__props__.__dict__["enable_public"] = enable_public
__props__.__dict__["engine_version"] = engine_version
__props__.__dict__["force_upgrade"] = force_upgrade
__props__.__dict__["global_instance"] = global_instance
__props__.__dict__["global_instance_id"] = global_instance_id
if instance_charge_type is not None and not opts.urn:
warnings.warn("""Field 'instance_charge_type' has been deprecated from version 1.101.0. Use 'payment_type' instead.""", DeprecationWarning)
pulumi.log.warn("""instance_charge_type is deprecated: Field 'instance_charge_type' has been deprecated from version 1.101.0. Use 'payment_type' instead.""")
__props__.__dict__["instance_charge_type"] = instance_charge_type
__props__.__dict__["instance_class"] = instance_class
if instance_name is not None and not opts.urn:
warnings.warn("""Field 'instance_name' has been deprecated from version 1.101.0. Use 'db_instance_name' instead.""", DeprecationWarning)
pulumi.log.warn("""instance_name is deprecated: Field 'instance_name' has been deprecated from version 1.101.0. Use 'db_instance_name' instead.""")
__props__.__dict__["instance_name"] = instance_name
__props__.__dict__["instance_release_protection"] = instance_release_protection
__props__.__dict__["instance_type"] = instance_type
__props__.__dict__["kms_encrypted_password"] = kms_encrypted_password
__props__.__dict__["kms_encryption_context"] = kms_encryption_context
__props__.__dict__["maintain_end_time"] = maintain_end_time
__props__.__dict__["maintain_start_time"] = maintain_start_time
__props__.__dict__["modify_mode"] = modify_mode
if node_type is not None and not opts.urn:
warnings.warn("""Field 'node_type' has been deprecated from version 1.120.1""", DeprecationWarning)
pulumi.log.warn("""node_type is deprecated: Field 'node_type' has been deprecated from version 1.120.1""")
__props__.__dict__["node_type"] = node_type
__props__.__dict__["order_type"] = order_type
if parameters is not None and not opts.urn:
warnings.warn("""Field 'parameters' has been deprecated from version 1.101.0. Use 'config' instead.""", DeprecationWarning)
pulumi.log.warn("""parameters is deprecated: Field 'parameters' has been deprecated from version 1.101.0. Use 'config' instead.""")
__props__.__dict__["parameters"] = parameters
__props__.__dict__["password"] = password
__props__.__dict__["payment_type"] = payment_type
__props__.__dict__["period"] = period
__props__.__dict__["port"] = port
__props__.__dict__["private_connection_port"] = private_connection_port
__props__.__dict__["private_connection_prefix"] = private_connection_prefix
__props__.__dict__["private_ip"] = private_ip
__props__.__dict__["resource_group_id"] = resource_group_id
__props__.__dict__["restore_time"] = restore_time
__props__.__dict__["secondary_zone_id"] = secondary_zone_id
__props__.__dict__["security_group_id"] = security_group_id
__props__.__dict__["security_ip_group_attribute"] = security_ip_group_attribute
__props__.__dict__["security_ip_group_name"] = security_ip_group_name
__props__.__dict__["security_ips"] = security_ips
__props__.__dict__["srcdb_instance_id"] = srcdb_instance_id
__props__.__dict__["ssl_enable"] = ssl_enable
__props__.__dict__["tags"] = tags
__props__.__dict__["vpc_auth_mode"] = vpc_auth_mode
__props__.__dict__["vswitch_id"] = vswitch_id
__props__.__dict__["zone_id"] = zone_id
__props__.__dict__["bandwidth"] = None
__props__.__dict__["connection_domain"] = None
__props__.__dict__["connection_string"] = None
__props__.__dict__["end_time"] = None
__props__.__dict__["qps"] = None
__props__.__dict__["status"] = None
super(Instance, __self__).__init__(
'alicloud:kvstore/instance:Instance',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
auto_renew: Optional[pulumi.Input[bool]] = None,
auto_renew_period: Optional[pulumi.Input[int]] = None,
auto_use_coupon: Optional[pulumi.Input[bool]] = None,
availability_zone: Optional[pulumi.Input[str]] = None,
backup_id: Optional[pulumi.Input[str]] = None,
backup_periods: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_time: Optional[pulumi.Input[str]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
business_info: Optional[pulumi.Input[str]] = None,
capacity: Optional[pulumi.Input[int]] = None,
config: Optional[pulumi.Input[Mapping[str, Any]]] = None,
connection_domain: Optional[pulumi.Input[str]] = None,
connection_string: Optional[pulumi.Input[str]] = None,
connection_string_prefix: Optional[pulumi.Input[str]] = None,
coupon_no: Optional[pulumi.Input[str]] = None,
db_instance_name: Optional[pulumi.Input[str]] = None,
dedicated_host_group_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
enable_backup_log: Optional[pulumi.Input[int]] = None,
enable_public: Optional[pulumi.Input[bool]] = None,
end_time: Optional[pulumi.Input[str]] = None,
engine_version: Optional[pulumi.Input[str]] = None,
force_upgrade: Optional[pulumi.Input[bool]] = None,
global_instance: Optional[pulumi.Input[bool]] = None,
global_instance_id: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
instance_name: Optional[pulumi.Input[str]] = None,
instance_release_protection: Optional[pulumi.Input[bool]] = None,
instance_type: Optional[pulumi.Input[str]] = None,
kms_encrypted_password: Optional[pulumi.Input[str]] = None,
kms_encryption_context: Optional[pulumi.Input[Mapping[str, Any]]] = None,
maintain_end_time: Optional[pulumi.Input[str]] = None,
maintain_start_time: Optional[pulumi.Input[str]] = None,
modify_mode: Optional[pulumi.Input[int]] = None,
node_type: Optional[pulumi.Input[str]] = None,
order_type: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['InstanceParameterArgs']]]]] = None,
password: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
private_connection_port: Optional[pulumi.Input[str]] = None,
private_connection_prefix: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
qps: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
restore_time: Optional[pulumi.Input[str]] = None,
secondary_zone_id: Optional[pulumi.Input[str]] = None,
security_group_id: Optional[pulumi.Input[str]] = None,
security_ip_group_attribute: Optional[pulumi.Input[str]] = None,
security_ip_group_name: Optional[pulumi.Input[str]] = None,
security_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
srcdb_instance_id: Optional[pulumi.Input[str]] = None,
ssl_enable: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_auth_mode: Optional[pulumi.Input[str]] = None,
vswitch_id: Optional[pulumi.Input[str]] = None,
zone_id: Optional[pulumi.Input[str]] = None) -> 'Instance':
"""
Get an existing Instance resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_renew: Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
:param pulumi.Input[int] auto_renew_period: Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
:param pulumi.Input[bool] auto_use_coupon: Specifies whether to use a coupon. Default to: `false`.
:param pulumi.Input[str] availability_zone: It has been deprecated from provider version 1.101.0 and `zone_id` instead.
:param pulumi.Input[str] backup_id: The ID of the backup file of the source instance.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_periods: Backup period.
:param pulumi.Input[str] backup_time: Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
:param pulumi.Input[int] bandwidth: The bandwidth.
:param pulumi.Input[str] business_info: The ID of the event or the business information.
:param pulumi.Input[int] capacity: The storage capacity of the KVStore DBInstance. Unit: MB.
:param pulumi.Input[Mapping[str, Any]] config: The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
:param pulumi.Input[str] connection_string_prefix: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] coupon_no: The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
:param pulumi.Input[str] db_instance_name: The name of KVStore DBInstance. It is a string of 2 to 256 characters.
:param pulumi.Input[str] dedicated_host_group_id: The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
:param pulumi.Input[int] enable_backup_log: Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
:param pulumi.Input[bool] enable_public: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] end_time: The expiration time of the prepaid instance.
:param pulumi.Input[str] engine_version: The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
:param pulumi.Input[bool] force_upgrade: Specifies whether to forcibly change the type. Default to: `true`.
:param pulumi.Input[bool] global_instance: Whether to create a distributed cache. Default to: `false`.
:param pulumi.Input[str] global_instance_id: The ID of distributed cache.
:param pulumi.Input[str] instance_charge_type: It has been deprecated from provider version 1.101.0 and `payment_type` instead.
:param pulumi.Input[str] instance_name: It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
:param pulumi.Input[bool] instance_release_protection: Whether to open the release protection.
:param pulumi.Input[str] instance_type: The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
:param pulumi.Input[str] kms_encrypted_password: An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
:param pulumi.Input[Mapping[str, Any]] kms_encryption_context: An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
:param pulumi.Input[str] maintain_end_time: The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[str] maintain_start_time: The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
:param pulumi.Input[int] modify_mode: The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
:param pulumi.Input[str] node_type: "Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
:param pulumi.Input[str] order_type: Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['InstanceParameterArgs']]]] parameters: It has been deprecated from provider version 1.101.0 and `config` instead..
:param pulumi.Input[str] password: The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
:param pulumi.Input[str] payment_type: The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
:param pulumi.Input[str] period: The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
:param pulumi.Input[int] port: It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
:param pulumi.Input[str] private_connection_port: Private network connection port, used to modify the private network connection port.
:param pulumi.Input[str] private_connection_prefix: Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
:param pulumi.Input[str] private_ip: The internal IP address of the instance.
:param pulumi.Input[int] qps: Theoretical maximum QPS value.
:param pulumi.Input[str] resource_group_id: The ID of resource group which the resource belongs.
:param pulumi.Input[str] restore_time: The point in time of a backup file.
:param pulumi.Input[str] secondary_zone_id: The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
:param pulumi.Input[str] security_group_id: The ID of security groups.
:param pulumi.Input[str] security_ip_group_attribute: The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
:param pulumi.Input[str] security_ip_group_name: The name of the whitelist group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_ips: The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
:param pulumi.Input[str] srcdb_instance_id: The ID of the source instance.
:param pulumi.Input[str] ssl_enable: Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
:param pulumi.Input[str] status: The status of KVStore DBInstance.
* `connection_domain`- Intranet connection address of the KVStore instance.
:param pulumi.Input[Mapping[str, Any]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] vpc_auth_mode: Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
:param pulumi.Input[str] vswitch_id: The ID of VSwitch.
:param pulumi.Input[str] zone_id: The ID of the zone.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _InstanceState.__new__(_InstanceState)
__props__.__dict__["auto_renew"] = auto_renew
__props__.__dict__["auto_renew_period"] = auto_renew_period
__props__.__dict__["auto_use_coupon"] = auto_use_coupon
__props__.__dict__["availability_zone"] = availability_zone
__props__.__dict__["backup_id"] = backup_id
__props__.__dict__["backup_periods"] = backup_periods
__props__.__dict__["backup_time"] = backup_time
__props__.__dict__["bandwidth"] = bandwidth
__props__.__dict__["business_info"] = business_info
__props__.__dict__["capacity"] = capacity
__props__.__dict__["config"] = config
__props__.__dict__["connection_domain"] = connection_domain
__props__.__dict__["connection_string"] = connection_string
__props__.__dict__["connection_string_prefix"] = connection_string_prefix
__props__.__dict__["coupon_no"] = coupon_no
__props__.__dict__["db_instance_name"] = db_instance_name
__props__.__dict__["dedicated_host_group_id"] = dedicated_host_group_id
__props__.__dict__["dry_run"] = dry_run
__props__.__dict__["enable_backup_log"] = enable_backup_log
__props__.__dict__["enable_public"] = enable_public
__props__.__dict__["end_time"] = end_time
__props__.__dict__["engine_version"] = engine_version
__props__.__dict__["force_upgrade"] = force_upgrade
__props__.__dict__["global_instance"] = global_instance
__props__.__dict__["global_instance_id"] = global_instance_id
__props__.__dict__["instance_charge_type"] = instance_charge_type
__props__.__dict__["instance_class"] = instance_class
__props__.__dict__["instance_name"] = instance_name
__props__.__dict__["instance_release_protection"] = instance_release_protection
__props__.__dict__["instance_type"] = instance_type
__props__.__dict__["kms_encrypted_password"] = kms_encrypted_password
__props__.__dict__["kms_encryption_context"] = kms_encryption_context
__props__.__dict__["maintain_end_time"] = maintain_end_time
__props__.__dict__["maintain_start_time"] = maintain_start_time
__props__.__dict__["modify_mode"] = modify_mode
__props__.__dict__["node_type"] = node_type
__props__.__dict__["order_type"] = order_type
__props__.__dict__["parameters"] = parameters
__props__.__dict__["password"] = password
__props__.__dict__["payment_type"] = payment_type
__props__.__dict__["period"] = period
__props__.__dict__["port"] = port
__props__.__dict__["private_connection_port"] = private_connection_port
__props__.__dict__["private_connection_prefix"] = private_connection_prefix
__props__.__dict__["private_ip"] = private_ip
__props__.__dict__["qps"] = qps
__props__.__dict__["resource_group_id"] = resource_group_id
__props__.__dict__["restore_time"] = restore_time
__props__.__dict__["secondary_zone_id"] = secondary_zone_id
__props__.__dict__["security_group_id"] = security_group_id
__props__.__dict__["security_ip_group_attribute"] = security_ip_group_attribute
__props__.__dict__["security_ip_group_name"] = security_ip_group_name
__props__.__dict__["security_ips"] = security_ips
__props__.__dict__["srcdb_instance_id"] = srcdb_instance_id
__props__.__dict__["ssl_enable"] = ssl_enable
__props__.__dict__["status"] = status
__props__.__dict__["tags"] = tags
__props__.__dict__["vpc_auth_mode"] = vpc_auth_mode
__props__.__dict__["vswitch_id"] = vswitch_id
__props__.__dict__["zone_id"] = zone_id
return Instance(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="autoRenew")
def auto_renew(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to renewal a KVStore DBInstance automatically or not. It is valid when payment_type is `PrePaid`. Default to `false`.
"""
return pulumi.get(self, "auto_renew")
@property
@pulumi.getter(name="autoRenewPeriod")
def auto_renew_period(self) -> pulumi.Output[Optional[int]]:
"""
Auto-renewal period of an KVStore DBInstance, in the unit of the month. It is valid when payment_type is `PrePaid`. Valid value: [1~12], Default to `1`.
"""
return pulumi.get(self, "auto_renew_period")
@property
@pulumi.getter(name="autoUseCoupon")
def auto_use_coupon(self) -> pulumi.Output[Optional[bool]]:
"""
Specifies whether to use a coupon. Default to: `false`.
"""
return pulumi.get(self, "auto_use_coupon")
@property
@pulumi.getter(name="availabilityZone")
def availability_zone(self) -> pulumi.Output[str]:
"""
It has been deprecated from provider version 1.101.0 and `zone_id` instead.
"""
return pulumi.get(self, "availability_zone")
@property
@pulumi.getter(name="backupId")
def backup_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the backup file of the source instance.
"""
return pulumi.get(self, "backup_id")
@property
@pulumi.getter(name="backupPeriods")
def backup_periods(self) -> pulumi.Output[Sequence[str]]:
"""
Backup period.
"""
return pulumi.get(self, "backup_periods")
@property
@pulumi.getter(name="backupTime")
def backup_time(self) -> pulumi.Output[str]:
"""
Backup time, the format is HH:mmZ-HH:mmZ (UTC time).
"""
return pulumi.get(self, "backup_time")
@property
@pulumi.getter
def bandwidth(self) -> pulumi.Output[int]:
"""
The bandwidth.
"""
return pulumi.get(self, "bandwidth")
@property
@pulumi.getter(name="businessInfo")
def business_info(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the event or the business information.
"""
return pulumi.get(self, "business_info")
@property
@pulumi.getter
def capacity(self) -> pulumi.Output[int]:
"""
The storage capacity of the KVStore DBInstance. Unit: MB.
"""
return pulumi.get(self, "capacity")
@property
@pulumi.getter
def config(self) -> pulumi.Output[Optional[Mapping[str, Any]]]:
"""
The configuration of the KVStore DBInstance. Available parameters can refer to the latest docs [Instance configurations table](https://www.alibabacloud.com/help/doc-detail/61209.htm) .
"""
return pulumi.get(self, "config")
@property
@pulumi.getter(name="connectionDomain")
def connection_domain(self) -> pulumi.Output[str]:
return pulumi.get(self, "connection_domain")
@property
@pulumi.getter(name="connectionString")
def connection_string(self) -> pulumi.Output[str]:
return pulumi.get(self, "connection_string")
@property
@pulumi.getter(name="connectionStringPrefix")
def connection_string_prefix(self) -> pulumi.Output[Optional[str]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "connection_string_prefix")
@property
@pulumi.getter(name="couponNo")
def coupon_no(self) -> pulumi.Output[Optional[str]]:
"""
The coupon code. Default to: `youhuiquan_promotion_option_id_for_blank`.
"""
return pulumi.get(self, "coupon_no")
@property
@pulumi.getter(name="dbInstanceName")
def db_instance_name(self) -> pulumi.Output[str]:
"""
The name of KVStore DBInstance. It is a string of 2 to 256 characters.
"""
return pulumi.get(self, "db_instance_name")
@property
@pulumi.getter(name="dedicatedHostGroupId")
def dedicated_host_group_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the dedicated cluster. This parameter is required when you create an ApsaraDB for Redis instance in a dedicated cluster.
"""
return pulumi.get(self, "dedicated_host_group_id")
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> pulumi.Output[Optional[bool]]:
"""
Specifies whether to precheck the request. Valid values:
* true: prechecks the request without creating an instance. The system prechecks the required parameters, request format, service limits, and available resources. If the request fails the precheck, the corresponding error message is returned. If the request passes the precheck, the DryRunOperation error code is returned.
* false: checks the request. After the request passes the check, an instance is created.
"""
return pulumi.get(self, "dry_run")
@property
@pulumi.getter(name="enableBackupLog")
def enable_backup_log(self) -> pulumi.Output[Optional[int]]:
"""
Turn on or off incremental backup. Valid values: `1`, `0`. Default to `0`
"""
return pulumi.get(self, "enable_backup_log")
@property
@pulumi.getter(name="enablePublic")
def enable_public(self) -> pulumi.Output[bool]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "enable_public")
@property
@pulumi.getter(name="endTime")
def end_time(self) -> pulumi.Output[str]:
"""
The expiration time of the prepaid instance.
"""
return pulumi.get(self, "end_time")
@property
@pulumi.getter(name="engineVersion")
def engine_version(self) -> pulumi.Output[Optional[str]]:
"""
The engine version of the KVStore DBInstance. Valid values: `2.8`, `4.0` and `5.0`. Default to `5.0`.
"""
return pulumi.get(self, "engine_version")
@property
@pulumi.getter(name="forceUpgrade")
def force_upgrade(self) -> pulumi.Output[Optional[bool]]:
"""
Specifies whether to forcibly change the type. Default to: `true`.
"""
return pulumi.get(self, "force_upgrade")
@property
@pulumi.getter(name="globalInstance")
def global_instance(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to create a distributed cache. Default to: `false`.
"""
return pulumi.get(self, "global_instance")
@property
@pulumi.getter(name="globalInstanceId")
def global_instance_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of distributed cache.
"""
return pulumi.get(self, "global_instance_id")
@property
@pulumi.getter(name="instanceChargeType")
def instance_charge_type(self) -> pulumi.Output[str]:
"""
It has been deprecated from provider version 1.101.0 and `payment_type` instead.
"""
return pulumi.get(self, "instance_charge_type")
@property
@pulumi.getter(name="instanceClass")
def instance_class(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "instance_class")
@property
@pulumi.getter(name="instanceName")
def instance_name(self) -> pulumi.Output[str]:
"""
It has been deprecated from provider version 1.101.0 and `db_instance_name` instead.
"""
return pulumi.get(self, "instance_name")
@property
@pulumi.getter(name="instanceReleaseProtection")
def instance_release_protection(self) -> pulumi.Output[bool]:
"""
Whether to open the release protection.
"""
return pulumi.get(self, "instance_release_protection")
@property
@pulumi.getter(name="instanceType")
def instance_type(self) -> pulumi.Output[Optional[str]]:
"""
The engine type of the KVStore DBInstance. Valid values: `Redis` or `Memcache`. Defaults to `Redis`.
"""
return pulumi.get(self, "instance_type")
@property
@pulumi.getter(name="kmsEncryptedPassword")
def kms_encrypted_password(self) -> pulumi.Output[Optional[str]]:
"""
An KMS encrypts password used to an instance. If the `password` is filled in, this field will be ignored.
"""
return pulumi.get(self, "kms_encrypted_password")
@property
@pulumi.getter(name="kmsEncryptionContext")
def kms_encryption_context(self) -> pulumi.Output[Optional[Mapping[str, Any]]]:
"""
An KMS encryption context used to decrypt `kms_encrypted_password` before creating or updating instance with `kms_encrypted_password`. See [Encryption Context](https://www.alibabacloud.com/help/doc-detail/42975.htm). It is valid when `kms_encrypted_password` is set.
"""
return pulumi.get(self, "kms_encryption_context")
@property
@pulumi.getter(name="maintainEndTime")
def maintain_end_time(self) -> pulumi.Output[str]:
"""
The end time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
"""
return pulumi.get(self, "maintain_end_time")
@property
@pulumi.getter(name="maintainStartTime")
def maintain_start_time(self) -> pulumi.Output[str]:
"""
The start time of the operation and maintenance time period of the KVStore DBInstance, in the format of HH:mmZ (UTC time).
"""
return pulumi.get(self, "maintain_start_time")
@property
@pulumi.getter(name="modifyMode")
def modify_mode(self) -> pulumi.Output[Optional[int]]:
"""
The method of modifying the whitelist. Valid values: `0`, `1` and `2`. Default to `0`. `0` means overwrites the original whitelist. `1` means adds the IP addresses to the whitelist. `2` means deletes the IP addresses from the whitelist.
"""
return pulumi.get(self, "modify_mode")
@property
@pulumi.getter(name="nodeType")
def node_type(self) -> pulumi.Output[str]:
"""
"Field 'node_type' has been deprecated from version 1.120.1". This parameter is determined by the `instance_class`.
"""
return pulumi.get(self, "node_type")
@property
@pulumi.getter(name="orderType")
def order_type(self) -> pulumi.Output[Optional[str]]:
"""
Specifies a change type when you change the configuration of a subscription instance. Valid values: `UPGRADE`, `DOWNGRADE`. Default to `UPGRADE`. `UPGRADE` means upgrades the configuration of a subscription instance. `DOWNGRADE` means downgrades the configuration of a subscription instance.
"""
return pulumi.get(self, "order_type")
@property
@pulumi.getter
def parameters(self) -> pulumi.Output[Sequence['outputs.InstanceParameter']]:
"""
It has been deprecated from provider version 1.101.0 and `config` instead..
"""
return pulumi.get(self, "parameters")
@property
@pulumi.getter
def password(self) -> pulumi.Output[Optional[str]]:
"""
The password of the KVStore DBInstance. The password is a string of 8 to 30 characters and must contain uppercase letters, lowercase letters, and numbers.
"""
return pulumi.get(self, "password")
@property
@pulumi.getter(name="paymentType")
def payment_type(self) -> pulumi.Output[str]:
"""
The billing method of the KVStore DBInstance. Valid values: `PrePaid`, `PostPaid`. Default to `PostPaid`.
"""
return pulumi.get(self, "payment_type")
@property
@pulumi.getter
def period(self) -> pulumi.Output[Optional[str]]:
"""
The duration that you will buy KVStore DBInstance (in month). It is valid when payment_type is `PrePaid`. Valid values: `[1~9]`, `12`, `24`, `36`.
"""
return pulumi.get(self, "period")
@property
@pulumi.getter
def port(self) -> pulumi.Output[Optional[int]]:
"""
It has been deprecated from provider version 1.101.0 and resource `kvstore.Connection` instead.
"""
return pulumi.get(self, "port")
@property
@pulumi.getter(name="privateConnectionPort")
def private_connection_port(self) -> pulumi.Output[str]:
"""
Private network connection port, used to modify the private network connection port.
"""
return pulumi.get(self, "private_connection_port")
@property
@pulumi.getter(name="privateConnectionPrefix")
def private_connection_prefix(self) -> pulumi.Output[Optional[str]]:
"""
Private network connection prefix, used to modify the private network connection address. Only supports updating private network connections for existing instance.
"""
return pulumi.get(self, "private_connection_prefix")
@property
@pulumi.getter(name="privateIp")
def private_ip(self) -> pulumi.Output[str]:
"""
The internal IP address of the instance.
"""
return pulumi.get(self, "private_ip")
@property
@pulumi.getter
def qps(self) -> pulumi.Output[int]:
"""
Theoretical maximum QPS value.
"""
return pulumi.get(self, "qps")
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> pulumi.Output[str]:
"""
The ID of resource group which the resource belongs.
"""
return pulumi.get(self, "resource_group_id")
@property
@pulumi.getter(name="restoreTime")
def restore_time(self) -> pulumi.Output[Optional[str]]:
"""
The point in time of a backup file.
"""
return pulumi.get(self, "restore_time")
@property
@pulumi.getter(name="secondaryZoneId")
def secondary_zone_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the secondary zone to which you want to migrate the ApsaraDB for Redis instance.
"""
return pulumi.get(self, "secondary_zone_id")
@property
@pulumi.getter(name="securityGroupId")
def security_group_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of security groups.
"""
return pulumi.get(self, "security_group_id")
@property
@pulumi.getter(name="securityIpGroupAttribute")
def security_ip_group_attribute(self) -> pulumi.Output[Optional[str]]:
"""
The value of this parameter is empty by default. The attribute of the whitelist group. The console does not display the whitelist group whose value of this parameter is hidden.
"""
return pulumi.get(self, "security_ip_group_attribute")
@property
@pulumi.getter(name="securityIpGroupName")
def security_ip_group_name(self) -> pulumi.Output[str]:
"""
The name of the whitelist group.
"""
return pulumi.get(self, "security_ip_group_name")
@property
@pulumi.getter(name="securityIps")
def security_ips(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
The IP addresses in the whitelist group. The maximum number of IP addresses in the whitelist group is 1000.
"""
return pulumi.get(self, "security_ips")
@property
@pulumi.getter(name="srcdbInstanceId")
def srcdb_instance_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the source instance.
"""
return pulumi.get(self, "srcdb_instance_id")
@property
@pulumi.getter(name="sslEnable")
def ssl_enable(self) -> pulumi.Output[Optional[str]]:
"""
Modifies the SSL status. Valid values: `Disable`, `Enable` and `Update`.
Note: This functionality is supported by Cluster mode (Redis 2.8, 4.0, 5.0) and Standard mode( Redis 2.8 only)
"""
return pulumi.get(self, "ssl_enable")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
The status of KVStore DBInstance.
* `connection_domain`- Intranet connection address of the KVStore instance.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, Any]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="vpcAuthMode")
def vpc_auth_mode(self) -> pulumi.Output[Optional[str]]:
"""
Only meaningful if instance_type is `Redis` and network type is VPC. Valid values: `Close`, `Open`. Defaults to `Open`. `Close` means the redis instance can be accessed without authentication. `Open` means authentication is required.
"""
return pulumi.get(self, "vpc_auth_mode")
@property
@pulumi.getter(name="vswitchId")
def vswitch_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of VSwitch.
"""
return pulumi.get(self, "vswitch_id")
@property
@pulumi.getter(name="zoneId")
def zone_id(self) -> pulumi.Output[str]:
"""
The ID of the zone.
"""
return pulumi.get(self, "zone_id")
| 52.473546 | 418 | 0.668518 | 19,987 | 159,677 | 5.130085 | 0.023065 | 0.082928 | 0.069908 | 0.072951 | 0.979256 | 0.974418 | 0.967826 | 0.964344 | 0.960267 | 0.941581 | 0 | 0.008708 | 0.226138 | 159,677 | 3,042 | 419 | 52.490796 | 0.821078 | 0.34633 | 0 | 0.91188 | 1 | 0.021194 | 0.151795 | 0.033894 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164529 | false | 0.027886 | 0.003904 | 0.003904 | 0.267708 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
041d2c90649cb898a75c3f69546eed49fe56d718 | 22,642 | py | Python | saleor/graphql/payment/tests/mutations/test_checkout_payment_create.py | DevPoke/saleor | ced3a2249a18031f9f593e71d1d18aa787ec1060 | [
"CC-BY-4.0"
] | null | null | null | saleor/graphql/payment/tests/mutations/test_checkout_payment_create.py | DevPoke/saleor | ced3a2249a18031f9f593e71d1d18aa787ec1060 | [
"CC-BY-4.0"
] | null | null | null | saleor/graphql/payment/tests/mutations/test_checkout_payment_create.py | DevPoke/saleor | ced3a2249a18031f9f593e71d1d18aa787ec1060 | [
"CC-BY-4.0"
] | null | null | null | from decimal import Decimal
from unittest.mock import patch
import graphene
import pytest
from .....checkout import calculations
from .....checkout.error_codes import CheckoutErrorCode
from .....checkout.fetch import fetch_checkout_info, fetch_checkout_lines
from .....checkout.utils import add_variant_to_checkout
from .....payment.error_codes import PaymentErrorCode
from .....payment.interface import StorePaymentMethodEnum
from .....payment.models import ChargeStatus, Payment
from .....plugins.manager import get_plugins_manager
from ....core.utils import to_global_id_or_none
from ....tests.utils import get_graphql_content
DUMMY_GATEWAY = "mirumee.payments.dummy"
CREATE_PAYMENT_MUTATION = """
mutation CheckoutPaymentCreate(
$id: ID,
$input: PaymentInput!,
) {
checkoutPaymentCreate(
id: $id,
input: $input,
) {
payment {
chargeStatus
}
errors {
code
field
variants
}
}
}
"""
def test_checkout_add_payment_without_shipping_method_and_not_shipping_required(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert not data["errors"]
payment = Payment.objects.get()
assert payment.checkout == checkout
assert payment.is_active
assert payment.token == "sample-token"
assert payment.total == total.gross.amount
assert payment.currency == total.gross.currency
assert payment.charge_status == ChargeStatus.NOT_CHARGED
assert payment.billing_address_1 == checkout.billing_address.street_address_1
assert payment.billing_first_name == checkout.billing_address.first_name
assert payment.billing_last_name == checkout.billing_address.last_name
def test_checkout_add_payment_without_shipping_method_with_shipping_required(
user_api_client, checkout_with_shipping_required, address
):
checkout = checkout_with_shipping_required
checkout.billing_address = address
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert data["errors"][0]["code"] == "SHIPPING_METHOD_NOT_SET"
assert data["errors"][0]["field"] == "shippingMethod"
def test_checkout_add_payment_with_shipping_method_and_shipping_required(
user_api_client, checkout_with_shipping_required, other_shipping_method, address
):
checkout = checkout_with_shipping_required
checkout.billing_address = address
checkout.shipping_address = address
checkout.shipping_method = other_shipping_method
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert not data["errors"]
payment = Payment.objects.get()
assert payment.checkout == checkout
assert payment.is_active
assert payment.token == "sample-token"
assert payment.total == total.gross.amount
assert payment.currency == total.gross.currency
assert payment.charge_status == ChargeStatus.NOT_CHARGED
assert payment.billing_address_1 == checkout.billing_address.street_address_1
assert payment.billing_first_name == checkout.billing_address.first_name
assert payment.billing_last_name == checkout.billing_address.last_name
def test_checkout_add_payment(
user_api_client, checkout_without_shipping_required, address, customer_user
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.email = "old@example"
checkout.user = customer_user
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
return_url = "https://www.example.com"
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"returnUrl": return_url,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert not data["errors"]
payment = Payment.objects.get()
assert payment.checkout == checkout
assert payment.is_active
assert payment.token == "sample-token"
assert payment.total == total.gross.amount
assert payment.currency == total.gross.currency
assert payment.charge_status == ChargeStatus.NOT_CHARGED
assert payment.billing_address_1 == checkout.billing_address.street_address_1
assert payment.billing_first_name == checkout.billing_address.first_name
assert payment.billing_last_name == checkout.billing_address.last_name
assert payment.return_url == return_url
assert payment.billing_email == customer_user.email
def test_checkout_add_payment_default_amount(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {"gateway": DUMMY_GATEWAY, "token": "sample-token"},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert not data["errors"]
payment = Payment.objects.get()
assert payment.checkout == checkout
assert payment.is_active
assert payment.token == "sample-token"
assert payment.total == total.gross.amount
assert payment.currency == total.gross.currency
assert payment.charge_status == ChargeStatus.NOT_CHARGED
def test_checkout_add_payment_bad_amount(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": str(total.gross.amount + Decimal(1)),
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert (
data["errors"][0]["code"] == PaymentErrorCode.PARTIAL_PAYMENT_NOT_ALLOWED.name
)
def test_checkout_add_payment_no_checkout_email(
user_api_client, checkout_without_shipping_required, address, customer_user
):
checkout = checkout_without_shipping_required
checkout.email = None
checkout.save(update_fields=["email"])
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
return_url = "https://www.example.com"
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"returnUrl": return_url,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert len(data["errors"]) == 1
assert data["errors"][0]["code"] == PaymentErrorCode.CHECKOUT_EMAIL_NOT_SET.name
@patch(
"saleor.payment.gateways.dummy.plugin.DummyGatewayPlugin.CONFIGURATION_PER_CHANNEL",
False,
)
def test_checkout_add_payment_not_supported_currency(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.currency = "EUR"
checkout.save(update_fields=["billing_address", "currency"])
variables = {
"id": to_global_id_or_none(checkout),
"input": {"gateway": DUMMY_GATEWAY, "token": "sample-token", "amount": "10.0"},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert data["errors"][0]["code"] == PaymentErrorCode.NOT_SUPPORTED_GATEWAY.name
assert data["errors"][0]["field"] == "gateway"
def test_checkout_add_payment_not_existing_gateway(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save(update_fields=["billing_address", "currency"])
variables = {
"id": to_global_id_or_none(checkout),
"input": {"gateway": "not.existing", "token": "sample-token", "amount": "10.0"},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert data["errors"][0]["code"] == PaymentErrorCode.NOT_SUPPORTED_GATEWAY.name
assert data["errors"][0]["field"] == "gateway"
@patch("saleor.payment.gateways.dummy.plugin.DummyGatewayPlugin.DEFAULT_ACTIVE", False)
def test_checkout_add_payment_gateway_inactive(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save(update_fields=["billing_address", "currency"])
variables = {
"id": to_global_id_or_none(checkout),
"input": {"gateway": DUMMY_GATEWAY, "token": "sample-token", "amount": "10.0"},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
assert data["errors"][0]["code"] == PaymentErrorCode.NOT_SUPPORTED_GATEWAY.name
assert data["errors"][0]["field"] == "gateway"
def test_use_checkout_billing_address_as_payment_billing(
user_api_client, checkout_without_shipping_required, address
):
checkout = checkout_without_shipping_required
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
# check if proper error is returned if address is missing
assert data["errors"][0]["field"] == "billingAddress"
assert data["errors"][0]["code"] == PaymentErrorCode.BILLING_ADDRESS_NOT_SET.name
# assign the address and try again
address.street_address_1 = "spanish-inqusition"
address.save()
checkout.billing_address = address
checkout.save()
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
get_graphql_content(response)
checkout.refresh_from_db()
assert checkout.payments.count() == 1
payment = checkout.payments.first()
assert payment.billing_address_1 == address.street_address_1
def test_create_payment_for_checkout_with_active_payments(
checkout_with_payments, user_api_client, address, product_without_shipping
):
# given
checkout = checkout_with_payments
address.street_address_1 = "spanish-inqusition"
address.save()
checkout.billing_address = address
manager = get_plugins_manager()
variant = product_without_shipping.variants.get()
checkout_info = fetch_checkout_info(checkout, [], [], manager)
add_variant_to_checkout(checkout_info, variant, 1)
checkout.save()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
},
}
payments_count = checkout.payments.count()
previous_active_payments = checkout.payments.filter(is_active=True)
previous_active_payments_ids = list(
previous_active_payments.values_list("pk", flat=True)
)
assert len(previous_active_payments_ids) > 0
# when
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
# then
data = content["data"]["checkoutPaymentCreate"]
assert not data["errors"]
checkout.refresh_from_db()
assert checkout.payments.all().count() == payments_count + 1
active_payments = checkout.payments.all().filter(is_active=True)
assert active_payments.count() == 1
assert active_payments.first().pk not in previous_active_payments_ids
@pytest.mark.parametrize(
"store",
[
StorePaymentMethodEnum.NONE,
StorePaymentMethodEnum.ON_SESSION,
StorePaymentMethodEnum.OFF_SESSION,
],
)
def test_create_payment_with_store(
user_api_client, checkout_without_shipping_required, address, store
):
# given
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"storePaymentMethod": store,
},
}
# when
user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
# then
checkout.refresh_from_db()
payment = checkout.payments.first()
assert payment.store_payment_method == store.lower()
@pytest.mark.parametrize(
"metadata", [[{"key": f"key{i}", "value": f"value{i}"} for i in range(5)], [], None]
)
def test_create_payment_with_metadata(
user_api_client, checkout_without_shipping_required, address, metadata
):
# given
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.save()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"metadata": metadata,
},
}
# when
user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
# then
checkout.refresh_from_db()
payment = checkout.payments.first()
assert payment.metadata == {m["key"]: m["value"] for m in metadata or {}}
def test_checkout_add_payment_no_variant_channel_listings(
user_api_client, checkout_without_shipping_required, address, customer_user
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.email = "old@example"
checkout.user = customer_user
checkout.save()
variant = checkout.lines.first().variant
variant.product.channel_listings.filter(channel=checkout.channel_id).delete()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
return_url = "https://www.example.com"
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"returnUrl": return_url,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
errors = data["errors"]
assert len(errors) == 1
assert errors[0]["code"] == CheckoutErrorCode.UNAVAILABLE_VARIANT_IN_CHANNEL.name
assert errors[0]["field"] == "token"
assert errors[0]["variants"] == [
graphene.Node.to_global_id("ProductVariant", variant.pk)
]
def test_checkout_add_payment_no_product_channel_listings(
user_api_client, checkout_without_shipping_required, address, customer_user
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.email = "old@example"
checkout.user = customer_user
checkout.save()
variant = checkout.lines.first().variant
variant.channel_listings.filter(channel=checkout.channel_id).delete()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
return_url = "https://www.example.com"
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"returnUrl": return_url,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
errors = data["errors"]
assert len(errors) == 1
assert errors[0]["code"] == CheckoutErrorCode.UNAVAILABLE_VARIANT_IN_CHANNEL.name
assert errors[0]["field"] == "token"
assert errors[0]["variants"] == [
graphene.Node.to_global_id("ProductVariant", variant.pk)
]
def test_checkout_add_payment_checkout_without_lines(
user_api_client, checkout_without_shipping_required, address, customer_user
):
checkout = checkout_without_shipping_required
checkout.billing_address = address
checkout.email = "old@example"
checkout.user = customer_user
checkout.save()
checkout.lines.all().delete()
manager = get_plugins_manager()
lines, _ = fetch_checkout_lines(checkout)
checkout_info = fetch_checkout_info(checkout, lines, [], manager)
total = calculations.checkout_total(
manager=manager, checkout_info=checkout_info, lines=lines, address=address
)
return_url = "https://www.example.com"
variables = {
"id": to_global_id_or_none(checkout),
"input": {
"gateway": DUMMY_GATEWAY,
"token": "sample-token",
"amount": total.gross.amount,
"returnUrl": return_url,
},
}
response = user_api_client.post_graphql(CREATE_PAYMENT_MUTATION, variables)
content = get_graphql_content(response)
data = content["data"]["checkoutPaymentCreate"]
errors = data["errors"]
assert len(errors) == 1
assert errors[0]["field"] == "lines"
assert errors[0]["code"] == PaymentErrorCode.NO_CHECKOUT_LINES.name
| 35.99682 | 88 | 0.704576 | 2,506 | 22,642 | 6.041899 | 0.070231 | 0.047553 | 0.030051 | 0.057328 | 0.841886 | 0.828809 | 0.80741 | 0.794465 | 0.779671 | 0.772142 | 0 | 0.002729 | 0.190928 | 22,642 | 628 | 89 | 36.05414 | 0.82379 | 0.006007 | 0 | 0.703154 | 0 | 0 | 0.104028 | 0.024673 | 0 | 0 | 0 | 0 | 0.133581 | 1 | 0.03154 | false | 0 | 0.025974 | 0 | 0.057514 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
04491150bf5dce1f24514d0a22640f84bcb690e2 | 2,419 | py | Python | tests/schemas/test_nested_schemas.py | maciejjaskowski/dataclasses-avroschema | 3dae8c435dc23f11b00a58a81885dce933430655 | [
"MIT"
] | 94 | 2019-09-01T08:05:37.000Z | 2022-03-24T07:36:25.000Z | tests/schemas/test_nested_schemas.py | maciejjaskowski/dataclasses-avroschema | 3dae8c435dc23f11b00a58a81885dce933430655 | [
"MIT"
] | 145 | 2019-09-02T13:25:53.000Z | 2022-03-28T00:39:45.000Z | tests/schemas/test_nested_schemas.py | maciejjaskowski/dataclasses-avroschema | 3dae8c435dc23f11b00a58a81885dce933430655 | [
"MIT"
] | 24 | 2019-09-20T05:43:55.000Z | 2022-03-27T05:57:29.000Z | import json
import typing
from dataclasses_avroschema import AvroModel
def test_one_to_one_relationship(user_one_address_schema):
"""
Test schema relationship one-to-one
"""
class Address(AvroModel):
"An Address"
street: str
street_number: int
class User(AvroModel):
"An User with Address"
name: str
age: int
address: Address
assert User.avro_schema() == json.dumps(user_one_address_schema)
def test_one_to_many_relationship(user_many_address_schema):
"""
Test schema relationship one-to-many
"""
class Address(AvroModel):
"An Address"
street: str
street_number: int
class User(AvroModel):
"User with multiple Address"
name: str
age: int
addresses: typing.List[Address]
assert User.avro_schema() == json.dumps(user_many_address_schema)
def test_one_to_many_map_relationship(user_many_address_map_schema):
"""
Test schema relationship one-to-many using a map
"""
class Address(AvroModel):
"An Address"
street: str
street_number: int
class User(AvroModel):
"User with multiple Address"
name: str
age: int
addresses: typing.Dict[str, Address]
assert User.avro_schema() == json.dumps(user_many_address_map_schema)
def test_one_to_many_map_relationship_with_alias(user_many_address_map_schema_alias_item):
"""
Test schema relationship one-to-many using a map
"""
class Address(AvroModel):
"An Address"
street: str
street_number: int
class User(AvroModel):
"User with multiple Address"
name: str
age: int
addresses: typing.Dict[str, Address]
class Meta:
alias_nested_items = {"address": "Address"}
assert User.avro_schema() == json.dumps(user_many_address_map_schema_alias_item)
def test_alias_nested_item(user_one_address_alias_item):
"""
Test schema relationship one-to-one
"""
class Address(AvroModel):
"An Address"
street: str
street_number: int
class User(AvroModel):
"An User with Address"
name: str
age: int
address: Address
class Meta:
alias_nested_items = {"address": "Address"}
assert User.avro_schema() == json.dumps(user_one_address_alias_item)
| 22.820755 | 90 | 0.646548 | 295 | 2,419 | 5.037288 | 0.135593 | 0.030283 | 0.036339 | 0.084118 | 0.898385 | 0.870794 | 0.870794 | 0.779273 | 0.733513 | 0.733513 | 0 | 0 | 0.271186 | 2,419 | 105 | 91 | 23.038095 | 0.842881 | 0.158743 | 0 | 0.774194 | 0 | 0 | 0.091717 | 0 | 0 | 0 | 0 | 0 | 0.080645 | 1 | 0.080645 | false | 0 | 0.048387 | 0 | 0.725806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0f5efba4217d5d2419d9c83f8ebd538faae558c | 160 | py | Python | md_davis/landscape/__init__.py | djmaity/md-davis | 5ddc446a31366ca242b81a603ff4d09b4368f0f2 | [
"MIT"
] | 2 | 2020-05-06T04:56:13.000Z | 2020-08-31T18:29:08.000Z | md_davis/landscape/__init__.py | djmaity/md_davis | 040f3128f20310f21788110f61c6fc7317b1dcf2 | [
"MIT"
] | null | null | null | md_davis/landscape/__init__.py | djmaity/md_davis | 040f3128f20310f21788110f61c6fc7317b1dcf2 | [
"MIT"
] | null | null | null | import md_davis.landscape.landscape
import md_davis.landscape.landscape_xvg
import md_davis.landscape.landscape_hdf
import md_davis.landscape.landscape_animate
| 32 | 43 | 0.9 | 23 | 160 | 5.956522 | 0.304348 | 0.233577 | 0.379562 | 0.642336 | 0.905109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 160 | 4 | 44 | 40 | 0.901316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
9bcd399a0b04a8a4bfc190a1ac3c23f5de3cca0a | 145 | py | Python | app_python/__init__.py | mari1647iv/devops | 005ddb88160d674927a82e6b1935d82fae51920d | [
"MIT"
] | 1 | 2021-08-24T14:13:47.000Z | 2021-08-24T14:13:47.000Z | app_python/__init__.py | mari1647iv/devops | 005ddb88160d674927a82e6b1935d82fae51920d | [
"MIT"
] | null | null | null | app_python/__init__.py | mari1647iv/devops | 005ddb88160d674927a82e6b1935d82fae51920d | [
"MIT"
] | null | null | null | from .main import moscow_time
from .main import index
from .main import mem_visit
from .main import get_visits_list
from .main import visits_page | 29 | 33 | 0.834483 | 25 | 145 | 4.64 | 0.48 | 0.344828 | 0.603448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131034 | 145 | 5 | 34 | 29 | 0.920635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
502c870a82139b7130518a52ee792f5857a71e78 | 7,213 | py | Python | backupkeeper/calc_test.py | ErikWegner/backupkeeper | fd0eff5dd7baaa25d20cbbfe7578b0810c56a8df | [
"BSD-2-Clause"
] | null | null | null | backupkeeper/calc_test.py | ErikWegner/backupkeeper | fd0eff5dd7baaa25d20cbbfe7578b0810c56a8df | [
"BSD-2-Clause"
] | null | null | null | backupkeeper/calc_test.py | ErikWegner/backupkeeper | fd0eff5dd7baaa25d20cbbfe7578b0810c56a8df | [
"BSD-2-Clause"
] | null | null | null | from .calc import calc_keep_date
from .backupmetadata import BackupMetadata
def test_keep_first_backup_for_10_years():
"""
If no other backup exists, the first backup is kept for 10 years.
"""
backupdate = '2018-05-05' # Saturday
existing_backups = []
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2028-05-05'
def test_keep_second_backup_for_1_year():
"""
If no other monthly backup exists, the backup is kept for 1 year.
"""
backupdate = '2018-06-06' # Wednesday
existing_backups = []
existing_backups.append(
BackupMetadata('2018-05-05', '2028-05-05'),
)
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2019-06-06'
def test_keep_monday_backup_for_1_week():
"""
If other backups exists, the backup for monday to thursday is kept for 1 week.
"""
backupdate = '2018-05-07' # Monday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2018-05-14'
def test_keep_tuesday_backup_for_1_week():
"""
If other backups exists, the backup for monday to thursday is kept for 1 week.
"""
backupdate = '2018-05-08' # Tuesday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2018-05-15'
def test_keep_wednesday_backup_for_1_week():
"""
If other backups exists, the backup for monday to thursday is kept for 1 week.
"""
backupdate = '2018-05-09' # Wednesday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2018-05-16'
def test_keep_thursday_backup_for_1_week():
"""
If other backups exists, the backup for monday to thursday is kept for 1 week.
"""
backupdate = '2018-05-10' # Thursday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2018-05-17'
def test_keep_friday_backup_for_2_weeks():
"""
If other backups exists, the backup for friday is kept for 14 week.
"""
backupdate = '2018-05-11' # Friday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2018-05-25'
def test_carry_over_to_next_month():
"""
Date will carry over to next month
"""
backupdate = '2018-05-18' # Friday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-11', '2018-05-25'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2018-06-01'
def test_keep_first_day_of_month_for_1_year_even_on_friday():
"""
If other backups exists, the backup for friday is kept for 14 week.
"""
backupdate = '2018-06-01' # Friday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-11', '2018-05-25'),
BackupMetadata('2018-05-18', '2018-06-01'),
BackupMetadata('2018-05-25', '2018-06-08'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2019-06-01'
def test_keep_first_day_of_month_for_1_year_even_on_monday():
"""
If other backups exists, the backup for friday is kept for 14 week.
"""
backupdate = '2018-07-01' # Monday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-11', '2018-05-25'),
BackupMetadata('2018-05-18', '2018-06-01'),
BackupMetadata('2018-05-25', '2018-06-08'),
BackupMetadata('2018-06-01', '2019-06-01'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2019-07-01'
def test_keep_first_day_of_year_for_10_years_even_on_tuesday():
"""
On the first day of the new year, keep backup for 10 years
"""
backupdate = '2019-01-01' # Tuesday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-11', '2018-05-25'),
BackupMetadata('2018-05-18', '2018-06-01'),
BackupMetadata('2018-05-25', '2018-06-08'),
BackupMetadata('2018-06-01', '2019-06-01'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2029-01-01'
def test_keep_first_backup_of_year_for_10_years_even_on_tuesday():
"""
If no other backup exists for the year, the first backup is kept for 10 years.
"""
backupdate = '2019-01-01' # Tuesday
existing_backups = [
BackupMetadata('2018-05-05', '2028-05-05'),
BackupMetadata('2018-05-06', '2019-05-06'),
BackupMetadata('2018-05-07', '2018-05-14'),
BackupMetadata('2018-05-08', '2018-05-15'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-09', '2018-05-16'),
BackupMetadata('2018-05-11', '2018-05-25'),
BackupMetadata('2018-05-18', '2018-06-01'),
BackupMetadata('2018-05-25', '2018-06-08'),
BackupMetadata('2018-06-01', '2019-06-01'),
]
keep_date = calc_keep_date(backupdate, existing_backups)
assert keep_date == '2029-01-01'
| 32.345291 | 85 | 0.629696 | 989 | 7,213 | 4.424671 | 0.066734 | 0.143967 | 0.278793 | 0.043876 | 0.884598 | 0.855804 | 0.855804 | 0.84415 | 0.830896 | 0.817185 | 0 | 0.219852 | 0.213642 | 7,213 | 222 | 86 | 32.490991 | 0.551657 | 0.127825 | 0 | 0.666667 | 0 | 0 | 0.250701 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.086957 | false | 0 | 0.014493 | 0 | 0.101449 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5048332428a5531ad195418c8ff1d915dbae3476 | 146 | py | Python | packages/libgdiplus-2-10.py | mhutch/bockbuild | 0d989e2d0259d17d41a195f8d28b3844a4652e7b | [
"MIT"
] | null | null | null | packages/libgdiplus-2-10.py | mhutch/bockbuild | 0d989e2d0259d17d41a195f8d28b3844a4652e7b | [
"MIT"
] | null | null | null | packages/libgdiplus-2-10.py | mhutch/bockbuild | 0d989e2d0259d17d41a195f8d28b3844a4652e7b | [
"MIT"
] | null | null | null | GitHubTarballPackage('mono', 'libgdiplus', '2.10.8', '851dea0c79bf26ab3a42ef953617ab5684801e8e', configure = './autogen.sh --prefix="%{prefix}"')
| 73 | 145 | 0.739726 | 12 | 146 | 9 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210145 | 0.054795 | 146 | 1 | 146 | 146 | 0.572464 | 0 | 0 | 0 | 1 | 0 | 0.636986 | 0.273973 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
acab89eaf049a54073defd480628bd509eee598c | 139 | py | Python | phytorch/units/_si/coherent.py | emaballarin/phytorch | 68cf0a630e2fee9dd98f08639edcceb2389adf35 | [
"MIT"
] | 1 | 2022-01-21T06:59:20.000Z | 2022-01-21T06:59:20.000Z | phytorch/units/_si/coherent.py | emaballarin/phytorch | 68cf0a630e2fee9dd98f08639edcceb2389adf35 | [
"MIT"
] | null | null | null | phytorch/units/_si/coherent.py | emaballarin/phytorch | 68cf0a630e2fee9dd98f08639edcceb2389adf35 | [
"MIT"
] | 1 | 2021-04-27T00:45:47.000Z | 2021-04-27T00:45:47.000Z | from ._coherent_unit_map import coherent_unit_map
from .._utils import register_unit_map
register_unit_map(coherent_unit_map, globals())
| 23.166667 | 49 | 0.856115 | 21 | 139 | 5.095238 | 0.380952 | 0.327103 | 0.420561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086331 | 139 | 5 | 50 | 27.8 | 0.84252 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ace861d659213e735db106d8c840f5116ff88dcd | 84 | py | Python | mod2.py | chidanandpujar/Python_scripts | 0ee70e07ef4ab4d8c04955466ea9b305bdac0a53 | [
"Unlicense"
] | null | null | null | mod2.py | chidanandpujar/Python_scripts | 0ee70e07ef4ab4d8c04955466ea9b305bdac0a53 | [
"Unlicense"
] | null | null | null | mod2.py | chidanandpujar/Python_scripts | 0ee70e07ef4ab4d8c04955466ea9b305bdac0a53 | [
"Unlicense"
] | null | null | null | from mod1 import A
from mod1 import mfn
from mod1 import a
print(A.b)
A.fn(1)
mfn()
| 12 | 20 | 0.72619 | 19 | 84 | 3.210526 | 0.473684 | 0.393443 | 0.688525 | 0.491803 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057971 | 0.178571 | 84 | 6 | 21 | 14 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.166667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c583c3080c102ebdcb7ae0b1ebc7c42409cdd8a2 | 10,929 | py | Python | NimFunctions.py | m-kosik/square-nim-game-analysis | cce8f4cf3442ffe4f231e37a655f0e77dad36ea0 | [
"MIT"
] | null | null | null | NimFunctions.py | m-kosik/square-nim-game-analysis | cce8f4cf3442ffe4f231e37a655f0e77dad36ea0 | [
"MIT"
] | null | null | null | NimFunctions.py | m-kosik/square-nim-game-analysis | cce8f4cf3442ffe4f231e37a655f0e77dad36ea0 | [
"MIT"
] | null | null | null | import numpy as np
def generate_possible_boards(prev_possibilities, b_size, order):
new_possibilities = []
new_possibilities_nodes = []
new_possibilities_edges = []
for possibility in prev_possibilities:
str_possibility = str(order-1)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in possibility)
for i in range(b_size):
for j in range(b_size):
board = np.copy(possibility)
if board[i][j]:
board[i][j] = 0
str_board = str(order)+ '\n' + '\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in board)
if any(np.array_equal(board, x) for x in new_possibilities):
new_possibilities_edges.append([str_possibility,str_board])
if any(np.array_equal(np.rot90(board), x) for x in new_possibilities):
str_90 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(board))
new_possibilities_edges.append([str_possibility,str_90])
if any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities):
str_180 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(board)))
new_possibilities_edges.append([str_possibility,str_180])
if any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities):
str_270 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(np.rot90(board))))
new_possibilities_edges.append([str_possibility,str_270])
if not (any(np.array_equal(board, x) for x in new_possibilities)
or any(np.array_equal(np.rot90(board), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities)):
new_possibilities.append(board)
new_possibilities_nodes.append(str_board)
new_possibilities_edges.append([str_possibility,str_board])
for i in range(b_size-1):
for j in range(b_size):
board = np.copy(possibility)
if board[i,j] and board[i+1,j]:
board[i][j] = 0
board[i+1][j] = 0
str_board = str(order)+ '\n' +'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in board)
if any(np.array_equal(board, x) for x in new_possibilities):
new_possibilities_edges.append([str_possibility,str_board])
if any(np.array_equal(np.rot90(board), x) for x in new_possibilities):
str_90 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(board))
new_possibilities_edges.append([str_possibility,str_90])
if any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities):
str_180 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(board)))
new_possibilities_edges.append([str_possibility,str_180])
if any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities):
str_270 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(np.rot90(board))))
new_possibilities_edges.append([str_possibility,str_270])
if not (any(np.array_equal(board, x) for x in new_possibilities)
or any(np.array_equal(np.rot90(board), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities)):
new_possibilities.append(board)
new_possibilities_nodes.append(str_board)
new_possibilities_edges.append([str_possibility,str_board])
for i in range(b_size):
for j in range(b_size-1):
board = np.copy(possibility)
if board[i,j] and board[i,j+1]:
board[i][j] = 0
board[i][j+1] = 0
str_board = str(order)+ '\n' +'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in board)
if any(np.array_equal(board, x) for x in new_possibilities):
new_possibilities_edges.append([str_possibility,str_board])
if any(np.array_equal(np.rot90(board), x) for x in new_possibilities):
str_90 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(board))
new_possibilities_edges.append([str_possibility,str_90])
if any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities):
str_180 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(board)))
new_possibilities_edges.append([str_possibility,str_180])
if any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities):
str_270 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(np.rot90(board))))
new_possibilities_edges.append([str_possibility,str_270])
if not (any(np.array_equal(board, x) for x in new_possibilities)
or any(np.array_equal(np.rot90(board), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities)):
new_possibilities.append(board)
new_possibilities_nodes.append(str_board)
new_possibilities_edges.append([str_possibility,str_board])
for i in range(b_size):
board = np.copy(possibility)
if np.sum(board[i][:]) == b_size:
board[i][:] = np.zeros(b_size,)
str_board = str(order)+ '\n' +'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in board)
if any(np.array_equal(board, x) for x in new_possibilities):
new_possibilities_edges.append([str_possibility,str_board])
if any(np.array_equal(np.rot90(board), x) for x in new_possibilities):
str_90 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(board))
new_possibilities_edges.append([str_possibility,str_90])
if any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities):
str_180 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(board)))
new_possibilities_edges.append([str_possibility,str_180])
if any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities):
str_270 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(np.rot90(board))))
new_possibilities_edges.append([str_possibility,str_270])
if not (any(np.array_equal(board, x) for x in new_possibilities)
or any(np.array_equal(np.rot90(board), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities)):
new_possibilities.append(board)
new_possibilities_nodes.append(str_board)
new_possibilities_edges.append([str_possibility,str_board])
for i in range(b_size):
board = np.copy(possibility)
if np.sum(board[:][i]) == b_size:
board[:,i] = np.zeros(b_size,)
str_board = str(order)+ '\n' +'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in board)
if any(np.array_equal(board, x) for x in new_possibilities):
new_possibilities_edges.append([str_possibility,str_board])
if any(np.array_equal(np.rot90(board), x) for x in new_possibilities):
str_90 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(board))
new_possibilities_edges.append([str_possibility,str_90])
if any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities):
str_180 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(board)))
new_possibilities_edges.append([str_possibility,str_180])
if any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities):
str_270 = str(order)+ '\n'+'\n'.join(''.join([str(int(elem)) for elem in board_line]) for board_line in np.rot90(np.rot90(np.rot90(board))))
new_possibilities_edges.append([str_possibility,str_270])
if not (any(np.array_equal(board, x) for x in new_possibilities)
or any(np.array_equal(np.rot90(board), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(board)), x) for x in new_possibilities)
or any(np.array_equal(np.rot90(np.rot90(np.rot90(board))), x) for x in new_possibilities)):
new_possibilities.append(board)
new_possibilities_nodes.append(str_board)
new_possibilities_edges.append([str_possibility,str_board])
return new_possibilities, new_possibilities_nodes, new_possibilities_edges | 78.064286 | 164 | 0.582944 | 1,532 | 10,929 | 3.969321 | 0.031332 | 0.103601 | 0.088801 | 0.103601 | 0.975991 | 0.974511 | 0.969906 | 0.969906 | 0.95083 | 0.95083 | 0 | 0.034988 | 0.288681 | 10,929 | 140 | 165 | 78.064286 | 0.747234 | 0 | 0 | 0.851563 | 1 | 0 | 0.007685 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007813 | false | 0 | 0.007813 | 0 | 0.023438 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a8446a17df82932900831dadf3e512c90d6c8c51 | 13,251 | py | Python | sdk/python/pulumi_databricks/grants.py | pulumi/pulumi-databricks | 43580d4adbd04b72558f368ff0eef3d03432ebc1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_databricks/grants.py | pulumi/pulumi-databricks | 43580d4adbd04b72558f368ff0eef3d03432ebc1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_databricks/grants.py | pulumi/pulumi-databricks | 43580d4adbd04b72558f368ff0eef3d03432ebc1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['GrantsArgs', 'Grants']
@pulumi.input_type
class GrantsArgs:
def __init__(__self__, *,
grants: pulumi.Input[Sequence[pulumi.Input['GrantsGrantArgs']]],
catalog: Optional[pulumi.Input[str]] = None,
external_location: Optional[pulumi.Input[str]] = None,
schema: Optional[pulumi.Input[str]] = None,
storage_credential: Optional[pulumi.Input[str]] = None,
table: Optional[pulumi.Input[str]] = None,
view: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Grants resource.
"""
pulumi.set(__self__, "grants", grants)
if catalog is not None:
pulumi.set(__self__, "catalog", catalog)
if external_location is not None:
pulumi.set(__self__, "external_location", external_location)
if schema is not None:
pulumi.set(__self__, "schema", schema)
if storage_credential is not None:
pulumi.set(__self__, "storage_credential", storage_credential)
if table is not None:
pulumi.set(__self__, "table", table)
if view is not None:
pulumi.set(__self__, "view", view)
@property
@pulumi.getter
def grants(self) -> pulumi.Input[Sequence[pulumi.Input['GrantsGrantArgs']]]:
return pulumi.get(self, "grants")
@grants.setter
def grants(self, value: pulumi.Input[Sequence[pulumi.Input['GrantsGrantArgs']]]):
pulumi.set(self, "grants", value)
@property
@pulumi.getter
def catalog(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "catalog")
@catalog.setter
def catalog(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "catalog", value)
@property
@pulumi.getter(name="externalLocation")
def external_location(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "external_location")
@external_location.setter
def external_location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "external_location", value)
@property
@pulumi.getter
def schema(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "schema")
@schema.setter
def schema(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "schema", value)
@property
@pulumi.getter(name="storageCredential")
def storage_credential(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "storage_credential")
@storage_credential.setter
def storage_credential(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "storage_credential", value)
@property
@pulumi.getter
def table(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "table")
@table.setter
def table(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "table", value)
@property
@pulumi.getter
def view(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "view")
@view.setter
def view(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "view", value)
@pulumi.input_type
class _GrantsState:
def __init__(__self__, *,
catalog: Optional[pulumi.Input[str]] = None,
external_location: Optional[pulumi.Input[str]] = None,
grants: Optional[pulumi.Input[Sequence[pulumi.Input['GrantsGrantArgs']]]] = None,
schema: Optional[pulumi.Input[str]] = None,
storage_credential: Optional[pulumi.Input[str]] = None,
table: Optional[pulumi.Input[str]] = None,
view: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Grants resources.
"""
if catalog is not None:
pulumi.set(__self__, "catalog", catalog)
if external_location is not None:
pulumi.set(__self__, "external_location", external_location)
if grants is not None:
pulumi.set(__self__, "grants", grants)
if schema is not None:
pulumi.set(__self__, "schema", schema)
if storage_credential is not None:
pulumi.set(__self__, "storage_credential", storage_credential)
if table is not None:
pulumi.set(__self__, "table", table)
if view is not None:
pulumi.set(__self__, "view", view)
@property
@pulumi.getter
def catalog(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "catalog")
@catalog.setter
def catalog(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "catalog", value)
@property
@pulumi.getter(name="externalLocation")
def external_location(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "external_location")
@external_location.setter
def external_location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "external_location", value)
@property
@pulumi.getter
def grants(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['GrantsGrantArgs']]]]:
return pulumi.get(self, "grants")
@grants.setter
def grants(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['GrantsGrantArgs']]]]):
pulumi.set(self, "grants", value)
@property
@pulumi.getter
def schema(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "schema")
@schema.setter
def schema(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "schema", value)
@property
@pulumi.getter(name="storageCredential")
def storage_credential(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "storage_credential")
@storage_credential.setter
def storage_credential(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "storage_credential", value)
@property
@pulumi.getter
def table(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "table")
@table.setter
def table(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "table", value)
@property
@pulumi.getter
def view(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "view")
@view.setter
def view(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "view", value)
class Grants(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
catalog: Optional[pulumi.Input[str]] = None,
external_location: Optional[pulumi.Input[str]] = None,
grants: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['GrantsGrantArgs']]]]] = None,
schema: Optional[pulumi.Input[str]] = None,
storage_credential: Optional[pulumi.Input[str]] = None,
table: Optional[pulumi.Input[str]] = None,
view: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Create a Grants resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: GrantsArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Create a Grants resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param GrantsArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(GrantsArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
catalog: Optional[pulumi.Input[str]] = None,
external_location: Optional[pulumi.Input[str]] = None,
grants: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['GrantsGrantArgs']]]]] = None,
schema: Optional[pulumi.Input[str]] = None,
storage_credential: Optional[pulumi.Input[str]] = None,
table: Optional[pulumi.Input[str]] = None,
view: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = GrantsArgs.__new__(GrantsArgs)
__props__.__dict__["catalog"] = catalog
__props__.__dict__["external_location"] = external_location
if grants is None and not opts.urn:
raise TypeError("Missing required property 'grants'")
__props__.__dict__["grants"] = grants
__props__.__dict__["schema"] = schema
__props__.__dict__["storage_credential"] = storage_credential
__props__.__dict__["table"] = table
__props__.__dict__["view"] = view
super(Grants, __self__).__init__(
'databricks:index/grants:Grants',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
catalog: Optional[pulumi.Input[str]] = None,
external_location: Optional[pulumi.Input[str]] = None,
grants: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['GrantsGrantArgs']]]]] = None,
schema: Optional[pulumi.Input[str]] = None,
storage_credential: Optional[pulumi.Input[str]] = None,
table: Optional[pulumi.Input[str]] = None,
view: Optional[pulumi.Input[str]] = None) -> 'Grants':
"""
Get an existing Grants resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _GrantsState.__new__(_GrantsState)
__props__.__dict__["catalog"] = catalog
__props__.__dict__["external_location"] = external_location
__props__.__dict__["grants"] = grants
__props__.__dict__["schema"] = schema
__props__.__dict__["storage_credential"] = storage_credential
__props__.__dict__["table"] = table
__props__.__dict__["view"] = view
return Grants(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def catalog(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "catalog")
@property
@pulumi.getter(name="externalLocation")
def external_location(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "external_location")
@property
@pulumi.getter
def grants(self) -> pulumi.Output[Sequence['outputs.GrantsGrant']]:
return pulumi.get(self, "grants")
@property
@pulumi.getter
def schema(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "schema")
@property
@pulumi.getter(name="storageCredential")
def storage_credential(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "storage_credential")
@property
@pulumi.getter
def table(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "table")
@property
@pulumi.getter
def view(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "view")
| 38.973529 | 134 | 0.632481 | 1,460 | 13,251 | 5.488356 | 0.09589 | 0.10433 | 0.142269 | 0.148259 | 0.797205 | 0.780107 | 0.748409 | 0.724073 | 0.705603 | 0.656184 | 0 | 0.000101 | 0.250245 | 13,251 | 339 | 135 | 39.088496 | 0.806442 | 0.082786 | 0 | 0.779468 | 1 | 0 | 0.088857 | 0.002508 | 0 | 0 | 0 | 0 | 0 | 1 | 0.159696 | false | 0.003802 | 0.026616 | 0.079848 | 0.281369 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.