hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3e75f59ad7183d85bc560750f48467d116d39d82 | 14,061 | py | Python | tmapi/tests/indices/test_literal_index.py | ajenhl/django-tmapi | 02f009e1b508218cf330ca7748c3a1dd110f3e8d | [
"Apache-2.0"
] | 2 | 2015-03-22T03:23:36.000Z | 2017-01-08T10:57:18.000Z | tmapi/tests/indices/test_literal_index.py | ajenhl/django-tmapi | 02f009e1b508218cf330ca7748c3a1dd110f3e8d | [
"Apache-2.0"
] | null | null | null | tmapi/tests/indices/test_literal_index.py | ajenhl/django-tmapi | 02f009e1b508218cf330ca7748c3a1dd110f3e8d | [
"Apache-2.0"
] | 1 | 2020-12-28T04:40:34.000Z | 2020-12-28T04:40:34.000Z | # Copyright 2011 Jamie Norrish (jamie@artefact.org.nz)
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Module caontaining tests against the `LiteralIndex` interface.
Most if not all of these tests are ported from the public domain tests
that come with the TMAPI 2.0 distribution (http://www.tmapi.org/2.0/).
"""
from tmapi.constants import XSD_ANY_URI, XSD_STRING
from tmapi.exceptions import IllegalArgumentException
from tmapi.indices.literal_index import LiteralIndex
from tmapi.tests.models.tmapi_test_case import TMAPITestCase
class LiteralIndexTest (TMAPITestCase):
def setUp (self):
super(LiteralIndexTest, self).setUp()
self._index = self.tm.get_index(LiteralIndex)
self._index.open()
self._XSD_ANY_URI = self.create_locator(XSD_ANY_URI)
self._XSD_STRING = self.create_locator(XSD_STRING)
def tearDown (self):
super(LiteralIndexTest, self).tearDown()
self._index.close()
def _update_index (self):
if not self._index.is_auto_updated():
self._index.reindex()
def test_name (self):
value = 'Value'
value2 = 'Value2'
self._update_index()
self.assertEqual(0, self._index.get_names(value).count())
name = self.create_topic().create_name(value)
self._update_index()
self.assertEqual(1, self._index.get_names(value).count())
self.assertTrue(name in self._index.get_names(value))
name.set_value(value2)
self._update_index()
self.assertEqual(0, self._index.get_names(value).count())
self.assertEqual(1, self._index.get_names(value2).count())
self.assertTrue(name in self._index.get_names(value2))
name.remove()
self._update_index()
self.assertEqual(0, self._index.get_names(value).count())
self.assertEqual(0, self._index.get_names(value2).count())
def test_name_illegal_string (self):
self.assertRaises(IllegalArgumentException, self._index.get_names,
None)
def test_occurrence_string (self):
value = 'Value'
value2 = 'Value2'
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value, self._XSD_STRING).count())
type = self.create_topic()
occurrence = self.create_topic().create_occurrence(type, value)
self._update_index()
self.assertEqual(1, self._index.get_occurrences(value).count())
self.assertTrue(occurrence in self._index.get_occurrences(value))
self.assertEqual(1, self._index.get_occurrences(
value, self._XSD_STRING).count())
self.assertTrue(occurrence in self._index.get_occurrences(
value, self._XSD_STRING))
occurrence.set_value(value2)
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value, self._XSD_STRING).count())
self.assertEqual(1, self._index.get_occurrences(value2).count())
self.assertTrue(occurrence in self._index.get_occurrences(value2))
self.assertEqual(1, self._index.get_occurrences(
value2, self._XSD_STRING).count())
self.assertTrue(occurrence in self._index.get_occurrences(
value2, self._XSD_STRING))
occurrence.remove()
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value, self._XSD_STRING).count())
self.assertEqual(0, self._index.get_occurrences(value2).count())
self.assertEqual(0, self._index.get_occurrences(
value2, self._XSD_STRING).count())
def test_occurrence_uri (self):
value = self.create_locator('http://www.example.org/1')
value2 = self.create_locator('http://www.example.org/2')
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value.get_reference(), self._XSD_ANY_URI).count())
type = self.create_topic()
occurrence = self.create_topic().create_occurrence(type, value)
self._update_index()
self.assertEqual(1, self._index.get_occurrences(value).count())
self.assertTrue(occurrence in self._index.get_occurrences(value))
self.assertEqual(1, self._index.get_occurrences(
value.get_reference(), self._XSD_ANY_URI).count())
self.assertTrue(occurrence in self._index.get_occurrences(
value.get_reference(), self._XSD_ANY_URI))
occurrence.set_value(value2)
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value.get_reference(), self._XSD_ANY_URI).count())
self.assertEqual(1, self._index.get_occurrences(value2).count())
self.assertTrue(occurrence in self._index.get_occurrences(value2))
self.assertEqual(1, self._index.get_occurrences(
value2.get_reference(), self._XSD_ANY_URI).count())
self.assertTrue(occurrence in self._index.get_occurrences(
value2.get_reference(), self._XSD_ANY_URI))
occurrence.remove()
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value.get_reference(), self._XSD_ANY_URI).count())
self.assertEqual(0, self._index.get_occurrences(value2).count())
self.assertEqual(0, self._index.get_occurrences(
value2.get_reference(), self._XSD_ANY_URI).count())
def test_occurrence_explicit_datatype (self):
value = 'http://www.example.org/1'
value2 = 'http://www.example.org/2'
datatype = self.create_locator('http://www.example.org/datatype')
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value, datatype).count())
type = self.create_topic()
occurrence = self.create_topic().create_occurrence(
type, value, datatype=datatype)
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(1, self._index.get_occurrences(
value, datatype).count())
self.assertTrue(occurrence in self._index.get_occurrences(
value, datatype))
occurrence.set_value(value2, datatype)
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value).count())
self.assertEqual(0, self._index.get_occurrences(
value, datatype).count())
self.assertEqual(0, self._index.get_occurrences(value2).count())
self.assertEqual(1, self._index.get_occurrences(
value2, datatype).count())
self.assertTrue(occurrence in self._index.get_occurrences(
value2, datatype))
occurrence.remove()
self._update_index()
self.assertEqual(0, self._index.get_occurrences(value2).count())
self.assertEqual(0, self._index.get_occurrences(
value2, datatype).count())
def test_occurrence_illegal_string (self):
self.assertRaises(IllegalArgumentException, self._index.get_occurrences,
None)
def test_occurrence_illegal_uri (self):
self.assertRaises(IllegalArgumentException, self._index.get_occurrences,
None)
def test_occurrence_illegal_datatype (self):
# This test is not applicable to this implementation.
pass
def test_variant_string (self):
value = 'Value'
value2 = 'Value2'
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(
value, self._XSD_STRING).count())
theme = self.create_topic()
variant = self.create_name().create_variant(value, theme)
self._update_index()
self.assertEqual(1, self._index.get_variants(value).count())
self.assertTrue(variant in self._index.get_variants(value))
self.assertEqual(1, self._index.get_variants(
value, self._XSD_STRING).count())
self.assertTrue(variant in self._index.get_variants(
value, self._XSD_STRING))
variant.set_value(value2)
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(
value, self._XSD_STRING).count())
self.assertEqual(1, self._index.get_variants(value2).count())
self.assertTrue(variant in self._index.get_variants(value2))
self.assertEqual(1, self._index.get_variants(
value2, self._XSD_STRING).count())
self.assertTrue(variant in self._index.get_variants(
value2, self._XSD_STRING))
variant.remove()
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(
value, self._XSD_STRING).count())
self.assertEqual(0, self._index.get_variants(value2).count())
self.assertEqual(0, self._index.get_variants(
value2, self._XSD_STRING).count())
def test_variant_uri (self):
value = self.create_locator('http://www.example.org/1')
value2 = self.create_locator('http://www.example.org/2')
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(
value.get_reference(), self._XSD_ANY_URI).count())
theme = self.create_topic()
variant = self.create_name().create_variant(value, theme)
self._update_index()
self.assertEqual(1, self._index.get_variants(value).count())
self.assertTrue(variant in self._index.get_variants(value))
self.assertEqual(1, self._index.get_variants(
value.get_reference(), self._XSD_ANY_URI).count())
self.assertTrue(variant in self._index.get_variants(
value.get_reference(), self._XSD_ANY_URI))
variant.set_value(value2)
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(
value.get_reference(), self._XSD_ANY_URI).count())
self.assertEqual(1, self._index.get_variants(value2).count())
self.assertTrue(variant in self._index.get_variants(value2))
self.assertEqual(1, self._index.get_variants(
value2.get_reference(), self._XSD_ANY_URI).count())
self.assertTrue(variant in self._index.get_variants(
value2.get_reference(), self._XSD_ANY_URI))
variant.remove()
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(
value.get_reference(), self._XSD_ANY_URI).count())
self.assertEqual(0, self._index.get_variants(value2).count())
self.assertEqual(0, self._index.get_variants(
value2.get_reference(), self._XSD_ANY_URI).count())
def test_variant_explicit_datatype (self):
value = 'http://www.example.org/1'
value2 = 'http://www.example.org/2'
datatype = self.create_locator('http://www.example.org/datatype')
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(value, datatype).count())
theme = self.create_topic()
variant = self.create_name().create_variant(value, theme, datatype)
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(1, self._index.get_variants(value, datatype).count())
self.assertTrue(variant in self._index.get_variants(value, datatype))
variant.set_value(value2, datatype)
self._update_index()
self.assertEqual(0, self._index.get_variants(value).count())
self.assertEqual(0, self._index.get_variants(value, datatype).count())
self.assertEqual(0, self._index.get_variants(value2).count())
self.assertEqual(1, self._index.get_variants(value2, datatype).count())
self.assertTrue(variant in self._index.get_variants(value2, datatype))
variant.remove()
self._update_index()
self.assertEqual(0, self._index.get_variants(value2).count())
self.assertEqual(0, self._index.get_variants(value2, datatype).count())
def test_variant_illegal_string (self):
# This test is not applicable to this implementation.
self.assertRaises(IllegalArgumentException, self._index.get_variants,
None)
def test_variant_illegal_uri (self):
self.assertRaises(IllegalArgumentException, self._index.get_variants,
None)
def test_variant_illegal_datatype (self):
# This test is not applicable to this implementation.
pass
| 47.826531 | 80 | 0.665671 | 1,687 | 14,061 | 5.282158 | 0.08358 | 0.107059 | 0.136012 | 0.11671 | 0.851644 | 0.848839 | 0.84334 | 0.836606 | 0.823477 | 0.788015 | 0 | 0.013453 | 0.217623 | 14,061 | 293 | 81 | 47.989761 | 0.796564 | 0.066638 | 0 | 0.792 | 0 | 0 | 0.021907 | 0 | 0 | 0 | 0 | 0 | 0.404 | 1 | 0.068 | false | 0.008 | 0.016 | 0 | 0.088 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3e77be9357aaeea968906b4746858c9965dc151f | 27 | py | Python | Python/School/C1/Q4C1.py | abdalrhmanyasser/Abdalrhman_Rep | e0fc3caa2cc04e92f591ccd7934586986d194000 | [
"CC0-1.0"
] | null | null | null | Python/School/C1/Q4C1.py | abdalrhmanyasser/Abdalrhman_Rep | e0fc3caa2cc04e92f591ccd7934586986d194000 | [
"CC0-1.0"
] | null | null | null | Python/School/C1/Q4C1.py | abdalrhmanyasser/Abdalrhman_Rep | e0fc3caa2cc04e92f591ccd7934586986d194000 | [
"CC0-1.0"
] | null | null | null | print((512-282)/(47*78+5))
| 13.5 | 26 | 0.592593 | 6 | 27 | 2.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.423077 | 0.037037 | 27 | 1 | 27 | 27 | 0.192308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
e41ce93504e8b40c72151550d056fb7ec2e2aa18 | 3,378 | py | Python | test/pyaz/sf/managed_service/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/sf/managed_service/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/sf/managed_service/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from ... pyaz_utils import get_cli_name, get_params
def list(resource_group, cluster_name, application):
params = get_params(locals())
command = "az sf managed-service list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete(resource_group, cluster_name, application, name):
params = get_params(locals())
command = "az sf managed-service delete " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def show(resource_group, cluster_name, application, name):
params = get_params(locals())
command = "az sf managed-service show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def create(resource_group, cluster_name, application, name, type, state, default_move_cost=None, placement_constraints=None, service_package_activation_mode=None, target_replica_set_size=None, min_replica_set_size=None, has_persisted_state=None, service_placement_time_limit=None, stand_by_replica_keep_duration=None, quorum_loss_wait_duration=None, replica_restart_wait_duration=None, instance_count=None, min_instance_count=None, min_instance_percentage=None, partition_scheme=None, partition_count=None, low_key=None, high_key=None, partition_names=None, tags=None):
params = get_params(locals())
command = "az sf managed-service create " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def update(resource_group, cluster_name, application, name, default_move_cost=None, placement_constraints=None, target_replica_set_size=None, min_replica_set_size=None, service_placement_time_limit=None, stand_by_replica_keep_duration=None, quorum_loss_wait_duration=None, replica_restart_wait_duration=None, instance_count=None, min_instance_count=None, min_instance_percentage=None, tags=None):
params = get_params(locals())
command = "az sf managed-service update " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 45.648649 | 569 | 0.720249 | 434 | 3,378 | 5.414747 | 0.186636 | 0.059574 | 0.042553 | 0.051064 | 0.898298 | 0.883404 | 0.850213 | 0.815319 | 0.815319 | 0.795745 | 0 | 0.003589 | 0.175252 | 3,378 | 73 | 570 | 46.273973 | 0.839914 | 0 | 0 | 0.820896 | 0 | 0 | 0.056542 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074627 | false | 0 | 0.029851 | 0 | 0.179104 | 0.223881 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e434b0f85049a9a3662017d2f558a0410cfa06c7 | 102 | py | Python | geographnet/geographnet/model/__init__.py | lspatial/geographnet | 83590f5e9da4fc2274c7590c076e7dc4edcea649 | [
"MIT"
] | null | null | null | geographnet/geographnet/model/__init__.py | lspatial/geographnet | 83590f5e9da4fc2274c7590c076e7dc4edcea649 | [
"MIT"
] | null | null | null | geographnet/geographnet/model/__init__.py | lspatial/geographnet | 83590f5e9da4fc2274c7590c076e7dc4edcea649 | [
"MIT"
] | null | null | null |
from geographnet.model.geographpnet import GeoGraphPNet
from geographnet.model.geogcon import GEOGCon | 34 | 55 | 0.882353 | 12 | 102 | 7.5 | 0.5 | 0.333333 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 102 | 3 | 56 | 34 | 0.957447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e44bde04874e1cd4bb570220b2df40d201e5a032 | 49 | py | Python | uedinst/merlin_drivers/__init__.py | LaurentRDC/uedinst | b6ab38c32e1889209a95be60fd7177b3f9e9200e | [
"BSD-3-Clause"
] | null | null | null | uedinst/merlin_drivers/__init__.py | LaurentRDC/uedinst | b6ab38c32e1889209a95be60fd7177b3f9e9200e | [
"BSD-3-Clause"
] | 1 | 2022-03-02T20:51:32.000Z | 2022-03-02T20:51:32.000Z | uedinst/merlin_drivers/__init__.py | LaurentRDC/uedinst | b6ab38c32e1889209a95be60fd7177b3f9e9200e | [
"BSD-3-Clause"
] | 2 | 2019-11-05T20:30:10.000Z | 2022-03-02T20:44:41.000Z | from .MERLIN_connection import MERLIN_connection
| 24.5 | 48 | 0.897959 | 6 | 49 | 7 | 0.666667 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e4927eb70cce61f3b81e2df3b5e4e38c97eab678 | 22,081 | py | Python | src/python_pachyderm/proto/admin/v1_9/pps/pps_pb2_grpc.py | barretthinson/python-pachyderm | 82cea22d1105d70833a5522ccac750ca521694ff | [
"Apache-2.0"
] | null | null | null | src/python_pachyderm/proto/admin/v1_9/pps/pps_pb2_grpc.py | barretthinson/python-pachyderm | 82cea22d1105d70833a5522ccac750ca521694ff | [
"Apache-2.0"
] | null | null | null | src/python_pachyderm/proto/admin/v1_9/pps/pps_pb2_grpc.py | barretthinson/python-pachyderm | 82cea22d1105d70833a5522ccac750ca521694ff | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from python_pachyderm.proto.admin.v1_9.pps import pps_pb2 as client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
class APIStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.CreateJob = channel.unary_unary(
'/pps_1_9.API/CreateJob',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.CreateJobRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.Job.FromString,
)
self.InspectJob = channel.unary_unary(
'/pps_1_9.API/InspectJob',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.InspectJobRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfo.FromString,
)
self.ListJob = channel.unary_unary(
'/pps_1_9.API/ListJob',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListJobRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfos.FromString,
)
self.ListJobStream = channel.unary_stream(
'/pps_1_9.API/ListJobStream',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListJobRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfo.FromString,
)
self.FlushJob = channel.unary_stream(
'/pps_1_9.API/FlushJob',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.FlushJobRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfo.FromString,
)
self.DeleteJob = channel.unary_unary(
'/pps_1_9.API/DeleteJob',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.DeleteJobRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.StopJob = channel.unary_unary(
'/pps_1_9.API/StopJob',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.StopJobRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.InspectDatum = channel.unary_unary(
'/pps_1_9.API/InspectDatum',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.InspectDatumRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.DatumInfo.FromString,
)
self.ListDatum = channel.unary_unary(
'/pps_1_9.API/ListDatum',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumResponse.FromString,
)
self.ListDatumStream = channel.unary_stream(
'/pps_1_9.API/ListDatumStream',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumStreamResponse.FromString,
)
self.RestartDatum = channel.unary_unary(
'/pps_1_9.API/RestartDatum',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.RestartDatumRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.CreatePipeline = channel.unary_unary(
'/pps_1_9.API/CreatePipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.CreatePipelineRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.InspectPipeline = channel.unary_unary(
'/pps_1_9.API/InspectPipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.InspectPipelineRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.PipelineInfo.FromString,
)
self.ListPipeline = channel.unary_unary(
'/pps_1_9.API/ListPipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListPipelineRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.PipelineInfos.FromString,
)
self.DeletePipeline = channel.unary_unary(
'/pps_1_9.API/DeletePipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.DeletePipelineRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.StartPipeline = channel.unary_unary(
'/pps_1_9.API/StartPipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.StartPipelineRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.StopPipeline = channel.unary_unary(
'/pps_1_9.API/StopPipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.StopPipelineRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.RunPipeline = channel.unary_unary(
'/pps_1_9.API/RunPipeline',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.RunPipelineRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.RunCron = channel.unary_unary(
'/pps_1_9.API/RunCron',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.RunCronRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.DeleteAll = channel.unary_unary(
'/pps_1_9.API/DeleteAll',
request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetLogs = channel.unary_stream(
'/pps_1_9.API/GetLogs',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.GetLogsRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.LogMessage.FromString,
)
self.GarbageCollect = channel.unary_unary(
'/pps_1_9.API/GarbageCollect',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.GarbageCollectRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.GarbageCollectResponse.FromString,
)
self.ActivateAuth = channel.unary_unary(
'/pps_1_9.API/ActivateAuth',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ActivateAuthRequest.SerializeToString,
response_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ActivateAuthResponse.FromString,
)
self.UpdateJobState = channel.unary_unary(
'/pps_1_9.API/UpdateJobState',
request_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.UpdateJobStateRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class APIServicer(object):
# missing associated documentation comment in .proto file
pass
def CreateJob(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def InspectJob(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListJob(self, request, context):
"""ListJob returns information about current and past Pachyderm jobs. This is
deprecated in favor of ListJobStream
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListJobStream(self, request, context):
"""ListJobStream returns information about current and past Pachyderm jobs.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def FlushJob(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteJob(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StopJob(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def InspectDatum(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListDatum(self, request, context):
"""ListDatum returns information about each datum fed to a Pachyderm job. This
is deprecated in favor of ListDatumStream
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListDatumStream(self, request, context):
"""ListDatumStream returns information about each datum fed to a Pachyderm job
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def RestartDatum(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreatePipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def InspectPipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListPipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeletePipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StartPipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StopPipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def RunPipeline(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def RunCron(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteAll(self, request, context):
"""DeleteAll deletes everything
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetLogs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GarbageCollect(self, request, context):
"""Garbage collection
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ActivateAuth(self, request, context):
"""An internal call that causes PPS to put itself into an auth-enabled state
(all pipeline have tokens, correct permissions, etcd)
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UpdateJobState(self, request, context):
"""An internal call used to move a job from one state to another
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_APIServicer_to_server(servicer, server):
rpc_method_handlers = {
'CreateJob': grpc.unary_unary_rpc_method_handler(
servicer.CreateJob,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.CreateJobRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.Job.SerializeToString,
),
'InspectJob': grpc.unary_unary_rpc_method_handler(
servicer.InspectJob,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.InspectJobRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfo.SerializeToString,
),
'ListJob': grpc.unary_unary_rpc_method_handler(
servicer.ListJob,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListJobRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfos.SerializeToString,
),
'ListJobStream': grpc.unary_stream_rpc_method_handler(
servicer.ListJobStream,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListJobRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfo.SerializeToString,
),
'FlushJob': grpc.unary_stream_rpc_method_handler(
servicer.FlushJob,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.FlushJobRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.JobInfo.SerializeToString,
),
'DeleteJob': grpc.unary_unary_rpc_method_handler(
servicer.DeleteJob,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.DeleteJobRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'StopJob': grpc.unary_unary_rpc_method_handler(
servicer.StopJob,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.StopJobRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'InspectDatum': grpc.unary_unary_rpc_method_handler(
servicer.InspectDatum,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.InspectDatumRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.DatumInfo.SerializeToString,
),
'ListDatum': grpc.unary_unary_rpc_method_handler(
servicer.ListDatum,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumResponse.SerializeToString,
),
'ListDatumStream': grpc.unary_stream_rpc_method_handler(
servicer.ListDatumStream,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListDatumStreamResponse.SerializeToString,
),
'RestartDatum': grpc.unary_unary_rpc_method_handler(
servicer.RestartDatum,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.RestartDatumRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'CreatePipeline': grpc.unary_unary_rpc_method_handler(
servicer.CreatePipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.CreatePipelineRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'InspectPipeline': grpc.unary_unary_rpc_method_handler(
servicer.InspectPipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.InspectPipelineRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.PipelineInfo.SerializeToString,
),
'ListPipeline': grpc.unary_unary_rpc_method_handler(
servicer.ListPipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ListPipelineRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.PipelineInfos.SerializeToString,
),
'DeletePipeline': grpc.unary_unary_rpc_method_handler(
servicer.DeletePipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.DeletePipelineRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'StartPipeline': grpc.unary_unary_rpc_method_handler(
servicer.StartPipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.StartPipelineRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'StopPipeline': grpc.unary_unary_rpc_method_handler(
servicer.StopPipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.StopPipelineRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'RunPipeline': grpc.unary_unary_rpc_method_handler(
servicer.RunPipeline,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.RunPipelineRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'RunCron': grpc.unary_unary_rpc_method_handler(
servicer.RunCron,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.RunCronRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'DeleteAll': grpc.unary_unary_rpc_method_handler(
servicer.DeleteAll,
request_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetLogs': grpc.unary_stream_rpc_method_handler(
servicer.GetLogs,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.GetLogsRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.LogMessage.SerializeToString,
),
'GarbageCollect': grpc.unary_unary_rpc_method_handler(
servicer.GarbageCollect,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.GarbageCollectRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.GarbageCollectResponse.SerializeToString,
),
'ActivateAuth': grpc.unary_unary_rpc_method_handler(
servicer.ActivateAuth,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ActivateAuthRequest.FromString,
response_serializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.ActivateAuthResponse.SerializeToString,
),
'UpdateJobState': grpc.unary_unary_rpc_method_handler(
servicer.UpdateJobState,
request_deserializer=client_dot_admin_dot_v1__9_dot_pps_dot_pps__pb2.UpdateJobStateRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'pps_1_9.API', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 49.957014 | 120 | 0.777592 | 2,637 | 22,081 | 5.994312 | 0.064088 | 0.055418 | 0.064655 | 0.07851 | 0.848801 | 0.848801 | 0.839818 | 0.743658 | 0.737332 | 0.731764 | 0 | 0.015953 | 0.154024 | 22,081 | 441 | 121 | 50.070295 | 0.830246 | 0.079933 | 0 | 0.409836 | 1 | 0 | 0.096824 | 0.024589 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071038 | false | 0.04918 | 0.008197 | 0 | 0.084699 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e49a964dc6e1d875a3a5c6547156eba768e0d94d | 174 | py | Python | vectorian/metrics.py | poke1024/vectorian-2021 | 5b8a23ca473dd5e6b0ad4baaca75c8a8bf7cc12b | [
"MIT"
] | 9 | 2020-03-02T08:43:55.000Z | 2021-07-14T14:54:40.000Z | vectorian/metrics.py | poke1024/vectorian-2021 | 5b8a23ca473dd5e6b0ad4baaca75c8a8bf7cc12b | [
"MIT"
] | null | null | null | vectorian/metrics.py | poke1024/vectorian-2021 | 5b8a23ca473dd5e6b0ad4baaca75c8a8bf7cc12b | [
"MIT"
] | 1 | 2021-07-14T14:53:33.000Z | 2021-07-14T14:53:33.000Z | from vectorian.sim.span import *
from vectorian.sim.token import *
from vectorian.sim.modifier import *
from vectorian.sim.kernel import *
from vectorian.sim.vector import *
| 29 | 36 | 0.798851 | 25 | 174 | 5.56 | 0.36 | 0.467626 | 0.57554 | 0.633094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114943 | 174 | 5 | 37 | 34.8 | 0.902597 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
901e0ec360d16926a33bbb00c3ea162c13cf2b68 | 17,488 | py | Python | events/tests/test_views.py | Akash1S/meethub | 3517ec7b4e03ca1fe50e7053fd46349f0b31740c | [
"MIT"
] | 428 | 2018-05-11T16:36:33.000Z | 2022-02-05T15:29:23.000Z | events/tests/test_views.py | dmkibuka/meethub | 296a1494ab6f2828b61f0b8e4ad80308306fc23a | [
"MIT"
] | 15 | 2018-05-14T16:33:29.000Z | 2021-06-09T17:28:14.000Z | events/tests/test_views.py | dmkibuka/meethub | 296a1494ab6f2828b61f0b8e4ad80308306fc23a | [
"MIT"
] | 53 | 2018-05-12T10:41:24.000Z | 2022-02-23T12:15:33.000Z | from django.test import TestCase
from django.urls import reverse
from comments.models import Comment
from comments.forms import CommentForm
from events.models import Event, Category, User
class EventListViewTest(TestCase):
def setUp(self):
user = User.objects.create_user(username='iyanu', password=12345, email='iyanu@gmail.com')
user.save()
category = Category.objects.create(name='Technology', description='This is the future')
number_of_events = 15
for event_num in range(number_of_events):
Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=category, creator=user)
def test_view_url_exists_at_desired_location(self):
self.client.login(username='iyanu', password=12345)
resp = self.client.get('')
# Check the user is logged in
self.assertEqual(str(resp.context['user']), 'iyanu')
# Check that we got a response 'success'
self.assertEqual(resp.status_code, 200)
def test_view_url_accessible_by_name(self):
self.client.login(username='iyanu', password='12345')
resp = self.client.get(reverse('events:event-list'))
self.assertEqual(resp.status_code, 200)
def test_view_uses_correct_template(self):
self.client.login(username='iyanu', password='12345')
resp = self.client.get(reverse('events:event-list'))
self.assertEqual(resp.status_code, 200)
self.assertTemplateUsed(resp, 'events/list_of_events.html')
def test_pagination_is_ten(self):
self.client.login(username='iyanu', password='12345')
resp = self.client.get(reverse('events:event-list'))
self.assertEqual(resp.status_code, 200)
self.assertTrue('is_paginated' in resp.context)
self.assertTrue(resp.context['is_paginated'] == True)
self.assertTrue(len(resp.context['events']) == 10)
def test_list_all_events_on_second_page(self):
self.client.login(username='iyanu', password='12345')
resp = self.client.get(reverse('events:event-list') + '?page=2')
self.assertEqual(resp.status_code, 200)
self.assertTrue('is_paginated' in resp.context)
self.assertTrue(resp.context['is_paginated'] == True)
self.assertTrue(len(resp.context['events']) == 5)
class EventDetailViewTest(TestCase):
def setUp(self):
self.user = User.objects.create_user(username='iyanu', password=12345, email='iyanu@gmail.com')
self.attendee = User.objects.create_user(username='tobi', password=56789)
self.attendee.save()
self.user.save()
self.category = Category.objects.create(name='Technology', description='This is the future')
self.event = Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=self.category, creator=self.user)
def test_detail_of_event(self):
self.client.login(username='iyanu', password=12345)
response = self.client.get(self.event.get_absolute_url())
# Check the user is logged in
self.assertEqual(str(response.context['user']), 'iyanu')
# Check that we got a response 'success'
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-detail', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
def test_view_uses_correct_template(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-detail', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'events/detail.html')
def test_name_of_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(self.event.get_absolute_url())
self.assertContains(response, self.event.name, html=True)
def test_details_of_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(self.event.get_absolute_url())
self.assertContains(response, self.event.details, html=True)
def test_category_of_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(self.event.get_absolute_url())
self.assertEqual(self.category, self.event.category)
def test_creator_of_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(self.event.get_absolute_url())
self.assertContains(response, self.event.creator.username)
def test_list_attendees_for_a_detail_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-detail', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertIsNotNone(response.context['attending'])
def test_list_of_comments_for_a_detail_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-detail', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertEquals(len(response.context['comments']), self.event.comments.all().count())
def test_comment_form_on_event_detail_page(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-detail', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertIsNotNone(response.context['form'])
def test_create_comment_form_displays_comment_on_event_detail_page(self):
self.client.login(username='iyanu', password='12345')
comment = Comment.objects.create(comment='Wow, how far', created_by=self.user, event=self.event)
response = self.client.get(self.event.get_absolute_url())
self.assertEqual(response.status_code, 200)
self.assertContains(response, 'Wow')
def test_valid_create_comment_form_on_event_detail_page_can_post_comment(self):
self.client.login(username='iyanu', password='12345')
response = self.client.post(self.event.get_absolute_url(),
data={'comment': 'Wow, how far tobi', 'event': 'self.event.pk',
'created_by': 'self.user'})
new_response = self.client.get(self.event.get_absolute_url())
self.assertEqual(new_response.status_code, 200)
self.assertEqual(Comment.objects.last().comment, 'Wow, how far tobi')
class EventCreateViewTest(TestCase):
def setUp(self):
self.user = User.objects.create_user(username='iyanu', password=12345, email='iyanu@gmail.com')
self.attendee = User.objects.create_user(username='tobi', password=56789)
self.attendee.save()
self.user.save()
self.category = Category.objects.create(name='Technology', description='This is the future')
self.event = Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=self.category, creator=self.user)
def test_create_event_url_exists_at_desired_location(self):
self.client.login(username='iyanu', password=12345)
response = self.client.get('/events/new/')
# Check the user is logged in
self.assertEqual(str(response.context['user']), 'iyanu')
# Check that we got a response 'success'
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-create'))
self.assertEqual(response.status_code, 200)
def test_view_uses_correct_template(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-create'))
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'events/create_form.html')
def test_event_form_is_on_create_event_page(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-create'))
self.assertEqual(response.status_code, 200)
self.assertIsNotNone(response.context['form'])
def test_create_event_form_displays_event_on_event_detail_page(self):
self.client.login(username='iyanu', password='12345')
event = Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=self.category, creator=self.user)
response = self.client.get(event.get_absolute_url())
self.assertEqual(response.status_code, 200)
self.assertContains(response, 'party is gonna')
def test_valid_create_event_form_on_can_post_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.post(reverse('events:event-create'),
data={'name': 'Party Outside', 'details': 'This party is gonna be banging again',
'venue': 'Mapo Hall', 'date': '2018-05-18', 'time': '12:25:00',
'category': self.category, 'creator': self.user})
new_response = self.client.get(self.event.get_absolute_url())
self.assertEqual(new_response.status_code, 200)
self.assertEqual(Event.objects.last().details, 'This party is gonna be banging again')
class EventUpdateViewTest(TestCase):
def setUp(self):
self.user = User.objects.create_user(username='iyanu', password=12345, email='iyanu@gmail.com')
self.attendee = User.objects.create_user(username='tobi', password=56789)
self.attendee.save()
self.user.save()
self.category = Category.objects.create(name='Technology', description='This is the future')
self.event = Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=self.category, creator=self.user)
def test_update_event_url_exists_at_desired_location(self):
event = Event.objects.create(name='Google io', details='Android is coming for you',
venue='Google Plex',
date='2018-11-18', time='10:25:00', category=self.category, creator=self.user)
self.client.login(username='iyanu', password=12345)
response = self.client.get(reverse('events:event-update', kwargs={'pk': self.event.pk}))
# Check the user is logged in
# self.assertEqual(str(response.context['user']), 'iyanu')
# Check that we got a response 'success'
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-update', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
def test_view_uses_correct_template(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-update', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'events/update_form.html')
def test_event_form_is_on_update_event_page(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-update', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertIsNotNone(response.context['form'])
def test_valid_update_event_form_on_can_update_event(self):
self.client.login(username='iyanu', password='12345')
response = self.client.post(reverse('events:event-update', kwargs={'pk': self.event.pk}),
data={'name': 'Party Outside', 'details': 'This party is gonna be banging again',
'venue': 'Mapo Hall', 'date': '2018-05-18', 'time': '12:25:00',
'category': self.category, 'creator': self.user})
new_response = self.client.get(self.event.get_absolute_url())
self.assertEqual(new_response.status_code, 200)
self.assertEqual(Event.objects.last().details, 'This party is gonna be banging again')
class EventDeleteViewTest(TestCase):
def setUp(self):
self.user = User.objects.create_user(username='iyanu', password=12345, email='iyanu@gmail.com')
self.attendee = User.objects.create_user(username='tobi', password=56789)
self.attendee.save()
self.user.save()
self.category = Category.objects.create(name='Technology', description='This is the future')
self.event = Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=self.category, creator=self.user)
def test_delete_event_url_exists_at_desired_location(self):
event = Event.objects.create(name='Google io', details='Android is coming for you',
venue='Google Plex',
date='2018-11-18', time='10:25:00', category=self.category, creator=self.user)
self.client.login(username='iyanu', password=12345)
response = self.client.get(reverse('events:event-delete', kwargs={'pk': self.event.pk}))
# Check the user is logged in
# self.assertEqual(str(response.context['user']), 'iyanu')
# Check that we got a response 'success'
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-delete', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
def test_view_uses_correct_template(self):
self.client.login(username='iyanu', password='12345')
response = self.client.get(reverse('events:event-delete', kwargs={'pk': self.event.pk}))
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'events/delete.html')
def test_delete_view_redirect_to_list_view(self):
self.client.login(username='iyanu', password='12345')
response = self.client.post(reverse('events:event-delete', kwargs={'pk': self.event.pk}))
# self.assertEqual(response.status_code, 200)
self.assertRedirects(response, '/')
class AttendEventTestView(TestCase):
def setUp(self):
self.user = User.objects.create_user(username='iyanu', password=12345, email='iyanu@gmail.com')
self.attendee = User.objects.create_user(username='tobi', password=56789)
self.attendee.save()
self.user.save()
self.category = Category.objects.create(name='Technology', description='This is the future')
self.event = Event.objects.create(name='Party Outside', details='This party is gonna be banging again',
venue='Mapo Hall',
date='2018-05-18', time='12:25:00', category=self.category, creator=self.user)
def test_attend_event_url_exists_at_desired_location(self):
event = Event.objects.create(name='Google io', details='Android is coming for you',
venue='Google Plex',
date='2018-11-18', time='10:25:00', category=self.category, creator=self.user)
self.client.login(username='iyanu', password=12345)
response = self.client.post(reverse('events:attend_event', kwargs={'event_id': self.event.id}))
self.assertNotEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
self.client.login(username='iyanu', password='12345')
response = self.client.post(reverse('events:attend_event', kwargs={'event_id': self.event.id}))
self.assertNotEqual(response.status_code, 200)
def test_attend_event_view_redirect_to_event_detail(self):
self.client.login(username='iyanu', password='12345')
response = self.client.post(reverse('events:attend_event', kwargs={'event_id': self.event.id}))
self.assertRedirects(response, reverse('events:event-detail', kwargs={'pk': self.event.pk}))
| 51.284457 | 120 | 0.653362 | 2,145 | 17,488 | 5.192075 | 0.071795 | 0.065547 | 0.07731 | 0.095717 | 0.889019 | 0.881746 | 0.878244 | 0.877795 | 0.872766 | 0.862351 | 0 | 0.035967 | 0.214604 | 17,488 | 340 | 121 | 51.435294 | 0.774882 | 0.028134 | 0 | 0.711462 | 0 | 0 | 0.149741 | 0.004241 | 0 | 0 | 0 | 0 | 0.233202 | 1 | 0.162055 | false | 0.181818 | 0.019763 | 0 | 0.205534 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
90304cb1e2fc34befb7a150e9fbc618cc841aed0 | 151 | py | Python | utils/__init__.py | masszhou/lane_detector | e28fe4adbd4c804e45c9bd86743739196bc30105 | [
"MIT"
] | 4 | 2020-10-07T03:31:42.000Z | 2022-03-23T04:10:56.000Z | utils/__init__.py | masszhou/lane_detector | e28fe4adbd4c804e45c9bd86743739196bc30105 | [
"MIT"
] | null | null | null | utils/__init__.py | masszhou/lane_detector | e28fe4adbd4c804e45c9bd86743739196bc30105 | [
"MIT"
] | 1 | 2020-11-16T07:13:53.000Z | 2020-11-16T07:13:53.000Z | # public API summary
from .pinet_utils import convert_to_original_size
from .pinet_utils import find_target
from .pinet_utils import write_result_json
| 30.2 | 49 | 0.86755 | 24 | 151 | 5.083333 | 0.666667 | 0.221311 | 0.344262 | 0.491803 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10596 | 151 | 4 | 50 | 37.75 | 0.903704 | 0.119205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
5f5fb65387b9919e341d3c8503983f7cc7d2201b | 181 | py | Python | pysal/explore/giddy/__init__.py | martinfleis/pysal | d2e0667d825d403efe7182ecda210dc152ec206d | [
"BSD-3-Clause"
] | 941 | 2015-01-12T22:25:55.000Z | 2022-03-27T15:41:29.000Z | pysal/explore/giddy/__init__.py | anekekarina99/pysal | bd8c954d34b4694416830a852e26fe40d64424f2 | [
"BSD-3-Clause"
] | 589 | 2015-01-09T03:58:03.000Z | 2022-02-26T02:17:15.000Z | pysal/explore/giddy/__init__.py | anekekarina99/pysal | bd8c954d34b4694416830a852e26fe40d64424f2 | [
"BSD-3-Clause"
] | 303 | 2015-01-10T02:59:04.000Z | 2022-03-05T04:21:55.000Z | from giddy import directional
from giddy import ergodic
from giddy import markov
from giddy import mobility
from giddy import rank
from giddy import util
from giddy import sequence
| 22.625 | 29 | 0.845304 | 28 | 181 | 5.464286 | 0.357143 | 0.411765 | 0.686275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154696 | 181 | 7 | 30 | 25.857143 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5f813a43d973c0491dcb37464d4bc950bf8ed1dc | 273 | py | Python | src/minml/components/visualization/__init__.py | timhannifan/minml | 21ab6a93e30fc4ef2e67e054cce036cb64b3b684 | [
"MIT"
] | null | null | null | src/minml/components/visualization/__init__.py | timhannifan/minml | 21ab6a93e30fc4ef2e67e054cce036cb64b3b684 | [
"MIT"
] | null | null | null | src/minml/components/visualization/__init__.py | timhannifan/minml | 21ab6a93e30fc4ef2e67e054cce036cb64b3b684 | [
"MIT"
] | null | null | null | from .metrics import (save_fig,
plot_predicted_scores,plot_auc_roc,plot_feature_importances,plot_decision_tree)
from .charts import ChartMaker
__all__ = ('ChartMaker', "save_fig","plot_predicted_scores","plot_auc_roc","plot_feature_importances","plot_decision_tree")
| 39 | 123 | 0.827839 | 37 | 273 | 5.513514 | 0.459459 | 0.068627 | 0.107843 | 0.196078 | 0.72549 | 0.72549 | 0.72549 | 0.72549 | 0.72549 | 0.72549 | 0 | 0 | 0.069597 | 273 | 6 | 124 | 45.5 | 0.80315 | 0 | 0 | 0 | 0 | 0 | 0.340659 | 0.164835 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
397bd5c6da9ff60c793b2a9e96e1b73f88590859 | 202 | py | Python | wechat/models/__init__.py | nahualventure/pos-addons | 3c911c28c259967fb74e311ddcc8e6ca032c005d | [
"MIT"
] | null | null | null | wechat/models/__init__.py | nahualventure/pos-addons | 3c911c28c259967fb74e311ddcc8e6ca032c005d | [
"MIT"
] | null | null | null | wechat/models/__init__.py | nahualventure/pos-addons | 3c911c28c259967fb74e311ddcc8e6ca032c005d | [
"MIT"
] | 3 | 2021-06-15T05:45:42.000Z | 2021-07-27T12:28:53.000Z | # License MIT (https://opensource.org/licenses/MIT).
from . import wechat_micropay
from . import wechat_order
from . import wechat_refund
from . import ir_config_parameter
from . import account_journal
| 28.857143 | 52 | 0.806931 | 28 | 202 | 5.607143 | 0.607143 | 0.318471 | 0.305732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118812 | 202 | 6 | 53 | 33.666667 | 0.882022 | 0.247525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
39de8bb4b75c7d4275ce8c852a6615ca0df2df08 | 32 | py | Python | halonsupport.py | fullphat/redsquare | cd0114e605a90930210e61ea31a88f8fbc1325b9 | [
"MIT"
] | 1 | 2020-05-25T21:32:35.000Z | 2020-05-25T21:32:35.000Z | halonsupport.py | fullphat/redsquare | cd0114e605a90930210e61ea31a88f8fbc1325b9 | [
"MIT"
] | null | null | null | halonsupport.py | fullphat/redsquare | cd0114e605a90930210e61ea31a88f8fbc1325b9 | [
"MIT"
] | null | null | null |
def Version():
return "2.51"
| 6.4 | 14 | 0.59375 | 5 | 32 | 3.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 0.21875 | 32 | 4 | 15 | 8 | 0.64 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
f2effe428f6100d6a487e35fe298881965b8a5b5 | 3,743 | py | Python | front_end/handlers/ViewHelpRequestsHandler.py | zacheliason/CodeBuddy | 18694771ccbb74e6966e08f1247aadda2d5d06f7 | [
"MIT"
] | null | null | null | front_end/handlers/ViewHelpRequestsHandler.py | zacheliason/CodeBuddy | 18694771ccbb74e6966e08f1247aadda2d5d06f7 | [
"MIT"
] | 11 | 2020-06-12T19:13:12.000Z | 2021-08-28T23:47:57.000Z | front_end/handlers/ViewHelpRequestsHandler.py | zacheliason/CodeBuddy | 18694771ccbb74e6966e08f1247aadda2d5d06f7 | [
"MIT"
] | 3 | 2020-05-12T16:54:16.000Z | 2021-04-30T16:19:46.000Z | from BaseUserHandler import *
class ViewHelpRequestsHandler(BaseUserHandler):
def get(self, course, assignment, exercise, student_id):
try:
if self.is_administrator() or self.is_instructor_for_course(course) or self.is_assistant_for_course(course):
self.render("view_request.html", courses=self.content.get_courses(), course_basics=self.content.get_course_basics(course), assignments=self.content.get_assignments(course), assignment_basics=self.content.get_assignment_basics(course, assignment), exercises=self.content.get_exercises(course, assignment), exercise_basics=self.content.get_exercise_basics(course, assignment, exercise), exercise_details=self.content.get_exercise_details(course, assignment, exercise), help_request=self.content.get_help_request(course, assignment, exercise, student_id), exercise_help_requests=self.content.get_exercise_help_requests(course, assignment, exercise, student_id), similar_requests=self.content.compare_help_requests(course, assignment, exercise, student_id), result=None, user_info=self.get_user_info(), is_administrator=self.is_administrator(), is_instructor=self.is_instructor_for_course(course), is_assistant=self.is_assistant_for_course(course))
else:
self.render("permissions.html")
except Exception as inst:
render_error(self, traceback.format_exc())
def post (self, course, assignment, exercise, student_id):
try:
if self.is_administrator() or self.is_instructor_for_course(course) or self.is_assistant_for_course(course):
suggestion = self.get_body_argument("suggestion")
more_info_needed = self.get_argument("more_info_needed", None) == "More info needed"
user_id = self.get_user_id()
if self.is_assistant_for_course(course):
self.content.save_help_request_suggestion(course, assignment, exercise, student_id, suggestion, 0, user_id, None, more_info_needed)
result = "Success: suggestion submitted for approval"
else:
help_request = self.content.get_help_request(course, assignment, exercise, student_id)
if help_request["suggester_id"]:
suggester_id = help_request["suggester_id"]
else:
suggester_id = user_id
self.content.save_help_request_suggestion(course, assignment, exercise, student_id, suggestion, 1, suggester_id, user_id, more_info_needed)
result = "Success: suggestion saved"
self.render("view_request.html", courses=self.content.get_courses(), course_basics=self.content.get_course_basics(course), assignments=self.content.get_assignments(course), assignment_basics=self.content.get_assignment_basics(course, assignment), exercises=self.content.get_exercises(course, assignment), exercise_basics=self.content.get_exercise_basics(course, assignment, exercise), exercise_details=self.content.get_exercise_details(course, assignment, exercise), help_request=self.content.get_help_request(course, assignment, exercise, student_id), exercise_help_requests=self.content.get_exercise_help_requests(course, assignment, exercise, student_id), similar_requests=self.content.compare_help_requests(course, assignment, exercise, student_id), result=result, user_info=self.get_user_info(), is_administrator=self.is_administrator(), is_instructor=self.is_instructor_for_course(course), is_assistant=self.is_assistant_for_course(course))
else:
self.render("permissions.html")
except Exception as inst:
render_error(self, traceback.format_exc())
| 101.162162 | 962 | 0.740048 | 454 | 3,743 | 5.781938 | 0.138767 | 0.096381 | 0.101333 | 0.129905 | 0.867429 | 0.867429 | 0.839238 | 0.82781 | 0.82781 | 0.82781 | 0 | 0.000641 | 0.16591 | 3,743 | 36 | 963 | 103.972222 | 0.840167 | 0 | 0 | 0.4375 | 0 | 0 | 0.05318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.03125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8453315f236c414a785ae133eb41ca3f6e2e6f58 | 347 | py | Python | 03_indentation_error_example.py | ajeyln/W3_School_Python_Coding | cfc598d5d0e96072303846b07b5c2b6f11d72690 | [
"MIT"
] | null | null | null | 03_indentation_error_example.py | ajeyln/W3_School_Python_Coding | cfc598d5d0e96072303846b07b5c2b6f11d72690 | [
"MIT"
] | null | null | null | 03_indentation_error_example.py | ajeyln/W3_School_Python_Coding | cfc598d5d0e96072303846b07b5c2b6f11d72690 | [
"MIT"
] | null | null | null | if 110 < 220:
print ("110 is lesser than 220") #indentation example
#d:\Python_Learning_Repository\Scripts_W3_School>Indentation_Example.py
#File "D:\Python_Learning_Repository\Scripts_W3_School\Indentation_Example.py",
#line 2
# print ("110 is lesser than 220") #indentation example
# ^
#IndentationError: expected an indented block | 43.375 | 81 | 0.772334 | 47 | 347 | 5.489362 | 0.531915 | 0.27907 | 0.077519 | 0.124031 | 0.782946 | 0.782946 | 0.782946 | 0.782946 | 0.465116 | 0.465116 | 0 | 0.069767 | 0.132565 | 347 | 8 | 82 | 43.375 | 0.787375 | 0.792507 | 0 | 0 | 0 | 0 | 0.338462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
f233f6195166160cd383e188ae864a3fe937ce6d | 192 | py | Python | pyHMT2D/Hydraulic_Models_Data/__init__.py | ali-mahdavi-mazdeh/pyHMT2D | 5351ea8a0d234d4a2b8e1e62c803d7f3cfeb0f4d | [
"MIT"
] | 7 | 2021-04-12T16:14:30.000Z | 2022-03-11T12:20:52.000Z | pyHMT2D/Hydraulic_Models_Data/__init__.py | ali-mahdavi-mazdeh/pyHMT2D | 5351ea8a0d234d4a2b8e1e62c803d7f3cfeb0f4d | [
"MIT"
] | 4 | 2021-05-17T15:33:02.000Z | 2021-07-16T18:04:59.000Z | pyHMT2D/Hydraulic_Models_Data/__init__.py | ali-mahdavi-mazdeh/pyHMT2D | 5351ea8a0d234d4a2b8e1e62c803d7f3cfeb0f4d | [
"MIT"
] | 6 | 2021-05-17T15:20:31.000Z | 2022-03-12T02:06:53.000Z | from .Hydraulic_Models_Data_Base import *
from .Backwater_1D import *
from .SRH_2D import *
from .RAS_2D import *
__all__ = ["Hydraulic_Models_Data_Base", "Backwater_1D", "SRH_2D", "RAS_2D"] | 27.428571 | 76 | 0.765625 | 29 | 192 | 4.517241 | 0.413793 | 0.229008 | 0.290076 | 0.351145 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035503 | 0.119792 | 192 | 7 | 76 | 27.428571 | 0.739645 | 0 | 0 | 0 | 0 | 0 | 0.259067 | 0.134715 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f238590bad950d2e1a3f2d2734709caae9a2b65d | 152,219 | py | Python | dingtalk/python/alibabacloud_dingtalk/crm_1_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | dingtalk/python/alibabacloud_dingtalk/crm_1_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | dingtalk/python/alibabacloud_dingtalk/crm_1_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# This file is auto-generated, don't edit it. Thanks.
from Tea.model import TeaModel
from typing import Dict, List, Any
class GetOfficialAccountContactsHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetOfficialAccountContactsRequest(TeaModel):
def __init__(
self,
next_token: str = None,
max_results: int = None,
):
# 取数游标,第一次传0
self.next_token = next_token
# 分页大小,最大不超过10
self.max_results = max_results
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.max_results is not None:
result['maxResults'] = self.max_results
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
return self
class GetOfficialAccountContactsResponseBodyValuesContactsPermission(TeaModel):
def __init__(
self,
participant_staff_ids: List[str] = None,
owner_staff_ids: List[str] = None,
):
# 协同人用户ID列表
self.participant_staff_ids = participant_staff_ids
# 负责人用户ID列表
self.owner_staff_ids = owner_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
return self
class GetOfficialAccountContactsResponseBodyValuesContacts(TeaModel):
def __init__(
self,
creator_nick: str = None,
modify_time: str = None,
create_time: str = None,
creator_user_id: str = None,
instance_id: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
permission: GetOfficialAccountContactsResponseBodyValuesContactsPermission = None,
):
# 创建记录的用户昵称
self.creator_nick = creator_nick
# 记录修改时间
self.modify_time = modify_time
# 记录创建时间
self.create_time = create_time
# 创建记录的用户ID
self.creator_user_id = creator_user_id
# 数据ID
self.instance_id = instance_id
# 数据内容
self.data = data
# 扩展数据内容
self.extend_data = extend_data
# 数据权限信息
self.permission = permission
def validate(self):
if self.permission:
self.permission.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.creator_nick is not None:
result['creatorNick'] = self.creator_nick
if self.modify_time is not None:
result['modifyTime'] = self.modify_time
if self.create_time is not None:
result['createTime'] = self.create_time
if self.creator_user_id is not None:
result['creatorUserId'] = self.creator_user_id
if self.instance_id is not None:
result['instanceId'] = self.instance_id
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
if self.permission is not None:
result['permission'] = self.permission.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('creatorNick') is not None:
self.creator_nick = m.get('creatorNick')
if m.get('modifyTime') is not None:
self.modify_time = m.get('modifyTime')
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('creatorUserId') is not None:
self.creator_user_id = m.get('creatorUserId')
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
if m.get('permission') is not None:
temp_model = GetOfficialAccountContactsResponseBodyValuesContactsPermission()
self.permission = temp_model.from_map(m['permission'])
return self
class GetOfficialAccountContactsResponseBodyValues(TeaModel):
def __init__(
self,
user_id: str = None,
contacts: List[GetOfficialAccountContactsResponseBodyValuesContacts] = None,
):
# 用户userId
self.user_id = user_id
# 用户的联系人数据
self.contacts = contacts
def validate(self):
if self.contacts:
for k in self.contacts:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.user_id is not None:
result['userId'] = self.user_id
result['contacts'] = []
if self.contacts is not None:
for k in self.contacts:
result['contacts'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('userId') is not None:
self.user_id = m.get('userId')
self.contacts = []
if m.get('contacts') is not None:
for k in m.get('contacts'):
temp_model = GetOfficialAccountContactsResponseBodyValuesContacts()
self.contacts.append(temp_model.from_map(k))
return self
class GetOfficialAccountContactsResponseBody(TeaModel):
def __init__(
self,
next_token: str = None,
max_results: int = None,
values: List[GetOfficialAccountContactsResponseBodyValues] = None,
):
# 下一页的游标,为null则表示无数据
self.next_token = next_token
# 分页大小
self.max_results = max_results
# 客户数据节点
self.values = values
def validate(self):
if self.values:
for k in self.values:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.max_results is not None:
result['maxResults'] = self.max_results
result['values'] = []
if self.values is not None:
for k in self.values:
result['values'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
self.values = []
if m.get('values') is not None:
for k in m.get('values'):
temp_model = GetOfficialAccountContactsResponseBodyValues()
self.values.append(temp_model.from_map(k))
return self
class GetOfficialAccountContactsResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetOfficialAccountContactsResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetOfficialAccountContactsResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ServiceWindowMessageBatchPushHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ServiceWindowMessageBatchPushRequestDetailMessageBodyText(TeaModel):
def __init__(
self,
content: str = None,
):
self.content = content
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.content is not None:
result['content'] = self.content
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('content') is not None:
self.content = m.get('content')
return self
class ServiceWindowMessageBatchPushRequestDetailMessageBodyMarkdown(TeaModel):
def __init__(
self,
title: str = None,
text: str = None,
):
self.title = title
self.text = text
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.title is not None:
result['title'] = self.title
if self.text is not None:
result['text'] = self.text
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('title') is not None:
self.title = m.get('title')
if m.get('text') is not None:
self.text = m.get('text')
return self
class ServiceWindowMessageBatchPushRequestDetailMessageBodyLink(TeaModel):
def __init__(
self,
pic_url: str = None,
message_url: str = None,
title: str = None,
text: str = None,
):
self.pic_url = pic_url
self.message_url = message_url
self.title = title
self.text = text
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pic_url is not None:
result['picUrl'] = self.pic_url
if self.message_url is not None:
result['messageUrl'] = self.message_url
if self.title is not None:
result['title'] = self.title
if self.text is not None:
result['text'] = self.text
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('picUrl') is not None:
self.pic_url = m.get('picUrl')
if m.get('messageUrl') is not None:
self.message_url = m.get('messageUrl')
if m.get('title') is not None:
self.title = m.get('title')
if m.get('text') is not None:
self.text = m.get('text')
return self
class ServiceWindowMessageBatchPushRequestDetailMessageBodyActionCardButtonList(TeaModel):
def __init__(
self,
title: str = None,
action_url: str = None,
):
self.title = title
self.action_url = action_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.title is not None:
result['title'] = self.title
if self.action_url is not None:
result['actionUrl'] = self.action_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('title') is not None:
self.title = m.get('title')
if m.get('actionUrl') is not None:
self.action_url = m.get('actionUrl')
return self
class ServiceWindowMessageBatchPushRequestDetailMessageBodyActionCard(TeaModel):
def __init__(
self,
button_orientation: str = None,
single_url: str = None,
single_title: str = None,
markdown: str = None,
title: str = None,
button_list: List[ServiceWindowMessageBatchPushRequestDetailMessageBodyActionCardButtonList] = None,
):
self.button_orientation = button_orientation
self.single_url = single_url
self.single_title = single_title
self.markdown = markdown
self.title = title
self.button_list = button_list
def validate(self):
if self.button_list:
for k in self.button_list:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.button_orientation is not None:
result['buttonOrientation'] = self.button_orientation
if self.single_url is not None:
result['singleUrl'] = self.single_url
if self.single_title is not None:
result['singleTitle'] = self.single_title
if self.markdown is not None:
result['markdown'] = self.markdown
if self.title is not None:
result['title'] = self.title
result['buttonList'] = []
if self.button_list is not None:
for k in self.button_list:
result['buttonList'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('buttonOrientation') is not None:
self.button_orientation = m.get('buttonOrientation')
if m.get('singleUrl') is not None:
self.single_url = m.get('singleUrl')
if m.get('singleTitle') is not None:
self.single_title = m.get('singleTitle')
if m.get('markdown') is not None:
self.markdown = m.get('markdown')
if m.get('title') is not None:
self.title = m.get('title')
self.button_list = []
if m.get('buttonList') is not None:
for k in m.get('buttonList'):
temp_model = ServiceWindowMessageBatchPushRequestDetailMessageBodyActionCardButtonList()
self.button_list.append(temp_model.from_map(k))
return self
class ServiceWindowMessageBatchPushRequestDetailMessageBody(TeaModel):
def __init__(
self,
text: ServiceWindowMessageBatchPushRequestDetailMessageBodyText = None,
markdown: ServiceWindowMessageBatchPushRequestDetailMessageBodyMarkdown = None,
link: ServiceWindowMessageBatchPushRequestDetailMessageBodyLink = None,
action_card: ServiceWindowMessageBatchPushRequestDetailMessageBodyActionCard = None,
):
self.text = text
self.markdown = markdown
self.link = link
self.action_card = action_card
def validate(self):
if self.text:
self.text.validate()
if self.markdown:
self.markdown.validate()
if self.link:
self.link.validate()
if self.action_card:
self.action_card.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.text is not None:
result['text'] = self.text.to_map()
if self.markdown is not None:
result['markdown'] = self.markdown.to_map()
if self.link is not None:
result['link'] = self.link.to_map()
if self.action_card is not None:
result['actionCard'] = self.action_card.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('text') is not None:
temp_model = ServiceWindowMessageBatchPushRequestDetailMessageBodyText()
self.text = temp_model.from_map(m['text'])
if m.get('markdown') is not None:
temp_model = ServiceWindowMessageBatchPushRequestDetailMessageBodyMarkdown()
self.markdown = temp_model.from_map(m['markdown'])
if m.get('link') is not None:
temp_model = ServiceWindowMessageBatchPushRequestDetailMessageBodyLink()
self.link = temp_model.from_map(m['link'])
if m.get('actionCard') is not None:
temp_model = ServiceWindowMessageBatchPushRequestDetailMessageBodyActionCard()
self.action_card = temp_model.from_map(m['actionCard'])
return self
class ServiceWindowMessageBatchPushRequestDetail(TeaModel):
def __init__(
self,
msg_type: str = None,
uuid: str = None,
biz_request_id: str = None,
user_id_list: List[str] = None,
message_body: ServiceWindowMessageBatchPushRequestDetailMessageBody = None,
send_to_all: bool = None,
):
self.msg_type = msg_type
self.uuid = uuid
self.biz_request_id = biz_request_id
self.user_id_list = user_id_list
self.message_body = message_body
self.send_to_all = send_to_all
def validate(self):
if self.message_body:
self.message_body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.msg_type is not None:
result['msgType'] = self.msg_type
if self.uuid is not None:
result['uuid'] = self.uuid
if self.biz_request_id is not None:
result['bizRequestId'] = self.biz_request_id
if self.user_id_list is not None:
result['userIdList'] = self.user_id_list
if self.message_body is not None:
result['messageBody'] = self.message_body.to_map()
if self.send_to_all is not None:
result['sendToAll'] = self.send_to_all
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('msgType') is not None:
self.msg_type = m.get('msgType')
if m.get('uuid') is not None:
self.uuid = m.get('uuid')
if m.get('bizRequestId') is not None:
self.biz_request_id = m.get('bizRequestId')
if m.get('userIdList') is not None:
self.user_id_list = m.get('userIdList')
if m.get('messageBody') is not None:
temp_model = ServiceWindowMessageBatchPushRequestDetailMessageBody()
self.message_body = temp_model.from_map(m['messageBody'])
if m.get('sendToAll') is not None:
self.send_to_all = m.get('sendToAll')
return self
class ServiceWindowMessageBatchPushRequest(TeaModel):
def __init__(
self,
detail: ServiceWindowMessageBatchPushRequestDetail = None,
biz_id: str = None,
ding_isv_org_id: int = None,
ding_org_id: int = None,
ding_token_grant_type: int = None,
ding_suite_key: str = None,
):
self.detail = detail
self.biz_id = biz_id
self.ding_isv_org_id = ding_isv_org_id
self.ding_org_id = ding_org_id
self.ding_token_grant_type = ding_token_grant_type
self.ding_suite_key = ding_suite_key
def validate(self):
if self.detail:
self.detail.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.detail is not None:
result['detail'] = self.detail.to_map()
if self.biz_id is not None:
result['bizId'] = self.biz_id
if self.ding_isv_org_id is not None:
result['dingIsvOrgId'] = self.ding_isv_org_id
if self.ding_org_id is not None:
result['dingOrgId'] = self.ding_org_id
if self.ding_token_grant_type is not None:
result['dingTokenGrantType'] = self.ding_token_grant_type
if self.ding_suite_key is not None:
result['dingSuiteKey'] = self.ding_suite_key
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('detail') is not None:
temp_model = ServiceWindowMessageBatchPushRequestDetail()
self.detail = temp_model.from_map(m['detail'])
if m.get('bizId') is not None:
self.biz_id = m.get('bizId')
if m.get('dingIsvOrgId') is not None:
self.ding_isv_org_id = m.get('dingIsvOrgId')
if m.get('dingOrgId') is not None:
self.ding_org_id = m.get('dingOrgId')
if m.get('dingTokenGrantType') is not None:
self.ding_token_grant_type = m.get('dingTokenGrantType')
if m.get('dingSuiteKey') is not None:
self.ding_suite_key = m.get('dingSuiteKey')
return self
class ServiceWindowMessageBatchPushResponseBodyResult(TeaModel):
def __init__(
self,
open_push_id: str = None,
):
self.open_push_id = open_push_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.open_push_id is not None:
result['openPushId'] = self.open_push_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('openPushId') is not None:
self.open_push_id = m.get('openPushId')
return self
class ServiceWindowMessageBatchPushResponseBody(TeaModel):
def __init__(
self,
result: ServiceWindowMessageBatchPushResponseBodyResult = None,
request_id: str = None,
):
# result
self.result = result
self.request_id = request_id
def validate(self):
if self.result:
self.result.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.result is not None:
result['result'] = self.result.to_map()
if self.request_id is not None:
result['requestId'] = self.request_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('result') is not None:
temp_model = ServiceWindowMessageBatchPushResponseBodyResult()
self.result = temp_model.from_map(m['result'])
if m.get('requestId') is not None:
self.request_id = m.get('requestId')
return self
class ServiceWindowMessageBatchPushResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ServiceWindowMessageBatchPushResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ServiceWindowMessageBatchPushResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class DeleteCrmFormInstanceHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class DeleteCrmFormInstanceRequest(TeaModel):
def __init__(
self,
current_operator_user_id: str = None,
name: str = None,
):
# 当前操作人id
self.current_operator_user_id = current_operator_user_id
# 模版名称
self.name = name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.current_operator_user_id is not None:
result['currentOperatorUserId'] = self.current_operator_user_id
if self.name is not None:
result['name'] = self.name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('currentOperatorUserId') is not None:
self.current_operator_user_id = m.get('currentOperatorUserId')
if m.get('name') is not None:
self.name = m.get('name')
return self
class DeleteCrmFormInstanceResponseBody(TeaModel):
def __init__(
self,
instance_id: str = None,
):
# 被删除的实例id
self.instance_id = instance_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.instance_id is not None:
result['instanceId'] = self.instance_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
return self
class DeleteCrmFormInstanceResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: DeleteCrmFormInstanceResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = DeleteCrmFormInstanceResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class BatchSendOfficialAccountOTOMessageHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyText(TeaModel):
def __init__(
self,
content: str = None,
):
# 消息内容,建议500字符以内。
self.content = content
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.content is not None:
result['content'] = self.content
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('content') is not None:
self.content = m.get('content')
return self
class BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyMarkdown(TeaModel):
def __init__(
self,
title: str = None,
text: str = None,
):
# 首屏会话透出的展示内容。
self.title = title
# markdown格式的消息,建议500字符以内。
self.text = text
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.title is not None:
result['title'] = self.title
if self.text is not None:
result['text'] = self.text
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('title') is not None:
self.title = m.get('title')
if m.get('text') is not None:
self.text = m.get('text')
return self
class BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyLink(TeaModel):
def __init__(
self,
pic_url: str = None,
message_url: str = None,
title: str = None,
text: str = None,
):
# 图片地址
self.pic_url = pic_url
# 消息点击链接地址,当发送消息为小程序时支持小程序跳转链接。
self.message_url = message_url
# 消息标题,建议100字符以内。
self.title = title
# 消息描述,建议500字符以内。
self.text = text
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pic_url is not None:
result['picUrl'] = self.pic_url
if self.message_url is not None:
result['messageUrl'] = self.message_url
if self.title is not None:
result['title'] = self.title
if self.text is not None:
result['text'] = self.text
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('picUrl') is not None:
self.pic_url = m.get('picUrl')
if m.get('messageUrl') is not None:
self.message_url = m.get('messageUrl')
if m.get('title') is not None:
self.title = m.get('title')
if m.get('text') is not None:
self.text = m.get('text')
return self
class BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyActionCardButtonList(TeaModel):
def __init__(
self,
title: str = None,
action_url: str = None,
):
# 使用独立跳转ActionCard样式时的按钮的标题,最长20个字符。
self.title = title
# 使用独立跳转ActionCard样式时的跳转链接。
self.action_url = action_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.title is not None:
result['title'] = self.title
if self.action_url is not None:
result['actionUrl'] = self.action_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('title') is not None:
self.title = m.get('title')
if m.get('actionUrl') is not None:
self.action_url = m.get('actionUrl')
return self
class BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyActionCard(TeaModel):
def __init__(
self,
button_orientation: str = None,
single_url: str = None,
single_title: str = None,
markdown: str = None,
title: str = None,
button_list: List[BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyActionCardButtonList] = None,
):
# 按钮排列方式: 0:竖直排列 1:横向排列 必须与buttonList同时设置。
self.button_orientation = button_orientation
# 消息点击链接地址,当发送消息为小程序时支持小程序跳转链接,最长500个字符。
self.single_url = single_url
# 使用整体跳转ActionCard样式时的标题。必须与singleUrl同时设置,最长20个字符。
self.single_title = single_title
# 消息内容,支持markdown,语法参考标准markdown语法。1000个字符以内。
self.markdown = markdown
# 透出到会话列表和通知的文案
self.title = title
# 使用独立跳转ActionCard样式时的按钮列表;必须与buttonOrientation同时设置,且长度不超过1000字符。
self.button_list = button_list
def validate(self):
if self.button_list:
for k in self.button_list:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.button_orientation is not None:
result['buttonOrientation'] = self.button_orientation
if self.single_url is not None:
result['singleUrl'] = self.single_url
if self.single_title is not None:
result['singleTitle'] = self.single_title
if self.markdown is not None:
result['markdown'] = self.markdown
if self.title is not None:
result['title'] = self.title
result['buttonList'] = []
if self.button_list is not None:
for k in self.button_list:
result['buttonList'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('buttonOrientation') is not None:
self.button_orientation = m.get('buttonOrientation')
if m.get('singleUrl') is not None:
self.single_url = m.get('singleUrl')
if m.get('singleTitle') is not None:
self.single_title = m.get('singleTitle')
if m.get('markdown') is not None:
self.markdown = m.get('markdown')
if m.get('title') is not None:
self.title = m.get('title')
self.button_list = []
if m.get('buttonList') is not None:
for k in m.get('buttonList'):
temp_model = BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyActionCardButtonList()
self.button_list.append(temp_model.from_map(k))
return self
class BatchSendOfficialAccountOTOMessageRequestDetailMessageBody(TeaModel):
def __init__(
self,
text: BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyText = None,
markdown: BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyMarkdown = None,
link: BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyLink = None,
action_card: BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyActionCard = None,
):
# 文本消息体 对于文本类型消息时必填
self.text = text
# markdown消息,仅对消息类型为markdown时有效
self.markdown = markdown
# 链接消息类型
self.link = link
# 卡片消息
self.action_card = action_card
def validate(self):
if self.text:
self.text.validate()
if self.markdown:
self.markdown.validate()
if self.link:
self.link.validate()
if self.action_card:
self.action_card.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.text is not None:
result['text'] = self.text.to_map()
if self.markdown is not None:
result['markdown'] = self.markdown.to_map()
if self.link is not None:
result['link'] = self.link.to_map()
if self.action_card is not None:
result['actionCard'] = self.action_card.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('text') is not None:
temp_model = BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyText()
self.text = temp_model.from_map(m['text'])
if m.get('markdown') is not None:
temp_model = BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyMarkdown()
self.markdown = temp_model.from_map(m['markdown'])
if m.get('link') is not None:
temp_model = BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyLink()
self.link = temp_model.from_map(m['link'])
if m.get('actionCard') is not None:
temp_model = BatchSendOfficialAccountOTOMessageRequestDetailMessageBodyActionCard()
self.action_card = temp_model.from_map(m['actionCard'])
return self
class BatchSendOfficialAccountOTOMessageRequestDetail(TeaModel):
def __init__(
self,
msg_type: str = None,
uuid: str = None,
biz_request_id: str = None,
user_id_list: List[str] = None,
message_body: BatchSendOfficialAccountOTOMessageRequestDetailMessageBody = None,
send_to_all: bool = None,
):
# 消息类型
self.msg_type = msg_type
# 消息请求唯一ID
self.uuid = uuid
# 业务请求标识,当一次业务请求需要多次调用发送API时可以设置此参数,方便后续跟踪处理。
self.biz_request_id = biz_request_id
# 消息接收人列表,最多支持1000人
self.user_id_list = user_id_list
# 消息体
self.message_body = message_body
# 全员群发
self.send_to_all = send_to_all
def validate(self):
if self.message_body:
self.message_body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.msg_type is not None:
result['msgType'] = self.msg_type
if self.uuid is not None:
result['uuid'] = self.uuid
if self.biz_request_id is not None:
result['bizRequestId'] = self.biz_request_id
if self.user_id_list is not None:
result['userIdList'] = self.user_id_list
if self.message_body is not None:
result['messageBody'] = self.message_body.to_map()
if self.send_to_all is not None:
result['sendToAll'] = self.send_to_all
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('msgType') is not None:
self.msg_type = m.get('msgType')
if m.get('uuid') is not None:
self.uuid = m.get('uuid')
if m.get('bizRequestId') is not None:
self.biz_request_id = m.get('bizRequestId')
if m.get('userIdList') is not None:
self.user_id_list = m.get('userIdList')
if m.get('messageBody') is not None:
temp_model = BatchSendOfficialAccountOTOMessageRequestDetailMessageBody()
self.message_body = temp_model.from_map(m['messageBody'])
if m.get('sendToAll') is not None:
self.send_to_all = m.get('sendToAll')
return self
class BatchSendOfficialAccountOTOMessageRequest(TeaModel):
def __init__(
self,
detail: BatchSendOfficialAccountOTOMessageRequestDetail = None,
biz_id: str = None,
account_id: str = None,
ding_isv_org_id: int = None,
ding_org_id: int = None,
ding_token_grant_type: int = None,
ding_suite_key: str = None,
):
# 消息详情
self.detail = detail
# 服务窗授权的调用方标识,可空
self.biz_id = biz_id
# 服务窗帐号ID
self.account_id = account_id
self.ding_isv_org_id = ding_isv_org_id
self.ding_org_id = ding_org_id
self.ding_token_grant_type = ding_token_grant_type
self.ding_suite_key = ding_suite_key
def validate(self):
if self.detail:
self.detail.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.detail is not None:
result['detail'] = self.detail.to_map()
if self.biz_id is not None:
result['bizId'] = self.biz_id
if self.account_id is not None:
result['accountId'] = self.account_id
if self.ding_isv_org_id is not None:
result['dingIsvOrgId'] = self.ding_isv_org_id
if self.ding_org_id is not None:
result['dingOrgId'] = self.ding_org_id
if self.ding_token_grant_type is not None:
result['dingTokenGrantType'] = self.ding_token_grant_type
if self.ding_suite_key is not None:
result['dingSuiteKey'] = self.ding_suite_key
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('detail') is not None:
temp_model = BatchSendOfficialAccountOTOMessageRequestDetail()
self.detail = temp_model.from_map(m['detail'])
if m.get('bizId') is not None:
self.biz_id = m.get('bizId')
if m.get('accountId') is not None:
self.account_id = m.get('accountId')
if m.get('dingIsvOrgId') is not None:
self.ding_isv_org_id = m.get('dingIsvOrgId')
if m.get('dingOrgId') is not None:
self.ding_org_id = m.get('dingOrgId')
if m.get('dingTokenGrantType') is not None:
self.ding_token_grant_type = m.get('dingTokenGrantType')
if m.get('dingSuiteKey') is not None:
self.ding_suite_key = m.get('dingSuiteKey')
return self
class BatchSendOfficialAccountOTOMessageResponseBodyResult(TeaModel):
def __init__(
self,
open_push_id: str = None,
):
self.open_push_id = open_push_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.open_push_id is not None:
result['openPushId'] = self.open_push_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('openPushId') is not None:
self.open_push_id = m.get('openPushId')
return self
class BatchSendOfficialAccountOTOMessageResponseBody(TeaModel):
def __init__(
self,
result: BatchSendOfficialAccountOTOMessageResponseBodyResult = None,
request_id: str = None,
):
# result
self.result = result
# 开放API
self.request_id = request_id
def validate(self):
if self.result:
self.result.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.result is not None:
result['result'] = self.result.to_map()
if self.request_id is not None:
result['requestId'] = self.request_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('result') is not None:
temp_model = BatchSendOfficialAccountOTOMessageResponseBodyResult()
self.result = temp_model.from_map(m['result'])
if m.get('requestId') is not None:
self.request_id = m.get('requestId')
return self
class BatchSendOfficialAccountOTOMessageResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: BatchSendOfficialAccountOTOMessageResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = BatchSendOfficialAccountOTOMessageResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetOfficialAccountContactInfoHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetOfficialAccountContactInfoResponseBody(TeaModel):
def __init__(
self,
corp_name: str = None,
mobile: str = None,
state_code: str = None,
union_id: str = None,
auth_items: List[str] = None,
user_infos: List[str] = None,
):
# 联系人主企业名称
self.corp_name = corp_name
# 手机号
self.mobile = mobile
# 手机号国家码
self.state_code = state_code
# 联系人的unionId
self.union_id = union_id
# 已授权的字段
self.auth_items = auth_items
# 已授权的字段
self.user_infos = user_infos
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.corp_name is not None:
result['corpName'] = self.corp_name
if self.mobile is not None:
result['mobile'] = self.mobile
if self.state_code is not None:
result['stateCode'] = self.state_code
if self.union_id is not None:
result['unionId'] = self.union_id
if self.auth_items is not None:
result['authItems'] = self.auth_items
if self.user_infos is not None:
result['userInfos'] = self.user_infos
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('corpName') is not None:
self.corp_name = m.get('corpName')
if m.get('mobile') is not None:
self.mobile = m.get('mobile')
if m.get('stateCode') is not None:
self.state_code = m.get('stateCode')
if m.get('unionId') is not None:
self.union_id = m.get('unionId')
if m.get('authItems') is not None:
self.auth_items = m.get('authItems')
if m.get('userInfos') is not None:
self.user_infos = m.get('userInfos')
return self
class GetOfficialAccountContactInfoResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetOfficialAccountContactInfoResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetOfficialAccountContactInfoResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class QueryAllCustomerHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class QueryAllCustomerRequest(TeaModel):
def __init__(
self,
ding_isv_org_id: int = None,
ding_org_id: int = None,
ding_token_grant_type: int = None,
ding_corp_id: str = None,
ding_suite_key: str = None,
operator_user_id: str = None,
max_results: int = None,
next_token: str = None,
object_type: str = None,
):
self.ding_isv_org_id = ding_isv_org_id
self.ding_org_id = ding_org_id
self.ding_token_grant_type = ding_token_grant_type
self.ding_corp_id = ding_corp_id
self.ding_suite_key = ding_suite_key
# 用户ID
self.operator_user_id = operator_user_id
# 翻页size
self.max_results = max_results
# 分页游标,第一次调用传空或者null
self.next_token = next_token
# 数据类型
self.object_type = object_type
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_isv_org_id is not None:
result['dingIsvOrgId'] = self.ding_isv_org_id
if self.ding_org_id is not None:
result['dingOrgId'] = self.ding_org_id
if self.ding_token_grant_type is not None:
result['dingTokenGrantType'] = self.ding_token_grant_type
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.ding_suite_key is not None:
result['dingSuiteKey'] = self.ding_suite_key
if self.operator_user_id is not None:
result['operatorUserId'] = self.operator_user_id
if self.max_results is not None:
result['maxResults'] = self.max_results
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.object_type is not None:
result['objectType'] = self.object_type
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingIsvOrgId') is not None:
self.ding_isv_org_id = m.get('dingIsvOrgId')
if m.get('dingOrgId') is not None:
self.ding_org_id = m.get('dingOrgId')
if m.get('dingTokenGrantType') is not None:
self.ding_token_grant_type = m.get('dingTokenGrantType')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('dingSuiteKey') is not None:
self.ding_suite_key = m.get('dingSuiteKey')
if m.get('operatorUserId') is not None:
self.operator_user_id = m.get('operatorUserId')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('objectType') is not None:
self.object_type = m.get('objectType')
return self
class QueryAllCustomerResponseBodyResultValuesPermission(TeaModel):
def __init__(
self,
participant_staff_ids: List[str] = None,
owner_staff_ids: List[str] = None,
):
# 协同人用户ID列表
self.participant_staff_ids = participant_staff_ids
# 负责人用户ID列表
self.owner_staff_ids = owner_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
return self
class QueryAllCustomerResponseBodyResultValues(TeaModel):
def __init__(
self,
creator_nick: str = None,
modify_time: str = None,
creator_user_id: str = None,
instance_id: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
create_time: str = None,
org_id: int = None,
object_type: str = None,
permission: QueryAllCustomerResponseBodyResultValuesPermission = None,
process_out_result: str = None,
process_instance_status: str = None,
):
# 创建记录的用户昵称
self.creator_nick = creator_nick
# 记录修改时间
self.modify_time = modify_time
# 创建记录的用户ID
self.creator_user_id = creator_user_id
# 数据ID
self.instance_id = instance_id
# 数据内容
self.data = data
# 扩展数据内容
self.extend_data = extend_data
# 记录创建时间
self.create_time = create_time
# 系统自动生成
self.org_id = org_id
# 数据类型
self.object_type = object_type
# 数据权限信息
self.permission = permission
# 审批结果
self.process_out_result = process_out_result
# 审批状态
self.process_instance_status = process_instance_status
def validate(self):
if self.permission:
self.permission.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.creator_nick is not None:
result['creatorNick'] = self.creator_nick
if self.modify_time is not None:
result['modifyTime'] = self.modify_time
if self.creator_user_id is not None:
result['creatorUserId'] = self.creator_user_id
if self.instance_id is not None:
result['instanceId'] = self.instance_id
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
if self.create_time is not None:
result['createTime'] = self.create_time
if self.org_id is not None:
result['orgId'] = self.org_id
if self.object_type is not None:
result['objectType'] = self.object_type
if self.permission is not None:
result['permission'] = self.permission.to_map()
if self.process_out_result is not None:
result['processOutResult'] = self.process_out_result
if self.process_instance_status is not None:
result['processInstanceStatus'] = self.process_instance_status
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('creatorNick') is not None:
self.creator_nick = m.get('creatorNick')
if m.get('modifyTime') is not None:
self.modify_time = m.get('modifyTime')
if m.get('creatorUserId') is not None:
self.creator_user_id = m.get('creatorUserId')
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('orgId') is not None:
self.org_id = m.get('orgId')
if m.get('objectType') is not None:
self.object_type = m.get('objectType')
if m.get('permission') is not None:
temp_model = QueryAllCustomerResponseBodyResultValuesPermission()
self.permission = temp_model.from_map(m['permission'])
if m.get('processOutResult') is not None:
self.process_out_result = m.get('processOutResult')
if m.get('processInstanceStatus') is not None:
self.process_instance_status = m.get('processInstanceStatus')
return self
class QueryAllCustomerResponseBodyResult(TeaModel):
def __init__(
self,
next_token: str = None,
values: List[QueryAllCustomerResponseBodyResultValues] = None,
max_results: int = None,
):
# 下一页的游标,为null则表示无数据
self.next_token = next_token
# 客户数据节点
self.values = values
# 分页大小
self.max_results = max_results
def validate(self):
if self.values:
for k in self.values:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.next_token is not None:
result['nextToken'] = self.next_token
result['values'] = []
if self.values is not None:
for k in self.values:
result['values'].append(k.to_map() if k else None)
if self.max_results is not None:
result['maxResults'] = self.max_results
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
self.values = []
if m.get('values') is not None:
for k in m.get('values'):
temp_model = QueryAllCustomerResponseBodyResultValues()
self.values.append(temp_model.from_map(k))
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
return self
class QueryAllCustomerResponseBody(TeaModel):
def __init__(
self,
result: QueryAllCustomerResponseBodyResult = None,
):
# 分页结果
self.result = result
def validate(self):
if self.result:
self.result.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.result is not None:
result['result'] = self.result.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('result') is not None:
temp_model = QueryAllCustomerResponseBodyResult()
self.result = temp_model.from_map(m['result'])
return self
class QueryAllCustomerResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: QueryAllCustomerResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = QueryAllCustomerResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class SendOfficialAccountOTOMessageHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class SendOfficialAccountOTOMessageRequestDetailMessageBodyText(TeaModel):
def __init__(
self,
content: str = None,
):
# 消息内容,建议500字符以内。
self.content = content
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.content is not None:
result['content'] = self.content
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('content') is not None:
self.content = m.get('content')
return self
class SendOfficialAccountOTOMessageRequestDetailMessageBodyMarkdown(TeaModel):
def __init__(
self,
title: str = None,
text: str = None,
):
# 首屏会话透出的展示内容。
self.title = title
# markdown格式的消息,建议500字符以内。
self.text = text
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.title is not None:
result['title'] = self.title
if self.text is not None:
result['text'] = self.text
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('title') is not None:
self.title = m.get('title')
if m.get('text') is not None:
self.text = m.get('text')
return self
class SendOfficialAccountOTOMessageRequestDetailMessageBodyLink(TeaModel):
def __init__(
self,
pic_url: str = None,
message_url: str = None,
title: str = None,
text: str = None,
):
# 图片地址
self.pic_url = pic_url
# 消息点击链接地址,当发送消息为小程序时支持小程序跳转链接。
self.message_url = message_url
# 消息标题,建议100字符以内。
self.title = title
# 消息描述,建议500字符以内。
self.text = text
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pic_url is not None:
result['picUrl'] = self.pic_url
if self.message_url is not None:
result['messageUrl'] = self.message_url
if self.title is not None:
result['title'] = self.title
if self.text is not None:
result['text'] = self.text
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('picUrl') is not None:
self.pic_url = m.get('picUrl')
if m.get('messageUrl') is not None:
self.message_url = m.get('messageUrl')
if m.get('title') is not None:
self.title = m.get('title')
if m.get('text') is not None:
self.text = m.get('text')
return self
class SendOfficialAccountOTOMessageRequestDetailMessageBodyActionCardButtonList(TeaModel):
def __init__(
self,
title: str = None,
action_url: str = None,
):
# 使用独立跳转ActionCard样式时的按钮的标题,最长20个字符。
self.title = title
# 使用独立跳转ActionCard样式时的跳转链接。
self.action_url = action_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.title is not None:
result['title'] = self.title
if self.action_url is not None:
result['actionUrl'] = self.action_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('title') is not None:
self.title = m.get('title')
if m.get('actionUrl') is not None:
self.action_url = m.get('actionUrl')
return self
class SendOfficialAccountOTOMessageRequestDetailMessageBodyActionCard(TeaModel):
def __init__(
self,
button_orientation: str = None,
single_url: str = None,
single_title: str = None,
markdown: str = None,
title: str = None,
button_list: List[SendOfficialAccountOTOMessageRequestDetailMessageBodyActionCardButtonList] = None,
):
# 按钮排列方式: 0:竖直排列 1:横向排列 必须与buttonList同时设置。
self.button_orientation = button_orientation
# 消息点击链接地址,当发送消息为小程序时支持小程序跳转链接,最长500个字符。
self.single_url = single_url
# 使用整体跳转ActionCard样式时的标题。必须与singleUrl同时设置,最长20个字符。
self.single_title = single_title
# 消息内容,支持markdown,语法参考标准markdown语法。1000个字符以内。
self.markdown = markdown
# 透出到会话列表和通知的文案
self.title = title
# 使用独立跳转ActionCard样式时的按钮列表;必须与buttonOrientation同时设置,且长度不超过1000字符。
self.button_list = button_list
def validate(self):
if self.button_list:
for k in self.button_list:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.button_orientation is not None:
result['buttonOrientation'] = self.button_orientation
if self.single_url is not None:
result['singleUrl'] = self.single_url
if self.single_title is not None:
result['singleTitle'] = self.single_title
if self.markdown is not None:
result['markdown'] = self.markdown
if self.title is not None:
result['title'] = self.title
result['buttonList'] = []
if self.button_list is not None:
for k in self.button_list:
result['buttonList'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('buttonOrientation') is not None:
self.button_orientation = m.get('buttonOrientation')
if m.get('singleUrl') is not None:
self.single_url = m.get('singleUrl')
if m.get('singleTitle') is not None:
self.single_title = m.get('singleTitle')
if m.get('markdown') is not None:
self.markdown = m.get('markdown')
if m.get('title') is not None:
self.title = m.get('title')
self.button_list = []
if m.get('buttonList') is not None:
for k in m.get('buttonList'):
temp_model = SendOfficialAccountOTOMessageRequestDetailMessageBodyActionCardButtonList()
self.button_list.append(temp_model.from_map(k))
return self
class SendOfficialAccountOTOMessageRequestDetailMessageBody(TeaModel):
def __init__(
self,
text: SendOfficialAccountOTOMessageRequestDetailMessageBodyText = None,
markdown: SendOfficialAccountOTOMessageRequestDetailMessageBodyMarkdown = None,
link: SendOfficialAccountOTOMessageRequestDetailMessageBodyLink = None,
action_card: SendOfficialAccountOTOMessageRequestDetailMessageBodyActionCard = None,
):
# 文本消息体 对于文本类型消息时必填
self.text = text
# markdown消息,仅对消息类型为markdown时有效
self.markdown = markdown
# 链接消息类型
self.link = link
# 卡片消息
self.action_card = action_card
def validate(self):
if self.text:
self.text.validate()
if self.markdown:
self.markdown.validate()
if self.link:
self.link.validate()
if self.action_card:
self.action_card.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.text is not None:
result['text'] = self.text.to_map()
if self.markdown is not None:
result['markdown'] = self.markdown.to_map()
if self.link is not None:
result['link'] = self.link.to_map()
if self.action_card is not None:
result['actionCard'] = self.action_card.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('text') is not None:
temp_model = SendOfficialAccountOTOMessageRequestDetailMessageBodyText()
self.text = temp_model.from_map(m['text'])
if m.get('markdown') is not None:
temp_model = SendOfficialAccountOTOMessageRequestDetailMessageBodyMarkdown()
self.markdown = temp_model.from_map(m['markdown'])
if m.get('link') is not None:
temp_model = SendOfficialAccountOTOMessageRequestDetailMessageBodyLink()
self.link = temp_model.from_map(m['link'])
if m.get('actionCard') is not None:
temp_model = SendOfficialAccountOTOMessageRequestDetailMessageBodyActionCard()
self.action_card = temp_model.from_map(m['actionCard'])
return self
class SendOfficialAccountOTOMessageRequestDetail(TeaModel):
def __init__(
self,
msg_type: str = None,
uuid: str = None,
user_id: str = None,
message_body: SendOfficialAccountOTOMessageRequestDetailMessageBody = None,
):
# 消息类型
self.msg_type = msg_type
# 请求唯一 ID
self.uuid = uuid
# 消息接收人id
self.user_id = user_id
# 消息体
self.message_body = message_body
def validate(self):
if self.message_body:
self.message_body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.msg_type is not None:
result['msgType'] = self.msg_type
if self.uuid is not None:
result['uuid'] = self.uuid
if self.user_id is not None:
result['userId'] = self.user_id
if self.message_body is not None:
result['messageBody'] = self.message_body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('msgType') is not None:
self.msg_type = m.get('msgType')
if m.get('uuid') is not None:
self.uuid = m.get('uuid')
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('messageBody') is not None:
temp_model = SendOfficialAccountOTOMessageRequestDetailMessageBody()
self.message_body = temp_model.from_map(m['messageBody'])
return self
class SendOfficialAccountOTOMessageRequest(TeaModel):
def __init__(
self,
detail: SendOfficialAccountOTOMessageRequestDetail = None,
biz_id: str = None,
ding_token_grant_type: int = None,
ding_isv_org_id: int = None,
ding_org_id: int = None,
ding_suite_key: str = None,
account_id: str = None,
):
# 消息详情
self.detail = detail
# API调用标识,可选参数
self.biz_id = biz_id
self.ding_token_grant_type = ding_token_grant_type
self.ding_isv_org_id = ding_isv_org_id
self.ding_org_id = ding_org_id
self.ding_suite_key = ding_suite_key
# 服务窗帐号ID
self.account_id = account_id
def validate(self):
if self.detail:
self.detail.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.detail is not None:
result['detail'] = self.detail.to_map()
if self.biz_id is not None:
result['bizId'] = self.biz_id
if self.ding_token_grant_type is not None:
result['dingTokenGrantType'] = self.ding_token_grant_type
if self.ding_isv_org_id is not None:
result['dingIsvOrgId'] = self.ding_isv_org_id
if self.ding_org_id is not None:
result['dingOrgId'] = self.ding_org_id
if self.ding_suite_key is not None:
result['dingSuiteKey'] = self.ding_suite_key
if self.account_id is not None:
result['accountId'] = self.account_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('detail') is not None:
temp_model = SendOfficialAccountOTOMessageRequestDetail()
self.detail = temp_model.from_map(m['detail'])
if m.get('bizId') is not None:
self.biz_id = m.get('bizId')
if m.get('dingTokenGrantType') is not None:
self.ding_token_grant_type = m.get('dingTokenGrantType')
if m.get('dingIsvOrgId') is not None:
self.ding_isv_org_id = m.get('dingIsvOrgId')
if m.get('dingOrgId') is not None:
self.ding_org_id = m.get('dingOrgId')
if m.get('dingSuiteKey') is not None:
self.ding_suite_key = m.get('dingSuiteKey')
if m.get('accountId') is not None:
self.account_id = m.get('accountId')
return self
class SendOfficialAccountOTOMessageResponseBodyResult(TeaModel):
def __init__(
self,
open_push_id: str = None,
):
# 推送ID
self.open_push_id = open_push_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.open_push_id is not None:
result['openPushId'] = self.open_push_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('openPushId') is not None:
self.open_push_id = m.get('openPushId')
return self
class SendOfficialAccountOTOMessageResponseBody(TeaModel):
def __init__(
self,
request_id: str = None,
result: SendOfficialAccountOTOMessageResponseBodyResult = None,
):
# Id of the request
self.request_id = request_id
# 推送结果
self.result = result
def validate(self):
if self.result:
self.result.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.request_id is not None:
result['requestId'] = self.request_id
if self.result is not None:
result['result'] = self.result.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('requestId') is not None:
self.request_id = m.get('requestId')
if m.get('result') is not None:
temp_model = SendOfficialAccountOTOMessageResponseBodyResult()
self.result = temp_model.from_map(m['result'])
return self
class SendOfficialAccountOTOMessageResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: SendOfficialAccountOTOMessageResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = SendOfficialAccountOTOMessageResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetOfficialAccountOTOMessageResultHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetOfficialAccountOTOMessageResultRequest(TeaModel):
def __init__(
self,
open_push_id: str = None,
account_id: str = None,
):
# 推送ID
self.open_push_id = open_push_id
self.account_id = account_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.open_push_id is not None:
result['openPushId'] = self.open_push_id
if self.account_id is not None:
result['accountId'] = self.account_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('openPushId') is not None:
self.open_push_id = m.get('openPushId')
if m.get('accountId') is not None:
self.account_id = m.get('accountId')
return self
class GetOfficialAccountOTOMessageResultResponseBodyResult(TeaModel):
def __init__(
self,
status: int = None,
read_user_id_list: List[str] = None,
):
# 执行状态: 0:未开始 1:处理中 2:处理完毕
self.status = status
# 已读消息的userid列表
self.read_user_id_list = read_user_id_list
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.status is not None:
result['status'] = self.status
if self.read_user_id_list is not None:
result['readUserIdList'] = self.read_user_id_list
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('status') is not None:
self.status = m.get('status')
if m.get('readUserIdList') is not None:
self.read_user_id_list = m.get('readUserIdList')
return self
class GetOfficialAccountOTOMessageResultResponseBody(TeaModel):
def __init__(
self,
request_id: str = None,
result: GetOfficialAccountOTOMessageResultResponseBodyResult = None,
):
# Id of the request
self.request_id = request_id
# 查询结果
self.result = result
def validate(self):
if self.result:
self.result.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.request_id is not None:
result['requestId'] = self.request_id
if self.result is not None:
result['result'] = self.result.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('requestId') is not None:
self.request_id = m.get('requestId')
if m.get('result') is not None:
temp_model = GetOfficialAccountOTOMessageResultResponseBodyResult()
self.result = temp_model.from_map(m['result'])
return self
class GetOfficialAccountOTOMessageResultResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetOfficialAccountOTOMessageResultResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetOfficialAccountOTOMessageResultResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class AddCrmPersonalCustomerHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class AddCrmPersonalCustomerRequestPermission(TeaModel):
def __init__(
self,
owner_staff_ids: List[str] = None,
participant_staff_ids: List[str] = None,
):
# 负责人的用户ID
self.owner_staff_ids = owner_staff_ids
# 协同人的用户ID
self.participant_staff_ids = participant_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
return self
class AddCrmPersonalCustomerRequest(TeaModel):
def __init__(
self,
creator_user_id: str = None,
creator_nick: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
permission: AddCrmPersonalCustomerRequestPermission = None,
skip_duplicate_check: bool = None,
):
# 记录创建人的用户ID
self.creator_user_id = creator_user_id
# 记录创建人的昵称
self.creator_nick = creator_nick
# 数据内容
self.data = data
# 扩展数据内容
self.extend_data = extend_data
# 权限
self.permission = permission
# 跳过uk查重
self.skip_duplicate_check = skip_duplicate_check
def validate(self):
if self.permission:
self.permission.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.creator_user_id is not None:
result['creatorUserId'] = self.creator_user_id
if self.creator_nick is not None:
result['creatorNick'] = self.creator_nick
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
if self.permission is not None:
result['permission'] = self.permission.to_map()
if self.skip_duplicate_check is not None:
result['skipDuplicateCheck'] = self.skip_duplicate_check
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('creatorUserId') is not None:
self.creator_user_id = m.get('creatorUserId')
if m.get('creatorNick') is not None:
self.creator_nick = m.get('creatorNick')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
if m.get('permission') is not None:
temp_model = AddCrmPersonalCustomerRequestPermission()
self.permission = temp_model.from_map(m['permission'])
if m.get('skipDuplicateCheck') is not None:
self.skip_duplicate_check = m.get('skipDuplicateCheck')
return self
class AddCrmPersonalCustomerResponseBody(TeaModel):
def __init__(
self,
instance_id: str = None,
):
# 客户数据id
self.instance_id = instance_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.instance_id is not None:
result['instanceId'] = self.instance_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
return self
class AddCrmPersonalCustomerResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: AddCrmPersonalCustomerResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = AddCrmPersonalCustomerResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class RecallOfficialAccountOTOMessageHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class RecallOfficialAccountOTOMessageRequest(TeaModel):
def __init__(
self,
ding_suite_key: str = None,
ding_org_id: int = None,
ding_isv_org_id: int = None,
ding_token_grant_type: int = None,
account_id: str = None,
open_push_id: str = None,
):
self.ding_suite_key = ding_suite_key
self.ding_org_id = ding_org_id
self.ding_isv_org_id = ding_isv_org_id
self.ding_token_grant_type = ding_token_grant_type
# 帐号ID 可空
self.account_id = account_id
# 消息推送时返回的ID
self.open_push_id = open_push_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_suite_key is not None:
result['dingSuiteKey'] = self.ding_suite_key
if self.ding_org_id is not None:
result['dingOrgId'] = self.ding_org_id
if self.ding_isv_org_id is not None:
result['dingIsvOrgId'] = self.ding_isv_org_id
if self.ding_token_grant_type is not None:
result['dingTokenGrantType'] = self.ding_token_grant_type
if self.account_id is not None:
result['accountId'] = self.account_id
if self.open_push_id is not None:
result['openPushId'] = self.open_push_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingSuiteKey') is not None:
self.ding_suite_key = m.get('dingSuiteKey')
if m.get('dingOrgId') is not None:
self.ding_org_id = m.get('dingOrgId')
if m.get('dingIsvOrgId') is not None:
self.ding_isv_org_id = m.get('dingIsvOrgId')
if m.get('dingTokenGrantType') is not None:
self.ding_token_grant_type = m.get('dingTokenGrantType')
if m.get('accountId') is not None:
self.account_id = m.get('accountId')
if m.get('openPushId') is not None:
self.open_push_id = m.get('openPushId')
return self
class RecallOfficialAccountOTOMessageResponseBody(TeaModel):
def __init__(
self,
request_id: str = None,
):
# Id of the request
self.request_id = request_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.request_id is not None:
result['requestId'] = self.request_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('requestId') is not None:
self.request_id = m.get('requestId')
return self
class RecallOfficialAccountOTOMessageResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: RecallOfficialAccountOTOMessageResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = RecallOfficialAccountOTOMessageResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class DescribeCrmPersonalCustomerObjectMetaHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsSelectOptions(TeaModel):
def __init__(
self,
key: str = None,
value: str = None,
):
self.key = key
self.value = value
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.key is not None:
result['key'] = self.key
if self.value is not None:
result['value'] = self.value
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('key') is not None:
self.key = m.get('key')
if m.get('value') is not None:
self.value = m.get('value')
return self
class DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsReferenceFieldsSelectOptions(TeaModel):
def __init__(
self,
key: str = None,
value: str = None,
):
self.key = key
self.value = value
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.key is not None:
result['key'] = self.key
if self.value is not None:
result['value'] = self.value
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('key') is not None:
self.key = m.get('key')
if m.get('value') is not None:
self.value = m.get('value')
return self
class DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsReferenceFields(TeaModel):
def __init__(
self,
label: str = None,
type: str = None,
nillable: bool = None,
unit: str = None,
format: str = None,
select_options: List[DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsReferenceFieldsSelectOptions] = None,
name: str = None,
):
self.label = label
self.type = type
self.nillable = nillable
self.unit = unit
self.format = format
self.select_options = select_options
self.name = name
def validate(self):
if self.select_options:
for k in self.select_options:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.label is not None:
result['label'] = self.label
if self.type is not None:
result['type'] = self.type
if self.nillable is not None:
result['nillable'] = self.nillable
if self.unit is not None:
result['unit'] = self.unit
if self.format is not None:
result['format'] = self.format
result['selectOptions'] = []
if self.select_options is not None:
for k in self.select_options:
result['selectOptions'].append(k.to_map() if k else None)
if self.name is not None:
result['name'] = self.name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('label') is not None:
self.label = m.get('label')
if m.get('type') is not None:
self.type = m.get('type')
if m.get('nillable') is not None:
self.nillable = m.get('nillable')
if m.get('unit') is not None:
self.unit = m.get('unit')
if m.get('format') is not None:
self.format = m.get('format')
self.select_options = []
if m.get('selectOptions') is not None:
for k in m.get('selectOptions'):
temp_model = DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsReferenceFieldsSelectOptions()
self.select_options.append(temp_model.from_map(k))
if m.get('name') is not None:
self.name = m.get('name')
return self
class DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsRollUpSummaryFields(TeaModel):
def __init__(
self,
name: str = None,
aggregator: str = None,
):
self.name = name
self.aggregator = aggregator
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.name is not None:
result['name'] = self.name
if self.aggregator is not None:
result['aggregator'] = self.aggregator
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('name') is not None:
self.name = m.get('name')
if m.get('aggregator') is not None:
self.aggregator = m.get('aggregator')
return self
class DescribeCrmPersonalCustomerObjectMetaResponseBodyFields(TeaModel):
def __init__(
self,
name: str = None,
customized: bool = None,
label: str = None,
type: str = None,
nillable: bool = None,
format: str = None,
unit: str = None,
select_options: List[DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsSelectOptions] = None,
quote: bool = None,
reference_to: str = None,
reference_fields: List[DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsReferenceFields] = None,
roll_up_summary_fields: List[DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsRollUpSummaryFields] = None,
):
self.name = name
self.customized = customized
self.label = label
self.type = type
self.nillable = nillable
self.format = format
self.unit = unit
self.select_options = select_options
self.quote = quote
self.reference_to = reference_to
self.reference_fields = reference_fields
self.roll_up_summary_fields = roll_up_summary_fields
def validate(self):
if self.select_options:
for k in self.select_options:
if k:
k.validate()
if self.reference_fields:
for k in self.reference_fields:
if k:
k.validate()
if self.roll_up_summary_fields:
for k in self.roll_up_summary_fields:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.name is not None:
result['name'] = self.name
if self.customized is not None:
result['customized'] = self.customized
if self.label is not None:
result['label'] = self.label
if self.type is not None:
result['type'] = self.type
if self.nillable is not None:
result['nillable'] = self.nillable
if self.format is not None:
result['format'] = self.format
if self.unit is not None:
result['unit'] = self.unit
result['selectOptions'] = []
if self.select_options is not None:
for k in self.select_options:
result['selectOptions'].append(k.to_map() if k else None)
if self.quote is not None:
result['quote'] = self.quote
if self.reference_to is not None:
result['referenceTo'] = self.reference_to
result['referenceFields'] = []
if self.reference_fields is not None:
for k in self.reference_fields:
result['referenceFields'].append(k.to_map() if k else None)
result['rollUpSummaryFields'] = []
if self.roll_up_summary_fields is not None:
for k in self.roll_up_summary_fields:
result['rollUpSummaryFields'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('name') is not None:
self.name = m.get('name')
if m.get('customized') is not None:
self.customized = m.get('customized')
if m.get('label') is not None:
self.label = m.get('label')
if m.get('type') is not None:
self.type = m.get('type')
if m.get('nillable') is not None:
self.nillable = m.get('nillable')
if m.get('format') is not None:
self.format = m.get('format')
if m.get('unit') is not None:
self.unit = m.get('unit')
self.select_options = []
if m.get('selectOptions') is not None:
for k in m.get('selectOptions'):
temp_model = DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsSelectOptions()
self.select_options.append(temp_model.from_map(k))
if m.get('quote') is not None:
self.quote = m.get('quote')
if m.get('referenceTo') is not None:
self.reference_to = m.get('referenceTo')
self.reference_fields = []
if m.get('referenceFields') is not None:
for k in m.get('referenceFields'):
temp_model = DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsReferenceFields()
self.reference_fields.append(temp_model.from_map(k))
self.roll_up_summary_fields = []
if m.get('rollUpSummaryFields') is not None:
for k in m.get('rollUpSummaryFields'):
temp_model = DescribeCrmPersonalCustomerObjectMetaResponseBodyFieldsRollUpSummaryFields()
self.roll_up_summary_fields.append(temp_model.from_map(k))
return self
class DescribeCrmPersonalCustomerObjectMetaResponseBody(TeaModel):
def __init__(
self,
name: str = None,
customized: bool = None,
fields: List[DescribeCrmPersonalCustomerObjectMetaResponseBodyFields] = None,
):
# 对象名称
self.name = name
# 是否自定义对象
self.customized = customized
# 字段列表
self.fields = fields
def validate(self):
if self.fields:
for k in self.fields:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.name is not None:
result['name'] = self.name
if self.customized is not None:
result['customized'] = self.customized
result['fields'] = []
if self.fields is not None:
for k in self.fields:
result['fields'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('name') is not None:
self.name = m.get('name')
if m.get('customized') is not None:
self.customized = m.get('customized')
self.fields = []
if m.get('fields') is not None:
for k in m.get('fields'):
temp_model = DescribeCrmPersonalCustomerObjectMetaResponseBodyFields()
self.fields.append(temp_model.from_map(k))
return self
class DescribeCrmPersonalCustomerObjectMetaResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: DescribeCrmPersonalCustomerObjectMetaResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = DescribeCrmPersonalCustomerObjectMetaResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class DeleteCrmPersonalCustomerHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class DeleteCrmPersonalCustomerRequest(TeaModel):
def __init__(
self,
current_operator_user_id: str = None,
):
# 操作人用户ID
self.current_operator_user_id = current_operator_user_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.current_operator_user_id is not None:
result['currentOperatorUserId'] = self.current_operator_user_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('currentOperatorUserId') is not None:
self.current_operator_user_id = m.get('currentOperatorUserId')
return self
class DeleteCrmPersonalCustomerResponseBody(TeaModel):
def __init__(
self,
instance_id: str = None,
):
# 客户数据id
self.instance_id = instance_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.instance_id is not None:
result['instanceId'] = self.instance_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
return self
class DeleteCrmPersonalCustomerResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: DeleteCrmPersonalCustomerResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = DeleteCrmPersonalCustomerResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class UpdateCrmPersonalCustomerHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class UpdateCrmPersonalCustomerRequestPermission(TeaModel):
def __init__(
self,
owner_staff_ids: List[str] = None,
participant_staff_ids: List[str] = None,
):
self.owner_staff_ids = owner_staff_ids
self.participant_staff_ids = participant_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
return self
class UpdateCrmPersonalCustomerRequest(TeaModel):
def __init__(
self,
instance_id: str = None,
modifier_user_id: str = None,
modifier_nick: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
permission: UpdateCrmPersonalCustomerRequestPermission = None,
skip_duplicate_check: bool = None,
):
self.instance_id = instance_id
self.modifier_user_id = modifier_user_id
self.modifier_nick = modifier_nick
self.data = data
self.extend_data = extend_data
self.permission = permission
# 跳过uk查重
self.skip_duplicate_check = skip_duplicate_check
def validate(self):
if self.permission:
self.permission.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.instance_id is not None:
result['instanceId'] = self.instance_id
if self.modifier_user_id is not None:
result['modifierUserId'] = self.modifier_user_id
if self.modifier_nick is not None:
result['modifierNick'] = self.modifier_nick
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
if self.permission is not None:
result['permission'] = self.permission.to_map()
if self.skip_duplicate_check is not None:
result['skipDuplicateCheck'] = self.skip_duplicate_check
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
if m.get('modifierUserId') is not None:
self.modifier_user_id = m.get('modifierUserId')
if m.get('modifierNick') is not None:
self.modifier_nick = m.get('modifierNick')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
if m.get('permission') is not None:
temp_model = UpdateCrmPersonalCustomerRequestPermission()
self.permission = temp_model.from_map(m['permission'])
if m.get('skipDuplicateCheck') is not None:
self.skip_duplicate_check = m.get('skipDuplicateCheck')
return self
class UpdateCrmPersonalCustomerResponseBody(TeaModel):
def __init__(
self,
instance_id: str = None,
):
# 客户数据id
self.instance_id = instance_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.instance_id is not None:
result['instanceId'] = self.instance_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
return self
class UpdateCrmPersonalCustomerResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: UpdateCrmPersonalCustomerResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = UpdateCrmPersonalCustomerResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class QueryCrmPersonalCustomerHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class QueryCrmPersonalCustomerRequest(TeaModel):
def __init__(
self,
current_operator_user_id: str = None,
next_token: str = None,
max_results: int = None,
query_dsl: str = None,
):
# 用户ID
self.current_operator_user_id = current_operator_user_id
# 分页页码
self.next_token = next_token
# 分页条数
self.max_results = max_results
# 查询条件
self.query_dsl = query_dsl
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.current_operator_user_id is not None:
result['currentOperatorUserId'] = self.current_operator_user_id
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.max_results is not None:
result['maxResults'] = self.max_results
if self.query_dsl is not None:
result['queryDsl'] = self.query_dsl
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('currentOperatorUserId') is not None:
self.current_operator_user_id = m.get('currentOperatorUserId')
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
if m.get('queryDsl') is not None:
self.query_dsl = m.get('queryDsl')
return self
class QueryCrmPersonalCustomerResponseBodyValuesPermission(TeaModel):
def __init__(
self,
owner_staff_ids: List[str] = None,
participant_staff_ids: List[str] = None,
):
# 负责人用户ID列表
self.owner_staff_ids = owner_staff_ids
# 协同人用户ID列表
self.participant_staff_ids = participant_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
return self
class QueryCrmPersonalCustomerResponseBodyValues(TeaModel):
def __init__(
self,
instance_id: str = None,
object_type: str = None,
creator_user_id: str = None,
creator_nick: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
permission: QueryCrmPersonalCustomerResponseBodyValuesPermission = None,
proc_out_result: str = None,
proc_inst_status: str = None,
gmt_create: str = None,
gmt_modified: str = None,
):
# 数据ID
self.instance_id = instance_id
# 数据类型
self.object_type = object_type
# 创建记录的用户ID
self.creator_user_id = creator_user_id
# 创建记录的用户昵称
self.creator_nick = creator_nick
# 数据内容
self.data = data
# 扩展数据内容
self.extend_data = extend_data
# 数据权限信息
self.permission = permission
# 审批结果
self.proc_out_result = proc_out_result
# 审批状态
self.proc_inst_status = proc_inst_status
# 记录创建时间
self.gmt_create = gmt_create
# 记录修改时间
self.gmt_modified = gmt_modified
def validate(self):
if self.permission:
self.permission.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.instance_id is not None:
result['instanceId'] = self.instance_id
if self.object_type is not None:
result['objectType'] = self.object_type
if self.creator_user_id is not None:
result['creatorUserId'] = self.creator_user_id
if self.creator_nick is not None:
result['creatorNick'] = self.creator_nick
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
if self.permission is not None:
result['permission'] = self.permission.to_map()
if self.proc_out_result is not None:
result['procOutResult'] = self.proc_out_result
if self.proc_inst_status is not None:
result['procInstStatus'] = self.proc_inst_status
if self.gmt_create is not None:
result['gmtCreate'] = self.gmt_create
if self.gmt_modified is not None:
result['gmtModified'] = self.gmt_modified
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
if m.get('objectType') is not None:
self.object_type = m.get('objectType')
if m.get('creatorUserId') is not None:
self.creator_user_id = m.get('creatorUserId')
if m.get('creatorNick') is not None:
self.creator_nick = m.get('creatorNick')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
if m.get('permission') is not None:
temp_model = QueryCrmPersonalCustomerResponseBodyValuesPermission()
self.permission = temp_model.from_map(m['permission'])
if m.get('procOutResult') is not None:
self.proc_out_result = m.get('procOutResult')
if m.get('procInstStatus') is not None:
self.proc_inst_status = m.get('procInstStatus')
if m.get('gmtCreate') is not None:
self.gmt_create = m.get('gmtCreate')
if m.get('gmtModified') is not None:
self.gmt_modified = m.get('gmtModified')
return self
class QueryCrmPersonalCustomerResponseBody(TeaModel):
def __init__(
self,
values: List[QueryCrmPersonalCustomerResponseBodyValues] = None,
has_more: bool = None,
next_token: str = None,
max_results: int = None,
):
self.values = values
self.has_more = has_more
self.next_token = next_token
self.max_results = max_results
def validate(self):
if self.values:
for k in self.values:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['values'] = []
if self.values is not None:
for k in self.values:
result['values'].append(k.to_map() if k else None)
if self.has_more is not None:
result['hasMore'] = self.has_more
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.max_results is not None:
result['maxResults'] = self.max_results
return result
def from_map(self, m: dict = None):
m = m or dict()
self.values = []
if m.get('values') is not None:
for k in m.get('values'):
temp_model = QueryCrmPersonalCustomerResponseBodyValues()
self.values.append(temp_model.from_map(k))
if m.get('hasMore') is not None:
self.has_more = m.get('hasMore')
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
return self
class QueryCrmPersonalCustomerResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: QueryCrmPersonalCustomerResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = QueryCrmPersonalCustomerResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ListCrmPersonalCustomersHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ListCrmPersonalCustomersRequest(TeaModel):
def __init__(
self,
current_operator_user_id: str = None,
body: List[str] = None,
):
# 操作人用户ID
self.current_operator_user_id = current_operator_user_id
# 数据客户列表
self.body = body
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.current_operator_user_id is not None:
result['currentOperatorUserId'] = self.current_operator_user_id
if self.body is not None:
result['body'] = self.body
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('currentOperatorUserId') is not None:
self.current_operator_user_id = m.get('currentOperatorUserId')
if m.get('body') is not None:
self.body = m.get('body')
return self
class ListCrmPersonalCustomersResponseBodyResultPermission(TeaModel):
def __init__(
self,
owner_staff_ids: List[str] = None,
participant_staff_ids: List[str] = None,
):
self.owner_staff_ids = owner_staff_ids
self.participant_staff_ids = participant_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
return self
class ListCrmPersonalCustomersResponseBodyResult(TeaModel):
def __init__(
self,
org_id: int = None,
instance_id: str = None,
object_type: str = None,
creator_user_id: str = None,
creator_nick: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
permission: ListCrmPersonalCustomersResponseBodyResultPermission = None,
app_uuid: str = None,
form_code: str = None,
proc_out_result: str = None,
proc_inst_status: str = None,
gmt_create: str = None,
gmt_modified: str = None,
):
self.org_id = org_id
self.instance_id = instance_id
self.object_type = object_type
self.creator_user_id = creator_user_id
self.creator_nick = creator_nick
self.data = data
self.extend_data = extend_data
self.permission = permission
self.app_uuid = app_uuid
self.form_code = form_code
self.proc_out_result = proc_out_result
self.proc_inst_status = proc_inst_status
self.gmt_create = gmt_create
self.gmt_modified = gmt_modified
def validate(self):
if self.permission:
self.permission.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.org_id is not None:
result['orgId'] = self.org_id
if self.instance_id is not None:
result['instanceId'] = self.instance_id
if self.object_type is not None:
result['objectType'] = self.object_type
if self.creator_user_id is not None:
result['creatorUserId'] = self.creator_user_id
if self.creator_nick is not None:
result['creatorNick'] = self.creator_nick
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
if self.permission is not None:
result['permission'] = self.permission.to_map()
if self.app_uuid is not None:
result['appUuid'] = self.app_uuid
if self.form_code is not None:
result['formCode'] = self.form_code
if self.proc_out_result is not None:
result['procOutResult'] = self.proc_out_result
if self.proc_inst_status is not None:
result['procInstStatus'] = self.proc_inst_status
if self.gmt_create is not None:
result['gmtCreate'] = self.gmt_create
if self.gmt_modified is not None:
result['gmtModified'] = self.gmt_modified
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('orgId') is not None:
self.org_id = m.get('orgId')
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
if m.get('objectType') is not None:
self.object_type = m.get('objectType')
if m.get('creatorUserId') is not None:
self.creator_user_id = m.get('creatorUserId')
if m.get('creatorNick') is not None:
self.creator_nick = m.get('creatorNick')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
if m.get('permission') is not None:
temp_model = ListCrmPersonalCustomersResponseBodyResultPermission()
self.permission = temp_model.from_map(m['permission'])
if m.get('appUuid') is not None:
self.app_uuid = m.get('appUuid')
if m.get('formCode') is not None:
self.form_code = m.get('formCode')
if m.get('procOutResult') is not None:
self.proc_out_result = m.get('procOutResult')
if m.get('procInstStatus') is not None:
self.proc_inst_status = m.get('procInstStatus')
if m.get('gmtCreate') is not None:
self.gmt_create = m.get('gmtCreate')
if m.get('gmtModified') is not None:
self.gmt_modified = m.get('gmtModified')
return self
class ListCrmPersonalCustomersResponseBody(TeaModel):
def __init__(
self,
result: List[ListCrmPersonalCustomersResponseBodyResult] = None,
):
self.result = result
def validate(self):
if self.result:
for k in self.result:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['result'] = []
if self.result is not None:
for k in self.result:
result['result'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.result = []
if m.get('result') is not None:
for k in m.get('result'):
temp_model = ListCrmPersonalCustomersResponseBodyResult()
self.result.append(temp_model.from_map(k))
return self
class ListCrmPersonalCustomersResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ListCrmPersonalCustomersResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ListCrmPersonalCustomersResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class CreateCustomerHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CreateCustomerRequestContacts(TeaModel):
def __init__(
self,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
):
# 联系人表单数据
self.data = data
# 联系人扩展数据
self.extend_data = extend_data
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
return self
class CreateCustomerRequestPermission(TeaModel):
def __init__(
self,
owner_staff_ids: List[str] = None,
participant_staff_ids: List[str] = None,
):
# 负责人
self.owner_staff_ids = owner_staff_ids
# 协同人
self.participant_staff_ids = participant_staff_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.owner_staff_ids is not None:
result['ownerStaffIds'] = self.owner_staff_ids
if self.participant_staff_ids is not None:
result['participantStaffIds'] = self.participant_staff_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('ownerStaffIds') is not None:
self.owner_staff_ids = m.get('ownerStaffIds')
if m.get('participantStaffIds') is not None:
self.participant_staff_ids = m.get('participantStaffIds')
return self
class CreateCustomerRequestSaveOption(TeaModel):
def __init__(
self,
subscribe_policy: int = None,
throw_exception_while_saving_contact_failed: bool = None,
customer_existed_policy: str = None,
skip_duplicate_check: bool = None,
):
# 关注配置:0 不处理, 1 自动关注(需要单独申请白名单)
self.subscribe_policy = subscribe_policy
# 保存联系人失败时是否阻断
self.throw_exception_while_saving_contact_failed = throw_exception_while_saving_contact_failed
# 客户已存在时的处理策略:APPEND_CONTACT_FORCE 直接追加联系人; REJECT 返回失败
self.customer_existed_policy = customer_existed_policy
# 跳过uk查重
self.skip_duplicate_check = skip_duplicate_check
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.subscribe_policy is not None:
result['subscribePolicy'] = self.subscribe_policy
if self.throw_exception_while_saving_contact_failed is not None:
result['throwExceptionWhileSavingContactFailed'] = self.throw_exception_while_saving_contact_failed
if self.customer_existed_policy is not None:
result['customerExistedPolicy'] = self.customer_existed_policy
if self.skip_duplicate_check is not None:
result['skipDuplicateCheck'] = self.skip_duplicate_check
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('subscribePolicy') is not None:
self.subscribe_policy = m.get('subscribePolicy')
if m.get('throwExceptionWhileSavingContactFailed') is not None:
self.throw_exception_while_saving_contact_failed = m.get('throwExceptionWhileSavingContactFailed')
if m.get('customerExistedPolicy') is not None:
self.customer_existed_policy = m.get('customerExistedPolicy')
if m.get('skipDuplicateCheck') is not None:
self.skip_duplicate_check = m.get('skipDuplicateCheck')
return self
class CreateCustomerRequest(TeaModel):
def __init__(
self,
object_type: str = None,
instance_id: str = None,
creator_user_id: str = None,
data: Dict[str, Any] = None,
extend_data: Dict[str, Any] = None,
contacts: List[CreateCustomerRequestContacts] = None,
permission: CreateCustomerRequestPermission = None,
save_option: CreateCustomerRequestSaveOption = None,
):
# 写入客户类型:个人客户crm_customer_personal; 企业客户crm_customer
self.object_type = object_type
# 已存在客户时,添加联系人,可以传入客户的instanceId用作关联绑定
self.instance_id = instance_id
# 创建人的userId
self.creator_user_id = creator_user_id
# 客户实例数据(表单数据)
self.data = data
# 客户实例扩展数据
self.extend_data = extend_data
# 关联联系人数据
self.contacts = contacts
# 权限
self.permission = permission
# 保存配置项
self.save_option = save_option
def validate(self):
if self.contacts:
for k in self.contacts:
if k:
k.validate()
if self.permission:
self.permission.validate()
if self.save_option:
self.save_option.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.object_type is not None:
result['objectType'] = self.object_type
if self.instance_id is not None:
result['instanceId'] = self.instance_id
if self.creator_user_id is not None:
result['creatorUserId'] = self.creator_user_id
if self.data is not None:
result['data'] = self.data
if self.extend_data is not None:
result['extendData'] = self.extend_data
result['contacts'] = []
if self.contacts is not None:
for k in self.contacts:
result['contacts'].append(k.to_map() if k else None)
if self.permission is not None:
result['permission'] = self.permission.to_map()
if self.save_option is not None:
result['saveOption'] = self.save_option.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('objectType') is not None:
self.object_type = m.get('objectType')
if m.get('instanceId') is not None:
self.instance_id = m.get('instanceId')
if m.get('creatorUserId') is not None:
self.creator_user_id = m.get('creatorUserId')
if m.get('data') is not None:
self.data = m.get('data')
if m.get('extendData') is not None:
self.extend_data = m.get('extendData')
self.contacts = []
if m.get('contacts') is not None:
for k in m.get('contacts'):
temp_model = CreateCustomerRequestContacts()
self.contacts.append(temp_model.from_map(k))
if m.get('permission') is not None:
temp_model = CreateCustomerRequestPermission()
self.permission = temp_model.from_map(m['permission'])
if m.get('saveOption') is not None:
temp_model = CreateCustomerRequestSaveOption()
self.save_option = temp_model.from_map(m['saveOption'])
return self
class CreateCustomerResponseBodyContacts(TeaModel):
def __init__(
self,
contact_instance_id: str = None,
):
# 联系人实例id
self.contact_instance_id = contact_instance_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.contact_instance_id is not None:
result['contactInstanceId'] = self.contact_instance_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('contactInstanceId') is not None:
self.contact_instance_id = m.get('contactInstanceId')
return self
class CreateCustomerResponseBody(TeaModel):
def __init__(
self,
customer_instance_id: str = None,
object_type: str = None,
contacts: List[CreateCustomerResponseBodyContacts] = None,
):
# 客户实例id
self.customer_instance_id = customer_instance_id
# 保存客户类型
self.object_type = object_type
# 联系人保存结果
self.contacts = contacts
def validate(self):
if self.contacts:
for k in self.contacts:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.customer_instance_id is not None:
result['customerInstanceId'] = self.customer_instance_id
if self.object_type is not None:
result['objectType'] = self.object_type
result['contacts'] = []
if self.contacts is not None:
for k in self.contacts:
result['contacts'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('customerInstanceId') is not None:
self.customer_instance_id = m.get('customerInstanceId')
if m.get('objectType') is not None:
self.object_type = m.get('objectType')
self.contacts = []
if m.get('contacts') is not None:
for k in m.get('contacts'):
temp_model = CreateCustomerResponseBodyContacts()
self.contacts.append(temp_model.from_map(k))
return self
class CreateCustomerResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CreateCustomerResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CreateCustomerResponseBody()
self.body = temp_model.from_map(m['body'])
return self
| 32.03936 | 121 | 0.591122 | 18,255 | 152,219 | 4.735032 | 0.026678 | 0.045697 | 0.082255 | 0.056572 | 0.808568 | 0.789028 | 0.775319 | 0.763923 | 0.754888 | 0.745772 | 0 | 0.00068 | 0.314277 | 152,219 | 4,750 | 122 | 32.046105 | 0.827429 | 0.015162 | 0 | 0.889992 | 1 | 0 | 0.07188 | 0.011941 | 0 | 0 | 0 | 0 | 0 | 1 | 0.110008 | false | 0.015024 | 0.000509 | 0 | 0.220525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f2590adc4b8e68c9ef643def038b5f953eaf4626 | 5,811 | py | Python | tests/strategies/test_dataresource.py | EMMC-ASBL/otelib | a66001f3a8b94fc7a113d25f03ff04ba1e2fbf51 | [
"MIT"
] | 1 | 2022-01-24T15:18:14.000Z | 2022-01-24T15:18:14.000Z | tests/strategies/test_dataresource.py | EMMC-ASBL/otelib | a66001f3a8b94fc7a113d25f03ff04ba1e2fbf51 | [
"MIT"
] | 34 | 2022-01-28T16:22:46.000Z | 2022-03-30T17:07:36.000Z | tests/strategies/test_dataresource.py | EMMC-ASBL/otelib | a66001f3a8b94fc7a113d25f03ff04ba1e2fbf51 | [
"MIT"
] | null | null | null | """Tests for `otelib.strategies.dataresource`."""
from typing import TYPE_CHECKING
import pytest
if TYPE_CHECKING:
from typing import Callable, Union
from tests.conftest import OTEResponse, ResourceType
def test_create(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
) -> None:
"""Test `DataResource.create()`."""
from otelib.strategies.dataresource import DataResource
mock_ote_response(
method="post",
endpoint="/dataresource",
return_json={"resource_id": ids("dataresource")},
)
data_resource = DataResource(server_url)
assert data_resource.id is None
data_resource.create(
downloadUrl="https://filesamples.com/samples/code/json/sample2.json",
mediaType="application/json",
)
assert data_resource.id
def test_create_fails(
mock_ote_response: "OTEResponse",
server_url: str,
) -> None:
"""Check `DataResource.create()` raises `ApiError` upon request failure."""
from otelib.exceptions import ApiError
from otelib.strategies.dataresource import DataResource
mock_ote_response(
method="post",
endpoint="/dataresource",
status_code=500,
return_content=b"Internal Server Error",
)
data_resource = DataResource(server_url)
assert data_resource.id is None
with pytest.raises(ApiError, match="APIError"):
# `session_id` has a wrong type, the request should fail.
data_resource.create(
downloadUrl="https://filesamples.com/samples/code/json/sample2.json",
mediaType="application/json",
session_id=123,
)
assert data_resource.id is None
def test_fetch(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Test `DataResource.fetch()`."""
import json
from otelib.strategies.dataresource import DataResource
mock_ote_response(
method="post",
endpoint="/dataresource",
return_json={"resource_id": ids("dataresource")},
)
mock_ote_response(
method="get",
endpoint=f"/dataresource/{ids('dataresource')}",
return_json=testdata("dataresource"),
)
data_resource = DataResource(server_url)
# We must first create the resource - getting a resource ID
data_resource.create(
downloadUrl="https://filesamples.com/samples/code/json/sample2.json",
mediaType="application/json",
)
content = data_resource.fetch(session_id=None)
assert json.loads(content) == testdata("dataresource")
def test_fetch_fails(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
) -> None:
"""Check `DataResource.fetch()` raises `ApiError` upon request failure."""
from otelib.exceptions import ApiError
from otelib.strategies.dataresource import DataResource
mock_ote_response(
method="post",
endpoint="/dataresource",
return_json={"resource_id": ids("dataresource")},
)
mock_ote_response(
method="get",
endpoint=f"/dataresource/{ids('dataresource')}",
status_code=500,
return_content=b"Internal Server Error",
)
data_resource = DataResource(server_url)
# We must first create the resource - getting a resource ID
data_resource.create(
downloadUrl="https://filesamples.com/samples/code/json/sample2.json",
mediaType="application/json",
)
with pytest.raises(ApiError, match="APIError"):
# `session_id` has a wrong type, the request should fail.
data_resource.fetch(session_id=123)
def test_initialize(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
) -> None:
"""Test `DataResource.fetch()`."""
import json
from otelib.strategies.dataresource import DataResource
mock_ote_response(
method="post",
endpoint="/dataresource",
return_json={"resource_id": ids("dataresource")},
)
mock_ote_response(
method="post",
endpoint=f"/dataresource/{ids('dataresource')}/initialize",
return_json={},
)
data_resource = DataResource(server_url)
# We must first create the resource - getting a resource ID
data_resource.create(
downloadUrl="https://filesamples.com/samples/code/json/sample2.json",
mediaType="application/json",
)
content = data_resource.initialize(session_id=None)
assert json.loads(content) == {}
def test_initialize_fails(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
) -> None:
"""Check `DataResource.fetch()` raises `ApiError` upon request failure."""
from otelib.exceptions import ApiError
from otelib.strategies.dataresource import DataResource
mock_ote_response(
method="post",
endpoint="/dataresource",
return_json={"resource_id": ids("dataresource")},
)
mock_ote_response(
method="post",
endpoint=f"/dataresource/{ids('dataresource')}/initialize",
status_code=500,
return_content=b"Internal Server Error",
)
data_resource = DataResource(server_url)
# We must first create the resource - getting a resource ID
data_resource.create(
downloadUrl="https://filesamples.com/samples/code/json/sample2.json",
mediaType="application/json",
)
with pytest.raises(ApiError, match="APIError"):
# `session_id` has a wrong type, the request should fail.
data_resource.initialize(session_id=123)
| 28.072464 | 81 | 0.668388 | 635 | 5,811 | 5.95748 | 0.129134 | 0.063442 | 0.063442 | 0.071372 | 0.8977 | 0.876817 | 0.858049 | 0.839545 | 0.839545 | 0.839545 | 0 | 0.005236 | 0.211151 | 5,811 | 206 | 82 | 28.208738 | 0.820026 | 0.127345 | 0 | 0.734266 | 0 | 0 | 0.245973 | 0.066812 | 0 | 0 | 0 | 0 | 0.041958 | 1 | 0.041958 | false | 0 | 0.104895 | 0 | 0.146853 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f29c24eb1daa85b666c5926f3b8206bfb59ccd38 | 158 | py | Python | homeworks/ilya_nilov/lesson10/test_lev01.py | tgrx/Z22 | b2539682ff26c8b6d9f63a7670c8a9c6b614a8ff | [
"Apache-2.0"
] | null | null | null | homeworks/ilya_nilov/lesson10/test_lev01.py | tgrx/Z22 | b2539682ff26c8b6d9f63a7670c8a9c6b614a8ff | [
"Apache-2.0"
] | 8 | 2019-11-15T18:15:56.000Z | 2020-02-03T18:05:05.000Z | homeworks/ilya_nilov/lesson10/test_lev01.py | tgrx/Z22 | b2539682ff26c8b6d9f63a7670c8a9c6b614a8ff | [
"Apache-2.0"
] | null | null | null | from homeworks.ilya_nilov.lesson10.level01 import big_summa
def test():
assert big_summa(1, 2, 3) == 6
assert big_summa(i for i in range(10)) == 45
| 22.571429 | 59 | 0.696203 | 28 | 158 | 3.785714 | 0.785714 | 0.226415 | 0.264151 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 0.189873 | 158 | 6 | 60 | 26.333333 | 0.734375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f29e785a4c8d71943d69b975fd61e485f829026c | 48 | py | Python | deepnet/lr_finder/__init__.py | rvk007/Monocular-Depth-Estimation | e24ae6547e45aead57424e8dc2e829994eca9107 | [
"MIT"
] | 7 | 2020-10-01T04:35:12.000Z | 2022-02-15T04:09:37.000Z | deepnet/lr_finder/__init__.py | rvk007/Monocular-Depth-Estimation | e24ae6547e45aead57424e8dc2e829994eca9107 | [
"MIT"
] | 8 | 2020-03-27T15:01:42.000Z | 2020-05-27T13:53:26.000Z | lr_finder/__init__.py | rvk007/DeepNet | c53b89a74cad5554bde7944357d6026e64c7dcdd | [
"MIT"
] | 1 | 2021-07-05T08:42:35.000Z | 2021-07-05T08:42:35.000Z | from deepnet.lr_finder.lr_finder import LRFinder | 48 | 48 | 0.895833 | 8 | 48 | 5.125 | 0.75 | 0.390244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 48 | 1 | 48 | 48 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f2b0d5dc0df8133c2d3ca99a2297028fc09acd80 | 66 | py | Python | creationdate/command_line.py | SageHourihan/creationdate | b0aef694576a963197a1c78875cc34159ee42add | [
"MIT"
] | null | null | null | creationdate/command_line.py | SageHourihan/creationdate | b0aef694576a963197a1c78875cc34159ee42add | [
"MIT"
] | null | null | null | creationdate/command_line.py | SageHourihan/creationdate | b0aef694576a963197a1c78875cc34159ee42add | [
"MIT"
] | null | null | null | import creationdate
def main():
return creationdate.date() | 16.5 | 30 | 0.712121 | 7 | 66 | 6.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19697 | 66 | 4 | 30 | 16.5 | 0.886792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
2993a1238b437d48713df04346f9714a654950d7 | 201 | py | Python | tests/test_logging_config.py | zveronics/zveronics-python-server | 6b3a5bad8981b139f94367a671acbed294203df7 | [
"MIT"
] | null | null | null | tests/test_logging_config.py | zveronics/zveronics-python-server | 6b3a5bad8981b139f94367a671acbed294203df7 | [
"MIT"
] | 5 | 2018-10-05T08:03:24.000Z | 2018-10-05T08:30:40.000Z | tests/test_logging_config.py | zveronics/zveronics | 6b3a5bad8981b139f94367a671acbed294203df7 | [
"MIT"
] | null | null | null | import pkg_resources
def test_logging_config():
assert pkg_resources.resource_exists('zveronics', 'etc/logging.yaml')
assert not pkg_resources.resource_isdir('zveronics', 'etc/logging.yaml')
| 28.714286 | 76 | 0.781095 | 26 | 201 | 5.769231 | 0.576923 | 0.24 | 0.266667 | 0.306667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104478 | 201 | 6 | 77 | 33.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.248756 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d9e0588647cbaa1a7c5da99c64cd4d49689cbf29 | 5,505 | py | Python | Developer/ModelLibrary/ModelFunctionsTRISTAN.py | IMI-TRISTAN/FERRET | 412d38f12093d45599ddbac4c7400b47334b97c2 | [
"Apache-2.0"
] | null | null | null | Developer/ModelLibrary/ModelFunctionsTRISTAN.py | IMI-TRISTAN/FERRET | 412d38f12093d45599ddbac4c7400b47334b97c2 | [
"Apache-2.0"
] | null | null | null | Developer/ModelLibrary/ModelFunctionsTRISTAN.py | IMI-TRISTAN/FERRET | 412d38f12093d45599ddbac4c7400b47334b97c2 | [
"Apache-2.0"
] | null | null | null | """This module contains functions that calculate the variation
of concentration or MR signal with time according to a tracer kinetic model.
"""
import MathsTools as tools
import ExceptionHandling as exceptionHandler
import numpy as np
from scipy.optimize import fsolve
from joblib import Parallel, delayed
import logging
logger = logging.getLogger(__name__)
def TRISTAN_Rat_Model_v2_0_4_7T(xData2DArray, Kbh, Khe,
constantsString):
"""This function contains the algorithm for calculating
how the MR signal from a 3D scan varies with time using the
TRISTAN Rat Model v2.0 at 4.7T
Input Parameters
----------------
xData2DArray - time (sec) and spleen signal 1D arrays
stacked into one 2D array.
Khe - Hepatocyte Uptake Rate (mL/min/mL)
Kbh - Biliary Efflux Rate (mL/min/mL)
constantsString - String representation of a dictionary
of constant name:value pairs used to convert concentrations
predicted by this model to MR signal values.
Returns
-------
St_rel - list of calculated MR signals at each of the
time points in array 'time'.
"""
try:
exceptionHandler.modelFunctionInfoLogger()
t = xData2DArray[:,0]
Ss = xData2DArray[:,1]
TR = 0.0058
baseline = 4
FA = 20
r1p = 6.4
r1h = 7.6
R10_s = 0.7458
R10_l = 1.3203
ve_s = 0.314
ve_l = 0.230
# Convert to concentrations
# n_jobs set to 1 to turn off parallel processing
# because parallel processing caused a segmentation
# fault in the compiled version of this application.
# This is not a problem in the uncompiled script
R1_s = [Parallel(n_jobs=1)(delayed(fsolve)
(tools.spgr3d_func, x0=0,
args = (FA, TR, R10_s, baseline, Ss[p]))
for p in np.arange(0,len(t)))]
R1_s = np.squeeze(R1_s)
DR1_s = R1_s - R10_s
Th = (1-ve_l)/(Kbh/60)
DR1_l = (ve_l/ve_s)*DR1_s + (r1h/r1p)*((Khe/60)/ve_s)*Th*tools.expconv(Th,t,DR1_s,'TRISTAN_Rat_Model_v2_0_4_7T')
# Convert to signal
c = np.cos(FA*np.pi/180)
R1_l = R10_l + DR1_l
E1 = np.exp(-TR*R1_l)
Sl = (1-E1)/(1-c*E1)
return(Sl) #Returns tissue signal
except ZeroDivisionError as zde:
exceptionHandler.handleDivByZeroException(zde)
except Exception as e:
exceptionHandler.handleGeneralException(e)
def TRISTAN_Rat_Model_v2_0_7T(xData2DArray, Kbh, Khe,
constantsString):
"""This function contains the algorithm for calculating
how the MR signal from a 3D scan varies with time using the
TRISTAN Rat Model v2.0 at 7T
Input Parameters
----------------
xData2DArray - time (sec) and spleen signal 1D arrays
stacked into one 2D array.
Khe - Hepatocyte Uptake Rate (mL/min/mL)
Kbh - Biliary Efflux Rate (mL/min/mL)
constantsString - String representation of a dictionary
of constant name:value pairs used to convert concentrations
predicted by this model to MR signal values.
Returns
-------
St_rel - list of calculated MR signals at each of the
time points in array 'time'.
"""
try:
exceptionHandler.modelFunctionInfoLogger()
t = xData2DArray[:,0]
Ss = xData2DArray[:,1]
TR = 0.0058
baseline = 4
FA = 20
r1p = 6.4
r1h = 7.6
R10_s = 0.7458
R10_l = 1.3203
ve_s = 0.314
ve_l = 0.230
# Convert to concentrations
# n_jobs set to 1 to turn off parallel processing
# because parallel processing caused a segmentation
# fault in the compiled version of this application.
# This is not a problem in the uncompiled script
R1_s = [Parallel(n_jobs=1)(delayed(fsolve)
(tools.spgr3d_func, x0=0,
args = (FA, TR, R10_s, baseline, Ss[p]))
for p in np.arange(0,len(t)))]
R1_s = np.squeeze(R1_s)
DR1_s = R1_s - R10_s
Th = (1-ve_l)/(Kbh/60)
DR1_l = (ve_l/ve_s)*DR1_s + (r1h/r1p)*((Khe/60)/ve_s)*Th*tools.expconv(Th,t,DR1_s,'TRISTAN_Rat_Model_v2_0_4_7T')
# Convert to signal
c = np.cos(FA*np.pi/180)
R1_l = R10_l + DR1_l
E1 = np.exp(-TR*R1_l)
Sl = (1-E1)/(1-c*E1)
return(Sl) #Returns tissue signal relative to the baseline St/St_baseline
except ZeroDivisionError as zde:
exceptionHandler.handleDivByZeroException(zde)
except Exception as e:
exceptionHandler.handleGeneralException(e) | 38.496503 | 124 | 0.525341 | 664 | 5,505 | 4.23494 | 0.25753 | 0.008535 | 0.032006 | 0.036273 | 0.883713 | 0.883713 | 0.875178 | 0.868421 | 0.868421 | 0.868421 | 0 | 0.056849 | 0.399273 | 5,505 | 143 | 125 | 38.496504 | 0.793468 | 0.387648 | 0 | 0.873239 | 0 | 0 | 0.018293 | 0.018293 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028169 | false | 0 | 0.084507 | 0 | 0.112676 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9e548f01f86fd96c3abd7b3cdaf5106653393fd | 71,688 | py | Python | tensorflow/python/keras/engine/training_test.py | tianyapiaozi/tensorflow | fb3ce0467766a8e91f1da0ad7ada7c24fde7a73a | [
"Apache-2.0"
] | 71 | 2017-05-25T16:02:15.000Z | 2021-06-09T16:08:08.000Z | tensorflow/python/keras/engine/training_test.py | shrikunjsarda/tensorflow | 7e8927e7af0c51ac20a63bd4eab6ff83df1a39ae | [
"Apache-2.0"
] | 133 | 2017-04-26T16:49:49.000Z | 2019-10-15T11:39:26.000Z | tensorflow/python/keras/engine/training_test.py | shrikunjsarda/tensorflow | 7e8927e7af0c51ac20a63bd4eab6ff83df1a39ae | [
"Apache-2.0"
] | 26 | 2017-04-12T16:25:44.000Z | 2018-10-30T10:10:15.000Z | # Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for training routines."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import unittest
import numpy as np
from tensorflow.python import keras
from tensorflow.python.data.ops import dataset_ops
from tensorflow.python.framework import ops
from tensorflow.python.framework import tensor_shape
from tensorflow.python.framework import test_util as tf_test_util
from tensorflow.python.keras import testing_utils
from tensorflow.python.keras.engine.training_utils import weighted_masked_objective
from tensorflow.python.keras.utils.generic_utils import slice_arrays
from tensorflow.python.ops import array_ops
from tensorflow.python.platform import test
from tensorflow.python.platform import tf_logging as logging
from tensorflow.python.training.rmsprop import RMSPropOptimizer
try:
import scipy.sparse as scipy_sparse # pylint: disable=g-import-not-at-top
except ImportError:
scipy_sparse = None
class TrainingTest(test.TestCase):
def test_fit_on_arrays(self):
with self.test_session():
a = keras.layers.Input(shape=(3,), name='input_a')
b = keras.layers.Input(shape=(3,), name='input_b')
dense = keras.layers.Dense(4, name='dense')
c = dense(a)
d = dense(b)
e = keras.layers.Dropout(0.5, name='dropout')(c)
model = keras.models.Model([a, b], [d, e])
optimizer = 'rmsprop'
loss = 'mse'
loss_weights = [1., 0.5]
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics, loss_weights=loss_weights)
input_a_np = np.random.random((10, 3))
input_b_np = np.random.random((10, 3))
output_d_np = np.random.random((10, 4))
output_e_np = np.random.random((10, 4))
# Test fit at different verbosity
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=1,
batch_size=5,
verbose=0)
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=1,
batch_size=5,
verbose=1)
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=2,
batch_size=5,
verbose=2)
model.train_on_batch([input_a_np, input_b_np], [output_d_np, output_e_np])
# Test model with input data as a list of lists
model.fit(
[np.ndarray.tolist(input_a_np), np.ndarray.tolist(input_b_np)],
[output_d_np, output_e_np],
epochs=2,
batch_size=5,
verbose=2)
# Test with validation data
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
validation_data=([input_a_np, input_b_np], [output_d_np,
output_e_np]),
epochs=1,
batch_size=5,
verbose=0)
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
validation_data=([input_a_np, input_b_np], [output_d_np,
output_e_np]),
epochs=2,
batch_size=5,
verbose=1)
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
validation_data=([input_a_np, input_b_np], [output_d_np,
output_e_np]),
epochs=2,
batch_size=5,
verbose=2)
# Test with validation split
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=2,
batch_size=5,
verbose=0,
validation_split=0.2)
# Test with dictionary inputs
model.fit(
{
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
},
epochs=1,
batch_size=5,
verbose=0)
model.fit(
{
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
},
epochs=1,
batch_size=5,
verbose=1)
model.fit(
{
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
},
validation_data=({
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
}),
epochs=1,
batch_size=5,
verbose=0)
model.train_on_batch({
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
})
# Test with lists for loss, metrics
loss = ['mae', 'mse']
metrics = ['acc', 'mae']
model.compile(optimizer, loss, metrics=metrics)
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=1,
batch_size=5,
verbose=0)
# Test with dictionaries for loss, metrics, loss weights
loss = {'dense': 'mse', 'dropout': 'mae'}
loss_weights = {'dense': 1., 'dropout': 0.5}
metrics = {'dense': 'mse', 'dropout': 'mae'}
model.compile(optimizer, loss, metrics=metrics, loss_weights=loss_weights)
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=1,
batch_size=5,
verbose=0)
# Invalid use cases
with self.assertRaises(ValueError):
model.train_on_batch({'input_a': input_a_np},
[output_d_np, output_e_np])
with self.assertRaises(AttributeError):
model.fit(
[input_a_np, input_b_np], [output_d_np, output_e_np],
epochs=1,
validation_data=([input_a_np, input_b_np], 0, 0),
verbose=0)
with self.assertRaises(ValueError):
model.train_on_batch([input_a_np], [output_d_np, output_e_np])
with self.assertRaises(AttributeError):
model.train_on_batch(1, [output_d_np, output_e_np])
with self.assertRaises(ValueError):
model.train_on_batch(input_a_np, [output_d_np, output_e_np])
with self.assertRaises(ValueError):
bad_input = np.random.random((11, 3))
model.train_on_batch([bad_input, input_b_np],
[output_d_np, output_e_np])
with self.assertRaises(ValueError):
bad_target = np.random.random((11, 4))
model.train_on_batch([input_a_np, input_b_np],
[bad_target, output_e_np])
# Build single-input model
x = keras.layers.Input(shape=(3,), name='input_a')
y = keras.layers.Dense(4)(x)
model = keras.models.Model(x, y)
model.compile(optimizer='rmsprop', loss='mse')
# This will work
model.fit([input_a_np], output_d_np, epochs=1)
with self.assertRaises(ValueError):
model.fit([input_a_np, input_a_np], output_d_np, epochs=1)
# Test model on a list of floats
input_a_np = np.random.random((10, 3))
input_b_np = np.random.random((10, 4))
model.fit([np.ndarray.tolist(input_a_np)],
[np.ndarray.tolist(input_b_np)],
epochs=2,
batch_size=5,
verbose=2)
def test_evaluate_predict_on_arrays(self):
with self.test_session():
a = keras.layers.Input(shape=(3,), name='input_a')
b = keras.layers.Input(shape=(3,), name='input_b')
dense = keras.layers.Dense(4, name='dense')
c = dense(a)
d = dense(b)
e = keras.layers.Dropout(0.5, name='dropout')(c)
model = keras.models.Model([a, b], [d, e])
optimizer = 'rmsprop'
loss = 'mse'
loss_weights = [1., 0.5]
metrics = ['mae']
model.compile(
optimizer,
loss,
metrics=metrics,
loss_weights=loss_weights,
sample_weight_mode=None)
input_a_np = np.random.random((10, 3))
input_b_np = np.random.random((10, 3))
output_d_np = np.random.random((10, 4))
output_e_np = np.random.random((10, 4))
# Test evaluate at different verbosity
out = model.evaluate(
[input_a_np, input_b_np], [output_d_np, output_e_np],
batch_size=5,
verbose=0)
self.assertEqual(len(out), 5)
out = model.evaluate(
[input_a_np, input_b_np], [output_d_np, output_e_np],
batch_size=5,
verbose=1)
self.assertEqual(len(out), 5)
out = model.evaluate(
[input_a_np, input_b_np], [output_d_np, output_e_np],
batch_size=5,
verbose=2)
self.assertEqual(len(out), 5)
out = model.test_on_batch([input_a_np, input_b_np],
[output_d_np, output_e_np])
self.assertEqual(len(out), 5)
# Test evaluate with dictionary inputs
model.evaluate(
{
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
},
batch_size=5,
verbose=0)
model.evaluate(
{
'input_a': input_a_np,
'input_b': input_b_np
}, {
'dense': output_d_np,
'dropout': output_e_np
},
batch_size=5,
verbose=1)
# Test predict
out = model.predict([input_a_np, input_b_np], batch_size=5)
self.assertEqual(len(out), 2)
out = model.predict({'input_a': input_a_np, 'input_b': input_b_np})
self.assertEqual(len(out), 2)
out = model.predict_on_batch({
'input_a': input_a_np,
'input_b': input_b_np
})
self.assertEqual(len(out), 2)
def test_invalid_loss_or_metrics(self):
num_classes = 5
train_samples = 1000
test_samples = 1000
input_dim = 5
with self.test_session():
model = keras.models.Sequential()
model.add(keras.layers.Dense(10, input_shape=(input_dim,)))
model.add(keras.layers.Activation('relu'))
model.add(keras.layers.Dense(num_classes))
model.add(keras.layers.Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
np.random.seed(1337)
(x_train, y_train), (_, _) = testing_utils.get_test_data(
train_samples=train_samples,
test_samples=test_samples,
input_shape=(input_dim,),
num_classes=num_classes)
with self.assertRaises(ValueError):
model.fit(x_train, y_train)
with self.assertRaises(ValueError):
model.fit(x_train, np.concatenate([y_train, y_train], axis=-1))
with self.assertRaises(TypeError):
model.compile(loss='categorical_crossentropy',
optimizer='rmsprop',
metrics=set(0))
with self.assertRaises(ValueError):
model.compile(loss=None,
optimizer='rmsprop')
def test_training_on_sparse_data_with_dense_placeholders(self):
if scipy_sparse is None:
return
with self.test_session():
test_inputs = [
scipy_sparse.random(6, 3, density=0.25).tocsr() for _ in range(2)
]
test_outputs = [
scipy_sparse.random(6, i, density=0.25).tocsr() for i in range(3, 5)
]
in1 = keras.layers.Input(shape=(3,))
in2 = keras.layers.Input(shape=(3,))
out1 = keras.layers.Dropout(0.5, name='dropout')(in1)
out2 = keras.layers.Dense(4, name='dense_1')(in2)
model = keras.Model([in1, in2], [out1, out2])
model.predict(test_inputs, batch_size=2)
model.compile('rmsprop', 'mse')
model.fit(test_inputs, test_outputs,
epochs=1, batch_size=2, validation_split=0.5)
model.evaluate(test_inputs, test_outputs, batch_size=2)
def test_that_trainable_disables_updates(self):
val_a = np.random.random((10, 4))
val_out = np.random.random((10, 4))
with self.test_session():
a = keras.layers.Input(shape=(4,))
layer = keras.layers.BatchNormalization(input_shape=(4,))
b = layer(a)
model = keras.Model(a, b)
model.trainable = False
assert not model.updates
model.compile('sgd', 'mse')
assert not model.updates
x1 = model.predict(val_a)
model.train_on_batch(val_a, val_out)
x2 = model.predict(val_a)
self.assertAllClose(x1, x2, atol=1e-7)
model.trainable = True
model.compile('sgd', 'mse')
assert model.updates
model.train_on_batch(val_a, val_out)
x2 = model.predict(val_a)
assert np.abs(np.sum(x1 - x2)) > 1e-5
layer.trainable = False
model.compile('sgd', 'mse')
assert not model.updates
x1 = model.predict(val_a)
model.train_on_batch(val_a, val_out)
x2 = model.predict(val_a)
self.assertAllClose(x1, x2, atol=1e-7)
class LossWeightingTest(test.TestCase):
def test_class_weights(self):
num_classes = 5
batch_size = 5
epochs = 5
weighted_class = 3
train_samples = 1000
test_samples = 1000
input_dim = 5
with self.test_session():
model = keras.models.Sequential()
model.add(keras.layers.Dense(10, input_shape=(input_dim,)))
model.add(keras.layers.Activation('relu'))
model.add(keras.layers.Dense(num_classes))
model.add(keras.layers.Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
np.random.seed(1337)
(x_train, y_train), (x_test, y_test) = testing_utils.get_test_data(
train_samples=train_samples,
test_samples=test_samples,
input_shape=(input_dim,),
num_classes=num_classes)
int_y_test = y_test.copy()
int_y_train = y_train.copy()
# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)
test_ids = np.where(int_y_test == np.array(weighted_class))[0]
class_weight = dict([(i, 1.) for i in range(num_classes)])
class_weight[weighted_class] = 2.
sample_weight = np.ones((y_train.shape[0]))
sample_weight[int_y_train == weighted_class] = 2.
model.fit(
x_train,
y_train,
batch_size=batch_size,
epochs=epochs // 3,
verbose=0,
class_weight=class_weight,
validation_data=(x_train, y_train, sample_weight))
model.fit(
x_train,
y_train,
batch_size=batch_size,
epochs=epochs // 2,
verbose=0,
class_weight=class_weight)
model.fit(
x_train,
y_train,
batch_size=batch_size,
epochs=epochs // 2,
verbose=0,
class_weight=class_weight,
validation_split=0.1)
model.train_on_batch(
x_train[:batch_size], y_train[:batch_size], class_weight=class_weight)
ref_score = model.evaluate(x_test, y_test, verbose=0)
score = model.evaluate(
x_test[test_ids, :], y_test[test_ids, :], verbose=0)
self.assertLess(score, ref_score)
def test_sample_weights(self):
num_classes = 5
batch_size = 5
epochs = 5
weighted_class = 3
train_samples = 1000
test_samples = 1000
input_dim = 5
with self.test_session():
model = keras.models.Sequential()
model.add(keras.layers.Dense(10, input_shape=(input_dim,)))
model.add(keras.layers.Activation('relu'))
model.add(keras.layers.Dense(num_classes))
model.add(keras.layers.Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
np.random.seed(43)
(x_train, y_train), (x_test, y_test) = testing_utils.get_test_data(
train_samples=train_samples,
test_samples=test_samples,
input_shape=(input_dim,),
num_classes=num_classes)
int_y_test = y_test.copy()
int_y_train = y_train.copy()
# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)
test_ids = np.where(int_y_test == np.array(weighted_class))[0]
class_weight = dict([(i, 1.) for i in range(num_classes)])
class_weight[weighted_class] = 2.
sample_weight = np.ones((y_train.shape[0]))
sample_weight[int_y_train == weighted_class] = 2.
model.fit(
x_train,
y_train,
batch_size=batch_size,
epochs=epochs // 3,
verbose=0,
sample_weight=sample_weight)
model.fit(
x_train,
y_train,
batch_size=batch_size,
epochs=epochs // 3,
verbose=0,
sample_weight=sample_weight,
validation_split=0.1)
model.train_on_batch(
x_train[:batch_size],
y_train[:batch_size],
sample_weight=sample_weight[:batch_size])
model.test_on_batch(
x_train[:batch_size],
y_train[:batch_size],
sample_weight=sample_weight[:batch_size])
ref_score = model.evaluate(x_test, y_test, verbose=0)
score = model.evaluate(
x_test[test_ids, :], y_test[test_ids, :], verbose=0)
self.assertLess(score, ref_score)
def test_temporal_sample_weights(self):
num_classes = 5
batch_size = 5
epochs = 5
weighted_class = 3
train_samples = 1000
test_samples = 1000
input_dim = 5
timesteps = 3
with self.test_session():
model = keras.models.Sequential()
model.add(
keras.layers.TimeDistributed(
keras.layers.Dense(num_classes),
input_shape=(timesteps, input_dim)))
model.add(keras.layers.Activation('softmax'))
np.random.seed(1337)
(x_train, y_train), (x_test, y_test) = testing_utils.get_test_data(
train_samples=train_samples,
test_samples=test_samples,
input_shape=(input_dim,),
num_classes=num_classes)
int_y_test = y_test.copy()
int_y_train = y_train.copy()
# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)
test_ids = np.where(int_y_test == np.array(weighted_class))[0]
class_weight = dict([(i, 1.) for i in range(num_classes)])
class_weight[weighted_class] = 2.
sample_weight = np.ones((y_train.shape[0]))
sample_weight[int_y_train == weighted_class] = 2.
temporal_x_train = np.reshape(x_train, (len(x_train), 1,
x_train.shape[1]))
temporal_x_train = np.repeat(temporal_x_train, timesteps, axis=1)
temporal_x_test = np.reshape(x_test, (len(x_test), 1, x_test.shape[1]))
temporal_x_test = np.repeat(temporal_x_test, timesteps, axis=1)
temporal_y_train = np.reshape(y_train, (len(y_train), 1,
y_train.shape[1]))
temporal_y_train = np.repeat(temporal_y_train, timesteps, axis=1)
temporal_y_test = np.reshape(y_test, (len(y_test), 1, y_test.shape[1]))
temporal_y_test = np.repeat(temporal_y_test, timesteps, axis=1)
temporal_sample_weight = np.reshape(sample_weight, (len(sample_weight),
1))
temporal_sample_weight = np.repeat(
temporal_sample_weight, timesteps, axis=1)
model.compile(
loss='binary_crossentropy',
optimizer='rmsprop',
sample_weight_mode='temporal')
model.fit(
temporal_x_train,
temporal_y_train,
batch_size=batch_size,
epochs=epochs // 3,
verbose=0,
sample_weight=temporal_sample_weight)
model.fit(
temporal_x_train,
temporal_y_train,
batch_size=batch_size,
epochs=epochs // 3,
verbose=0,
sample_weight=temporal_sample_weight,
validation_split=0.1)
model.train_on_batch(
temporal_x_train[:batch_size],
temporal_y_train[:batch_size],
sample_weight=temporal_sample_weight[:batch_size])
model.test_on_batch(
temporal_x_train[:batch_size],
temporal_y_train[:batch_size],
sample_weight=temporal_sample_weight[:batch_size])
ref_score = model.evaluate(temporal_x_test, temporal_y_test, verbose=0)
score = model.evaluate(
temporal_x_test[test_ids], temporal_y_test[test_ids], verbose=0)
self.assertLess(score, ref_score)
def test_class_weight_invalid_use_case(self):
num_classes = 5
train_samples = 1000
test_samples = 1000
input_dim = 5
timesteps = 3
with self.test_session():
model = keras.models.Sequential()
model.add(
keras.layers.TimeDistributed(
keras.layers.Dense(num_classes),
input_shape=(timesteps, input_dim)))
model.add(keras.layers.Activation('softmax'))
model.compile(
loss='binary_crossentropy',
optimizer='rmsprop')
(x_train, y_train), _ = testing_utils.get_test_data(
train_samples=train_samples,
test_samples=test_samples,
input_shape=(input_dim,),
num_classes=num_classes)
# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, num_classes)
class_weight = dict([(i, 1.) for i in range(num_classes)])
del class_weight[1]
with self.assertRaises(ValueError):
model.fit(x_train, y_train,
epochs=0, verbose=0, class_weight=class_weight)
with self.assertRaises(ValueError):
model.compile(
loss='binary_crossentropy',
optimizer='rmsprop',
sample_weight_mode=[])
# Build multi-output model
x = keras.Input((3,))
y1 = keras.layers.Dense(4, name='1')(x)
y2 = keras.layers.Dense(4, name='2')(x)
model = keras.models.Model(x, [y1, y2])
model.compile(optimizer='rmsprop', loss='mse')
x_np = np.random.random((10, 3))
y_np = np.random.random((10, 4))
w_np = np.random.random((10,))
# This will work
model.fit(x_np, [y_np, y_np], epochs=1,
sample_weight={'1': w_np})
# These will not
with self.assertRaises(ValueError):
model.fit(x_np, [y_np, y_np], epochs=1,
sample_weight=[w_np])
with self.assertRaises(TypeError):
model.fit(x_np, [y_np, y_np], epochs=1,
sample_weight=w_np)
with self.assertRaises(ValueError):
bad_w_np = np.random.random((11,))
model.fit(x_np, [y_np, y_np], epochs=1,
sample_weight={'1': bad_w_np})
with self.assertRaises(ValueError):
bad_w_np = np.random.random((10, 2))
model.fit(x_np, [y_np, y_np], epochs=1,
sample_weight={'1': bad_w_np})
with self.assertRaises(ValueError):
bad_w_np = np.random.random((10, 2, 2))
model.fit(x_np, [y_np, y_np], epochs=1,
sample_weight={'1': bad_w_np})
class LossMaskingTest(test.TestCase):
def test_masking(self):
with self.test_session():
np.random.seed(1337)
x = np.array([[[1], [1]], [[0], [0]]])
model = keras.models.Sequential()
model.add(keras.layers.Masking(mask_value=0, input_shape=(2, 1)))
model.add(
keras.layers.TimeDistributed(
keras.layers.Dense(1, kernel_initializer='one')))
model.compile(loss='mse', optimizer='sgd')
y = np.array([[[1], [1]], [[1], [1]]])
loss = model.train_on_batch(x, y)
self.assertEqual(loss, 0)
def test_loss_masking(self):
with self.test_session():
weighted_loss = weighted_masked_objective(keras.losses.get('mae'))
shape = (3, 4, 2)
x = np.arange(24).reshape(shape)
y = 2 * x
# Normally the trailing 1 is added by standardize_weights
weights = np.ones((3,))
mask = np.ones((3, 4))
mask[1, 0] = 0
keras.backend.eval(
weighted_loss(
keras.backend.variable(x),
keras.backend.variable(y),
keras.backend.variable(weights), keras.backend.variable(mask)))
class TestDynamicTrainability(test.TestCase):
def test_trainable_warning(self):
with self.test_session():
x = np.random.random((5, 3))
y = np.random.random((5, 2))
model = keras.models.Sequential()
model.add(keras.layers.Dense(2, input_dim=3))
model.trainable = False
model.compile('rmsprop', 'mse')
model.trainable = True
model.train_on_batch(x, y)
self.assertRaises(Warning)
def test_trainable_argument(self):
with self.test_session():
x = np.random.random((5, 3))
y = np.random.random((5, 2))
model = keras.models.Sequential()
model.add(keras.layers.Dense(2, input_dim=3, trainable=False))
model.compile('rmsprop', 'mse')
out = model.predict(x)
model.train_on_batch(x, y)
out_2 = model.predict(x)
self.assertAllClose(out, out_2)
# test with nesting
inputs = keras.layers.Input(shape=(3,))
output = model(inputs)
model = keras.models.Model(inputs, output)
model.compile('rmsprop', 'mse')
out = model.predict(x)
model.train_on_batch(x, y)
out_2 = model.predict(x)
self.assertAllClose(out, out_2)
def test_layer_trainability_switch(self):
with self.test_session():
# with constructor argument, in Sequential
model = keras.models.Sequential()
model.add(keras.layers.Dense(2, trainable=False, input_dim=1))
self.assertListEqual(model.trainable_weights, [])
# by setting the `trainable` argument, in Sequential
model = keras.models.Sequential()
layer = keras.layers.Dense(2, input_dim=1)
model.add(layer)
self.assertListEqual(model.trainable_weights, layer.trainable_weights)
layer.trainable = False
self.assertListEqual(model.trainable_weights, [])
# with constructor argument, in Model
x = keras.layers.Input(shape=(1,))
y = keras.layers.Dense(2, trainable=False)(x)
model = keras.models.Model(x, y)
self.assertListEqual(model.trainable_weights, [])
# by setting the `trainable` argument, in Model
x = keras.layers.Input(shape=(1,))
layer = keras.layers.Dense(2)
y = layer(x)
model = keras.models.Model(x, y)
self.assertListEqual(model.trainable_weights, layer.trainable_weights)
layer.trainable = False
self.assertListEqual(model.trainable_weights, [])
def test_model_trainability_switch(self):
with self.test_session():
# a non-trainable model has no trainable weights
x = keras.layers.Input(shape=(1,))
y = keras.layers.Dense(2)(x)
model = keras.models.Model(x, y)
model.trainable = False
self.assertListEqual(model.trainable_weights, [])
# same for Sequential
model = keras.models.Sequential()
model.add(keras.layers.Dense(2, input_dim=1))
model.trainable = False
self.assertListEqual(model.trainable_weights, [])
def test_nested_model_trainability(self):
with self.test_session():
# a Sequential inside a Model
inner_model = keras.models.Sequential()
inner_model.add(keras.layers.Dense(2, input_dim=1))
x = keras.layers.Input(shape=(1,))
y = inner_model(x)
outer_model = keras.models.Model(x, y)
self.assertListEqual(outer_model.trainable_weights,
inner_model.trainable_weights)
inner_model.trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
inner_model.trainable = True
inner_model.layers[-1].trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
# a Sequential inside a Sequential
inner_model = keras.models.Sequential()
inner_model.add(keras.layers.Dense(2, input_dim=1))
outer_model = keras.models.Sequential()
outer_model.add(inner_model)
self.assertListEqual(outer_model.trainable_weights,
inner_model.trainable_weights)
inner_model.trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
inner_model.trainable = True
inner_model.layers[-1].trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
# a Model inside a Model
x = keras.layers.Input(shape=(1,))
y = keras.layers.Dense(2)(x)
inner_model = keras.models.Model(x, y)
x = keras.layers.Input(shape=(1,))
y = inner_model(x)
outer_model = keras.models.Model(x, y)
self.assertListEqual(outer_model.trainable_weights,
inner_model.trainable_weights)
inner_model.trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
inner_model.trainable = True
inner_model.layers[-1].trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
# a Model inside a Sequential
x = keras.layers.Input(shape=(1,))
y = keras.layers.Dense(2)(x)
inner_model = keras.models.Model(x, y)
outer_model = keras.models.Sequential()
outer_model.add(inner_model)
self.assertListEqual(outer_model.trainable_weights,
inner_model.trainable_weights)
inner_model.trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
inner_model.trainable = True
inner_model.layers[-1].trainable = False
self.assertListEqual(outer_model.trainable_weights, [])
class TestGeneratorMethods(test.TestCase):
@unittest.skipIf(
os.name == 'nt',
'use_multiprocessing=True does not work on windows properly.')
def test_generator_methods(self):
arr_data = np.random.random((50, 2))
arr_labels = np.random.random((50,))
def custom_generator():
batch_size = 10
num_samples = 50
while True:
batch_index = np.random.randint(0, num_samples - batch_size)
start = batch_index
end = start + batch_size
x = arr_data[start: end]
y = arr_labels[start: end]
yield x, y
with self.test_session():
x = keras.Input((2,))
y = keras.layers.Dense(1)(x)
fn_model = keras.models.Model(x, y)
fn_model.compile(loss='mse', optimizer='sgd')
seq_model = keras.models.Sequential()
seq_model.add(keras.layers.Dense(1, input_shape=(2,)))
seq_model.compile(loss='mse', optimizer='sgd')
for model in [fn_model, seq_model]:
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
workers=4,
use_multiprocessing=True)
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
use_multiprocessing=False)
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
use_multiprocessing=False,
validation_data=custom_generator(),
validation_steps=10)
model.fit_generator(custom_generator(),
steps_per_epoch=5,
validation_data=custom_generator(),
validation_steps=1,
workers=0)
model.predict_generator(custom_generator(),
steps=5,
max_queue_size=10,
workers=2,
use_multiprocessing=True)
model.predict_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False)
model.predict_generator(custom_generator(),
steps=5,
max_queue_size=10,
workers=0)
model.evaluate_generator(custom_generator(),
steps=5,
max_queue_size=10,
workers=2,
verbose=1,
use_multiprocessing=True)
model.evaluate_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False)
model.evaluate_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False,
workers=0)
def test_generator_methods_with_sample_weights(self):
arr_data = np.random.random((50, 2))
arr_labels = np.random.random((50,))
arr_sample_weights = np.random.random((50,))
def custom_generator():
batch_size = 10
num_samples = 50
while True:
batch_index = np.random.randint(0, num_samples - batch_size)
start = batch_index
end = start + batch_size
x = arr_data[start: end]
y = arr_labels[start: end]
w = arr_sample_weights[start: end]
yield x, y, w
with self.test_session():
model = keras.models.Sequential()
model.add(keras.layers.Dense(1, input_shape=(2,)))
model.compile(loss='mse', optimizer='sgd')
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
use_multiprocessing=False)
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
use_multiprocessing=False,
validation_data=custom_generator(),
validation_steps=10)
model.predict_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False)
model.evaluate_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False)
def test_generator_methods_invalid_use_case(self):
def custom_generator():
while 1:
yield 0
with self.test_session():
model = keras.models.Sequential()
model.add(keras.layers.Dense(1, input_shape=(2,)))
model.compile(loss='mse', optimizer='sgd')
with self.assertRaises(ValueError):
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
use_multiprocessing=False)
with self.assertRaises(ValueError):
model.fit_generator(custom_generator(),
steps_per_epoch=5,
epochs=1,
verbose=1,
max_queue_size=10,
use_multiprocessing=False,
validation_data=custom_generator(),
validation_steps=10)
with self.assertRaises(AttributeError):
model.predict_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False)
with self.assertRaises(ValueError):
model.evaluate_generator(custom_generator(),
steps=5,
max_queue_size=10,
use_multiprocessing=False)
def test_training_with_sequences(self):
class DummySequence(keras.utils.Sequence):
def __getitem__(self, idx):
return np.zeros([10, 2]), np.ones([10])
def __len__(self):
return 10
arr_data = np.random.random((50, 2))
arr_labels = np.random.random((50,))
arr_sample_weights = np.random.random((50,))
def custom_generator():
batch_size = 10
num_samples = 50
while True:
batch_index = np.random.randint(0, num_samples - batch_size)
start = batch_index
end = start + batch_size
x = arr_data[start: end]
y = arr_labels[start: end]
w = arr_sample_weights[start: end]
yield x, y, w
with self.test_session():
model = keras.models.Sequential()
model.add(keras.layers.Dense(1, input_shape=(2,)))
model.compile(loss='mse', optimizer='sgd')
model.fit_generator(DummySequence(),
steps_per_epoch=10,
validation_data=custom_generator(),
validation_steps=1,
max_queue_size=10,
workers=0,
use_multiprocessing=True)
model.fit_generator(DummySequence(),
steps_per_epoch=10,
validation_data=custom_generator(),
validation_steps=1,
max_queue_size=10,
workers=0,
use_multiprocessing=False)
class TestTrainingUtils(test.TestCase):
def test_check_array_lengths(self):
keras.engine.training_utils.check_array_lengths(None, None, None)
a_np = np.random.random((4, 3, 3))
keras.engine.training_utils.check_array_lengths(a_np, a_np, a_np)
keras.engine.training_utils.check_array_lengths(
[a_np, a_np], [a_np, a_np], [a_np, a_np])
keras.engine.training_utils.check_array_lengths([None], [None], [None])
b_np = np.random.random((3, 4))
with self.assertRaises(ValueError):
keras.engine.training_utils.check_array_lengths([a_np], [b_np], None)
def test_slice_arrays(self):
input_a = np.random.random((10, 3))
slice_arrays(input_a, 0)
slice_arrays(None)
slice_arrays(input_a, 0, 1)
slice_arrays(input_a, stop=2)
input_a = [None, [1, 1], None, [1, 1]]
slice_arrays(input_a, 0)
slice_arrays(input_a, 0, 1)
slice_arrays(input_a, stop=2)
input_a = [None]
slice_arrays(input_a, 0)
slice_arrays(input_a, 0, 1)
slice_arrays(input_a, stop=2)
input_a = None
slice_arrays(input_a, 0)
slice_arrays(input_a, 0, 1)
slice_arrays(input_a, stop=2)
class TestTrainingWithDataTensors(test.TestCase):
def test_training_and_eval_methods_on_symbolic_tensors_single_io(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = 'rmsprop'
loss = 'mse'
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics)
inputs = keras.backend.zeros(shape=(10, 3))
targets = keras.backend.zeros(shape=(10, 4))
model.fit(inputs, targets, epochs=1, steps_per_epoch=2, verbose=0)
model.evaluate(inputs, targets, steps=2, verbose=0)
model.predict(inputs, steps=2)
model.train_on_batch(inputs, targets)
model.test_on_batch(inputs, targets)
model.fit(inputs, targets,
epochs=1, steps_per_epoch=2, verbose=0,
validation_data=(inputs, targets), validation_steps=2)
# Test with dynamic shape
inputs = array_ops.placeholder_with_default(
np.zeros((2, 3)), shape=tensor_shape.TensorShape([None, 3]))
targets = array_ops.placeholder_with_default(
np.zeros((2, 4)), shape=tensor_shape.TensorShape([None, 4]))
self.assertEqual(inputs.shape[0].value, None)
model.fit(inputs, targets, epochs=1, steps_per_epoch=2, verbose=0)
model.evaluate(inputs, targets, steps=2, verbose=0)
model.predict(inputs, steps=2)
model.train_on_batch(inputs, targets)
model.test_on_batch(inputs, targets)
model.fit(inputs, targets,
epochs=1, steps_per_epoch=2, verbose=0,
validation_data=(inputs, targets), validation_steps=2)
def test_training_and_eval_methods_on_symbolic_tensors_multi_io(self):
with self.test_session():
a = keras.layers.Input(shape=(3,), name='input_a')
b = keras.layers.Input(shape=(3,), name='input_b')
dense = keras.layers.Dense(4, name='dense')
c = dense(a)
d = dense(b)
e = keras.layers.Dropout(0.5, name='dropout')(c)
model = keras.models.Model([a, b], [d, e])
optimizer = 'rmsprop'
loss = 'mse'
loss_weights = [1., 0.5]
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics, loss_weights=loss_weights)
input_a_tf = keras.backend.zeros(shape=(10, 3))
input_b_tf = keras.backend.zeros(shape=(10, 3))
output_d_tf = keras.backend.zeros(shape=(10, 4))
output_e_tf = keras.backend.zeros(shape=(10, 4))
model.fit(
[input_a_tf, input_b_tf], [output_d_tf, output_e_tf],
epochs=1,
steps_per_epoch=2,
verbose=0)
with self.assertRaisesRegexp(ValueError,
'should specify the `steps_per_epoch`'):
model.fit(
[input_a_tf, input_b_tf], [output_d_tf, output_e_tf],
epochs=1,
batch_size=5,
verbose=0)
model.train_on_batch([input_a_tf, input_b_tf], [output_d_tf, output_e_tf])
# Test with dictionary inputs
model.fit(
{'input_a': input_a_tf,
'input_b': input_b_tf},
{'dense': output_d_tf,
'dropout': output_e_tf},
epochs=1,
steps_per_epoch=2,
verbose=0)
model.fit(
{'input_a': input_a_tf,
'input_b': input_b_tf},
{'dense': output_d_tf,
'dropout': output_e_tf},
validation_data=({'input_a': input_a_tf,
'input_b': input_b_tf},
{'dense': output_d_tf,
'dropout': output_e_tf}),
epochs=1,
steps_per_epoch=2,
validation_steps=2,
verbose=0)
model.train_on_batch(
{'input_a': input_a_tf,
'input_b': input_b_tf},
{'dense': output_d_tf,
'dropout': output_e_tf})
# Test with validation data
model.fit(
[input_a_tf, input_b_tf], [output_d_tf, output_e_tf],
validation_data=([input_a_tf, input_b_tf],
[output_d_tf, output_e_tf]),
epochs=1,
steps_per_epoch=2,
validation_steps=2,
verbose=0)
# Test with validation split
with self.assertRaisesRegexp(ValueError,
'you cannot use `validation_split`'):
model.fit(
[input_a_tf, input_b_tf], [output_d_tf, output_e_tf],
epochs=2,
steps_per_epoch=2,
verbose=0,
validation_split=0.2,
validation_steps=2)
# Test evaluation / prediction methods
model.evaluate([input_a_tf, input_b_tf], [output_d_tf, output_e_tf],
steps=2, verbose=0)
model.predict([input_a_tf, input_b_tf], steps=2)
model.test_on_batch([input_a_tf, input_b_tf], [output_d_tf, output_e_tf])
def test_model_with_input_feed_tensor(self):
"""We test building a model with a TF variable as input.
We should be able to call fit, evaluate, predict,
by only passing them data for the placeholder inputs
in the model.
"""
with self.test_session():
input_a_np = np.random.random((10, 3))
input_b_np = np.random.random((10, 3))
output_a_np = np.random.random((10, 4))
output_b_np = np.random.random((10, 3))
a = keras.Input(
tensor=keras.backend.variables_module.Variable(input_a_np,
dtype='float32'))
b = keras.Input(shape=(3,), name='input_b')
a_2 = keras.layers.Dense(4, name='dense_1')(a)
dp = keras.layers.Dropout(0.5, name='dropout')
b_2 = dp(b)
model = keras.models.Model([a, b], [a_2, b_2])
model.summary()
optimizer = 'rmsprop'
loss = 'mse'
loss_weights = [1., 0.5]
model.compile(optimizer, loss, metrics=['mean_squared_error'],
loss_weights=loss_weights,
sample_weight_mode=None)
# test train_on_batch
out = model.train_on_batch(input_b_np,
[output_a_np, output_b_np])
out = model.train_on_batch({'input_b': input_b_np},
[output_a_np, output_b_np])
out = model.test_on_batch({'input_b': input_b_np},
[output_a_np, output_b_np])
out = model.predict_on_batch({'input_b': input_b_np})
# test fit
out = model.fit({'input_b': input_b_np},
[output_a_np, output_b_np], epochs=1, batch_size=10)
out = model.fit(input_b_np,
[output_a_np, output_b_np], epochs=1, batch_size=10)
# test evaluate
out = model.evaluate({'input_b': input_b_np},
[output_a_np, output_b_np], batch_size=10)
out = model.evaluate(input_b_np,
[output_a_np, output_b_np], batch_size=10)
# test predict
out = model.predict({'input_b': input_b_np}, batch_size=10)
out = model.predict(input_b_np, batch_size=10)
self.assertEqual(len(out), 2)
# Now test a model with a single input
# i.e. we don't pass any data to fit the model.
a = keras.Input(
tensor=keras.backend.variables_module.Variable(input_a_np,
dtype='float32'))
a_2 = keras.layers.Dense(4, name='dense_1')(a)
a_2 = keras.layers.Dropout(0.5, name='dropout')(a_2)
model = keras.models.Model(a, a_2)
model.summary()
optimizer = 'rmsprop'
loss = 'mse'
model.compile(optimizer, loss, metrics=['mean_squared_error'])
# test train_on_batch
out = model.train_on_batch(None,
output_a_np)
out = model.train_on_batch(None,
output_a_np)
out = model.test_on_batch(None,
output_a_np)
out = model.predict_on_batch(None)
out = model.train_on_batch([],
output_a_np)
out = model.train_on_batch({},
output_a_np)
# test fit
_ = model.fit(None, output_a_np, epochs=1, steps_per_epoch=3)
_ = model.fit(None, output_a_np, epochs=1, steps_per_epoch=3)
# test evaluate
_ = model.evaluate(None, output_a_np, steps=3)
_ = model.evaluate(None, output_a_np, steps=3)
# test predict
out = model.predict(None, steps=3)
out = model.predict(None, steps=3)
self.assertEqual(out.shape, (10 * 3, 4))
# Same, without learning phase
# i.e. we don't pass any data to fit the model.
a = keras.Input(
tensor=keras.backend.variables_module.Variable(input_a_np,
dtype='float32'))
a_2 = keras.layers.Dense(4, name='dense_1')(a)
model = keras.models.Model(a, a_2)
model.summary()
optimizer = 'rmsprop'
loss = 'mse'
model.compile(optimizer, loss, metrics=['mean_squared_error'])
# test train_on_batch
out = model.train_on_batch(None,
output_a_np)
out = model.train_on_batch(None,
output_a_np)
out = model.test_on_batch(None,
output_a_np)
out = model.predict_on_batch(None)
out = model.train_on_batch([],
output_a_np)
out = model.train_on_batch({},
output_a_np)
# test fit
_ = model.fit(None, output_a_np, epochs=1, steps_per_epoch=10)
_ = model.fit(None, output_a_np, epochs=1, steps_per_epoch=10)
# test evaluate
_ = model.evaluate(None, output_a_np, steps=10)
_ = model.evaluate(None, output_a_np, steps=10)
# test predict
out = model.predict(None, steps=3)
out = model.predict(None, steps=3)
self.assertEqual(out.shape, (10 * 3, 4))
def test_model_with_partial_loss(self):
with self.test_session():
a = keras.Input(shape=(3,), name='input_a')
a_2 = keras.layers.Dense(4, name='dense_1')(a)
dp = keras.layers.Dropout(0.5, name='dropout')
a_3 = dp(a_2)
model = keras.models.Model(a, [a_2, a_3])
optimizer = 'rmsprop'
loss = {'dropout': 'mse'}
model.compile(optimizer, loss, metrics=['mae'])
input_a_np = np.random.random((10, 3))
output_a_np = np.random.random((10, 4))
# test train_on_batch
_ = model.train_on_batch(input_a_np, output_a_np)
_ = model.test_on_batch(input_a_np, output_a_np)
# fit
_ = model.fit(input_a_np, [output_a_np])
# evaluate
_ = model.evaluate(input_a_np, [output_a_np])
# Same without dropout.
a = keras.Input(shape=(3,), name='input_a')
a_2 = keras.layers.Dense(4, name='dense_1')(a)
a_3 = keras.layers.Dense(4, name='dense_2')(a_2)
model = keras.models.Model(a, [a_2, a_3])
optimizer = 'rmsprop'
loss = {'dense_2': 'mse'}
model.compile(optimizer, loss, metrics={'dense_1': 'mae'})
# test train_on_batch
_ = model.train_on_batch(input_a_np, output_a_np)
_ = model.test_on_batch(input_a_np, output_a_np)
# fit
_ = model.fit(input_a_np, [output_a_np])
# evaluate
_ = model.evaluate(input_a_np, [output_a_np])
def test_model_with_external_loss(self):
with self.test_session():
# None loss, only regularization loss.
a = keras.Input(shape=(3,), name='input_a')
a_2 = keras.layers.Dense(4, name='dense_1',
kernel_regularizer='l1',
bias_regularizer='l2')(a)
dp = keras.layers.Dropout(0.5, name='dropout')
a_3 = dp(a_2)
model = keras.models.Model(a, [a_2, a_3])
optimizer = 'rmsprop'
loss = None
model.compile(optimizer, loss, metrics=['mae'])
input_a_np = np.random.random((10, 3))
# test train_on_batch
out = model.train_on_batch(input_a_np, None)
out = model.test_on_batch(input_a_np, None)
# fit
out = model.fit(input_a_np, None)
# evaluate
out = model.evaluate(input_a_np, None)
# No dropout, external loss.
a = keras.Input(shape=(3,), name='input_a')
a_2 = keras.layers.Dense(4, name='dense_1')(a)
a_3 = keras.layers.Dense(4, name='dense_2')(a)
model = keras.models.Model(a, [a_2, a_3])
model.add_loss(keras.backend.mean(a_3 + a_2))
optimizer = 'rmsprop'
loss = None
model.compile(optimizer, loss, metrics=['mae'])
# test train_on_batch
out = model.train_on_batch(input_a_np, None)
out = model.test_on_batch(input_a_np, None)
# fit
out = model.fit(input_a_np, None)
# evaluate
out = model.evaluate(input_a_np, None)
# Test model with no external data at all.
a = keras.Input(
tensor=keras.backend.variables_module.Variable(input_a_np,
dtype='float32'))
a_2 = keras.layers.Dense(4, name='dense_1')(a)
a_2 = keras.layers.Dropout(0.5, name='dropout')(a_2)
model = keras.models.Model(a, a_2)
model.add_loss(keras.backend.mean(a_2))
model.compile(optimizer='rmsprop',
loss=None,
metrics=['mean_squared_error'])
# test train_on_batch
out = model.train_on_batch(None, None)
out = model.test_on_batch(None, None)
out = model.predict_on_batch(None)
# test fit
with self.assertRaises(ValueError):
out = model.fit(None, None, epochs=1, batch_size=10)
out = model.fit(None, None, epochs=1, steps_per_epoch=1)
# test fit with validation data
with self.assertRaises(ValueError):
out = model.fit(None, None, epochs=1,
steps_per_epoch=None,
validation_steps=2)
out = model.fit(None, None, epochs=1,
steps_per_epoch=2,
validation_steps=2)
# test evaluate
with self.assertRaises(ValueError):
out = model.evaluate(None, None, batch_size=10)
out = model.evaluate(None, None, steps=3)
# test predict
with self.assertRaises(ValueError):
out = model.predict(None, batch_size=10)
out = model.predict(None, steps=3)
self.assertEqual(out.shape, (10 * 3, 4))
# Test multi-output model with no external data at all.
a = keras.Input(
tensor=keras.backend.variables_module.Variable(input_a_np,
dtype='float32'))
a_1 = keras.layers.Dense(4, name='dense_1')(a)
a_2 = keras.layers.Dropout(0.5, name='dropout')(a_1)
model = keras.models.Model(a, [a_1, a_2])
model.add_loss(keras.backend.mean(a_2))
model.compile(optimizer='rmsprop',
loss=None,
metrics=['mean_squared_error'])
# test train_on_batch
out = model.train_on_batch(None, None)
out = model.test_on_batch(None, None)
out = model.predict_on_batch(None)
# test fit
with self.assertRaises(ValueError):
out = model.fit(None, None, epochs=1, batch_size=10)
out = model.fit(None, None, epochs=1, steps_per_epoch=1)
# test fit with validation data
out = model.fit(None, None, epochs=1,
steps_per_epoch=2,
validation_steps=2)
# test evaluate
with self.assertRaises(ValueError):
out = model.evaluate(None, None, batch_size=10)
out = model.evaluate(None, None, steps=3)
# test predict
with self.assertRaises(ValueError):
out = model.predict(None, batch_size=10, verbose=1)
out = model.predict(None, steps=3)
self.assertEqual(len(out), 2)
self.assertEqual(out[0].shape, (10 * 3, 4))
self.assertEqual(out[1].shape, (10 * 3, 4))
def test_target_tensors(self):
with self.test_session():
# single-output, as list
model = keras.models.Sequential()
model.add(keras.layers.Dense(4, input_shape=(4,), name='dense'))
input_val = np.random.random((10, 4))
target_val = np.random.random((10, 4))
target = keras.backend.variable(target_val)
model.compile(optimizer='rmsprop', loss='mse', target_tensors=[target])
model.train_on_batch(input_val, None)
# single-output, as dict
model.compile(optimizer='rmsprop', loss='mse',
target_tensors={'dense': target})
model.train_on_batch(input_val, None)
# test invalid arguments
with self.assertRaises(TypeError):
model.compile(optimizer='rmsprop', loss='mse',
target_tensors=set())
with self.assertRaises(ValueError):
model.compile(optimizer='rmsprop', loss='mse',
target_tensors=[target, target])
with self.assertRaises(ValueError):
model.compile(optimizer='rmsprop', loss='mse',
target_tensors={'dense2': None})
with self.assertRaises(ValueError):
model.compile(optimizer='rmsprop', loss='mse',
target_tensors=[target])
model.train_on_batch(input_val, target_val)
# multi-output, as list
input_val = np.random.random((10, 4))
target_val_a = np.random.random((10, 4))
target_val_b = np.random.random((10, 4))
target_a = keras.backend.variable(target_val_a)
target_b = keras.backend.variable(target_val_b)
inputs = keras.layers.Input(shape=(4,))
output_a = keras.layers.Dense(4, name='dense_a')(inputs)
output_b = keras.layers.Dense(4, name='dense_b')(inputs)
model = keras.models.Model(inputs, [output_a, output_b])
model.compile(optimizer='rmsprop', loss='mse',
target_tensors=[target_a, target_b])
model.train_on_batch(input_val, None)
# multi-output, as dict
model.compile(optimizer='rmsprop', loss='mse',
target_tensors={'dense_a': target_a,
'dense_b': target_b})
model.train_on_batch(input_val, None)
# test with sample weights
model.compile(optimizer='rmsprop', loss='mse',
target_tensors=[target_a, target_b])
model.train_on_batch(input_val, None,
sample_weight={'dense_a': np.random.random((10,))})
def test_model_custom_target_tensors(self):
with self.test_session():
a = keras.Input(shape=(3,), name='input_a')
b = keras.Input(shape=(3,), name='input_b')
a_2 = keras.layers.Dense(4, name='dense_1')(a)
dp = keras.layers.Dropout(0.5, name='dropout')
b_2 = dp(b)
y = keras.backend.placeholder([10, 4], name='y')
y1 = keras.backend.placeholder([10, 3], name='y1')
y2 = keras.backend.placeholder([7, 5], name='y2')
model = keras.models.Model([a, b], [a_2, b_2])
optimizer = 'rmsprop'
loss = 'mse'
loss_weights = [1., 0.5]
# test list of target tensors
with self.assertRaises(ValueError):
model.compile(optimizer, loss, metrics=[], loss_weights=loss_weights,
sample_weight_mode=None, target_tensors=[y, y1, y2])
model.compile(optimizer, loss, metrics=[], loss_weights=loss_weights,
sample_weight_mode=None, target_tensors=[y, y1])
input_a_np = np.random.random((10, 3))
input_b_np = np.random.random((10, 3))
output_a_np = np.random.random((10, 4))
output_b_np = np.random.random((10, 3))
_ = model.train_on_batch([input_a_np, input_b_np],
[output_a_np, output_b_np],
{y: np.random.random((10, 4)),
y1: np.random.random((10, 3))})
# test dictionary of target_tensors
with self.assertRaises(ValueError):
model.compile(optimizer, loss,
metrics=[],
loss_weights=loss_weights,
sample_weight_mode=None,
target_tensors={'does_not_exist': y2})
# test dictionary of target_tensors
model.compile(optimizer, loss,
metrics=[],
loss_weights=loss_weights,
sample_weight_mode=None,
target_tensors={'dense_1': y, 'dropout': y1})
_ = model.train_on_batch([input_a_np, input_b_np],
[output_a_np, output_b_np],
{y: np.random.random((10, 4)),
y1: np.random.random((10, 3))})
# test with custom TF placeholder as target
pl_target_a = keras.backend.array_ops.placeholder('float32',
shape=(None, 4))
model.compile(optimizer='rmsprop', loss='mse',
target_tensors={'dense_1': pl_target_a})
model.train_on_batch([input_a_np, input_b_np],
[output_a_np, output_b_np])
@tf_test_util.run_in_graph_and_eager_modes
def test_metric_names_are_identical_in_graph_and_eager(self):
a = keras.layers.Input(shape=(3,), name='input_a')
b = keras.layers.Input(shape=(3,), name='input_b')
dense = keras.layers.Dense(4, name='dense')
c = dense(a)
d = dense(b)
e = keras.layers.Dropout(0.5, name='dropout')(c)
model = keras.models.Model([a, b], [d, e])
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
loss_weights = [1., 0.5]
metrics = ['mae', 'acc']
model.compile(optimizer, loss, metrics=metrics, loss_weights=loss_weights)
reference_metric_names = ['loss', 'dense_loss', 'dropout_loss',
'dense_mean_absolute_error',
'dense_acc',
'dropout_mean_absolute_error',
'dropout_acc']
self.assertEqual(reference_metric_names, model.metrics_names)
class TestTrainingWithDatasetIterators(test.TestCase):
@tf_test_util.run_in_graph_and_eager_modes
def test_training_and_eval_methods_on_iterators_single_io(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics)
inputs = np.zeros((10, 3))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(100)
dataset = dataset.batch(10)
iterator = dataset.make_one_shot_iterator()
model.fit(iterator, epochs=1, steps_per_epoch=2, verbose=1)
model.evaluate(iterator, steps=2, verbose=1)
model.predict(iterator, steps=2)
model.train_on_batch(iterator)
model.test_on_batch(iterator)
model.predict_on_batch(iterator)
# Test with validation data
model.fit(iterator,
epochs=1, steps_per_epoch=2, verbose=0,
validation_data=iterator, validation_steps=2)
# Test with validation split
with self.assertRaisesRegexp(
ValueError, '`validation_split` argument is not supported '
'when input `x` is a dataset or a dataset iterator'):
model.fit(iterator,
epochs=1, steps_per_epoch=2, verbose=0,
validation_split=0.5, validation_steps=2)
# Test with sample weight.
sample_weight = np.random.random((10,))
with self.assertRaisesRegexp(
ValueError, '`sample_weight` argument is not supported '
'when input `x` is a dataset or a dataset iterator'):
model.fit(
iterator,
epochs=1,
steps_per_epoch=2,
verbose=0,
sample_weight=sample_weight)
# Test invalid usage
with self.assertRaisesRegexp(ValueError,
'you should not specify a target'):
model.fit(iterator, iterator,
epochs=1, steps_per_epoch=2, verbose=0)
with self.assertRaisesRegexp(
ValueError, 'you should specify the `steps_per_epoch` argument'):
model.fit(iterator, epochs=1, verbose=0)
with self.assertRaisesRegexp(ValueError,
'you should specify the `steps` argument'):
model.evaluate(iterator, verbose=0)
with self.assertRaisesRegexp(ValueError,
'you should specify the `steps` argument'):
model.predict(iterator, verbose=0)
def test_get_next_op_created_once(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics)
inputs = np.zeros((10, 3))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(100)
dataset = dataset.batch(10)
iterator = dataset.make_one_shot_iterator()
model.fit(iterator, epochs=1, steps_per_epoch=2, verbose=1)
# Finalize graph to make sure we are not appending another iterator
# get_next op in the graph.
ops.get_default_graph().finalize()
model.fit(iterator, epochs=1, steps_per_epoch=2, verbose=1)
@tf_test_util.run_in_graph_and_eager_modes
def test_iterators_running_out_of_data(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics)
inputs = np.zeros((10, 3))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(2)
dataset = dataset.batch(10)
iterator = dataset.make_one_shot_iterator()
with test.mock.patch.object(logging, 'warning') as mock_log:
model.fit(iterator, epochs=1, steps_per_epoch=3, verbose=0)
self.assertRegexpMatches(
str(mock_log.call_args),
'dataset iterator ran out of data')
class TestTrainingWithDataset(test.TestCase):
def test_calling_model_on_same_dataset(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics)
inputs = np.zeros((10, 3))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(100)
dataset = dataset.batch(10)
# Call fit with validation data
model.fit(dataset, epochs=1, steps_per_epoch=2, verbose=0,
validation_data=dataset, validation_steps=2)
# Finalize the graph to make sure new ops aren't added when calling on the
# same dataset
ops.get_default_graph().finalize()
model.fit(dataset, epochs=1, steps_per_epoch=2, verbose=0,
validation_data=dataset, validation_steps=2)
@tf_test_util.run_in_graph_and_eager_modes
def test_training_and_eval_methods_on_dataset(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
metrics = ['mae']
model.compile(optimizer, loss, metrics=metrics)
inputs = np.zeros((10, 3))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(100)
dataset = dataset.batch(10)
model.fit(dataset, epochs=1, steps_per_epoch=2, verbose=1)
model.evaluate(dataset, steps=2, verbose=1)
model.predict(dataset, steps=2)
model.train_on_batch(dataset)
model.predict_on_batch(dataset)
# Test with validation data
model.fit(dataset, epochs=1, steps_per_epoch=2, verbose=0,
validation_data=dataset, validation_steps=2)
# Test with validation split
with self.assertRaisesRegexp(
ValueError, '`validation_split` argument is not supported '
'when input `x` is a dataset or a dataset iterator'):
model.fit(dataset,
epochs=1, steps_per_epoch=2, verbose=0,
validation_split=0.5, validation_steps=2)
# Test with sample weight.
sample_weight = np.random.random((10,))
with self.assertRaisesRegexp(
ValueError, '`sample_weight` argument is not supported '
'when input `x` is a dataset or a dataset iterator'):
model.fit(
dataset,
epochs=1,
steps_per_epoch=2,
verbose=0,
sample_weight=sample_weight)
# Test invalid usage
with self.assertRaisesRegexp(ValueError,
'you should not specify a target'):
model.fit(dataset, dataset,
epochs=1, steps_per_epoch=2, verbose=0)
with self.assertRaisesRegexp(
ValueError, 'you should specify the `steps_per_epoch` argument'):
model.fit(dataset, epochs=1, verbose=0)
with self.assertRaisesRegexp(ValueError,
'you should specify the `steps` argument'):
model.evaluate(dataset, verbose=0)
with self.assertRaisesRegexp(ValueError,
'you should specify the `steps` argument'):
model.predict(dataset, verbose=0)
def test_dataset_input_shape_validation(self):
with self.test_session():
x = keras.layers.Input(shape=(3,), name='input')
y = keras.layers.Dense(4, name='dense')(x)
model = keras.Model(x, y)
optimizer = RMSPropOptimizer(learning_rate=0.001)
loss = 'mse'
model.compile(optimizer, loss)
# User forgets to batch the dataset
inputs = np.zeros((10, 3))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(100)
with self.assertRaisesRegexp(ValueError,
'expected input to have 2 dimensions'):
model.train_on_batch(dataset)
# Wrong input shape
inputs = np.zeros((10, 5))
targets = np.zeros((10, 4))
dataset = dataset_ops.Dataset.from_tensor_slices((inputs, targets))
dataset = dataset.repeat(100)
dataset = dataset.batch(10)
with self.assertRaisesRegexp(ValueError,
'expected input to have shape'):
model.train_on_batch(dataset)
if __name__ == '__main__':
test.main()
| 36.426829 | 83 | 0.593237 | 9,080 | 71,688 | 4.432819 | 0.047797 | 0.018335 | 0.013913 | 0.02154 | 0.84313 | 0.81287 | 0.780894 | 0.752 | 0.72164 | 0.699776 | 0 | 0.023359 | 0.292364 | 71,688 | 1,967 | 84 | 36.445348 | 0.770072 | 0.051264 | 0 | 0.763625 | 0 | 0 | 0.04061 | 0.002534 | 0 | 0 | 0 | 0 | 0.065906 | 1 | 0.026616 | false | 0 | 0.012674 | 0.001267 | 0.047529 | 0.000634 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9fb53ef855308d7ab01c6e105a425e7975b115e | 21,329 | py | Python | env/lib/python3.4/site-packages/bulbs/property.py | mudbungie/NetExplorer | a2a7e027a40de75625aca8821646b63e5a844889 | [
"MIT"
] | null | null | null | env/lib/python3.4/site-packages/bulbs/property.py | mudbungie/NetExplorer | a2a7e027a40de75625aca8821646b63e5a844889 | [
"MIT"
] | null | null | null | env/lib/python3.4/site-packages/bulbs/property.py | mudbungie/NetExplorer | a2a7e027a40de75625aca8821646b63e5a844889 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright 2012 James Thornton (http://jamesthornton.com)
# BSD License (see LICENSE for details)
#
"""
Interface for interacting with a graph database through Rexster.
"""
# Python 3
import six
import sys
if sys.version > '3':
long = int
unicode = str
import datetime
import dateutil.parser
from numbers import Number
from . import utils
from .utils import get_logger, to_datetime, to_date
log = get_logger(__name__)
class Property(object):
"""
Abstract base class for database property types used in Models.
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexeded: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
def __init__(self, fget=None, name=None, default=None, \
nullable=True, unique=False, indexed=False):
self.fget = fget
self.name = name
self.default = default
self.nullable = nullable
# These aren't implemented yet.
# TODO: unique creates an index
self.indexed = indexed
self.unique = unique
#self.constraint = constraint
def validate(self, key, value):
"""
Validates the Property value before saving it to the database.
:param key: Property key.
:type key: str
:param value: Property value.
:type value: object
:rtype: None
"""
# Do null checks first so you can ignore None values in check_datatype()
self._check_null(key, value)
self._check_datatype(key, value)
def _check_null(self,key,value):
# TODO: should this be checking that the value is True to catch empties?
if self.nullable is False and value is None:
log.error("Null Property Error: '%s' cannot be set to '%s'",
key, value)
raise ValueError
def _check_datatype(self, key, value):
if value is not None and isinstance(value, self.python_type) is False:
log.error("Type Error: '%s' is set to %s with type %s, but must be a %s.",
key, value, type(value), self.python_type)
raise TypeError
def convert_to_db(self, type_system, key, value):
"""
Converts a Property value from its Python type to its database representation.
:param type_system: TypeSystem object.
:type type_system: TypeSystem
:param key: Property key.
:type key: str
:param value: Property value.
:type value: object
:rtype: object
"""
value = self.to_db(type_system,value)
return value
def convert_to_python(self, type_system, key, value):
"""
Converts a Property value from its database representation to its Python type.
:param type_system: TypeSystem object.
:type type_system: TypeSystem
:param key: Property key.
:type key: str
:param value: Property value.
:type value: object
:rtype: object
"""
try:
value = self.to_python(type_system, value)
except Exception as e:
log.exception("Property Type Mismatch: '%s' with value '%s': %s",
key, value, e)
value = None
return value
def coerce(self, key, value):
"""
Coerces a Property value to its Python type.
:param key: Property key.
:type key: str
:param value: Property value.
:type value: object
:rtype: object
"""
initial_datatype = type(value)
try:
value = self._coerce(value)
return value
except ValueError:
log.exception("'%s' is not a valid value for %s, must be %s.",
value, key, self.python_type)
raise
except AttributeError:
log.exception("Can't set attribute '%s' to value '%s with type %s'",
key, value, initial_datatype)
raise
def _coerce(self, value):
# overload coerce for special types like DateTime
return self.python_type(value)
class String(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = unicode
def to_db(self,type_system,value):
return type_system.database.to_string(value)
def to_python(self,type_system,value):
return type_system.python.to_string(value)
def _coerce(self, value):
return utils.u(value)
class Integer(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = int
def to_db(self,type_system,value):
return type_system.database.to_integer(value)
def to_python(self,type_system,value):
return type_system.python.to_integer(value)
class Long(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = long
def to_db(self,type_system,value):
return type_system.database.to_long(value)
def to_python(self,type_system,value):
return type_system.python.to_long(value)
class Float(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = float
def to_db(self,type_system,value):
return type_system.database.to_float(value)
def to_python(self,type_system,value):
return type_system.python.to_float(value)
class Bool(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, bool, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = bool
def to_db(self,type_system,value):
return type_system.database.to_bool(value)
def to_python(self,type_system,value):
return type_system.python.to_bool(value)
class Null(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = None
def to_db(self,type_system,value):
return type_system.database.to_null(value)
def to_python(self,type_system,value):
return type_system.python.to_null(value)
class List(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = list
def to_db(self,type_system,value):
return type_system.database.to_list(value)
def to_python(self,type_system,value):
return type_system.python.to_list(value)
class Dictionary(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = dict
def to_db(self,type_system,value):
return type_system.database.to_dictionary(value)
def to_python(self,type_system,value):
return type_system.python.to_dictionary(value)
class Document(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = dict
def to_db(self,type_system,value):
return type_system.database.to_document(value)
def to_python(self,type_system,value):
return type_system.python.to_dictionary(value)
class DateTime(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = datetime.datetime
def to_db(self, type_system, value):
return type_system.database.to_datetime(value)
def to_python(self, type_system, value):
return type_system.python.to_datetime(value)
def is_valid(self, key, value):
# how do you assert it's UTC?
#Don't use assert except for sanity check during development
# (it gets turned to a no-op when you run with python -o), and
# don't raise the wrong kind of exception (such as, an AssertionError
# when a TypeError is clearly what you mean here).
#return type(value) is datetime.datetime
return isinstance(value, datetime.datetime)
def _coerce(self, value):
# Coerce user input to the Python type
# Overloaded from Property since this is a special case
# http://labix.org/python-dateutil#head-a23e8ae0a661d77b89dfb3476f85b26f0b30349c
# return dateutils.parse(value)
# Not using parse -- let the client code do that. Expect a UTC dateime object here.
# How you going to handle asserts? It's easy with ints.
if isinstance(value, Number):
# catches unix timestamps
dt = to_datetime(value)
elif isinstance(value, datetime.datetime):
# value passed in was already in proper form
dt = value
else:
# Python 3 unicode/str catchall
dt = dateutil.parser.parse(value)
#if dt.tzinfo is None:
# tz = pytz.timezone('UTC')
# dt.replace(tzinfo = tz)
return dt
class Date(Property):
"""
:param fget: Method name that returns a calculated value. Defaults to None.
:type fget: str
:param name: Database property name. Defaults to the Property key.
:type name: str
:param default: Default property value. Defaults to None.
:type default: str, int, long, float, list, dict, or Callable
:param nullable: If True, the Property can be null. Defaults to True.
:type nullable: bool
:param indexed: If True, index the Property in the DB. Defaults to False.
:type indexed: bool
:ivar fget: Name of the method that gets the calculated Property value.
:ivar name: Database property name. Defaults to the Property key.
:ivar default: Default property value. Defaults to None.
:ivar nullable: If True, the Property can be null. Defaults to True.
:ivar indexed: If True, index the Property in the DB. Defaults to False.
.. note:: If no Properties have index=True, all Properties are indexed.
"""
#: Python type
python_type = datetime.date
def to_db(self, type_system, value):
return type_system.database.to_date(value)
def to_python(self, type_system, value):
return type_system.python.to_date(value)
def is_valid(self, key, value):
# how do you assert it's UTC?
#Don't use assert except for sanity check during development
# (it gets turned to a no-op when you run with python -o), and
# don't raise the wrong kind of exception (such as, an AssertionError
# when a TypeError is clearly what you mean here).
return isinstance(value, datetime.date)
def _coerce(self, value):
# Coerce user input to the Python type
# Overloaded from Property since this is a special case
# http://labix.org/python-dateutil#head-a23e8ae0a661d77b89dfb3476f85b26f0b30349c
# return dateutils.parse(value)
# Not using parse -- let the client code do that. Expect a UTC dateime object here.
# How you going to handle asserts? It's easy with ints.
if isinstance(value, Number):
# catches unix timestamps
d = to_date(value)
elif isinstance(value, datetime.date):
# value passed in was already in proper form
d = value
else:
# Python 3 unicode/str catchall
d = dateutil.parser.parse(value).date()
return d
| 34.18109 | 91 | 0.66426 | 2,953 | 21,329 | 4.747376 | 0.07721 | 0.077038 | 0.038519 | 0.048791 | 0.8313 | 0.823739 | 0.823739 | 0.818889 | 0.813895 | 0.813895 | 0 | 0.003729 | 0.258193 | 21,329 | 623 | 92 | 34.235955 | 0.882316 | 0.647428 | 0 | 0.292517 | 0 | 0.006803 | 0.040922 | 0 | 0 | 0 | 0 | 0.00321 | 0 | 1 | 0.238095 | false | 0 | 0.047619 | 0.176871 | 0.653061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
8a216fe53697c56d0b003fa21c9ec8829aa7a1bb | 47,880 | py | Python | chesstab/core/tests/test_uci_to_pgn.py | RogerMarsh/chesstab | 01d375dc6bf025b621612a84513e55c4640a78ad | [
"BSD-3-Clause"
] | null | null | null | chesstab/core/tests/test_uci_to_pgn.py | RogerMarsh/chesstab | 01d375dc6bf025b621612a84513e55c4640a78ad | [
"BSD-3-Clause"
] | null | null | null | chesstab/core/tests/test_uci_to_pgn.py | RogerMarsh/chesstab | 01d375dc6bf025b621612a84513e55c4640a78ad | [
"BSD-3-Clause"
] | null | null | null | # uci_to_pgn_test.py
# Copyright 2015 Roger Marsh
# Licence: See LICENCE (BSD licence)
"""uci_to_pgn tests"""
import unittest
from pgn_read.core.constants import (
FEN_WHITE_KING,
FEN_WHITE_QUEEN,
FEN_WHITE_ROOK,
FEN_WHITE_BISHOP,
FEN_WHITE_KNIGHT,
FEN_WHITE_PAWN,
FEN_BLACK_KING,
FEN_BLACK_QUEEN,
FEN_BLACK_ROOK,
FEN_BLACK_BISHOP,
FEN_BLACK_KNIGHT,
FEN_BLACK_PAWN,
PGN_QUEEN,
PGN_ROOK,
PGN_BISHOP,
PGN_KNIGHT,
PGN_PAWN,
PGN_KING,
)
from ..uci_to_pgn import (
_PIECE_TO_PGN,
_PROMOTE,
_CASTLES,
_CASTLEKEY,
generate_pgn_for_uci_moves_in_position,
)
from ..constants import NOPIECE
class ModuleAssumptions(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test__assumptions(self):
msg = "Failure of this test invalidates all other tests"
self.assertEqual(NOPIECE, "", msg)
self.assertEqual(FEN_WHITE_KING, "K", msg)
self.assertEqual(FEN_WHITE_QUEEN, "Q", msg)
self.assertEqual(FEN_WHITE_ROOK, "R", msg)
self.assertEqual(FEN_WHITE_BISHOP, "B", msg)
self.assertEqual(FEN_WHITE_KNIGHT, "N", msg)
self.assertEqual(FEN_WHITE_PAWN, "P", msg)
self.assertEqual(FEN_BLACK_KING, "k", msg)
self.assertEqual(FEN_BLACK_QUEEN, "q", msg)
self.assertEqual(FEN_BLACK_ROOK, "r", msg)
self.assertEqual(FEN_BLACK_BISHOP, "b", msg)
self.assertEqual(FEN_BLACK_KNIGHT, "n", msg)
self.assertEqual(FEN_BLACK_PAWN, "p", msg)
self.assertEqual(PGN_KING, "K", msg)
self.assertEqual(PGN_QUEEN, "Q", msg)
self.assertEqual(PGN_ROOK, "R", msg)
self.assertEqual(PGN_BISHOP, "B", msg)
self.assertEqual(PGN_KNIGHT, "N", msg)
self.assertEqual(PGN_PAWN, "", msg)
class ModuleConstants(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test____constants(self):
self.assertEqual(
_PIECE_TO_PGN,
{
"K": "K",
"Q": "Q",
"R": "R",
"B": "B",
"N": "N",
"P": "",
"k": "K",
"q": "Q",
"r": "R",
"b": "B",
"n": "N",
"p": "",
},
)
self.assertEqual(
_PROMOTE,
{
"q": "=Q",
"r": "=R",
"b": "=B",
"n": "=N",
"": "",
},
)
self.assertEqual(
_CASTLES,
{"e1g1": "O-O", "e8g8": "O-O", "e1c1": "O-O-O", "e8c8": "O-O-O"},
)
self.assertEqual(
_CASTLEKEY, {"e1g1": "K", "e8g8": "k", "e1c1": "K", "e8c8": "k"}
)
class Generate_pgn_for_uci_illegal_moves(unittest.TestCase):
def setUp(self):
self.fen = "rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR w KQkq - 0 1"
def tearDown(self):
pass
def test_generate_pgn_for_uci_illegal_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position([], self.fen),
"{'[]' cannot be a move, 'Yz0' inserted.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("", self.fen),
"{'' is not a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"zzzz",
"rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKRRR w KQkq - 0 1",
),
"".join(
(
"{'Forsyth-Edwards Notation sets an illegal position. ",
"Move 'Yz0' inserted.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("zzzz", self.fen),
"{'zzzz' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("zzzz e7e5", self.fen),
"".join(
(
"{'zzzz' cannot be a move, 'Yz0' inserted. Rest 'e7e5' ",
"ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("x9", self.fen),
"{'x9' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_07(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("x9y0", self.fen),
"{'x9y0' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_08(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("2e4e", self.fen),
"{'2e4e' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_09(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f3f4", self.fen),
"".join(
(
"{'f3f4' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_10(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7g8", self.fen),
"".join(
(
"{'e7g8' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_11(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e2g1q", self.fen),
"{'e2g1q' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
# *_12 to *_21 are acceptable pawn move specifications, although only *_20
# and *_21 are legal in the position used in these tests.
def test_generate_pgn_for_uci_illegal_moves_12(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7e6", self.fen),
"".join(
(
"{'e7e6' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_13(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7e5", self.fen),
"".join(
(
"{'e7e5' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_14(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7e8", self.fen),
"".join(
(
"{'e7e8' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_15(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7f8", self.fen),
"".join(
(
"{'e7f8' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_16(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e2e1", self.fen),
"{'e2e1' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_17(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e2f1", self.fen),
"{'e2f1' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_18(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7f8q", self.fen),
"".join(
(
"{'e7f8q' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_19(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e2f1r", self.fen),
"{'e2f1r' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_20(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e2e3", self.fen), "e3"
)
def test_generate_pgn_for_uci_illegal_moves_21(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e2e4", self.fen), "e4"
)
# *_22 to *_27 are impossible non-pawn piece moves.
def test_generate_pgn_for_uci_illegal_moves_22(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f1f5", self.fen),
"{'f1f5' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_23(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e8e6", self.fen),
"".join(
(
"{'e8e6' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_24(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d8c6", self.fen),
"".join(
(
"{'d8c6' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_25(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b1b3", self.fen),
"{'b1b3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_26(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h1e4", self.fen),
"{'h1e4' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_27(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h1f1", self.fen),
"{'h1f1' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
# *_28 to *_36 are unambigous, but illegal, non-pawn piece moves.
# In *_30, Rh3 is unambigous because h1 contains a white rook and h8 a
# black rook; but if the pawns on h2 and h7 are removed, the PGN move
# becomes legal whichever side has the move and remains unambiguous.
# Finding a move to be illegal is a side-effect of disambiguating the move
# in this function.
def test_generate_pgn_for_uci_illegal_moves_28(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h1g1", self.fen),
"{'h1g1' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_29(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h1h2", self.fen),
"{'h1h2' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_30(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h1h3", self.fen),
"{'h1h3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_31(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h1h7", self.fen),
"{'h1h7' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_32(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d8f8", self.fen),
"".join(
(
"{'d8f8' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_33(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d8c8", self.fen),
"".join(
(
"{'d8c8' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
def test_generate_pgn_for_uci_illegal_moves_34(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d1d8", self.fen),
"{'d1d8' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_35(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c1b2", self.fen),
"{'c1b2' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_illegal_moves_36(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b8a6", self.fen),
"".join(
(
"{'b8a6' does not refer to a piece of the active side, ",
"'Yz0' inserted. Rest '' ignored.}Yz0",
)
),
)
# *_37 is one of the four legal non-pawn moves in the start position.
def test_generate_pgn_for_uci_illegal_moves_37(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g1f3", self.fen), "Nf3"
)
class Generate_pgn_for_uci_file_moves(unittest.TestCase):
def setUp(self):
self.fenw = "4r3/8/4r3/k7/4R3/K7/8/4R3 w - - 0 1"
self.fenb = "4r3/8/4r3/k7/4R3/K7/8/4R3 b - - 0 1"
def tearDown(self):
pass
def test_generate_pgn_for_uci_file_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e1e2", self.fenw), "R1e2"
)
def test_generate_pgn_for_uci_file_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e2", self.fenw), "R4e2"
)
def test_generate_pgn_for_uci_file_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e5", self.fenw), "Re5"
)
def test_generate_pgn_for_uci_file_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e6e5", self.fenb), "Re5"
)
def test_generate_pgn_for_uci_file_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e6e7", self.fenb), "R6e7"
)
def test_generate_pgn_for_uci_file_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e8e7", self.fenb), "R8e7"
)
class Generate_pgn_for_uci_rank_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/1K4k1/8/8/8/r2r1R1R/8/8 w - - 0 1"
self.fenb = "8/1K4k1/8/8/8/r2r1R1R/8/8 b - - 0 1"
def tearDown(self):
pass
def test_generate_pgn_for_uci_rank_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("a3b3", self.fenb), "Rab3"
)
def test_generate_pgn_for_uci_rank_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d3c3", self.fenb), "Rdc3"
)
def test_generate_pgn_for_uci_rank_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d3e3", self.fenb), "Re3"
)
def test_generate_pgn_for_uci_rank_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f3e3", self.fenw), "Re3"
)
def test_generate_pgn_for_uci_rank_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f3g3", self.fenw), "Rfg3"
)
def test_generate_pgn_for_uci_rank_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h3g3", self.fenw), "Rhg3"
)
class Generate_pgn_for_uci_diagonal_moves_square(unittest.TestCase):
def setUp(self):
self.fenw = "8/2k4B/8/5B2/8/3b2K1/8/1b6 w - - 0 1"
self.fenb = "8/2k4B/8/5B2/8/3b2K1/8/1b6 b - - 0 1"
def tearDown(self):
pass
def test_generate_pgn_for_uci_diagonal_moves_square_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b1c2", self.fenb), "Bbc2"
)
def test_generate_pgn_for_uci_diagonal_moves_square_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d3c2", self.fenb), "Bdc2"
)
def test_generate_pgn_for_uci_diagonal_moves_square_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d3e4", self.fenb), "Be4"
)
def test_generate_pgn_for_uci_diagonal_moves_square_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f5e4", self.fenw), "Be4"
)
def test_generate_pgn_for_uci_diagonal_moves_square_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f5g6", self.fenw), "Bfg6"
)
def test_generate_pgn_for_uci_diagonal_moves_square_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("h7g6", self.fenw), "Bhg6"
)
class Generate_pgn_for_uci_diagonal_moves_stretched_square(unittest.TestCase):
def setUp(self):
self.fenw = "8/2b3B1/8/8/8/2b3B1/8/1k5K w - - 0 1"
self.fenb = "8/2b3B1/8/8/8/2b3B1/8/1k5K b - - 0 1"
def tearDown(self):
pass
def test_generate_pgn_for_uci_diagonal_moves_stretched_square_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c3e5", self.fenb), "B3e5"
)
def test_generate_pgn_for_uci_diagonal_moves_stretched_square_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3e5", self.fenw), "B3e5"
)
def test_generate_pgn_for_uci_diagonal_moves_stretched_square_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c7e5", self.fenb), "B7e5"
)
def test_generate_pgn_for_uci_diagonal_moves_stretched_square_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7e5", self.fenw), "B7e5"
)
class Generate_two_token_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/2B3B1/8/8/8/2B3B1/8/1k5K w - - 0 1"
def tearDown(self):
pass
def test_generate_two_token_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c3e5", self.fenw), "Bc3e5"
)
def test_generate_two_token_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3e5", self.fenw), "Bg3e5"
)
def test_generate_two_token_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c7e5", self.fenw), "Bc7e5"
)
def test_generate_two_token_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7e5", self.fenw), "Bg7e5"
)
class Generate_queen_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/2Q3Q1/8/8/8/2Q3Q1/8/1k5K w - - 0 1"
def tearDown(self):
pass
def test_generate_queen_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c3e5", self.fenw), "Qc3e5"
)
def test_generate_queen_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3e5", self.fenw), "Qg3e5"
)
def test_generate_queen_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c7e5", self.fenw), "Qc7e5"
)
def test_generate_queen_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7e5", self.fenw), "Qg7e5"
)
class Generate_bishop_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/2B3B1/8/8/8/6B1/1B6/1k5K w - - 0 1"
def tearDown(self):
pass
def test_generate_bishop_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2e5", self.fenw), "Bbe5"
)
def test_generate_bishop_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3e5", self.fenw), "B3e5"
)
def test_generate_bishop_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c7e5", self.fenw), "Bce5"
)
def test_generate_bishop_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7e5", self.fenw), "Bg7e5"
)
class Generate_knight_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/8/1N6/4n1n1/1N6/8/8/1k5K w - - 0 1"
self.fenb = "8/8/1N6/4n1n1/1N6/8/8/1k5K b - - 0 1"
def tearDown(self):
pass
def test_generate_knight_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b4d5", self.fenw), "N4d5"
)
def test_generate_knight_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6d5", self.fenw), "N6d5"
)
def test_generate_knight_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e5f3", self.fenb), "Nef3"
)
def test_generate_knight_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g5f3", self.fenb), "Ngf3"
)
def test_generate_knight_move_pgn_for_uci_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6c8", self.fenw), "Nc8"
)
def test_generate_knight_move_pgn_for_uci_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g5h3", self.fenb), "Nh3"
)
class Generate_three_queen_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenb = "8/8/2q5/8/2q1q3/8/7K/5k2 b - - 0 1"
def tearDown(self):
pass
def test_generate_three_queen_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4e6", self.fenb), "Qc4e6"
)
def test_generate_three_queen_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c6e6", self.fenb), "Q6e6"
)
def test_generate_three_queen_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e6", self.fenb), "Qee6"
)
def test_generate_three_queen_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4a2", self.fenb), "Qa2"
)
class Generate_three_queen_pinned_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenb = "8/8/B1q5/8/2q1q3/8/7K/5k2 b - - 0 1"
def tearDown(self):
pass
def test_generate_three_queen_pinned_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4e6", self.fenb),
"{'c4e6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_three_queen_pinned_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c6e6", self.fenb), "Qce6"
)
def test_generate_three_queen_pinned_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e6", self.fenb), "Qee6"
)
def test_generate_three_queen_pinned_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4a2", self.fenb),
"{'c4a2' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
class Generate_three_queen_block_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenb = "8/8/2q5/3r4/2q1q3/8/7K/5k2 b - - 0 1"
def tearDown(self):
pass
def test_generate_three_queen_block_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4e6", self.fenb),
"{'c4e6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_three_queen_block_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c6e6", self.fenb), "Qce6"
)
def test_generate_three_queen_block_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e6", self.fenb), "Qee6"
)
def test_generate_three_queen_block_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4a2", self.fenb), "Qa2"
)
class Generate_pawn_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/8/8/8/1NPPP3/8/7K/1k6 w - - 0 1"
def tearDown(self):
pass
def test_generate_pawn_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4d5", self.fenw),
"{'c4d5' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pawn_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d4d5", self.fenw), "d5"
)
def test_generate_pawn_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4d5", self.fenw),
"{'e4d5' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pawn_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b4d5", self.fenw), "Nd5"
)
class Generate_pawn_capture_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "8/8/8/3p4/1NPPP3/8/7K/1k6 w - - 0 1"
def tearDown(self):
pass
def test_generate_pawn_capture_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4d5", self.fenw), "cxd5"
)
def test_generate_pawn_capture_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d4d5", self.fenw),
"{'d4d5' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pawn_capture_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4d5", self.fenw), "exd5"
)
def test_generate_pawn_capture_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b4d5", self.fenw), "Nxd5"
)
class Generate_castles_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "r3k2r/8/8/8/8/8/8/R3K2R w KQkq - 0 1"
self.fenb = "r3k2r/8/8/8/8/8/8/R3K2R b KQkq - 0 1"
def tearDown(self):
pass
def test_generate_castles_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e1g1", self.fenw), "O-O"
)
def test_generate_castles_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e8g8", self.fenb), "O-O"
)
def test_generate_castles_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e1c1", self.fenw), "O-O-O"
)
def test_generate_castles_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e8c8", self.fenb), "O-O-O"
)
def test_generate_castles_pgn_for_uci_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e1f1", self.fenw), "Kf1"
)
def test_generate_castles_pgn_for_uci_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e8f8", self.fenb), "Kf8"
)
class Generate_king_move_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenw = "r2k3r/8/8/8/8/8/8/R4K1R w - - 0 1"
self.fenb = "r2k3r/8/8/8/8/8/8/R4K1R b - - 0 1"
def tearDown(self):
pass
def test_generate_king_move_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f1g1", self.fenw), "Kg1"
)
def test_generate_king_move_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f1f2", self.fenw), "Kf2"
)
def test_generate_king_move_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d8c8", self.fenb), "Kc8"
)
def test_generate_king_move_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d8e8", self.fenb), "Ke8"
)
class Generate_three_queen_capture_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenb = "8/8/2q1P3/8/2q1q3/8/N6K/1k6 b - - 0 1"
def tearDown(self):
pass
def test_generate_three_queen_capture_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4e6", self.fenb), "Qc4xe6"
)
def test_generate_three_queen_capture_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c6e6", self.fenb), "Q6xe6"
)
def test_generate_three_queen_capture_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e6", self.fenb), "Qexe6"
)
def test_generate_three_queen_capture_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4a2", self.fenb), "Qxa2"
)
class Generate_three_queen_block_capture_pgn_for_uci_moves(unittest.TestCase):
def setUp(self):
self.fenb = "8/8/2q1P3/3r4/2q1q3/8/N6K/1k6 b - - 0 1"
def tearDown(self):
pass
def test_generate_three_queen_block_capture_pgn_for_uci_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4e6", self.fenb),
"{'c4e6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_three_queen_block_capture_pgn_for_uci_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c6e6", self.fenb), "Qcxe6"
)
def test_generate_three_queen_block_capture_pgn_for_uci_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e4e6", self.fenb), "Qexe6"
)
def test_generate_three_queen_block_capture_pgn_for_uci_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4a2", self.fenb), "Qxa2"
)
class Generate_pgn_for_uci_move_sequence(unittest.TestCase):
def setUp(self):
self.fen = "rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR w KQkq - 0 1"
def tearDown(self):
pass
def test_generate_pgn_for_uci_move_sequence_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"".join(
(
"e2e4 c7c6 d2d4 d7d5 e4d5 c6d5 g1f3 g8f6 f1d3 e7e6 ",
"e1g1 f8e7 c2c3 e8g8",
)
),
self.fen,
),
"e4 c6 d4 d5 exd5 cxd5 Nf3 Nf6 Bd3 e6 O-O Be7 c3 O-O",
)
class Generate_pgn_for_uci_white_pawn_moves(unittest.TestCase):
def setUp(self):
self.fenw = "k4n2/4p1P1/1p6/2P5/5p2/6P1/1p1P4/2N4K w - - 0 60"
def tearDown(self):
pass
def test_generate_pgn_for_uci_white_pawn_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7h6", self.fenw),
"{'g7h6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7g6", self.fenw),
"{'g7g6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7g8", self.fenw),
"{'g7g8' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7h8b", self.fenw),
"{'g7h8b' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7f8q", self.fenw),
"gxf8=Q",
)
def test_generate_pgn_for_uci_white_pawn_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g7g8r", self.fenw), "g8=R"
)
def test_generate_pgn_for_uci_white_pawn_moves_07(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3g4b", self.fenw),
"{'g3g4b' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_08(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3g6", self.fenw),
"{'g3g6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_09(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3g5", self.fenw),
"{'g3g5' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_10(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3g4", self.fenw), "g4"
)
def test_generate_pgn_for_uci_white_pawn_moves_11(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c5d6", self.fenw),
"{'c5d6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_12(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c5c6", self.fenw), "c6"
)
def test_generate_pgn_for_uci_white_pawn_moves_13(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c5b6", self.fenw), "cxb6"
)
def test_generate_pgn_for_uci_white_pawn_moves_14(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g3f4", self.fenw), "gxf4"
)
def test_generate_pgn_for_uci_white_pawn_moves_15(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d2d5", self.fenw),
"{'d2d5' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_16(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d2d4", self.fenw), "d4"
)
def test_generate_pgn_for_uci_white_pawn_moves_17(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d2d3", self.fenw), "d3"
)
def test_generate_pgn_for_uci_white_pawn_moves_18(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d2c3", self.fenw),
"{'d2c3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_moves_19(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("d2e3", self.fenw),
"{'d2e3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
class Generate_pgn_for_uci_black_pawn_moves(unittest.TestCase):
def setUp(self):
self.fenb = "k4n2/4p1P1/1p6/2P5/5p2/6P1/1p1P4/2N4K b - - 0 60"
def tearDown(self):
pass
def test_generate_pgn_for_uci_black_pawn_moves_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2a3", self.fenb),
"{'b2a3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2b3", self.fenb),
"{'b2b3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2b1", self.fenb),
"{'b2b1' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2a1b", self.fenb),
"{'b2a1b' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_05(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2c1q", self.fenb),
"bxc1=Q",
)
def test_generate_pgn_for_uci_black_pawn_moves_06(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b2b1r", self.fenb), "b1=R"
)
def test_generate_pgn_for_uci_black_pawn_moves_07(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6b5b", self.fenb),
"{'b6b5b' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_08(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6b3", self.fenb),
"{'b6b3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_09(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6b4", self.fenb),
"{'b6b4' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_10(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6b5", self.fenb), "b5"
)
def test_generate_pgn_for_uci_black_pawn_moves_11(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f4e3", self.fenb),
"{'f4e3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_12(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f4f3", self.fenb), "f3"
)
def test_generate_pgn_for_uci_black_pawn_moves_13(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("f4g3", self.fenb), "fxg3"
)
def test_generate_pgn_for_uci_black_pawn_moves_14(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("b6c5", self.fenb), "bxc5"
)
def test_generate_pgn_for_uci_black_pawn_moves_15(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7e4", self.fenb),
"{'e7e4' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_16(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7e5", self.fenb), "e5"
)
def test_generate_pgn_for_uci_black_pawn_moves_17(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7e6", self.fenb), "e6"
)
def test_generate_pgn_for_uci_black_pawn_moves_18(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7f6", self.fenb),
"{'e7f6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_moves_19(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("e7d6", self.fenb),
"{'e7d6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
class Generate_pgn_for_uci_white_pawn_en_passant(unittest.TestCase):
def setUp(self):
self.fenw = "k7/8/1p6/1pPp2P1/8/8/8/7K w - d6 0 60"
def tearDown(self):
pass
def test_generate_pgn_for_uci_white_pawn_en_passant_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g5f6", self.fenw),
"{'g5f6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_white_pawn_en_passant_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c5d6", self.fenw), "cxd6"
)
def test_generate_pgn_for_uci_white_pawn_en_passant_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c5b6", self.fenw), "cxb6"
)
def test_generate_pgn_for_uci_white_pawn_en_passant_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"c5d6", "k7/8/1p6/1pPp2P1/8/8/8/7K w - - 0 60"
),
"{'c5d6' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
class Generate_pgn_for_uci_black_pawn_en_passant(unittest.TestCase):
def setUp(self):
self.fenb = "k7/8/8/8/1PpP2p1/1P6/8/7K b - d3 0 60"
def tearDown(self):
pass
def test_generate_pgn_for_uci_black_pawn_en_passant_01(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("g4f3", self.fenb),
"{'g4f3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
def test_generate_pgn_for_uci_black_pawn_en_passant_02(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4d3", self.fenb), "cxd3"
)
def test_generate_pgn_for_uci_black_pawn_en_passant_03(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position("c4b3", self.fenb), "cxb3"
)
def test_generate_pgn_for_uci_black_pawn_en_passant_04(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"c4d3", "k7/8/8/8/1PpP2p1/1P6/8/7K b - - 0 60"
),
"{'c4d3' cannot be a move, 'Yz0' inserted. Rest '' ignored.}Yz0",
)
class Generate_pgn_non_castle_moves_like_e1g1(unittest.TestCase):
# e1g1, e1c1, e8g8, and e8c8, were always treated as O-O or O-O-O until
# the identity of the piece on e1 or e8 was taken into account.
def setUp(self):
pass
def tearDown(self):
pass
def test_generate_pgn_non_castle_move_e1g1(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"e1g1", "4r3/k7/8/8/8/8/7K/4R3 w - - 0 1"
),
"Rg1",
)
def test_generate_pgn_non_castle_move_e1c1(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"e1c1", "4r3/k7/8/8/8/8/7K/4R3 w - - 0 1"
),
"Rc1",
)
def test_generate_pgn_non_castle_move_e8g8(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"e8g8", "4r3/k7/8/8/8/8/7K/4R3 b - - 0 1"
),
"Rg8",
)
def test_generate_pgn_non_castle_move_e8c8(self):
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"e8c8", "4r3/k7/8/8/8/8/7K/4R3 b - - 0 1"
),
"Rc8",
)
class Generate_pgn_for_uci_real_position(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_generate_pgn_for_uci_real_position_01(self):
# Bf5 was treated as an illegal move because PIECE_MOVE_MAP used in one
# place, not PIECE_CAPTURE_MAP, causing a king blockading a pawn to be
# 'in check'.
self.assertEqual(
generate_pgn_for_uci_moves_in_position(
"g4h5 f5f4 h5h6 g7h8 g3e4 g8h7 d1b3 d7f5",
"".join(
(
"1r2qrk1/pn1bn1b1/1p4p1/1P2ppBp/N1Pp2P1/P2P1PNP/",
"6B1/1R1Q1RK1 w - - 4 21",
)
),
),
"gxh5 f4 h6 Bh8 Ne4 Kh7 Qb3 Bf5",
)
if __name__ == "__main__":
runner = unittest.TextTestRunner
loader = unittest.defaultTestLoader.loadTestsFromTestCase
runner().run(loader(ModuleAssumptions))
runner().run(loader(ModuleConstants))
runner().run(loader(Generate_pgn_for_uci_illegal_moves))
runner().run(loader(Generate_pgn_for_uci_file_moves))
runner().run(loader(Generate_pgn_for_uci_rank_moves))
runner().run(loader(Generate_pgn_for_uci_diagonal_moves_square))
runner().run(loader(Generate_pgn_for_uci_diagonal_moves_stretched_square))
runner().run(loader(Generate_two_token_pgn_for_uci_moves))
runner().run(loader(Generate_queen_move_pgn_for_uci_moves))
runner().run(loader(Generate_bishop_move_pgn_for_uci_moves))
runner().run(loader(Generate_knight_move_pgn_for_uci_moves))
runner().run(loader(Generate_three_queen_move_pgn_for_uci_moves))
runner().run(loader(Generate_three_queen_pinned_move_pgn_for_uci_moves))
runner().run(loader(Generate_three_queen_block_move_pgn_for_uci_moves))
runner().run(loader(Generate_pawn_move_pgn_for_uci_moves))
runner().run(loader(Generate_pawn_capture_pgn_for_uci_moves))
runner().run(loader(Generate_castles_pgn_for_uci_moves))
runner().run(loader(Generate_king_move_pgn_for_uci_moves))
runner().run(loader(Generate_three_queen_capture_pgn_for_uci_moves))
runner().run(loader(Generate_three_queen_block_capture_pgn_for_uci_moves))
runner().run(loader(Generate_pgn_for_uci_move_sequence))
runner().run(loader(Generate_pgn_for_uci_white_pawn_moves))
runner().run(loader(Generate_pgn_for_uci_black_pawn_moves))
runner().run(loader(Generate_pgn_for_uci_white_pawn_en_passant))
runner().run(loader(Generate_pgn_for_uci_black_pawn_en_passant))
runner().run(loader(Generate_pgn_non_castle_moves_like_e1g1))
runner().run(loader(Generate_pgn_for_uci_real_position))
| 34.645441 | 79 | 0.634461 | 6,397 | 47,880 | 4.320306 | 0.075504 | 0.082281 | 0.123422 | 0.18269 | 0.864964 | 0.853855 | 0.835691 | 0.819047 | 0.777436 | 0.708073 | 0 | 0.047814 | 0.263972 | 47,880 | 1,381 | 80 | 34.670529 | 0.736415 | 0.020614 | 0 | 0.318801 | 0 | 0 | 0.156283 | 0.022168 | 0 | 0 | 0 | 0 | 0.17257 | 1 | 0.202543 | false | 0.039055 | 0.003633 | 0 | 0.230699 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8a276eb836d7b5fa505f944e4b2d810bd0bb4f72 | 181 | py | Python | academicstoday_project/gunicorn_config.py | LeeDoona/EasyGrading | 8a3b7a95e328a5b710bd98934dcde7556731bd72 | [
"Apache-2.0"
] | 146 | 2017-02-04T11:14:50.000Z | 2021-12-30T20:54:50.000Z | academicstoday_project/gunicorn_config.py | LeeDoona/EasyGrading | 8a3b7a95e328a5b710bd98934dcde7556731bd72 | [
"Apache-2.0"
] | 139 | 2015-02-21T21:40:34.000Z | 2016-02-20T13:34:25.000Z | academicstoday_project/gunicorn_config.py | topsit143/acda | c2a20ffd1dcf8668d1fe401d114d32d9e686f1fd | [
"Apache-2.0"
] | 88 | 2017-01-20T20:32:44.000Z | 2022-02-07T05:32:44.000Z | command = '/usr/home/django/academicstoday-django/env/bin/gunicorn'
pythonpath = '/usr/home/django/academicstoday-django/academicstoday_project'
bind = '127.0.0.1:8001'
workers = 3
| 36.2 | 76 | 0.779006 | 25 | 181 | 5.6 | 0.68 | 0.428571 | 0.185714 | 0.385714 | 0.471429 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065089 | 0.066298 | 181 | 4 | 77 | 45.25 | 0.763314 | 0 | 0 | 0 | 0 | 0 | 0.718232 | 0.640884 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8aabc4f58bb6bd3717efdc036774280a74d56a62 | 8,271 | py | Python | tests/test_ai.py | DanKatzuv/tic-tac-toe | 607cda9f1939b5c97f13ee875a925eb4d1116f3e | [
"MIT"
] | 1 | 2021-12-23T06:20:22.000Z | 2021-12-23T06:20:22.000Z | tests/test_ai.py | DanKatzuv/tic-tac-toe | 607cda9f1939b5c97f13ee875a925eb4d1116f3e | [
"MIT"
] | null | null | null | tests/test_ai.py | DanKatzuv/tic-tac-toe | 607cda9f1939b5c97f13ee875a925eb4d1116f3e | [
"MIT"
] | 2 | 2021-04-06T19:46:42.000Z | 2021-05-20T18:04:00.000Z | from itertools import product
from pytest import mark
from board.board import Board
from game import Game
from players import AI
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_first_turn(player_mark):
ai = AI(player_mark)
board = Board()
assert ai._first_turn(board.representation()) in product((0, 2), (0, 2))
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_second_turn_corner_opening(player_mark):
other_mark = Game.FIRST_PLAYER_MARK
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', ' '],
[' ', ' ', ' '],
[' ', ' ', ' ']]
assert ai._second_turn(board.representation()) == (1, 1)
board._rows = [[' ', ' ', other_mark],
[' ', ' ', ' '],
[' ', ' ', ' ']]
assert ai._second_turn(board.representation()) == (1, 1)
board._rows = [[' ', ' ', ' '],
[' ', ' ', ' '],
[' ', ' ', other_mark]]
assert ai._second_turn(board.representation()) == (1, 1)
board._rows = [[' ', ' ', ' '],
[' ', ' ', ' '],
[other_mark, ' ', ' ']]
assert ai._second_turn(board.representation()) == (1, 1)
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_second_turn_center_opening(player_mark):
other_mark = Game.FIRST_PLAYER_MARK
ai = AI(player_mark)
board = Board()
board._rows = [[' ', ' ', ' '],
[' ', other_mark, ' '],
[' ', ' ', ' ']]
for _ in range(100):
assert ai._second_turn(board.representation()) in product((0, 2), (0, 2))
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_win(player_mark):
ai = AI(player_mark)
board = Board()
board._rows = [[player_mark, ' ', player_mark],
[' ', ' ', ' '],
[' ', ' ', ' ']]
assert ai.turn(board.representation()) == (0, 1)
board._rows = [[' ', player_mark, ' '],
[' ', player_mark, ' '],
[' ', ' ', ' ']]
assert ai.turn(board.representation()) == (2, 1)
board._rows = [[' ', ' ', ' '],
[' ', player_mark, ' '],
[' ', ' ', player_mark]]
assert ai.turn(board.representation()) == (0, 0)
board._rows = [[' ', ' ', player_mark],
[' ', ' ', ' '],
[player_mark, ' ', ' ']]
assert ai.turn(board.representation()) == (1, 1)
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_block(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', other_mark],
[' ', ' ', ' '],
[' ', ' ', ' ']]
assert ai.turn(board.representation()) == (0, 1)
board._rows = [[' ', other_mark, ' '],
[' ', other_mark, ' '],
[' ', ' ', ' ']]
assert ai.turn(board.representation()) == (2, 1)
board._rows = [[' ', ' ', ' '],
[' ', other_mark, ' '],
[' ', ' ', other_mark]]
assert ai.turn(board.representation()) == (0, 0)
board._rows = [[' ', ' ', other_mark],
[' ', ' ', ' '],
[other_mark, ' ', ' ']]
assert ai.turn(board.representation()) == (1, 1)
board._rows = [[player_mark, ' ', ' '],
[other_mark, other_mark, ' '],
[player_mark, ' ', other_mark]]
assert ai.turn(board.representation()) == (1, 2)
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_create_fork(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[player_mark, ' ', ' '],
[' ', other_mark, ' '],
[' ', ' ', player_mark]]
assert ai.turn(board.representation()) in ((2, 0), (0, 2))
board._rows = [[player_mark, ' ', ' '],
[player_mark, ' ', ' '],
[' ', ' ', ' ']]
assert ai._fork(board.representation()) == (1, 1)
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_block_fork(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', ' '],
[' ', player_mark, ' '],
[' ', ' ', other_mark]]
assert ai.turn(board.representation()) in ((0, 2), (2, 0))
board._rows = [[player_mark, ' ', ' '],
[' ', other_mark, ' '],
[' ', ' ', other_mark]]
assert ai.turn(board.representation()) in ((0, 2), (2, 0))
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_center(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', ' '],
[' ', player_mark, ' '],
[' ', ' ', other_mark]]
assert ai._center(board.representation()) is None
board._rows = [[player_mark, ' ', ' '],
[' ', ' ', ' '],
[other_mark, ' ', other_mark]]
assert ai._center(board.representation()) == (1, 1)
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_opposite_corner(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', ' '],
[' ', player_mark, ' '],
[' ', ' ', ' ']]
assert ai._opposite_corner(board.representation()) == (2, 2)
board._rows = [[player_mark, ' ', ' '],
[' ', ' ', ' '],
[other_mark, ' ', other_mark]]
assert ai._opposite_corner(board.representation()) == (0, 2)
board._rows = [[player_mark, ' ', player_mark],
[' ', ' ', ' '],
[other_mark, ' ', other_mark]]
assert ai._opposite_corner(board.representation()) is None
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_empty_corner(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', ' '],
[' ', player_mark, ' '],
[' ', ' ', ' ']]
assert ai._empty_corner(board.representation()) in ((0, 2), (2, 2), (2, 0))
board._rows = [[player_mark, ' ', ' '],
[' ', ' ', ' '],
[other_mark, ' ', other_mark]]
assert ai._empty_corner(board.representation()) == (0, 2)
board._rows = [[player_mark, ' ', player_mark],
[' ', ' ', ' '],
[other_mark, ' ', other_mark]]
assert ai._empty_corner(board.representation()) is None
@mark.parametrize('player_mark', (Game.FIRST_PLAYER_MARK, Game.SECOND_PLAYER_MARK))
def test_empty_edge(player_mark):
other_mark = other_player(player_mark)
ai = AI(player_mark)
board = Board()
board._rows = [[other_mark, ' ', ' '],
[' ', player_mark, ' '],
[' ', player_mark, ' ']]
assert ai._empty_edge(board.representation()) in ((0, 1), (1, 2), (1, 0))
board._rows = [[player_mark, other_mark, ' '],
[' ', ' ', ' '],
[other_mark, player_mark, other_mark]]
assert ai._empty_edge(board.representation()) in ((1, 2), (1, 0))
board._rows = [[player_mark, player_mark, player_mark],
[other_mark, ' ', player_mark],
[other_mark, other_mark, other_mark]]
assert ai._empty_edge(board.representation()) is None
def other_player(player_mark):
"""
Return the other player's mark, based on the given mark.
:param player_mark: mark of this player
:type player_mark:
:return: mark of the other player
:rtype: str
"""
return Game.FIRST_PLAYER_MARK if player_mark == Game.SECOND_PLAYER_MARK else Game.SECOND_PLAYER_MARK
| 35.650862 | 104 | 0.522186 | 862 | 8,271 | 4.687935 | 0.062645 | 0.259837 | 0.122247 | 0.112843 | 0.900767 | 0.90052 | 0.884187 | 0.858698 | 0.802029 | 0.782232 | 0 | 0.012834 | 0.293435 | 8,271 | 231 | 105 | 35.805195 | 0.678645 | 0.019587 | 0 | 0.724719 | 0 | 0 | 0.037762 | 0 | 0 | 0 | 0 | 0 | 0.168539 | 1 | 0.067416 | false | 0 | 0.02809 | 0 | 0.101124 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8ab145a7529affc9d0a14f7b856799a61fa48bab | 4,748 | py | Python | cumulusci/salesforce_api/tests/metadata_test_strings.py | davisagli/CumulusCI | fd74c324ad3ff662484b159395c639879011e711 | [
"BSD-3-Clause"
] | 163 | 2018-09-13T18:49:34.000Z | 2022-03-25T08:37:15.000Z | cumulusci/salesforce_api/tests/metadata_test_strings.py | davisagli/CumulusCI | fd74c324ad3ff662484b159395c639879011e711 | [
"BSD-3-Clause"
] | 1,280 | 2018-09-11T20:09:37.000Z | 2022-03-31T18:40:21.000Z | cumulusci/salesforce_api/tests/metadata_test_strings.py | davisagli/CumulusCI | fd74c324ad3ff662484b159395c639879011e711 | [
"BSD-3-Clause"
] | 125 | 2015-01-17T16:05:39.000Z | 2018-09-06T19:05:00.000Z | list_metadata_start_envelope = '<?xml version="1.0" encoding="utf-8"?>\n<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">\n <soap:Header>\n <SessionHeader xmlns="http://soap.sforce.com/2006/04/metadata">\n <sessionId>###SESSION_ID###</sessionId>\n </SessionHeader>\n </soap:Header>\n <soap:Body>\n <listMetadata xmlns="http://soap.sforce.com/2006/04/metadata">\n <queries>\n <type>CustomObject</type>\n </queries>\n <asOfVersion>{api_version}</asOfVersion>\n </listMetadata>\n </soap:Body>\n</soap:Envelope>'
retrieve_packaged_start_envelope = '<?xml version="1.0" encoding="utf-8"?>\n<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">\n <soap:Header>\n <SessionHeader xmlns="http://soap.sforce.com/2006/04/metadata">\n <sessionId>###SESSION_ID###</sessionId>\n </SessionHeader>\n </soap:Header>\n <soap:Body>\n <retrieve xmlns="http://soap.sforce.com/2006/04/metadata">\n <retrieveRequest>\n <apiVersion>{api_version}</apiVersion>\n <packageNames>{package_name}</packageNames>\n </retrieveRequest>\n </retrieve>\n </soap:Body>\n</soap:Envelope>'
retrieve_unpackaged_start_envelope = '<?xml version="1.0" encoding="utf-8"?>\n<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">\n <soap:Header>\n <SessionHeader xmlns="http://soap.sforce.com/2006/04/metadata">\n <sessionId>###SESSION_ID###</sessionId>\n </SessionHeader>\n </soap:Header>\n <soap:Body>\n <retrieve xmlns="http://soap.sforce.com/2006/04/metadata">\n <retrieveRequest>\n <apiVersion>{api_version}</apiVersion>\n <unpackaged>\n <version>41.0</version>\n </unpackaged>\n </retrieveRequest>\n </retrieve>\n </soap:Body>\n</soap:Envelope>'
result_envelope = '<?xml version="1.0" encoding="utf-8"?>\n<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">\n <soap:Header>\n <SessionHeader xmlns="http://soap.sforce.com/2006/04/metadata">\n <sessionId>###SESSION_ID###</sessionId>\n </SessionHeader>\n </soap:Header>\n <soap:Body>\n <checkRetrieveStatus xmlns="http://soap.sforce.com/2006/04/metadata">\n <asyncProcessId>{process_id}</asyncProcessId>\n </checkRetrieveStatus>\n </soap:Body>\n</soap:Envelope>'
deploy_result = '<?xml version="1.0" encoding="utf-8"?>\n<testing>\n <status>{status}</status>\n{extra}</testing>'
deploy_result_failure = '<?xml version="1.0" encoding="utf-8"?>\n<result>\n <status>Failed</status>\n <details>\n {details}\n </details>\n</result>'
list_metadata_result = '<?xml version="1.0" encoding="utf-8"?>\n<result><fullName>Test__c</fullName><type>CustomObject</type><createdDate>2018-08-07T16:31:57.000+0000</createdDate></result>'
list_metadata_result_bad_val = '<?xml version="1.0" encoding="utf-8"?>\n<result><fullName>Test__c</fullName><type>CustomObject</type><createdDate>2018-08-07T16:31:57.000+200</createdDate></result>'
retrieve_result = '<?xml version="1.0" encoding="utf-8"?>\n<testing>\n <zipFile>{zip}</zipFile>\n{extra}</testing>'
status_envelope = '<?xml version="1.0" encoding="utf-8"?>\n<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">\n <soap:Header>\n <SessionHeader xmlns="http://soap.sforce.com/2006/04/metadata">\n <sessionId>###SESSION_ID###</sessionId>\n </SessionHeader>\n </soap:Header>\n <soap:Body>\n <checkStatus xmlns="http://soap.sforce.com/2006/04/metadata">\n <asyncProcessId>{process_id}</asyncProcessId>\n </checkStatus>\n </soap:Body>\n</soap:Envelope>'
deploy_status_envelope = '<?xml version="1.0" encoding="utf-8"?>\n<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">\n <soap:Header>\n <SessionHeader xmlns="http://soap.sforce.com/2006/04/metadata">\n <sessionId>###SESSION_ID###</sessionId>\n </SessionHeader>\n </soap:Header>\n <soap:Body>\n <checkDeployStatus xmlns="http://soap.sforce.com/2006/04/metadata">\n <asyncProcessId>{process_id}</asyncProcessId>\n <includeDetails>true</includeDetails>\n </checkDeployStatus>\n </soap:Body>\n</soap:Envelope>'
| 215.818182 | 730 | 0.694187 | 677 | 4,748 | 4.809453 | 0.119645 | 0.055283 | 0.047912 | 0.044226 | 0.839988 | 0.83231 | 0.825553 | 0.796683 | 0.796683 | 0.773956 | 0 | 0.048212 | 0.086984 | 4,748 | 21 | 731 | 226.095238 | 0.702884 | 0 | 1 | 0 | 0 | 1 | 0.932814 | 0.334667 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
6a825e43f51e876d156c484c54979a88cc5ea3b7 | 13,931 | py | Python | sugarplot/test/test_normalization.py | edmundsj/sugarplot | 4e25eebfc6045482b0a3add978a1047d8fc696db | [
"MIT"
] | null | null | null | sugarplot/test/test_normalization.py | edmundsj/sugarplot | 4e25eebfc6045482b0a3add978a1047d8fc696db | [
"MIT"
] | 4 | 2021-03-26T15:54:53.000Z | 2021-06-23T16:51:40.000Z | sugarplot/test/test_normalization.py | edmundsj/sugarplot | 4e25eebfc6045482b0a3add978a1047d8fc696db | [
"MIT"
] | null | null | null | import pytest
import pandas as pd
import numpy as np
from sugarplot import interpolate, normalize_pandas, normalize_reflectance, ureg
from pandas.testing import assert_frame_equal
def test_normalize_pandas_simple_multiply():
data1 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
data2= pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [
0,
0.16666666666666669,
0.33333333333333337,
0.5,
0.6666666666666666,
0.8333333333333335]})
multiplied_data_desired = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'power (nA ** 2)': [
0,
0.016666666666666669,
0.06666666666666668,
0.15,
0.26666666666666666,
0.41666666666666674]})
multiplied_data_actual = normalize_pandas(data1, data2)
assert_frame_equal(multiplied_data_actual, multiplied_data_desired)
def test_normalize_mul_integration():
data1 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
data2 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
multiplied_data_desired = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'power (nA ** 2)': [
0,
0.016666666666666669,
0.06666666666666668,
0.15,
0.26666666666666666,
0.41666666666666674]})
multiplied_data_actual = normalize_pandas(data1, data2)
assert_frame_equal(multiplied_data_actual, multiplied_data_desired)
def test_normalize_div_integration():
data1 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
data2 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
multiplied_data_desired = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'rel': [
np.NaN,
0.6,
0.6,
0.6,
0.6,
0.6]})
multiplied_data_actual = normalize_pandas(data1, data2, operation=np.divide, new_name='rel')
assert_frame_equal(multiplied_data_actual, multiplied_data_desired)
def test_normalize_reflectance():
photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'Photocurrent (nA)': [1, 1, 1, 1, 1, 1]})
reference_photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'Photocurrent (nA)': [2, 2, 2.0, 2, 2, 2]})
reference_reflectance = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R ()': [0.3, 0.4, 0.5, 0.4, 0.3, 0.2]})
reflectance_desired = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R': 0.5*np.array([0.3, 0.4, 0.5, 0.4, 0.3, 0.2])})
reflectance_actual = normalize_reflectance(
photocurrent, reference_photocurrent, reference_reflectance)
assert_frame_equal(reflectance_actual, reflectance_desired)
def test_normalize_reflectance_unsorted():
photocurrent = pd.DataFrame({
'Wavelength (nm)': [3.5, 1.5, 2, 2.5, 3, 1],
'Photocurrent (nA)': [1, 1, 1, 1, 1, 1]})
reference_photocurrent = pd.DataFrame({
'Wavelength (nm)': [2, 1.5, 2, 2.5, 3, 3.5],
'Photocurrent (nA)': [2, 2, 2.0, 2, 2, 2]})
reference_reflectance = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R': [0.3, 0.4, 0.5, 0.4, 0.3, 0.2]})
reflectance_desired = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R': 0.5*np.array([0.3, 0.4, 0.5, 0.4, 0.3, 0.2])})
reflectance_actual = normalize_reflectance(
photocurrent, reference_photocurrent, reference_reflectance)
assert_frame_equal(reflectance_actual, reflectance_desired)
def test_normalize_reflectance_units():
photocurrent = pd.DataFrame({
'Boolergs (pA)': [5, 6.5, 0, 2.5, 4, 3.5],
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'Photocurrent (nA)': [1, 1, 1, 1, 1, 1]})
reference_photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'Photocurrent (nA)': [2, 2, 2.0, 2, 2, 2]})
reference_reflectance = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'Reflectance ()': [0.3, 0.4, 0.5, 0.4, 0.3, 0.2]})
reflectance_desired = pd.DataFrame({
'Boolergs (pA)': [5, 6.5, 0, 2.5, 4, 3.5],
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R': 0.5*np.array([0.3, 0.4, 0.5, 0.4, 0.3, 0.2])})
reflectance_actual = normalize_reflectance(
photocurrent, reference_photocurrent, reference_reflectance,
column_units=ureg.nm)
assert_frame_equal(reflectance_actual, reflectance_desired)
def test_normalize_reflectance_extra_data():
photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1, 2, 2, 3, 3],
'Amplitude': [0.1, 1, 0.1, 1, 0.1, 1],
'Photocurrent (nA)': [1, 1, 1, 1, 1, 1]})
reference_photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'Photocurrent (nA)': [2, 2, 2.0, 2, 2, 2]})
reference_reflectance = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R': [0.3, 0.4, 0.5, 0.4, 0.3, 0.2]})
reflectance_desired = pd.DataFrame({
'Wavelength (nm)': [1, 1, 2, 2, 3, 3],
'Amplitude': [0.1, 1, 0.1, 1, 0.1, 1],
'R': 0.5*np.array([0.3, 0.3, 0.5, 0.5, 0.3, 0.3])})
reflectance_actual = normalize_reflectance(
photocurrent, reference_photocurrent, reference_reflectance)
assert_frame_equal(reflectance_actual, reflectance_desired)
def test_normalize_reflectance_extra_data_both():
photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1, 2, 2, 3, 3],
'Amplitude': [0.1, 1, 0.1, 1, 0.1, 1],
'Photocurrent (nA)': [1, 1, 1, 1, 1, 1]})
reference_photocurrent = pd.DataFrame({
'Wavelength (nm)': [1, 1, 2, 2, 3, 3],
'Amplitude': [0.1, 1, 0.1, 1, 0.1, 1],
'Photocurrent (nA)': [2, 2, 2.0, 2, 2, 2]})
reference_reflectance = pd.DataFrame({
'Wavelength (nm)': [1, 1.5, 2, 2.5, 3, 3.5],
'R': [0.3, 0.4, 0.5, 0.4, 0.3, 0.2]})
reflectance_desired = pd.DataFrame({
'Wavelength (nm)': [1, 1, 2, 2, 3, 3],
'Amplitude': [0.1, 1, 0.1, 1, 0.1, 1],
'R': 0.5*np.array([0.3, 0.3, 0.5, 0.5, 0.3, 0.3])})
reflectance_actual = normalize_reflectance(
photocurrent, reference_photocurrent, reference_reflectance)
assert_frame_equal(reflectance_actual, reflectance_desired)
def test_interpolate():
data1 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
data2 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
interpolated_desired = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [
0,
0.16666666666666669,
0.33333333333333337,
0.5,
0.6666666666666666,
0.8333333333333335]})
interpolated_actual = interpolate(data1, data2)
assert_frame_equal(interpolated_actual, interpolated_desired)
def test_interpolate_units():
data1 = pd.DataFrame({
'Phlargen (mV)': [5, 4, 2, 4, 5, 6],
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
data2 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
interpolated_desired = pd.DataFrame({
'Phlargen (mV)': [5, 4, 2, 4, 5, 6],
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [
0,
0.16666666666666669,
0.33333333333333337,
0.5,
0.6666666666666666,
0.8333333333333335]})
interpolated_actual = interpolate(data1, data2, column_units=ureg.ms)
assert_frame_equal(interpolated_actual, interpolated_desired)
def test_interpolate_big_data1():
"""
Checks that we can handle data1 which is larger in length than data2
"""
data1 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
data2 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
interpolated_desired = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.06, 0.12, 0.18, 0.24, 0.3, 0.36, 0.42, 0.48, 0.5]})
interpolated_actual = interpolate(data1, data2)
assert_frame_equal(interpolated_actual, interpolated_desired)
def test_interpolate_with_offset():
"""
Checks that we can handle data1 which is larger in length than data2
"""
data1 = pd.DataFrame({
'Time (ms)': [5, 5.6, 6.2, 6.8, 7.4, 8.0, 8.6, 9.2, 9.8, 10.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
data2 = pd.DataFrame({
'Time (ms)': np.array([5, 6, 7, 8, 9, 10]),
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5]})
interpolated_desired = pd.DataFrame({
'Time (ms)': [5, 5.6, 6.2, 6.8, 7.4, 8.0, 8.6, 9.2, 9.8, 10.4],
'Current (nA)': np.array([0, 0.06, 0.12, 0.18, 0.24, 0.3, 0.36, 0.42, 0.48, 0.5])})
interpolated_actual = interpolate(data1, data2)
assert_frame_equal(interpolated_actual, interpolated_desired)
def test_interpolate_with_yoffset():
"""
Checks that we can handle data1 which is larger in length than data2
"""
data1 = pd.DataFrame({
'Time (ms)': [5, 5.6, 6.2, 6.8, 7.4, 8.0, 8.6, 9.2, 9.8, 10.4],
'Current (nA)': [1.0, 0.95, 0.9, 0.85, 0.8, 0.75, 0.7, 0.65, 0.6, 0.65]})
data2 = pd.DataFrame({
'Time (ms)': np.array([5, 6, 7, 8, 9, 10]),
'Current (nA)': [0.5, 0.45, 0.4, 0.35, 0.3, 0.25]})
interpolated_desired = pd.DataFrame({
'Time (ms)': [5, 5.6, 6.2, 6.8, 7.4, 8.0, 8.6, 9.2, 9.8, 10.4],
'Current (nA)': np.array([0.5, 0.47, 0.44, 0.41, 0.38, 0.35, 0.32, 0.29, 0.26, 0.25])})
interpolated_actual = interpolate(data1, data2)
assert_frame_equal(interpolated_actual, interpolated_desired)
def test_interpolate_numpy():
R_ref = pd.DataFrame({
'Wavelength (nm)': np.arange(100, 151, 1),
'Reflectance ()': np.linspace(0,1, 51)})
I_ref = pd.DataFrame({
'Wavelength (nm)': np.arange(100, 150, 5),
'Photocurrent (nA)': np.linspace(2, 2, 10)})
I_meas = pd.DataFrame({
'Wavelength (nm)': np.linspace(110, 140,30),
'Photocurrent (nA)': np.linspace(1, 1, 30)})
R_1 = normalize_pandas(I_meas, I_ref, np.divide,
new_name='Reflectance')
R_actual = normalize_pandas(R_1, R_ref, np.multiply,
new_name='Reflectance')
R_desired = pd.DataFrame({
'Wavelength (nm)': np.linspace(110, 140, 30),
'Reflectance': 0.5*np.linspace(0.2, 0.8, 30)})
assert_frame_equal(R_actual, R_desired)
def test_interpolate_target_units():
data1 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5],
'Phlargen (mV)': [5, 4, 2, 4, 5, 6],
})
data2 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]})
interpolated_desired = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [
0,
0.16666666666666669,
0.33333333333333337,
0.5,
0.6666666666666666,
0.8333333333333335],
'Phlargen (mV)': [5, 4, 2, 4, 5, 6],
})
interpolated_actual = interpolate(data1, data2,
column_units=ureg.ms, target_units=ureg.nA)
assert_frame_equal(interpolated_actual, interpolated_desired)
def test_normalize_target_units():
data1 = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5],
'Phlargen (kV)': 1 + np.array([0, 0.1, 0.2, 0.3, 0.4, 0.5])
})
data2 = pd.DataFrame({
'Time (ms)': [0, 0.6, 1.2, 1.8, 2.4, 3.0, 3.6, 4.2, 4.8, 5.4],
'Current (nA)': [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9],
'Phlargen (kV)': 1 + np.array([0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9])
})
multiplied_data_desired = pd.DataFrame({
'Time (ms)': [0, 1, 2, 3, 4, 5],
'Phlargen (kV)': 1 + np.array([0, 0.1, 0.2, 0.3, 0.4, 0.5]),
'power (nA ** 2)': [
0,
0.016666666666666669,
0.06666666666666668,
0.15,
0.26666666666666666,
0.41666666666666674],
})
multiplied_data_actual = normalize_pandas(data1, data2,
target_units=ureg.nA)
assert_frame_equal(multiplied_data_actual, multiplied_data_desired)
| 43.534375 | 99 | 0.517479 | 2,168 | 13,931 | 3.228321 | 0.059502 | 0.01686 | 0.019288 | 0.06801 | 0.900843 | 0.89327 | 0.892413 | 0.881126 | 0.865695 | 0.826832 | 0 | 0.186499 | 0.291795 | 13,931 | 319 | 100 | 43.670846 | 0.522907 | 0.014787 | 0 | 0.795222 | 0 | 0 | 0.104686 | 0 | 0 | 0 | 0 | 0 | 0.05802 | 1 | 0.054608 | false | 0 | 0.017065 | 0 | 0.071672 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a89b3511dc85b0428965768ee5bc3d80e3c227b | 21,556 | py | Python | tests/unit/test_snap.py | jnpr-bowen/jsnapy | 8463151fdf4a4f551ba00400ce849c59e300ab2b | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | tests/unit/test_snap.py | jnpr-bowen/jsnapy | 8463151fdf4a4f551ba00400ce849c59e300ab2b | [
"Apache-2.0",
"BSD-3-Clause"
] | 1 | 2018-03-19T08:52:21.000Z | 2018-03-19T08:52:21.000Z | tests/unit/test_snap.py | jnpr-bowen/jsnapy | 8463151fdf4a4f551ba00400ce849c59e300ab2b | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | import unittest
import yaml
import os
from jnpr.jsnapy.snap import Parser
from jnpr.jsnapy import SnapAdmin
import jnpr.junos.device
from jnpr.junos.device import Device
from mock import patch, mock_open, ANY, call, MagicMock
#from contextlib import nested
from nose.plugins.attrib import attr
@attr('unit')
class TestSnap(unittest.TestCase):
def setUp(self):
self.diff = False
self.hostname = "1.1.1.1"
self.db = dict()
self.db['store_in_sqlite'] = False
self.db['check_from_sqlite'] = False
self.db['db_name'] = ""
self.db['first_snap_id'] = None
self.db['second_snap_id'] = None
self.output_file = "abc"
self.logger_snap = MagicMock()
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
@patch('jnpr.jsnapy.snap.logging.getLogger')
def test_snap(self, mock_log, mock_etree, mock_dev):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
m_op = mock_open()
with patch('jnpr.jsnapy.snap.open', m_op, create=True) as m_open:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
self.assertEqual(prs.command_list, ['show chassis fpc'])
self.assertEqual(prs.rpc_list, [])
self.assertEqual(prs.test_included, ['check_chassis_fpc'])
dev.close()
@patch('jnpr.jsnapy.jsnapy.get_path')
@patch('sys.exit')
@patch('argparse.ArgumentParser.print_help')
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
@patch('jnpr.jsnapy.snap.logging.getLogger')
def test_snap_2(self, mock_log, mock_etree, mock_dev, mock_parser, mock_exit, mock_path):
js = SnapAdmin()
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main.yml')
config_file = open(conf_file, 'r')
js.main_file = yaml.load(config_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
m_op = mock_open()
with (patch('jnpr.jsnapy.snap.open', m_op, create=True))as (m_open):
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
js.generate_rpc_reply(
dev,
self.output_file,
"1.1.1.1",
js.main_file)
self.assertTrue(mock_path.called)
dev.close()
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_snap_3(self, mock_etree, mock_parse):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with patch('jnpr.junos.rpcmeta._RpcMetaExec.cli') as mock_cli:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
mock_cli.assert_called_once_with('show chassis fpc', format='xml')
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_snap_4(self, mock_etree, mock_parse):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta_text.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with patch('jnpr.junos.rpcmeta._RpcMetaExec.cli') as mock_cli:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
mock_cli.assert_called_once_with('show chassis fpc', format='text')
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_snap_5(self, mock_etree, mock_parse):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta_error.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with patch('logging.Logger.error') as mock_log:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
c = mock_log.call_args_list[0]
self.assertNotEqual(c[0][0].find("ERROR occurred"), -1)
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
def test_rpc_1(self, mock_etree, mock_dev):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user_mock",
passwd="xyz")
dev.open()
m_op = mock_open()
with patch('jnpr.jsnapy.snap.open', m_op, create=True) as m_open:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
self.assertEqual(prs.command_list, [])
self.assertEqual(
prs.rpc_list, [
'get-config', 'get-interface-information'])
self.assertEqual(
prs.test_included, [
'test_rpc_version', 'test_interface'])
dev.close()
@patch('jnpr.junos.rpcmeta._RpcMetaExec.get_config')
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_rpc_2(self, mock_etree, mock_parse, mock_config):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with (
patch('jnpr.junos.rpcmeta._RpcMetaExec.__getattr__')
) as (mock_rpc):
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
mock_rpc.assert_called_once_with('get_interface_information')
mock_config.assert_called_once_with(
options={
'format': 'xml'},
filter_xml=ANY)
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_rpc_3(self, mock_etree, mock_parse, mock_dev):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc_error.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
with patch('logging.Logger.error') as mock_log:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
c = mock_log.call_args_list[0]
self.assertNotEqual(
c[0][0].find("ERROR!!, filtering rpc works only for 'get-config' rpc"), -1)
dev.close()
@patch('jnpr.junos.rpcmeta._RpcMetaExec.get_config')
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_rpc_4(self, mock_etree, mock_parse, mock_config):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc_2.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with (
patch('jnpr.junos.rpcmeta._RpcMetaExec.__getattr__')
) as (mock_rpc):
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
mock_rpc.assert_called_once_with('get_interface_information')
mock_config.assert_called_once_with(options={'format': 'xml'})
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_rpc_5(self, mock_etree, mock_parse):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc_error_2.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with patch('logging.Logger.error') as mock_log:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
c = mock_log.call_args_list[0]
self.assertNotEqual(c[0][0].find("ERROR occurred"), -1)
@patch('jnpr.jsnapy.snap.Parser._write_file')
@patch('jnpr.jsnapy.snap.etree')
def test_rpc_6(self, mock_etree, mock_parse):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc_2_error.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="xyz",
passwd="abc")
with patch('logging.Logger.error') as mock_log:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
c = mock_log.call_args_list[0]
self.assertNotEqual(c[0][0].find("ERROR occurred"), -1)
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
@patch('jnpr.jsnapy.snap.JsnapSqlite')
def test_snap_sqlite_1(self, mock_sqlite, mock_etree, mock_dev):
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
m_op = mock_open()
self.db['store_in_sqlite'] = True
self.db['db_name'] = "abc.db"
with patch('jnpr.jsnapy.snap.open', m_op, create=True) as m_open:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"1.1.1.1",
self.db)
mock_sqlite.assert_called_once_with('1.1.1.1', 'abc.db')
dev.close()
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
@patch('jnpr.jsnapy.snap.JsnapSqlite.__init__')
@patch('jnpr.jsnapy.snap.JsnapSqlite.insert_data')
def test_snap_sqlite_2(self, mock_insert, mock_init, mock_etree, mock_dev):
mock_init.return_value = None
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
m_op = mock_open()
with patch('jnpr.jsnapy.snap.open', m_op, create=True) as m_open:
prs.generate_reply(
test_file,
dev,
"1.1.1.1_snap_mock",
"01.216.193.114",
self.db)
self.assertFalse(mock_insert.called)
self.assertFalse(mock_init.called)
dev.close()
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
@patch('jnpr.jsnapy.snap.JsnapSqlite.__init__')
@patch('jnpr.jsnapy.snap.Parser._check_reply')
@patch('jnpr.jsnapy.snap.JsnapSqlite.insert_data')
def test_snap_sqlite_3(
self, mock_insert, mock_reply, mock_init, mock_etree, mock_dev):
mock_init.return_value = None
prs = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'delta.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
m_op = mock_open()
self.db['store_in_sqlite'] = True
self.db['db_name'] = "abc.db"
with patch('jnpr.jsnapy.snap.open', m_op, create=True) as m_open:
prs.generate_reply(
test_file,
dev,
"snap_mock",
"1.1.1.1",
self.db)
db_dict = dict()
db_dict['cli_command'] = 'show_chassis_fpc'
db_dict['snap_name'] = "snap_mock"
db_dict['filename'] = "1.1.1.1" + \
"_" "snap_mock" + "_" + "show_chassis_fpc" + "." + "xml"
db_dict['format'] = "xml"
db_dict['data'] = mock_reply()
mock_insert.assert_called_once_with(db_dict)
dev.close()
@patch('jnpr.junos.device.Device')
@patch('jnpr.jsnapy.snap.etree')
@patch('jnpr.jsnapy.snap.JsnapSqlite.__init__')
@patch('jnpr.jsnapy.snap.Parser._check_reply')
@patch('jnpr.jsnapy.snap.JsnapSqlite.insert_data')
def test_snap_sqlite_4(
self, mock_insert, mock_reply, mock_init, mock_etree, mock_dev):
mock_init.return_value = None
prs = Parser()
calls = []
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'test_rpc.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = jnpr.junos.device.Device(
host="1.1.1.1",
user="user",
passwd="xyz")
dev.open()
m_op = mock_open()
self.db['store_in_sqlite'] = True
self.db['db_name'] = "abc.db"
with patch('jnpr.jsnapy.snap.open', m_op, create=True) as m_open:
prs.generate_reply(
test_file,
dev,
"snap_mock",
"1.1.1.1",
self.db)
db_dict = dict()
db_dict['cli_command'] = 'get-config'
db_dict['snap_name'] = "snap_mock_tcxj9UUkDu6z-Jv5vBTBtA=="
db_dict['filename'] = "1.1.1.1" + "_" + \
"snap_mock_tcxj9UUkDu6z-Jv5vBTBtA==" + "_" + "get-config" + "." + "xml"
db_dict['format'] = 'xml'
db_dict['data'] = mock_reply()
calls.append(call(db_dict))
db_dict2 = db_dict.copy()
db_dict2['snap_name'] = "snap_mock_8E_z-UpxqzKuorLyN66fYA=="
db_dict2['cli_command'] = 'get-interface-information'
db_dict2['filename'] = "1.1.1.1" + "_" + \
"snap_mock_8E_z-UpxqzKuorLyN66fYA==" + "_" + "get-interface-information" + "." + "xml"
calls.append(call(db_dict2))
mock_insert.assert_has_calls(calls)
dev.close()
@patch('logging.Logger.info')
def test_write_file(self, mock_info):
par = Parser()
res = par._check_reply(True, 'xml')
self.assertEqual(res, '')
mock_info.assert_called()
@patch('logging.Logger.error')
@patch('ncclient.manager.connect')
def test_generate_reply_error_1(self, mock_dev, mock_err):
par = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'bogus_testfile_1.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = Device(user='1.1.1.1', host='abc', passwd='xyz')
dev.open()
par.generate_reply(test_file, dev, '1.1.1.1_snap_mock', self.hostname, self.db)
mock_err.assert_called()
@patch('logging.Logger.error')
@patch('ncclient.manager.connect')
def test_generate_reply_error_2(self, mock_dev, mock_err):
par = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'bogus_testfile_2.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = Device(user='1.1.1.1', host='abc', passwd='xyz')
dev.open()
par.generate_reply(test_file, dev, '1.1.1.1_snap_mock', self.hostname, self.db)
mock_err.assert_called()
@patch('jnpr.jsnapy.snap.Parser._write_warning')
@patch('jnpr.junos.rpcmeta._RpcMetaExec.cli')
@patch('logging.Logger.error')
@patch('lxml.etree.tostring')
@patch('ncclient.manager.connect')
def test_generate_reply_error_3(self, mock_dev, mock_tostring, mock_err, mock_cli, mock_write_warn):
from jnpr.junos.exception import RpcError
mock_cli.side_effect = RpcError
par = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'bogus_testfile_3.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = Device(host='10.221.136.250', user='abc', passwd='xyz')
dev.open()
par.generate_reply(test_file, dev, 'mock.xml', self.hostname, self.db)
mock_err.assert_called()
mock_write_warn.assert_called()
@patch('jnpr.jsnapy.snap.Parser._write_warning')
@patch('jnpr.junos.rpcmeta._RpcMetaExec.__getattr__')
@patch('logging.Logger.error')
@patch('lxml.etree.tostring')
@patch('ncclient.manager.connect')
def test_generate_reply_rpc_error_4(self, mock_dev, mock_tostring, mock_err, mock_rpc, mock_write_warn):
from jnpr.junos.exception import RpcError
mock_rpc.side_effect = RpcError
par = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'bogus_testfile_4.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = Device(host='10.221.136.250', user='abc', passwd='xyz')
dev.open()
par.generate_reply(test_file, dev, 'mock.xml', self.hostname, self.db)
mock_err.assert_called()
mock_write_warn.assert_called()
@patch('jnpr.jsnapy.snap.Parser._write_warning')
@patch('jnpr.junos.rpcmeta._RpcMetaExec.__getattr__')
@patch('logging.Logger.error')
@patch('lxml.etree.tostring')
@patch('ncclient.manager.connect')
def test_generate_reply_rpc_error_5(self, mock_dev, mock_tostring, mock_err, mock_rpc, mock_write_warn):
from jnpr.junos.exception import RpcError
mock_rpc.side_effect = RpcError
par = Parser()
test_file = os.path.join(os.path.dirname(__file__),
'configs', 'bogus_testfile_5.yml')
test_file = open(test_file, 'r')
test_file = yaml.load(test_file)
dev = Device(host='10.221.136.250', user='abc', passwd='xyz')
dev.open()
par.generate_reply(test_file, dev, 'mock.xml', self.hostname, self.db)
mock_err.assert_called()
mock_write_warn.assert_called()
@patch('logging.Logger.info')
def test_write_file_rpc_reply_true(self, mock_log):
par = Parser()
par._write_file(True, 'xml', 'mock.xml')
c = mock_log.call_args_list[0]
self.assertNotEqual(
c[0][0].find("Output of requested Command/RPC is empty"), -1)
@patch('jnpr.jsnapy.snap.Parser.store_in_sqlite')
def test_write_warning(self, mock_store_data):
self.db['store_in_sqlite'] = True
par = Parser()
par._write_warning("mock_reply", self.db, 'mock.xml', self.hostname
, 'mock_cmd', 'text', 'mock_output')
mock_store_data.assert_called()
# with nested(
# patch('jnpr.jsnapy.snap.logging.getLogger'),
# patch('logging.Logger'),
# patch('jnpr.jsnapy.snap.logging.getLogger')
# )as (mock_logger, mock_log, mock_log1):
# if __name__ == "__main__":
# suite = unittest.TestLoader().loadTestsFromTestCase(TestSnap)
# unittest.TextTestRunner(verbosity=2).run(suite)
| 38.561717 | 108 | 0.556179 | 2,714 | 21,556 | 4.149595 | 0.068902 | 0.026638 | 0.026638 | 0.017759 | 0.839993 | 0.820103 | 0.802167 | 0.797017 | 0.783786 | 0.773042 | 0 | 0.02073 | 0.306272 | 21,556 | 558 | 109 | 38.630824 | 0.732379 | 0.017489 | 0 | 0.761628 | 0 | 0 | 0.21077 | 0.108833 | 0.007752 | 0 | 0 | 0 | 0.065891 | 1 | 0.046512 | false | 0.03876 | 0.023256 | 0 | 0.071705 | 0.001938 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6ac8d715420f60afb66c66d04f5fdce165ef1e2e | 19,293 | py | Python | Assignment5.py | MijazzChan/AistudioCodeBackup | ea3976d444b66fc836774a1d847f0209f2c388ac | [
"MIT"
] | null | null | null | Assignment5.py | MijazzChan/AistudioCodeBackup | ea3976d444b66fc836774a1d847f0209f2c388ac | [
"MIT"
] | null | null | null | Assignment5.py | MijazzChan/AistudioCodeBackup | ea3976d444b66fc836774a1d847f0209f2c388ac | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
# In[ ]:
# 查看当前挂载的数据集目录, 该目录下的变更重启环境后会自动还原
# View dataset directory. This directory will be recovered automatically after resetting environment.
get_ipython().system('ls /home/aistudio/data')
# In[ ]:
# 查看工作区文件, 该目录下的变更将会持久保存. 请及时清理不必要的文件, 避免加载过慢.
# View personal work directory. All changes under this directory will be kept even after reset. Please clean unnecessary files in time to speed up environment loading.
get_ipython().system('ls /home/aistudio/work')
# 请点击[此处](https://ai.baidu.com/docs#/AIStudio_Project_Notebook/a38e5576)查看本环境基本用法. <br>
# Please click [here ](https://ai.baidu.com/docs#/AIStudio_Project_Notebook/a38e5576) for more detailed instructions.
# # 第五次作业
# ---
# 2017326603075 陈浩骏 2017326603075
#
# ## 题1
# + 添加了两个Linear层,与一层Relu激活函数之后,回归拟合结果的误差相差大的频率变小了。在使用单层的时候,因为拟合数据的batch也是随机的,会出现拟合测试与实际相差超过50%,但是添加层之后,虽然还有极低出现的频率,但是大lost减少了很多。
# + 13 -> 8 -> 4 -> 1 (F C)
# In[1]:
# 共两题:
# 1 请将房价预测修改为多层带激活函数的全联接模型,并比较与线性回归模型的效果有什么不同
import paddle
import paddle.fluid as fluid
import paddle.fluid.dygraph as dygraph
from paddle.fluid.dygraph import Linear
import numpy as np
import os
import random
def load_data():
# 从文件导入数据
datafile = './work/housing.data.csv'
data = np.fromfile(datafile, sep=' ')
# 每条数据包括14项,其中前面13项是影响因素,第14项是相应的房屋价格中位数
feature_names = [ 'CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD', 'TAX', 'PTRATIO', 'B', 'LSTAT', 'MEDV' ]
feature_num = len(feature_names)
# 将原始数据进行Reshape,变成[N, 14]这样的形状
data = data.reshape([data.shape[0] // feature_num, feature_num])
# 将原数据集拆分成训练集和测试集
# 这里使用80%的数据做训练,20%的数据做测试
# 测试集和训练集必须是没有交集的
ratio = 0.8
offset = int(data.shape[0] * ratio)
training_data = data[:offset]
# 计算train数据集的最大值,最小值,平均值
maximums, minimums, avgs = training_data.max(axis=0), training_data.min(axis=0), training_data.sum(axis=0) / training_data.shape[0]
# 记录数据的归一化参数,在预测时对数据做归一化
global max_values
global min_values
global avg_values
max_values = maximums
min_values = minimums
avg_values = avgs
# 对数据进行归一化处理
for i in range(feature_num):
#print(maximums[i], minimums[i], avgs[i])
data[:, i] = (data[:, i] - avgs[i]) / (maximums[i] - minimums[i])
# 训练集和测试集的划分比例
training_data = data[:offset]
test_data = data[offset:]
return training_data, test_data
print('Single Layer FC Linear Model')
class Regressor(fluid.dygraph.Layer):
def __init__(self, name_scope):
super(Regressor, self).__init__(name_scope)
name_scope = self.full_name()
# 定义一层全连接层,输出维度是1,激活函数为None,即不使用激活函数
self.fc = Linear(input_dim=13, output_dim=1, act=None)
# 网络的前向计算函数
def forward(self, inputs):
x = self.fc(inputs)
return x
# 定义飞桨动态图的工作环境
with fluid.dygraph.guard():
# 声明定义好的线性回归模型
model = Regressor("Regressor")
# 开启模型训练模式
model.train()
# 加载数据
training_data, test_data = load_data()
# 定义优化算法,这里使用随机梯度下降-SGD
# 学习率设置为0.01
opt = fluid.optimizer.SGD(learning_rate=0.01, parameter_list=model.parameters())
# 启动训练
with dygraph.guard(fluid.CPUPlace()):
EPOCH_NUM = 10 # 设置外层循环次数
BATCH_SIZE = 10 # 设置batch大小
# 定义外层循环
for epoch_id in range(EPOCH_NUM):
# 在每轮迭代开始之前,将训练数据的顺序随机的打乱
np.random.shuffle(training_data)
# 将训练数据进行拆分,每个batch包含10条数据
mini_batches = [training_data[k:k+BATCH_SIZE] for k in range(0, len(training_data), BATCH_SIZE)]
# 定义内层循环
for iter_id, mini_batch in enumerate(mini_batches):
x = np.array(mini_batch[:, :-1]).astype('float32') # 获得当前批次训练数据
y = np.array(mini_batch[:, -1:]).astype('float32') # 获得当前批次训练标签(真实房价)
# 将numpy数据转为飞桨动态图variable形式
house_features = dygraph.to_variable(x)
prices = dygraph.to_variable(y)
# 前向计算
predicts = model(house_features)
# 计算损失
loss = fluid.layers.square_error_cost(predicts, label=prices)
avg_loss = fluid.layers.mean(loss)
if iter_id%20==0:
# print("epoch: {}, iter: {}, loss is: {}".format(epoch_id, iter_id, avg_loss.numpy()))
pass
# 反向传播
avg_loss.backward()
# 最小化loss,更新参数
opt.minimize(avg_loss)
# 清除梯度
model.clear_gradients()
# 保存模型
fluid.save_dygraph(model.state_dict(), 'LR_model')
def load_one_example(data_dir):
f = open(data_dir, 'r')
datas = f.readlines()
# 选择倒数第10条数据用于测试
tmp = datas[-10]
tmp = tmp.strip().split()
one_data = [float(v) for v in tmp]
# 对数据进行归一化处理
for i in range(len(one_data)-1):
one_data[i] = (one_data[i] - avg_values[i]) / (max_values[i] - min_values[i])
data = np.reshape(np.array(one_data[:-1]), [1, -1]).astype(np.float32)
label = one_data[-1]
return data, label
with dygraph.guard():
# 参数为保存模型参数的文件地址
model_dict, _ = fluid.load_dygraph('LR_model')
model.load_dict(model_dict)
model.eval()
# 参数为数据集的文件地址
test_data, label = load_one_example('./work/housing.data.csv')
# 将数据转为动态图的variable格式
test_data = dygraph.to_variable(test_data)
results = model(test_data)
# 对结果做反归一化处理
results = results * (max_values[-1] - min_values[-1]) + avg_values[-1]
print("Inference result is {}, the corresponding label is {}".format(results.numpy(), label))
# In[3]:
import paddle
import paddle.fluid as fluid
import paddle.fluid.dygraph as dygraph
from paddle.fluid.dygraph import Linear
import numpy as np
import os
import random
def load_data():
# 从文件导入数据
datafile = './work/housing.data.csv'
data = np.fromfile(datafile, sep=' ')
# 每条数据包括14项,其中前面13项是影响因素,第14项是相应的房屋价格中位数
feature_names = [ 'CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD', 'TAX', 'PTRATIO', 'B', 'LSTAT', 'MEDV' ]
feature_num = len(feature_names)
# 将原始数据进行Reshape,变成[N, 14]这样的形状
data = data.reshape([data.shape[0] // feature_num, feature_num])
# 将原数据集拆分成训练集和测试集
# 这里使用80%的数据做训练,20%的数据做测试
# 测试集和训练集必须是没有交集的
ratio = 0.8
offset = int(data.shape[0] * ratio)
training_data = data[:offset]
# 计算train数据集的最大值,最小值,平均值
maximums, minimums, avgs = training_data.max(axis=0), training_data.min(axis=0), training_data.sum(axis=0) / training_data.shape[0]
# 记录数据的归一化参数,在预测时对数据做归一化
global max_values
global min_values
global avg_values
max_values = maximums
min_values = minimums
avg_values = avgs
# 对数据进行归一化处理
for i in range(feature_num):
#print(maximums[i], minimums[i], avgs[i])
data[:, i] = (data[:, i] - avgs[i]) / (maximums[i] - minimums[i])
# 训练集和测试集的划分比例
training_data = data[:offset]
test_data = data[offset:]
return training_data, test_data
print('Multi Layer FC Linear Model')
class Regressor(fluid.dygraph.Layer):
def __init__(self, name_scope):
super(Regressor, self).__init__(name_scope)
name_scope = self.full_name()
# 定义一层全连接层,输出维度是1,激活函数为None,即不使用激活函数
self.fc = Linear(input_dim=13, output_dim=8, act='relu')
self.sc = Linear(input_dim=8, output_dim=4, act='relu')
self.tc = Linear(input_dim=4, output_dim=1, act=None)
# 网络的前向计算函数
def forward(self, inputs):
x = self.fc(inputs)
x = self.sc(x)
x = self.tc(x)
return x
# 定义飞桨动态图的工作环境
with fluid.dygraph.guard():
# 声明定义好的线性回归模型
model = Regressor("Regressor")
# 开启模型训练模式
model.train()
# 加载数据
training_data, test_data = load_data()
# 定义优化算法,这里使用随机梯度下降-SGD
# 学习率设置为0.01
opt = fluid.optimizer.SGD(learning_rate=0.01, parameter_list=model.parameters())
# 启动训练
with dygraph.guard(fluid.CPUPlace()):
EPOCH_NUM = 10 # 设置外层循环次数
BATCH_SIZE = 10 # 设置batch大小
# 定义外层循环
for epoch_id in range(EPOCH_NUM):
# 在每轮迭代开始之前,将训练数据的顺序随机的打乱
np.random.shuffle(training_data)
# 将训练数据进行拆分,每个batch包含10条数据
mini_batches = [training_data[k:k+BATCH_SIZE] for k in range(0, len(training_data), BATCH_SIZE)]
# 定义内层循环
for iter_id, mini_batch in enumerate(mini_batches):
x = np.array(mini_batch[:, :-1]).astype('float32') # 获得当前批次训练数据
y = np.array(mini_batch[:, -1:]).astype('float32') # 获得当前批次训练标签(真实房价)
# 将numpy数据转为飞桨动态图variable形式
house_features = dygraph.to_variable(x)
prices = dygraph.to_variable(y)
# 前向计算
predicts = model(house_features)
# 计算损失
loss = fluid.layers.square_error_cost(predicts, label=prices)
avg_loss = fluid.layers.mean(loss)
if iter_id%20==0:
# print("epoch: {}, iter: {}, loss is: {}".format(epoch_id, iter_id, avg_loss.numpy()))
pass
# 反向传播
avg_loss.backward()
# 最小化loss,更新参数
opt.minimize(avg_loss)
# 清除梯度
model.clear_gradients()
# 保存模型
fluid.save_dygraph(model.state_dict(), 'LR_model')
def load_one_example(data_dir):
f = open(data_dir, 'r')
datas = f.readlines()
# 选择倒数第10条数据用于测试
tmp = datas[-10]
tmp = tmp.strip().split()
one_data = [float(v) for v in tmp]
# 对数据进行归一化处理
for i in range(len(one_data)-1):
one_data[i] = (one_data[i] - avg_values[i]) / (max_values[i] - min_values[i])
data = np.reshape(np.array(one_data[:-1]), [1, -1]).astype(np.float32)
label = one_data[-1]
return data, label
with dygraph.guard():
# 参数为保存模型参数的文件地址
model_dict, _ = fluid.load_dygraph('LR_model')
model.load_dict(model_dict)
model.eval()
# 参数为数据集的文件地址
test_data, label = load_one_example('./work/housing.data.csv')
# 将数据转为动态图的variable格式
test_data = dygraph.to_variable(test_data)
results = model(test_data)
# 对结果做反归一化处理
results = results * (max_values[-1] - min_values[-1]) + avg_values[-1]
print("Inference result is {}, the corresponding label is {}".format(results.numpy(), label))
# ## 题2
# + 更改guard传入的`fluid.CPUPlace()`为`fluid.CUDAPlace(0)`
# <br/>
# 因为环境算力卡为Nvidia下,加速单元为cuda核,0为卡位参数,单卡即为0
# ---
# 时间(秒)由258.6152288913727 降至 28.782493352890015
# In[4]:
#2 学习视频,修改以下代码使其能正常执行(部分代码缺失),并修改卷积核个数,或者增加卷积层树,修改激活函数等方式找到你认为最合适的超参
#2.2 修改代码使其能在GPU上执行,并比较与CPU上执行的时间差异
import os
import random
import paddle
import paddle.fluid as fluid
from paddle.fluid.dygraph.nn import Conv2D, Pool2D, Linear
import numpy as np
import matplotlib.pyplot as plt
from PIL import Image
from time import *
import gzip
import json
# 定义数据集读取器
def load_data(mode='train'):
# 数据文件
datafile = './work/mnist.json.gz'
print('loading mnist dataset from {} ......'.format(datafile))
data = json.load(gzip.open(datafile))
train_set, val_set, eval_set = data
# 数据集相关参数,图片高度IMG_ROWS, 图片宽度IMG_COLS
IMG_ROWS = 28
IMG_COLS = 28
if mode == 'train':
imgs = train_set[0]
labels = train_set[1]
elif mode == 'valid':
imgs = val_set[0]
labels = val_set[1]
elif mode == 'eval':
imgs = eval_set[0]
labels = eval_set[1]
imgs_length = len(imgs)
assert len(imgs) == len(labels), "length of train_imgs({}) should be the same as train_labels({})".format(
len(imgs), len(labels))
index_list = list(range(imgs_length))
# 读入数据时用到的batchsize
BATCHSIZE = 100
# 定义数据生成器
def data_generator():
if mode == 'train':
random.shuffle(index_list)
imgs_list = []
labels_list = []
for i in index_list:
img = np.reshape(imgs[i], [1, IMG_ROWS, IMG_COLS]).astype('float32')
label = np.reshape(labels[i], [1]).astype('float32')
imgs_list.append(img)
labels_list.append(label)
if len(imgs_list) == BATCHSIZE:
yield np.array(imgs_list), np.array(labels_list)
imgs_list = []
labels_list = []
# 如果剩余数据的数目小于BATCHSIZE,
# 则剩余数据一起构成一个大小为len(imgs_list)的mini-batch
if len(imgs_list) > 0:
yield np.array(imgs_list), np.array(labels_list)
return data_generator
# 多层卷积神经网络实现
class MNIST(fluid.dygraph.Layer):
def __init__(self, name_scope):
super(MNIST, self).__init__(name_scope)
# 定义卷积层,输出特征通道num_filters设置为20,卷积核的大小filter_size为5,卷积步长stride=1,padding=2
# 激活函数使用relu
self.conv1 = Conv2D(num_channels=1, num_filters=20, filter_size=5, stride=1, padding=2, act='relu')
# 定义池化层,池化核pool_size=2,池化步长为2,选择最大池化方式
self.pool1 = Pool2D(pool_size=2, pool_stride=2, pool_type='avg')
# 定义卷积层,输出特征通道num_filters设置为20,卷积核的大小filter_size为5,卷积步长stride=1,padding=2
self.conv2 = Conv2D(num_channels=20, num_filters=20, filter_size=5, stride=1, padding=2, act='relu')
# 定义池化层,池化核pool_size=2,池化步长为2,选择最大池化方式
self.pool2 = Pool2D(pool_size=2, pool_stride=2, pool_type='max')
# 定义一层全连接层,输出维度是1,不使用激活函数
self.fc = Linear(input_dim=980, output_dim=1, act=None)
# 定义网络前向计算过程,卷积后紧接着使用池化层,最后使用全连接层计算最终输出
def forward(self, inputs):
x = self.conv1(inputs)
x = self.pool1(x)
x = self.conv2(x)
x = self.pool2(x)
x = fluid.layers.reshape(x, [x.shape[0], -1])
x = self.fc(x)
return x
start = time()
#网络结构部分之后的代码,保持不变
with fluid.dygraph.guard(fluid.CPUPlace()):
model = MNIST("mnist")
model.train()
#调用加载数据的函数
train_loader = load_data('train')
optimizer = fluid.optimizer.SGDOptimizer(learning_rate=0.01, parameter_list=model.parameters())
EPOCH_NUM = 5
for epoch_id in range(EPOCH_NUM):
for batch_id, data in enumerate(train_loader()):
#准备数据
image_data, label_data = data
image = fluid.dygraph.to_variable(image_data)
label = fluid.dygraph.to_variable(label_data)
#前向计算的过程
predict = model(image)
#计算损失,取一个批次样本损失的平均值
loss = fluid.layers.square_error_cost(predict, label)
avg_loss = fluid.layers.mean(loss)
#每训练了100批次的数据,打印下当前Loss的情况
if batch_id % 200 == 0:
print("epoch: {}, batch: {}, loss is: {}".format(epoch_id, batch_id, avg_loss.numpy()))
#后向传播,更新参数的过程
avg_loss.backward()
optimizer.minimize(avg_loss)
model.clear_gradients()
#保存模型参数
end = time()
print('Time', end-start)
fluid.save_dygraph(model.state_dict(), 'mnist')
# In[13]:
#2 学习视频,修改以下代码使其能正常执行(部分代码缺失),并修改卷积核个数,或者增加卷积层树,修改激活函数等方式找到你认为最合适的超参
#2.2 修改代码使其能在GPU上执行,并比较与CPU上执行的时间差异
import os
import random
import paddle
import paddle.fluid as fluid
from paddle.fluid.dygraph.nn import Conv2D, Pool2D, Linear
import numpy as np
import matplotlib.pyplot as plt
from PIL import Image
from time import *
import gzip
import json
# 定义数据集读取器
def load_data(mode='train'):
# 数据文件
datafile = './work/mnist.json.gz'
print('loading mnist dataset from {} ......'.format(datafile))
data = json.load(gzip.open(datafile))
train_set, val_set, eval_set = data
# 数据集相关参数,图片高度IMG_ROWS, 图片宽度IMG_COLS
IMG_ROWS = 28
IMG_COLS = 28
if mode == 'train':
imgs = train_set[0]
labels = train_set[1]
elif mode == 'valid':
imgs = val_set[0]
labels = val_set[1]
elif mode == 'eval':
imgs = eval_set[0]
labels = eval_set[1]
imgs_length = len(imgs)
assert len(imgs) == len(labels), "length of train_imgs({}) should be the same as train_labels({})".format(
len(imgs), len(labels))
index_list = list(range(imgs_length))
# 读入数据时用到的batchsize
BATCHSIZE = 100
# 定义数据生成器
def data_generator():
if mode == 'train':
random.shuffle(index_list)
imgs_list = []
labels_list = []
for i in index_list:
img = np.reshape(imgs[i], [1, IMG_ROWS, IMG_COLS]).astype('float32')
label = np.reshape(labels[i], [1]).astype('float32')
imgs_list.append(img)
labels_list.append(label)
if len(imgs_list) == BATCHSIZE:
yield np.array(imgs_list), np.array(labels_list)
imgs_list = []
labels_list = []
# 如果剩余数据的数目小于BATCHSIZE,
# 则剩余数据一起构成一个大小为len(imgs_list)的mini-batch
if len(imgs_list) > 0:
yield np.array(imgs_list), np.array(labels_list)
return data_generator
# 多层卷积神经网络实现
class MNIST(fluid.dygraph.Layer):
def __init__(self, name_scope):
super(MNIST, self).__init__(name_scope)
# 定义卷积层,输出特征通道num_filters设置为20,卷积核的大小filter_size为5,卷积步长stride=1,padding=2
# 激活函数使用relu
self.conv1 = Conv2D(num_channels=1, num_filters=20, filter_size=5, stride=1, padding=2, act='relu')
# 定义池化层,池化核pool_size=2,池化步长为2,选择最大池化方式
self.pool1 = Pool2D(pool_size=2, pool_stride=2, pool_type='avg')
# 定义卷积层,输出特征通道num_filters设置为20,卷积核的大小filter_size为5,卷积步长stride=1,padding=2
self.conv2 = Conv2D(num_channels=20, num_filters=20, filter_size=5, stride=1, padding=2, act='relu')
# 定义池化层,池化核pool_size=2,池化步长为2,选择最大池化方式
self.pool2 = Pool2D(pool_size=2, pool_stride=2, pool_type='max')
# 定义一层全连接层,输出维度是1,不使用激活函数
self.fc = Linear(input_dim=980, output_dim=1, act=None)
# 定义网络前向计算过程,卷积后紧接着使用池化层,最后使用全连接层计算最终输出
def forward(self, inputs):
x = self.conv1(inputs)
x = self.pool1(x)
x = self.conv2(x)
x = self.pool2(x)
x = fluid.layers.reshape(x, [x.shape[0], -1])
x = self.fc(x)
return x
start = time()
#网络结构部分之后的代码,保持不变
with fluid.dygraph.guard(fluid.CUDAPlace(0)):
model = MNIST("mnist")
model.train()
#调用加载数据的函数
train_loader = load_data('train')
optimizer = fluid.optimizer.SGDOptimizer(learning_rate=0.01, parameter_list=model.parameters())
EPOCH_NUM = 5
for epoch_id in range(EPOCH_NUM):
for batch_id, data in enumerate(train_loader()):
#准备数据
image_data, label_data = data
image = fluid.dygraph.to_variable(image_data)
label = fluid.dygraph.to_variable(label_data)
#前向计算的过程
predict = model(image)
#计算损失,取一个批次样本损失的平均值
loss = fluid.layers.square_error_cost(predict, label)
avg_loss = fluid.layers.mean(loss)
#每训练了100批次的数据,打印下当前Loss的情况
if batch_id % 200 == 0:
print("epoch: {}, batch: {}, loss is: {}".format(epoch_id, batch_id, avg_loss.numpy()))
#后向传播,更新参数的过程
avg_loss.backward()
optimizer.minimize(avg_loss)
model.clear_gradients()
#保存模型参数
end = time()
print('Time', end-start)
fluid.save_dygraph(model.state_dict(), 'mnist')
| 31.421824 | 168 | 0.621313 | 2,399 | 19,293 | 4.828262 | 0.172155 | 0.022792 | 0.014677 | 0.008806 | 0.928516 | 0.928516 | 0.923336 | 0.923336 | 0.923336 | 0.914703 | 0 | 0.026972 | 0.256311 | 19,293 | 613 | 169 | 31.473083 | 0.780318 | 0.192868 | 0 | 0.965318 | 0 | 0 | 0.060451 | 0.005974 | 0 | 0 | 0 | 0 | 0.00578 | 1 | 0.046243 | false | 0.00578 | 0.104046 | 0 | 0.190751 | 0.028902 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6acfcf489136d7d67eb92c3c55f4d20d4ed10a8b | 79,154 | py | Python | isi_sdk_9_0_0/isi_sdk_9_0_0/api/fsa_results_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 24 | 2018-06-22T14:13:23.000Z | 2022-03-23T01:21:26.000Z | isi_sdk_9_0_0/isi_sdk_9_0_0/api/fsa_results_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 46 | 2018-04-30T13:28:22.000Z | 2022-03-21T21:11:07.000Z | isi_sdk_9_0_0/isi_sdk_9_0_0/api/fsa_results_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 29 | 2018-06-19T00:14:04.000Z | 2022-02-08T17:51:19.000Z | # coding: utf-8
"""
Isilon SDK
Isilon SDK - Language bindings for the OneFS API # noqa: E501
OpenAPI spec version: 10
Contact: sdk@isilon.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from isi_sdk_9_0_0.api_client import ApiClient
class FsaResultsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_histogram_stat_by(self, id, stat, **kwargs): # noqa: E501
"""get_histogram_stat_by # noqa: E501
This resource retrieves a histogram breakout for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_histogram_stat_by(id, stat, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str stat: (required)
:return: HistogramStatBy
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_histogram_stat_by_with_http_info(id, stat, **kwargs) # noqa: E501
else:
(data) = self.get_histogram_stat_by_with_http_info(id, stat, **kwargs) # noqa: E501
return data
def get_histogram_stat_by_with_http_info(self, id, stat, **kwargs): # noqa: E501
"""get_histogram_stat_by # noqa: E501
This resource retrieves a histogram breakout for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_histogram_stat_by_with_http_info(id, stat, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str stat: (required)
:return: HistogramStatBy
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'stat'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_histogram_stat_by" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_histogram_stat_by`") # noqa: E501
# verify the required parameter 'stat' is set
if ('stat' not in params or
params['stat'] is None):
raise ValueError("Missing the required parameter `stat` when calling `get_histogram_stat_by`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
if 'stat' in params:
path_params['Stat'] = params['stat'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/histogram/{Stat}/by', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='HistogramStatBy', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_histogram_stat_by_breakout(self, histogram_stat_by_breakout, id, stat, **kwargs): # noqa: E501
"""get_histogram_stat_by_breakout # noqa: E501
This resource retrieves a histogram breakout for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_histogram_stat_by_breakout(histogram_stat_by_breakout, id, stat, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str histogram_stat_by_breakout: This resource retrieves a histogram breakout for an individual FSA result set. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str stat: (required)
:param str directory_filter: Filter according to a specific directory, which includes all of its subdirectories.
:param str attribute_filter: Filter according to the name of a file user attribute.
:param str node_pool_filter: Filter according to the name of a node pool, which is a set of disk pools that belong to nodes of the same equivalence class.
:param str disk_pool_filter: Filter according to the name of a disk pool, which is a set of drives that represent an independent failure domain.
:param str tier_filter: Filter according to the name of a storage tier, which is a user-created set of node pools.
:param int comp_report: Result set identifier for comparison of database results.
:param int log_size_filter: Filter according to file logical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by logical size. The list of valid log_size filter values may be found by performing a histogram breakout by log_size and viewing the resulting key values.
:param int phys_size_filter: Filter according to file physical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by physical size. The list of valid phys_size filter values may be found by performing a histogram breakout by phys_size and viewing the resulting key values.
:param int limit: Limit the number of breakout results.
:param str path_ext_filter: Filter according to the name of a single file extension.
:param int ctime_filter: Filter according to file modified time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid ctime filter values may be found by performing a histogram breakout by ctime and viewing the resulting key values.
:param int atime_filter: Filter according to file accessed time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid atime filter values may be found by performing a histogram breakout by atime and viewing the resulting key values.
:return: HistogramStatBy
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_histogram_stat_by_breakout_with_http_info(histogram_stat_by_breakout, id, stat, **kwargs) # noqa: E501
else:
(data) = self.get_histogram_stat_by_breakout_with_http_info(histogram_stat_by_breakout, id, stat, **kwargs) # noqa: E501
return data
def get_histogram_stat_by_breakout_with_http_info(self, histogram_stat_by_breakout, id, stat, **kwargs): # noqa: E501
"""get_histogram_stat_by_breakout # noqa: E501
This resource retrieves a histogram breakout for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_histogram_stat_by_breakout_with_http_info(histogram_stat_by_breakout, id, stat, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str histogram_stat_by_breakout: This resource retrieves a histogram breakout for an individual FSA result set. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str stat: (required)
:param str directory_filter: Filter according to a specific directory, which includes all of its subdirectories.
:param str attribute_filter: Filter according to the name of a file user attribute.
:param str node_pool_filter: Filter according to the name of a node pool, which is a set of disk pools that belong to nodes of the same equivalence class.
:param str disk_pool_filter: Filter according to the name of a disk pool, which is a set of drives that represent an independent failure domain.
:param str tier_filter: Filter according to the name of a storage tier, which is a user-created set of node pools.
:param int comp_report: Result set identifier for comparison of database results.
:param int log_size_filter: Filter according to file logical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by logical size. The list of valid log_size filter values may be found by performing a histogram breakout by log_size and viewing the resulting key values.
:param int phys_size_filter: Filter according to file physical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by physical size. The list of valid phys_size filter values may be found by performing a histogram breakout by phys_size and viewing the resulting key values.
:param int limit: Limit the number of breakout results.
:param str path_ext_filter: Filter according to the name of a single file extension.
:param int ctime_filter: Filter according to file modified time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid ctime filter values may be found by performing a histogram breakout by ctime and viewing the resulting key values.
:param int atime_filter: Filter according to file accessed time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid atime filter values may be found by performing a histogram breakout by atime and viewing the resulting key values.
:return: HistogramStatBy
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['histogram_stat_by_breakout', 'id', 'stat', 'directory_filter', 'attribute_filter', 'node_pool_filter', 'disk_pool_filter', 'tier_filter', 'comp_report', 'log_size_filter', 'phys_size_filter', 'limit', 'path_ext_filter', 'ctime_filter', 'atime_filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_histogram_stat_by_breakout" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'histogram_stat_by_breakout' is set
if ('histogram_stat_by_breakout' not in params or
params['histogram_stat_by_breakout'] is None):
raise ValueError("Missing the required parameter `histogram_stat_by_breakout` when calling `get_histogram_stat_by_breakout`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_histogram_stat_by_breakout`") # noqa: E501
# verify the required parameter 'stat' is set
if ('stat' not in params or
params['stat'] is None):
raise ValueError("Missing the required parameter `stat` when calling `get_histogram_stat_by_breakout`") # noqa: E501
collection_formats = {}
path_params = {}
if 'histogram_stat_by_breakout' in params:
path_params['HistogramStatByBreakout'] = params['histogram_stat_by_breakout'] # noqa: E501
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
if 'stat' in params:
path_params['Stat'] = params['stat'] # noqa: E501
query_params = []
if 'directory_filter' in params:
query_params.append(('directory_filter', params['directory_filter'])) # noqa: E501
if 'attribute_filter' in params:
query_params.append(('attribute_filter', params['attribute_filter'])) # noqa: E501
if 'node_pool_filter' in params:
query_params.append(('node_pool_filter', params['node_pool_filter'])) # noqa: E501
if 'disk_pool_filter' in params:
query_params.append(('disk_pool_filter', params['disk_pool_filter'])) # noqa: E501
if 'tier_filter' in params:
query_params.append(('tier_filter', params['tier_filter'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'log_size_filter' in params:
query_params.append(('log_size_filter', params['log_size_filter'])) # noqa: E501
if 'phys_size_filter' in params:
query_params.append(('phys_size_filter', params['phys_size_filter'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'path_ext_filter' in params:
query_params.append(('path_ext_filter', params['path_ext_filter'])) # noqa: E501
if 'ctime_filter' in params:
query_params.append(('ctime_filter', params['ctime_filter'])) # noqa: E501
if 'atime_filter' in params:
query_params.append(('atime_filter', params['atime_filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/histogram/{Stat}/by/{HistogramStatByBreakout}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='HistogramStatBy', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_dir_pools_usage(self, id, **kwargs): # noqa: E501
"""get_result_dir_pools_usage # noqa: E501
View pool usage information of a directory, classified by storage pools in response \"usage_data\". The storage pool type can be specified by query parameter \"storage_pool_type\". The directory is \"path\" query parameter. The response \"dir_usage\" is total disk usage of directory, over all pools at a given storage pool level. When path cannot be found within result, status code 404 and error message will be returned. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_dir_pools_usage(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str path: Directory absolute path to report usage information. Path should be UTF8 percent encoded, should be within \"/ifs\". Defaults to \"/ifs\".
:param int comp_report: Result set identifier for comparison of database results.
:param str storage_pool_type: The type of the storage pool.
:return: ResultDirPoolsUsage
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_dir_pools_usage_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_result_dir_pools_usage_with_http_info(id, **kwargs) # noqa: E501
return data
def get_result_dir_pools_usage_with_http_info(self, id, **kwargs): # noqa: E501
"""get_result_dir_pools_usage # noqa: E501
View pool usage information of a directory, classified by storage pools in response \"usage_data\". The storage pool type can be specified by query parameter \"storage_pool_type\". The directory is \"path\" query parameter. The response \"dir_usage\" is total disk usage of directory, over all pools at a given storage pool level. When path cannot be found within result, status code 404 and error message will be returned. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_dir_pools_usage_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str path: Directory absolute path to report usage information. Path should be UTF8 percent encoded, should be within \"/ifs\". Defaults to \"/ifs\".
:param int comp_report: Result set identifier for comparison of database results.
:param str storage_pool_type: The type of the storage pool.
:return: ResultDirPoolsUsage
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'path', 'comp_report', 'storage_pool_type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_dir_pools_usage" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_dir_pools_usage`") # noqa: E501
if ('path' in params and
len(params['path']) > 4096):
raise ValueError("Invalid value for parameter `path` when calling `get_result_dir_pools_usage`, length must be less than or equal to `4096`") # noqa: E501
if ('path' in params and
len(params['path']) < 4):
raise ValueError("Invalid value for parameter `path` when calling `get_result_dir_pools_usage`, length must be greater than or equal to `4`") # noqa: E501
if 'path' in params and not re.search('^\/ifs|^\/ifs\/.*', params['path']): # noqa: E501
raise ValueError("Invalid value for parameter `path` when calling `get_result_dir_pools_usage`, must conform to the pattern `/^\/ifs|^\/ifs\/.*/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'path' in params:
query_params.append(('path', params['path'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'storage_pool_type' in params:
query_params.append(('storage_pool_type', params['storage_pool_type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/9/fsa/results/{Id}/dir_pools_usage', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultDirPoolsUsage', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_dir_pools_usage_lin(self, result_dir_pools_usage_lin, id, **kwargs): # noqa: E501
"""get_result_dir_pools_usage_lin # noqa: E501
View pool usage information of a directory, classified by storage pools in response \"usage_data\". The storage pool type can be specified by query parameter \"storage_pool_type\". The directory is LIN token of URI. The response \"dir_usage\" is total disk usage of directory, over all pools at a given storage pool level. When LIN cannot be found within result, status code 404 and error message will be returned. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_dir_pools_usage_lin(result_dir_pools_usage_lin, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int result_dir_pools_usage_lin: View pool usage information of a directory, classified by storage pools in response \"usage_data\". The storage pool type can be specified by query parameter \"storage_pool_type\". The directory is LIN token of URI. The response \"dir_usage\" is total disk usage of directory, over all pools at a given storage pool level. When LIN cannot be found within result, status code 404 and error message will be returned. (required)
:param str id: (required)
:param int comp_report: Result set identifier for comparison of database results.
:param str storage_pool_type: The type of the storage pool.
:return: ResultDirPoolsUsage
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_dir_pools_usage_lin_with_http_info(result_dir_pools_usage_lin, id, **kwargs) # noqa: E501
else:
(data) = self.get_result_dir_pools_usage_lin_with_http_info(result_dir_pools_usage_lin, id, **kwargs) # noqa: E501
return data
def get_result_dir_pools_usage_lin_with_http_info(self, result_dir_pools_usage_lin, id, **kwargs): # noqa: E501
"""get_result_dir_pools_usage_lin # noqa: E501
View pool usage information of a directory, classified by storage pools in response \"usage_data\". The storage pool type can be specified by query parameter \"storage_pool_type\". The directory is LIN token of URI. The response \"dir_usage\" is total disk usage of directory, over all pools at a given storage pool level. When LIN cannot be found within result, status code 404 and error message will be returned. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_dir_pools_usage_lin_with_http_info(result_dir_pools_usage_lin, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int result_dir_pools_usage_lin: View pool usage information of a directory, classified by storage pools in response \"usage_data\". The storage pool type can be specified by query parameter \"storage_pool_type\". The directory is LIN token of URI. The response \"dir_usage\" is total disk usage of directory, over all pools at a given storage pool level. When LIN cannot be found within result, status code 404 and error message will be returned. (required)
:param str id: (required)
:param int comp_report: Result set identifier for comparison of database results.
:param str storage_pool_type: The type of the storage pool.
:return: ResultDirPoolsUsage
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['result_dir_pools_usage_lin', 'id', 'comp_report', 'storage_pool_type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_dir_pools_usage_lin" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'result_dir_pools_usage_lin' is set
if ('result_dir_pools_usage_lin' not in params or
params['result_dir_pools_usage_lin'] is None):
raise ValueError("Missing the required parameter `result_dir_pools_usage_lin` when calling `get_result_dir_pools_usage_lin`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_dir_pools_usage_lin`") # noqa: E501
collection_formats = {}
path_params = {}
if 'result_dir_pools_usage_lin' in params:
path_params['ResultDirPoolsUsageLin'] = params['result_dir_pools_usage_lin'] # noqa: E501
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'storage_pool_type' in params:
query_params.append(('storage_pool_type', params['storage_pool_type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/9/fsa/results/{Id}/dir_pools_usage/{ResultDirPoolsUsageLin}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultDirPoolsUsage', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_directories(self, id, **kwargs): # noqa: E501
"""get_result_directories # noqa: E501
This resource retrieves directory information. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_directories(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param str path: Primary directory path to report usage information, which may be specified instead of a LIN.
:param int limit: Limit the number of reported subdirectories.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultDirectoriesExtended
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_directories_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_result_directories_with_http_info(id, **kwargs) # noqa: E501
return data
def get_result_directories_with_http_info(self, id, **kwargs): # noqa: E501
"""get_result_directories # noqa: E501
This resource retrieves directory information. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_directories_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param str path: Primary directory path to report usage information, which may be specified instead of a LIN.
:param int limit: Limit the number of reported subdirectories.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultDirectoriesExtended
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'sort', 'path', 'limit', 'comp_report', 'dir'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_directories" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_directories`") # noqa: E501
if ('sort' in params and
len(params['sort']) > 255):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_directories`, length must be less than or equal to `255`") # noqa: E501
if ('sort' in params and
len(params['sort']) < 0):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_directories`, length must be greater than or equal to `0`") # noqa: E501
if ('dir' in params and
len(params['dir']) < 0):
raise ValueError("Invalid value for parameter `dir` when calling `get_result_directories`, length must be greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'sort' in params:
query_params.append(('sort', params['sort'])) # noqa: E501
if 'path' in params:
query_params.append(('path', params['path'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'dir' in params:
query_params.append(('dir', params['dir'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/directories', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultDirectoriesExtended', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_directory(self, result_directory_id, id, **kwargs): # noqa: E501
"""get_result_directory # noqa: E501
This resource retrieves directory information. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_directory(result_directory_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int result_directory_id: This resource retrieves directory information. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param int limit: Limit the number of reported subdirectories.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultDirectories
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_directory_with_http_info(result_directory_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_result_directory_with_http_info(result_directory_id, id, **kwargs) # noqa: E501
return data
def get_result_directory_with_http_info(self, result_directory_id, id, **kwargs): # noqa: E501
"""get_result_directory # noqa: E501
This resource retrieves directory information. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_directory_with_http_info(result_directory_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int result_directory_id: This resource retrieves directory information. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param int limit: Limit the number of reported subdirectories.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultDirectories
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['result_directory_id', 'id', 'sort', 'limit', 'comp_report', 'dir'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_directory" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'result_directory_id' is set
if ('result_directory_id' not in params or
params['result_directory_id'] is None):
raise ValueError("Missing the required parameter `result_directory_id` when calling `get_result_directory`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_directory`") # noqa: E501
if ('sort' in params and
len(params['sort']) > 255):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_directory`, length must be less than or equal to `255`") # noqa: E501
if ('sort' in params and
len(params['sort']) < 0):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_directory`, length must be greater than or equal to `0`") # noqa: E501
if ('dir' in params and
len(params['dir']) < 0):
raise ValueError("Invalid value for parameter `dir` when calling `get_result_directory`, length must be greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'result_directory_id' in params:
path_params['ResultDirectoryId'] = params['result_directory_id'] # noqa: E501
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'sort' in params:
query_params.append(('sort', params['sort'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'dir' in params:
query_params.append(('dir', params['dir'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/directories/{ResultDirectoryId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultDirectories', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_histogram(self, id, **kwargs): # noqa: E501
"""get_result_histogram # noqa: E501
This resource retrieves a histogram of file counts for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_histogram(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResultHistogram
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_histogram_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_result_histogram_with_http_info(id, **kwargs) # noqa: E501
return data
def get_result_histogram_with_http_info(self, id, **kwargs): # noqa: E501
"""get_result_histogram # noqa: E501
This resource retrieves a histogram of file counts for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_histogram_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResultHistogram
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_histogram" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_histogram`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/histogram', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultHistogram', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_histogram_stat(self, result_histogram_stat, id, **kwargs): # noqa: E501
"""get_result_histogram_stat # noqa: E501
This resource retrieves a histogram of file counts for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_histogram_stat(result_histogram_stat, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str result_histogram_stat: This resource retrieves a histogram of file counts for an individual FSA result set. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str directory_filter: Filter according to a specific directory, which includes all of its subdirectories.
:param str attribute_filter: Filter according to the name of a file user attribute.
:param str node_pool_filter: Filter according to the name of a node pool, which is a set of disk pools that belong to nodes of the same equivalence class.
:param str disk_pool_filter: Filter according to the name of a disk pool, which is a set of drives that represent an independent failure domain.
:param str tier_filter: Filter according to the name of a storage tier, which is a user-created set of node pools.
:param int comp_report: Result set identifier for comparison of database results.
:param int log_size_filter: Filter according to file logical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by logical size. The list of valid log_size filter values may be found by performing a histogram breakout by log_size and viewing the resulting key values.
:param int phys_size_filter: Filter according to file physical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by physical size. The list of valid phys_size filter values may be found by performing a histogram breakout by phys_size and viewing the resulting key values.
:param str path_ext_filter: Filter according to the name of a single file extension.
:param int ctime_filter: Filter according to file modified time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid ctime filter values may be found by performing a histogram breakout by ctime and viewing the resulting key values.
:param int atime_filter: Filter according to file accessed time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid atime filter values may be found by performing a histogram breakout by atime and viewing the resulting key values.
:return: ResultHistogram
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_histogram_stat_with_http_info(result_histogram_stat, id, **kwargs) # noqa: E501
else:
(data) = self.get_result_histogram_stat_with_http_info(result_histogram_stat, id, **kwargs) # noqa: E501
return data
def get_result_histogram_stat_with_http_info(self, result_histogram_stat, id, **kwargs): # noqa: E501
"""get_result_histogram_stat # noqa: E501
This resource retrieves a histogram of file counts for an individual FSA result set. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_histogram_stat_with_http_info(result_histogram_stat, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str result_histogram_stat: This resource retrieves a histogram of file counts for an individual FSA result set. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str directory_filter: Filter according to a specific directory, which includes all of its subdirectories.
:param str attribute_filter: Filter according to the name of a file user attribute.
:param str node_pool_filter: Filter according to the name of a node pool, which is a set of disk pools that belong to nodes of the same equivalence class.
:param str disk_pool_filter: Filter according to the name of a disk pool, which is a set of drives that represent an independent failure domain.
:param str tier_filter: Filter according to the name of a storage tier, which is a user-created set of node pools.
:param int comp_report: Result set identifier for comparison of database results.
:param int log_size_filter: Filter according to file logical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by logical size. The list of valid log_size filter values may be found by performing a histogram breakout by log_size and viewing the resulting key values.
:param int phys_size_filter: Filter according to file physical size, where the filter value specifies the lower bound in bytes to a set of files that have been grouped by physical size. The list of valid phys_size filter values may be found by performing a histogram breakout by phys_size and viewing the resulting key values.
:param str path_ext_filter: Filter according to the name of a single file extension.
:param int ctime_filter: Filter according to file modified time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid ctime filter values may be found by performing a histogram breakout by ctime and viewing the resulting key values.
:param int atime_filter: Filter according to file accessed time, where the filter value specifies a negative number of seconds representing a time before the begin time of the report. The list of valid atime filter values may be found by performing a histogram breakout by atime and viewing the resulting key values.
:return: ResultHistogram
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['result_histogram_stat', 'id', 'directory_filter', 'attribute_filter', 'node_pool_filter', 'disk_pool_filter', 'tier_filter', 'comp_report', 'log_size_filter', 'phys_size_filter', 'path_ext_filter', 'ctime_filter', 'atime_filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_histogram_stat" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'result_histogram_stat' is set
if ('result_histogram_stat' not in params or
params['result_histogram_stat'] is None):
raise ValueError("Missing the required parameter `result_histogram_stat` when calling `get_result_histogram_stat`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_histogram_stat`") # noqa: E501
collection_formats = {}
path_params = {}
if 'result_histogram_stat' in params:
path_params['ResultHistogramStat'] = params['result_histogram_stat'] # noqa: E501
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'directory_filter' in params:
query_params.append(('directory_filter', params['directory_filter'])) # noqa: E501
if 'attribute_filter' in params:
query_params.append(('attribute_filter', params['attribute_filter'])) # noqa: E501
if 'node_pool_filter' in params:
query_params.append(('node_pool_filter', params['node_pool_filter'])) # noqa: E501
if 'disk_pool_filter' in params:
query_params.append(('disk_pool_filter', params['disk_pool_filter'])) # noqa: E501
if 'tier_filter' in params:
query_params.append(('tier_filter', params['tier_filter'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'log_size_filter' in params:
query_params.append(('log_size_filter', params['log_size_filter'])) # noqa: E501
if 'phys_size_filter' in params:
query_params.append(('phys_size_filter', params['phys_size_filter'])) # noqa: E501
if 'path_ext_filter' in params:
query_params.append(('path_ext_filter', params['path_ext_filter'])) # noqa: E501
if 'ctime_filter' in params:
query_params.append(('ctime_filter', params['ctime_filter'])) # noqa: E501
if 'atime_filter' in params:
query_params.append(('atime_filter', params['atime_filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/histogram/{ResultHistogramStat}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultHistogram', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_top_dir(self, result_top_dir_id, id, **kwargs): # noqa: E501
"""get_result_top_dir # noqa: E501
This resource retrieves the top directories. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_dir(result_top_dir_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str result_top_dir_id: This resource retrieves the top directories. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param int start: Starting index for results. Default value of 0.
:param int limit: Number of results from start index. Default value of 1000.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultTopDirs
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_top_dir_with_http_info(result_top_dir_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_result_top_dir_with_http_info(result_top_dir_id, id, **kwargs) # noqa: E501
return data
def get_result_top_dir_with_http_info(self, result_top_dir_id, id, **kwargs): # noqa: E501
"""get_result_top_dir # noqa: E501
This resource retrieves the top directories. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_dir_with_http_info(result_top_dir_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str result_top_dir_id: This resource retrieves the top directories. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param int start: Starting index for results. Default value of 0.
:param int limit: Number of results from start index. Default value of 1000.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultTopDirs
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['result_top_dir_id', 'id', 'sort', 'start', 'limit', 'comp_report', 'dir'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_top_dir" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'result_top_dir_id' is set
if ('result_top_dir_id' not in params or
params['result_top_dir_id'] is None):
raise ValueError("Missing the required parameter `result_top_dir_id` when calling `get_result_top_dir`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_top_dir`") # noqa: E501
if ('sort' in params and
len(params['sort']) > 255):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_top_dir`, length must be less than or equal to `255`") # noqa: E501
if ('sort' in params and
len(params['sort']) < 0):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_top_dir`, length must be greater than or equal to `0`") # noqa: E501
if ('dir' in params and
len(params['dir']) < 0):
raise ValueError("Invalid value for parameter `dir` when calling `get_result_top_dir`, length must be greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'result_top_dir_id' in params:
path_params['ResultTopDirId'] = params['result_top_dir_id'] # noqa: E501
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'sort' in params:
query_params.append(('sort', params['sort'])) # noqa: E501
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'dir' in params:
query_params.append(('dir', params['dir'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/top-dirs/{ResultTopDirId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultTopDirs', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_top_dirs(self, id, **kwargs): # noqa: E501
"""get_result_top_dirs # noqa: E501
This resource retrieves the top directories. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_dirs(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResultTopDirs
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_top_dirs_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_result_top_dirs_with_http_info(id, **kwargs) # noqa: E501
return data
def get_result_top_dirs_with_http_info(self, id, **kwargs): # noqa: E501
"""get_result_top_dirs # noqa: E501
This resource retrieves the top directories. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_dirs_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResultTopDirs
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_top_dirs" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_top_dirs`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/top-dirs', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultTopDirs', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_top_file(self, result_top_file_id, id, **kwargs): # noqa: E501
"""get_result_top_file # noqa: E501
This resource retrieves the top files. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_file(result_top_file_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str result_top_file_id: This resource retrieves the top files. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param int start: Starting index for results. Default value of 0.
:param int limit: Number of results from start index. Default value of 1000.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultTopFiles
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_top_file_with_http_info(result_top_file_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_result_top_file_with_http_info(result_top_file_id, id, **kwargs) # noqa: E501
return data
def get_result_top_file_with_http_info(self, result_top_file_id, id, **kwargs): # noqa: E501
"""get_result_top_file # noqa: E501
This resource retrieves the top files. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_file_with_http_info(result_top_file_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str result_top_file_id: This resource retrieves the top files. ID in the resource path is the result set ID. (required)
:param str id: (required)
:param str sort: The field that will be used for sorting.
:param int start: Starting index for results. Default value of 0.
:param int limit: Number of results from start index. Default value of 1000.
:param int comp_report: Result set identifier for comparison of database results.
:param str dir: The direction of the sort.
:return: ResultTopFiles
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['result_top_file_id', 'id', 'sort', 'start', 'limit', 'comp_report', 'dir'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_top_file" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'result_top_file_id' is set
if ('result_top_file_id' not in params or
params['result_top_file_id'] is None):
raise ValueError("Missing the required parameter `result_top_file_id` when calling `get_result_top_file`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_top_file`") # noqa: E501
if ('sort' in params and
len(params['sort']) > 255):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_top_file`, length must be less than or equal to `255`") # noqa: E501
if ('sort' in params and
len(params['sort']) < 0):
raise ValueError("Invalid value for parameter `sort` when calling `get_result_top_file`, length must be greater than or equal to `0`") # noqa: E501
if ('dir' in params and
len(params['dir']) < 0):
raise ValueError("Invalid value for parameter `dir` when calling `get_result_top_file`, length must be greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'result_top_file_id' in params:
path_params['ResultTopFileId'] = params['result_top_file_id'] # noqa: E501
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
if 'sort' in params:
query_params.append(('sort', params['sort'])) # noqa: E501
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'comp_report' in params:
query_params.append(('comp_report', params['comp_report'])) # noqa: E501
if 'dir' in params:
query_params.append(('dir', params['dir'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/top-files/{ResultTopFileId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultTopFiles', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_result_top_files(self, id, **kwargs): # noqa: E501
"""get_result_top_files # noqa: E501
This resource retrieves the top files. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_files(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResultTopFiles
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_result_top_files_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_result_top_files_with_http_info(id, **kwargs) # noqa: E501
return data
def get_result_top_files_with_http_info(self, id, **kwargs): # noqa: E501
"""get_result_top_files # noqa: E501
This resource retrieves the top files. ID in the resource path is the result set ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_result_top_files_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResultTopFiles
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_result_top_files" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_result_top_files`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['Id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/fsa/results/{Id}/top-files', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultTopFiles', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 52.109282 | 473 | 0.648369 | 10,233 | 79,154 | 4.805727 | 0.02961 | 0.044248 | 0.012608 | 0.017569 | 0.979137 | 0.972284 | 0.965512 | 0.959127 | 0.951969 | 0.947516 | 0 | 0.016028 | 0.267744 | 79,154 | 1,518 | 474 | 52.14361 | 0.832422 | 0.420901 | 0 | 0.801198 | 1 | 0.017964 | 0.254967 | 0.06953 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02994 | false | 0 | 0.00479 | 0 | 0.079042 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0a767c5c8f0b9b3696af5b22b47e5baa6855bc00 | 319,102 | py | Python | tests/website_mock.py | FriendsOfGalaxy/galaxy-integration-battlenet | c46c1d4e8968deeed22b78d0a1ab901794b7ff37 | [
"MIT"
] | 179 | 2019-06-27T07:04:01.000Z | 2022-03-08T13:47:47.000Z | tests/website_mock.py | FriendsOfGalaxy/galaxy-integration-battlenet | c46c1d4e8968deeed22b78d0a1ab901794b7ff37 | [
"MIT"
] | 36 | 2019-08-05T04:06:44.000Z | 2020-06-29T07:30:15.000Z | tests/website_mock.py | FriendsOfGalaxy/galaxy-integration-battlenet | c46c1d4e8968deeed22b78d0a1ab901794b7ff37 | [
"MIT"
] | 17 | 2019-07-22T19:54:16.000Z | 2022-02-02T14:11:48.000Z | import json
from tests.async_mock import AsyncMock
from aiohttp.client_exceptions import ClientResponseError
class WebsiteClientMock(AsyncMock):
def is_authenticated(self):
return True
async def create_session(self, auth_data):
return None
async def validate_access_token(self, access_token):
return json.loads(
"""{"exp": 1550846897, "user_name": "123", "authorities": ["IS_AUTHENTICATED_FULLY", "ROLE_USER"], "client_id": "90942d6d2c8c4a308a441b6fca477fe8", "scope": ["wow.profile", "sc2.profile"]}"""
)
async def refresh_cookies(self):
return None
async def get_user_info(self):
return json.loads("""{"sub": "420", "id": 420, "battletag": "MOCK"}""")
async def get_account_details(self):
return json.loads(
"""{
"metadata": {
"fieldMetadata": {
"lastName": {
"readOnly": true,
"hidden": false
},
"cancelAccountComponent": {
"readOnly": false,
"hidden": true
},
"parentalControlsComponent": {
"readOnly": false,
"hidden": false
},
"addressBookComponent": {
"readOnly": false,
"hidden": false
},
"communicationPreferencesComponent": {
"readOnly": false,
"hidden": false
}
},
"disableEditing": false
},
"accountId": 420,
"accountName": "MrMock@gmail.com",
"firstName": "Mr",
"lastName": "Mock",
"battleTag": "MOCK",
"countryId": 616,
"countryCodeAlpha3": "POL",
"availableCountries": [
616
],
"identities": [],
"realIdEnabled": true,
"emailVerified": true
}"""
)
async def get_owned_classic_games(self):
return json.loads("""
{
"classicGames": []
}
""")
async def get_owned_games(self):
return json.loads(
"""{"gameAccounts":[
{
"titleId": 5730135,
"localizedGameName": "World of Warcraft\u00ae",
"gameAccountName": "WoW1",
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 5730135
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "world-of-warcraft.svg",
"gameAccountStatus": "Inactive",
"lastPlayedDateMillis": 1532872671000,
"titleHasSubscriptions": true,
"titleHasGameTime": true,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 1
},
{
"titleId": 1329875278,
"localizedGameName": "Call of Duty: Modern Warfare",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 1329875278
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "cod-mw.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": 1529516420000,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 1000
},
{
"titleId": 17459,
"localizedGameName": "Diablo\u00ae\u00a0III",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 17459
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "diablo-iii.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": null,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 2
},
{
"titleId": 1465140039,
"localizedGameName": "Hearthstone\u00ae",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 1465140039
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "hearthstone.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": 1537722232000,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 3
},
{
"titleId": 1214607983,
"localizedGameName": "Heroes of the Storm\u00ae",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 1214607983
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "heroes-of-the-storm.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": null,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 4
},
{
"titleId": 5272175,
"localizedGameName": "Overwatch\u00ae",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 5272175
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "overwatch.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": null,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 5
},
{
"titleId": 21298,
"localizedGameName": "StarCraft\u00ae\u00a0II",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 21298
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "starcraft-ii.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": null,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 6
},
{
"titleId": 21297,
"localizedGameName": "StarCraft\u00ae Remastered",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 21297
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "starcraft-remastered.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": 1550489642000,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 7
},
{
"titleId": 1146311730,
"localizedGameName": "Destiny 2",
"gameAccountName": null,
"gameAccountUniqueId": {
"gameAccountId": 123,
"gameServiceRegionId": 2,
"programId": 1146311730
},
"gameAccountRegion": "EU",
"regionalGameFranchiseIconFilename": "destiny-2.svg",
"gameAccountStatus": "Good",
"lastPlayedDateMillis": 1529516420000,
"titleHasSubscriptions": false,
"titleHasGameTime": false,
"accountSubscriptionView": null,
"gameTimeView": null,
"displayOrder": 1000
}
]}"""
)
async def get_sc2_player_data(self, account_id):
return json.loads(
"""[{"name": "MOCK", "profileUrl": "https://www.starcraft2.com/profile/2/1/123", "avatarUrl": "https://static.starcraft2.com/starport/eadc1041-1c53-4c27-bf45-303ac5c6e33f/portraits/13-4.jpg", "profileId": "420", "regionId": 2, "realmId": 1}]"""
)
async def get_sc2_profile_data(self, region_id, realm_id, account_id):
if account_id == "420":
return json.loads(
"""{"summary": {"id": "6546825", "realm": 1, "displayName": "Mock", "portrait": "https://static.starcraft2.com/starport/eadc1041-1c53-4c27-bf45-303ac5c6e33f/portraits/13-4.jpg", "decalTerran": "https://static.starcraft2.com/starport/eadc1041-1c53-4c27-bf45-303ac5c6e33f/decals/1-0.jpg", "decalProtoss": "https://static.starcraft2.com/starport/eadc1041-1c53-4c27-bf45-303ac5c6e33f/decals/1-25.jpg", "decalZerg": "https://static.starcraft2.com/starport/eadc1041-1c53-4c27-bf45-303ac5c6e33f/decals/1-50.jpg", "totalSwarmLevel": 13, "totalAchievementPoints": 1135}, "snapshot": {"seasonSnapshot": {"1v1": {"rank": -1, "leagueName": null, "totalGames": 0, "totalWins": 0}, "2v2": {"rank": -1, "leagueName": null, "totalGames": 0, "totalWins": 0}, "3v3": {"rank": -1, "leagueName": null, "totalGames": 0, "totalWins": 0}, "4v4": {"rank": -1, "leagueName": null, "totalGames": 0, "totalWins": 0}, "Archon": {"rank": -1, "leagueName": null, "totalGames": 0, "totalWins": 0}}, "totalRankedSeasonGamesPlayed": 0}, "career": {"terranWins": 0, "zergWins": 0, "protossWins": 0, "totalCareerGames": 21, "totalGamesThisSeason": 0, "current1v1LeagueName": null, "currentBestTeamLeagueName": null, "best1v1Finish": {"leagueName": null, "timesAchieved": 0}, "bestTeamFinish": {"leagueName": "PLATINUM", "timesAchieved": 1}}, "swarmLevels": {"level": 13, "terran": {"level": 8, "maxLevelPoints": 145000, "currentLevelPoints": 87586}, "zerg": {"level": 0, "maxLevelPoints": 5000, "currentLevelPoints": 0}, "protoss": {"level": 5, "maxLevelPoints": 125000, "currentLevelPoints": 46643}}, "campaign": {"difficultyCompleted": {"legacy-of-the-void": "HARD"}}, "categoryPointProgress": [{"categoryId": "4325379", "pointsEarned": 0}, {"categoryId": "4325410", "pointsEarned": 0}, {"categoryId": "4330138", "pointsEarned": 885}, {"categoryId": "4364473", "pointsEarned": 0}, {"categoryId": "4386911", "pointsEarned": 0}, {"categoryId": "4325377", "pointsEarned": 180}, {"categoryId": "4325382", "pointsEarned": 0}, {"categoryId": "4325408", "pointsEarned": 0}], "achievementShowcase": [], "earnedRewards": [{"rewardId": "3065782512", "selected": false}, {"rewardId": "1244367155", "selected": false}, {"rewardId": "3719319749", "selected": false}, {"rewardId": "595469804", "selected": false}, {"rewardId": "4172710431", "selected": false}, {"rewardId": "301758540", "selected": false}, {"rewardId": "24676187", "selected": false}, {"rewardId": "3788149940", "selected": false}, {"rewardId": "3321648143", "selected": false}, {"rewardId": "175118693", "selected": false}, {"rewardId": "3851663319", "selected": false}, {"rewardId": "2951153716", "selected": false}, {"rewardId": "442861206", "selected": false}, {"rewardId": "1422135942", "selected": false}, {"rewardId": "3721089590", "selected": false}, {"rewardId": "1516476230", "achievementId": "91475327638380", "selected": false}, {"rewardId": "3318718057", "achievementId": "91475332200455", "selected": false}, {"rewardId": "1159285604", "achievementId": "91475320768611", "selected": false}, {"rewardId": "3484380374", "achievementId": "91475320768611", "selected": false}, {"rewardId": "4179780094", "achievementId": "91475320768611", "selected": false}, {"rewardId": "1505747653", "achievementId": "91475329944805", "selected": false}, {"rewardId": "927859820", "achievementId": "91475335633554", "selected": false}, {"rewardId": "367294557", "achievementId": "91475320768465", "selected": false}, {"rewardId": "3625001715", "achievementId": "91475320767718", "selected": false}, {"rewardId": "2212045446", "achievementId": "91475320767725", "selected": false}, {"rewardId": "2184379176", "achievementId": "91475320767623", "selected": false}, {"rewardId": "1467565626", "achievementId": "91475320767632", "selected": false}, {"rewardId": "597272904", "achievementId": "91475328241047", "selected": false}, {"rewardId": "4263649210", "achievementId": "91475325309291", "selected": false}, {"rewardId": "869977899", "achievementId": "91475035554452", "selected": false}, {"rewardId": "751224560", "achievementId": "91475035554583", "selected": false}, {"rewardId": "530763862", "achievementId": "91475035554482", "selected": false}, {"rewardId": "631688705", "achievementId": "91475035554562", "selected": false}, {"rewardId": "963483579", "achievementId": "91475035554586", "selected": false}, {"rewardId": "4286743759", "achievementId": "91475035554587", "selected": false}, {"rewardId": "1356696042", "achievementId": "91475035554588", "selected": false}, {"rewardId": "4256166790", "achievementId": "91475035554438", "selected": false}, {"rewardId": "1958621912", "achievementId": "91475035554510", "selected": false}, {"rewardId": "1579757823", "achievementId": "91475035554539", "selected": false}, {"rewardId": "1871255302", "achievementId": "91475035554536", "selected": true}, {"rewardId": "1519882399", "achievementId": "91475330443207", "selected": false}, {"rewardId": "2940888951", "achievementId": "91475329346913", "selected": false}, {"rewardId": "2009110693", "selected": true}, {"rewardId": "2560554373", "selected": false}, {"rewardId": "3168567118", "selected": false}, {"rewardId": "637508413", "achievementId": "91475320767624", "selected": false}, {"rewardId": "4130543639", "achievementId": "91475320767626", "selected": false}, {"rewardId": "2599148713", "achievementId": "91475329944805", "selected": false}, {"rewardId": "3441709157", "selected": false}, {"rewardId": "18730036", "selected": true}, {"rewardId": "985481741", "selected": false}, {"rewardId": "1919498533", "achievementId": "91475320768464", "selected": false}, {"rewardId": "4196367769", "achievementId": "91475320767628", "selected": false}, {"rewardId": "693439517", "achievementId": "91475320767722", "selected": false}, {"rewardId": "2220484966", "achievementId": "91475329944805", "selected": false}, {"rewardId": "1187203361", "selected": false}, {"rewardId": "1103653240", "selected": false}, {"rewardId": "2359737029", "selected": true}, {"rewardId": "604349629", "achievementId": "91475329944805", "selected": false}, {"category": "Thor", "rewardId": "3979921667", "selected": true}, {"category": "Overlord", "rewardId": "3717180683", "selected": true}, {"category": "Supply Depot", "rewardId": "2820704051", "selected": true}, {"category": "Ultralisk", "rewardId": "1751841167", "selected": true}, {"category": "Marine", "rewardId": "656055948", "selected": true}, {"category": "Zergling", "rewardId": "4067220888", "selected": true}, {"category": "Zealot", "rewardId": "3673336716", "selected": true}, {"category": "Pylon", "rewardId": "1500394717", "selected": true}, {"category": "Adept", "rewardId": "2428064812", "selected": true}, {"category": "Colossus", "rewardId": "4144653506", "selected": false}, {"category": "Colossus", "rewardId": "1001385194", "achievementId": "91475334888156", "selected": true}, {"category": "Ghost", "rewardId": "15768155", "selected": true}, {"category": "Marauder", "rewardId": "1697980591", "selected": true}, {"category": "Stalker", "rewardId": "2908506816", "selected": true}, {"category": "Roach", "rewardId": "879683935", "selected": true}, {"category": "Viking", "rewardId": "268555963", "selected": true}, {"category": "Ravager", "rewardId": "1896734762", "selected": true}, {"category": "Immortal", "rewardId": "2442224836", "selected": true}, {"category": "Hellion", "rewardId": "535257061", "selected": true}, {"category": "Brood Lord", "rewardId": "3927507442", "selected": true}, {"category": "Carrier", "rewardId": "1701027258", "selected": true}, {"category": "Swarm Host", "rewardId": "648212375", "selected": true}, {"category": "Infestor", "rewardId": "695505991", "selected": true}, {"category": "Hydralisk", "rewardId": "415049813", "selected": true}, {"category": "Widow Mine", "rewardId": "306523769", "selected": true}, {"category": "Overseer", "rewardId": "3881125246", "selected": true}, {"category": "Mutalisk", "rewardId": "1804753350", "selected": true}, {"category": "Baneling", "rewardId": "2418337610", "selected": true}, {"category": "Siege Tank", "rewardId": "2798322801", "selected": true}, {"category": "Corruptor", "rewardId": "2265163032", "selected": true}, {"category": "Liberator", "rewardId": "800865563", "selected": true}, {"category": "Warp Prism", "rewardId": "3659500125", "selected": true}, {"category": "Medivac", "rewardId": "2588696470", "selected": true}, {"category": "Viper", "rewardId": "570858554", "selected": true}, {"category": "Lurker", "rewardId": "3662634844", "selected": true}, {"category": "Banshee", "rewardId": "170345607", "selected": true}, {"category": "Tempest", "rewardId": "2104220742", "selected": true}, {"category": "SCV", "rewardId": "3200889017", "selected": true}, {"category": "Reaper", "rewardId": "1664806146", "selected": true}, {"category": "Disruptor", "rewardId": "1819379161", "selected": true}, {"category": "Queen", "rewardId": "3535385893", "selected": true}, {"category": "Observer", "rewardId": "522060623", "selected": true}, {"category": "Void Ray", "rewardId": "4037382619", "selected": true}, {"category": "Cyclone", "rewardId": "2200440608", "selected": true}, {"category": "Battlecruiser", "rewardId": "3312681754", "selected": true}, {"category": "Drone", "rewardId": "2615206337", "selected": true}, {"category": "Sentry", "rewardId": "574205234", "selected": true}, {"category": "Oracle", "rewardId": "2211791455", "selected": true}, {"category": "Probe", "rewardId": "1434315354", "selected": true}, {"category": "Mothership", "rewardId": "861201319", "selected": true}, {"category": "Raven", "rewardId": "3776162522", "selected": true}, {"category": "Phoenix", "rewardId": "2112492252", "selected": true}, {"category": "Dark Templar", "rewardId": "334383271", "selected": true}, {"rewardId": "3733267237", "selected": true}, {"rewardId": "2065745403", "selected": true}, {"rewardId": "1211360877", "selected": true}, {"rewardId": "535063834", "selected": true}, {"rewardId": "923590681", "selected": true}, {"rewardId": "4166020264", "selected": true}, {"rewardId": "2926081466", "selected": true}, {"rewardId": "3710533283", "selected": true}, {"rewardId": "92851551", "selected": true}, {"rewardId": "2261818887", "selected": true}, {"rewardId": "2575161619", "selected": true}, {"rewardId": "3862455791", "selected": true}, {"rewardId": "2636101654", "selected": true}, {"rewardId": "2269615009", "selected": true}, {"rewardId": "3573408684", "selected": true}, {"rewardId": "3297901810", "selected": true}, {"rewardId": "1856740790", "selected": true}, {"rewardId": "3844201106", "selected": true}, {"rewardId": "2743830879", "selected": true}, {"rewardId": "2484810407", "selected": true}, {"rewardId": "2816041570", "selected": true}, {"rewardId": "2440135473", "selected": true}, {"rewardId": "957728898", "selected": true}, {"rewardId": "3297635990", "selected": true}, {"rewardId": "3558592749", "selected": true}, {"rewardId": "58624234", "selected": true}, {"rewardId": "3334748015", "selected": true}, {"rewardId": "2546882099", "selected": true}, {"rewardId": "2569643939", "selected": true}, {"rewardId": "1932524020", "selected": true}, {"rewardId": "918219367", "selected": true}, {"rewardId": "1970675092", "selected": true}, {"rewardId": "569055983", "selected": true}, {"rewardId": "3774144990", "selected": true}, {"rewardId": "1185292261", "selected": true}, {"rewardId": "3663192486", "selected": true}, {"rewardId": "3335041079", "selected": true}, {"rewardId": "1465828875", "selected": true}, {"rewardId": "1625603542", "selected": true}, {"rewardId": "1873318506", "selected": true}, {"rewardId": "3356965872", "selected": true}, {"rewardId": "3281192906", "selected": true}, {"rewardId": "3961822947", "selected": true}], "earnedAchievements": [{"achievementId": "91475333092824", "completionDate": 1550677355, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475329427946"}]}, {"achievementId": "91475320767578", "completionDate": 1550677346, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 24, "isComplete": true, "inProgress": false, "criteria": []}, {"achievementId": "91475325309291", "completionDate": 1473449043, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475328935513"}]}, {"achievementId": "91475320766707", "completionDate": 1473449043, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475320766915"}]}, {"achievementId": "91475035553824", "completionDate": 1473446653, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553831"}]}, {"achievementId": "91475329346913", "completionDate": 1473443802, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475333699746"}]}, {"achievementId": "91475320768509", "completionDate": 1470366983, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475320767772"}]}, {"achievementId": "91475035553803", "completionDate": 1470347581, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553803"}]}, {"achievementId": "91475320766710", "completionDate": 1470346743, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475320766921"}]}, {"achievementId": "91475035553816", "completionDate": 1470346620, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553823"}]}, {"achievementId": "91475035553840", "completionDate": 1470343829, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553874"}]}, {"achievementId": "91475035553823", "completionDate": 1470343804, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553830"}]}, {"achievementId": "91475035553838", "completionDate": 1470343773, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553872"}]}, {"achievementId": "91475035553818", "completionDate": 1470342234, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553825"}]}, {"achievementId": "91475328241047", "completionDate": 1470339356, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475321289247"}]}, {"achievementId": "91475320768475", "completionDate": 1470339356, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475320767709", "earned": {"quantity": 4, "startTime": 0}}, {"criterionId": "91475320767711", "earned": {"quantity": 4, "startTime": 0}}, {"criterionId": "91475320767713", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035553833", "completionDate": 1470338943, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553840"}]}, {"achievementId": "91475334888156", "completionDate": 1470337864, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 7, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475325292206", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475336474743", "earned": {"quantity": 7, "startTime": 0}}, {"criterionId": "91475331445599", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475327555908", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475326525233", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475332121737", "earned": {"quantity": 2, "startTime": 0}}, {"criterionId": "91475333990659", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035553847", "completionDate": 1470337600, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553881"}]}, {"achievementId": "91475035553802", "completionDate": 1470336507, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553802"}]}, {"achievementId": "91475320768588", "completionDate": 1470336179, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475320767913"}]}, {"achievementId": "91475035553821", "completionDate": 1470336142, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553828"}]}, {"achievementId": "91475035553829", "completionDate": 1470336096, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553836"}]}, {"achievementId": "91475035553819", "completionDate": 1470336064, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553826"}]}, {"achievementId": "91475035553830", "completionDate": 1470335716, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553837"}]}, {"achievementId": "91475035553828", "completionDate": 1470335258, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035553835"}]}, {"achievementId": "91475332200455", "completionDate": 1470334043, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475337379646"}]}, {"achievementId": "91475329944805", "completionDate": 1470334043, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475322799887"}]}, {"achievementId": "91475335633554", "completionDate": 1470334043, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475334202312"}]}, {"achievementId": "91475330443207", "completionDate": 1447976497, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475323160462"}]}, {"achievementId": "91475035554499", "completionDate": 1447714775, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555148"}]}, {"achievementId": "91475035554541", "completionDate": 1447693974, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555221"}]}, {"achievementId": "91475035554557", "completionDate": 1447693260, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555143"}]}, {"achievementId": "91475035554449", "completionDate": 1447692303, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555118"}]}, {"achievementId": "91475035554443", "completionDate": 1447691403, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555112"}]}, {"achievementId": "91475035554452", "completionDate": 1447690666, "numCompletedAchievementsInSeries": 6, "totalAchievementsInSeries": 6, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555161", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555122", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555123", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555159", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555160", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555121", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554539", "completionDate": 1447690666, "numCompletedAchievementsInSeries": 4, "totalAchievementsInSeries": 4, "isComplete": true, "inProgress": false, "criteria": []}, {"achievementId": "91475035554487", "completionDate": 1447690666, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555136"}]}, {"achievementId": "91475035554564", "completionDate": 1447690666, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555252", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555253", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555602", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554536", "completionDate": 1447690666, "numCompletedAchievementsInSeries": 4, "totalAchievementsInSeries": 4, "isComplete": true, "inProgress": false, "criteria": []}, {"achievementId": "91475035554510", "completionDate": 1447690666, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555616"}]}, {"achievementId": "91475035554484", "completionDate": 1447676793, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555133"}]}, {"achievementId": "91475035554488", "completionDate": 1447676340, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555137"}]}, {"achievementId": "91475035554440", "completionDate": 1447675702, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555109", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554483", "completionDate": 1447674768, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555132"}]}, {"achievementId": "91475035554486", "completionDate": 1447674205, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555135"}]}, {"achievementId": "91475035554588", "completionDate": 1447672943, "numCompletedAchievementsInSeries": 10, "totalAchievementsInSeries": 10, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555342", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555343", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555344", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555345", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555348", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555349", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555346", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555347", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555350", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555341", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554438", "completionDate": 1447672555, "numCompletedAchievementsInSeries": 6, "totalAchievementsInSeries": 6, "isComplete": true, "inProgress": false, "criteria": []}, {"achievementId": "91475035554714", "completionDate": 1447672555, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555310", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555318", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555319", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554583", "completionDate": 1447672555, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555723", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555329", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555722", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554500", "completionDate": 1447672554, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555149"}]}, {"achievementId": "91475035554503", "completionDate": 1447671919, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555152"}]}, {"achievementId": "91475035554502", "completionDate": 1447671764, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555151"}]}, {"achievementId": "91475035554507", "completionDate": 1447633753, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555156"}]}, {"achievementId": "91475035554509", "completionDate": 1447633753, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555158"}]}, {"achievementId": "91475035554585", "completionDate": 1447633753, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555334", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555335", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555333", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554586", "completionDate": 1447633753, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555336", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555337", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555338", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554504", "completionDate": 1447632195, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555153"}]}, {"achievementId": "91475035554447", "completionDate": 1447632069, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555116"}]}, {"achievementId": "91475035554445", "completionDate": 1447632009, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555114"}]}, {"achievementId": "91475035554581", "completionDate": 1447630506, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 2, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555320", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555321", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554587", "completionDate": 1447630506, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 2, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555339", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555340", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554518", "completionDate": 1447630505, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555169"}]}, {"achievementId": "91475035554521", "completionDate": 1447630017, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555601"}]}, {"achievementId": "91475035554497", "completionDate": 1447628744, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555146"}]}, {"achievementId": "91475035554519", "completionDate": 1447628682, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555170"}]}, {"achievementId": "91475035554441", "completionDate": 1447628522, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555110"}]}, {"achievementId": "91475035554546", "completionDate": 1447626253, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555225"}]}, {"achievementId": "91475035554551", "completionDate": 1447626253, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555230"}]}, {"achievementId": "91475035554562", "completionDate": 1447626253, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555239", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555240", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555238", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554584", "completionDate": 1447626253, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555331", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555332", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555330", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554480", "completionDate": 1447625881, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555128"}]}, {"achievementId": "91475035554571", "completionDate": 1447624894, "numCompletedAchievementsInSeries": 5, "totalAchievementsInSeries": 5, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555269", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555268", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555266", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555267", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555270", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554542", "completionDate": 1447624715, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555222"}]}, {"achievementId": "91475035554547", "completionDate": 1447623701, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555226"}]}, {"achievementId": "91475035554524", "completionDate": 1447622992, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555174"}]}, {"achievementId": "91475035554549", "completionDate": 1447621054, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555228"}]}, {"achievementId": "91475035554505", "completionDate": 1447621054, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555154"}]}, {"achievementId": "91475035554525", "completionDate": 1447619781, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555175"}]}, {"achievementId": "91475035554577", "completionDate": 1447619781, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 2, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555308", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555309", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554482", "completionDate": 1447619781, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 2, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555131", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555130", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554444", "completionDate": 1447618616, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555623"}]}, {"achievementId": "91475035554479", "completionDate": 1447610585, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555127"}]}, {"achievementId": "91475035554512", "completionDate": 1447609934, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555163"}]}, {"achievementId": "91475035554513", "completionDate": 1447597159, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555164"}]}, {"achievementId": "91475035554517", "completionDate": 1447597159, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555168"}]}, {"achievementId": "91475035554450", "completionDate": 1447596836, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555119"}]}, {"achievementId": "91475035554516", "completionDate": 1447596617, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555167"}]}, {"achievementId": "91475035554493", "completionDate": 1447525664, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555142"}]}, {"achievementId": "91475035554556", "completionDate": 1447523766, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555235"}]}, {"achievementId": "91475035554567", "completionDate": 1447507110, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555262"}]}, {"achievementId": "91475035554554", "completionDate": 1447505479, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555233"}]}, {"achievementId": "91475035554566", "completionDate": 1447504784, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555261"}]}, {"achievementId": "91475035554552", "completionDate": 1447451680, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555231"}]}, {"achievementId": "91475035554555", "completionDate": 1447451476, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555234"}]}, {"achievementId": "91475035554423", "completionDate": 1447449394, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555087"}]}, {"achievementId": "91475035554433", "completionDate": 1447449394, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555107", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555105", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555106", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554432", "completionDate": 1447449394, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 3, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555095", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555098", "earned": {"quantity": 1, "startTime": 1}}, {"criterionId": "91475035555101", "earned": {"quantity": 1, "startTime": 1}}]}, {"achievementId": "91475035554429", "completionDate": 1447449246, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555093"}]}, {"achievementId": "91475035554425", "completionDate": 1447447780, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555089"}]}, {"achievementId": "91475035554424", "completionDate": 1447447723, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555088"}]}, {"achievementId": "91475035554428", "completionDate": 1447446969, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555092"}]}, {"achievementId": "91475035554422", "completionDate": 1447441173, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555086"}]}, {"achievementId": "91475035554427", "completionDate": 1447440907, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 1, "isComplete": true, "inProgress": false, "criteria": [{"criterionId": "91475035555091"}]}, {"achievementId": "91475320766541", "completionDate": 1470339356, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475035554530", "completionDate": 1447597160, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": true, "criteria": [], "nextProgressEarnedQuantity": 37, "nextProgressRequiredQuantity": 1}, {"achievementId": "91475321572241", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475035553808", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035553809"}, {"criterionId": "91475035553811"}, {"criterionId": "91475035553812", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553807"}, {"criterionId": "91475035553808", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553813"}, {"criterionId": "91475035553814"}, {"criterionId": "91475035553815"}, {"criterionId": "91475035553810"}]}, {"achievementId": "91475035554036", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475035553837", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 10, "totalAchievementsInSeries": 26, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035553844"}, {"criterionId": "91475035553845"}, {"criterionId": "91475035553847", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553849", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553851"}, {"criterionId": "91475035553852"}, {"criterionId": "91475035553855"}, {"criterionId": "91475035553857"}, {"criterionId": "91475035553858", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553859"}, {"criterionId": "91475035553860", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553863"}, {"criterionId": "91475035553864"}, {"criterionId": "91475035553865"}, {"criterionId": "91475035553866", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553867", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553869", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553870"}, {"criterionId": "91475035553871"}, {"criterionId": "91475035553848", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553853"}, {"criterionId": "91475035553854", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553856", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553861"}, {"criterionId": "91475035553862"}, {"criterionId": "91475035553868"}]}, {"achievementId": "91475322400597", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 90, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475327003879"}, {"criterionId": "91475330600185"}, {"criterionId": "91475334114668"}, {"criterionId": "91475328467099"}, {"criterionId": "91475332599274"}, {"criterionId": "91475323530868"}, {"criterionId": "91475332140686"}, {"criterionId": "91475323932442"}, {"criterionId": "91475324890912"}, {"criterionId": "91475321343888", "earned": {"quantity": 5, "startTime": 0}}, {"criterionId": "91475334205442"}, {"criterionId": "91475321409686"}, {"criterionId": "91475324612983"}, {"criterionId": "91475329020433"}, {"criterionId": "91475326128709"}, {"criterionId": "91475327808098"}, {"criterionId": "91475332354710"}, {"criterionId": "91475329192628"}, {"criterionId": "91475336893263"}, {"criterionId": "91475329782624"}, {"criterionId": "91475334861700"}, {"criterionId": "91475323032471"}, {"criterionId": "91475330143469"}, {"criterionId": "91475331552554"}, {"criterionId": "91475327980853"}, {"criterionId": "91475335812638"}, {"criterionId": "91475321820832"}, {"criterionId": "91475337090779"}, {"criterionId": "91475327293209"}, {"criterionId": "91475322230578"}, {"criterionId": "91475321583501"}, {"criterionId": "91475328513953"}, {"criterionId": "91475329685428"}, {"criterionId": "91475337361485"}, {"criterionId": "91475334543480"}, {"criterionId": "91475321714924"}, {"criterionId": "91475325188374"}, {"criterionId": "91475327228304"}, {"criterionId": "91475324049988"}, {"criterionId": "91475326614474"}, {"criterionId": "91475333667807", "earned": {"quantity": 6, "startTime": 0}}, {"criterionId": "91475335298124"}, {"criterionId": "91475334741155"}, {"criterionId": "91475336543614"}, {"criterionId": "91475326827926"}, {"criterionId": "91475334481180"}, {"criterionId": "91475331874606"}, {"criterionId": "91475335151459"}, {"criterionId": "91475334758270"}, {"criterionId": "91475327082368"}, {"criterionId": "91475332947901"}, {"criterionId": "91475330596803"}, {"criterionId": "91475323551975"}, {"criterionId": "91475336634652"}, {"criterionId": "91475336405292"}, {"criterionId": "91475323502948"}, {"criterionId": "91475334291858"}, {"criterionId": "91475323600970"}, {"criterionId": "91475336684078"}, {"criterionId": "91475324052221"}, {"criterionId": "91475329581887"}, {"criterionId": "91475329500246"}, {"criterionId": "91475326682443"}, {"criterionId": "91475327550634"}, {"criterionId": "91475330745591"}, {"criterionId": "91475336226132"}, {"criterionId": "91475334767974"}, {"criterionId": "91475333531000"}, {"criterionId": "91475324044694"}, {"criterionId": "91475321447880"}, {"criterionId": "91475333473761"}, {"criterionId": "91475327026700"}, {"criterionId": "91475324134962"}, {"criterionId": "91475321112172"}, {"criterionId": "91475326404326"}, {"criterionId": "91475326158648"}, {"criterionId": "91475323029308"}, {"criterionId": "91475333859283"}, {"criterionId": "91475325315139"}, {"criterionId": "91475335424210"}, {"criterionId": "91475324250404"}, {"criterionId": "91475331901811"}, {"criterionId": "91475331492494"}, {"criterionId": "91475325946516"}, {"criterionId": "91475335219963"}, {"criterionId": "91475330310626"}, {"criterionId": "91475327257647"}, {"criterionId": "91475325682196"}, {"criterionId": "91475332377798"}, {"criterionId": "91475335023820"}]}, {"achievementId": "91475035553849", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 11, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035553890"}, {"criterionId": "91475035553891"}, {"criterionId": "91475035553883"}, {"criterionId": "91475035553884"}, {"criterionId": "91475035553885"}, {"criterionId": "91475035553886", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553887"}, {"criterionId": "91475035553888", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553889", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035553892"}, {"criterionId": "91475035553893"}]}, {"achievementId": "91475336291145", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475325600197", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 8, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475326766331", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475331968811"}, {"criterionId": "91475333846087", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475329414494", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475336370883"}, {"criterionId": "91475331573065", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475326116849", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475324249510", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475332082713", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475329707091", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475320766708", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320766916"}, {"criterionId": "91475320766917"}, {"criterionId": "91475320766918"}, {"criterionId": "91475320766919", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766981"}]}, {"achievementId": "91475335958848", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475337130079", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475334675119"}, {"criterionId": "91475330564404"}]}, {"achievementId": "91475320766956", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320766964", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766982"}, {"criterionId": "91475320766983"}, {"criterionId": "91475320766963"}]}, {"achievementId": "91475325498135", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475320768476", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320767714"}, {"criterionId": "91475320767716"}, {"criterionId": "91475320767718"}, {"criterionId": "91475320767720", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475035554582", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 5, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555323", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555322", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555324"}, {"criterionId": "91475035555325", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555326", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555327"}, {"criterionId": "91475035555328", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475035554579", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555317", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555311", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555312"}, {"criterionId": "91475035555314", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555313"}, {"criterionId": "91475035555315"}, {"criterionId": "91475035555316"}]}, {"achievementId": "91475322764393", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475320766633", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320766781"}, {"criterionId": "91475320766785", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766786"}, {"criterionId": "91475320766787"}]}, {"achievementId": "91475035554574", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555286"}, {"criterionId": "91475035555285", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555287"}]}, {"achievementId": "91475035554576", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 6, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555299"}, {"criterionId": "91475035555303", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555305"}, {"criterionId": "91475035555298", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555300"}, {"criterionId": "91475035555301", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555302"}, {"criterionId": "91475035555304", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555306", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555307", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475035554575", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 6, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555291", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555292"}, {"criterionId": "91475035555293"}, {"criterionId": "91475035555297", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555289", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555288", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555290"}, {"criterionId": "91475035555294", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555295", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555296"}]}, {"achievementId": "91475035554572", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 6, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555279"}, {"criterionId": "91475035555280"}, {"criterionId": "91475035555273"}, {"criterionId": "91475035555274", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555275", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555272", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555276"}, {"criterionId": "91475035555271", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555277", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555278", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475035554573", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 8, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555254", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555259", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555255"}, {"criterionId": "91475035555281", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555283", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555256"}, {"criterionId": "91475035555257", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555258", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555282", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555284", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475320766634", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320766788"}, {"criterionId": "91475320766789"}, {"criterionId": "91475320766791", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766792", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766793", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766790"}]}, {"achievementId": "91475320766635", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 2, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320766794"}, {"criterionId": "91475320766797", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766798"}, {"criterionId": "91475320766800"}, {"criterionId": "91475320766801"}, {"criterionId": "91475320766802"}, {"criterionId": "91475320766803"}, {"criterionId": "91475320766795", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766796"}, {"criterionId": "91475320766799"}]}, {"achievementId": "91475323853471", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475320766557", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475320766575"}, {"criterionId": "91475320766576"}, {"criterionId": "91475320766577"}, {"criterionId": "91475320766578", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475320766579"}, {"criterionId": "91475320766580"}, {"criterionId": "91475320766581"}, {"criterionId": "91475320766582"}, {"criterionId": "91475320766583"}, {"criterionId": "91475320766584"}]}, {"achievementId": "91475035554715", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555251", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555604"}, {"criterionId": "91475035555605"}]}, {"achievementId": "91475035553986", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475035554716", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 3, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555607", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555608"}, {"criterionId": "91475035555609"}, {"criterionId": "91475035555610"}, {"criterionId": "91475035555612"}, {"criterionId": "91475035555613"}, {"criterionId": "91475035555614"}, {"criterionId": "91475035555615"}, {"criterionId": "91475035555606", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555611", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475328493540", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475325568242", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475035554589", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 10, "totalAchievementsInSeries": 18, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555725"}, {"criterionId": "91475035555726", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555734", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555736", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555737", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555738", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555731", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555724"}, {"criterionId": "91475035555727", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555728"}, {"criterionId": "91475035555729", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555730"}, {"criterionId": "91475035555732", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555733"}, {"criterionId": "91475035555735", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555739"}, {"criterionId": "91475035555740"}, {"criterionId": "91475035555741"}]}, {"achievementId": "91475035554563", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 6, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475035555250", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555242", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555244", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555245", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555247", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555248"}, {"criterionId": "91475035555246"}, {"criterionId": "91475035555249"}, {"criterionId": "91475035555241", "earned": {"quantity": 1, "startTime": 0}}, {"criterionId": "91475035555243"}]}, {"achievementId": "91475322961304", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475320767211", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 28, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475323410831", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475336203783", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475324225400", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475334058022", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475331577281", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475324281236", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475329833803", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475327159984", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475329050964", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": true, "criteria": []}, {"achievementId": "91475333712804", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475333181557"}, {"criterionId": "91475332643036", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475323166580", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 1, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": true, "criteria": [{"criterionId": "91475323594076"}, {"criterionId": "91475337341977", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475035554109", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554477"}]}, {"achievementId": "91475035554777", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555830"}]}, {"achievementId": "91475320766698", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766906"}]}, {"achievementId": "91475035553947", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554057"}]}, {"achievementId": "91475320766636", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766587", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766701", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766908"}]}, {"achievementId": "91475035554655", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555531", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475320766704", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766912"}]}, {"achievementId": "91475035553850", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553894"}]}, {"achievementId": "91475035554670", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555532", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553812", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553819"}]}, {"achievementId": "91475035554807", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555882"}]}, {"achievementId": "91475320766573", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766655", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554771", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555821", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475320766711", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554638", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555399", "earned": {"quantity": 6000, "startTime": 0}}]}, {"achievementId": "91475035554079", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554423"}]}, {"achievementId": "91475035554797", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555866"}]}, {"achievementId": "91475320766559", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554089", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554441"}]}, {"achievementId": "91475035554802", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555874"}]}, {"achievementId": "91475320766666", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554665", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555426", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554713", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555522", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553960", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554081"}]}, {"achievementId": "91475320766545", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 1, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475325745764", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475331727248"}]}, {"achievementId": "91475035554812", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555890"}]}, {"achievementId": "91475035554823", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555915"}]}, {"achievementId": "91475320768477", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767721"}]}, {"achievementId": "91475035553973", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554105"}]}, {"achievementId": "91475320766682", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766875"}]}, {"achievementId": "91475035554453", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555539"}]}, {"achievementId": "91475035554660", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555421", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554136", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554525"}]}, {"achievementId": "91475320766509", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 6, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475035553851", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553895"}]}, {"achievementId": "91475329450186", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475323675377"}]}, {"achievementId": "91475035554451", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555120", "earned": {"quantity": 17, "startTime": 0}}]}, {"achievementId": "91475326410893", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 8, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475035553902", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553973"}]}, {"achievementId": "91475035553931", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554027"}]}, {"achievementId": "91475035553809", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553816"}]}, {"achievementId": "91475035553794", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554254"}, {"criterionId": "91475035554257"}, {"criterionId": "91475035554261"}, {"criterionId": "91475035554256"}, {"criterionId": "91475035554253"}, {"criterionId": "91475035554255"}, {"criterionId": "91475035554258"}, {"criterionId": "91475035554259"}, {"criterionId": "91475035554260"}]}, {"achievementId": "91475035553795", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554263"}, {"criterionId": "91475035554265"}, {"criterionId": "91475035554262"}, {"criterionId": "91475035554264"}, {"criterionId": "91475035554266"}, {"criterionId": "91475035554267"}, {"criterionId": "91475035554268"}, {"criterionId": "91475035554269"}, {"criterionId": "91475035554270"}]}, {"achievementId": "91475320766709", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766920"}]}, {"achievementId": "91475035554280", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554752"}]}, {"achievementId": "91475035554129", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554513"}]}, {"achievementId": "91475320766465", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554751", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555782"}]}, {"achievementId": "91475035554099", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554459"}]}, {"achievementId": "91475035554680", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555441"}]}, {"achievementId": "91475035554729", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555742", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475320766632", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766777"}, {"criterionId": "91475320766778"}, {"criterionId": "91475320766779"}, {"criterionId": "91475320766780"}]}, {"achievementId": "91475035553799", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553799"}]}, {"achievementId": "91475035553915", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553997"}]}, {"achievementId": "91475035553879", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553931"}]}, {"achievementId": "91475035554772", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555822"}]}, {"achievementId": "91475035554818", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555899"}, {"criterionId": "91475035555903"}, {"criterionId": "91475035555904"}, {"criterionId": "91475035555905"}, {"criterionId": "91475035555907"}, {"criterionId": "91475035555900"}, {"criterionId": "91475035555901"}, {"criterionId": "91475035555902"}, {"criterionId": "91475035555906"}]}, {"achievementId": "91475035553796", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554274"}, {"criterionId": "91475035554275"}, {"criterionId": "91475035553796"}, {"criterionId": "91475035554273"}, {"criterionId": "91475035554271"}, {"criterionId": "91475035554272"}, {"criterionId": "91475035554276"}, {"criterionId": "91475035554277"}, {"criterionId": "91475035554278"}, {"criterionId": "91475035554279"}]}, {"achievementId": "91475035554368", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554963"}]}, {"achievementId": "91475035554407", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555031"}, {"criterionId": "91475035555033"}, {"criterionId": "91475035555032"}]}, {"achievementId": "91475035553889", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553949"}]}, {"achievementId": "91475035554384", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554993"}]}, {"achievementId": "91475335333585", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553890", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553950"}]}, {"achievementId": "91475035554408", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555034"}, {"criterionId": "91475035555035"}, {"criterionId": "91475035555036"}]}, {"achievementId": "91475035553888", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553942"}, {"criterionId": "91475035553943"}, {"criterionId": "91475035553947"}, {"criterionId": "91475035553940"}, {"criterionId": "91475035553941"}, {"criterionId": "91475035553944"}, {"criterionId": "91475035553945"}, {"criterionId": "91475035553946"}, {"criterionId": "91475035553948"}]}, {"achievementId": "91475035554361", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555084"}, {"criterionId": "91475035555085"}, {"criterionId": "91475035555083"}]}, {"achievementId": "91475035553880", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553932"}]}, {"achievementId": "91475035553797", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553797"}]}, {"achievementId": "91475035553916", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553998"}]}, {"achievementId": "91475035554723", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555713"}, {"criterionId": "91475035555714"}, {"criterionId": "91475035555715"}, {"criterionId": "91475035555716"}]}, {"achievementId": "91475035554459", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555622"}]}, {"achievementId": "91475035554281", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554753"}]}, {"achievementId": "91475323881078", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475328305582"}]}, {"achievementId": "91475035553961", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554082"}]}, {"achievementId": "91475320766700", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766907"}]}, {"achievementId": "91475035554514", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555165"}]}, {"achievementId": "91475035553948", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554058"}]}, {"achievementId": "91475035554526", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555176"}]}, {"achievementId": "91475035553810", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553817"}]}, {"achievementId": "91475035553903", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553974"}]}, {"achievementId": "91475035553800", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553800"}]}, {"achievementId": "91475035554137", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554526"}]}, {"achievementId": "91475035554455", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555540"}]}, {"achievementId": "91475035554544", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555223"}]}, {"achievementId": "91475035554130", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554514"}]}, {"achievementId": "91475035553974", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554106"}]}, {"achievementId": "91475035554824", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555916"}]}, {"achievementId": "91475035554110", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554478"}]}, {"achievementId": "91475035554808", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555883"}]}, {"achievementId": "91475035554100", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554460"}]}, {"achievementId": "91475035554385", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554994"}]}, {"achievementId": "91475035554803", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555875"}]}, {"achievementId": "91475035554639", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555400"}]}, {"achievementId": "91475035554798", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555867"}]}, {"achievementId": "91475035554080", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554424"}]}, {"achievementId": "91475035554773", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555823"}]}, {"achievementId": "91475035554819", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555908"}]}, {"achievementId": "91475320768479", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767723"}]}, {"achievementId": "91475035554813", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555891"}]}, {"achievementId": "91475035554752", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555783", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554746", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555764"}, {"criterionId": "91475035555766"}, {"criterionId": "91475035555767"}, {"criterionId": "91475035555768"}, {"criterionId": "91475035555771"}, {"criterionId": "91475035555763"}, {"criterionId": "91475035555765"}, {"criterionId": "91475035555769"}, {"criterionId": "91475035555770"}, {"criterionId": "91475035555772"}]}, {"achievementId": "91475320766705", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766913"}]}, {"achievementId": "91475035554734", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555751"}]}, {"achievementId": "91475320766702", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766909", "earned": {"quantity": 1, "startTime": 0}}]}, {"achievementId": "91475035554656", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555417", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554730", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555743", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554778", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555831"}]}, {"achievementId": "91475320768511", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767774"}]}, {"achievementId": "91475035554013", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554282"}]}, {"achievementId": "91475320766683", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766876"}]}, {"achievementId": "91475035553932", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554028"}]}, {"achievementId": "91475035554661", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555422", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553852", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553896"}]}, {"achievementId": "91475035554666", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555427", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554671", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555432", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475320766623", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554681", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555510", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554676", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555437", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554369", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554964"}]}, {"achievementId": "91475035554090", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554442"}]}, {"achievementId": "91475320768481", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767725"}, {"criterionId": "91475320767727"}, {"criterionId": "91475320767729"}]}, {"achievementId": "91475035553933", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554029"}]}, {"achievementId": "91475035554779", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555832"}]}, {"achievementId": "91475035554282", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554754"}]}, {"achievementId": "91475035554456", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555541"}]}, {"achievementId": "91475035554774", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555824"}]}, {"achievementId": "91475035553962", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554083"}]}, {"achievementId": "91475035554111", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554479"}]}, {"achievementId": "91475035553839", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553873"}]}, {"achievementId": "91475035554272", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554736"}]}, {"achievementId": "91475035553901", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 12, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553962"}, {"criterionId": "91475035553964"}, {"criterionId": "91475035553961"}, {"criterionId": "91475035553963"}, {"criterionId": "91475035553965"}, {"criterionId": "91475035553966"}, {"criterionId": "91475035553967"}, {"criterionId": "91475035553968"}, {"criterionId": "91475035553969"}, {"criterionId": "91475035553970"}, {"criterionId": "91475035553971"}, {"criterionId": "91475035553972"}]}, {"achievementId": "91475035553975", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554107"}]}, {"achievementId": "91475035553853", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553897"}]}, {"achievementId": "91475035554682", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555443"}]}, {"achievementId": "91475035554138", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554527"}]}, {"achievementId": "91475035553801", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553801"}]}, {"achievementId": "91475035554568", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555263"}]}, {"achievementId": "91475035553904", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553975"}]}, {"achievementId": "91475035554804", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555876"}]}, {"achievementId": "91475035554131", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554515"}]}, {"achievementId": "91475035554677", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555517", "earned": {"quantity": 342, "startTime": 0}}]}, {"achievementId": "91475035554527", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555177"}]}, {"achievementId": "91475035554506", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555155"}]}, {"achievementId": "91475035553917", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553999"}]}, {"achievementId": "91475035554799", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555868"}]}, {"achievementId": "91475035554515", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555166"}]}, {"achievementId": "91475035554091", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554443"}]}, {"achievementId": "91475035554672", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555433"}]}, {"achievementId": "91475035554545", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555224"}]}, {"achievementId": "91475035554081", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554425"}]}, {"achievementId": "91475035554820", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555909"}]}, {"achievementId": "91475035554735", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555752"}]}, {"achievementId": "91475035554825", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555917"}]}, {"achievementId": "91475035554731", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555744"}]}, {"achievementId": "91475035554426", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555090"}]}, {"achievementId": "91475035554814", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555892"}]}, {"achievementId": "91475035554667", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555428"}]}, {"achievementId": "91475035553798", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553798"}]}, {"achievementId": "91475035554409", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555037"}, {"criterionId": "91475035555039"}, {"criterionId": "91475035555038"}]}, {"achievementId": "91475035554657", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555418"}]}, {"achievementId": "91475035553811", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553818"}]}, {"achievementId": "91475035554386", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554995"}]}, {"achievementId": "91475035553949", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554059"}]}, {"achievementId": "91475035554753", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555784"}]}, {"achievementId": "91475035554383", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 15, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554981"}, {"criterionId": "91475035554983"}, {"criterionId": "91475035554985"}, {"criterionId": "91475035554986"}, {"criterionId": "91475035554987"}, {"criterionId": "91475035554989"}, {"criterionId": "91475035554991"}, {"criterionId": "91475035554978"}, {"criterionId": "91475035554979"}, {"criterionId": "91475035554980"}, {"criterionId": "91475035554982"}, {"criterionId": "91475035554984"}, {"criterionId": "91475035554988"}, {"criterionId": "91475035554990"}, {"criterionId": "91475035554992"}]}, {"achievementId": "91475035554014", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554283"}]}, {"achievementId": "91475035553891", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553951"}]}, {"achievementId": "91475320766684", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766877"}]}, {"achievementId": "91475035554662", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555423"}]}, {"achievementId": "91475035554370", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554965"}]}, {"achievementId": "91475035554520", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555171"}]}, {"achievementId": "91475035554809", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555884"}]}, {"achievementId": "91475035554659", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555492"}, {"criterionId": "91475035555493"}, {"criterionId": "91475035555494"}, {"criterionId": "91475035555491"}]}, {"achievementId": "91475035554641", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555402"}]}, {"achievementId": "91475035553881", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553933"}]}, {"achievementId": "91475035554747", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555773"}, {"criterionId": "91475035555774"}, {"criterionId": "91475035555775"}]}, {"achievementId": "91475320768513", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767776"}]}, {"achievementId": "91475325138062", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321108812"}]}, {"achievementId": "91475320766706", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766914"}]}, {"achievementId": "91475035554371", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554966"}]}, {"achievementId": "91475035553950", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554060"}]}, {"achievementId": "91475322370975", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321300503"}, {"criterionId": "91475335997041"}]}, {"achievementId": "91475035554463", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555545"}]}, {"achievementId": "91475035554139", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554528"}]}, {"achievementId": "91475035553905", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553976"}]}, {"achievementId": "91475035554039", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 30, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554340"}, {"criterionId": "91475035554341"}, {"criterionId": "91475035554322"}, {"criterionId": "91475035554327"}, {"criterionId": "91475035554331"}, {"criterionId": "91475035554338"}, {"criterionId": "91475035554332"}, {"criterionId": "91475035554334"}, {"criterionId": "91475035554337"}, {"criterionId": "91475035554324"}, {"criterionId": "91475035554313"}, {"criterionId": "91475035554312"}, {"criterionId": "91475035554315"}, {"criterionId": "91475035554317"}, {"criterionId": "91475035554318"}, {"criterionId": "91475035554319"}, {"criterionId": "91475035554320"}, {"criterionId": "91475035554321"}, {"criterionId": "91475035554323"}, {"criterionId": "91475035554325"}, {"criterionId": "91475035554326"}, {"criterionId": "91475035554314"}, {"criterionId": "91475035554328"}, {"criterionId": "91475035554329"}, {"criterionId": "91475035554330"}, {"criterionId": "91475035554316"}, {"criterionId": "91475035554333"}, {"criterionId": "91475035554335"}, {"criterionId": "91475035554336"}, {"criterionId": "91475035554339"}]}, {"achievementId": "91475035553976", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554108"}]}, {"achievementId": "91475320766688", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766881"}]}, {"achievementId": "91475035553914", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 12, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553985"}, {"criterionId": "91475035553986"}, {"criterionId": "91475035553987"}, {"criterionId": "91475035553988"}, {"criterionId": "91475035553989"}, {"criterionId": "91475035553990"}, {"criterionId": "91475035553991"}, {"criterionId": "91475035553992"}, {"criterionId": "91475035553995"}, {"criterionId": "91475035553996"}, {"criterionId": "91475035553993"}, {"criterionId": "91475035553994"}]}, {"achievementId": "91475035554826", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555918"}]}, {"achievementId": "91475320768515", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767778"}]}, {"achievementId": "91475035553813", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553820"}]}, {"achievementId": "91475035554815", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555893"}]}, {"achievementId": "91475035553963", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554084"}]}, {"achievementId": "91475035554810", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555885"}]}, {"achievementId": "91475035554821", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555910"}]}, {"achievementId": "91475035554082", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554426"}]}, {"achievementId": "91475035553892", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553952"}]}, {"achievementId": "91475035554805", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555877"}]}, {"achievementId": "91475035554092", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554444"}]}, {"achievementId": "91475035554780", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555833"}]}, {"achievementId": "91475035554102", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554462"}]}, {"achievementId": "91475035554800", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555869"}]}, {"achievementId": "91475035554112", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554480"}]}, {"achievementId": "91475035554015", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554284"}]}, {"achievementId": "91475035554132", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554516"}]}, {"achievementId": "91475035553918", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554000"}]}, {"achievementId": "91475035554775", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555825"}]}, {"achievementId": "91475324171245", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475335495237", "earned": {"quantity": 19, "startTime": 0}}]}, {"achievementId": "91475035554149", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555042"}, {"criterionId": "91475035555041"}, {"criterionId": "91475035555040"}]}, {"achievementId": "91475035554387", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554996"}]}, {"achievementId": "91475035554754", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555785"}]}, {"achievementId": "91475035554642", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555403", "earned": {"quantity": 6000, "startTime": 0}}]}, {"achievementId": "91475035554748", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555776"}, {"criterionId": "91475035555777"}, {"criterionId": "91475035555778"}]}, {"achievementId": "91475035554736", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555753"}]}, {"achievementId": "91475035554732", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555745"}]}, {"achievementId": "91475035554663", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555509"}]}, {"achievementId": "91475035554658", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555419"}]}, {"achievementId": "91475332500573", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475331029760"}]}, {"achievementId": "91475035553854", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553898"}]}, {"achievementId": "91475035554098", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554453"}, {"criterionId": "91475035554450"}, {"criterionId": "91475035554451"}, {"criterionId": "91475035554452"}, {"criterionId": "91475035554454"}, {"criterionId": "91475035554455"}, {"criterionId": "91475035554456"}, {"criterionId": "91475035554457"}, {"criterionId": "91475035554458"}, {"criterionId": "91475035555016"}]}, {"achievementId": "91475035553934", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554030"}]}, {"achievementId": "91475035554283", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554755"}]}, {"achievementId": "91475035554664", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555498"}, {"criterionId": "91475035555495"}, {"criterionId": "91475035555496"}, {"criterionId": "91475035555497"}]}, {"achievementId": "91475035554683", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555444"}]}, {"achievementId": "91475035554678", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555439", "earned": {"quantity": 4177, "startTime": 0}}]}, {"achievementId": "91475035554553", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555232"}]}, {"achievementId": "91475035554668", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555429"}]}, {"achievementId": "91475035554673", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555434"}]}, {"achievementId": "91475035553882", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553934"}]}, {"achievementId": "91475035553814", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553821"}]}, {"achievementId": "91475035553964", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554085"}]}, {"achievementId": "91475035554501", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555150"}]}, {"achievementId": "91475324364820", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475335823515"}]}, {"achievementId": "91475320766703", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766910"}, {"criterionId": "91475320766911"}, {"criterionId": "91475320766980"}]}, {"achievementId": "91475035554133", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554517"}]}, {"achievementId": "91475035553977", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554109"}]}, {"achievementId": "91475035554478", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555126"}]}, {"achievementId": "91475035554415", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555044"}, {"criterionId": "91475035555043"}, {"criterionId": "91475035555045"}]}, {"achievementId": "91475035554770", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555809"}, {"criterionId": "91475035555810"}, {"criterionId": "91475035555815"}, {"criterionId": "91475035555808"}, {"criterionId": "91475035555811"}, {"criterionId": "91475035555812"}, {"criterionId": "91475035555813"}, {"criterionId": "91475035555814"}, {"criterionId": "91475035555816"}, {"criterionId": "91475035555817"}]}, {"achievementId": "91475035554140", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554529"}]}, {"achievementId": "91475035553906", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553977"}]}, {"achievementId": "91475320766715", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766926"}]}, {"achievementId": "91475035553855", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553899"}]}, {"achievementId": "91475035553930", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 15, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554023"}, {"criterionId": "91475035554015"}, {"criterionId": "91475035554017"}, {"criterionId": "91475035554012"}, {"criterionId": "91475035554014"}, {"criterionId": "91475035554016"}, {"criterionId": "91475035554018"}, {"criterionId": "91475035554020"}, {"criterionId": "91475035554022"}, {"criterionId": "91475035554026"}, {"criterionId": "91475035554019"}, {"criterionId": "91475035554013"}, {"criterionId": "91475035554021"}, {"criterionId": "91475035554024"}, {"criterionId": "91475035554025"}]}, {"achievementId": "91475035554113", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554481"}]}, {"achievementId": "91475035554528", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555178"}]}, {"achievementId": "91475035554548", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555227"}]}, {"achievementId": "91475035554643", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555404"}]}, {"achievementId": "91475035553935", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554031"}]}, {"achievementId": "91475035553893", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553953"}]}, {"achievementId": "91475320766654", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766822"}, {"criterionId": "91475320766823"}, {"criterionId": "91475320766824"}, {"criterionId": "91475320766825"}, {"criterionId": "91475320766827"}, {"criterionId": "91475320766826"}]}, {"achievementId": "91475035554083", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554427"}]}, {"achievementId": "91475035554284", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554756"}]}, {"achievementId": "91475035554388", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554997"}]}, {"achievementId": "91475035554669", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555505"}, {"criterionId": "91475035555506"}, {"criterionId": "91475035555507"}, {"criterionId": "91475035555508"}]}, {"achievementId": "91475035554093", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554445"}]}, {"achievementId": "91475035553841", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553875"}]}, {"achievementId": "91475035554737", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555754"}]}, {"achievementId": "91475327956139", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475324355799"}, {"criterionId": "91475324758999"}]}, {"achievementId": "91475320766689", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766882"}]}, {"achievementId": "91475320768519", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767782"}]}, {"achievementId": "91475336371857", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475324082220"}]}, {"achievementId": "91475320766671", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766854"}]}, {"achievementId": "91475035554454", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555558"}]}, {"achievementId": "91475320766640", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553951", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554061"}]}, {"achievementId": "91475035554289", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554765"}, {"criterionId": "91475035554769"}, {"criterionId": "91475035554761"}, {"criterionId": "91475035554762"}, {"criterionId": "91475035554763"}, {"criterionId": "91475035554764"}, {"criterionId": "91475035554766"}, {"criterionId": "91475035554767"}, {"criterionId": "91475035554768"}, {"criterionId": "91475035555020"}]}, {"achievementId": "91475320766660", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766833"}]}, {"achievementId": "91475035553919", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554001"}]}, {"achievementId": "91475035554103", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554463"}]}, {"achievementId": "91475035554040", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554309"}]}, {"achievementId": "91475035554372", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554967"}]}, {"achievementId": "91475035553883", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553935"}]}, {"achievementId": "91475035554464", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555546"}]}, {"achievementId": "91475035553884", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553936"}]}, {"achievementId": "91475035554769", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555805"}, {"criterionId": "91475035555806"}, {"criterionId": "91475035555807"}]}, {"achievementId": "91475328876509", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475326373507"}]}, {"achievementId": "91475035553946", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 15, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554044"}, {"criterionId": "91475035554056"}, {"criterionId": "91475035554042"}, {"criterionId": "91475035554043"}, {"criterionId": "91475035554045"}, {"criterionId": "91475035554046"}, {"criterionId": "91475035554047"}, {"criterionId": "91475035554048"}, {"criterionId": "91475035554049"}, {"criterionId": "91475035554050"}, {"criterionId": "91475035554051"}, {"criterionId": "91475035554052"}, {"criterionId": "91475035554053"}, {"criterionId": "91475035554054"}, {"criterionId": "91475035554055"}]}, {"achievementId": "91475035554094", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554446"}]}, {"achievementId": "91475035553936", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554032"}]}, {"achievementId": "91475035554738", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555755"}]}, {"achievementId": "91475035553952", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554062"}]}, {"achievementId": "91475035554645", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555406"}]}, {"achievementId": "91475035554508", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555157"}]}, {"achievementId": "91475035553856", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553900"}]}, {"achievementId": "91475035554151", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555060"}, {"criterionId": "91475035555061"}]}, {"achievementId": "91475035554134", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554518"}]}, {"achievementId": "91475035554114", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554482"}]}, {"achievementId": "91475035554558", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555236"}]}, {"achievementId": "91475035554373", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554968"}]}, {"achievementId": "91475035554104", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554464"}]}, {"achievementId": "91475320766550", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554569", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555264"}]}, {"achievementId": "91475035554529", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555179"}]}, {"achievementId": "91475320767188", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554285", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554757"}]}, {"achievementId": "91475035553842", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553876"}]}, {"achievementId": "91475035554674", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555513"}, {"criterionId": "91475035555514"}, {"criterionId": "91475035555516"}, {"criterionId": "91475035555515"}]}, {"achievementId": "91475320766564", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554389", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554998"}]}, {"achievementId": "91475035554279", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554746"}, {"criterionId": "91475035554747"}, {"criterionId": "91475035554749"}, {"criterionId": "91475035554750"}, {"criterionId": "91475035554743"}, {"criterionId": "91475035554744"}, {"criterionId": "91475035554745"}, {"criterionId": "91475035554748"}, {"criterionId": "91475035554751"}, {"criterionId": "91475035555018"}]}, {"achievementId": "91475035553815", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553822"}]}, {"achievementId": "91475035554141", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554530"}]}, {"achievementId": "91475035553894", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553954"}]}, {"achievementId": "91475320766716", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766927"}]}, {"achievementId": "91475035553920", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554002"}]}, {"achievementId": "91475320768517", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767780"}]}, {"achievementId": "91475035553804", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553804"}]}, {"achievementId": "91475035554471", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555125"}]}, {"achievementId": "91475320766578", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553907", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553978"}]}, {"achievementId": "91475320766592", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554472", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555553"}]}, {"achievementId": "91475035553965", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554086"}]}, {"achievementId": "91475320766690", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766883"}]}, {"achievementId": "91475035554041", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554310"}]}, {"achievementId": "91475320766672", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766855"}]}, {"achievementId": "91475320766665", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766842"}, {"criterionId": "91475320766843"}, {"criterionId": "91475320766844"}, {"criterionId": "91475320766845"}, {"criterionId": "91475320766846"}, {"criterionId": "91475320766847"}, {"criterionId": "91475320766848"}]}, {"achievementId": "91475035553978", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554110"}]}, {"achievementId": "91475320766659", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766832"}]}, {"achievementId": "91475331767553", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475331685022"}, {"criterionId": "91475329172734"}, {"criterionId": "91475335963107"}]}, {"achievementId": "91475035554084", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554428"}]}, {"achievementId": "91475035554105", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554465"}]}, {"achievementId": "91475035553843", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553877"}]}, {"achievementId": "91475035554679", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555523"}, {"criterionId": "91475035555524"}, {"criterionId": "91475035555525"}, {"criterionId": "91475035555526"}]}, {"achievementId": "91475035554042", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554311"}]}, {"achievementId": "91475035553959", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 12, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554073"}, {"criterionId": "91475035554076"}, {"criterionId": "91475035554070"}, {"criterionId": "91475035554072"}, {"criterionId": "91475035554074"}, {"criterionId": "91475035554077"}, {"criterionId": "91475035554079"}, {"criterionId": "91475035554080"}, {"criterionId": "91475035554069"}, {"criterionId": "91475035554071"}, {"criterionId": "91475035554075"}, {"criterionId": "91475035554078"}]}, {"achievementId": "91475035554088", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554435"}, {"criterionId": "91475035554436"}, {"criterionId": "91475035554432"}, {"criterionId": "91475035554433"}, {"criterionId": "91475035554434"}, {"criterionId": "91475035554437"}, {"criterionId": "91475035554438"}, {"criterionId": "91475035554439"}, {"criterionId": "91475035554440"}, {"criterionId": "91475035555022"}]}, {"achievementId": "91475035553805", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553805"}]}, {"achievementId": "91475320767195", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766717", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766928"}]}, {"achievementId": "91475035553979", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554111"}]}, {"achievementId": "91475035554095", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554447"}]}, {"achievementId": "91475035554739", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555756"}]}, {"achievementId": "91475320767176", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766966"}]}, {"achievementId": "91475320766677", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766864"}, {"criterionId": "91475320766865"}, {"criterionId": "91475320766867"}, {"criterionId": "91475320766868"}, {"criterionId": "91475320766869"}, {"criterionId": "91475320766870"}, {"criterionId": "91475320766866"}]}, {"achievementId": "91475320766661", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766834"}]}, {"achievementId": "91475035554474", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555554"}]}, {"achievementId": "91475035554477", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555200"}]}, {"achievementId": "91475035554494", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555603"}]}, {"achievementId": "91475035553895", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553955"}]}, {"achievementId": "91475035553885", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553937"}]}, {"achievementId": "91475035553921", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554003"}]}, {"achievementId": "91475334209573", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554142", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554531"}]}, {"achievementId": "91475035553908", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553979"}]}, {"achievementId": "91475035553966", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554087"}]}, {"achievementId": "91475035554646", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555407", "earned": {"quantity": 4000, "startTime": 0}}]}, {"achievementId": "91475330042677", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475329131883"}]}, {"achievementId": "91475035554344", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554895"}, {"criterionId": "91475035554896"}]}, {"achievementId": "91475035554152", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555082"}]}, {"achievementId": "91475035554768", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555803"}, {"criterionId": "91475035555802"}, {"criterionId": "91475035555804"}]}, {"achievementId": "91475320766601", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 1, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475320766673", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766856"}]}, {"achievementId": "91475035553857", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553901"}]}, {"achievementId": "91475324311264", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320875671"}]}, {"achievementId": "91475035554085", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554429"}]}, {"achievementId": "91475035554286", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554758"}]}, {"achievementId": "91475035553953", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554063"}]}, {"achievementId": "91475035554374", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554969"}]}, {"achievementId": "91475035554115", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554483"}]}, {"achievementId": "91475035553937", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554033"}]}, {"achievementId": "91475035554404", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555013"}]}, {"achievementId": "91475035553954", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554064"}]}, {"achievementId": "91475035553886", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553938"}]}, {"achievementId": "91475320767177", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766967"}]}, {"achievementId": "91475035554347", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554902"}]}, {"achievementId": "91475035554116", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554484"}]}, {"achievementId": "91475035554647", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555408"}]}, {"achievementId": "91475035554684", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555529"}, {"criterionId": "91475035555527"}, {"criterionId": "91475035555528"}, {"criterionId": "91475035555530"}]}, {"achievementId": "91475035554796", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555857"}, {"criterionId": "91475035555863"}, {"criterionId": "91475035555865"}, {"criterionId": "91475035555856"}, {"criterionId": "91475035555858"}, {"criterionId": "91475035555859"}, {"criterionId": "91475035555860"}, {"criterionId": "91475035555861"}, {"criterionId": "91475035555862"}, {"criterionId": "91475035555864"}]}, {"achievementId": "91475035553858", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553902"}]}, {"achievementId": "91475035553909", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553980"}]}, {"achievementId": "91475035554135", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555027"}, {"criterionId": "91475035554519"}, {"criterionId": "91475035554520"}, {"criterionId": "91475035554521"}, {"criterionId": "91475035554522"}, {"criterionId": "91475035554523"}, {"criterionId": "91475035554524"}]}, {"achievementId": "91475320766721", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766936"}, {"criterionId": "91475320766937"}, {"criterionId": "91475320766938"}, {"criterionId": "91475320766939"}, {"criterionId": "91475320766940"}, {"criterionId": "91475320766941"}, {"criterionId": "91475320766942"}]}, {"achievementId": "91475035554096", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554448"}]}, {"achievementId": "91475035554287", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554759"}]}, {"achievementId": "91475035553980", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554112"}]}, {"achievementId": "91475035553896", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553956"}]}, {"achievementId": "91475320766674", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766857"}]}, {"achievementId": "91475035553844", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553878"}]}, {"achievementId": "91475035554740", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555757"}]}, {"achievementId": "91475320766718", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766929"}]}, {"achievementId": "91475320766662", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766835"}]}, {"achievementId": "91475035554086", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554430"}]}, {"achievementId": "91475035554550", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555229"}]}, {"achievementId": "91475035554106", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554466", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553817", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553824"}]}, {"achievementId": "91475035554143", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554532"}]}, {"achievementId": "91475035553967", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554088"}]}, {"achievementId": "91475035554468", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555555"}]}, {"achievementId": "91475035553806", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555779"}]}, {"achievementId": "91475320767194", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553972", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 12, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554094"}, {"criterionId": "91475035554100"}, {"criterionId": "91475035554101"}, {"criterionId": "91475035554102"}, {"criterionId": "91475035554103"}, {"criterionId": "91475035554104"}, {"criterionId": "91475035554093"}, {"criterionId": "91475035554095"}, {"criterionId": "91475035554096"}, {"criterionId": "91475035554097"}, {"criterionId": "91475035554098"}, {"criterionId": "91475035554099"}]}, {"achievementId": "91475035554405", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555014"}]}, {"achievementId": "91475035553922", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554004"}]}, {"achievementId": "91475035554375", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554970"}]}, {"achievementId": "91475320835952", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475336676362"}]}, {"achievementId": "91475035554016", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554285"}]}, {"achievementId": "91475035553938", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554034"}]}, {"achievementId": "91475320766719", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766930"}, {"criterionId": "91475320766931"}, {"criterionId": "91475320766932"}]}, {"achievementId": "91475035553981", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554115"}, {"criterionId": "91475035554114"}, {"criterionId": "91475035554118"}, {"criterionId": "91475035554117"}, {"criterionId": "91475035554113"}, {"criterionId": "91475035554116"}]}, {"achievementId": "91475320766517", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766663", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766836"}, {"criterionId": "91475320766837"}, {"criterionId": "91475320766838"}]}, {"achievementId": "91475035554011", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554238"}, {"criterionId": "91475035554239"}, {"criterionId": "91475035554240"}, {"criterionId": "91475035554241"}, {"criterionId": "91475035554242"}]}, {"achievementId": "91475035554461", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555559"}]}, {"achievementId": "91475035554145", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554540"}, {"criterionId": "91475035554541"}, {"criterionId": "91475035555029"}, {"criterionId": "91475035554534"}, {"criterionId": "91475035554535"}, {"criterionId": "91475035554536"}, {"criterionId": "91475035554537"}, {"criterionId": "91475035554538"}, {"criterionId": "91475035554539"}, {"criterionId": "91475035554542"}]}, {"achievementId": "91475035553955", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554065"}]}, {"achievementId": "91475035554107", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554467"}]}, {"achievementId": "91475035553845", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553879"}]}, {"achievementId": "91475035554570", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555265"}]}, {"achievementId": "91475035554097", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554449"}]}, {"achievementId": "91475320766553", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 1, "nextProgressRequiredQuantity": 5}, {"achievementId": "91475035554475", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555556"}]}, {"achievementId": "91475035553897", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553957"}]}, {"achievementId": "91475035554117", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554485"}]}, {"achievementId": "91475035553859", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553903"}]}, {"achievementId": "91475035553807", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553806"}]}, {"achievementId": "91475320766567", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554406", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555015", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554733", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555747"}, {"criterionId": "91475035555749"}, {"criterionId": "91475035555750"}, {"criterionId": "91475035555748"}]}, {"achievementId": "91475035554511", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555162"}]}, {"achievementId": "91475035553910", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553981"}]}, {"achievementId": "91475035554017", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554286"}]}, {"achievementId": "91475035554144", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554533"}]}, {"achievementId": "91475320766675", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766859"}, {"criterionId": "91475320766860"}, {"criterionId": "91475320766858"}]}, {"achievementId": "91475035554087", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554431"}]}, {"achievementId": "91475320766581", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475328335041", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321131527"}]}, {"achievementId": "91475320767178", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766968"}]}, {"achievementId": "91475320766475", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553923", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554005"}]}, {"achievementId": "91475320766697", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766891"}, {"criterionId": "91475320766892"}, {"criterionId": "91475320766893"}, {"criterionId": "91475320766897"}, {"criterionId": "91475320766898"}, {"criterionId": "91475320766899"}, {"criterionId": "91475320766901"}, {"criterionId": "91475320766969"}, {"criterionId": "91475320766970"}, {"criterionId": "91475320766971"}]}, {"achievementId": "91475035553968", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554089"}]}, {"achievementId": "91475320767193", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553939", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554035"}]}, {"achievementId": "91475035554288", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554760"}]}, {"achievementId": "91475320766644", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554795", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555853"}, {"criterionId": "91475035555854"}]}, {"achievementId": "91475035554741", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555758"}]}, {"achievementId": "91475035554348", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554903"}]}, {"achievementId": "91475330365211", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 8, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475035554376", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554971"}]}, {"achievementId": "91475035554649", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555410"}]}, {"achievementId": "91475035553887", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553939"}]}, {"achievementId": "91475320766595", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554755", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555789"}, {"criterionId": "91475035555786"}, {"criterionId": "91475035555787"}, {"criterionId": "91475035555788"}]}, {"achievementId": "91475035553956", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554066"}]}, {"achievementId": "91475035554481", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555129", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553846", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553880"}]}, {"achievementId": "91475035554430", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555094"}]}, {"achievementId": "91475035553860", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553904"}]}, {"achievementId": "91475035554350", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554908"}]}, {"achievementId": "91475035554018", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554287"}]}, {"achievementId": "91475320766664", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766839"}, {"criterionId": "91475320766840"}, {"criterionId": "91475320766841"}]}, {"achievementId": "91475320767192", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553982", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554119"}]}, {"achievementId": "91475035553940", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554036"}]}, {"achievementId": "91475035553969", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554090"}]}, {"achievementId": "91475035554340", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554884"}, {"criterionId": "91475035554885"}, {"criterionId": "91475035554883"}]}, {"achievementId": "91475035553924", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554006"}]}, {"achievementId": "91475035554742", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555759"}]}, {"achievementId": "91475035554341", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554886"}, {"criterionId": "91475035554888"}, {"criterionId": "91475035554887"}]}, {"achievementId": "91475329589660", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475336696876"}]}, {"achievementId": "91475035554012", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554244"}, {"criterionId": "91475035554243"}]}, {"achievementId": "91475320766676", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766861"}, {"criterionId": "91475320766862"}, {"criterionId": "91475320766863"}]}, {"achievementId": "91475035554794", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555850"}, {"criterionId": "91475035555851"}, {"criterionId": "91475035555852"}]}, {"achievementId": "91475320766720", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766933"}, {"criterionId": "91475320766934"}, {"criterionId": "91475320766935"}]}, {"achievementId": "91475035554342", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554890"}, {"criterionId": "91475035554891"}, {"criterionId": "91475035554889"}]}, {"achievementId": "91475035554343", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554892"}, {"criterionId": "91475035554894"}]}, {"achievementId": "91475035554366", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554960"}, {"criterionId": "91475035554961"}, {"criterionId": "91475035554962"}]}, {"achievementId": "91475035554377", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554972"}]}, {"achievementId": "91475035554476", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555557"}]}, {"achievementId": "91475035554560", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555237"}]}, {"achievementId": "91475320766692", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766885"}]}, {"achievementId": "91475035554650", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555411"}]}, {"achievementId": "91475035554393", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555002"}]}, {"achievementId": "91475035553898", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553958"}]}, {"achievementId": "91475035554345", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554898"}, {"criterionId": "91475035554900"}]}, {"achievementId": "91475035553911", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553982"}]}, {"achievementId": "91475035554367", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554486"}, {"criterionId": "91475035554489"}, {"criterionId": "91475035554487"}, {"criterionId": "91475035554488"}, {"criterionId": "91475035554490"}, {"criterionId": "91475035554491"}, {"criterionId": "91475035554492"}, {"criterionId": "91475035554493"}, {"criterionId": "91475035554494"}, {"criterionId": "91475035555054"}]}, {"achievementId": "91475035554394", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555003"}]}, {"achievementId": "91475334760091", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475328726725"}]}, {"achievementId": "91475035553957", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554067"}]}, {"achievementId": "91475035554776", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555826"}, {"criterionId": "91475035555827"}, {"criterionId": "91475035555829"}, {"criterionId": "91475035555828"}]}, {"achievementId": "91475035553899", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553959"}]}, {"achievementId": "91475035553912", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553983"}]}, {"achievementId": "91475035554473", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555550"}]}, {"achievementId": "91475035554019", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554288"}]}, {"achievementId": "91475035553861", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553905"}]}, {"achievementId": "91475035554651", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555412"}]}, {"achievementId": "91475035554743", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555760"}]}, {"achievementId": "91475035553925", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554007"}]}, {"achievementId": "91475035553983", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554120"}]}, {"achievementId": "91475035554153", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555046"}, {"criterionId": "91475035555048"}, {"criterionId": "91475035555049"}, {"criterionId": "91475035555051"}, {"criterionId": "91475035555053"}, {"criterionId": "91475035555047"}, {"criterionId": "91475035555050"}, {"criterionId": "91475035555052"}]}, {"achievementId": "91475035553941", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554037"}]}, {"achievementId": "91475035554495", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555144"}]}, {"achievementId": "91475320767191", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553970", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554091"}]}, {"achievementId": "91475035553820", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553827"}]}, {"achievementId": "91475035554378", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554973", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554653", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555414"}]}, {"achievementId": "91475035554781", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555834"}, {"criterionId": "91475035555835"}, {"criterionId": "91475035555836"}, {"criterionId": "91475035555837"}]}, {"achievementId": "91475035553958", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554068"}]}, {"achievementId": "91475035553848", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553882"}]}, {"achievementId": "91475035554349", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554905"}, {"criterionId": "91475035554907"}, {"criterionId": "91475035554904"}]}, {"achievementId": "91475035553900", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553960"}]}, {"achievementId": "91475035553971", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554092"}]}, {"achievementId": "91475320767190", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766598", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766692"}, {"criterionId": "91475320766693"}, {"criterionId": "91475320766694"}, {"criterionId": "91475320766696"}, {"criterionId": "91475320766697"}, {"criterionId": "91475320766695"}]}, {"achievementId": "91475035553942", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554038"}]}, {"achievementId": "91475320766584", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766652"}, {"criterionId": "91475320766653"}, {"criterionId": "91475320766654"}, {"criterionId": "91475320766655"}, {"criterionId": "91475320766656"}, {"criterionId": "91475320766657"}]}, {"achievementId": "91475035554020", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554289"}]}, {"achievementId": "91475320766606", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554395", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555004"}]}, {"achievementId": "91475035554744", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555761"}]}, {"achievementId": "91475320766570", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766612"}, {"criterionId": "91475320766613"}, {"criterionId": "91475320766614"}, {"criterionId": "91475320766615"}, {"criterionId": "91475320766616"}, {"criterionId": "91475320766617"}]}, {"achievementId": "91475035554498", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555147"}]}, {"achievementId": "91475035553984", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554121"}]}, {"achievementId": "91475035553862", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553906"}]}, {"achievementId": "91475035554379", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554974", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553926", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554008"}]}, {"achievementId": "91475332914972", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332252216"}]}, {"achievementId": "91475320766556", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766569"}, {"criterionId": "91475320766570"}, {"criterionId": "91475320766571"}, {"criterionId": "91475320766572"}, {"criterionId": "91475320766573"}, {"criterionId": "91475320766574"}]}, {"achievementId": "91475035554469", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555551"}]}, {"achievementId": "91475035553913", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553984"}]}, {"achievementId": "91475324342495", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766585", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766658"}, {"criterionId": "91475320766659"}, {"criterionId": "91475320766660"}, {"criterionId": "91475320766661"}, {"criterionId": "91475320766662"}, {"criterionId": "91475320766663"}, {"criterionId": "91475320766664"}, {"criterionId": "91475320766665"}, {"criterionId": "91475320766666"}, {"criterionId": "91475320766667"}]}, {"achievementId": "91475329728508", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475331558989"}]}, {"achievementId": "91475320766599", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766698"}, {"criterionId": "91475320766699"}, {"criterionId": "91475320766700"}, {"criterionId": "91475320766701"}, {"criterionId": "91475320766702"}, {"criterionId": "91475320766703"}, {"criterionId": "91475320766704"}, {"criterionId": "91475320766705"}, {"criterionId": "91475320766706"}, {"criterionId": "91475320766707"}]}, {"achievementId": "91475035553863", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553907"}]}, {"achievementId": "91475035553985", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554122"}]}, {"achievementId": "91475035554801", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555870"}, {"criterionId": "91475035555871"}, {"criterionId": "91475035555872"}, {"criterionId": "91475035555873"}]}, {"achievementId": "91475035554029", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554298"}]}, {"achievementId": "91475035554401", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555010"}]}, {"achievementId": "91475035554745", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555762"}]}, {"achievementId": "91475035553943", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554039"}]}, {"achievementId": "91475035554485", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555134"}]}, {"achievementId": "91475035554380", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554975", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554462", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555560"}]}, {"achievementId": "91475035554352", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554915"}, {"criterionId": "91475035554916"}, {"criterionId": "91475035554910"}, {"criterionId": "91475035554911"}, {"criterionId": "91475035554914"}, {"criterionId": "91475035554912"}, {"criterionId": "91475035554913"}]}, {"achievementId": "91475320766648", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553927", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554009"}]}, {"achievementId": "91475320767189", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554470", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555552"}]}, {"achievementId": "91475035553822", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553829"}]}, {"achievementId": "91475320766571", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 10, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766618"}, {"criterionId": "91475320766620"}, {"criterionId": "91475320766621"}, {"criterionId": "91475320766622"}, {"criterionId": "91475320766623"}, {"criterionId": "91475320766624"}, {"criterionId": "91475320766625"}, {"criterionId": "91475320766626"}, {"criterionId": "91475320766627"}, {"criterionId": "91475320766619"}]}, {"achievementId": "91475035553864", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553908"}]}, {"achievementId": "91475320767187", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554381", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554976", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554030", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554299"}]}, {"achievementId": "91475035553929", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554011"}]}, {"achievementId": "91475035554806", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555881"}, {"criterionId": "91475035555878"}, {"criterionId": "91475035555879"}, {"criterionId": "91475035555880"}]}, {"achievementId": "91475035554353", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554921"}, {"criterionId": "91475035554922"}, {"criterionId": "91475035554918"}, {"criterionId": "91475035554920"}, {"criterionId": "91475035554923"}, {"criterionId": "91475035554917"}, {"criterionId": "91475035554919"}]}, {"achievementId": "91475035554767", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555801"}]}, {"achievementId": "91475035554402", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555011"}]}, {"achievementId": "91475035553944", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554040"}]}, {"achievementId": "91475035554465", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555547"}]}, {"achievementId": "91475327115980", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475326583500"}]}, {"achievementId": "91475035553945", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554041"}]}, {"achievementId": "91475035554766", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555800"}]}, {"achievementId": "91475035554414", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555064"}, {"criterionId": "91475035555065"}, {"criterionId": "91475035555066"}, {"criterionId": "91475035555067"}, {"criterionId": "91475035555068"}]}, {"achievementId": "91475035554031", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554300"}]}, {"achievementId": "91475035554403", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555012"}]}, {"achievementId": "91475035554811", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555889"}, {"criterionId": "91475035555887"}, {"criterionId": "91475035555886"}, {"criterionId": "91475035555888"}]}, {"achievementId": "91475035554466", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555548"}]}, {"achievementId": "91475035554382", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554977"}]}, {"achievementId": "91475323715407", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475324697255"}]}, {"achievementId": "91475035553928", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554010"}]}, {"achievementId": "91475320767197", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766985"}]}, {"achievementId": "91475035553865", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553909"}]}, {"achievementId": "91475035554717", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555621"}]}, {"achievementId": "91475035553866", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553910"}]}, {"achievementId": "91475336791664", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475333467524"}]}, {"achievementId": "91475035554354", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554467", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555549"}]}, {"achievementId": "91475035554446", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555115", "earned": {"quantity": 4952, "startTime": 0}}]}, {"achievementId": "91475035554765", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555799"}]}, {"achievementId": "91475035554032", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554301"}]}, {"achievementId": "91475035553825", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553832"}]}, {"achievementId": "91475035554816", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555894"}, {"criterionId": "91475035555895"}, {"criterionId": "91475035555896"}, {"criterionId": "91475035555897"}]}, {"achievementId": "91475035554396", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555005"}]}, {"achievementId": "91475035554489", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555138"}]}, {"achievementId": "91475320766525", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 3, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475320767199", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766986"}]}, {"achievementId": "91475035554033", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554302"}]}, {"achievementId": "91475035553989", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554126"}]}, {"achievementId": "91475035553826", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553833"}]}, {"achievementId": "91475323539522", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475330200666"}]}, {"achievementId": "91475320766611", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554822", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555911"}, {"criterionId": "91475035555913"}, {"criterionId": "91475035555914"}, {"criterionId": "91475035555912"}]}, {"achievementId": "91475035554764", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555798"}]}, {"achievementId": "91475035554457", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555542"}]}, {"achievementId": "91475320766652", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766820"}]}, {"achievementId": "91475335028033", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": [], "nextProgressEarnedQuantity": 6, "nextProgressRequiredQuantity": 10}, {"achievementId": "91475035553867", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553911"}]}, {"achievementId": "91475320766489", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554442", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555111", "earned": {"quantity": 94, "startTime": 0}}]}, {"achievementId": "91475035554397", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555006"}]}, {"achievementId": "91475035554827", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555920"}, {"criterionId": "91475035555921"}, {"criterionId": "91475035555922"}, {"criterionId": "91475035555919"}]}, {"achievementId": "91475035553990", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554127"}]}, {"achievementId": "91475320766653", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766821"}]}, {"achievementId": "91475035553868", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553912"}]}, {"achievementId": "91475035554034", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554303"}]}, {"achievementId": "91475321075154", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332629713"}]}, {"achievementId": "91475035553827", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553834"}]}, {"achievementId": "91475035554762", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555796"}]}, {"achievementId": "91475035554398", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555007"}]}, {"achievementId": "91475035554458", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555543"}]}, {"achievementId": "91475035554448", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555117"}]}, {"achievementId": "91475035554460", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555544"}]}, {"achievementId": "91475035554399", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555008"}]}, {"achievementId": "91475035554035", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554304"}]}, {"achievementId": "91475035553869", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553913"}]}, {"achievementId": "91475325260808", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475335519841"}]}, {"achievementId": "91475331841770", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554490", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555139"}]}, {"achievementId": "91475320767207", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766988"}]}, {"achievementId": "91475035554763", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555797", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035553991", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554128"}]}, {"achievementId": "91475035554021", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554290"}]}, {"achievementId": "91475035553992", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554129"}]}, {"achievementId": "91475035553870", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553914"}]}, {"achievementId": "91475035554400", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555009"}]}, {"achievementId": "91475035554761", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555795"}]}, {"achievementId": "91475330238649", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475327819862"}]}, {"achievementId": "91475035554491", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555140"}]}, {"achievementId": "91475035553871", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553915"}]}, {"achievementId": "91475035554022", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554291"}]}, {"achievementId": "91475035554760", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555794"}]}, {"achievementId": "91475035553993", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554130"}]}, {"achievementId": "91475035553994", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 7, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554131"}, {"criterionId": "91475035554136"}, {"criterionId": "91475035554132"}, {"criterionId": "91475035554133"}, {"criterionId": "91475035554134"}, {"criterionId": "91475035554135"}, {"criterionId": "91475035554137"}]}, {"achievementId": "91475035553872", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553916"}]}, {"achievementId": "91475320766616", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553831", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553838"}]}, {"achievementId": "91475035554759", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555793"}]}, {"achievementId": "91475035554023", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554292"}]}, {"achievementId": "91475035554024", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554293"}]}, {"achievementId": "91475035554362", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553873", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553917"}]}, {"achievementId": "91475035553832", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553839"}]}, {"achievementId": "91475035553995", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554138"}]}, {"achievementId": "91475035554758", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555792", "earned": {"quantity": 0, "startTime": 0}}]}, {"achievementId": "91475035554025", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554294"}]}, {"achievementId": "91475035553996", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554139"}]}, {"achievementId": "91475035553874", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553918"}]}, {"achievementId": "91475035554757", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555791"}]}, {"achievementId": "91475320766533", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320766497", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554756", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555790"}]}, {"achievementId": "91475035553997", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475332369454", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 8, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553875", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553919"}]}, {"achievementId": "91475035553834", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553841"}]}, {"achievementId": "91475035554026", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554295"}]}, {"achievementId": "91475334973146", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 14, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554817", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555898"}]}, {"achievementId": "91475035554782", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555838"}]}, {"achievementId": "91475035554027", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554296"}]}, {"achievementId": "91475035553876", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553920"}]}, {"achievementId": "91475035554028", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554297"}]}, {"achievementId": "91475035554160", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035553877", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553921"}]}, {"achievementId": "91475035554788", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555844"}]}, {"achievementId": "91475035554789", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555845"}]}, {"achievementId": "91475035553878", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 9, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035553927"}, {"criterionId": "91475035553922"}, {"criterionId": "91475035553924"}, {"criterionId": "91475035553923"}, {"criterionId": "91475035553925"}, {"criterionId": "91475035553926"}, {"criterionId": "91475035553928"}, {"criterionId": "91475035553929"}, {"criterionId": "91475035553930"}]}, {"achievementId": "91475035554784", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555840"}]}, {"achievementId": "91475035554002", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554787", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555843"}]}, {"achievementId": "91475035554783", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555839"}]}, {"achievementId": "91475035554156", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554786", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555842"}]}, {"achievementId": "91475035554791", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555847"}]}, {"achievementId": "91475320766505", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475035554793", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555849"}]}, {"achievementId": "91475035554007", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 29, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554154"}, {"criterionId": "91475035554155"}, {"criterionId": "91475035554156"}, {"criterionId": "91475035554162"}, {"criterionId": "91475035554163"}, {"criterionId": "91475035554165"}, {"criterionId": "91475035554167"}, {"criterionId": "91475035554171"}, {"criterionId": "91475035554176"}, {"criterionId": "91475035554178"}, {"criterionId": "91475035554150"}, {"criterionId": "91475035554151"}, {"criterionId": "91475035554152"}, {"criterionId": "91475035554153"}, {"criterionId": "91475035554157"}, {"criterionId": "91475035554158"}, {"criterionId": "91475035554159"}, {"criterionId": "91475035554160"}, {"criterionId": "91475035554161"}, {"criterionId": "91475035554164"}, {"criterionId": "91475035554166"}, {"criterionId": "91475035554168"}, {"criterionId": "91475035554169"}, {"criterionId": "91475035554170"}, {"criterionId": "91475035554172"}, {"criterionId": "91475035554173"}, {"criterionId": "91475035554174"}, {"criterionId": "91475035554175"}, {"criterionId": "91475035554177"}]}, {"achievementId": "91475035554785", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555841"}]}, {"achievementId": "91475035554008", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 29, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554179"}, {"criterionId": "91475035554186"}, {"criterionId": "91475035554194"}, {"criterionId": "91475035554196"}, {"criterionId": "91475035554199"}, {"criterionId": "91475035554200"}, {"criterionId": "91475035554202"}, {"criterionId": "91475035554206"}, {"criterionId": "91475035554180"}, {"criterionId": "91475035554181"}, {"criterionId": "91475035554182"}, {"criterionId": "91475035554183"}, {"criterionId": "91475035554184"}, {"criterionId": "91475035554185"}, {"criterionId": "91475035554187"}, {"criterionId": "91475035554188"}, {"criterionId": "91475035554189"}, {"criterionId": "91475035554190"}, {"criterionId": "91475035554191"}, {"criterionId": "91475035554192"}, {"criterionId": "91475035554193"}, {"criterionId": "91475035554195"}, {"criterionId": "91475035554197"}, {"criterionId": "91475035554198"}, {"criterionId": "91475035554201"}, {"criterionId": "91475035554203"}, {"criterionId": "91475035554204"}, {"criterionId": "91475035554205"}, {"criterionId": "91475035554207"}]}, {"achievementId": "91475035554790", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555846"}]}, {"achievementId": "91475035554009", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 29, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554209"}, {"criterionId": "91475035554229"}, {"criterionId": "91475035554208"}, {"criterionId": "91475035554212"}, {"criterionId": "91475035554214"}, {"criterionId": "91475035554215"}, {"criterionId": "91475035554216"}, {"criterionId": "91475035554217"}, {"criterionId": "91475035554218"}, {"criterionId": "91475035554222"}, {"criterionId": "91475035554223"}, {"criterionId": "91475035554224"}, {"criterionId": "91475035554228"}, {"criterionId": "91475035554230"}, {"criterionId": "91475035554232"}, {"criterionId": "91475035554233"}, {"criterionId": "91475035554235"}, {"criterionId": "91475035554210"}, {"criterionId": "91475035554211"}, {"criterionId": "91475035554213"}, {"criterionId": "91475035554219"}, {"criterionId": "91475035554220"}, {"criterionId": "91475035554221"}, {"criterionId": "91475035554225"}, {"criterionId": "91475035554226"}, {"criterionId": "91475035554227"}, {"criterionId": "91475035554231"}, {"criterionId": "91475035554234"}, {"criterionId": "91475035554236"}]}, {"achievementId": "91475035554792", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035555848"}]}, {"achievementId": "91475035554010", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475035554237"}]}, {"achievementId": "91475320767238", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767590", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767464"}]}, {"achievementId": "91475320767599", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767591", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767592", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767593", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767594", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767595", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767596", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767597", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767598", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": []}, {"achievementId": "91475320767600", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475330085730"}, {"criterionId": "91475320767465"}]}, {"achievementId": "91475320767608", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767473"}]}, {"achievementId": "91475320767775", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767696"}]}, {"achievementId": "91475320767612", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767477"}]}, {"achievementId": "91475320768503", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767766"}]}, {"achievementId": "91475320768505", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767768"}]}, {"achievementId": "91475320768507", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767770"}]}, {"achievementId": "91475320768584", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767909"}]}, {"achievementId": "91475320768585", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767908"}]}, {"achievementId": "91475333429423", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475327764279"}]}, {"achievementId": "91475320768589", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767914"}]}, {"achievementId": "91475320768590", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767915"}]}, {"achievementId": "91475320768606", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320934192"}, {"criterionId": "91475320767931"}]}, {"achievementId": "91475320768608", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767933"}]}, {"achievementId": "91475320768609", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767934"}]}, {"achievementId": "91475320768610", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767951"}]}, {"achievementId": "91475320768612", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320767953"}]}, {"achievementId": "91475330083013", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475322667243"}]}, {"achievementId": "91475336568755", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475322460188"}]}, {"achievementId": "91475327507083", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475334317622"}]}, {"achievementId": "91475321472496", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321852452"}]}, {"achievementId": "91475337217274", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321230791"}]}, {"achievementId": "91475336485884", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332449926"}]}, {"achievementId": "91475333723367", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475323916545"}]}, {"achievementId": "91475322499604", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475334593083"}]}, {"achievementId": "91475324502671", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475328822040"}]}, {"achievementId": "91475322743276", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475331640773"}]}, {"achievementId": "91475321520575", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332788950"}]}, {"achievementId": "91475331871854", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332815545"}]}, {"achievementId": "91475324090085", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332539062"}]}, {"achievementId": "91475324794017", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475336904908"}]}, {"achievementId": "91475328620915", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475330155302"}]}, {"achievementId": "91475328698935", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475333975884"}]}, {"achievementId": "91475335163351", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475327717946"}]}, {"achievementId": "91475327980608", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475326102584"}]}, {"achievementId": "91475330024044", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332066581"}]}, {"achievementId": "91475325099068", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320941638"}]}, {"achievementId": "91475335069420", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475330337505"}]}, {"achievementId": "91475330109208", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475329562213"}]}, {"achievementId": "91475321411323", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475322985702"}]}, {"achievementId": "91475334683502", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475330133682"}, {"criterionId": "91475336958839"}, {"criterionId": "91475327630169"}]}, {"achievementId": "91475337310687", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 2, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332647033"}, {"criterionId": "91475325421257"}]}, {"achievementId": "91475335847082", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321276162"}]}, {"achievementId": "91475323125219", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475324655303"}]}, {"achievementId": "91475324281736", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475333930495"}]}, {"achievementId": "91475337287182", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475326919831"}]}, {"achievementId": "91475324863824", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475326970843"}]}, {"achievementId": "91475334615466", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475336863381"}]}, {"achievementId": "91475322057131", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475326025377"}]}, {"achievementId": "91475335445758", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475333058793"}]}, {"achievementId": "91475320959067", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321088587"}]}, {"achievementId": "91475334807273", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475323865080"}]}, {"achievementId": "91475326724670", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475323072027"}]}, {"achievementId": "91475321797977", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320932375"}]}, {"achievementId": "91475321962218", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475335670099"}]}, {"achievementId": "91475334676766", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475328683669"}]}, {"achievementId": "91475330753538", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475329402606"}]}, {"achievementId": "91475329881936", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475324770594"}]}, {"achievementId": "91475323453060", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475321545518"}]}, {"achievementId": "91475334044016", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475328500397"}]}, {"achievementId": "91475333836328", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475323841605"}]}, {"achievementId": "91475331645958", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475333520697"}]}, {"achievementId": "91475320766508", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766531"}, {"criterionId": "91475320766532"}, {"criterionId": "91475320766533"}, {"criterionId": "91475320766534"}]}, {"achievementId": "91475320766544", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766554"}, {"criterionId": "91475320766555"}, {"criterionId": "91475320766556"}, {"criterionId": "91475320766557"}]}, {"achievementId": "91475329565821", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475329895214"}]}, {"achievementId": "91475320766738", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766960"}, {"criterionId": "91475320766962"}, {"criterionId": "91475320766961"}]}, {"achievementId": "91475320766558", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766592"}, {"criterionId": "91475320766595"}, {"criterionId": "91475320766598"}, {"criterionId": "91475320766599"}, {"criterionId": "91475320766600"}]}, {"achievementId": "91475320766572", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766632"}, {"criterionId": "91475320766635"}, {"criterionId": "91475320766638"}, {"criterionId": "91475320766639"}, {"criterionId": "91475320766640"}]}, {"achievementId": "91475320766586", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766672"}, {"criterionId": "91475320766675"}, {"criterionId": "91475320766678"}, {"criterionId": "91475320766679"}, {"criterionId": "91475320766680"}]}, {"achievementId": "91475320766600", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 5, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766712"}, {"criterionId": "91475320766715"}, {"criterionId": "91475320766718"}, {"criterionId": "91475320766719"}, {"criterionId": "91475320766720"}]}, {"achievementId": "91475320766621", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320766750"}, {"criterionId": "91475320766755"}, {"criterionId": "91475320766760"}, {"criterionId": "91475320766745"}]}, {"achievementId": "91475321367688", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475331515574"}]}, {"achievementId": "91475332319624", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475332002221"}]}, {"achievementId": "91475327390206", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475328573607"}]}, {"achievementId": "91475327680984", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475330921680"}, {"criterionId": "91475327391102"}, {"criterionId": "91475328818156"}]}, {"achievementId": "91475330714048", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475325519958"}]}, {"achievementId": "91475320999251", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475335166108"}, {"criterionId": "91475337158462"}, {"criterionId": "91475327188041"}]}, {"achievementId": "91475325132337", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475324951483"}]}, {"achievementId": "91475322935195", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 3, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475329357019"}, {"criterionId": "91475321034385"}, {"criterionId": "91475322930353"}]}, {"achievementId": "91475329549917", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 1, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475323675964"}]}, {"achievementId": "91475325460243", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 6, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475322712268"}, {"criterionId": "91475321344226"}, {"criterionId": "91475333911648"}, {"criterionId": "91475332945212"}, {"criterionId": "91475320853931"}, {"criterionId": "91475323541417"}]}, {"achievementId": "91475334536921", "completionDate": -9223372036854776000, "numCompletedAchievementsInSeries": 0, "totalAchievementsInSeries": 4, "isComplete": false, "inProgress": false, "criteria": [{"criterionId": "91475320961036"}, {"criterionId": "91475336460677"}, {"criterionId": "91475327957908"}, {"criterionId": "91475330009323"}]}]}"""
)
async def get_wow_character_data(self):
return json.loads(
"""{"characters": [{"name": "Thoryxus", "realm": "Doomhammer", "battlegroup": "Embuscade / Hinterhalt", "class": 1, "race": 6, "gender": 0, "level": 2, "achievementPoints": 0, "thumbnail": "doomhammer/40/117791784-avatar.jpg", "lastModified": 0}, {"name": "Thoryx", "realm": "Doomhammer", "battlegroup": "Embuscade / Hinterhalt", "class": 9, "race": 10, "gender": 0, "level": 75, "achievementPoints": 535, "thumbnail": "doomhammer/88/117456216-avatar.jpg", "spec": {"name": "Destruction", "role": "DPS", "backgroundImage": "bg-warlock-destruction", "icon": "spell_shadow_rainoffire", "description": "A master of chaos who calls down fire to burn and demolish enemies. Preferred Weapon: Staff, Wand, Dagger, Sword", "order": 2}, "guild": "Killswitch", "guildRealm": "Doomhammer", "lastModified": 1532872734000}]}"""
)
async def get_wow_character_achievements(self, realm, character_name):
if character_name == "Thoryx":
return json.loads(
"""{"lastModified": 1532872734000, "name": "Thoryx", "realm": "Doomhammer", "battlegroup": "Embuscade / Hinterhalt", "class": 9, "race": 10, "gender": 0, "level": 75, "achievementPoints": 705, "thumbnail": "doomhammer/88/117456216-avatar.jpg", "calcClass": "V", "faction": 1, "achievements": {"achievementsCompleted": [6, 7, 8, 9, 10, 11, 12, 116, 477, 478, 480, 481, 483, 484, 487, 503, 504, 505, 506, 507, 522, 547, 621, 641, 647, 648, 649, 650, 652, 653, 655, 658, 659, 666, 731, 766, 775, 782, 852, 858, 889, 890, 891, 964, 1017, 1176, 1263, 1265, 1356, 4516, 4892, 4893, 4894, 4895, 4896, 4897, 4900, 4901, 4904, 4908, 4909, 4910, 4927, 4956, 4957, 5448, 5794, 8345, 10561, 10657, 10689], "achievementsCompletedTimestamp": [1531334880000, 1531523280000, 1531600860000, 1531678140000, 1531935600000, 1532170980000, 1532730900000, 1531660620000, 1532256540000, 1532374740000, 1532782140000, 1532784120000, 1532855520000, 1532787480000, 1532780580000, 1531339440000, 1531524660000, 1531659780000, 1531934040000, 1532783760000, 1531562760000, 1532790180000, 1532790860000, 1532124720000, 1532872440000, 1532857020000, 1532778540000, 1532283960000, 1532789700000, 1532371800000, 1532805960000, 1532804880000, 1532810280000, 1532369760000, 1532103960000, 1532116980000, 1531944240000, 1532103720000, 1532179320000, 1531559100000, 1531683300000, 1532171760000, 1531523280000, 1532032020000, 1531165152000, 1532283300000, 1532276340000, 1532787780000, 1532288700000, 1532811360000, 1531849980000, 1531668060000, 1531687740000, 1531597320000, 1531608360000, 1531657140000, 1531857300000, 1532026800000, 1532103900000, 1531562760000, 1532118780000, 1531937760000, 1532181000000, 1532124780000, 1532374800000, 1532178960000, 1531936800000, 1531045142000, 1532028300000, 1531523280000, 1532727543000], "criteria": [72, 111, 149, 151, 162, 167, 169, 286, 653, 657, 753, 757, 832, 962, 963, 964, 965, 966, 967, 968, 970, 971, 972, 973, 974, 975, 976, 982, 1014, 1016, 1017, 1018, 1020, 1021, 1023, 1028, 1029, 1030, 1031, 1032, 1033, 1034, 1035, 1036, 1041, 1042, 1043, 1045, 1053, 1054, 1055, 1056, 1057, 1058, 1060, 1061, 1062, 1063, 1064, 1065, 1070, 1071, 1072, 1073, 1074, 1075, 1076, 1077, 1078, 1079, 1080, 1081, 1082, 1083, 1084, 1085, 1086, 1087, 1088, 1089, 1090, 1091, 1092, 1093, 1094, 1095, 1096, 1097, 1098, 1099, 1100, 1101, 1102, 1103, 1104, 1105, 1106, 1108, 1109, 1110, 1111, 1112, 1113, 1114, 1115, 1117, 1118, 1119, 1120, 1121, 1122, 1123, 1124, 1125, 1126, 1127, 1128, 1129, 1130, 1132, 1134, 1135, 1136, 1137, 1138, 1139, 1140, 1141, 1142, 1143, 1144, 1145, 1158, 1174, 1176, 1185, 1186, 1187, 1188, 1189, 1190, 1191, 1192, 1193, 1194, 1195, 1224, 1225, 1226, 1227, 1228, 1229, 1230, 1231, 1232, 1233, 1268, 1269, 1271, 1272, 1273, 1274, 1275, 1280, 1281, 1294, 1409, 1410, 1411, 1412, 1429, 1430, 1432, 1433, 1434, 1435, 1436, 1438, 1439, 1440, 1441, 1442, 1443, 1498, 1504, 1505, 1506, 1507, 1508, 1509, 1510, 1511, 1512, 1513, 1514, 1515, 1516, 1517, 1518, 1519, 1520, 1521, 1524, 1525, 1526, 1527, 1528, 1529, 1530, 1531, 1532, 1536, 1537, 1538, 1539, 1540, 1541, 1542, 1543, 1544, 1545, 1546, 1548, 1551, 1872, 2002, 2020, 2044, 2045, 2048, 2072, 2422, 3239, 3240, 3242, 3243, 3244, 3245, 3247, 3250, 3251, 3631, 3639, 3643, 3647, 3652, 3656, 3667, 3679, 3905, 3906, 3924, 3952, 3964, 3965, 3966, 3967, 3969, 3970, 3973, 3976, 3977, 3980, 4040, 4041, 4043, 4044, 4046, 4047, 4050, 4092, 4093, 4136, 4137, 4138, 4139, 4140, 4141, 4142, 4143, 4144, 4146, 4147, 4148, 4149, 4150, 4151, 4152, 4153, 4154, 4155, 4157, 4158, 4159, 4160, 4162, 4163, 4164, 4165, 4166, 4167, 4168, 4170, 4171, 4172, 4173, 4174, 4175, 4176, 4177, 4178, 4186, 4187, 4192, 4193, 4194, 4196, 4197, 4198, 4199, 4201, 4202, 4203, 4204, 4293, 4294, 4295, 4297, 4299, 4472, 4705, 4731, 4737, 4738, 4741, 4742, 4743, 4744, 4745, 4746, 4751, 4757, 4759, 4763, 4764, 4765, 4774, 4787, 4788, 4943, 4944, 4946, 4948, 4949, 4950, 4951, 4952, 4953, 4954, 4955, 4956, 4957, 4958, 4971, 4984, 4987, 5008, 5049, 5212, 5217, 5220, 5230, 5289, 5293, 5295, 5296, 5297, 5299, 5300, 5301, 5305, 5313, 5314, 5315, 5316, 5317, 5322, 5323, 5327, 5371, 5372, 5373, 5374, 5375, 5376, 5377, 5378, 5379, 5380, 5381, 5382, 5383, 5384, 5436, 5437, 5438, 5439, 5440, 5441, 5442, 5447, 5448, 5449, 5450, 5451, 5459, 5460, 5461, 5462, 5512, 5529, 5530, 5531, 5532, 5559, 5561, 5567, 5573, 5575, 5696, 5701, 5860, 6140, 6142, 6554, 6568, 6847, 6891, 6928, 7040, 7047, 7114, 7221, 7230, 7314, 7855, 8799, 8801, 8819, 8820, 8821, 8822, 8824, 8996, 9358, 9359, 9361, 9362, 9365, 9366, 9368, 9369, 9370, 9371, 9372, 9818, 10519, 12183, 12678, 12680, 12684, 12685, 12687, 12689, 13166, 13169, 13315, 13316, 13757, 13812, 13872, 13874, 13878, 13922, 14030, 14046, 14154, 14159, 14160, 14161, 14163, 14164, 14165, 14166, 14167, 14168, 14169, 14170, 14281, 14282, 14500, 14590, 14591, 14593, 14624, 15106, 15108, 15109, 15110, 15111, 15112, 15113, 15114, 15115, 15116, 15117, 15118, 15119, 15120, 15121, 15204, 15205, 15206, 15207, 15209, 15210, 15211, 15212, 15214, 15215, 15216, 15217, 15232, 15475, 15624, 15625, 15626, 15627, 15628, 15629, 15630, 15889, 15908, 15919, 15946, 16092, 16093, 16170, 16206, 16207, 16555, 16557, 16561, 16566, 16571, 16578, 16584, 16585, 16586, 16589, 16590, 16591, 16592, 16594, 16597, 16598, 16599, 16825, 17398, 17909, 17910, 17914, 17915, 17920, 17921, 17923, 17924, 17927, 17928, 17931, 17932, 17934, 17936, 17938, 17939, 17940, 17942, 17944, 17945, 17947, 17948, 17950, 17952, 17953, 17954, 17957, 17958, 17959, 17963, 17965, 17967, 17969, 18533, 18536, 19395, 19423, 19426, 19438, 19442, 19451, 19480, 19598, 19734, 19735, 20418, 20466, 20677, 21256, 22927, 23250, 23395, 23418, 23419, 23420, 23444, 23445, 23446, 23447, 23448, 23449, 23450, 23451, 23452, 23453, 23454, 23455, 23456, 23457, 23458, 23459, 23460, 23461, 23462, 23463, 23466, 23469, 23470, 23472, 23473, 23475, 23478, 23479, 23480, 23481, 23482, 23483, 23484, 23485, 23486, 23487, 23488, 23489, 23490, 23491, 23492, 23493, 23494, 23495, 23496, 23497, 23498, 23499, 23500, 25827, 25830, 26801, 28824, 28825, 28826, 28827, 28828, 28829, 28830, 28831, 28832, 28833, 28834, 28835, 28836, 28837, 28838, 28839, 28841, 28842, 28843, 28845, 28846, 28847, 28848, 28849, 28850, 28851, 28852, 28853, 28854, 28855, 28858, 28859, 28978, 29762, 30402, 30478, 30479, 30565, 30566, 30569, 30570, 30571, 30572, 30573, 30574, 30575, 30576, 30577, 30578, 30579, 30584, 35977, 37604, 38257, 38261, 38264, 38266, 38267, 38270, 38271, 38272, 38273, 38274, 38277, 38278, 38280, 38281, 38283, 38284, 38285, 38286, 38287, 38288, 38289, 38291, 38293, 38295, 38297, 38309, 38310, 38311, 38312, 38313, 38314, 38315, 38317, 38318, 38319, 38320, 38321, 38322, 38323, 38334, 38336, 38338, 38339, 38340, 38341, 38352, 38354, 38356, 38357, 38358, 38361, 38362, 38363, 38364, 38365, 38366, 38370, 38371, 38372, 38373, 38374, 38375, 38376, 38377, 38378, 38379, 38380, 38381, 38382, 38383, 38384, 38385, 38386, 38387, 38388, 38389, 38390, 38391, 38392, 38393, 38394, 38395, 38396, 38397, 38398, 38437, 38438, 38439, 38440, 38441, 38442, 38443, 38444, 38445, 38446, 38447, 38448, 38449, 38450, 38451, 38452, 38453, 38454, 38455, 38456, 38457, 38717, 38718, 38719, 38720, 38721, 38722, 38723, 38724, 38725, 38979, 38980, 38981, 38989, 38993, 38994, 38995, 38996, 38997, 38998, 38999, 39000, 39001, 39002, 39003, 39004, 39005, 39010, 39011, 39012, 39013, 39018, 39019, 39020, 39021, 39022, 39023, 39024, 39083, 39084, 39085, 39086, 39087, 39088, 40057, 40059, 40079, 40081, 40126, 40811, 40813, 41655, 44103, 44104, 44105, 44110], "criteriaQuantity": [2, 61, 3, 1, 167552591, 1, 12, 1, 1, 1, 1, 375, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 1, 21000, 1, 2920, 45, 18246, 23476, 1, 2, 1, 2, 1, 3, 1, 1, 1, 1061, 2, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 12477830, 3242287, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 1, 2, 2, 1, 1, 1, 20341, 13260, 11108, 4556, 1820, 17300, 18758, 17327, 4461, 4388, 4124, 3500, 3852, 4501, 1, 4, 2, 437243, 9617, 6085, 1578, 117, 357, 750, 46, 4247, 49, 1670, 189, 5, 555, 1, 2, 30, 7, 1, 75, 75, 2, 75, 13, 1, 1, 1, 1, 9, 17, 39, 50, 9599, 22979, 8999, 42000, 24565, 3051, 42000, 11745, 6575, 1487429, 131323884, 3652, 658771, 4476, 1, 1, 1, 1, 3, 2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3727, 6983, 448, 798, 1410, 3, 3, 3, 260, 190, 1, 1, 1, 1, 1, 2, 2, 12, 4, 4, 2, 1, 8, 4, 66, 2, 1, 1, 1, 2360, 1722, 1722, 1722, 190, 1, 8, 6, 21, 4, 22, 2, 84, 27, 30, 40, 94, 10, 11, 0, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 206001, 1, 260, 190, 25, 3, 75, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1680, 4, 555, 1, 1, 1, 75, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1066, 200, 1, 1, 1, 1, 1, 1, 1, 36, 6, 30400, 1, 1, 0, 4, 0, 5, 3, 3, 3, 2, 3, 1, 3, 3, 2, 3, 4, 4, 3, 4, 3, 1, 21471, 42000, 45, 21471, 9599, 2360, 22979, 1722, 24565, 1722, 8999, 1722, 42000, 11108, 1820, 20341, 4556, 4388, 4124, 3852, 4461, 3051, 3500, 4501, 21000, 18758, 23476, 13260, 17327, 17300, 11745, 190, 30400, 42000, 2920, 1, 1, 3000, 1, 2, 2, 3, 9, 8999, 3, 3000, 8999, 1, 0, 75, 10, 3000, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 3000, 3000, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3000, 1, 1, 1, 1, 20, 22, 25, 25, 22, 59, 27, 96, 19, 2, 2, 21, 28, 1, 3000, 3000, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 260, 190, 14, 5, 1, 1, 1, 3000, 2000, 2000, 3571, 2800], "criteriaTimestamp": [1532781139000, 1532857731000, 1532857731000, 1531938025000, 1532872591000, 1532870859000, 1532804599000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532780918000, 1532870859000, 1532857058000, 1532803705000, 1532789715000, 1532783243000, 1532369760000, 1532859182000, 1532806005000, 1532804884000, 1532810297000, 1532872635000, 1532785245000, 1532782184000, 1532784133000, 1532855541000, 1532787486000, 1532780622000, 1532256565000, 1532870859000, 1532870859000, 1532779303000, 1532870859000, 1532872483000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532872635000, 1532872561000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531659032000, 1531768854000, 1531581783000, 1532270068000, 1531593058000, 1532870859000, 1531167835000, 1531687648000, 1532871526000, 1532870859000, 1532870859000, 1532870859000, 1532871421000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532811099000, 1532870859000, 1532870859000, 1532856037000, 1532872591000, 1532871899000, 1532872393000, 1532872483000, 1532872591000, 1532871397000, 1532803705000, 1532872581000, 1532809218000, 1532871899000, 1532812429000, 1532803020000, 1532871373000, 1532804652000, 1532781139000, 1532871903000, 1532104055000, 1532870859000, 1532870859000, 1532870859000, 1532102688000, 1532870859000, 1532855765000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532808606000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532872632000, 1532870859000, 1532870859000, 1531854585000, 1532871265000, 1531856634000, 1532811075000, 1532872637000, 1532781030000, 1532255799000, 1532256225000, 1532256225000, 1532256565000, 1532784497000, 1532784766000, 1532785068000, 1532785245000, 1532781805000, 1532781958000, 1532782184000, 1532783763000, 1532783543000, 1532783973000, 1532784133000, 1532855541000, 1532786864000, 1532787247000, 1532787062000, 1532787486000, 1532779777000, 1532780555000, 1532780235000, 1532780622000, 1532118688000, 1532872591000, 1532180763000, 1531562633000, 1532871899000, 1532870859000, 1532870859000, 1531336885000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532811096000, 1532811096000, 1531523127000, 1532178072000, 1532804599000, 1531602428000, 1532368013000, 1531583443000, 1531763833000, 1532125705000, 1532870859000, 1532870859000, 1531599124000, 1532803732000, 1532855054000, 1532855345000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531559750000, 1532858617000, 1532784360000, 1532118217000, 1532278259000, 1532783054000, 1532034473000, 1532859081000, 1532858892000, 1532116898000, 1532871373000, 1532870960000, 1532275322000, 1532870859000, 1531165014000, 1532256565000, 1532780622000, 1532782184000, 1532784133000, 1532787486000, 1532785245000, 1532811083000, 1532811388000, 1532811083000, 1532811388000, 1532871482000, 1532870859000, 1532870859000, 1532870859000, 1531596459000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532730598000, 1531773945000, 1532871373000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532872635000, 1531850244000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531517293000, 1532116262000, 1532803463000, 1532870859000, 1532870859000, 1532870859000, 1531851131000, 1532179169000, 1532179301000, 1532870859000, 1532870859000, 1532870859000, 1532872635000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532871421000, 1532871526000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532872632000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532124744000, 1532872483000, 1532870859000, 1532167441000, 1532164065000, 1532103158000, 1532116692000, 1532274444000, 1532870859000, 1532032426000, 1532870859000, 1532870859000, 1532255640000, 1532449189000, 1532870859000, 1531165152000, 1532870859000, 1531051101000, 1532870859000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532870859000, 1532870859000, 1532180953000, 1531645144000, 1531610062000, 1531587417000, 1531669041000, 1531338430000, 1531521098000, 1531515737000, 1531513091000, 1531343908000, 1531512510000, 1531519829000, 1531687781000, 1531773304000, 1531937785000, 1531513779000, 1531560556000, 1531658064000, 1532034200000, 1532255241000, 1531668085000, 1531592832000, 1532780004000, 1531687779000, 1532128714000, 1532124441000, 1531687782000, 1531590046000, 1532166708000, 1531860250000, 1531595190000, 1531521101000, 1531562779000, 1532870859000, 1532870859000, 1531523283000, 1531051101000, 1531523283000, 1532859896000, 1532812742000, 1532857101000, 1532854502000, 1532857675000, 1532857115000, 1532787603000, 1532812748000, 1532786058000, 1532790860000, 1531669041000, 1532872633000, 1532811096000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1550483988000, 1550483988000, 1550483988000, 1550483988000], "criteriaCreated": [1532368677000, 1531165014000, 1531578712000, 1531938025000, 1531051269000, 1532870859000, 1531861254000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532032051000, 1532870859000, 1532857058000, 1532778548000, 1532789715000, 1532284019000, 1532369760000, 1532371807000, 1532806005000, 1532804884000, 1532810297000, 1532870859000, 1532374780000, 1532782184000, 1532784133000, 1532855541000, 1532787486000, 1532780622000, 1532256565000, 1532870859000, 1532870859000, 1532779303000, 1532870859000, 1532872483000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531051415000, 1531052874000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531659032000, 1531587468000, 1531581783000, 1532178057000, 1531587407000, 1532870859000, 1531167835000, 1531687648000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532811099000, 1532870859000, 1532870859000, 1531341228000, 1531051277000, 1531051277000, 1531051277000, 1532026324000, 1531851279000, 1531052034000, 1531608102000, 1531052872000, 1531861609000, 1531334396000, 1531586897000, 1532777986000, 1531511901000, 1532804652000, 1532368677000, 1531859560000, 1531052894000, 1532870859000, 1532870859000, 1532870859000, 1532102402000, 1532870859000, 1531596519000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531166616000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531051270000, 1531051270000, 1531051269000, 1531341228000, 1531341228000, 1531341228000, 1532255799000, 1532256225000, 1532256225000, 1532256565000, 1532280964000, 1532374003000, 1532374494000, 1532374780000, 1532781805000, 1532781958000, 1532782184000, 1532783763000, 1532783543000, 1532783973000, 1532784133000, 1532855541000, 1532786864000, 1532787247000, 1532787062000, 1532787486000, 1532779777000, 1532780555000, 1532780235000, 1532780622000, 1531580363000, 1531051277000, 1532122268000, 1531051277000, 1532183081000, 1532870859000, 1532870859000, 1531336750000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532811096000, 1532811096000, 1531519882000, 1532164914000, 1531861254000, 1531581788000, 1532254098000, 1531341091000, 1531763833000, 1531646091000, 1532870859000, 1532870859000, 1531594200000, 1532803732000, 1532855054000, 1532855345000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531559750000, 1532283165000, 1532280215000, 1531930048000, 1532184709000, 1531515984000, 1532031455000, 1531512319000, 1532178032000, 1531688310000, 1532123322000, 1531517792000, 1531679909000, 1532870859000, 1531165014000, 1532256565000, 1532780622000, 1532782184000, 1532784133000, 1532787486000, 1532374780000, 1532811083000, 1532811388000, 1532811083000, 1532811388000, 1531602608000, 1532870859000, 1532870859000, 1532870859000, 1531596459000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531339345000, 1531168048000, 1531511901000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531051415000, 1531563842000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1531517293000, 1532029780000, 1532283554000, 1532870859000, 1532870859000, 1532870859000, 1531851101000, 1532179160000, 1532179263000, 1532870859000, 1532870859000, 1532870859000, 1532872629000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532124744000, 1532872483000, 1532870859000, 1532167441000, 1532163431000, 1532103074000, 1531939498000, 1531518152000, 1532870859000, 1531165152000, 1532870859000, 1532870859000, 1532255640000, 1532449189000, 1532870859000, 1531165152000, 1532870859000, 1531051101000, 1532870859000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532449189000, 1532870859000, 1532870859000, 1532180953000, 1531645144000, 1531610062000, 1531587417000, 1531669041000, 1531338430000, 1531521098000, 1531515737000, 1531513091000, 1531343908000, 1531512510000, 1531519829000, 1531687781000, 1531773304000, 1531937785000, 1531513779000, 1531560556000, 1531658064000, 1532034200000, 1532255241000, 1531668085000, 1531592832000, 1532780004000, 1531687779000, 1532128714000, 1532124441000, 1531687782000, 1531590046000, 1532166708000, 1531860250000, 1531595190000, 1531521101000, 1531562779000, 1532870859000, 1532870859000, 1531523283000, 1531051101000, 1531523283000, 1531051101000, 1531338424000, 1531515736000, 1531513089000, 1531051101000, 1531051101000, 1531343900000, 1532727543000, 1532730313000, 1531051101000, 1531051101000, 1531051101000, 1531511574000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1532870859000, 1550483988000, 1550483988000, 1550483988000, 1550483988000]}, "totalHonorableKills": 0}"""
)
elif character_name == "Thoryxus":
raise ClientResponseError(status=404, message="Not Found")
else:
raise ClientResponseError(status=404, message="Not Found")
| 1,151.99278 | 271,909 | 0.746579 | 22,109 | 319,102 | 10.773441 | 0.192275 | 0.128368 | 0.184038 | 0.237306 | 0.6679 | 0.660641 | 0.628565 | 0.624762 | 0.619182 | 0.614094 | 0 | 0.302681 | 0.076565 | 319,102 | 276 | 271,910 | 1,156.166667 | 0.505647 | 0 | 0 | 0.244444 | 0 | 0 | 0.094096 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022222 | false | 0 | 0.066667 | 0.022222 | 0.377778 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0a9305d345a4e719aa3cfcd617f414754da3f79d | 15,155 | py | Python | tests/unit/states/cron_test.py | skrobul/salt | ef7fb71082cce7a9783e00b9c65062fefae09263 | [
"Apache-2.0"
] | 1 | 2020-06-16T05:47:58.000Z | 2020-06-16T05:47:58.000Z | tests/unit/states/cron_test.py | skrobul/salt | ef7fb71082cce7a9783e00b9c65062fefae09263 | [
"Apache-2.0"
] | null | null | null | tests/unit/states/cron_test.py | skrobul/salt | ef7fb71082cce7a9783e00b9c65062fefae09263 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
'''
:codauthor: :email:`Mike Place <mp@saltstack.com>`
'''
# Import Salt Testing libs
from salttesting import TestCase, skipIf
from salttesting.helpers import ensure_in_syspath
from salttesting.mock import NO_MOCK, NO_MOCK_REASON, MagicMock, patch
from StringIO import StringIO
ensure_in_syspath('../../')
from salt.modules import cron as cronmod
from salt.states import cron as cron
STUB_USER = 'root'
STUB_PATH = '/tmp'
STUB_CRON_TIMESTAMP = {
'minute': '1',
'hour': '2',
'daymonth': '3',
'month': '4',
'dayweek': '5'}
STUB_SIMPLE_RAW_CRON = '5 0 * * * /tmp/no_script.sh'
STUB_SIMPLE_CRON_DICT = {
'pre': ['5 0 * * * /tmp/no_script.sh'],
'crons': [],
'env': [],
'special': []}
__low__ = {
'__id__': 'noo'
}
__grains__ = {
'os': 'Debian',
'os_family': 'Debian',
}
cron.__opts__ = {
'test': False,
}
cronmod.__low__ = cron.__low__ = __low__
cronmod.__grains__ = cron.__grains__ = __grains__
cronmod.__salt__ = cron.__salt__ = {
'cmd.run_all': MagicMock(return_value={
'pid': 5,
'retcode': 0,
'stderr': '',
'stdout': ''}),
'cron.list_tab': cronmod.list_tab,
'cron.rm_job': cronmod.rm_job,
'cron.set_job': cronmod.set_job,
}
CRONTAB = StringIO()
def get_crontab(*args, **kw):
return CRONTAB.getvalue()
def set_crontab(val):
CRONTAB.truncate(0)
CRONTAB.write(val)
def write_crontab(*args, **kw):
set_crontab('\n'.join(
[a.strip() for a in args[1]]))
return {
'retcode': False,
}
@skipIf(NO_MOCK, NO_MOCK_REASON)
class CronTestCase(TestCase):
def setUp(self):
super(CronTestCase, self).setUp()
set_crontab('')
def tearDown(self):
super(CronTestCase, self).tearDown()
set_crontab('')
@patch('salt.modules.cron.raw_cron',
new=MagicMock(side_effect=get_crontab))
@patch('salt.modules.cron._write_cron_lines',
new=MagicMock(side_effect=write_crontab))
def test_present(self):
cron.present(
name='foo',
hour='1',
identifier='1',
user='root')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'* 1 * * * foo')
cron.present(
name='foo',
hour='2',
identifier='1',
user='root')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'* 2 * * * foo')
cron.present(
name='foo',
hour='2',
identifier='2',
user='root')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'* 2 * * * foo\n'
'# SALT_CRON_IDENTIFIER:2\n'
'* 2 * * * foo')
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'* 2 * * * foo\n'
'# SALT_CRON_IDENTIFIER:2\n'
'* 2 * * * foo\n'
'* 2 * * * foo\n'
)
cron.present(
name='foo',
hour='2',
user='root')
self.assertEqual(
get_crontab(),
('# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'* 2 * * * foo\n'
'# SALT_CRON_IDENTIFIER:2\n'
'* 2 * * * foo\n'
'* 2 * * * foo\n'))
@patch('salt.modules.cron.raw_cron',
new=MagicMock(side_effect=get_crontab))
@patch('salt.modules.cron._write_cron_lines',
new=MagicMock(side_effect=write_crontab))
def test_remove(self):
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'* 1 * * * foo')
cron.absent(name='bar', identifier='1')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit'
)
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'* * * * * foo')
cron.absent(name='bar', identifier='1')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'* * * * * foo'
)
# old behavior, do not remove with identifier setted and
# even if command match !
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'* * * * * foo')
cron.absent(name='foo', identifier='1')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit'
)
# old behavior, remove if no identifier and command match
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'* * * * * foo')
cron.absent(name='foo')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit'
)
@patch('salt.modules.cron.raw_cron',
new=MagicMock(side_effect=get_crontab))
@patch('salt.modules.cron._write_cron_lines',
new=MagicMock(side_effect=write_crontab))
def test_aissue_1072(self):
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'# I have a multi-line comment SALT_CRON_IDENTIFIER:1\n'
'* 1 * * * foo'
)
cron.present(
name='foo',
hour='1',
comment='1I have a multi-line comment\n2about my script here.\n',
identifier='1',
user='root')
cron.present(
name='foo',
hour='1',
comment='3I have a multi-line comment\n3about my script here.\n',
user='root')
cron.present(
name='foo',
hour='1',
comment='I have a multi-line comment\nabout my script here.\n',
identifier='2',
user='root')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'# 2about my script here. SALT_CRON_IDENTIFIER:1\n'
'* 1 * * * foo\n'
'# I have a multi-line comment\n'
'# about my script here. SALT_CRON_IDENTIFIER:2\n'
'* 1 * * * foo')
@patch('salt.modules.cron.raw_cron',
new=MagicMock(side_effect=get_crontab))
@patch('salt.modules.cron._write_cron_lines',
new=MagicMock(side_effect=write_crontab))
def test_issue_11935(self):
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'0 2 * * * find /var/www -type f '
'-mtime -7 -print0 | xargs -0 '
'clamscan -i --no-summary 2>/dev/null'
)
cmd = (
'find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null'
)
self.assertEqual(cron._check_cron('root', cmd, hour='2', minute='0'),
'present')
ret = cron.present(cmd, 'root', minute='0', hour='2')
self.assertEqual(ret['changes'], {})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null already present')
self.assertEqual(cron._check_cron('root', cmd, hour='3', minute='0'),
'update')
ret = cron.present(cmd, 'root', minute='0', hour='3')
self.assertEqual(ret['changes'],
{'root': 'find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/null'})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null updated')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'0 3 * * * find /var/www -type f -mtime -7 -print0 |'
' xargs -0 clamscan -i --no-summary 2>/dev/null')
@patch('salt.modules.cron.raw_cron',
new=MagicMock(side_effect=get_crontab))
@patch('salt.modules.cron._write_cron_lines',
new=MagicMock(side_effect=write_crontab))
def test_issue_11935_with_id(self):
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'0 2 * * * find /var/www -type f '
'-mtime -7 -print0 | xargs -0 '
'clamscan -i --no-summary 2>/dev/null'
)
cmd = (
'find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null'
)
self.assertEqual(cron._check_cron(
'root', cmd, hour='2', minute='0', identifier=1), 'present')
ret = cron.present(cmd, 'root', minute='0', hour='2', identifier='1')
self.assertEqual(ret['changes'], {})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null already present')
self.assertEqual(cron._check_cron(
'root', cmd, hour='3', minute='0', identifier='1'), 'update')
ret = cron.present(cmd, 'root', minute='0', hour='3', identifier='1')
self.assertEqual(ret['changes'],
{'root': 'find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/null'})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null updated')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'0 3 * * * find /var/www -type f -mtime -7 -print0 |'
' xargs -0 clamscan -i --no-summary 2>/dev/null')
@patch('salt.modules.cron.raw_cron',
new=MagicMock(side_effect=get_crontab))
@patch('salt.modules.cron._write_cron_lines',
new=MagicMock(side_effect=write_crontab))
def test_issue_11935_mixed(self):
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'0 2 * * * find /var/www -type f '
'-mtime -7 -print0 | xargs -0 '
'clamscan -i --no-summary 2>/dev/null'
)
cmd = (
'find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null'
)
self.assertEqual(cron._check_cron('root', cmd, hour='2', minute='0'),
'present')
ret = cron.present(cmd, 'root', minute='0', hour='2')
self.assertEqual(ret['changes'], {})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null already present')
self.assertEqual(cron._check_cron('root', cmd, hour='3', minute='0'),
'update')
ret = cron.present(cmd, 'root', minute='0', hour='3')
self.assertEqual(ret['changes'],
{'root': 'find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/null'})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null updated')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'0 3 * * * find /var/www -type f -mtime -7 -print0 |'
' xargs -0 clamscan -i --no-summary 2>/dev/null')
self.assertEqual(cron._check_cron(
'root', cmd, hour='2', minute='0', identifier='1'), 'update')
ret = cron.present(cmd, 'root', minute='0', hour='2', identifier='1')
self.assertEqual(
ret['changes'],
{'root': 'find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/null'})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null updated')
self.assertEqual(cron._check_cron(
'root', cmd, hour='3', minute='0', identifier='1'), 'update')
ret = cron.present(cmd, 'root', minute='0', hour='3', identifier='1')
self.assertEqual(ret['changes'],
{'root': 'find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/null'})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 '
'| xargs -0 clamscan -i --no-summary 2>/dev/null updated')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'# SALT_CRON_IDENTIFIER:1\n'
'0 3 * * * find /var/www -type f -mtime -7 -print0 |'
' xargs -0 clamscan -i --no-summary 2>/dev/null')
set_crontab(
'# Lines below here are managed by Salt, do not edit\n'
'0 2 * * * find /var/www -type f '
'-mtime -7 -print0 | xargs -0 '
'clamscan -i --no-summary 2>/dev/null'
)
self.assertEqual(cron._check_cron(
'root', cmd + "a", hour='2', minute='0', identifier='1'), 'absent')
ret = cron.present(
cmd + "a", 'root', minute='0', hour='2', identifier='1')
self.assertEqual(
ret['changes'],
{'root': 'find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/nulla'})
self.assertEqual(
ret['comment'],
'Cron find /var/www -type f -mtime -7 -print0 | '
'xargs -0 clamscan -i --no-summary 2>/dev/nulla added '
'to root\'s crontab')
self.assertEqual(
get_crontab(),
'# Lines below here are managed by Salt, do not edit\n'
'0 2 * * *'
' find /var/www -type f -mtime -7 -print0'
' | xargs -0 clamscan -i --no-summary 2>/dev/null\n'
'# SALT_CRON_IDENTIFIER:1\n'
'0 2 * * *'
' find /var/www -type f -mtime -7 -print0'
' | xargs -0 clamscan -i --no-summary 2>/dev/nulla')
if __name__ == '__main__':
from integration import run_tests
run_tests(CronTestCase, needs_daemon=False)
| 37.235872 | 79 | 0.522864 | 1,857 | 15,155 | 4.142703 | 0.095854 | 0.079943 | 0.036397 | 0.050955 | 0.836735 | 0.820746 | 0.799688 | 0.789679 | 0.786559 | 0.77668 | 0 | 0.025714 | 0.330254 | 15,155 | 406 | 80 | 37.327586 | 0.732217 | 0.015374 | 0 | 0.707317 | 0 | 0 | 0.395545 | 0.050315 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.02981 | false | 0 | 0.01897 | 0.00271 | 0.056911 | 0.075881 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0aa7cdf28b2e51ca46856f2cb2fc05a47099fc6a | 41,585 | py | Python | api/ingame.py | thesky-cdn/bot-dokkan-battle | cea44eefef279969d0769edc27bbbe203ffa6c77 | [
"MIT"
] | null | null | null | api/ingame.py | thesky-cdn/bot-dokkan-battle | cea44eefef279969d0769edc27bbbe203ffa6c77 | [
"MIT"
] | null | null | null | api/ingame.py | thesky-cdn/bot-dokkan-battle | cea44eefef279969d0769edc27bbbe203ffa6c77 | [
"MIT"
] | null | null | null | import requests
import config
import crypto
import json
import time
import random
from random import choice
from random import randint
from string import ascii_uppercase
import base64
# account information
def user(ver, os, token, secret, first):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/user'
auth = crypto.mac(ver, token, secret, 'GET', '/user')
if first == False:
code = '////'
else:
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/user'
auth = crypto.mac(ver, token, secret, 'GET', '/user')
if first == False:
code = config.jp_code
else:
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# box contents
def cards(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/cards'
auth = crypto.mac(ver, token, secret, 'GET', '/cards')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/cards'
auth = crypto.mac(ver, token, secret, 'GET', '/cards')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# current news
def news(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/announcements?display=home'
auth = crypto.mac(ver, token, secret, 'GET', '/announcements?display=home')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/announcements?display=home'
auth = crypto.mac(ver, token, secret, 'GET', '/announcements?display=home')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# current banners
def banners(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/gashas'
auth = crypto.mac(ver, token, secret, 'GET', '/gashas')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/gashas'
auth = crypto.mac(ver, token, secret, 'GET', '/gashas')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# current events
def events(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/events'
auth = crypto.mac(ver, token, secret, 'GET', '/events')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/events'
auth = crypto.mac(ver, token, secret, 'GET', '/events')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# change account name
def changeName(ver, os, token, secret, name):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/user'
auth = crypto.mac(ver, token, secret, 'PUT', '/user')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/user'
auth = crypto.mac(ver, token, secret, 'PUT', '/user')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
data = {'user':{'name': name}}
r = requests.put(url, data=json.dumps(data), headers=headers)
return r.json()
# increase box size by 5
def capacity(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/user/capacity/card'
auth = crypto.mac(ver, token, secret, 'POST', '/user/capacity/card')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/user/capacity/card'
auth = crypto.mac(ver, token, secret, 'POST', '/user/capacity/card')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.post(url, data=None, headers=headers)
return r.json()
# starter banner status
def dashStatus(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/start_dash_gasha_status'
auth = crypto.mac(ver, token, secret, 'GET', '/start_dash_gasha_status')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/start_dash_gasha_status'
auth = crypto.mac(ver, token, secret, 'GET', '/start_dash_gasha_status')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# starter banner status
def starterStatus(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/start_dash_gasha_status'
auth = crypto.mac(ver, token, secret, 'GET', '/start_dash_gasha_status')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/start_dash_gasha_status'
auth = crypto.mac(ver, token, secret, 'GET', '/start_dash_gasha_status')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# summon on a banner
def summon(ver, os, token, secret, id, course):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/gashas/' + str(id) + '/courses/' + str(course) + '/draw'
auth = crypto.mac(ver, token, secret, 'POST', '/gashas/' + str(id) + '/courses/' + str(course) + '/draw')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/gashas/' + str(id) + '/courses/' + str(course) + '/draw'
auth = crypto.mac(ver, token, secret, 'POST', '/gashas/' + str(id) + '/courses/' + str(course) + '/draw')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.post(url, data=None, headers=headers)
return r.json()
# list of gifts
def gifts(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/gifts'
auth = crypto.mac(ver, token, secret, 'GET', '/gifts')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/gifts'
auth = crypto.mac(ver, token, secret, 'GET', '/gifts')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# accept gifts
def acceptGifts(ver, os, token, secret, gift):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/gifts/accept'
auth = crypto.mac(ver, token, secret, 'POST', '/gifts/accept')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/gifts/accept'
auth = crypto.mac(ver, token, secret, 'POST', '/gifts/accept')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
data = {'gift_ids': gift}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# list of missions
def missions(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/missions'
auth = crypto.mac(ver, token, secret, 'GET', '/missions')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/missions'
auth = crypto.mac(ver, token, secret, 'GET', '/missions')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# accept missions
def acceptMissions(ver, os, token, secret, mission):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/missions/accept'
auth = crypto.mac(ver, token, secret, 'POST', '/missions/accept')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/missions/accept'
auth = crypto.mac(ver, token, secret, 'POST', '/missions/accept')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
data = {'mission_ids': mission}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# stone stamina refill
def actRefill(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/user/recover_act_with_stone'
auth = crypto.mac(ver, token, secret, 'PUT', '/user/recover_act_with_stone')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/user/recover_act_with_stone'
auth = crypto.mac(ver, token, secret, 'PUT', '/user/recover_act_with_stone')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.put(url, data=None, headers=headers)
return r.json()
# sell cards
def sell(ver, os, token, secret, card):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/cards/sell'
auth = crypto.mac(ver, token, secret, 'POST', '/cards/sell')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/cards/sell'
auth = crypto.mac(ver, token, secret, 'POST', '/cards/sell')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
data = {'card_ids': card}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# set wallpaper
def setWallpaper(ver, os, token, secret, wallpaper):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/user'
auth = crypto.mac(ver, token, secret, 'PUT', '/user')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/user'
auth = crypto.mac(ver, token, secret, 'PUT', '/user')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
data = {'user': {'wallpaper_item_id': wallpaper}}
r = requests.put(url, data=json.dumps(data), headers=headers)
return r.json()
# story stages
def quests(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/user_areas'
auth = crypto.mac(ver, token, secret, 'GET', '/user_areas')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/user_areas'
auth = crypto.mac(ver, token, secret, 'GET', '/user_areas')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# stage supports
def getSupports(ver, os, token, secret, stage, difficulty):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/quests/' + str(stage) + '/supporters'
auth = crypto.mac(ver, token, secret, 'GET', '/quests/' + str(stage) + '/supporters')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/quests/' + str(stage) + '/supporters'
auth = crypto.mac(ver, token, secret, 'GET', '/quests/' + str(stage) + '/supporters')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# get all medals
def getMedals(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/awakening_items'
auth = crypto.mac(ver, token, secret, 'GET', '/awakening_items')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/awakening_items'
auth = crypto.mac(ver, token, secret, 'GET', '/awakening_items')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# get all items (orbs, training, support, treasure, special)
def getItems(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/resources/login?potential_items=true&training_items=true&support_items=true&treasure_items=true&special_items=true'
auth = crypto.mac(ver, token, secret, 'GET', '/resources/login?potential_items=true&training_items=true&support_items=true&treasure_items=true&special_items=true')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/resources/login?potential_items=true&training_items=true&support_items=true&treasure_items=true&special_items=true'
auth = crypto.mac(ver, token, secret, 'GET', '/resources/login?potential_items=true&training_items=true&support_items=true&treasure_items=true&special_items=true')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# list of dragonball locations
def dragonballs(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/dragonball_sets'
auth = crypto.mac(ver, token, secret, 'GET', '/dragonball_sets')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/dragonball_sets'
auth = crypto.mac(ver, token, secret, 'GET', '/dragonball_sets')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# list of dragonball locations
def dragonballs(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/dragonball_sets'
auth = crypto.mac(ver, token, secret, 'GET', '/dragonball_sets')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/dragonball_sets'
auth = crypto.mac(ver, token, secret, 'GET', '/dragonball_sets')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# friend list
def friends(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/friendships'
auth = crypto.mac(ver, token, secret, 'GET', '/friendships')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/friendships'
auth = crypto.mac(ver, token, secret, 'GET', '/friendships')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# search friend by user ID
def findFriend(ver, os, token, secret, id):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/users/' + str(id)
auth = crypto.mac(ver, token, secret, 'GET', '/users/' + str(id))
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/users/' + str(id)
auth = crypto.mac(ver, token, secret, 'GET', '/users/' + str(id))
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# add friend
def addFriend(ver, os, token, secret, id):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/users/' + str(id) + '/friendships'
auth = crypto.mac(ver, token, secret, 'POST', '/users/' + str(id) + '/friendships')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/users/' + str(id) + '/friendships'
auth = crypto.mac(ver, token, secret, 'POST', '/users/' + str(id) + '/friendships')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.post(url, headers=headers)
return r.json()
# accept pending friend
def acceptFriend(ver, os, token, secret, id):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/friendships/' + str(id) + '/accept'
auth = crypto.mac(ver, token, secret, 'PUT', '/friendships/' + str(id) + '/accept')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/friendships/' + str(id) + '/accept'
auth = crypto.mac(ver, token, secret, 'PUT', '/friendships/' + str(id) + '/accept')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.put(url, headers=headers)
return r.json()
def getTeams(ver, os, token, secret):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/teams'
auth = crypto.mac(ver, token, secret, 'GET', '/teams')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/teams'
auth = crypto.mac(ver, token, secret, 'GET', '/teams')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
def setTeam(ver, os, token, secret, team, cards):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/teams'
auth = crypto.mac(ver, token, secret, 'POST', '/teams')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/teams'
auth = crypto.mac(ver, token, secret, 'POST', '/teams')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
data = {'selected_team_num': int(team), 'user_card_teams': cards}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# start a stage by ID & difficulty
def startStage(ver, os, token, secret, stage, difficulty, friend, friend_card):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/quests/' + str(stage) + '/sugoroku_maps/start'
auth = crypto.mac(ver, token, secret, 'POST', '/quests/' + str(stage) + '/sugoroku_maps/start')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/quests/' + str(stage) + '/sugoroku_maps/start'
auth = crypto.mac(ver, token, secret, 'POST', '/quests/' + str(stage) + '/sugoroku_maps/start')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
APIToken = ''.join(random.choice(list('abcdefghijklmnopqrstuvwxyzBCDEFGHIKLMNOPQRUVWXYZ123456789-_')) for i in range(63))
decks = getTeams(ver, os, token, secret)
if len(str(friend)) >= 4:
sign = json.dumps({'difficulty': int(difficulty), 'friend_id': int(friend), 'is_playing_script': True, 'selected_team_num': int(decks['selected_team_num']), 'support_leader': {'card_id': int(friend_card), 'exp': 0, 'optimal_awakening_step': 0, 'released_rate': 0}})
enc_sign = crypto.encrypt_sign(sign)
else:
sign = json.dumps({'difficulty': int(difficulty), 'cpu_friend_id': int(friend), 'is_playing_script': True, 'selected_team_num': int(decks['selected_team_num'])})
enc_sign = crypto.encrypt_sign(sign)
headers = {
'User-Agent': dua,
'Accept': '*/*',
'Authorization': auth,
'Content-Type': 'application/json',
'X-Platform': os,
'X-AssetVersion': '////',
'X-DatabaseVersion': '////',
'X-ClientVersion': code
}
data = {'sign': enc_sign}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# finish stage by ID & difficulty
def finishStage(ver, os, token, secret, stage, difficulty, paces, defeated, stoken):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/quests/' + str(stage) + '/sugoroku_maps/finish'
auth = crypto.mac(ver, token, secret, 'POST', '/quests/' + str(stage) + '/sugoroku_maps/finish')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/quests/' + str(stage) + '/sugoroku_maps/finish'
auth = crypto.mac(ver, token, secret, 'POST', '/quests/' + str(stage) + '/sugoroku_maps/finish')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
steps = []
for x in paces:
steps.append(x)
finish = int(round(time.time(), 0) + 90)
start = finish - randint(6200000, 8200000)
damage = randint(500000, 1000000)
# Hercule punching bag event damage
if str(stage)[0:3] in ('711', '185'):
damage = randint(100000000, 101000000)
sign = {
'actual_steps': steps,
'difficulty': difficulty,
'elapsed_time': finish - start,
'energy_ball_counts_in_boss_battle': [4, 6, 0, 6, 4, 3, 0, 0, 0, 0, 0, 0, 0, ],
'has_player_been_taken_damage': False,
'is_cheat_user': False,
'is_cleared': True,
'is_defeated_boss': True,
'is_player_special_attack_only': True,
'max_damage_to_boss': damage,
'min_turn_in_boss_battle': len(defeated),
'passed_round_ids': defeated,
'quest_finished_at_ms': finish,
'quest_started_at_ms': start,
'steps': steps,
'token': stoken
}
enc_sign = crypto.encrypt_sign(json.dumps(sign))
headers = {
'User-Agent': dua,
'Accept': '*/*',
'Authorization': auth,
'Content-Type': 'application/json',
'X-Platform': os,
'X-AssetVersion': '////',
'X-DatabaseVersion': '////',
'X-ClientVersion': code
}
data = {'sign': enc_sign}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# eza rankings
def zRankings(ver, os, token, secret, eza):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/z_battles/' + str(eza) + '/rankings'
auth = crypto.mac(ver, token, secret, 'GET', '/z_battles/' + str(eza) + '/rankings')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/z_battles/' + str(eza) + '/rankings'
auth = crypto.mac(ver, token, secret, 'GET', '/z_battles/' + str(eza) + '/rankings')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# eza support units
def zSupports(ver, os, token, secret, eza):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/z_battles/' + str(eza) + '/supporters'
auth = crypto.mac(ver, token, secret, 'GET', '/z_battles/' + str(eza) + '/supporters')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/z_battles/' + str(eza) + '/supporters'
auth = crypto.mac(ver, token, secret, 'GET', '/z_battles/' + str(eza) + '/supporters')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
headers = {
'X-Platform': os,
'X-Language': config.lang,
'X-ClientVersion': code,
'X-AssetVersion': asset,
'X-DatabaseVersion': db,
'Content-Type': 'application/json',
'Accept': '*/*',
'Authorization': auth,
'User-Agent': dua
}
r = requests.get(url, data=None, headers=headers)
return r.json()
# start eza by level
def zStart(ver, os, token, secret, eza, level, friend, friend_card):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/z_battles/' + str(eza) + '/start'
auth = crypto.mac(ver, token, secret, 'POST', '/z_battles/' + str(eza) + '/start')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/z_battles/' + str(eza) + '/start'
auth = crypto.mac(ver, token, secret, 'POST', '/z_battles/' + str(eza) + '/start')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
APIToken = ''.join(random.choice(list('abcdefghijklmnopqrstuvwxyzBCDEFGHIKLMNOPQRUVWXYZ123456789-_')) for i in range(63))
decks = getTeams(ver, os, token, secret)
sign = json.dumps({'friend_id': int(friend), 'level': int(level), 'selected_team_num': int(decks['selected_team_num']), 'support_leader': {'card_id': int(friend_card), 'exp': 0, 'optimal_awakening_step': 0, 'released_rate': 0}})
enc_sign = crypto.encrypt_sign(sign)
headers = {
'User-Agent': dua,
'Accept': '*/*',
'Authorization': auth,
'Content-Type': 'application/json',
'X-Platform': os,
'X-AssetVersion': '////',
'X-DatabaseVersion': '////',
'X-ClientVersion': code
}
data = {'sign': enc_sign}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json()
# finish eza by level
def zFinish(ver, os, token, secret, eza, level, stoken, em_atk, em_hp):
if os == 'android':
dua = config.device_agent1
else:
dua = config.device_agent2
if ver == 'gb':
url = config.gb_url + '/z_battles/' + str(eza) + '/finish'
auth = crypto.mac(ver, token, secret, 'POST', '/z_battles/' + str(eza) + '/finish')
code = config.gb_code
asset = config.file_ts1
db = config.db_ts1
else:
url = config.jp_url + '/z_battles/' + str(eza) + '/finish'
auth = crypto.mac(ver, token, secret, 'POST', '/z_battles/' + str(eza) + '/finish')
code = config.jp_code
asset = config.file_ts2
db = config.db_ts2
finish = int(round(time.time(), 0) + 90)
start = finish - randint(6200000, 8200000)
summary = {
'summary':{
'enemy_attack': int(em_atk),
'enemy_attack_count': 1,
'enemy_heal_counts': [0],
'enemy_heals': [0],
'enemy_max_attack': int(em_atk),
'enemy_min_attack': int(em_atk),
'player_attack_counts': [3],
'player_attacks': em_hp,
'player_heal': 0,
'player_heal_count': 0,
'player_max_attacks': em_hp,
'player_min_attacks': em_hp,
'type': 'summary'
}
}
headers = {
'User-Agent': dua,
'Accept': '*/*',
'Authorization': auth,
'Content-Type': 'application/json',
'X-Platform': os,
'X-AssetVersion': '////',
'X-DatabaseVersion': '////',
'X-ClientVersion': code
}
data = {
'elapsed_time': finish - start,
'is_cleared': True,
'level': int(level),
'reason': 'win',
's': 'iwM9xu4mM/7fZyLfKV93JaquLtLzpP35CKBoDiB+X8k=',
't': base64.b64encode(json.dumps(summary).encode()).decode(),
'token': str(stoken),
'used_items': [],
'z_battle_finished_at_ms': finish,
'z_battle_started_at_ms': start
}
r = requests.post(url, data=json.dumps(data), headers=headers)
return r.json() | 33.781478 | 274 | 0.546447 | 4,784 | 41,585 | 4.629181 | 0.057274 | 0.053147 | 0.047413 | 0.050573 | 0.904949 | 0.89935 | 0.885442 | 0.885171 | 0.883681 | 0.883681 | 0 | 0.012067 | 0.310473 | 41,585 | 1,231 | 275 | 33.781478 | 0.760271 | 0.01652 | 0 | 0.829974 | 0 | 0.003506 | 0.217692 | 0.033305 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030675 | false | 0.000876 | 0.008764 | 0 | 0.070114 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0abb359fcd3fa230f84c1d94f935a7561a18b43e | 104 | py | Python | reward_surfaces/experiments/__init__.py | weepingwillowben/reward-surfaces | f27211faf3784df3305972b7cad65002fd57d7bf | [
"MIT"
] | null | null | null | reward_surfaces/experiments/__init__.py | weepingwillowben/reward-surfaces | f27211faf3784df3305972b7cad65002fd57d7bf | [
"MIT"
] | null | null | null | reward_surfaces/experiments/__init__.py | weepingwillowben/reward-surfaces | f27211faf3784df3305972b7cad65002fd57d7bf | [
"MIT"
] | 2 | 2021-10-03T14:51:38.000Z | 2021-11-10T02:54:26.000Z | from .generate_eval_jobs import generate_eval_jobs
from .generate_plane_jobs import generate_plane_data
| 34.666667 | 52 | 0.903846 | 16 | 104 | 5.375 | 0.4375 | 0.27907 | 0.372093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 104 | 2 | 53 | 52 | 0.895833 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7c2e3a4040df4e3d8ffc01195a444c47fb9646ba | 12,249 | py | Python | py_entitymatching/tests/_test_debug_matcher_dt.py | anhaidgroup/py_entitymatching | 9390e6444e2dc0976405fad4678f96cd98680b92 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 165 | 2016-08-28T14:30:01.000Z | 2022-03-29T17:24:03.000Z | py_entitymatching/tests/_test_debug_matcher_dt.py | mvahit/py_entitymatching | 6724081d7d95c547e5a51625b4a8207c6c1737f8 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 70 | 2016-11-22T00:35:22.000Z | 2022-03-11T22:26:26.000Z | py_entitymatching/tests/_test_debug_matcher_dt.py | mvahit/py_entitymatching | 6724081d7d95c547e5a51625b4a8207c6c1737f8 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 53 | 2016-09-22T02:07:34.000Z | 2022-03-19T18:57:06.000Z | # coding=utf-8
import os
import unittest
from nose.tools import *
import pandas as pd
import py_entitymatching.catalog.catalog_manager as cm
import py_entitymatching.matcher.matcherutils as mu
from py_entitymatching.debugmatcher.debug_gui_decisiontree_matcher import _vis_debug_dt, \
vis_tuple_debug_dt_matcher
from py_entitymatching.debugmatcher.debug_decisiontree_matcher import visualize_tree, \
debug_decisiontree_matcher
from py_entitymatching.feature.autofeaturegen import get_features_for_matching
from py_entitymatching.feature.extractfeatures import extract_feature_vecs
from py_entitymatching.io.parsers import read_csv_metadata
from py_entitymatching.matcher.dtmatcher import DTMatcher
from py_entitymatching.utils.generic_helper import get_install_path
datasets_path = os.sep.join([get_install_path(), 'tests', 'test_datasets'])
path_a = os.sep.join([datasets_path, 'A.csv'])
path_b = os.sep.join([datasets_path, 'B.csv'])
path_c = os.sep.join([datasets_path, 'C.csv'])
class VisDTDebugMatcherTestCases(unittest.TestCase):
def setUp(self):
cm.del_catalog()
def tearDown(self):
cm.del_catalog()
def test_vis_debug_matcher_dt_valid_1(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
train_test = mu.split_train_test(feature_vectors)
train = train_test['train']
test = train_test['test']
_vis_debug_dt(dt, train, test,
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels', show_window=False)
def test_vis_tuple_debug_dt_matcher_valid_1(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
dt.fit(table=feature_vectors, exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels')
s = pd.DataFrame(feature_vectors.loc[0])
s1 = s.T
vis_tuple_debug_dt_matcher(dt, s1,
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'])
def test_vis_tuple_debug_dt_matcher_valid_2(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
dt.fit(table=feature_vectors, exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels')
s = pd.DataFrame(feature_vectors.loc[0])
s1 = s.T
vis_tuple_debug_dt_matcher(dt.clf, s1,
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'])
def test_vis_tuple_debug_dt_matcher_valid_3(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
dt.fit(table=feature_vectors, exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels')
feature_vectors.drop(['_id', 'ltable_ID', 'rtable_ID', 'labels'], axis=1, inplace=True)
s = pd.DataFrame(feature_vectors.loc[0])
s1 = s.T
vis_tuple_debug_dt_matcher(dt.clf, s1, exclude_attrs=None)
@raises(AssertionError)
def test_vis_debug_matcher_dt_invalid_df(self):
_vis_debug_dt(None, pd.DataFrame(), pd.DataFrame(),
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels', show_window=False)
@raises(AssertionError)
def test_vis_debug_matcher_dt_invalid_tar_attr(self):
_vis_debug_dt(DTMatcher(), pd.DataFrame(), pd.DataFrame(),
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr=None, show_window=False)
@raises(AssertionError)
def test_vis_debug_matcher_dt_ex_attrs_notin_train(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
train_test = mu.split_train_test(feature_vectors)
train = train_test['train']
test = train_test['test']
_vis_debug_dt(dt, train, test,
exclude_attrs=['_id', 'ltable_ID1', 'rtable_ID', 'labels'],
target_attr='labels', show_window=False)
@raises(AssertionError)
def test_vis_debug_matcher_dt_tar_attr_notin_train(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
train_test = mu.split_train_test(feature_vectors)
train = train_test['train']
test = train_test['test']
_vis_debug_dt(dt, train, test,
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels1', show_window=False)
@raises(AssertionError)
def test_vis_debug_matcher_dt_ex_attrs_notin_test(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
train_test = mu.split_train_test(feature_vectors)
train = train_test['train']
test = train_test['test']
test.drop('_id', inplace=True, axis=1)
_vis_debug_dt(dt, train, test,
exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels', show_window=False)
# def test_vis_debug_matcher_dt_tar_attrs_notin_exattrs(self):
# A = read_csv_metadata(path_a)
# B = read_csv_metadata(path_b, key='ID')
# C = read_csv_metadata(path_c, ltable=A, rtable=B)
# labels = [0] * 7
# labels.extend([1] * 8)
# C['labels'] = labels
#
# feature_table = get_features_for_matching(A, B)
# feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
# attrs_after='labels')
#
# dt = DTMatcher()
# train_test = mu.split_train_test(feature_vectors)
#
# train = train_test['train']
# test = train_test['test']
# _vis_debug_dt(dt, train, test,
# exclude_attrs=['_id', 'ltable_ID', 'rtable_ID'],
# target_attr='labels', show_window=False)
# def test_vis_debug_matcher_dt_label_col_wi_sp_name(self):
# A = read_csv_metadata(path_a)
# B = read_csv_metadata(path_b, key='ID')
# C = read_csv_metadata(path_c, ltable=A, rtable=B)
# labels = [0] * 7
# labels.extend([1] * 8)
# C['_predicted'] = labels
#
# feature_table = get_features_for_matching(A, B)
# feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
# attrs_after='_predicted')
#
# dt = DTMatcher()
# train_test = mu.split_train_test(feature_vectors)
#
# train = train_test['train']
# test = train_test['test']
# _vis_debug_dt(dt, train, test,
# exclude_attrs=['_id', 'ltable_ID', 'rtable_ID'],
# target_attr='_predicted', show_window=False)
class DTDebugMatcherTestCases(unittest.TestCase):
def setUp(self):
cm.del_catalog()
def tearDown(self):
cm.del_catalog()
def test_visualize_tree_valid(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
dt.fit(table=feature_vectors, exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels')
visualize_tree(dt, feature_vectors.columns, exclude_attrs=['_id', 'ltable_ID',
'rtable_ID', 'labels'])
# @raises(AssertionError)
def test_visualize_tree_invalid_df(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
dt.fit(table=feature_vectors, exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels')
visualize_tree(dt.clf, feature_vectors.columns, exclude_attrs=['_id', 'ltable_ID',
'rtable_ID', 'labels'])
def test_debug_dt_matcher_valid(self):
A = read_csv_metadata(path_a)
B = read_csv_metadata(path_b, key='ID')
C = read_csv_metadata(path_c, ltable=A, rtable=B)
labels = [0] * 7
labels.extend([1] * 8)
C['labels'] = labels
feature_table = get_features_for_matching(A, B)
feature_vectors = extract_feature_vecs(C, feature_table=feature_table,
attrs_after='labels')
dt = DTMatcher()
dt.fit(table=feature_vectors, exclude_attrs=['_id', 'ltable_ID', 'rtable_ID', 'labels'],
target_attr='labels')
debug_decisiontree_matcher(dt, A.loc[1], B.loc[2], feature_table=feature_table,
table_columns=feature_vectors.columns,
exclude_attrs=['ltable_ID', 'rtable_ID', '_id', 'labels'])
| 41.104027 | 96 | 0.599151 | 1,507 | 12,249 | 4.493696 | 0.078965 | 0.067336 | 0.081955 | 0.101004 | 0.840667 | 0.812315 | 0.806261 | 0.805375 | 0.802569 | 0.794005 | 0 | 0.007799 | 0.288187 | 12,249 | 297 | 97 | 41.242424 | 0.768896 | 0.127112 | 0 | 0.769231 | 0 | 0 | 0.071932 | 0 | 0 | 0 | 0 | 0 | 0.024038 | 1 | 0.076923 | false | 0 | 0.0625 | 0 | 0.149038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7c55828e8744a81e63cccc7591d302303852b6dd | 517 | py | Python | modified_gym/envs/robotics/__init__.py | mk37972/SCAPE | 01080e4159917546c76dd15ae5c74e092f4ae299 | [
"MIT"
] | null | null | null | modified_gym/envs/robotics/__init__.py | mk37972/SCAPE | 01080e4159917546c76dd15ae5c74e092f4ae299 | [
"MIT"
] | null | null | null | modified_gym/envs/robotics/__init__.py | mk37972/SCAPE | 01080e4159917546c76dd15ae5c74e092f4ae299 | [
"MIT"
] | null | null | null | from modified_gym.envs.robotics.fetch_env import FetchEnv
from modified_gym.envs.robotics.fetch.block_environment import BlockEnv
from modified_gym.envs.robotics.fetch.chip_environment import ChipEnv
from modified_gym.envs.robotics.fetch.nufingers_environment import NuFingersEnv
from modified_gym.envs.robotics.fetch.block_environment_IL import BlockEnvIL
from modified_gym.envs.robotics.fetch.chip_environment_IL import ChipEnvIL
from modified_gym.envs.robotics.fetch.nufingers_environment_IL import NuFingersEnvIL
| 64.625 | 84 | 0.893617 | 72 | 517 | 6.180556 | 0.277778 | 0.188764 | 0.235955 | 0.298876 | 0.732584 | 0.732584 | 0.660674 | 0.660674 | 0 | 0 | 0 | 0 | 0.054159 | 517 | 7 | 85 | 73.857143 | 0.91002 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7c7ecd4ab1b185801ee576935a1b1c984907d668 | 49 | py | Python | src/mi_est/__init__.py | v-i-s-h/ib-dnn | 6176a90596121f477dea4eea4d3093e5c02d420e | [
"MIT"
] | null | null | null | src/mi_est/__init__.py | v-i-s-h/ib-dnn | 6176a90596121f477dea4eea4d3093e5c02d420e | [
"MIT"
] | null | null | null | src/mi_est/__init__.py | v-i-s-h/ib-dnn | 6176a90596121f477dea4eea4d3093e5c02d420e | [
"MIT"
] | null | null | null | from .binning import compute_mi as bin_compute_mi | 49 | 49 | 0.877551 | 9 | 49 | 4.444444 | 0.777778 | 0.45 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102041 | 49 | 1 | 49 | 49 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7ccc1ccc048906bb4ec13a31cf1df8e508a69a6e | 12,139 | py | Python | Processing_Scripts/VIIRS/ReprojectVIIRSwithPYTROLL.py | tylere/SEBAL | 685832bea985413b88c415261ef7d23405d5f20a | [
"Apache-2.0"
] | 16 | 2018-01-02T11:58:30.000Z | 2021-03-24T13:39:00.000Z | Processing_Scripts/VIIRS/ReprojectVIIRSwithPYTROLL.py | tylere/SEBAL | 685832bea985413b88c415261ef7d23405d5f20a | [
"Apache-2.0"
] | 2 | 2017-09-15T17:35:28.000Z | 2018-05-22T19:12:34.000Z | Processing_Scripts/VIIRS/ReprojectVIIRSwithPYTROLL.py | tylere/SEBAL | 685832bea985413b88c415261ef7d23405d5f20a | [
"Apache-2.0"
] | 14 | 2018-01-12T06:07:45.000Z | 2021-03-24T13:39:03.000Z | # -*- coding: utf-8 -*-
"""
Created on Thu Jul 28 08:12:55 2016
Tim Hessels
To run this tool PYTROLL/mpop must be installed (http://www.pytroll.org/)
Download the VIIRS input data here: https://www.class.ncdc.noaa.gov/
Change line 21 and 22 to define the input and output files and change the
region in line 92 and run the code
"""
import numpy as np
import os
import mpop
from mpop.satellites import PolarFactory
from datetime import datetime
import glob
input_folder = r"E:\Project_2\Tunisia\DATA_RAW\VIIRS"
output_folder = r"E:\Project_2\Tunisia\DATA_RAW\VIIRS_mpop_out_375"
os.chdir(input_folder)
re = glob.glob("GITCO_*.h5")
for filename in re[:]:
# Define input files and output files
geofile = os.path.join(input_folder, filename)
outfile = os.path.join(output_folder, filename.replace("GITCO", "VIIRS_SVI05").replace(".h5", ".tif"))
if not os.path.exists(outfile):
try:
# Collect general data from the name of the input files
year = np.int((geofile.split(os.sep)[-1]).split('_')[2][1:5])
month = np.int((geofile.split(os.sep)[-1]).split('_')[2][5:7])
day = np.int((geofile.split(os.sep)[-1]).split('_')[2][7:9])
hour = np.int((geofile.split(os.sep)[-1]).split('_')[3][1:3])
minute = np.int((geofile.split(os.sep)[-1]).split('_')[3][3:5])
orbit = (geofile.split(os.sep)[-1]).split('_')[5][1:6]
endHour = np.int((geofile.split(os.sep)[-1]).split('_')[4][1:3])
endMinute = np.int((geofile.split(os.sep)[-1]).split('_')[4][3:5])
start = datetime(year, month, day, hour, minute)
end = datetime(year, month, day, endHour, endMinute)
# geofile is just your GITCO* file
time_slot = datetime(year, month, day, hour, minute)
global_data = PolarFactory.create_scene("npp", "", "viirs",
time_slot, orbit)
'''
import utils
import osr
import pyproj
srs = osr.SpatialReference()
srs.ImportFromEPSG(3857)
wgs84 = osr.SpatialReference()
wgs84.ImportFromEPSG(4326)
proj4_args =srs.ExportToProj4()
proj4_args = '%s %s %s %s %s %s %s %s %s' % (proj4_args.split( ' ')[0][1:], \
proj4_args.split( ' ')[1][1:], proj4_args.split( ' ')[2][1:], proj4_args.split( ' ')[3][1:] \
, proj4_args.split( ' ')[4][1:], proj4_args.split( ' ')[5][1:], proj4_args.split( ' ')[6][1:] \
, proj4_args.split( ' ')[7][1:], proj4_args.split( ' ')[8][1:])
latlim = [12,37]
lonlim = [-20,0]
osng = osr.SpatialReference()
osng.ImportFromEPSG(3857)
wgs84 = osr.SpatialReference()
wgs84.ImportFromEPSG(4326)
wgs84=pyproj.Proj("+init=EPSG:4326") # UK Ordnance Survey, 1936 datum
geoProj=pyproj.Proj("+init=EPSG:3857")
ur =pyproj.transform(wgs84, geoProj, lonlim[1],latlim[1])
ll=pyproj.transform(wgs84, geoProj, lonlim[0],latlim[0])
area_extent = (ll[0],ll[1],ur[0],ur[1])
area_id= 'viirs_data'
area_name = 'viirs_data'
proj_id = 'viirs_data'
area_def = utils.get_area_def(area_id, area_name, proj_id, proj4_args,int(6000), int(8000), area_extent)
'''
from mpop.projector import get_area_def
area_def = get_area_def("TUN375")
global_data.load(['I05'], time_interval=(start, end))
# 1: 0.64 2: 0.87 3:1.61 4:3.74 5: 11.5
#global_data.image.channel_image(11.5) # .show()
local_data = global_data.project(area_def, mode='nearest')
# pick an area_def I have actually created one based on the extent of
# VIIRS swath. That is advanced so just pick one of the built in area_def
# for where your swath is located to get the hang of it.
loclocal_data = local_data['I05']
img = loclocal_data.as_image(stretched=False)
img.time_slot = time_slot
# you can save the image as a geotiff below#
img.geotiff_save(outfile, compression=0, tags=None, gdal_options=None,
blocksize=0, geotransform=None, spatialref=None,
floating_point=True)
except:
print(filename)
for filename in re[:]:
# Define input files and output files
geofile = os.path.join(input_folder, filename)
outfile = os.path.join(output_folder, filename.replace("GITCO", "VIIRS_SVM07").replace(".h5", ".tif"))
if not os.path.exists(outfile):
try:
# Collect general data from the name of the input files
year = np.int((geofile.split(os.sep)[-1]).split('_')[2][1:5])
month = np.int((geofile.split(os.sep)[-1]).split('_')[2][5:7])
day = np.int((geofile.split(os.sep)[-1]).split('_')[2][7:9])
hour = np.int((geofile.split(os.sep)[-1]).split('_')[3][1:3])
minute = np.int((geofile.split(os.sep)[-1]).split('_')[3][3:5])
orbit = (geofile.split(os.sep)[-1]).split('_')[5][1:6]
endHour = np.int((geofile.split(os.sep)[-1]).split('_')[4][1:3])
endMinute = np.int((geofile.split(os.sep)[-1]).split('_')[4][3:5])
start = datetime(year, month, day, hour, minute)
end = datetime(year, month, day, endHour, endMinute)
# geofile is just your GITCO* file
time_slot = datetime(year, month, day, hour, minute)
global_data = PolarFactory.create_scene("npp", "", "viirs",
time_slot, orbit)
'''
import utils
import osr
import pyproj
srs = osr.SpatialReference()
srs.ImportFromEPSG(3857)
wgs84 = osr.SpatialReference()
wgs84.ImportFromEPSG(4326)
proj4_args =srs.ExportToProj4()
proj4_args = '%s %s %s %s %s %s %s %s %s' % (proj4_args.split( ' ')[0][1:], \
proj4_args.split( ' ')[1][1:], proj4_args.split( ' ')[2][1:], proj4_args.split( ' ')[3][1:] \
, proj4_args.split( ' ')[4][1:], proj4_args.split( ' ')[5][1:], proj4_args.split( ' ')[6][1:] \
, proj4_args.split( ' ')[7][1:], proj4_args.split( ' ')[8][1:])
latlim = [12,37]
lonlim = [-20,0]
osng = osr.SpatialReference()
osng.ImportFromEPSG(3857)
wgs84 = osr.SpatialReference()
wgs84.ImportFromEPSG(4326)
wgs84=pyproj.Proj("+init=EPSG:4326") # UK Ordnance Survey, 1936 datum
geoProj=pyproj.Proj("+init=EPSG:3857")
ur =pyproj.transform(wgs84, geoProj, lonlim[1],latlim[1])
ll=pyproj.transform(wgs84, geoProj, lonlim[0],latlim[0])
area_extent = (ll[0],ll[1],ur[0],ur[1])
area_id= 'viirs_data'
area_name = 'viirs_data'
proj_id = 'viirs_data'
area_def = utils.get_area_def(area_id, area_name, proj_id, proj4_args,int(6000), int(8000), area_extent)
'''
from mpop.projector import get_area_def
area_def = get_area_def("TUN375")
global_data.load(['M07'], time_interval=(start, end))
# 1: 0.64 2: 0.87 3:1.61 4:3.74 5: 11.5
#global_data.image.channel_image(0.87) # .show()
local_data = global_data.project(area_def, mode='nearest')
# pick an area_def I have actually created one based on the extent of
# VIIRS swath. That is advanced so just pick one of the built in area_def
# for where your swath is located to get the hang of it.
loclocal_data = local_data['M07']
img = loclocal_data.as_image(stretched=False)
img.time_slot = time_slot
# you can save the image as a geotiff below#
img.geotiff_save(outfile, compression=0, tags=None, gdal_options=None,
blocksize=0, geotransform=None, spatialref=None,
floating_point=True)
except:
print(filename)
for filename in re[:]:
# Define input files and output files
geofile = os.path.join(input_folder, filename)
outfile = os.path.join(output_folder, filename.replace("GITCO", "VIIRS_SVM10").replace(".h5", ".tif"))
if not os.path.exists(outfile):
try:
# Collect general data from the name of the input files
year = np.int((geofile.split(os.sep)[-1]).split('_')[2][1:5])
month = np.int((geofile.split(os.sep)[-1]).split('_')[2][5:7])
day = np.int((geofile.split(os.sep)[-1]).split('_')[2][7:9])
hour = np.int((geofile.split(os.sep)[-1]).split('_')[3][1:3])
minute = np.int((geofile.split(os.sep)[-1]).split('_')[3][3:5])
orbit = (geofile.split(os.sep)[-1]).split('_')[5][1:6]
endHour = np.int((geofile.split(os.sep)[-1]).split('_')[4][1:3])
endMinute = np.int((geofile.split(os.sep)[-1]).split('_')[4][3:5])
start = datetime(year, month, day, hour, minute)
end = datetime(year, month, day, endHour, endMinute)
# geofile is just your GITCO* file
time_slot = datetime(year, month, day, hour, minute)
global_data = PolarFactory.create_scene("npp", "", "viirs",
time_slot, orbit)
'''
import utils
import osr
import pyproj
srs = osr.SpatialReference()
srs.ImportFromEPSG(3857)
wgs84 = osr.SpatialReference()
wgs84.ImportFromEPSG(4326)
proj4_args =srs.ExportToProj4()
proj4_args = '%s %s %s %s %s %s %s %s %s' % (proj4_args.split( ' ')[0][1:], \
proj4_args.split( ' ')[1][1:], proj4_args.split( ' ')[2][1:], proj4_args.split( ' ')[3][1:] \
, proj4_args.split( ' ')[4][1:], proj4_args.split( ' ')[5][1:], proj4_args.split( ' ')[6][1:] \
, proj4_args.split( ' ')[7][1:], proj4_args.split( ' ')[8][1:])
latlim = [12,37]
lonlim = [-20,0]
osng = osr.SpatialReference()
osng.ImportFromEPSG(3857)
wgs84 = osr.SpatialReference()
wgs84.ImportFromEPSG(4326)
wgs84=pyproj.Proj("+init=EPSG:4326") # UK Ordnance Survey, 1936 datum
geoProj=pyproj.Proj("+init=EPSG:3857")
ur =pyproj.transform(wgs84, geoProj, lonlim[1],latlim[1])
ll=pyproj.transform(wgs84, geoProj, lonlim[0],latlim[0])
area_extent = (ll[0],ll[1],ur[0],ur[1])
area_id= 'viirs_data'
area_name = 'viirs_data'
proj_id = 'viirs_data'
area_def = utils.get_area_def(area_id, area_name, proj_id, proj4_args,int(6000), int(8000), area_extent)
'''
from mpop.projector import get_area_def
area_def = get_area_def("TUN375")
global_data.load(['M10'], time_interval=(start, end))
# 1: 0.64 2: 0.87 3:1.61 4:3.74 5: 11.5
# global_data.image.channel_image(1.6) # .show()
local_data = global_data.project(area_def, mode='nearest')
# pick an area_def I have actually created one based on the extent of
# VIIRS swath. That is advanced so just pick one of the built in area_def
# for where your swath is located to get the hang of it.
loclocal_data = local_data['M10']
img = loclocal_data.as_image(stretched=False)
img.time_slot = time_slot
# you can save the image as a geotiff below#
img.geotiff_save(outfile, compression=0, tags=None, gdal_options=None,
blocksize=0, geotransform=None, spatialref=None,
floating_point=True)
except:
print(filename)
| 40.872054 | 116 | 0.554576 | 1,601 | 12,139 | 4.078076 | 0.134916 | 0.049625 | 0.057896 | 0.06249 | 0.931996 | 0.931996 | 0.931996 | 0.931996 | 0.931996 | 0.921274 | 0 | 0.059937 | 0.294917 | 12,139 | 296 | 117 | 41.010135 | 0.702886 | 0.136749 | 0 | 0.81 | 0 | 0 | 0.043943 | 0.01366 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.09 | 0 | 0.09 | 0.03 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ce71a1ac0ca1f8d0e542dc14dd54c54e02a6f11 | 12,607 | py | Python | fhirclient/r4models/library_tests.py | Healthedata1/Flask-PL | 88a2f40ca430c4cbb9fbded7fc92fdc166ebb9f1 | [
"MIT"
] | null | null | null | fhirclient/r4models/library_tests.py | Healthedata1/Flask-PL | 88a2f40ca430c4cbb9fbded7fc92fdc166ebb9f1 | [
"MIT"
] | null | null | null | fhirclient/r4models/library_tests.py | Healthedata1/Flask-PL | 88a2f40ca430c4cbb9fbded7fc92fdc166ebb9f1 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Generated from FHIR 4.0.0-a53ec6ee1b on 2019-05-07.
# 2019, SMART Health IT.
import os
import io
import unittest
import json
from . import library
from .fhirdate import FHIRDate
class LibraryTests(unittest.TestCase):
def instantiate_from(self, filename):
datadir = os.environ.get('FHIR_UNITTEST_DATADIR') or ''
with io.open(os.path.join(datadir, filename), 'r', encoding='utf-8') as handle:
js = json.load(handle)
self.assertEqual("Library", js["resourceType"])
return library.Library(js)
def testLibrary1(self):
inst = self.instantiate_from("library-predecessor-example.json")
self.assertIsNotNone(inst, "Must have instantiated a Library instance")
self.implLibrary1(inst)
js = inst.as_json()
self.assertEqual("Library", js["resourceType"])
inst2 = library.Library(js)
self.implLibrary1(inst2)
def implLibrary1(self, inst):
self.assertEqual(inst.content[0].contentType, "text/cql")
self.assertEqual(inst.content[0].title, "FHIR Helpers")
self.assertEqual(inst.content[0].url, "library-fhir-helpers-content.cql")
self.assertEqual(inst.date.date, FHIRDate("2016-11-14").date)
self.assertEqual(inst.date.as_json(), "2016-11-14")
self.assertEqual(inst.description, "FHIR Helpers")
self.assertTrue(inst.experimental)
self.assertEqual(inst.id, "library-fhir-helpers-predecessor")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "FHIRHelpers")
self.assertEqual(inst.meta.tag[0].code, "HTEST")
self.assertEqual(inst.meta.tag[0].display, "test health data")
self.assertEqual(inst.meta.tag[0].system, "http://terminology.hl7.org/CodeSystem/v3-ActReason")
self.assertEqual(inst.relatedArtifact[0].resource, "Library/fhir-model-definition")
self.assertEqual(inst.relatedArtifact[0].type, "depends-on")
self.assertEqual(inst.relatedArtifact[1].resource, "Library/library-fhir-helpers")
self.assertEqual(inst.relatedArtifact[1].type, "successor")
self.assertEqual(inst.status, "active")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.title, "FHIR Helpers")
self.assertEqual(inst.topic[0].text, "FHIR Helpers")
self.assertEqual(inst.type.coding[0].code, "logic-library")
self.assertEqual(inst.version, "1.6")
def testLibrary2(self):
inst = self.instantiate_from("library-cms146-example.json")
self.assertIsNotNone(inst, "Must have instantiated a Library instance")
self.implLibrary2(inst)
js = inst.as_json()
self.assertEqual("Library", js["resourceType"])
inst2 = library.Library(js)
self.implLibrary2(inst2)
def implLibrary2(self, inst):
self.assertEqual(inst.content[0].contentType, "text/cql")
self.assertEqual(inst.content[0].url, "library-cms146-example-content.cql")
self.assertEqual(inst.dataRequirement[0].type, "Patient")
self.assertEqual(inst.dataRequirement[1].codeFilter[0].code[0].code, "diagnosis")
self.assertEqual(inst.dataRequirement[1].codeFilter[0].path, "category")
self.assertEqual(inst.dataRequirement[1].codeFilter[1].code[0].code, "confirmed")
self.assertEqual(inst.dataRequirement[1].codeFilter[1].path, "clinicalStatus")
self.assertEqual(inst.dataRequirement[1].codeFilter[2].path, "code")
self.assertEqual(inst.dataRequirement[1].codeFilter[2].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.102.12.1011")
self.assertEqual(inst.dataRequirement[1].type, "Condition")
self.assertEqual(inst.dataRequirement[2].codeFilter[0].code[0].code, "diagnosis")
self.assertEqual(inst.dataRequirement[2].codeFilter[0].path, "category")
self.assertEqual(inst.dataRequirement[2].codeFilter[1].code[0].code, "confirmed")
self.assertEqual(inst.dataRequirement[2].codeFilter[1].path, "clinicalStatus")
self.assertEqual(inst.dataRequirement[2].codeFilter[2].path, "code")
self.assertEqual(inst.dataRequirement[2].codeFilter[2].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.102.12.1012")
self.assertEqual(inst.dataRequirement[2].type, "Condition")
self.assertEqual(inst.dataRequirement[3].codeFilter[0].code[0].code, "finished")
self.assertEqual(inst.dataRequirement[3].codeFilter[0].path, "status")
self.assertEqual(inst.dataRequirement[3].codeFilter[1].code[0].code, "ambulatory")
self.assertEqual(inst.dataRequirement[3].codeFilter[1].path, "class")
self.assertEqual(inst.dataRequirement[3].codeFilter[2].path, "type")
self.assertEqual(inst.dataRequirement[3].codeFilter[2].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.101.12.1061")
self.assertEqual(inst.dataRequirement[3].type, "Encounter")
self.assertEqual(inst.dataRequirement[4].codeFilter[0].path, "diagnosis")
self.assertEqual(inst.dataRequirement[4].codeFilter[0].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.198.12.1012")
self.assertEqual(inst.dataRequirement[4].type, "DiagnosticReport")
self.assertEqual(inst.dataRequirement[5].codeFilter[0].path, "code")
self.assertEqual(inst.dataRequirement[5].codeFilter[0].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.196.12.1001")
self.assertEqual(inst.dataRequirement[5].type, "Medication")
self.assertEqual(inst.dataRequirement[6].codeFilter[0].code[0].code, "active")
self.assertEqual(inst.dataRequirement[6].codeFilter[0].path, "status")
self.assertEqual(inst.dataRequirement[6].codeFilter[1].path, "medication.code")
self.assertEqual(inst.dataRequirement[6].codeFilter[1].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.196.12.1001")
self.assertEqual(inst.dataRequirement[6].type, "MedicationRequest")
self.assertEqual(inst.dataRequirement[7].codeFilter[0].code[0].code, "completed")
self.assertEqual(inst.dataRequirement[7].codeFilter[0].path, "status")
self.assertEqual(inst.dataRequirement[7].codeFilter[1].path, "medication.code")
self.assertEqual(inst.dataRequirement[7].codeFilter[1].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.196.12.1001")
self.assertEqual(inst.dataRequirement[7].type, "MedicationStatement")
self.assertEqual(inst.date.date, FHIRDate("2015-07-22").date)
self.assertEqual(inst.date.as_json(), "2015-07-22")
self.assertEqual(inst.description, "Logic for CMS 146: Appropriate Testing for Children with Pharyngitis")
self.assertEqual(inst.id, "library-cms146-example")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "CMS146")
self.assertEqual(inst.meta.tag[0].code, "HTEST")
self.assertEqual(inst.meta.tag[0].display, "test health data")
self.assertEqual(inst.meta.tag[0].system, "http://terminology.hl7.org/CodeSystem/v3-ActReason")
self.assertEqual(inst.relatedArtifact[0].resource, "Library/library-quick-model-definition")
self.assertEqual(inst.relatedArtifact[0].type, "depends-on")
self.assertEqual(inst.status, "draft")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.title, "Appropriate Testing for Children with Pharyngitis")
self.assertEqual(inst.type.coding[0].code, "logic-library")
self.assertEqual(inst.version, "2.0.0")
def testLibrary3(self):
inst = self.instantiate_from("library-example.json")
self.assertIsNotNone(inst, "Must have instantiated a Library instance")
self.implLibrary3(inst)
js = inst.as_json()
self.assertEqual("Library", js["resourceType"])
inst2 = library.Library(js)
self.implLibrary3(inst2)
def implLibrary3(self, inst):
self.assertEqual(inst.content[0].contentType, "text/cql")
self.assertEqual(inst.content[0].url, "library-example-content.cql")
self.assertEqual(inst.dataRequirement[0].codeFilter[0].path, "code")
self.assertEqual(inst.dataRequirement[0].codeFilter[0].valueSet, "urn:oid:2.16.840.1.113883.3.464.1003.111.12.1006")
self.assertEqual(inst.dataRequirement[0].type, "Condition")
self.assertEqual(inst.date.date, FHIRDate("2015-07-22").date)
self.assertEqual(inst.date.as_json(), "2015-07-22")
self.assertEqual(inst.description, "Common Logic for adherence to Chlamydia Screening guidelines")
self.assertEqual(inst.id, "example")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "ChalmydiaScreening_Common")
self.assertEqual(inst.meta.tag[0].code, "HTEST")
self.assertEqual(inst.meta.tag[0].display, "test health data")
self.assertEqual(inst.meta.tag[0].system, "http://terminology.hl7.org/CodeSystem/v3-ActReason")
self.assertEqual(inst.relatedArtifact[0].resource, "Library/library-quick-model-definition")
self.assertEqual(inst.relatedArtifact[0].type, "depends-on")
self.assertEqual(inst.status, "draft")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.title, "Chlamydia Screening Common Library")
self.assertEqual(inst.topic[0].text, "Chlamydia Screening")
self.assertEqual(inst.type.coding[0].code, "logic-library")
self.assertEqual(inst.version, "2.0.0")
def testLibrary4(self):
inst = self.instantiate_from("library-composition-example.json")
self.assertIsNotNone(inst, "Must have instantiated a Library instance")
self.implLibrary4(inst)
js = inst.as_json()
self.assertEqual("Library", js["resourceType"])
inst2 = library.Library(js)
self.implLibrary4(inst2)
def implLibrary4(self, inst):
self.assertEqual(inst.date.date, FHIRDate("2017-03-10").date)
self.assertEqual(inst.date.as_json(), "2017-03-10")
self.assertEqual(inst.description, "Artifacts required for implementation of Zika Virus Management")
self.assertEqual(inst.id, "composition-example")
self.assertEqual(inst.identifier[0].system, "http://example.org")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "Zika Artifacts")
self.assertEqual(inst.meta.tag[0].code, "HTEST")
self.assertEqual(inst.meta.tag[0].display, "test health data")
self.assertEqual(inst.meta.tag[0].system, "http://terminology.hl7.org/CodeSystem/v3-ActReason")
self.assertEqual(inst.relatedArtifact[0].resource, "ActivityDefinition/administer-zika-virus-exposure-assessment")
self.assertEqual(inst.relatedArtifact[0].type, "composed-of")
self.assertEqual(inst.relatedArtifact[1].resource, "ActivityDefinition/order-serum-zika-dengue-virus-igm")
self.assertEqual(inst.relatedArtifact[1].type, "composed-of")
self.assertEqual(inst.relatedArtifact[2].resource, "ActivityDefinition/provide-mosquito-prevention-advice")
self.assertEqual(inst.relatedArtifact[2].type, "composed-of")
self.assertEqual(inst.relatedArtifact[3].resource, "Library/zika-virus-intervention-logic")
self.assertEqual(inst.relatedArtifact[3].type, "composed-of")
self.assertEqual(inst.relatedArtifact[4].resource, "PlanDefinition/zika-virus-intervention")
self.assertEqual(inst.relatedArtifact[4].type, "composed-of")
self.assertEqual(inst.relatedArtifact[5].resource, "Questionnaire/zika-virus-exposure-assessment")
self.assertEqual(inst.relatedArtifact[5].type, "composed-of")
self.assertEqual(inst.relatedArtifact[6].type, "derived-from")
self.assertEqual(inst.relatedArtifact[6].url, "https://www.cdc.gov/mmwr/volumes/65/wr/mm6539e1.htm?s_cid=mm6539e1_w")
self.assertEqual(inst.status, "draft")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.title, "Zika Artifacts")
self.assertEqual(inst.topic[0].text, "Zika Virus Management")
self.assertEqual(inst.type.coding[0].code, "asset-collection")
self.assertEqual(inst.version, "1.0.0")
| 62.103448 | 126 | 0.683351 | 1,505 | 12,607 | 5.712292 | 0.145515 | 0.235547 | 0.28731 | 0.16215 | 0.813307 | 0.761894 | 0.684076 | 0.566709 | 0.468652 | 0.441549 | 0 | 0.050776 | 0.167367 | 12,607 | 202 | 127 | 62.410891 | 0.768219 | 0.009439 | 0 | 0.316667 | 1 | 0.05 | 0.232492 | 0.088355 | 0 | 0 | 0 | 0 | 0.777778 | 1 | 0.05 | false | 0 | 0.033333 | 0 | 0.094444 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6b025cc2a80f647c413de2c3552484058eea0fdb | 33,699 | py | Python | utilities/sync/sync_generator.py | alanjclark/autoware.ai | ba97edbbffb6f22e78912bf96400a59ef6a13daf | [
"Apache-2.0"
] | 2 | 2020-11-13T11:11:16.000Z | 2022-03-09T20:24:54.000Z | utilities/sync/sync_generator.py | alanjclark/autoware.ai | ba97edbbffb6f22e78912bf96400a59ef6a13daf | [
"Apache-2.0"
] | 40 | 2019-06-24T16:56:15.000Z | 2022-02-28T13:41:58.000Z | utilities/sync/sync_generator.py | alanjclark/autoware.ai | ba97edbbffb6f22e78912bf96400a59ef6a13daf | [
"Apache-2.0"
] | 8 | 2019-08-20T18:54:00.000Z | 2022-02-09T13:54:41.000Z | # -*- coding: utf-8 -*-
import yaml
import sys
argvs = sys.argv
argc = len(argvs)
text = ""
if __name__ == "__main__":
if (argc != 3):
print('%python sync_generator.py argv[1] argv[2]')
print('\t' + 'argv[1]: input *.yaml file')
print('\t' + 'argv[2]: output *.cpp file')
quit()
f_config = open(argvs[1], 'r')
f_generate = open(argvs[2], 'w')
data = yaml.load(f_config)
text = '/* ----header---- */\n'
text += '/* common header */\n'
text += '#include "ros/ros.h"\n'
text += '#include <ros/callback_queue.h>\n'
text += '#include <boost/circular_buffer.hpp>\n'
text += '#include <vector>\n'
text += '#include <stdio.h>\n'
text += '#include <stdlib.h>\n'
text += '#include <string.h>\n'
text += '#include <signal.h>\n'
text += '#include <sys/stat.h>\n'
text += '#include <sys/select.h>\n'
text += '#include <mqueue.h>\n'
text += '#include <fcntl.h>\n'
text += '#include <errno.h>\n'
text += '#include <unistd.h>\n'
text += '#include <pthread.h>\n'
text += '#include "t_sync_message.h"\n'
text += '/* user header */\n'
text += '#include "%s.h"\n' % data['sub1_header']
text += '#include "%s.h"\n' % data['sub2_header']
text += '#include "%s.h"\n' % data['sync_sub1_header']
text += '#include "%s.h"\n' % data['sync_sub2_header']
text += '\n/* ----mode---- */\n'
text += '#define _REQ_PUB %s\n\n' % data['req_pub_mode']
text += '/* ----var---- */\n'
text += '/* common var */\n'
text += 'bool buf_flag;\n'
text += 'pthread_mutex_t mutex;\n'
text += '/* user var */\n'
text += 'boost::circular_buffer<%s> %s_ringbuf(%s);\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1], data['sub1_ringbuf'])
text += 'boost::circular_buffer<%s> %s_ringbuf(%s);\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1], data['sub2_ringbuf'])
text += 'ros::Publisher %s_pub;\n' % data['pub1'].split('/')[-1]
text += 'ros::Publisher %s_pub;\n' % data['pub2'].split('/')[-1]
text += 'bool %s_flag;\n' % data['sync_sub1'].split('/')[-1]
text += 'bool %s_flag;\n\n' % data['sync_sub2'].split('/')[-1]
text += '/* ----function---- */\n'
text += 'double fabs_time_diff(std_msgs::Header *timespec1, std_msgs::Header *timespec2) {\n'
text += ' double time1 = (double)timespec1->stamp.sec + (double)timespec1->stamp.nsec/1000000000L;\n'
text += ' double time2 = (double)timespec2->stamp.sec + (double)timespec2->stamp.nsec/1000000000L;\n\n'
text += ' return fabs(time1 - time2);\n'
text += '}\n\n'
text += 'double get_time(const std_msgs::Header *timespec) {\n'
text += ' return (double)timespec->stamp.sec + (double)timespec->stamp.nsec/1000000000L;\n'
text += '}\n\n\n'
text += '#if _REQ_PUB\n'
text += '%s* p_%s_buf;\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += '%s* p_%s_buf;\n\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += 'void %s_callback(const %s::ConstPtr& %s_msg) {\n' % (data['sub1'].split('/')[-1], data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' pthread_mutex_lock(&mutex);\n'
text += ' %s_ringbuf.push_front(*%s_msg);\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' //%s is empty\n' % data['sub2'].split('/')[-1]
text += ' if (%s_ringbuf.begin() == %s_ringbuf.end()) {\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' ROS_INFO("%s ring buffer is empty");\n' % data['sub2'].split('/')[-1]
text += ' return\n'
text += ' }\n'
text += ' buf_flag = true;\n'
text += ' pthread_mutex_unlock(&mutex);\n'
text += '}\n\n'
text += 'void %s_callback(const %s::ConstPtr& %s_msg) {\n' % (data['sub2'].split('/')[-1], data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' pthread_mutex_lock(&mutex);\n'
text += ' %s_ringbuf.push_front(*%s_msg);\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' //%s is empty\n' % data['sub1'].split('/')[-1]
text += ' if (%s_ringbuf.begin() == %s_ringbuf.end()) {\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' ROS_INFO("%s ring buffer is empty");\n' % data['sub1'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' buf_flag = true;\n'
text += ' pthread_mutex_unlock(&mutex);\n'
text += '}\n'
text += '\n'
text += 'void publish_msg(%s* p_%s_buf, %s* p_%s_buf)\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1], data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += '{\n'
text += ' ROS_INFO("publish");\n'
text += ' %s_pub.publish(*p_%s_buf);\n' % (data['pub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' %s_pub.publish(*p_%s_buf);\n' % (data['pub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += '}\n\n'
text += 'bool publish() {\n'
text += ' if (buf_flag) {\n'
text += ' pthread_mutex_lock(&mutex)\n\n'
text += ' //%s is empty\n' % data['sub1'].split('/')[-1]
text += ' if (%s_ringbuf.begin() == %s_ringbuf.end()) {\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' ROS_INFO("%s ring buffer is empty");\n'% data['sub1'].split('/')[-1]
text += ' return false;\n'
text += ' }\n\n'
text += ' //%s is empty\n' % data['sub2'].split('/')[-1]
text += ' if (%s_ringbuf.begin() == %s_ringbuf.end()) {\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' ROS_INFO("%s ring buffer is empty");\n'% data['sub2'].split('/')[-1]
text += ' return false;\n'
text += ' }\n\n'
if data['sched_policy'] == 1:
text += ' // %s > %s\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' if (get_time(&(%s_ringbuf.front().header)) >= get_time(&(%s_ringbuf.front().header))) {\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' p_%s_buf = &(%s_ringbuf.front());\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub1'].split('/')[-1]
text += ' p_%s_buf = &*it;\n' % data['sub1'].split('/')[-1]
text += ' publish_msg(p_%s_buf, p_%s_buf);\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return true;\n'
text += ' } else {\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub1'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['sub2'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['sub2'].split('/')[-1]
text += ' p_%s_buf = &*(it-1);\n' % data['sub1'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub1'].split('/')[-1]
text += ' p_%s_buf = &(%s_ringbuf.back());\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' }\n'
text += ' }\n'
text += ' }\n'
text += ' // %s < %s\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' else {\n'
text += ' p_%s_buf = &(%s_ringbuf.front());\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub2'].split('/')[-1]
text += ' p_%s_buf = &*it;\n' % data['sub2'].split('/')[-1]
text += ' publish_msg(p_%s_buf, p_%s_buf);\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return true;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub2'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['sub1'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['sub1'].split('/')[-1]
text += ' p_%s_buf = &*(it-1);\n' % data['sub2'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub2'].split('/')[-1]
text += ' p_%s_buf = &(%s_ringbuf.back());\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' }\n'
text += ' }\n'
elif data['sched_policy'] == 2:
text += ' p_%s_buf = &(%s_ringbuf.front());\n' % (data['short_rate'].split('/')[-1], data['short_rate'].split('/')[-1])
if data['short_rate'] == data['sub1'] :
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub2'].split('/')[-1]
text += ' p_%s_buf = &*it;\n' % data['sub2'].split('/')[-1]
text += ' publish_msg(p_%s_buf, p_%s_buf);\n' % (data['short_rate'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return true;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub2'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['short_rate'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['short_rate'].split('/')[-1]
text += ' p_%s_buf = &*(it-1);\n' % data['sub2'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub2'].split('/')[-1]
text += ' p_%s_buf = &(%s_ringbuf.back());\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' }\n'
elif data['short_rate'] == data['sub2']:
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub1'].split('/')[-1]
text += ' p_%s_buf = &*it;\n' % data['sub1'].split('/')[-1]
text += ' publish_msg(p_%s_buf, p_%s_buf);\n' % (data['short_rate'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return true;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub1'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['short_rate'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['short_rate'].split('/')[-1]
text += ' p_%s_buf = &*(it-1);\n' % data['sub1'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub1'].split('/')[-1]
text += ' p_%s_buf = &(%s_ringbuf.back());\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' }\n'
else :
print("failed: sched_policy 2, short_rate unmatched sub1 or sub2")
text += ' publish_msg(p_%s_buf, p_%s_buf);\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return true;\n'
text += ' } else {\n'
text += ' return false;\n'
text += ' }\n'
text += '}\n'
text += '#else\n'
text += '%s %s_buf;\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += '%s %s_buf;\n\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += 'void %s_callback(const %s::ConstPtr& %s_msg) {\n' % (data['sub1'].split('/')[-1], data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' pthread_mutex_lock(&mutex);\n'
text += ' %s_ringbuf.push_front(*%s_msg);\n\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' //%s is empty\n' % data['sub2'].split('/')[-1]
text += ' if (%s_ringbuf.begin() == %s_ringbuf.end()) {\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' ROS_INFO("%s ring buffer is empty");\n' % data['sub2'].split('/')[-1]
text += ' return;\n'
text += ' }\n\n'
text += ' buf_flag = true;\n\n'
if data['sched_policy'] == 1:
text += ' // %s > %s\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' if (get_time(&(%s_ringbuf.front().header)) >= get_time(&(%s_ringbuf.front().header))) {\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' %s_buf = %s_ringbuf.front();\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub1'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' } else {\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub1'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['sub2'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub1'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' }\n'
text += ' }\n\n'
text += ' } else {\n'
text += ' %s_buf = %s_ringbuf.front();\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub2'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub2'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['sub1'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub2'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' }\n'
text += ' }\n'
elif data['sched_policy'] == 2:
text += ' %s_buf = %s_ringbuf.front();\n' % (data['short_rate'].split('/')[-1], data['short_rate'].split('/')[-1])
if data['short_rate'] == data['sub1'] :
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub2'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub2'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['short_rate'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['short_rate'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub2'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' }\n'
elif data['short_rate'] == data['sub2']:
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub1'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub1'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['short_rate'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['short_rate'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub1'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' }\n'
else :
print("failed: sched_policy 2, short_rate unmatched sub1 or sub2")
text += ' pthread_mutex_unlock(&mutex);\n'
text += '}\n\n'
text += 'void %s_callback(const %s::ConstPtr& %s_msg) {\n' % (data['sub2'].split('/')[-1], data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' pthread_mutex_lock(&mutex);\n'
text += ' %s_ringbuf.push_front(*%s_msg);\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' //%s is empty\n' % data['sub1'].split('/')[-1]
text += ' if (%s_ringbuf.begin() == %s_ringbuf.end()) {\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' ROS_INFO("%s ring buffer is empty");\n' % data['sub1'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' buf_flag = true;\n\n'
if data['sched_policy'] == 1:
text += ' // %s > %s\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' if (get_time(&(%s_ringbuf.front().header)) >= get_time(&(%s_ringbuf.front().header))) {\n' % (data['sub1'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' %s_buf = %s_ringbuf.front();\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub1'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' } else {\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub1'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['sub2'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub1'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' }\n'
text += ' }\n\n'
text += ' } else {\n'
text += ' %s_buf = %s_ringbuf.front();\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub2'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub2'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['sub1'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub2'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' }\n'
text += ' }\n'
elif data['sched_policy'] == 2:
text += ' %s_buf = %s_ringbuf.front();\n' % (data['short_rate'].split('/')[-1], data['short_rate'].split('/')[-1])
if data['short_rate'] == data['sub1'] :
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub2_header'].replace('/', '::'), data['sub2'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub2'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub2'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['short_rate'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['short_rate'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub2'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub2'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' }\n'
elif data['short_rate'] == data['sub2']:
text += ' boost::circular_buffer<%s>::iterator it = %s_ringbuf.begin();\n' % (data['sub1_header'].replace('/', '::'), data['sub1'].split('/')[-1])
text += ' if (%s_ringbuf.size() == 1) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = *it;\n' % data['sub1'].split('/')[-1]
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return;\n'
text += ' }\n\n'
text += ' for (it++; it != %s_ringbuf.end(); it++) {\n' % data['sub1'].split('/')[-1]
text += ' if (fabs_time_diff(&(%s_ringbuf.front().header), &((it-1)->header))\n' % data['short_rate'].split('/')[-1]
text += ' < fabs_time_diff(&(%s_ringbuf.front().header), &(it->header))) {\n' % data['short_rate'].split('/')[-1]
text += ' %s_buf = *(it-1);\n' % data['sub1'].split('/')[-1]
text += ' break;\n'
text += ' }\n'
text += ' }\n\n'
text += ' if (it == %s_ringbuf.end()) {\n' % data['sub1'].split('/')[-1]
text += ' %s_buf = %s_ringbuf.back();\n' % (data['sub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' }\n'
else :
print("failed: sched_policy 2, short_rate unmatched sub1 or sub2")
text += ' pthread_mutex_unlock(&mutex);\n'
text += '}\n\n'
text += 'bool publish() {\n'
text += ' if (buf_flag) {\n'
text += ' pthread_mutex_lock(&mutex);\n'
text += ' // scan_ringbuf.clear();\n'
text += ' // image_ringbuf.clear();\n'
text += ' // scan_ringbuf.push_front(scan_buf);\n'
text += ' // image_ringbuf.push_front(image_buf);\n'
text += ' ROS_INFO("publish");\n'
text += ' %s_pub.publish(%s_buf);\n' % (data['pub1'].split('/')[-1], data['sub1'].split('/')[-1])
text += ' %s_pub.publish(%s_buf);\n' % (data['pub2'].split('/')[-1], data['sub2'].split('/')[-1])
text += ' pthread_mutex_unlock(&mutex);\n'
text += ' return true;\n'
text += ' } else {\n'
text += ' ROS_INFO("publish failed");\n'
text += ' return false;\n'
text += ' }\n'
text += '}\n'
text += '#endif\n\n'
text += 'void %s_callback(const %s::ConstPtr& %s_msg) {\n' % (data['sync_sub1'].split('/')[-1], data['sync_sub1_header'].replace('/', '::'), data['sync_sub1'].split('/')[-1])
text += ' if (%s_flag) {\n' % data['sync_sub1'].split('/')[-1]
text += ' %s_flag = false;\n' % data['sync_sub1'].split('/')[-1]
text += ' %s_flag = false;\n' % data['sync_sub2'].split('/')[-1]
text += ' return;\n'
text += ' }\n\n'
text += ' %s_flag = true;\n' % data['sync_sub1'].split('/')[-1]
text += ' if (%s_flag) {\n' % data['sync_sub2'].split('/')[-1]
text += ' ROS_INFO("catch publish request");\n'
text += ' if(!publish()) {\n'
text += ' /* when to publish is failure, republish */\n'
text += ' struct timespec sleep_time;\n'
text += ' sleep_time.tv_sec = 0;\n'
text += ' sleep_time.tv_nsec = 200000000; //5Hz\n'
text += ' while (!publish() || ros::ok())\n'
text += ' nanosleep(&sleep_time, NULL);\n'
text += ' }\n'
text += ' %s_flag = false;\n' % data['sync_sub1'].split('/')[-1]
text += ' %s_flag = false;\n' % data['sync_sub2'].split('/')[-1]
text += ' }\n'
text += '}\n'
text += 'void %s_callback(const %s::ConstPtr& %s_msg) {\n' % (data['sync_sub2'].split('/')[-1], data['sync_sub2_header'].replace('/', '::'), data['sync_sub2'].split('/')[-1])
text += ' if (%s_flag) {\n' % data['sync_sub2'].split('/')[-1]
text += ' %s_flag = false;\n' % data['sync_sub1'].split('/')[-1]
text += ' %s_flag = false;\n' % data['sync_sub2'].split('/')[-1]
text += ' return;\n'
text += ' }\n\n'
text += ' %s_flag = true;\n' % data['sync_sub2'].split('/')[-1]
text += ' if (%s_flag) {\n' % data['sync_sub1'].split('/')[-1]
text += ' ROS_INFO("catch publish request");\n'
text += ' if(!publish()) {\n'
text += ' /* when to publish is failure, republish */\n'
text += ' struct timespec sleep_time;\n'
text += ' sleep_time.tv_sec = 0;\n'
text += ' sleep_time.tv_nsec = 200000000; //5Hz\n'
text += ' while (!publish() || ros::ok())\n'
text += ' nanosleep(&sleep_time, NULL);\n'
text += ' }\n'
text += ' %s_flag = false;\n' % data['sync_sub1'].split('/')[-1]
text += ' %s_flag = false;\n' % data['sync_sub2'].split('/')[-1]
text += ' }\n'
text += '}\n\n'
text += 'void* thread(void* args)\n'
text += '{\n'
text += ' ros::NodeHandle nh_rcv;\n'
text += ' ros::CallbackQueue rcv_callbackqueue;\n'
text += ' nh_rcv.setCallbackQueue(&rcv_callbackqueue);\n'
text += ' ros::Subscriber %s_sub = nh_rcv.subscribe("%s", 5, %s_callback);\n' % (data['sync_sub1'].split('/')[-1], data['sync_sub1'].split('/')[-1], data['sync_sub1'].split('/')[-1])
text += ' ros::Subscriber %s_sub = nh_rcv.subscribe("%s", 5, %s_callback);\n' % (data['sync_sub2'].split('/')[-1], data['sync_sub2'].split('/')[-1], data['sync_sub2'].split('/')[-1])
text += ' while (nh_rcv.ok())\n'
text += ' rcv_callbackqueue.callAvailable(ros::WallDuration(1.0f));\n'
text += ' return NULL;\n'
text += '}\n\n'
text += 'int main(int argc, char **argv) {\n'
text += ' ros::init(argc, argv, "%s");\n' % data['node_name'].split('/')[-1]
text += ' ros::NodeHandle nh;\n\n'
text += ' /* create server thread */\n'
text += ' pthread_t th;\n'
text += ' pthread_create(&th, NULL, thread, (void *)NULL );\n\n'
text += ' ros::Subscriber %s_sub = nh.subscribe("%s", 1, %s_callback);\n' % (data['sub1'].split('/')[-1], data['sub1'], data['sub1'].split('/')[-1])
text += ' ros::Subscriber %s_sub = nh.subscribe("%s", 1, %s_callback);\n' % (data['sub2'].split('/')[-1], data['sub2'], data['sub2'].split('/')[-1])
text += ' %s_pub = nh.advertise<%s>("%s", 5);\n' % (data['pub1'].split('/')[-1], data['sub1_header'].replace('/', '::'), data['pub1'])
text += ' %s_pub = nh.advertise<%s>("%s", 5);\n' % (data['pub2'].split('/')[-1], data['sub2_header'].replace('/', '::'), data['pub2'])
text += ' while (!buf_flag) {\n'
text += ' ros::spinOnce();\n'
text += ' }\n'
text += ' if(!publish()) {\n'
text += ' /* when to publish is failure, republish */\n'
text += ' struct timespec sleep_time;\n'
text += ' sleep_time.tv_sec = 0;\n'
text += ' sleep_time.tv_nsec = 200000000; //5Hz\n'
text += ' while (!publish() || ros::ok())\n'
text += ' nanosleep(&sleep_time, NULL);\n'
text += ' }\n\n'
text += ' ros::spin();\n\n'
text += ' /* shutdown server thread */\n'
text += ' ROS_INFO("wait until shutdown a thread");\n'
text += ' pthread_kill(th, SIGINT);\n'
text += ' pthread_join(th, NULL);\n\n'
text += ' return 0;\n'
text += '}\n'
f_generate.write(text)
f_config.close()
f_generate.close()
print("generate ")
| 61.606947 | 201 | 0.442773 | 4,083 | 33,699 | 3.509184 | 0.045065 | 0.105946 | 0.129816 | 0.096734 | 0.877164 | 0.86837 | 0.859366 | 0.8445 | 0.83794 | 0.82754 | 0 | 0.027611 | 0.279919 | 33,699 | 546 | 202 | 61.71978 | 0.562845 | 0.000623 | 0 | 0.748454 | 0 | 0.012371 | 0.519955 | 0.128162 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004124 | 0 | 0.004124 | 0.014433 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6b056f2a823a2e5bfe9d769811ebf2b71394a1bb | 36,735 | py | Python | fhir/resources/STU3/tests/test_valueset.py | mmabey/fhir.resources | cc73718e9762c04726cd7de240c8f2dd5313cbe1 | [
"BSD-3-Clause"
] | null | null | null | fhir/resources/STU3/tests/test_valueset.py | mmabey/fhir.resources | cc73718e9762c04726cd7de240c8f2dd5313cbe1 | [
"BSD-3-Clause"
] | null | null | null | fhir/resources/STU3/tests/test_valueset.py | mmabey/fhir.resources | cc73718e9762c04726cd7de240c8f2dd5313cbe1 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Profile: http://hl7.org/fhir/StructureDefinition/ValueSet
Release: STU3
Version: 3.0.2
Revision: 11917
Last updated: 2019-10-24T11:53:00+11:00
"""
import io
import json
import os
import unittest
import pytest
from .. import valueset
from ..fhirdate import FHIRDate
from .fixtures import force_bytes
@pytest.mark.usefixtures("base_settings")
class ValueSetTests(unittest.TestCase):
def instantiate_from(self, filename):
datadir = os.environ.get("FHIR_UNITTEST_DATADIR") or ""
with io.open(os.path.join(datadir, filename), "r", encoding="utf-8") as handle:
js = json.load(handle)
self.assertEqual("ValueSet", js["resourceType"])
return valueset.ValueSet(js)
def testValueSet1(self):
inst = self.instantiate_from("valueset-encounter-status.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet1(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet1(inst2)
def implValueSet1(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/encounter-status"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].system), force_bytes("email")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].value),
force_bytes("fhir@lists.hl7.org"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description), force_bytes("Current state of the encounter")
)
self.assertFalse(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 2)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("pa"))
self.assertEqual(force_bytes(inst.id), force_bytes("encounter-status"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.241"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("EncounterStatus"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("HL7 (FHIR Project)"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/encounter-status"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet2(self):
inst = self.instantiate_from("valueset-report-status-codes.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet2(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet2(inst2)
def implValueSet2(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/report-status-codes"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].system), force_bytes("email")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].value),
force_bytes("fhir@lists.hl7.org"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("The current status of the test report."),
)
self.assertFalse(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 0)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("fhir"))
self.assertEqual(force_bytes(inst.id), force_bytes("report-status-codes"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.712"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("TestReportStatus"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("HL7 (FHIR Project)"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/report-status-codes"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet3(self):
inst = self.instantiate_from("valueset-note-type.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet3(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet3(inst2)
def implValueSet3(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/note-type"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].system), force_bytes("email")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].value),
force_bytes("fhir@lists.hl7.org"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("The presentation types of notes."),
)
self.assertFalse(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 2)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("fm"))
self.assertEqual(force_bytes(inst.id), force_bytes("note-type"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.15"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("NoteType"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("HL7 (FHIR Project)"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url), force_bytes("http://hl7.org/fhir/ValueSet/note-type")
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet4(self):
inst = self.instantiate_from("valueset-sequence-quality-method.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet4(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet4(inst2)
def implValueSet4(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("https://precision.fda.gov/apps/"),
)
self.assertEqual(
force_bytes(inst.compose.include[1].system),
force_bytes("https://precision.fda.gov/jobs/"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("This value set includes sequence quality method"),
)
self.assertTrue(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 1)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("cg"))
self.assertEqual(force_bytes(inst.id), force_bytes("sequence-quality-method"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.218"),
)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("FDA-Method"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("FHIR Project team"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/sequence-quality-method"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet5(self):
inst = self.instantiate_from("valueset-issue-severity.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet5(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet5(inst2)
def implValueSet5(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/issue-severity"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].system), force_bytes("email")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].value),
force_bytes("fhir@lists.hl7.org"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("How the issue affects the success of the action."),
)
self.assertFalse(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 5)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("fhir"))
self.assertEqual(force_bytes(inst.id), force_bytes("issue-severity"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.397"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("IssueSeverity"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("HL7 (FHIR Project)"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/issue-severity"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet6(self):
inst = self.instantiate_from("valueset-sequence-referenceSeq.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet6(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet6(inst2)
def implValueSet6(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://www.ensembl.org"),
)
self.assertEqual(
force_bytes(inst.compose.include[1].system),
force_bytes("http://www.ncbi.nlm.nih.gov/nuccore"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("This value set includes all Reference codes"),
)
self.assertTrue(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 1)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("cg"))
self.assertEqual(force_bytes(inst.id), force_bytes("sequence-referenceSeq"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.216"),
)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("ENSEMBL"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("FHIR Project team"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/sequence-referenceSeq"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet7(self):
inst = self.instantiate_from("valueset-process-outcome.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet7(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet7(inst2)
def implValueSet7(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/processoutcomecodes"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.copyright), force_bytes("This is an example set.")
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("This value set includes sample Process Outcome codes."),
)
self.assertTrue(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 1)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("fm"))
self.assertEqual(force_bytes(inst.id), force_bytes("process-outcome"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.677"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("Process Outcome Codes"))
self.assertEqual(
force_bytes(inst.publisher), force_bytes("Financial Management")
)
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/process-outcome"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet8(self):
inst = self.instantiate_from("valueset-claim-exception.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet8(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet8(inst2)
def implValueSet8(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/claim-exception"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.copyright), force_bytes("This is an example set.")
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes("This value set includes sample Exception codes."),
)
self.assertTrue(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 1)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("fm"))
self.assertEqual(force_bytes(inst.id), force_bytes("claim-exception"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.572"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(force_bytes(inst.name), force_bytes("Exception Codes"))
self.assertEqual(
force_bytes(inst.publisher), force_bytes("Financial Management")
)
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/claim-exception"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
def testValueSet9(self):
inst = self.instantiate_from("valueset-object-lifecycle-events.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet9(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet9(inst2)
def implValueSet9(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/dicom-audit-lifecycle"),
)
self.assertEqual(
force_bytes(inst.compose.include[1].system),
force_bytes("http://hl7.org/fhir/iso-21089-lifecycle"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].system), force_bytes("email")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[1].value),
force_bytes("fhir@lists.hl7.org"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-02-19T18:00:00+01:00").date)
self.assertEqual(inst.date.as_json(), "2017-02-19T18:00:00+01:00")
self.assertEqual(
force_bytes(inst.description),
force_bytes(
"This example FHIR value set is comprised of lifecycle event codes. The FHIR Actor value set is based on DICOM Audit Message, ParticipantObjectDataLifeCycle; ISO Standard, TS 21089-2017; "
),
)
self.assertFalse(inst.experimental)
self.assertTrue(inst.extensible)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[0].valueCode), force_bytes("sec"))
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[1].valueString), force_bytes("Trial Use")
)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[2].valueInteger, 3)
self.assertEqual(force_bytes(inst.id), force_bytes("object-lifecycle-events"))
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(force_bytes(inst.name), force_bytes("ObjectLifecycleEvents"))
self.assertEqual(force_bytes(inst.publisher), force_bytes("HL7 (FHIR Project)"))
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(
force_bytes(inst.text.div),
force_bytes(
'<div xmlns="http://www.w3.org/1999/xhtml"> This value set includes codes from multiple codesets. </div>'
),
)
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/object-lifecycle-events"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("1.1.0"))
def testValueSet10(self):
inst = self.instantiate_from("valueset-entformula-additive.json")
self.assertIsNotNone(inst, "Must have instantiated a ValueSet instance")
self.implValueSet10(inst)
js = inst.as_json()
self.assertEqual("ValueSet", js["resourceType"])
inst2 = valueset.ValueSet(js)
self.implValueSet10(inst2)
def implValueSet10(self, inst):
self.assertEqual(
force_bytes(inst.compose.include[0].system),
force_bytes("http://hl7.org/fhir/entformula-additive"),
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].system), force_bytes("url")
)
self.assertEqual(
force_bytes(inst.contact[0].telecom[0].value),
force_bytes("http://hl7.org/fhir"),
)
self.assertEqual(inst.date.date, FHIRDate("2017-04-19T07:44:43+10:00").date)
self.assertEqual(inst.date.as_json(), "2017-04-19T07:44:43+10:00")
self.assertTrue(inst.experimental)
self.assertEqual(
force_bytes(inst.extension[0].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status"
),
)
self.assertEqual(
force_bytes(inst.extension[0].valueString), force_bytes("Informative")
)
self.assertEqual(
force_bytes(inst.extension[1].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm"
),
)
self.assertEqual(inst.extension[1].valueInteger, 1)
self.assertEqual(
force_bytes(inst.extension[2].url),
force_bytes(
"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg"
),
)
self.assertEqual(force_bytes(inst.extension[2].valueCode), force_bytes("oo"))
self.assertEqual(force_bytes(inst.id), force_bytes("entformula-additive"))
self.assertEqual(
force_bytes(inst.identifier[0].system), force_bytes("urn:ietf:rfc:3986")
)
self.assertEqual(
force_bytes(inst.identifier[0].value),
force_bytes("urn:oid:2.16.840.1.113883.4.642.3.379"),
)
self.assertTrue(inst.immutable)
self.assertEqual(
inst.meta.lastUpdated.date, FHIRDate("2017-04-19T07:44:43.294+10:00").date
)
self.assertEqual(
inst.meta.lastUpdated.as_json(), "2017-04-19T07:44:43.294+10:00"
)
self.assertEqual(
force_bytes(inst.meta.profile[0]),
force_bytes("http://hl7.org/fhir/StructureDefinition/shareablevalueset"),
)
self.assertEqual(
force_bytes(inst.name), force_bytes("Enteral Formula Additive Type Code")
)
self.assertEqual(
force_bytes(inst.publisher), force_bytes("FHIR NutritionOrder team")
)
self.assertEqual(force_bytes(inst.status), force_bytes("draft"))
self.assertEqual(force_bytes(inst.text.status), force_bytes("generated"))
self.assertEqual(
force_bytes(inst.url),
force_bytes("http://hl7.org/fhir/ValueSet/entformula-additive"),
)
self.assertEqual(force_bytes(inst.version), force_bytes("3.0.1"))
| 41.275281 | 210 | 0.610344 | 4,077 | 36,735 | 5.389257 | 0.062055 | 0.184326 | 0.18387 | 0.229838 | 0.910705 | 0.903741 | 0.889905 | 0.87079 | 0.858957 | 0.854815 | 0 | 0.050974 | 0.255016 | 36,735 | 889 | 211 | 41.32171 | 0.751891 | 0.004492 | 0 | 0.650118 | 0 | 0.013002 | 0.21402 | 0.051037 | 0 | 0 | 0 | 0 | 0.343972 | 1 | 0.024823 | false | 0 | 0.009456 | 0 | 0.036643 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6b3bfbb4270f1eb0f50418504e65be30ea23d10b | 34,845 | py | Python | glance/tests/functional/v1/test_api.py | darren-wang/gl | c5c731c7153d6d46c27260474d2811d504dfac5c | [
"Apache-2.0"
] | null | null | null | glance/tests/functional/v1/test_api.py | darren-wang/gl | c5c731c7153d6d46c27260474d2811d504dfac5c | [
"Apache-2.0"
] | null | null | null | glance/tests/functional/v1/test_api.py | darren-wang/gl | c5c731c7153d6d46c27260474d2811d504dfac5c | [
"Apache-2.0"
] | null | null | null | # Copyright 2011 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Functional test case that utilizes httplib2 against the API server"""
import hashlib
import httplib2
from oslo.serialization import jsonutils
from oslo_utils import units
# NOTE(jokke): simplified transition to py3, behaves like py2 xrange
from six.moves import range
from glance.tests import functional
from glance.tests.utils import minimal_headers
from glance.tests.utils import skip_if_disabled
FIVE_KB = 5 * units.Ki
FIVE_GB = 5 * units.Gi
class TestApi(functional.FunctionalTest):
"""Functional tests using httplib2 against the API server"""
@skip_if_disabled
def test_get_head_simple_post(self):
"""
We test the following sequential series of actions:
0. GET /images
- Verify no public images
1. GET /images/detail
- Verify no public images
2. POST /images with public image named Image1
and no custom properties
- Verify 201 returned
3. HEAD image
- Verify HTTP headers have correct information we just added
4. GET image
- Verify all information on image we just added is correct
5. GET /images
- Verify the image we just added is returned
6. GET /images/detail
- Verify the image we just added is returned
7. PUT image with custom properties of "distro" and "arch"
- Verify 200 returned
8. PUT image with too many custom properties
- Verify 413 returned
9. GET image
- Verify updated information about image was stored
10. PUT image
- Remove a previously existing property.
11. PUT image
- Add a previously deleted property.
12. PUT image/members/member1
- Add member1 to image
13. PUT image/members/member2
- Add member2 to image
14. GET image/members
- List image members
15. DELETE image/members/member1
- Delete image member1
16. PUT image/members
- Attempt to replace members with an overlimit amount
17. PUT image/members/member11
- Attempt to add a member while at limit
18. POST /images with another public image named Image2
- attribute and three custom properties, "distro", "arch" & "foo"
- Verify a 200 OK is returned
19. HEAD image2
- Verify image2 found now
20. GET /images
- Verify 2 public images
21. GET /images with filter on user-defined property "distro".
- Verify both images are returned
22. GET /images with filter on user-defined property 'distro' but
- with non-existent value. Verify no images are returned
23. GET /images with filter on non-existent user-defined property
- "boo". Verify no images are returned
24. GET /images with filter 'arch=i386'
- Verify only image2 is returned
25. GET /images with filter 'arch=x86_64'
- Verify only image1 is returned
26. GET /images with filter 'foo=bar'
- Verify only image2 is returned
27. DELETE image1
- Delete image
28. GET image/members
- List deleted image members
29. PUT image/members/member2
- Update existing member2 of deleted image
30. PUT image/members/member3
- Add member3 to deleted image
31. DELETE image/members/member2
- Delete member2 from deleted image
32. DELETE image2
- Delete image
33. GET /images
- Verify no images are listed
"""
self.cleanup()
self.start_servers(**self.__dict__.copy())
# 0. GET /images
# Verify no public images
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"images": []}', content)
# 1. GET /images/detail
# Verify no public images
path = "http://%s:%d/v1/images/detail" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"images": []}', content)
# 2. POST /images with public image named Image1
# attribute and no custom properties. Verify a 200 OK is returned
image_data = "*" * FIVE_KB
headers = minimal_headers('Image1')
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'POST', headers=headers,
body=image_data)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
image_id = data['image']['id']
self.assertEqual(hashlib.md5(image_data).hexdigest(),
data['image']['checksum'])
self.assertEqual(FIVE_KB, data['image']['size'])
self.assertEqual("Image1", data['image']['name'])
self.assertTrue(data['image']['is_public'])
# 3. HEAD image
# Verify image found now
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Image1", response['x-image-meta-name'])
# 4. GET image
# Verify all information on image we just added is correct
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_image_headers = {
'x-image-meta-id': image_id,
'x-image-meta-name': 'Image1',
'x-image-meta-is_public': 'True',
'x-image-meta-status': 'active',
'x-image-meta-disk_format': 'raw',
'x-image-meta-container_format': 'ovf',
'x-image-meta-size': str(FIVE_KB)}
expected_std_headers = {
'content-length': str(FIVE_KB),
'content-type': 'application/octet-stream'}
for expected_key, expected_value in expected_image_headers.items():
self.assertEqual(expected_value, response[expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
response[expected_key]))
for expected_key, expected_value in expected_std_headers.items():
self.assertEqual(expected_value, response[expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
response[expected_key]))
self.assertEqual("*" * FIVE_KB, content)
self.assertEqual(hashlib.md5("*" * FIVE_KB).hexdigest(),
hashlib.md5(content).hexdigest())
# 5. GET /images
# Verify one public image
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_result = {"images": [
{"container_format": "ovf",
"disk_format": "raw",
"id": image_id,
"name": "Image1",
"checksum": "c2e5db72bd7fd153f53ede5da5a06de3",
"size": 5120}]}
self.assertEqual(expected_result, jsonutils.loads(content))
# 6. GET /images/detail
# Verify image and all its metadata
path = "http://%s:%d/v1/images/detail" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_image = {
"status": "active",
"name": "Image1",
"deleted": False,
"container_format": "ovf",
"disk_format": "raw",
"id": image_id,
"is_public": True,
"deleted_at": None,
"properties": {},
"size": 5120}
image = jsonutils.loads(content)
for expected_key, expected_value in expected_image.items():
self.assertEqual(expected_value, image['images'][0][expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
image['images'][0][expected_key]))
# 7. PUT image with custom properties of "distro" and "arch"
# Verify 200 returned
headers = {'X-Image-Meta-Property-Distro': 'Ubuntu',
'X-Image-Meta-Property-Arch': 'x86_64'}
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual("x86_64", data['image']['properties']['arch'])
self.assertEqual("Ubuntu", data['image']['properties']['distro'])
# 8. PUT image with too many custom properties
# Verify 413 returned
headers = {}
for i in range(11): # configured limit is 10
headers['X-Image-Meta-Property-foo%d' % i] = 'bar'
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(413, response.status)
# 9. GET /images/detail
# Verify image and all its metadata
path = "http://%s:%d/v1/images/detail" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_image = {
"status": "active",
"name": "Image1",
"deleted": False,
"container_format": "ovf",
"disk_format": "raw",
"id": image_id,
"is_public": True,
"deleted_at": None,
"properties": {'distro': 'Ubuntu', 'arch': 'x86_64'},
"size": 5120}
image = jsonutils.loads(content)
for expected_key, expected_value in expected_image.items():
self.assertEqual(expected_value, image['images'][0][expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
image['images'][0][expected_key]))
# 10. PUT image and remove a previously existing property.
headers = {'X-Image-Meta-Property-Arch': 'x86_64'}
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
path = "http://%s:%d/v1/images/detail" % ("127.0.0.1", self.api_port)
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['images'][0]
self.assertEqual(1, len(data['properties']))
self.assertEqual("x86_64", data['properties']['arch'])
# 11. PUT image and add a previously deleted property.
headers = {'X-Image-Meta-Property-Distro': 'Ubuntu',
'X-Image-Meta-Property-Arch': 'x86_64'}
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
path = "http://%s:%d/v1/images/detail" % ("127.0.0.1", self.api_port)
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['images'][0]
self.assertEqual(2, len(data['properties']))
self.assertEqual("x86_64", data['properties']['arch'])
self.assertEqual("Ubuntu", data['properties']['distro'])
self.assertNotEqual(data['created_at'], data['updated_at'])
# 12. Add member to image
path = ("http://%s:%d/v1/images/%s/members/pattieblack" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'PUT')
self.assertEqual(204, response.status)
# 13. Add member to image
path = ("http://%s:%d/v1/images/%s/members/pattiewhite" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'PUT')
self.assertEqual(204, response.status)
# 14. List image members
path = ("http://%s:%d/v1/images/%s/members" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['members']))
self.assertEqual('pattieblack', data['members'][0]['member_id'])
self.assertEqual('pattiewhite', data['members'][1]['member_id'])
# 15. Delete image member
path = ("http://%s:%d/v1/images/%s/members/pattieblack" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'DELETE')
self.assertEqual(204, response.status)
# 16. Attempt to replace members with an overlimit amount
# Adding 11 image members should fail since configured limit is 10
path = ("http://%s:%d/v1/images/%s/members" %
("127.0.0.1", self.api_port, image_id))
memberships = []
for i in range(11):
member_id = "foo%d" % i
memberships.append(dict(member_id=member_id))
http = httplib2.Http()
body = jsonutils.dumps(dict(memberships=memberships))
response, content = http.request(path, 'PUT', body=body)
self.assertEqual(413, response.status)
# 17. Attempt to add a member while at limit
# Adding an 11th member should fail since configured limit is 10
path = ("http://%s:%d/v1/images/%s/members" %
("127.0.0.1", self.api_port, image_id))
memberships = []
for i in range(10):
member_id = "foo%d" % i
memberships.append(dict(member_id=member_id))
http = httplib2.Http()
body = jsonutils.dumps(dict(memberships=memberships))
response, content = http.request(path, 'PUT', body=body)
self.assertEqual(204, response.status)
path = ("http://%s:%d/v1/images/%s/members/fail_me" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'PUT')
self.assertEqual(413, response.status)
# 18. POST /images with another public image named Image2
# attribute and three custom properties, "distro", "arch" & "foo".
# Verify a 200 OK is returned
image_data = "*" * FIVE_KB
headers = minimal_headers('Image2')
headers['X-Image-Meta-Property-Distro'] = 'Ubuntu'
headers['X-Image-Meta-Property-Arch'] = 'i386'
headers['X-Image-Meta-Property-foo'] = 'bar'
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'POST', headers=headers,
body=image_data)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
image2_id = data['image']['id']
self.assertEqual(hashlib.md5(image_data).hexdigest(),
data['image']['checksum'])
self.assertEqual(FIVE_KB, data['image']['size'])
self.assertEqual("Image2", data['image']['name'])
self.assertTrue(data['image']['is_public'])
self.assertEqual('Ubuntu', data['image']['properties']['distro'])
self.assertEqual('i386', data['image']['properties']['arch'])
self.assertEqual('bar', data['image']['properties']['foo'])
# 19. HEAD image2
# Verify image2 found now
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image2_id)
http = httplib2.Http()
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Image2", response['x-image-meta-name'])
# 20. GET /images
# Verify 2 public images
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(2, len(images))
self.assertEqual(image2_id, images[0]['id'])
self.assertEqual(image_id, images[1]['id'])
# 21. GET /images with filter on user-defined property 'distro'.
# Verify both images are returned
path = "http://%s:%d/v1/images?property-distro=Ubuntu" % (
"127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(2, len(images))
self.assertEqual(image2_id, images[0]['id'])
self.assertEqual(image_id, images[1]['id'])
# 22. GET /images with filter on user-defined property 'distro' but
# with non-existent value. Verify no images are returned
path = "http://%s:%d/v1/images?property-distro=fedora" % (
"127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(0, len(images))
# 23. GET /images with filter on non-existent user-defined property
# 'boo'. Verify no images are returned
path = "http://%s:%d/v1/images?property-boo=bar" % ("127.0.0.1",
self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(0, len(images))
# 24. GET /images with filter 'arch=i386'
# Verify only image2 is returned
path = "http://%s:%d/v1/images?property-arch=i386" % ("127.0.0.1",
self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(1, len(images))
self.assertEqual(image2_id, images[0]['id'])
# 25. GET /images with filter 'arch=x86_64'
# Verify only image1 is returned
path = "http://%s:%d/v1/images?property-arch=x86_64" % ("127.0.0.1",
self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(1, len(images))
self.assertEqual(image_id, images[0]['id'])
# 26. GET /images with filter 'foo=bar'
# Verify only image2 is returned
path = "http://%s:%d/v1/images?property-foo=bar" % ("127.0.0.1",
self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(1, len(images))
self.assertEqual(image2_id, images[0]['id'])
# 27. DELETE image1
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'DELETE')
self.assertEqual(200, response.status)
# 28. Try to list members of deleted image
path = ("http://%s:%d/v1/images/%s/members" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(404, response.status)
# 29. Try to update member of deleted image
path = ("http://%s:%d/v1/images/%s/members" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
fixture = [{'member_id': 'pattieblack', 'can_share': 'false'}]
body = jsonutils.dumps(dict(memberships=fixture))
response, content = http.request(path, 'PUT', body=body)
self.assertEqual(404, response.status)
# 30. Try to add member to deleted image
path = ("http://%s:%d/v1/images/%s/members/chickenpattie" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'PUT')
self.assertEqual(404, response.status)
# 31. Try to delete member of deleted image
path = ("http://%s:%d/v1/images/%s/members/pattieblack" %
("127.0.0.1", self.api_port, image_id))
http = httplib2.Http()
response, content = http.request(path, 'DELETE')
self.assertEqual(404, response.status)
# 32. DELETE image2
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image2_id)
http = httplib2.Http()
response, content = http.request(path, 'DELETE')
self.assertEqual(200, response.status)
# 33. GET /images
# Verify no images are listed
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
images = jsonutils.loads(content)['images']
self.assertEqual(0, len(images))
# 34. HEAD /images/detail
path = "http://%s:%d/v1/images/detail" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'HEAD')
self.assertEqual(405, response.status)
self.assertEqual('GET', response.get('allow'))
self.stop_servers()
def test_download_non_exists_image_raises_http_forbidden(self):
"""
We test the following sequential series of actions:
0. POST /images with public image named Image1
and no custom properties
- Verify 201 returned
1. HEAD image
- Verify HTTP headers have correct information we just added
2. GET image
- Verify all information on image we just added is correct
3. DELETE image1
- Delete the newly added image
4. GET image
- Verify that 403 HTTPForbidden exception is raised prior to
404 HTTPNotFound
"""
self.cleanup()
self.start_servers(**self.__dict__.copy())
image_data = "*" * FIVE_KB
headers = minimal_headers('Image1')
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'POST', headers=headers,
body=image_data)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
image_id = data['image']['id']
self.assertEqual(hashlib.md5(image_data).hexdigest(),
data['image']['checksum'])
self.assertEqual(FIVE_KB, data['image']['size'])
self.assertEqual("Image1", data['image']['name'])
self.assertTrue(data['image']['is_public'])
# 1. HEAD image
# Verify image found now
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Image1", response['x-image-meta-name'])
# 2. GET /images
# Verify one public image
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_result = {"images": [
{"container_format": "ovf",
"disk_format": "raw",
"id": image_id,
"name": "Image1",
"checksum": "c2e5db72bd7fd153f53ede5da5a06de3",
"size": 5120}]}
self.assertEqual(expected_result, jsonutils.loads(content))
# 3. DELETE image1
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'DELETE')
self.assertEqual(200, response.status)
# 4. GET image
# Verify that 403 HTTPForbidden exception is raised prior to
# 404 HTTPNotFound
rules = {"download_image": '!'}
self.set_policy_rules(rules)
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(403, response.status)
self.stop_servers()
def test_download_non_exists_image_raises_http_not_found(self):
"""
We test the following sequential series of actions:
0. POST /images with public image named Image1
and no custom properties
- Verify 201 returned
1. HEAD image
- Verify HTTP headers have correct information we just added
2. GET image
- Verify all information on image we just added is correct
3. DELETE image1
- Delete the newly added image
4. GET image
- Verify that 404 HTTPNotFound exception is raised
"""
self.cleanup()
self.start_servers(**self.__dict__.copy())
image_data = "*" * FIVE_KB
headers = minimal_headers('Image1')
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'POST', headers=headers,
body=image_data)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
image_id = data['image']['id']
self.assertEqual(hashlib.md5(image_data).hexdigest(),
data['image']['checksum'])
self.assertEqual(FIVE_KB, data['image']['size'])
self.assertEqual("Image1", data['image']['name'])
self.assertTrue(data['image']['is_public'])
# 1. HEAD image
# Verify image found now
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Image1", response['x-image-meta-name'])
# 2. GET /images
# Verify one public image
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_result = {"images": [
{"container_format": "ovf",
"disk_format": "raw",
"id": image_id,
"name": "Image1",
"checksum": "c2e5db72bd7fd153f53ede5da5a06de3",
"size": 5120}]}
self.assertEqual(expected_result, jsonutils.loads(content))
# 3. DELETE image1
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'DELETE')
self.assertEqual(200, response.status)
# 4. GET image
# Verify that 404 HTTPNotFound exception is raised
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image_id)
http = httplib2.Http()
response, content = http.request(path, 'GET')
self.assertEqual(404, response.status)
self.stop_servers()
def test_status_cannot_be_manipulated_directly(self):
self.cleanup()
self.start_servers(**self.__dict__.copy())
headers = minimal_headers('Image1')
# Create a 'queued' image
http = httplib2.Http()
headers = {'Content-Type': 'application/octet-stream',
'X-Image-Meta-Disk-Format': 'raw',
'X-Image-Meta-Container-Format': 'bare'}
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
response, content = http.request(path, 'POST', headers=headers,
body=None)
self.assertEqual(201, response.status)
image = jsonutils.loads(content)['image']
self.assertEqual('queued', image['status'])
# Ensure status of 'queued' image can't be changed
path = "http://%s:%d/v1/images/%s" % ("127.0.0.1", self.api_port,
image['id'])
http = httplib2.Http()
headers = {'X-Image-Meta-Status': 'active'}
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(403, response.status)
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual('queued', response['x-image-meta-status'])
# We allow 'setting' to the same status
http = httplib2.Http()
headers = {'X-Image-Meta-Status': 'queued'}
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual('queued', response['x-image-meta-status'])
# Make image active
http = httplib2.Http()
headers = {'Content-Type': 'application/octet-stream'}
response, content = http.request(path, 'PUT', headers=headers,
body='data')
self.assertEqual(200, response.status)
image = jsonutils.loads(content)['image']
self.assertEqual('active', image['status'])
# Ensure status of 'active' image can't be changed
http = httplib2.Http()
headers = {'X-Image-Meta-Status': 'queued'}
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(403, response.status)
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual('active', response['x-image-meta-status'])
# We allow 'setting' to the same status
http = httplib2.Http()
headers = {'X-Image-Meta-Status': 'active'}
response, content = http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
response, content = http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual('active', response['x-image-meta-status'])
# Create a 'queued' image, ensure 'status' header is ignored
http = httplib2.Http()
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
headers = {'Content-Type': 'application/octet-stream',
'X-Image-Meta-Status': 'active'}
response, content = http.request(path, 'POST', headers=headers,
body=None)
self.assertEqual(201, response.status)
image = jsonutils.loads(content)['image']
self.assertEqual('queued', image['status'])
# Create an 'active' image, ensure 'status' header is ignored
http = httplib2.Http()
path = "http://%s:%d/v1/images" % ("127.0.0.1", self.api_port)
headers = {'Content-Type': 'application/octet-stream',
'X-Image-Meta-Disk-Format': 'raw',
'X-Image-Meta-Status': 'queued',
'X-Image-Meta-Container-Format': 'bare'}
response, content = http.request(path, 'POST', headers=headers,
body='data')
self.assertEqual(201, response.status)
image = jsonutils.loads(content)['image']
self.assertEqual('active', image['status'])
self.stop_servers()
| 43.178439 | 78 | 0.566078 | 4,057 | 34,845 | 4.794676 | 0.083313 | 0.09562 | 0.058606 | 0.080197 | 0.872764 | 0.841507 | 0.82845 | 0.810405 | 0.780331 | 0.771694 | 0 | 0.042296 | 0.300445 | 34,845 | 806 | 79 | 43.23201 | 0.755702 | 0.192797 | 0 | 0.849421 | 0 | 0 | 0.178483 | 0.023413 | 0 | 0 | 0 | 0 | 0.249035 | 1 | 0.007722 | false | 0 | 0.015444 | 0 | 0.025097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6b463c5adefd56f067f2b69217b219b99862db95 | 2,757 | py | Python | PYTHON/8.py | guikingma/project_euler | e4deffc63f39a304e449c12ee1fa1233c6f70e91 | [
"WTFPL"
] | 1 | 2015-11-06T07:04:22.000Z | 2015-11-06T07:04:22.000Z | PYTHON/8.py | guikingma/project_euler | e4deffc63f39a304e449c12ee1fa1233c6f70e91 | [
"WTFPL"
] | null | null | null | PYTHON/8.py | guikingma/project_euler | e4deffc63f39a304e449c12ee1fa1233c6f70e91 | [
"WTFPL"
] | null | null | null | '''
8:
Find the greatest product of five consecutive digits in the 1000-digit number.
'''
str_number = '7316717653133062491922511967442657474235534919493496983520312774506326239578318016984801869478851843858615607891129494954595017379583319528532088055111254069874715852386305071569329096329522744304355766896648950445244523161731856403098711121722383113622298934233803081353362766142828064444866452387493035890729629049156044077239071381051585930796086670172427121883998797908792274921901699720888093776657273330010533678812202354218097512545405947522435258490771167055601360483958644670632441572215539753697817977846174064955149290862569321978468622482839722413756570560574902614079729686524145351004748216637048440319989000889524345065854122758866688116427171479924442928230863465674813919123162824586178664583591245665294765456828489128831426076900422421902267105562632111110937054421750694165896040807198403850962455444362981230987879927244284909188845801561660979191338754992005240636899125607176060588611646710940507754100225698315520005593572972571636269561882670428252483600823257530420752963450'
#int_number = 7316717653133062491922511967442657474235534919493496983520312774506326239578318016984801869478851843858615607891129494954595017379583319528532088055111254069874715852386305071569329096329522744304355766896648950445244523161731856403098711121722383113622298934233803081353362766142828064444866452387493035890729629049156044077239071381051585930796086670172427121883998797908792274921901699720888093776657273330010533678812202354218097512545405947522435258490771167055601360483958644670632441572215539753697817977846174064955149290862569321978468622482839722413756570560574902614079729686524145351004748216637048440319989000889524345065854122758866688116427171479924442928230863465674813919123162824586178664583591245665294765456828489128831426076900422421902267105562632111110937054421750694165896040807198403850962455444362981230987879927244284909188845801561660979191338754992005240636899125607176060588611646710940507754100225698315520005593572972571636269561882670428252483600823257530420752963450
def mult5(a,b,c,d,e):
return a*b*c*d*e
#print mult5(5,6,9,8,7)
def maxi_mult(p_str_number):
i=0
mult = 0
maxi = 0
while (i<(len(p_str_number)-4)):
a = int ( p_str_number [i] )
b = int ( p_str_number [i + 1] )
c = int ( p_str_number [i + 2] )
d = int ( p_str_number [i + 3] )
e = int ( p_str_number [i + 4] )
mult = mult5(a,b,c,d,e)
if mult > maxi:
a1=a
a2=b
a3=c
a4=d
a5=e
maxi = mult
#print a
#print b
#print c
#print d
#print e
#print
i = i + 1
return a1, a2, a3, a4, a5, maxi
def execute():
print maxi_mult(str_number)
execute()
#print_list(return_numbers('13.file'))
| 53.019231 | 1,015 | 0.887922 | 150 | 2,757 | 16.18 | 0.34 | 0.033375 | 0.028842 | 0.027194 | 0.039143 | 0.008241 | 0 | 0 | 0 | 0 | 0 | 0.792365 | 0.068915 | 2,757 | 51 | 1,016 | 54.058824 | 0.153097 | 0.403337 | 0 | 0 | 0 | 0 | 0.646412 | 0.646412 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.038462 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
861bb330602311716beef4e5f89c064e9d926a13 | 63,408 | py | Python | src/hotpot_doc_retri/hotpot_preliminary_doc_retri.py | ethanjperez/semanticRetrievalMRS | 765e00d6e7693e0eaba20ef1407fad0be4a7a92b | [
"MIT"
] | 61 | 2019-09-19T03:04:32.000Z | 2022-03-08T03:59:28.000Z | src/hotpot_doc_retri/hotpot_preliminary_doc_retri.py | ethanjperez/semanticRetrievalMRS | 765e00d6e7693e0eaba20ef1407fad0be4a7a92b | [
"MIT"
] | 13 | 2019-09-19T12:11:01.000Z | 2020-12-28T17:51:43.000Z | src/hotpot_doc_retri/hotpot_preliminary_doc_retri.py | ethanjperez/semanticRetrievalMRS | 765e00d6e7693e0eaba20ef1407fad0be4a7a92b | [
"MIT"
] | 10 | 2019-09-20T05:07:28.000Z | 2022-01-12T08:12:08.000Z | import unicodedata
import regex
from flashtext import KeywordProcessor
from build_rindex.build_rvindex import load_from_file
from build_rindex.rvindex_scoring import get_query_ngrams, get_query_doc_score
from wiki_util.title_entities_set import get_title_entity_set
import wiki_util.title_entities_set
from wiki_util import wiki_db_tool
from tqdm import tqdm
import config
from utils import common
from evaluation import ext_hotpot_eval
import re
from hotpot_doc_retri import retrieval_utils
from typing import Dict, List, Tuple
import collections
import json
from hotpot_doc_retri.retrieval_utils import RetrievedItem, RetrievedSet
def filter_disamb_doc(input_string):
if ' (disambiguation)' in input_string:
return True
else:
return False
def check_arabic(input_string):
res = re.findall(
r'[\U00010E60-\U00010E7F]|[\U0001EE00-\U0001EEFF]|[\u0750-\u077F]|[\u08A0-\u08FF]|[\uFB50-\uFDFF]|[\uFE70-\uFEFF]|[\u0600-\u06FF]',
input_string)
if len(res) != 0:
return True
else:
return False
def filter_document_id(input_string, remove_disambiguation_doc=True):
pid_words = input_string.strip().replace('_', ' ')
match = re.search('[a-zA-Z]', pid_words)
if match is None: # filter id that contains no alphabets characters
return True
elif check_arabic(pid_words): # remove id that contain arabic characters.
return True
else:
if remove_disambiguation_doc:
if filter_disamb_doc(input_string):
return True
return False
STOPWORDS = {
'i', 'me', 'my', 'myself', 'we', 'our', 'ours', 'ourselves', 'you', 'your',
'yours', 'yourself', 'yourselves', 'he', 'him', 'his', 'himself', 'she',
'her', 'hers', 'herself', 'it', 'its', 'itself', 'they', 'them', 'their',
'theirs', 'themselves', 'what', 'which', 'who', 'whom', 'this', 'that',
'these', 'those', 'am', 'is', 'are', 'was', 'were', 'be', 'been', 'being',
'have', 'has', 'had', 'having', 'do', 'does', 'did', 'doing', 'a', 'an',
'the', 'and', 'but', 'if', 'or', 'because', 'as', 'until', 'while', 'of',
'at', 'by', 'for', 'with', 'about', 'against', 'between', 'into', 'through',
'during', 'before', 'after', 'above', 'below', 'to', 'from', 'up', 'down',
'in', 'out', 'on', 'off', 'over', 'under', 'again', 'further', 'then',
'once', 'here', 'there', 'when', 'where', 'why', 'how', 'all', 'any',
'both', 'each', 'few', 'more', 'most', 'other', 'some', 'such', 'no', 'nor',
'not', 'only', 'own', 'same', 'so', 'than', 'too', 'very', 's', 't', 'can',
'will', 'just', 'don', 'should', 'now', 'd', 'll', 'm', 'o', 're', 've',
'y', 'ain', 'aren', 'couldn', 'didn', 'doesn', 'hadn', 'hasn', 'haven',
'isn', 'ma', 'mightn', 'mustn', 'needn', 'shan', 'shouldn', 'wasn', 'weren',
'won', 'wouldn', "'ll", "'re", "'ve", "n't", "'s", "'d", "'m", "''", "``"
}
def normalize(text):
"""Resolve different type of unicode encodings."""
return unicodedata.normalize('NFD', text)
def filter_word(text):
"""Take out english stopwords, punctuation, and compound endings."""
text = normalize(text)
if regex.match(r'^\p{P}+$', text):
return True
if text.lower() in STOPWORDS:
return True
return False
def toy_init_results():
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
keyword_processor = KeywordProcessor(case_sensitive=True)
print("Build Processor")
for kw in tqdm(ner_set):
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor.add_keyword(kw, {kw})
doc_pred_dict = {'sp_doc': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
finded_keys = keyword_processor.extract_keywords(question)
finded_keys_set = set()
if isinstance(finded_keys, list) and len(finded_keys) != 0:
finded_keys_set = set.union(*finded_keys)
# Addons cut retrieved document to contain only two
finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
top_n = 2
finded_keys_set = finded_keys_set[:top_n]
doc_pred_dict['sp_doc'][qid] = list(finded_keys_set)
common.save_json(doc_pred_dict, "toy_doc_rm_stopword_top2_pred_file.json")
def toy_init_results_v1():
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
keyword_processor = KeywordProcessor(case_sensitive=True)
print("Build Processor")
for kw in tqdm(ner_set):
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor.add_keyword(kw, {kw})
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
finded_keys = keyword_processor.extract_keywords(question)
finded_keys_set = set()
retrieved_set = retrieval_utils.RetrievedSet()
# .1 We first find the raw matching.
if isinstance(finded_keys, list) and len(finded_keys) != 0:
finded_keys_set = set.union(*finded_keys)
for page_name in finded_keys_set:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm'))
# .2 Then we add more disambiguation titles. # Comment out this to remove this function.
for keyword in finded_keys_set:
if keyword in wiki_util.title_entities_set.disambiguation_group:
for page_name in wiki_util.title_entities_set.disambiguation_group[keyword]:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb'))
# finded_keys_set = set.union(finded_keys_set, wiki_util.title_entities_set.disambiguation_group[keyword])
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
hyperlinked_title = []
for keyword in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
potential_title):
hyperlinked_title.append(potential_title)
# finded_keys_set.add(potential_title)
# finded_keys_set = set.union(set(hyperlinked_title), finded_keys_set)
for page_name in hyperlinked_title:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
retrieved_list = retrieved_set.to_id_list()
doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_file.json")
def toy_init_results_v2():
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
keyword_processor = KeywordProcessor(case_sensitive=True)
keyword_processor_disamb = KeywordProcessor(case_sensitive=True)
print("Build Processor")
for kw in tqdm(ner_set):
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor.add_keyword(kw, {kw})
for kw in wiki_util.title_entities_set.disambiguation_group:
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor_disamb.add_keyword(kw, wiki_util.title_entities_set.disambiguation_group[kw])
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
finded_keys = keyword_processor.extract_keywords(question)
finded_keys_set = set()
retrieved_set = retrieval_utils.RetrievedSet()
# .1 We first find the raw matching.
if isinstance(finded_keys, list) and len(finded_keys) != 0:
finded_keys_set = set.union(*finded_keys)
for page_name in finded_keys_set:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm'))
# del keyword_processor
# .2 Then we add more disambiguation titles. # Comment out this to remove this function.
finded_keys_disamb = keyword_processor_disamb.extract_keywords(question)
finded_keys_disamb_set = set()
if isinstance(finded_keys_disamb, list) and len(finded_keys_disamb) != 0:
finded_keys_disamb_set = set.union(*finded_keys_disamb)
for page_name in finded_keys_disamb_set:
# There will be duplicate pages, then we just ignore.
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb'))
# del keyword_processor_disamb
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
hyperlinked_title = []
for keyword in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
potential_title):
hyperlinked_title.append(potential_title)
# finded_keys_set.add(potential_title)
# finded_keys_set = set.union(set(hyperlinked_title), finded_keys_set)
for page_name in hyperlinked_title:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
retrieved_list = retrieved_set.to_id_list()
doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_v2_file.json")
def toy_init_results_v3():
# 2019 - 03 - 27
# We want to merge raw key word matching and disambiguration group.
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
# keyword_processor = KeywordProcessor(case_sensitive=True)
keyword_processor = KeywordProcessor(case_sensitive=False)
# The structure for keyword_processor is {keyword: str: dict{kw: str: method: str} }
print("Build Processor")
for kw in tqdm(ner_set):
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor.add_keyword(kw, {kw: 'kwm'})
for kw in wiki_util.title_entities_set.disambiguation_group:
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
if kw in keyword_processor:
# if the kw existed in the kw_processor, we update its dict to add more disamb items
existing_dict: Dict = keyword_processor.get_keyword(kw)
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if disamb_kw not in existing_dict:
existing_dict[disamb_kw] = 'kwm_disamb'
else:
new_dict = dict()
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
new_dict[disamb_kw] = 'kwm_disamb'
keyword_processor.add_keyword(kw, new_dict)
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
finded_keys: List[Dict[str: str]] = keyword_processor.extract_keywords(question)
finded_keys_list: List[Tuple[str, str]] = []
retrieved_set = retrieval_utils.RetrievedSet()
for finded_key in finded_keys:
for title, method in finded_key.items():
finded_keys_list.append((title, method))
# .1 We first find the raw matching.
for title, method in finded_keys_list:
if method == 'kwm':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm'))
# .2 Then, we find the raw matching.
for title, method in finded_keys_list:
if method == 'kwm_disamb':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm_disamb'))
# .2 Then we add more disambiguation titles.
# Since we merge the two dictionary, we don't need to processing again.
# Comment out this to remove this function.
# finded_keys_disamb = keyword_processor_disamb.extract_keywords(question)
# finded_keys_disamb_set = set()
# if isinstance(finded_keys_disamb, list) and len(finded_keys_disamb) != 0:
# finded_keys_disamb_set = set.union(*finded_keys_disamb)
#
# for page_name in finded_keys_disamb_set:
# # There will be duplicate pages, then we just ignore.
# retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb'))
# del keyword_processor_disamb
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
hyperlinked_title = []
for keyword in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
potential_title):
hyperlinked_title.append(potential_title)
# finded_keys_set.add(potential_title)
# finded_keys_set = set.union(set(hyperlinked_title), finded_keys_set)
for page_name in hyperlinked_title:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
retrieved_list = retrieved_set.to_id_list()
doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_uncased_v3_file.json")
def toy_init_results_v4():
# 2019-03-28
# We first do raw key word matching and then disambiguation and
# remove the overlapping span of kw and disambiguating.
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
keyword_processor = KeywordProcessor(case_sensitive=True)
keyword_processor_disamb = KeywordProcessor(case_sensitive=True)
print("Build Processor")
for kw in tqdm(ner_set):
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor.add_keyword(kw, {kw: 'kwm'})
for kw in wiki_util.title_entities_set.disambiguation_group:
if kw.lower() in STOPWORDS or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
if kw in keyword_processor:
# if the kw existed in the kw_processor, we update its dict to add more disamb items
existing_dict: Dict = keyword_processor.get_keyword(kw)
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if disamb_kw not in existing_dict:
existing_dict[disamb_kw] = 'kwm_disamb'
else: # If not we add it to the keyword_processor_disamb, which is set to be lower priority
new_dict = dict()
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
new_dict[disamb_kw] = 'kwm_disamb'
keyword_processor_disamb.add_keyword(kw, new_dict)
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
# 1. First retrieve raw key word matching.
finded_keys_kwm: List[Dict[str: str], int, int] = keyword_processor.extract_keywords(question, span_info=True)
finded_keys_kwm_disamb: List[Dict[str: str], int, int] = keyword_processor_disamb.extract_keywords(question,
span_info=True)
finded_keys_list: List[Tuple[str, str, int, int]] = []
retrieved_set = retrieval_utils.RetrievedSet()
whole_span = [False for _ in question]
for finded_key, start, end in finded_keys_kwm:
for i in range(start, end):
whole_span[i] = True # We mark the span as extracted by key word matching.
for title, method in finded_key.items():
finded_keys_list.append((title, method, start, end))
for finded_key, start, end in finded_keys_kwm_disamb:
valid = True
for i in range(start, end):
if whole_span[i]:
valid = False # If we want a span overlapping, we just ignore this item.
break
if valid:
for title, method in finded_key.items():
finded_keys_list.append((title, method, start, end))
# .1 We first find the raw matching.
for title, method, start, end in finded_keys_list:
if method == 'kwm':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm'))
# .2 Then, we find the raw matching.
for title, method, start, end in finded_keys_list:
if method == 'kwm_disamb':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm_disamb'))
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
hyperlinked_title = []
for keyword in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
potential_title):
hyperlinked_title.append(potential_title)
# finded_keys_set.add(potential_title)
# finded_keys_set = set.union(set(hyperlinked_title), finded_keys_set)
for page_name in hyperlinked_title:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
retrieved_list = retrieved_set.to_id_list()
doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_v4_file.json")
def toy_init_results_v5():
# 2019-03-28
# We first do raw key word matching and then disambiguation and
# remove the overlapping span of kw and disambiguating.
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
keyword_processor = KeywordProcessor(case_sensitive=True)
keyword_processor_disamb = KeywordProcessor(case_sensitive=True)
print("Build Processor")
for kw in tqdm(ner_set):
# if kw.lower() in STOPWORDS or filter_document_id(kw):
if filter_word(kw) or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
keyword_processor.add_keyword(kw, {kw: 'kwm'})
for kw in wiki_util.title_entities_set.disambiguation_group:
# if kw.lower() in STOPWORDS or filter_document_id(kw):
if filter_word(kw) or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
if kw in keyword_processor:
# if the kw existed in the kw_processor, we update its dict to add more disamb items
existing_dict: Dict = keyword_processor.get_keyword(kw)
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if disamb_kw not in existing_dict:
existing_dict[disamb_kw] = 'kwm_disamb'
else: # If not we add it to the keyword_processor_disamb, which is set to be lower priority
new_dict = dict()
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
new_dict[disamb_kw] = 'kwm_disamb'
keyword_processor_disamb.add_keyword(kw, new_dict)
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
# 1. First retrieve raw key word matching.
finded_keys_kwm: List[Dict[str: str], int, int] = keyword_processor.extract_keywords(question, span_info=True)
finded_keys_kwm_disamb: List[Dict[str: str], int, int] = keyword_processor_disamb.extract_keywords(question,
span_info=True)
finded_keys_list: List[Tuple[str, str, int, int]] = []
retrieved_set = retrieval_utils.RetrievedSet()
all_finded_span = []
for finded_key, start, end in finded_keys_kwm:
for i in range(start, end):
all_finded_span.append((start, end))
for title, method in finded_key.items():
finded_keys_list.append((title, method, start, end))
for finded_key, start, end in finded_keys_kwm_disamb:
not_valid = False
for e_start, e_end in all_finded_span:
if e_start <= start and e_end >= end:
not_valid = True
break
if not not_valid:
for title, method in finded_key.items():
finded_keys_list.append((title, method, start, end))
# .1 We first find the raw matching.
for title, method, start, end in finded_keys_list:
if method == 'kwm':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm'))
# .2 Then, we find the raw matching.
for title, method, start, end in finded_keys_list:
if method == 'kwm_disamb':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm_disamb'))
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
hyperlinked_title = []
for keyword in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
potential_title):
hyperlinked_title.append(potential_title)
for page_name in hyperlinked_title:
retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
retrieved_list = retrieved_set.to_id_list()
doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_withiout_hyperlinked_v5_file.json")
def toy_init_results_v6():
# 2019-03-28
# We first do raw key word matching and then disambiguation and
# remove the overlapping span of kw and disambiguating.
match_filtering_k = 3
ner_set = get_title_entity_set()
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
# Load tf-idf_score function:
g_score_dict = dict()
load_from_file(g_score_dict,
config.PDATA_ROOT / "reverse_indexing/abs_rindexdb/scored_db/default-tf-idf.score.txt")
keyword_processor = KeywordProcessor(case_sensitive=True)
keyword_processor_disamb = KeywordProcessor(case_sensitive=True)
_MatchedObject = collections.namedtuple( # pylint: disable=invalid-name
"MatchedObject", ["matched_key_word", "matched_keywords_info"])
# Extracted key word is the key word in the database, matched word is the word in the input question.
print("Build Processor")
for kw in tqdm(ner_set):
# if kw.lower() in STOPWORDS or filter_document_id(kw):
if filter_word(kw) or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
matched_obj = _MatchedObject(matched_key_word=kw, matched_keywords_info={kw: 'kwm'})
keyword_processor.add_keyword(kw, matched_obj)
for kw in wiki_util.title_entities_set.disambiguation_group:
# if kw.lower() in STOPWORDS or filter_document_id(kw):
if filter_word(kw) or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
if kw in keyword_processor:
# if the kw existed in the kw_processor, we update its dict to add more disamb items
existing_matched_obj: _MatchedObject = keyword_processor.get_keyword(kw)
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if filter_document_id(disamb_kw):
continue
if disamb_kw not in existing_matched_obj.matched_keywords_info:
existing_matched_obj.matched_keywords_info[disamb_kw] = 'kwm_disamb'
else: # If not we add it to the keyword_processor_disamb, which is set to be lower priority
# new_dict = dict()
matched_obj = _MatchedObject(matched_key_word=kw, matched_keywords_info=dict())
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if filter_document_id(disamb_kw):
continue
matched_obj.matched_keywords_info[disamb_kw] = 'kwm_disamb'
# new_dict[disamb_kw] = 'kwm_disamb'
keyword_processor_disamb.add_keyword(kw, matched_obj)
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
query_terms = get_query_ngrams(question)
valid_query_terms = [term for term in query_terms if term in g_score_dict]
# 1. First retrieve raw key word matching.
finded_keys_kwm: List[_MatchedObject, int, int] = keyword_processor.extract_keywords(question, span_info=True)
finded_keys_kwm_disamb: List[_MatchedObject, int, int] = keyword_processor_disamb.extract_keywords(question,
span_info=True)
finded_keys_list: List[Tuple[str, str, str, int, int]] = []
retrieved_set = retrieval_utils.RetrievedSet()
all_finded_span = []
all_finded_span_2 = []
for finded_matched_obj, start, end in finded_keys_kwm:
for i in range(start, end):
all_finded_span.append((start, end))
all_finded_span_2.append((start, end))
# for matched_obj in finded_matched_obj.:
matched_words = finded_matched_obj.matched_key_word
for extracted_keyword, method in finded_matched_obj.matched_keywords_info.items():
finded_keys_list.append((matched_words, extracted_keyword, method, start, end))
for finded_matched_obj, start, end in finded_keys_kwm_disamb:
not_valid = False
for e_start, e_end in all_finded_span:
if e_start <= start and e_end >= end:
not_valid = True
break
if not not_valid:
matched_words = finded_matched_obj.matched_key_word
for extracted_keyword, method in finded_matched_obj.matched_keywords_info.items():
finded_keys_list.append((matched_words, extracted_keyword, method, start, end))
all_finded_span_2.append((start, end))
all_raw_matched_word = set()
# .1 We first find the raw matching.
for matched_word, title, method, start, end in finded_keys_list:
# add after debug_2
not_valid = False
for e_start, e_end in all_finded_span_2:
if (e_start < start and e_end >= end) or (e_start <= start and e_end > end):
not_valid = True # Skip this match bc this match is already contained in some other match.
break
if not_valid:
continue
# add finished
if method == 'kwm':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm'))
score = get_query_doc_score(valid_query_terms, title,
g_score_dict) # A function to compute between title and query
retrieved_set.score_item(title, score, namespace=matched_word)
all_raw_matched_word.add(matched_word)
# .2 Then, we find the raw matching.
for matched_word, title, method, start, end in finded_keys_list:
# add after debug_2
not_valid = False
for e_start, e_end in all_finded_span_2:
if (e_start < start and e_end >= end) or (e_start <= start and e_end > end):
not_valid = True # Skip this match bc this match is already contained in some other match.
break
if not_valid:
continue
# add finished
if method == 'kwm_disamb':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm_disamb'))
score = get_query_doc_score(valid_query_terms, title,
g_score_dict) # A function to compute between title and query
retrieved_set.score_item(title, score, namespace=matched_word)
all_raw_matched_word.add(matched_word)
for matched_word in all_raw_matched_word:
retrieved_set.sort_and_filter(matched_word, top_k=match_filtering_k)
# We don't worry about the hyperlink so far.
# finded_keys_set = set(
# retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# # .3 We then add some hyperlinked title
# db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
# hyperlinked_title = []
# for keyword in finded_keys_set:
# flatten_hyperlinks = []
# hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
# for hls in hyperlinks:
# flatten_hyperlinks.extend(hls)
#
# for hl in flatten_hyperlinks:
# potential_title = hl.href
# if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
# potential_title):
# hyperlinked_title.append(potential_title)
#
# for page_name in hyperlinked_title:
# retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
retrieved_list = retrieved_set.to_id_list()
doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_withiout_hyperlinked_v6_file_debug_4_redo_0.json")
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
ext_hotpot_eval.eval(doc_pred_dict, dev_fullwiki_list)
def toy_init_results_v7_pre():
# 2019-03-28
# We first do raw key word matching and then disambiguation and
# remove the overlapping span of kw and disambiguating.
# match_filtering_k = 3
ner_set = get_title_entity_set()
term_retrieval_top_k = 5
multihop_retrieval_top_k = None
#
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
terms_based_results = common.load_jsonl(config.RESULT_PATH / "doc_retri_results/term_based_methods_results/hotpot_tf_idf_dev.jsonl")
terms_based_results_dict = dict()
for item in terms_based_results:
terms_based_results_dict[item['qid']] = item
# print(item)
# Load tf-idf_score function:
g_score_dict = dict()
load_from_file(g_score_dict,
config.PDATA_ROOT / "reverse_indexing/abs_rindexdb/scored_db/default-tf-idf.score.txt")
# keyword_processor = KeywordProcessor(case_sensitive=True)
# keyword_processor_disamb = KeywordProcessor(case_sensitive=True)
# _MatchedObject = collections.namedtuple( # pylint: disable=invalid-name
# "MatchedObject", ["matched_key_word", "matched_keywords_info"])
# Extracted key word is the key word in the database, matched word is the word in the input question.
# print("Build Processor")
# for kw in tqdm(ner_set):
# if kw.lower() in STOPWORDS or filter_document_id(kw):
# if filter_word(kw) or filter_document_id(kw):
# continue # if the keyword is filtered by above function or is stopwords
# else:
# matched_obj = _MatchedObject(matched_key_word=kw, matched_keywords_info={kw: 'kwm'})
# keyword_processor.add_keyword(kw, matched_obj)
# for kw in wiki_util.title_entities_set.disambiguation_group:
# # if kw.lower() in STOPWORDS or filter_document_id(kw):
# if filter_word(kw) or filter_document_id(kw):
# continue # if the keyword is filtered by above function or is stopwords
# else:
# if kw in keyword_processor:
# # if the kw existed in the kw_processor, we update its dict to add more disamb items
# existing_matched_obj: _MatchedObject = keyword_processor.get_keyword(kw)
# for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
# if filter_document_id(disamb_kw):
# continue
# if disamb_kw not in existing_matched_obj.matched_keywords_info:
# existing_matched_obj.matched_keywords_info[disamb_kw] = 'kwm_disamb'
# else: # If not we add it to the keyword_processor_disamb, which is set to be lower priority
# # new_dict = dict()
# matched_obj = _MatchedObject(matched_key_word=kw, matched_keywords_info=dict())
# for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
# if filter_document_id(disamb_kw):
# continue
# matched_obj.matched_keywords_info[disamb_kw] = 'kwm_disamb'
# # new_dict[disamb_kw] = 'kwm_disamb'
# keyword_processor_disamb.add_keyword(kw, matched_obj)
#
# doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
# Load some preobtained results.
doc_pred_dict = common.load_json(config.RESULT_PATH / "doc_retri_results/doc_retrieval_debug_v6/doc_raw_matching_with_disamb_withiout_hyperlinked_v6_file_debug_4.json")
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
retrieved_set = doc_pred_dict['raw_retrieval_set'][qid]
# print(type(retrieved_set))
query_terms = get_query_ngrams(question)
valid_query_terms = [term for term in query_terms if term in g_score_dict]
new_sent_from_tf_idf = []
for score, title in sorted(
terms_based_results_dict[qid]['doc_list'], key=lambda x: x[0], reverse=True)[:term_retrieval_top_k]:
# doc_pred_dict['sp_doc'][qid].append(title)
retrieved_set.add_item(RetrievedItem(title, 'tf-idf'))
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
# hyperlinked_title = []
# keyword_group = []
for keyword_group in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword_group)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
# if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
if potential_title in ner_set and not filter_word(potential_title) or not filter_document_id(
potential_title):
# hyperlinked_title.append(potential_title)
# if not filter_document_id(potential_title):
score = get_query_doc_score(valid_query_terms, potential_title, g_score_dict)
retrieved_set.add_item(retrieval_utils.RetrievedItem(potential_title, 'kwm_disamb_hlinked'))
retrieved_set.score_item(potential_title, score, namespace=keyword_group + '-2-hop')
# g_score_dict) # A function to compute between title and query
for keyword_group in finded_keys_set: # Group ordering and filtering
retrieved_set.sort_and_filter(keyword_group + '-2-hop', top_k=multihop_retrieval_top_k)
# for page_name in hyperlinked_title:
# retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
doc_pred_dict['sp_doc'][qid] = retrieved_set.to_id_list()
#
# for item in tqdm(dev_fullwiki_list):
# question = item['question']
# qid = item['_id']
#
# query_terms = get_query_ngrams(question)
# valid_query_terms = [term for term in query_terms if term in g_score_dict]
#
# # 1. First retrieve raw key word matching.
# finded_keys_kwm: List[_MatchedObject, int, int] = keyword_processor.extract_keywords(question, span_info=True)
# finded_keys_kwm_disamb: List[_MatchedObject, int, int] = keyword_processor_disamb.extract_keywords(question,
# span_info=True)
# finded_keys_list: List[Tuple[str, str, str, int, int]] = []
# retrieved_set = retrieval_utils.RetrievedSet()
#
# all_finded_span = []
# all_finded_span_2 = []
#
# for finded_matched_obj, start, end in finded_keys_kwm:
# for i in range(start, end):
# all_finded_span.append((start, end))
# all_finded_span_2.append((start, end))
#
# # for matched_obj in finded_matched_obj.:
# matched_words = finded_matched_obj.matched_key_word
# for extracted_keyword, method in finded_matched_obj.matched_keywords_info.items():
# finded_keys_list.append((matched_words, extracted_keyword, method, start, end))
#
# for finded_matched_obj, start, end in finded_keys_kwm_disamb:
# not_valid = False
# for e_start, e_end in all_finded_span:
# if e_start <= start and e_end >= end:
# not_valid = True
# break
#
# if not not_valid:
# matched_words = finded_matched_obj.matched_key_word
# for extracted_keyword, method in finded_matched_obj.matched_keywords_info.items():
# finded_keys_list.append((matched_words, extracted_keyword, method, start, end))
# all_finded_span_2.append((start, end))
#
# all_raw_matched_word = set()
# # .1 We first find the raw matching.
#
# for matched_word, title, method, start, end in finded_keys_list:
# # add after debug_2
# not_valid = False
# for e_start, e_end in all_finded_span_2:
# if (e_start < start and e_end >= end) or (e_start <= start and e_end > end):
# not_valid = True # Skip this match bc this match is already contained in some other match.
# break
#
# if not_valid:
# continue
# # add finished
#
# if method == 'kwm':
# retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm'))
# score = get_query_doc_score(valid_query_terms, title,
# g_score_dict) # A function to compute between title and query
# retrieved_set.score_item(title, score, namespace=matched_word)
# all_raw_matched_word.add(matched_word)
#
# # .2 Then, we find the raw matching.
# for matched_word, title, method, start, end in finded_keys_list:
# # add after debug_2
# not_valid = False
# for e_start, e_end in all_finded_span_2:
# if (e_start < start and e_end >= end) or (e_start <= start and e_end > end):
# not_valid = True # Skip this match bc this match is already contained in some other match.
# break
#
# if not_valid:
# continue
# # add finished
#
# if method == 'kwm_disamb':
# retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm_disamb'))
# score = get_query_doc_score(valid_query_terms, title,
# g_score_dict) # A function to compute between title and query
# retrieved_set.score_item(title, score, namespace=matched_word)
# all_raw_matched_word.add(matched_word)
#
# for matched_word in all_raw_matched_word:
# retrieved_set.sort_and_filter(matched_word, top_k=match_filtering_k)
# We don't worry about the hyperlink so far.
# finded_keys_set = set(
# retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# # .3 We then add some hyperlinked title
# db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
# hyperlinked_title = []
# for keyword in finded_keys_set:
# flatten_hyperlinks = []
# hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword)
# for hls in hyperlinks:
# flatten_hyperlinks.extend(hls)
#
# for hl in flatten_hyperlinks:
# potential_title = hl.href
# if potential_title in ner_set and potential_title.lower() not in STOPWORDS or not filter_document_id(
# potential_title):
# hyperlinked_title.append(potential_title)
#
# for page_name in hyperlinked_title:
# retrieved_set.add_item(retrieval_utils.RetrievedItem(page_name, 'kwm_disamb_hlinked'))
# Addons cut retrieved document to contain only two
# finded_keys_set = sorted(list(finded_keys_set), key=lambda x: len(x), reverse=True)
# top_n = 2
# finded_keys_set = finded_keys_set[:top_n]
# retrieved_list = retrieved_set.to_id_list()
#
# doc_pred_dict['sp_doc'][qid] = list(retrieved_list)
# doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_v7_file_debug_top_none.json")
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
ext_hotpot_eval.eval(doc_pred_dict, dev_fullwiki_list)
def toy_init_results_v7():
# 2019-04-05
# The complete v7 version of retrieval
# We first do raw key word matching and then disambiguation and
# remove the overlapping span of kw and disambiguating.
ner_set = get_title_entity_set()
match_filtering_k = 3
term_retrieval_top_k = 5
multihop_retrieval_top_k = None
#
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
print(len(dev_fullwiki_list))
# We load term-based results
terms_based_results = common.load_jsonl(config.RESULT_PATH / "doc_retri_results/term_based_methods_results/hotpot_tf_idf_dev.jsonl")
terms_based_results_dict = dict()
for item in terms_based_results:
terms_based_results_dict[item['qid']] = item
# Load tf-idf_score function:
g_score_dict = dict()
load_from_file(g_score_dict,
config.PDATA_ROOT / "reverse_indexing/abs_rindexdb/scored_db/default-tf-idf.score.txt")
keyword_processor = KeywordProcessor(case_sensitive=True)
keyword_processor_disamb = KeywordProcessor(case_sensitive=True)
_MatchedObject = collections.namedtuple( # pylint: disable=invalid-name
"MatchedObject", ["matched_key_word", "matched_keywords_info"])
# Extracted key word is the key word in the database, matched word is the word in the input question.
print("Build Processor")
for kw in tqdm(ner_set):
if filter_word(kw) or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
# matched_key_word is the original matched span. we need to save it for group ordering.
matched_obj = _MatchedObject(matched_key_word=kw, matched_keywords_info={kw: 'kwm'})
keyword_processor.add_keyword(kw, matched_obj)
#
for kw in wiki_util.title_entities_set.disambiguation_group:
if filter_word(kw) or filter_document_id(kw):
continue # if the keyword is filtered by above function or is stopwords
else:
if kw in keyword_processor:
# if the kw existed in the kw_processor, we update its dict to add more disamb items
existing_matched_obj: _MatchedObject = keyword_processor.get_keyword(kw)
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if filter_document_id(disamb_kw):
continue
if disamb_kw not in existing_matched_obj.matched_keywords_info:
existing_matched_obj.matched_keywords_info[disamb_kw] = 'kwm_disamb'
else: # If not we add it to the keyword_processor_disamb, which is set to be lower priority
# new_dict = dict()
matched_obj = _MatchedObject(matched_key_word=kw, matched_keywords_info=dict())
for disamb_kw in wiki_util.title_entities_set.disambiguation_group[kw]:
if filter_document_id(disamb_kw):
continue
matched_obj.matched_keywords_info[disamb_kw] = 'kwm_disamb'
# new_dict[disamb_kw] = 'kwm_disamb'
keyword_processor_disamb.add_keyword(kw, matched_obj)
doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
doc_pred_dict_p1 = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for item in tqdm(dev_fullwiki_list):
question = item['question']
qid = item['_id']
query_terms = get_query_ngrams(question)
valid_query_terms = [term for term in query_terms if term in g_score_dict]
# 1. First retrieve raw key word matching.
finded_keys_kwm: List[_MatchedObject, int, int] = keyword_processor.extract_keywords(question, span_info=True)
finded_keys_kwm_disamb: List[_MatchedObject, int, int] = keyword_processor_disamb.extract_keywords(question,
span_info=True)
finded_keys_list: List[Tuple[str, str, str, int, int]] = []
retrieved_set = retrieval_utils.RetrievedSet()
all_finded_span = []
all_finded_span_2 = []
for finded_matched_obj, start, end in finded_keys_kwm:
for i in range(start, end):
all_finded_span.append((start, end))
all_finded_span_2.append((start, end))
# for matched_obj in finded_matched_obj.:
matched_words = finded_matched_obj.matched_key_word
for extracted_keyword, method in finded_matched_obj.matched_keywords_info.items():
finded_keys_list.append((matched_words, extracted_keyword, method, start, end))
for finded_matched_obj, start, end in finded_keys_kwm_disamb:
not_valid = False
for e_start, e_end in all_finded_span:
if e_start <= start and e_end >= end:
not_valid = True
break
if not not_valid:
matched_words = finded_matched_obj.matched_key_word
for extracted_keyword, method in finded_matched_obj.matched_keywords_info.items():
finded_keys_list.append((matched_words, extracted_keyword, method, start, end))
all_finded_span_2.append((start, end))
all_raw_matched_word = set()
# .1 We first find the raw matching.
for matched_word, title, method, start, end in finded_keys_list:
# add after debug_2
not_valid = False
for e_start, e_end in all_finded_span_2:
if (e_start < start and e_end >= end) or (e_start <= start and e_end > end):
not_valid = True # Skip this match bc this match is already contained in some other match.
break
if not_valid:
continue
# add finished
if method == 'kwm':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm'))
score = get_query_doc_score(valid_query_terms, title,
g_score_dict) # A function to compute between title and query
retrieved_set.score_item(title, score, namespace=matched_word)
all_raw_matched_word.add(matched_word)
# .2 Then, we find the raw matching.
for matched_word, title, method, start, end in finded_keys_list:
# add after debug_2
not_valid = False
for e_start, e_end in all_finded_span_2:
if (e_start < start and e_end >= end) or (e_start <= start and e_end > end):
not_valid = True # Skip this match bc this match is already contained in some other match.
break
if not_valid:
continue
# add finished
if method == 'kwm_disamb':
retrieved_set.add_item(retrieval_utils.RetrievedItem(title, 'kwm_disamb'))
score = get_query_doc_score(valid_query_terms, title,
g_score_dict) # A function to compute between title and query
retrieved_set.score_item(title, score, namespace=matched_word)
all_raw_matched_word.add(matched_word)
for matched_word in all_raw_matched_word:
retrieved_set.sort_and_filter(matched_word, top_k=match_filtering_k)
doc_pred_dict_p1['sp_doc'][qid] = retrieved_set.to_id_list()
doc_pred_dict_p1['raw_retrieval_set'][qid] = retrieved_set
# Then we add term-based matching results
added_count = 0
for score, title in sorted(
terms_based_results_dict[qid]['doc_list'], key=lambda x: x[0], reverse=True)[:term_retrieval_top_k + 3]:
if not filter_word(title) and not filter_document_id(title):
retrieved_set.add_item(RetrievedItem(title, 'tf-idf'))
added_count += 1
if term_retrieval_top_k is not None and added_count >= term_retrieval_top_k:
break
# Add hyperlinked pages:
finded_keys_set = set(
retrieved_set.to_id_list()) # for finding hyperlinked pages we do for both keyword matching and disambiguration group.
# .3 We then add some hyperlinked title
db_cursor = wiki_db_tool.get_cursor(config.WHOLE_WIKI_DB)
for keyword_group in finded_keys_set:
flatten_hyperlinks = []
hyperlinks = wiki_db_tool.get_first_paragraph_hyperlinks(db_cursor, keyword_group)
for hls in hyperlinks:
flatten_hyperlinks.extend(hls)
for hl in flatten_hyperlinks:
potential_title = hl.href
if potential_title in ner_set and not filter_word(potential_title) or not filter_document_id(
potential_title):
# hyperlinked_title.append(potential_title)
# if not filter_document_id(potential_title):
score = get_query_doc_score(valid_query_terms, potential_title, g_score_dict)
retrieved_set.add_item(retrieval_utils.RetrievedItem(potential_title, 'kwm_disamb_hlinked'))
retrieved_set.score_item(potential_title, score, namespace=keyword_group + '-2-hop')
for keyword_group in finded_keys_set:
retrieved_set.sort_and_filter(keyword_group + '-2-hop', top_k=multihop_retrieval_top_k)
doc_pred_dict['sp_doc'][qid] = retrieved_set.to_id_list()
doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
common.save_json(doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_v7_file_pipeline_top_none_redo_0.json")
common.save_json(doc_pred_dict_p1, "doc_raw_matching_with_disamb_with_hyperlinked_v7_file_pipeline_top_none_debug_p1.json")
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
ext_hotpot_eval.eval(doc_pred_dict, dev_fullwiki_list)
ext_hotpot_eval.eval(doc_pred_dict_p1, dev_fullwiki_list)
def toy_init_pos_results_v7():
hyperlinked_top_k = None
pred_dict = common.load_json(config.PRO_ROOT / "src/doc_retri/doc_raw_matching_with_disamb_with_hyperlinked_v7_file_debug_top_none.json")
new_doc_pred_dict = {'sp_doc': dict(), 'raw_retrieval_set': dict()}
for key in pred_dict['raw_retrieval_set'].keys():
qid = key
retrieved_set: RetrievedSet = pred_dict['raw_retrieval_set'][key]
hyperlinked_keyword_group = set()
for item in retrieved_set.retrieved_dict.values():
for keyword_gourp_name in item.scores_dict.keys():
if keyword_gourp_name.endswith('-2-hop'):
hyperlinked_keyword_group.add(keyword_gourp_name)
# If the current scored one is 2-hop retrieval
for keyword_group in hyperlinked_keyword_group: # The group already has '-2-hop' in the end
# retrieved_set.sort_and_filter(keyword_group + '-2-hop', top_k=hyperlinked_top_k)
retrieved_set.sort_and_filter(keyword_group, top_k=hyperlinked_top_k)
new_doc_pred_dict['sp_doc'][qid] = retrieved_set.to_id_list()
new_doc_pred_dict['raw_retrieval_set'][qid] = retrieved_set
# common.save_json(new_doc_pred_dict, "doc_raw_matching_with_disamb_with_hyperlinked_v7_file_pipeline_top_none.json")
dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
ext_hotpot_eval.eval(new_doc_pred_dict, dev_fullwiki_list)
if __name__ == '__main__':
# toy_init_results_v1()
# toy_init_results_v2()
# toy_init_results_v3()
# toy_init_results_v4()
# toy_init_results_v5()
# toy_init_results_v6()
# toy_init_results_v7_pre()
toy_init_results_v7()
# toy_init_pos_results_v7()
# get_title_entity_set()
# print(wiki_util.title_entities_set.disambiguation_group)
# pred_dev = common.load_json(config.RESULT_PATH / "doc_retri_results/toy_doc_rm_stopword_pred_file.json")
# pred_dev = common.load_json(config.RESULT_PATH / "doc_retri_results/doc_raw_matching_with_disamb_file.json")
# pred_dev = common.load_json(
# config.RESULT_PATH / "doc_retri_results/doc_raw_matching_with_disamb_with_hyperlinked_v2_file.json")
# pred_dev = common.load_json(
# config.RESULT_PATH / "doc_retri_results/doc_raw_matching_with_disamb_with_hyperlinked_v3_file.json")
# pred_dev = common.load_json(
# config.RESULT_PATH / "doc_retri_results/doc_raw_matching_with_disamb_with_hyperlinked_v4_file.json")
# pred_dev = common.load_json(
# config.RESULT_PATH / "doc_retri_results/doc_raw_matching_with_disamb_with_hyperlinked_v5_file.json")
# pred_dev = common.load_json(
# "/Users/yixin/projects/extinguishHotpot/src/doc_retri/doc_raw_matching_with_disamb_with_hyperlinked_v5_file.json")
# pred_dev = common.load_json(
# "/Users/yixin/projects/extinguishHotpot/src/doc_retri/doc_raw_matching_with_disamb_withiout_hyperlinked_v6_file_debug_4.json")
# pred_dev = common.load_json(config.RESULT_PATH / "doc_retri_results/doc_raw_matching_with_disamb_with_hyperlinked_file.json")
# pred_dev = common.load_json(config.RESULT_PATH / "doc_retri_results/doc_raw_matching_file.json")
# pred_dev = common.load_json(config.RESULT_PATH / "doc_retri_results/toy_doc_rm_stopword_top2_pred_file.json")
# pred_dev = common.load_json(config.RESULT_PATH / "doc_retri_results/toy_doc_rm_stopword_top2_pred_file.json")
# print(len(pred_dev))
# dev_fullwiki_list = common.load_json(config.DEV_FULLWIKI_FILE)
# ext_hotpot_eval.eval(pred_dev, dev_fullwiki_list)
# 'doc_em': 0.04577987846049966, 'doc_f1': 0.35197722376656215, 'doc_prec': 0.33270779074627865, 'doc_recall': 0.412018906144497
# 'doc_em': 0.1489534098582039, 'doc_f1': 0.4042764580306798, 'doc_prec': 0.43380716590034823, 'doc_recall': 0.412018906144497
# 'doc_em': 0.15192437542201215, 'doc_f1': 0.4141590789733938, 'doc_prec': 0.4542443436974177, 'doc_recall': 0.41188386225523294
# 'doc_em': 0.18230925050641458, 'doc_f1': 0.4293720459149168, 'doc_prec': 0.4835246455097907, 'doc_recall': 0.4022957461174882
# upperbound 2-hop 'doc_recall': 0.8951384199864956
# 0.8951384199864956
# Pipeline expected results.
# 'doc_f1': 0.07500399833405762, 'doc_prec': 0.039512565895725084, 'doc_recall': 0.8951384199864956
# V7 cut
# {'em': 0.0, 'f1': 0.0, 'prec': 0.0, 'recall': 0.0, 'doc_em': 0.12032410533423363, 'doc_f1': 0.40915052664529045, 'doc_prec': 0.42250535090360103, 'doc_recall': 0.4812288993923025
# 'doc_em': 0.12032410533423363, 'doc_f1': 0.40921579785843476, 'doc_prec': 0.42256161919079444, 'doc_recall': 0.4812964213369345
# 'doc_f1': 0.07481397826629528, 'doc_prec': 0.039406719378171556, 'doc_recall': 0.8950033760972316
# V6 reran
# 'doc_em': 0.12032410533423363, 'doc_f1': 0.40908020833729497, 'doc_prec': 0.4224800569687865, 'doc_recall': 0.48095881161377446
# 'doc_em': 0.12032410533423363, 'doc_f1': 0.4092293022473612, 'doc_prec': 0.4225616191907944, 'doc_recall': 0.4813639432815665 | 48.29246 | 180 | 0.653246 | 8,228 | 63,408 | 4.694458 | 0.059431 | 0.039611 | 0.025579 | 0.014265 | 0.894061 | 0.884611 | 0.87379 | 0.859318 | 0.85357 | 0.850438 | 0 | 0.020377 | 0.263973 | 63,408 | 1,313 | 181 | 48.29246 | 0.807264 | 0.328917 | 0 | 0.814763 | 0 | 0.001393 | 0.076702 | 0.032082 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020891 | false | 0 | 0.02507 | 0 | 0.062674 | 0.023677 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
862a69a6fb280cddf698d4a9882cfaabd8570ebd | 10,237 | py | Python | app/data_service/test_data_service.py | sergiofenoll/project-databases | a3a3c77691c7b0af9cd202e509cdc4a319acd714 | [
"MIT"
] | 2 | 2018-02-28T14:23:10.000Z | 2018-03-16T16:24:57.000Z | app/data_service/test_data_service.py | sergiofenoll/project-databases | a3a3c77691c7b0af9cd202e509cdc4a319acd714 | [
"MIT"
] | 73 | 2018-02-21T12:51:05.000Z | 2018-05-27T10:42:32.000Z | app/data_service/test_data_service.py | sergiofenoll/project-databases | a3a3c77691c7b0af9cd202e509cdc4a319acd714 | [
"MIT"
] | 3 | 2018-05-22T14:23:52.000Z | 2018-10-25T10:44:45.000Z | import unittest
from app import user_data_access, data_loader, database as db
from app.user_service.models import User
from app.data_service.models import Dataset, Column, Table, _cv, _ci
username = "test_username"
password = "test_pass"
firstname = "test_fname"
lastname = "test_lname"
email = "test_email@test.com"
status = "user"
active = True
class TestDataService(unittest.TestCase):
@classmethod
def setUpClass(cls):
user_obj = User(username=username, password=password, firstname=firstname, lastname=lastname, email=email,
status=status, active=active)
# Add user to db using UserDataAccess class
user_data_access.add_user(user_obj)
@classmethod
def tearDownClass(cls):
user_data_access.delete_user(data_loader, username)
def test_create_dataset(self):
name = 'test_dataset'
owner_id = username
schema_id = 0
dataset = Dataset(schema_id, name, 'Default description', username)
try:
data_loader.create_dataset(name, owner_id)
self.assertEqual(dataset, data_loader.get_dataset(schema_id, owner_id))
finally:
data_loader.delete_dataset(schema_id)
def test_delete_dataset(self):
name = 'test_dataset'
owner_id = username
schema_id = 0
try:
data_loader.create_dataset(name, owner_id)
finally:
data_loader.delete_dataset(schema_id)
def test_get_dataset_id(self):
name = 'test_dataset'
owner_id = username
schema_id = 0
try:
data_loader.create_dataset(name, owner_id)
gotten_id = int(data_loader.get_dataset_id(name)[0].split('-')[1])
self.assertEqual(0, gotten_id)
finally:
data_loader.delete_dataset(schema_id)
def test_create_table(self):
schema_name = 'test-schema'
table_name = 'test-table'
columns = ['test-column']
schema_id = 0
table = Table(table_name, '', columns)
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [])
self.assertEqual(table, data_loader.get_table(schema_id, table_name))
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_delete_table(self):
schema_name = 'test-schema'
table_name = 'test-table'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [])
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_get_table(self):
schema_name = 'test-schema'
table_name = 'test-table'
columns = ['test-column']
schema_id = 0
table = Table(table_name, '', columns)
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [])
self.assertEqual(table, data_loader.get_table(schema_id, table_name))
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_table_exists(self):
schema_name = 'test-schema'
table_name = 'test-table'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [])
data_loader.table_exists(table_name, schema_id)
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_create_row(self):
schema_name = 'test-schema'
table_name = 'test-table'
columns = ['test-column']
values = {'test-column': 'test'}
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, columns)
data_loader.insert_row(table_name, schema_id, columns, values)
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_delete_row(self):
schema_name = 'test-schema'
table_name = 'test-table'
columns = ['test-column']
values = {'test-column': 'test'}
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, columns)
data_loader.insert_row(table_name, schema_id, columns, values)
finally:
data_loader.delete_row(schema_id, table_name, [1])
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_create_column(self):
schema_name = 'test-schema'
table_name = 'test-table'
new_column = 'test-column-2'
new_column_type = 'VARCHAR(255)'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [])
data_loader.insert_column(schema_id, table_name, new_column, new_column_type)
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_delete_column(self):
schema_name = 'test-schema'
table_name = 'test-table'
new_column = 'test-column'
new_column_type = 'VARCHAR(255)'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [])
data_loader.insert_column(schema_id, table_name, new_column, new_column_type)
finally:
data_loader.delete_column(schema_id, table_name, new_column)
data_loader.delete_table(table_name,schema_id)
data_loader.delete_dataset(schema_id)
def test_rename_column(self):
schema_name = 'test-schema'
table_name = 'test-table'
column_name = 'test-column'
new_column_name = 'new-test-column'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [column_name])
data_loader.rename_column(schema_id, table_name, column_name, new_column_name)
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_update_column_type(self):
schema_name = 'test-schema'
table_name = 'test-table'
column_name = 'test-column'
column_type = 'INT'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.create_table(table_name, schema_id, [column_name])
data_loader.update_column_type(schema_id, table_name, column_name, column_type)
finally:
data_loader.delete_table(table_name, schema_id)
data_loader.delete_dataset(schema_id)
def test_grant_access(self):
contrib_username = "contrib_test_username"
contrib_password = "contrib_test_pass"
contrib_firstname = "contrib_test_fname"
contrib_lastname = "contrib_test_lname"
contrib_email = "contrib_test_email@test.com"
contrib_status = "user"
contrib_active = True
user_data_access.add_user(
User(contrib_username, contrib_password, contrib_firstname, contrib_lastname, contrib_email, contrib_status,
contrib_active))
schema_name = 'test-schema'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.grant_access(contrib_username, schema_id)
self.assertTrue(data_loader.has_access(contrib_username, schema_id))
finally:
user_data_access.delete_user(data_loader, contrib_username)
data_loader.delete_dataset(schema_id)
def test_remove_access(self):
contrib_username = "contrib_test_username"
contrib_password = "contrib_test_pass"
contrib_firstname = "contrib_test_fname"
contrib_lastname = "contrib_test_lname"
contrib_email = "contrib_test_email@test.com"
contrib_status = "user"
contrib_active = True
user_data_access.add_user(
User(contrib_username, contrib_password, contrib_firstname, contrib_lastname, contrib_email, contrib_status,
contrib_active))
schema_name = 'test-schema'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.grant_access(contrib_username, schema_id)
data_loader.remove_access(contrib_username, schema_id)
finally:
user_data_access.delete_user(data_loader, contrib_username)
data_loader.remove_access(contrib_username, schema_id)
data_loader.delete_dataset(schema_id)
def test_has_access(self):
contrib_username = "contrib_test_username"
contrib_password = "contrib_test_pass"
contrib_firstname = "contrib_test_fname"
contrib_lastname = "contrib_test_lname"
contrib_email = "contrib_test_email@test.com"
contrib_status = "user"
contrib_active = True
user_data_access.add_user(
User(contrib_username, contrib_password, contrib_firstname, contrib_lastname, contrib_email, contrib_status,
contrib_active))
schema_name = 'test-schema'
schema_id = 0
try:
data_loader.create_dataset(schema_name, username)
data_loader.grant_access(contrib_username, schema_id)
data_loader.remove_access(contrib_username, schema_id)
finally:
user_data_access.delete_user(data_loader, contrib_username)
data_loader.delete_dataset(schema_id)
if __name__ == '__main__':
unittest.main()
| 37.361314 | 120 | 0.655172 | 1,221 | 10,237 | 5.091728 | 0.066339 | 0.123854 | 0.07206 | 0.062892 | 0.856844 | 0.854271 | 0.841563 | 0.830947 | 0.819527 | 0.816632 | 0 | 0.003584 | 0.264042 | 10,237 | 273 | 121 | 37.498169 | 0.821609 | 0.004005 | 0 | 0.789916 | 0 | 0 | 0.083284 | 0.014126 | 0 | 0 | 0 | 0 | 0.021008 | 1 | 0.07563 | false | 0.033613 | 0.016807 | 0 | 0.096639 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
864131c7efae9cf5c26b99d224f5e6207f59d794 | 118 | py | Python | 1_languages/python/src/assignment.py | praisetompane/3_programming | dd3e2e89a36a613d895fdbdd9c03845cb648fddf | [
"MIT"
] | null | null | null | 1_languages/python/src/assignment.py | praisetompane/3_programming | dd3e2e89a36a613d895fdbdd9c03845cb648fddf | [
"MIT"
] | null | null | null | 1_languages/python/src/assignment.py | praisetompane/3_programming | dd3e2e89a36a613d895fdbdd9c03845cb648fddf | [
"MIT"
] | null | null | null | a = 0
print(a)
a += 10
print(a)
a -= 2
print(a)
a *= 3
print(a)
a /= 2
print(a)
a //=2
print(a)
a **=2
print(a)
| 5.619048 | 8 | 0.483051 | 28 | 118 | 2.035714 | 0.214286 | 0.736842 | 0.736842 | 0.561404 | 0.789474 | 0.789474 | 0.789474 | 0.526316 | 0.526316 | 0.526316 | 0 | 0.093023 | 0.271186 | 118 | 20 | 9 | 5.9 | 0.569767 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
86598d4412efc796b29ed1933433d996567380ff | 38 | py | Python | samples/src/main/resources/datasets/python/123.py | sritchie/kotlingrad | 8165ed1cd77220a5347c58cded4c6f2bcf22ee30 | [
"Apache-2.0"
] | 11 | 2020-12-19T01:19:44.000Z | 2021-12-25T20:43:33.000Z | src/main/resources/datasets/python/123.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | null | null | null | src/main/resources/datasets/python/123.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | 2 | 2021-01-25T07:59:20.000Z | 2021-08-07T07:13:49.000Z | def byteString5():
return b"Test"
| 12.666667 | 18 | 0.657895 | 5 | 38 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0.210526 | 38 | 2 | 19 | 19 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
867fadfefc12df3c06efe01e7edd581372ef8e47 | 1,835 | py | Python | pyconcz/proposals/migrations/0009_auto_20190312_1818.py | martinpucala/cz.pycon.org-2019 | 044337ed0e7f721e96d88da69511ba5493d127e6 | [
"MIT"
] | 6 | 2018-08-25T13:40:22.000Z | 2019-05-25T21:58:41.000Z | pyconcz/proposals/migrations/0009_auto_20190312_1818.py | Giraafje/cz.pycon.org-2019 | f7bfad2f0c0f98368e2f6163f7dce70335549a68 | [
"MIT"
] | 188 | 2018-08-26T06:53:50.000Z | 2022-02-12T04:04:36.000Z | pyconcz/proposals/migrations/0009_auto_20190312_1818.py | Giraafje/cz.pycon.org-2019 | f7bfad2f0c0f98368e2f6163f7dce70335549a68 | [
"MIT"
] | 15 | 2018-11-03T06:32:34.000Z | 2020-02-11T21:17:14.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.17 on 2019-03-12 17:18
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('proposals', '0008_auto_20190225_2101'),
]
operations = [
migrations.AlterField(
model_name='talk',
name='photo',
field=models.ImageField(help_text='If you don’t have a photo according to specs below, we will ask you for one if your talk is selected.\nIdeal photo is:\n– as large as possible (please no 128 × 128 px, there is no upper limit, even 1000 × 1000 px isn’t too much),\n– as uncompressed as possible (JPEGs are ok),\n– doesn’t show other people\n– is a head shot (not you in front of a pyramid)\n– is not black and white and has no “creative filters” applied.\nWe might crop it and change contrast, brightness etc. to fit PyCon CZ visual style.', upload_to='proposals/pyconcz2019/talks/', verbose_name='Your photo (not an\xa0illustration nor\xa0avatar)'),
),
migrations.AlterField(
model_name='workshop',
name='photo',
field=models.ImageField(help_text='If you don’t have a photo according to specs below, we will ask you for one if your workshop is selected.\nIdeal photo is:\n– as large as possible (please no 128 × 128 px, there is no upper limit, even 1000 × 1000 px isn’t too much),\n– as uncompressed as possible (JPEGs are ok),\n– doesn’t show other people\n– is a head shot (not you in front of a pyramid)\n– is not black and white and has no “creative filters” applied.\nWe might crop it and change contrast, brightness etc. to fit PyCon CZ visual style.', upload_to='proposals/pyconcz2019/talks/', verbose_name='Your photo (not an\xa0illustration nor\xa0avatar)'),
),
]
| 70.576923 | 667 | 0.700272 | 305 | 1,835 | 4.206557 | 0.396721 | 0.015588 | 0.012471 | 0.045207 | 0.7841 | 0.7841 | 0.7841 | 0.7841 | 0.7841 | 0.7841 | 0 | 0.050616 | 0.20327 | 1,835 | 25 | 668 | 73.4 | 0.817373 | 0.037602 | 0 | 0.333333 | 1 | 0.111111 | 0.693137 | 0.04481 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
868b2266f24cf6c954c8d3cf174b938054c13537 | 39,707 | py | Python | examples/system/efuse/example_test.py | iPlon-org/esp-idf | a5227db2a75102ca1a17860188c3c352a529a01b | [
"Apache-2.0"
] | 4 | 2022-03-15T22:43:28.000Z | 2022-03-28T01:25:08.000Z | examples/system/efuse/example_test.py | iPlon-org/esp-idf | a5227db2a75102ca1a17860188c3c352a529a01b | [
"Apache-2.0"
] | null | null | null | examples/system/efuse/example_test.py | iPlon-org/esp-idf | a5227db2a75102ca1a17860188c3c352a529a01b | [
"Apache-2.0"
] | 3 | 2021-08-07T09:17:31.000Z | 2022-03-20T21:54:52.000Z | from __future__ import unicode_literals
import os
import re
import ttfw_idf
def erase_field_on_emul_efuse(dut, pos_of_bits): # type: (ttfw_idf.TinyFW.Env, list) -> None
emul_efuse_bin_path = os.path.join(dut.app.binary_path, 'emul_efuse.bin')
dut.dump_flash(emul_efuse_bin_path, partition='emul_efuse')
def erase_bit(pos_of_bit): # type: (int) -> None
nbytes, nbits = divmod(pos_of_bit, 8)
with open(emul_efuse_bin_path, 'r+b') as f:
f.seek(nbytes)
data = ord(f.read(1))
data &= ~(1 << nbits)
f.seek(-1, os.SEEK_CUR)
f.write(bytes([data]))
for pos_of_bit in sorted(pos_of_bits):
erase_bit(pos_of_bit)
offs = dut.app.partition_table['emul_efuse']['offset']
flash_files = [(offs, emul_efuse_bin_path)]
dut.write_flash(flash_files)
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32', 'esp32c3'])
def test_examples_efuse(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse')
dut.start_app()
dut.expect_all(re.compile(r'example: Coding Scheme (3/4)|(NONE)|(REPEAT)|(RS \(Reed-Solomon coding\))'),
'example: read efuse fields',
re.compile(r'example: 1. read MAC address: {}'.format(r':'.join((r'[0-9a-f]{2}',) * 6))),
'example: 2. read secure_version: 0',
'example: 3. read custom fields',
'example: module_version = 0',
'example: device_role = None',
'example: setting_1 = 0',
'example: setting_2 = 0',
'example: custom_secure_version = 0',
'example: This example does not burn any efuse in reality only virtually',
'example: Write operations in efuse fields are performed virtually',
'example: write custom efuse fields',
'efuse: Virtual efuses enabled: Not really burning eFuses',
'example: module_version = 1',
'example: device_role = Slave',
'example: setting_1 = 3',
'example: setting_2 = 4',
'example: custom_secure_version = 5',
'example: Done',
timeout=30)
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32', 'esp32s2', 'esp32c3'])
def test_examples_efuse_with_virt_flash_enc(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_flash_enc')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Checking flash encryption...')
dut.expect('Generating new flash encryption key...')
if dut.TARGET == 'esp32':
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 2')
dut.expect('Setting CRYPT_CONFIG efuse to 0xF')
dut.expect('Not disabling UART bootloader encryption')
dut.expect('Disable UART bootloader decryption...')
dut.expect('Disable UART bootloader MMU cache...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
else:
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 4')
dut.expect('Not disabling UART bootloader encryption')
dut.expect('Disable UART bootloader cache...')
dut.expect('Disable JTAG...')
dut.expect('bootloader encrypted successfully')
dut.expect('partition table encrypted and loaded successfully')
dut.expect('Flash encryption completed', timeout=90)
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Checking flash encryption...')
if dut.TARGET == 'esp32':
dut.expect('flash encryption is enabled (3 plaintext flashes left)')
else:
dut.expect('flash encryption is enabled (1 plaintext flashes left)')
dut.expect('Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32s2'])
def test_examples_efuse_with_virt_flash_enc_aes_256(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
# Only ESP32-S2 has support AES-256 FLASH_ENCRYPTION key
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_flash_enc_aes_256')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Checking flash encryption...')
dut.expect('Generating new flash encryption key...')
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 2')
dut.expect('Writing EFUSE_BLK_KEY1 with purpose 3')
dut.expect('Not disabling UART bootloader encryption')
dut.expect('Disable UART bootloader cache...')
dut.expect('Disable JTAG...')
dut.expect('bootloader encrypted successfully')
dut.expect('partition table encrypted and loaded successfully')
dut.expect('Flash encryption completed', timeout=90)
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Checking flash encryption...')
dut.expect('flash encryption is enabled (1 plaintext flashes left)')
dut.expect('Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32', 'esp32s2', 'esp32c3'])
def test_examples_efuse_with_virt_flash_enc_pre_loaded(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_flash_enc')
print(' - Erase flash')
dut.erase_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Flash encryption completed', timeout=90)
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
if dut.TARGET == 'esp32':
print(' - Flash emul_efuse with pre-loaded efuses (FLASH_CRYPT_CNT 1 -> 0)')
# offset of this eFuse is taken from components/efuse/esp32/esp_efuse_table.csv
FLASH_CRYPT_CNT = 20
# Resets eFuse, which enables Flash encryption feature
erase_field_on_emul_efuse(dut, [FLASH_CRYPT_CNT])
else:
# offset of this eFuse is taken from components/efuse/{target}/esp_efuse_table.csv
print(' - Flash emul_efuse with pre-loaded efuses (SPI_BOOT_CRYPT_CNT 1 -> 0)')
SPI_BOOT_CRYPT_CNT = 82
# Resets eFuse, which enables Flash encryption feature
erase_field_on_emul_efuse(dut, [SPI_BOOT_CRYPT_CNT])
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Checking flash encryption...')
dut.expect('Using pre-loaded flash encryption key in efuse')
if dut.TARGET == 'esp32':
dut.expect('Setting CRYPT_CONFIG efuse to 0xF')
dut.expect('Not disabling UART bootloader encryption')
dut.expect('Disable UART bootloader decryption...')
dut.expect('Disable UART bootloader MMU cache...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
else:
dut.expect('Not disabling UART bootloader encryption')
dut.expect('Disable UART bootloader cache...')
dut.expect('Disable JTAG...')
dut.expect('bootloader encrypted successfully')
dut.expect('partition table encrypted and loaded successfully')
dut.expect('Flash encryption completed', timeout=90)
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Checking flash encryption...')
if dut.TARGET == 'esp32':
dut.expect('flash encryption is enabled (3 plaintext flashes left)')
else:
dut.expect('flash encryption is enabled (1 plaintext flashes left)')
dut.expect('Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32', 'esp32s2', 'esp32c3'])
def test_examples_efuse_with_virt_flash_enc_release(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_flash_enc_release')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Checking flash encryption...')
dut.expect('Generating new flash encryption key...')
if dut.TARGET == 'esp32':
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 2')
dut.expect('Setting CRYPT_CONFIG efuse to 0xF')
dut.expect('Disable UART bootloader encryption...')
dut.expect('Disable UART bootloader decryption...')
dut.expect('Disable UART bootloader MMU cache...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
else:
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 4')
dut.expect('Disable UART bootloader encryption')
dut.expect('Disable UART bootloader cache...')
dut.expect('Disable JTAG...')
dut.expect('bootloader encrypted successfully')
dut.expect('partition table encrypted and loaded successfully')
dut.expect('Setting CRYPT_CNT for permanent encryption', timeout=90)
dut.expect('Flash encryption completed')
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Checking flash encryption...')
dut.expect('flash encryption is enabled (0 plaintext flashes left)')
dut.expect('Flash encryption mode is RELEASE')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32'])
def test_examples_efuse_with_virt_secure_boot_v1(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
# only for ESP32
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_secure_boot_v1')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader')
dut.bootloader_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v1: Generating new secure boot key...')
dut.expect('secure_boot_v1: Generating secure boot digest...')
dut.expect('secure_boot_v1: Digest generation complete')
dut.expect('Checking secure boot...')
dut.expect('secure_boot_v1: blowing secure boot efuse...')
dut.expect('Read & write protecting new key...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
dut.expect('secure_boot_v1: secure boot is now enabled for bootloader image')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
dut.reset()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v1: bootloader secure boot is already enabled. No need to generate digest. continuing..')
dut.expect('boot: Checking secure boot...')
dut.expect('secure_boot_v1: bootloader secure boot is already enabled, continuing..')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32'])
def test_examples_efuse_with_virt_secure_boot_v1_pre_loaded(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
# only for ESP32
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_secure_boot_v1')
print(' - Erase flash')
dut.erase_flash()
dut.bootloader_flash()
dut.start_app()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
print(' - Flash emul_efuse with pre-loaded efuses (ABS_DONE_0 1 -> 0)')
# offset of this eFuse is taken from components/efuse/esp32/esp_efuse_table.csv
ABS_DONE_0 = 196
# Resets eFuse, which enables Secure boot (V1) feature
erase_field_on_emul_efuse(dut, [ABS_DONE_0])
print(' - Start app (flash partition_table and app)')
dut.start_app()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v1: Using pre-loaded secure boot key in EFUSE block 2')
dut.expect('secure_boot_v1: Generating secure boot digest...')
dut.expect('secure_boot_v1: Digest generation complete')
dut.expect('Checking secure boot...')
dut.expect('secure_boot_v1: blowing secure boot efuse...')
dut.expect('Read & write protecting new key...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
dut.expect('secure_boot_v1: secure boot is now enabled for bootloader image')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
dut.reset()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v1: bootloader secure boot is already enabled. No need to generate digest. continuing..')
dut.expect('Checking secure boot...')
dut.expect('secure_boot_v1: bootloader secure boot is already enabled, continuing..')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_EthKitV12', target=['esp32'])
def test_examples_efuse_with_virt_secure_boot_v2(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
# only for ESP32 ECO3
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_secure_boot_v2')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader')
dut.bootloader_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: Secure boot digests absent, generating..')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the bootloader')
dut.expect('Writing EFUSE_BLK_KEY1 with purpose 3')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the app')
dut.expect('secure_boot_v2: Application key(0) matches with bootloader key(0)')
dut.expect('secure_boot_v2: blowing secure boot efuse...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
dut.expect('UART ROM Download mode kept enabled - SECURITY COMPROMISED')
dut.expect('Prevent read disabling of additional efuses...')
dut.expect('secure_boot_v2: Secure boot permanently enabled')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
dut.reset()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('secure_boot_v2: secure boot v2 is already enabled, continuing..')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_EthKitV12', target=['esp32'])
def test_examples_efuse_with_virt_secure_boot_v2_pre_loaded(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
# only for ESP32 ECO3
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_secure_boot_v2')
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader and app')
dut.bootloader_flash()
dut.start_app()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
print(' - Flash emul_efuse with pre-loaded efuses (ABS_DONE_1 1 -> 0)')
# offset of this eFuse is taken from components/efuse/esp32/esp_efuse_table.csv
ABS_DONE_1 = 197
# Resets eFuse, which enables Secure boot (V2) feature
erase_field_on_emul_efuse(dut, [ABS_DONE_1])
print(' - Start app (flash partition_table and app)')
dut.start_app()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: Secure boot digests already present')
dut.expect('secure_boot_v2: Using pre-loaded public key digest in eFuse')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the app')
dut.expect('secure_boot_v2: Application key(0) matches with bootloader key(0)')
dut.expect('secure_boot_v2: blowing secure boot efuse...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
dut.expect('UART ROM Download mode kept enabled - SECURITY COMPROMISED')
dut.expect('Prevent read disabling of additional efuses...')
dut.expect('secure_boot_v2: Secure boot permanently enabled')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
dut.reset()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('secure_boot_v2: secure boot v2 is already enabled, continuing..')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32s2', 'esp32c3'])
def test_examples_efuse_with_virt_secure_boot_v2_esp32xx(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_secure_boot_v2')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader')
dut.bootloader_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: Secure boot digests absent, generating..')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the bootloader')
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 9')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the app')
dut.expect('secure_boot_v2: Application key(0) matches with bootloader key(0)')
dut.expect('secure_boot_v2: Revoking empty key digest slot (1)...')
dut.expect('secure_boot_v2: Revoking empty key digest slot (2)...')
dut.expect('secure_boot_v2: blowing secure boot efuse...')
dut.expect('UART ROM Download mode kept enabled - SECURITY COMPROMISED')
dut.expect('Disable hardware & software JTAG...')
dut.expect('secure_boot_v2: Secure boot permanently enabled')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
dut.reset()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('secure_boot_v2: secure boot v2 is already enabled, continuing..')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32s2', 'esp32c3'])
def test_examples_efuse_with_virt_secure_boot_v2_esp32xx_pre_loaded(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_secure_boot_v2')
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader and app')
dut.bootloader_flash()
dut.start_app()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
print(' - Flash emul_efuse with pre-loaded efuses (SECURE_BOOT_EN 1 -> 0, SECURE_BOOT_KEY_REVOKE[0..2] -> 0)')
# offsets of eFuses are taken from components/efuse/{target}/esp_efuse_table.csv
SECURE_BOOT_EN = 116
SECURE_BOOT_KEY_REVOKE0 = 85
SECURE_BOOT_KEY_REVOKE1 = 86
SECURE_BOOT_KEY_REVOKE2 = 87
# Resets eFuse, which enables Secure boot feature
# Resets eFuses, which control digest slots
erase_field_on_emul_efuse(dut, [SECURE_BOOT_EN, SECURE_BOOT_KEY_REVOKE0, SECURE_BOOT_KEY_REVOKE1, SECURE_BOOT_KEY_REVOKE2])
print(' - Start app (flash partition_table and app)')
dut.start_app()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: Secure boot digests already present')
dut.expect('secure_boot_v2: Using pre-loaded public key digest in eFuse')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the app')
dut.expect('secure_boot_v2: Application key(0) matches with bootloader key(0)')
dut.expect('secure_boot_v2: Revoking empty key digest slot (1)...')
dut.expect('secure_boot_v2: Revoking empty key digest slot (2)...')
dut.expect('secure_boot_v2: blowing secure boot efuse...')
dut.expect('UART ROM Download mode kept enabled - SECURITY COMPROMISED')
dut.expect('Disable hardware & software JTAG...')
dut.expect('secure_boot_v2: Secure boot permanently enabled')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Start eFuse example')
dut.expect('example: Done')
dut.reset()
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('secure_boot_v2: secure boot v2 is already enabled, continuing..')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32'])
def test_examples_efuse_with_virt_sb_v1_and_fe(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_sb_v1_and_fe')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader')
dut.bootloader_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v1: Generating new secure boot key...')
dut.expect('secure_boot_v1: Generating secure boot digest...')
dut.expect('secure_boot_v1: Digest generation complete')
dut.expect('Checking flash encryption...')
dut.expect('flash_encrypt: Generating new flash encryption key...')
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 2')
dut.expect('flash_encrypt: Setting CRYPT_CONFIG efuse to 0xF')
dut.expect('flash_encrypt: Not disabling UART bootloader encryption')
dut.expect('flash_encrypt: Disable UART bootloader decryption...')
dut.expect('flash_encrypt: Disable UART bootloader MMU cache...')
dut.expect('flash_encrypt: Disable JTAG...')
dut.expect('flash_encrypt: Disable ROM BASIC interpreter fallback...')
dut.expect('flash_encrypt: bootloader encrypted successfully')
dut.expect('flash_encrypt: partition table encrypted and loaded successfully')
dut.expect('Verifying image signature...')
dut.expect('flash_encrypt: Flash encryption completed', timeout=90)
dut.expect('Checking secure boot...')
dut.expect('secure_boot_v1: blowing secure boot efuse...')
dut.expect('Read & write protecting new key...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
dut.expect('secure_boot_v1: secure boot is now enabled for bootloader image')
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v1: bootloader secure boot is already enabled. No need to generate digest. continuing..')
dut.expect('Checking flash encryption...')
dut.expect('flash_encrypt: flash encryption is enabled (3 plaintext flashes left)')
dut.expect('Checking secure boot...')
dut.expect('secure_boot_v1: bootloader secure boot is already enabled, continuing..')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('flash_encrypt: Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_EthKitV12', target=['esp32'])
def test_examples_efuse_with_virt_sb_v2_and_fe(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
# only for ESP32 ECO3
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_sb_v2_and_fe')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader')
dut.bootloader_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully')
dut.expect('secure_boot_v2: Secure boot digests absent, generating..')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the bootloader')
dut.expect('Writing EFUSE_BLK_KEY1 with purpose 3')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the app')
dut.expect('secure_boot_v2: Application key(0) matches with bootloader key(0)')
dut.expect('secure_boot_v2: blowing secure boot efuse...')
dut.expect('Disable JTAG...')
dut.expect('Disable ROM BASIC interpreter fallback...')
dut.expect('UART ROM Download mode kept enabled - SECURITY COMPROMISED')
dut.expect('secure_boot_v2: Secure boot permanently enabled')
dut.expect('Checking flash encryption...')
dut.expect('flash_encrypt: Generating new flash encryption key...')
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 2')
dut.expect('flash_encrypt: Setting CRYPT_CONFIG efuse to 0xF')
dut.expect('flash_encrypt: Not disabling UART bootloader encryption')
dut.expect('flash_encrypt: Disable UART bootloader decryption...')
dut.expect('flash_encrypt: Disable UART bootloader MMU cache...')
dut.expect('flash_encrypt: Disable JTAG...')
dut.expect('flash_encrypt: Disable ROM BASIC interpreter fallback...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('flash_encrypt: bootloader encrypted successfully')
dut.expect('flash_encrypt: partition table encrypted and loaded successfully')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('flash_encrypt: Flash encryption completed', timeout=90)
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('secure_boot_v2: secure boot v2 is already enabled, continuing..')
dut.expect('flash_encrypt: flash encryption is enabled (3 plaintext flashes left)')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('flash_encrypt: Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
@ttfw_idf.idf_example_test(env_tag='Example_GENERIC', target=['esp32s2', 'esp32c3'])
def test_examples_efuse_with_virt_sb_v2_and_fe_esp32xx(env, _): # type: (ttfw_idf.TinyFW.Env, None) -> None
dut = env.get_dut('efuse', 'examples/system/efuse', app_config_name='virt_sb_v2_and_fe')
# check and log bin size
binary_file = os.path.join(dut.app.binary_path, 'bootloader', 'bootloader.bin')
bin_size = os.path.getsize(binary_file)
ttfw_idf.log_performance('{}_bootloader_{}_bin_size'.format(dut.app.target, dut.app.config_name), '{}KB'.format(bin_size // 1024))
print(' - Erase flash')
dut.erase_flash()
print(' - Flash bootloader')
dut.bootloader_flash()
print(' - Start app (flash partition_table and app)')
dut.start_app_no_enc()
dut.expect('Loading virtual efuse blocks from real efuses')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Secure boot V2 is not enabled yet and eFuse digest keys are not set')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: Secure boot digests absent, generating..')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the bootloader')
dut.expect('Writing EFUSE_BLK_KEY0 with purpose 9')
dut.expect('secure_boot_v2: Digests successfully calculated, 1 valid signatures')
dut.expect('secure_boot_v2: 1 signature block(s) found appended to the app')
dut.expect('secure_boot_v2: Application key(0) matches with bootloader key(0)')
dut.expect('secure_boot_v2: Revoking empty key digest slot (1)...')
dut.expect('secure_boot_v2: Revoking empty key digest slot (2)...')
dut.expect('secure_boot_v2: blowing secure boot efuse...')
dut.expect('UART ROM Download mode kept enabled - SECURITY COMPROMISED')
dut.expect('Disable hardware & software JTAG...')
dut.expect('secure_boot_v2: Secure boot permanently enabled')
dut.expect('Checking flash encryption...')
dut.expect('flash_encrypt: Generating new flash encryption key...')
dut.expect('Writing EFUSE_BLK_KEY1 with purpose 4')
dut.expect('Not disabling UART bootloader encryption')
dut.expect('Disable UART bootloader cache...')
dut.expect('Disable JTAG...')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('flash_encrypt: bootloader encrypted successfully')
dut.expect('flash_encrypt: partition table encrypted and loaded successfully')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('flash_encrypt: Flash encryption completed', timeout=90)
dut.expect('Resetting with flash encryption enabled...')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('Verifying image signature...')
dut.expect('secure_boot_v2: Verifying with RSA-PSS...')
dut.expect('secure_boot_v2: Signature verified successfully!')
dut.expect('secure_boot_v2: enabling secure boot v2...')
dut.expect('secure_boot_v2: secure boot v2 is already enabled, continuing..')
dut.expect('flash_encrypt: flash encryption is enabled (1 plaintext flashes left)')
dut.expect('cpu_start: Pro cpu up')
dut.expect('Loading virtual efuse blocks from flash')
dut.expect('flash_encrypt: Flash encryption mode is DEVELOPMENT (not secure)')
dut.expect('Start eFuse example')
dut.expect('example: Done')
if __name__ == '__main__':
test_examples_efuse()
test_examples_efuse_with_virt_flash_enc()
test_examples_efuse_with_virt_flash_enc_pre_loaded()
test_examples_efuse_with_virt_flash_enc_aes_256()
test_examples_efuse_with_virt_flash_enc_release()
test_examples_efuse_with_virt_secure_boot_v1()
test_examples_efuse_with_virt_secure_boot_v1_pre_loaded()
test_examples_efuse_with_virt_secure_boot_v2()
test_examples_efuse_with_virt_secure_boot_v2_pre_loaded()
test_examples_efuse_with_virt_secure_boot_v2_esp32xx()
test_examples_efuse_with_virt_secure_boot_v2_esp32xx_pre_loaded()
test_examples_efuse_with_virt_sb_v1_and_fe()
test_examples_efuse_with_virt_sb_v2_and_fe()
test_examples_efuse_with_virt_sb_v2_and_fe_esp32xx()
| 48.188107 | 134 | 0.712242 | 5,485 | 39,707 | 4.958797 | 0.04959 | 0.140299 | 0.069267 | 0.097798 | 0.941616 | 0.936431 | 0.929409 | 0.92724 | 0.9211 | 0.903379 | 0 | 0.015778 | 0.16521 | 39,707 | 823 | 135 | 48.246659 | 0.804779 | 0.042738 | 0 | 0.83915 | 0 | 0.001517 | 0.546791 | 0.017331 | 0 | 0 | 0.000395 | 0 | 0 | 1 | 0.024279 | false | 0 | 0.00607 | 0 | 0.030349 | 0.060698 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
81158ad2874bdd509f603f75e1c652d25a299008 | 1,230 | py | Python | tests/test_list.py | yxtay/data-structures-algorithms | a28f0b7a727192c121579ed51d44d00e09cc1b9a | [
"MIT"
] | 1 | 2020-06-23T16:08:51.000Z | 2020-06-23T16:08:51.000Z | tests/test_list.py | yxtay/data-structures-algorithms | a28f0b7a727192c121579ed51d44d00e09cc1b9a | [
"MIT"
] | null | null | null | tests/test_list.py | yxtay/data-structures-algorithms | a28f0b7a727192c121579ed51d44d00e09cc1b9a | [
"MIT"
] | null | null | null | from src.list import OrderedList, UnorderedList
def test_unordered():
my_list = UnorderedList()
assert my_list.is_empty() is True
my_list.add(31)
assert my_list.is_empty() is False
assert my_list.size() == 1
my_list.add(77)
my_list.add(17)
my_list.add(93)
my_list.add(26)
my_list.add(54)
assert my_list.is_empty() is False
assert my_list.size() == 6
assert my_list.search(17) is True
my_list.remove(17)
assert my_list.is_empty() is False
assert my_list.size() == 5
assert my_list.search(17) is False
def test_ordered():
my_list = OrderedList()
assert my_list.is_empty() is True
my_list.add(31)
assert my_list.is_empty() is False
assert my_list.size() == 1
my_list.add(77)
my_list.add(17)
my_list.add(93)
my_list.add(26)
my_list.add(54)
assert my_list.is_empty() is False
assert my_list.size() == 6
assert my_list.search(17) == 0
items = list(my_list)
assert items == [17, 26, 31, 54, 77, 93]
my_list.remove(17)
assert my_list.is_empty() is False
assert my_list.size() == 5
assert my_list.search(17) == -1
items = list(my_list)
assert items == [26, 31, 54, 77, 93]
| 21.964286 | 47 | 0.642276 | 209 | 1,230 | 3.559809 | 0.148325 | 0.290323 | 0.290323 | 0.150538 | 0.83871 | 0.811828 | 0.736559 | 0.736559 | 0.736559 | 0.736559 | 0 | 0.070213 | 0.235772 | 1,230 | 55 | 48 | 22.363636 | 0.721277 | 0 | 0 | 0.731707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.487805 | 1 | 0.04878 | false | 0 | 0.02439 | 0 | 0.073171 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
8122d728b9a0e2b4d0eb05c4904e980c777d2d8a | 141 | py | Python | src/clusto/drivers/locations/__init__.py | thekad/clusto | c141ea3ef4931c6a21fdf42845c6e9de5ee08caa | [
"BSD-3-Clause"
] | 216 | 2015-01-10T17:03:25.000Z | 2022-03-24T07:23:41.000Z | src/clusto/drivers/locations/__init__.py | thekad/clusto | c141ea3ef4931c6a21fdf42845c6e9de5ee08caa | [
"BSD-3-Clause"
] | 23 | 2015-01-08T16:51:22.000Z | 2021-03-13T12:56:04.000Z | src/clusto/drivers/locations/__init__.py | thekad/clusto | c141ea3ef4931c6a21fdf42845c6e9de5ee08caa | [
"BSD-3-Clause"
] | 49 | 2015-01-08T00:13:17.000Z | 2021-09-22T02:01:20.000Z | from clusto.drivers.locations.datacenters import *
from clusto.drivers.locations.racks import *
from clusto.drivers.locations.zones import *
| 35.25 | 50 | 0.829787 | 18 | 141 | 6.5 | 0.444444 | 0.25641 | 0.435897 | 0.666667 | 0.547009 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 141 | 3 | 51 | 47 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8123b919af3338f4b5992e4c74958c3fbcc37183 | 14,142 | py | Python | modules/prototypical_loss.py | OatmealLiu/DTC | a2d0d2279efc946b83692d5af32008559eb74eff | [
"MIT"
] | 139 | 2019-08-28T08:38:44.000Z | 2022-03-17T09:08:49.000Z | modules/prototypical_loss.py | OatmealLiu/DTC | a2d0d2279efc946b83692d5af32008559eb74eff | [
"MIT"
] | 3 | 2019-10-09T09:47:13.000Z | 2021-12-10T21:28:50.000Z | modules/prototypical_loss.py | OatmealLiu/DTC | a2d0d2279efc946b83692d5af32008559eb74eff | [
"MIT"
] | 19 | 2019-08-29T04:50:04.000Z | 2022-03-02T17:20:09.000Z | # coding=utf-8
import torch
from torch.nn import functional as F
from torch.nn.modules import Module
import numpy as np
class PrototypicalLoss(Module):
'''
Loss class deriving from Module for the prototypical loss function defined below
'''
def __init__(self, n_support):
super(PrototypicalLoss, self).__init__()
self.n_support = n_support
def forward(self, input, target):
# _assert_no_grad(target)
return prototypical_loss(input, target, self.n_support)
def euclidean_dist(x, y):
'''
Compute euclidean distance between two tensors
'''
# x: N x D
# y: M x D
n = x.size(0)
m = y.size(0)
d = x.size(1)
if d != y.size(1):
raise Exception
x = x.unsqueeze(1).expand(n, m, d)
y = y.unsqueeze(0).expand(n, m, d)
return torch.pow(x - y, 2).sum(2)
def prototypical_loss(input, target, n_support):
'''
Inspired by https://github.com/jakesnell/prototypical-networks/blob/master/protonets/models/few_shot.py
Compute the barycentres by averaging the features of n_support
samples for each class in target, computes then the distances from each
samples' features to each one of the barycentres, computes the
log_probability for each n_query samples for each one of the current
classes, of appartaining to a class c, loss and accuracy are then computed
and returned
Args:
- input: the model output for a batch of samples
- target: ground truth for the above batch of samples
- n_support: number of samples to keep in account when computing
barycentres, for each one of the current classes
'''
target_cpu = target.to('cpu')
input_cpu = input.to('cpu')
def supp_idxs(c):
# FIXME when torch will support where as np
return target_cpu.eq(c).nonzero()[:n_support].squeeze(1)
# FIXME when torch.unique will be available on cuda too
classes = torch.unique(target_cpu)
n_classes = len(classes)
# FIXME when torch will support where as np
# assuming n_query, n_target constants
n_query = target_cpu.eq(classes[0].item()).sum().item() - n_support
support_idxs = list(map(supp_idxs, classes))
# print('support_idxs', support_idxs)
prototypes = torch.stack([input_cpu[idx_list].mean(0) for idx_list in support_idxs])
# FIXME when torch will support where as np
query_idxs = torch.stack(list(map(lambda c: target_cpu.eq(c).nonzero()[n_support:], classes))).view(-1)
query_samples = input.to('cpu')[query_idxs]
dists = euclidean_dist(query_samples, prototypes)
# print('dist', F.log_softmax(-dists, dim=1).shape)
log_p_y = F.log_softmax(-dists, dim=1).view(n_classes, n_query, -1)
# print('log_p_y', log_p_y.shape)
target_inds = torch.arange(0, n_classes)
target_inds = target_inds.view(n_classes, 1, 1)
target_inds = target_inds.expand(n_classes, n_query, 1).long()
# print(target_inds.shape)
# print(log_p_y)
loss_val = -log_p_y.gather(2, target_inds).squeeze().view(-1).mean()
_, y_hat = log_p_y.max(2)
acc_val = y_hat.eq(target_inds.squeeze(2)).float().mean()
return loss_val, acc_val
def prototypical_loss_pair(input, target, n_support, normalize=False):
'''
Inspired by https://github.com/jakesnell/prototypical-networks/blob/master/protonets/models/few_shot.py
Compute the barycentres by averaging the features of n_support
samples for each class in target, computes then the distances from each
samples' features to each one of the barycentres, computes the
log_probability for each n_query samples for each one of the current
classes, of appartaining to a class c, loss and accuracy are then computed
and returned
Args:
- input: the model output for a batch of samples
- target: ground truth for the above batch of samples
- n_support: number of samples to keep in account when computing
barycentres, for each one of the current classes
'''
if normalize:
radius = 30.0
input = F.normalize(input)*radius
target_cpu = target.to('cpu')
input_cpu = input.to('cpu')
def supp_idxs(c):
# FIXME when torch will support where as np
return target_cpu.eq(c).nonzero()[:n_support].squeeze(1)
# FIXME when torch.unique will be available on cuda too
classes = torch.unique(target_cpu)
n_classes = len(classes)
# FIXME when torch will support where as np
# assuming n_query, n_target constants
n_query = target_cpu.eq(classes[0].item()).sum().item() - n_support
support_idxs = list(map(supp_idxs, classes))
# print('support_idxs', support_idxs)
prototypes = torch.stack([input_cpu[idx_list].mean(0) for idx_list in support_idxs])
if normalize:
prototypes = F.normalize(prototypes)
prototypes = prototypes*radius
# print('prototypes', prototypes.shape, prototypes.norm(dim=1))
# FIXME when torch will support where as np
query_idxs = torch.stack(list(map(lambda c: target_cpu.eq(c).nonzero()[n_support:], classes))).view(-1)
query_samples = input.to('cpu')[query_idxs]
dists = euclidean_dist(query_samples, prototypes)
# print('dists', dists.shape)
# print('dist', F.log_softmax(-dists, dim=1).shape)
log_p_y = F.log_softmax(-dists, dim=1).view(n_classes, n_query, -1)
# print('log_p_y', log_p_y.shape)
target_inds = torch.arange(0, n_classes)
target_inds = target_inds.view(n_classes, 1, 1)
target_inds = target_inds.expand(n_classes, n_query, 1).long()
# print(target_inds.shape)
# print(log_p_y)
loss_val = -log_p_y.gather(2, target_inds).squeeze().view(-1).mean()
_, y_hat = log_p_y.max(2)
acc_val = y_hat.eq(target_inds.squeeze(2)).float().mean()
# print('target, y_hat', target_inds.squeeze(2).shape, y_hat.shape)
return loss_val, acc_val
def prototypical_loss_pair_cyc(input, target, n_support, normalize=False):
'''
Inspired by https://github.com/jakesnell/prototypical-networks/blob/master/protonets/models/few_shot.py
Compute the barycentres by averaging the features of n_support
samples for each class in target, computes then the distances from each
samples' features to each one of the barycentres, computes the
log_probability for each n_query samples for each one of the current
classes, of appartaining to a class c, loss and accuracy are then computed
and returned
Args:
- input: the model output for a batch of samples
- target: ground truth for the above batch of samples
- n_support: number of samples to keep in account when computing
barycentres, for each one of the current classes
'''
if normalize:
radius = 30.0
input = F.normalize(input)*radius
target_cpu = target.to('cpu')
input_cpu = input.to('cpu')
def supp_idxs(c):
# FIXME when torch will support where as np
return target_cpu.eq(c).nonzero()[:n_support].squeeze(1)
# FIXME when torch.unique will be available on cuda too
classes = torch.unique(target_cpu)
n_classes = len(classes)
# FIXME when torch will support where as np
# assuming n_query, n_target constants
n_query = target_cpu.eq(classes[0].item()).sum().item() - n_support
support_idxs = list(map(supp_idxs, classes))
# print('support_idxs', support_idxs)
prototypes = torch.stack([input_cpu[idx_list].mean(0) for idx_list in support_idxs])
if normalize:
prototypes = F.normalize(prototypes)
prototypes = prototypes*radius
# print('prototypes', prototypes.shape, prototypes.norm(dim=1))
# FIXME when torch will support where as np
query_idxs = torch.stack(list(map(lambda c: target_cpu.eq(c).nonzero()[n_support:], classes))).view(-1)
query_samples = input.to('cpu')[query_idxs]
dists = euclidean_dist(query_samples, prototypes)
# print('dists', dists.shape)
# print('dist', F.log_softmax(-dists, dim=1).shape)
log_p_y_AB = F.log_softmax(-dists, dim=1).view(n_classes, n_query, -1)
log_p_y_BA = F.log_softmax(-dists.transpose(0, 1), dim=1).view(n_classes, n_query, -1)
# print('log_p_y', log_p_y.shape)
target_inds = torch.arange(0, n_classes)
target_inds = target_inds.view(n_classes, 1, 1)
target_inds = target_inds.expand(n_classes, n_query, 1).long()
# print(target_inds.shape)
# print(log_p_y)
loss_val_AB = -log_p_y_AB.gather(2, target_inds).squeeze().view(-1).mean()
_, y_hat_AB = log_p_y_AB.max(2)
acc_val_AB = y_hat_AB.eq(target_inds.squeeze(2)).float().mean()
loss_val_BA = -log_p_y_BA.gather(2, target_inds).squeeze().view(-1).mean()
_, y_hat_BA = log_p_y_BA.max(2)
acc_val_BA = y_hat_BA.eq(target_inds.squeeze(2)).float().mean()
loss_val = (loss_val_AB+loss_val_BA)/2.0
acc_val = (acc_val_AB + acc_val_BA)/2.0
return loss_val, acc_val
def prototypical_center_loss(input, target, n_support=None):
target_cpu = target.to('cpu')
input_cpu = input.to('cpu')
def supp_idxs(c):
# FIXME when torch will support where as np
return target_cpu.eq(c).nonzero().squeeze(1)
# FIXME when torch.unique will be available on cuda too
classes = torch.unique(target_cpu, sorted=True)
n_classes = len(classes)
n_query = target_cpu.eq(classes[0].item()).sum().item()
support_idxs = list(map(supp_idxs, classes))
prototypes = torch.stack([input_cpu[idx_list].mean(0) for idx_list in support_idxs])
query_idxs = torch.cat(list(map(lambda c: target_cpu.eq(c).nonzero(), classes))).view(-1)
class_to_idx = {classes.numpy()[i]: i for i in range(len(classes))}
query_samples = input.to('cpu')[query_idxs]
target_cpu = target_cpu[query_idxs] #sort targets according to query_idxs
dists = euclidean_dist(query_samples, prototypes)
log_p_y = F.log_softmax(-dists, dim=1)
target_indx = [class_to_idx[i] for i in target_cpu.numpy()]
target_indx = torch.Tensor(target_indx).long()
loss_val = -log_p_y.gather(1, target_indx.unsqueeze(1)).view(-1).mean()
_, y_hat = log_p_y.max(1)
acc_val = y_hat.eq(target_indx).float().mean()
return loss_val, acc_val
def prototypical_mpair_loss(input, target, n_support=None):
target_cpu = target.to('cpu')
input_cpu = input.to('cpu')
# print('target', target_cpu)
classes = torch.unique(target_cpu)
n_classes = len(classes)
dists = euclidean_dist(input_cpu, input_cpu)
n_per_cls = len(target)/n_classes
dists_np = dists.detach().numpy()
# print('dist', dists)
block_mask = np.kron(np.eye(n_classes), np.ones([n_per_cls, n_per_cls]))
dists_np = dists_np - 2*dists_np*block_mask
# print('dist np', dists_np)
ind_tuple = np.zeros([len(target), n_classes])
for i in range(n_classes):
ind_tuple[:, i] = np.argmin(dists_np[:, i*n_per_cls:(i+1)*n_per_cls], axis=1) + i*n_per_cls
ind_tuple = torch.from_numpy(ind_tuple).long()
# print(ind_tuple)
dists = dists.gather(1, ind_tuple)
# print('dist after', dists)
# class_to_idx = {classes.numpy()[i]: i for i in range(len(classes))}
log_p_y = F.log_softmax(-dists, dim=1)
# target_indx = [class_to_idx[i] for i in target_cpu.numpy()]
# target_indx = torch.Tensor(target_indx).long()
target_inds = torch.arange(0, n_classes)
target_inds = target_inds.view(n_classes, 1)
target_inds = target_inds.expand(n_classes, n_per_cls).long().contiguous().view(-1, 1)
# print('target_inds', target_inds)
loss_val = -log_p_y.gather(1, target_inds).view(-1).mean()
_, y_hat = log_p_y.max(1)
# print('y_hat, target_inds', y_hat, target_inds.squeeze())
acc_val = y_hat.eq(target_inds.squeeze()).float().mean()
return loss_val, acc_val
def prototypical_mpair_l2_loss(input, target, radius=None):
target_cpu = target.to('cpu')
input_cpu = input.to('cpu')
# print('target', target_cpu)
if radius is not None:
input = F.normalize(input)*radius
classes = torch.unique(target_cpu)
n_classes = len(classes)
dists = euclidean_dist(input_cpu, input_cpu)
n_per_cls = len(target)/n_classes
dists_np = dists.detach().numpy()
# print('dist', dists)
block_mask = np.kron(np.eye(n_classes), np.ones([n_per_cls, n_per_cls]))
dists_np = dists_np - 2*dists_np*block_mask
# print('dist np', dists_np)
ind_tuple = np.zeros([len(target), n_classes])
for i in range(n_classes):
ind_tuple[:, i] = np.argmin(dists_np[:, i*n_per_cls:(i+1)*n_per_cls], axis=1) + i*n_per_cls
ind_tuple = torch.from_numpy(ind_tuple).long()
# print(ind_tuple)
dists = dists.gather(1, ind_tuple)
# print('dist after', dists)
# class_to_idx = {classes.numpy()[i]: i for i in range(len(classes))}
log_p_y = F.log_softmax(-dists, dim=1)
# target_indx = [class_to_idx[i] for i in target_cpu.numpy()]
# target_indx = torch.Tensor(target_indx).long()
target_inds = torch.arange(0, n_classes)
target_inds = target_inds.view(n_classes, 1)
target_inds = target_inds.expand(n_classes, n_per_cls).long().contiguous().view(-1, 1)
# print('target_inds', target_inds)
loss_val = -log_p_y.gather(1, target_inds).view(-1).mean()
_, y_hat = log_p_y.max(1)
# print('y_hat, target_inds', y_hat, target_inds.squeeze())
acc_val = y_hat.eq(target_inds.squeeze()).float().mean()
return loss_val, acc_val
if __name__ == "__main__":
import numpy as np
x = torch.randn(10, 256)
y = torch.Tensor([10, 10, 2, 2, 3, 3, 4, 4, 6, 6])
l, a = prototypical_mpair_loss(x, y)
# print(l, a)
# classes = torch.unique(y)
# print('classes', classes)
# class_to_idx = {classes[i]: i for i in range(len(classes))}
# print('class_to_idx', class_to_idx)
n_classes = 5
n_query = 2
target_inds = torch.arange(0, n_classes)
target_inds = target_inds.view(n_classes, 1, 1)
target_inds = target_inds.expand(n_classes, n_query, 1).long()
# print(target_inds.contiguous().view(-1))
# print(target_inds.contiguous().view(-1, 1))
| 39.283333 | 107 | 0.681799 | 2,240 | 14,142 | 4.075 | 0.090625 | 0.06135 | 0.016433 | 0.030675 | 0.885408 | 0.873795 | 0.866893 | 0.856157 | 0.84553 | 0.824825 | 0 | 0.011837 | 0.193537 | 14,142 | 359 | 108 | 39.392758 | 0.788514 | 0.340192 | 0 | 0.716667 | 0 | 0 | 0.006184 | 0 | 0 | 0 | 0 | 0.008357 | 0 | 1 | 0.072222 | false | 0 | 0.027778 | 0.027778 | 0.172222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d4a9f3f431d9619fcc09753cf191604923da2795 | 123 | py | Python | mlrun/api/utils/singletons/k8s.py | Hedingber/mlrun | e2269718fcc7caa7e1aa379ac28495830b45f9da | [
"Apache-2.0"
] | 1 | 2021-02-17T08:12:33.000Z | 2021-02-17T08:12:33.000Z | mlrun/api/utils/singletons/k8s.py | Hedingber/mlrun | e2269718fcc7caa7e1aa379ac28495830b45f9da | [
"Apache-2.0"
] | 1 | 2020-12-31T14:36:29.000Z | 2020-12-31T14:36:29.000Z | mlrun/api/utils/singletons/k8s.py | Hedingber/mlrun | e2269718fcc7caa7e1aa379ac28495830b45f9da | [
"Apache-2.0"
] | 1 | 2021-08-30T21:43:38.000Z | 2021-08-30T21:43:38.000Z | from mlrun.k8s_utils import K8sHelper, get_k8s_helper
def get_k8s() -> K8sHelper:
return get_k8s_helper(silent=True)
| 20.5 | 53 | 0.780488 | 19 | 123 | 4.736842 | 0.631579 | 0.2 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056604 | 0.138211 | 123 | 5 | 54 | 24.6 | 0.792453 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
d4b66058a3d435479576000a91dbab63f3a69d57 | 17,646 | py | Python | src/dataload/sources/exac/exac_upload.py | SuLab/myvariant.info | 60850eb9af19f6ba0a0979553b5506405f9fa27c | [
"Apache-2.0"
] | 32 | 2015-10-23T19:47:09.000Z | 2019-11-16T01:28:26.000Z | src/dataload/sources/exac/exac_upload.py | Mfpfox/myvariant.info | 60850eb9af19f6ba0a0979553b5506405f9fa27c | [
"Apache-2.0"
] | 12 | 2015-10-27T20:20:41.000Z | 2017-04-04T21:35:46.000Z | src/dataload/sources/exac/exac_upload.py | Mfpfox/myvariant.info | 60850eb9af19f6ba0a0979553b5506405f9fa27c | [
"Apache-2.0"
] | 15 | 2015-10-15T20:46:50.000Z | 2021-07-12T19:17:49.000Z | import glob, os
from .exac_parser import load_data
import biothings.dataload.uploader as uploader
from dataload.uploader import SnpeffPostUpdateUploader
class ExacBaseUploader(SnpeffPostUpdateUploader):
__metadata__ = {"mapper" : 'observed',
"assembly" : "hg19",
"src_meta" : {
"url" : "http://exac.broadinstitute.org/",
"license" : "ODbL",
"license_url" : "http://exac.broadinstitute.org/terms",
"license_url_short": "https://goo.gl/MH8b34"
}
}
@classmethod
def get_mapping(klass):
mapping = {
"exac": {
"properties": {
"chrom": {
"type": "string",
"analyzer": "string_lowercase"
},
"pos": {
"type": "long"
},
"ref": {
"type": "string",
"analyzer": "string_lowercase"
},
"alt": {
"type": "string",
"analyzer": "string_lowercase"
},
"multi-allelic": {
"type": "string",
"analyzer": "string_lowercase"
},
"alleles": {
"type": "string",
"analyzer": "string_lowercase"
},
"type": {
"type": "string",
"analyzer": "string_lowercase"
},
"qual": {
"type": "float"
},
"filter": {
"type": "string",
"analyzer": "string_lowercase"
},
"ac": {
"properties": {
"ac": {
"type": "integer"
},
"ac_afr": {
"type": "integer"
},
"ac_amr": {
"type": "integer"
},
"ac_adj": {
"type": "integer"
},
"ac_eas": {
"type": "integer"
},
"ac_fin": {
"type": "integer"
},
"ac_nfe": {
"type": "integer"
},
"ac_oth": {
"type": "integer"
},
"ac_sas": {
"type": "integer"
},
"ac_male": {
"type": "integer"
},
"ac_female": {
"type": "integer"
}
}
},
"af": {
"type": "float"
},
"an": {
"properties": {
"an": {
"type": "integer"
},
"an_afr": {
"type": "integer"
},
"an_amr": {
"type": "integer"
},
"an_adj": {
"type": "integer"
},
"an_eas": {
"type": "integer"
},
"an_fin": {
"type": "integer"
},
"an_nfe": {
"type": "integer"
},
"an_oth": {
"type": "integer"
},
"an_sas": {
"type": "integer"
},
"an_female": {
"type": "integer"
},
"an_male": {
"type": "integer"
}
}
},
"baseqranksum": {
"type": "float"
},
"clippingranksum": {
"type": "float"
},
"fs": {
"type": "float"
},
"dp": {
"type": "long"
},
"het": {
"properties": {
"het_afr": {
"type": "integer"
},
"het_amr": {
"type": "integer"
},
"het_eas": {
"type": "integer"
},
"het_fin": {
"type": "integer"
},
"het_nfe": {
"type": "integer"
},
"het_oth": {
"type": "integer"
},
"het_sas": {
"type": "integer"
},
"ac_het": {
"type": "integer"
}
}
},
"hom": {
"properties": {
"hom_afr": {
"type": "integer"
},
"hom_amr": {
"type": "integer"
},
"hom_eas": {
"type": "integer"
},
"hom_fin": {
"type": "integer"
},
"hom_nfe": {
"type": "integer"
},
"hom_oth": {
"type": "integer"
},
"hom_sas": {
"type": "integer"
},
"ac_hom": {
"type": "integer"
}
}
},
"inbreedingcoeff": {
"type": "float"
},
"mq": {
"properties": {
"mq": {
"type": "float"
},
"mq0": {
"type": "integer"
},
"mqranksum": {
"type": "float"
}
}
},
"ncc": {
"type": "long"
},
"qd": {
"type": "float"
},
"readposranksum": {
"type": "float"
},
"vqslod": {
"type": "float"
},
"culprit": {
"type": "string",
"analyzer": "string_lowercase"
}
}
},
"exac_nontcga": {
"properties": {
"chrom": {
"type": "string",
"analyzer": "string_lowercase"
},
"pos": {
"type": "long"
},
"ref": {
"type": "string",
"analyzer": "string_lowercase"
},
"alt": {
"type": "string",
"analyzer": "string_lowercase"
},
"multi-allelic": {
"type": "string",
"analyzer": "string_lowercase"
},
"alleles": {
"type": "string",
"analyzer": "string_lowercase"
},
"type": {
"type": "string",
"analyzer": "string_lowercase"
},
"qual": {
"type": "float"
},
"filter": {
"type": "string",
"analyzer": "string_lowercase"
},
"ac": {
"properties": {
"ac": {
"type": "integer"
},
"ac_afr": {
"type": "integer"
},
"ac_amr": {
"type": "integer"
},
"ac_adj": {
"type": "integer"
},
"ac_eas": {
"type": "integer"
},
"ac_fin": {
"type": "integer"
},
"ac_nfe": {
"type": "integer"
},
"ac_oth": {
"type": "integer"
},
"ac_sas": {
"type": "integer"
},
"ac_male": {
"type": "integer"
},
"ac_female": {
"type": "integer"
}
}
},
"af": {
"type": "float"
},
"an": {
"properties": {
"an": {
"type": "integer"
},
"an_afr": {
"type": "integer"
},
"an_amr": {
"type": "integer"
},
"an_adj": {
"type": "integer"
},
"an_eas": {
"type": "integer"
},
"an_fin": {
"type": "integer"
},
"an_nfe": {
"type": "integer"
},
"an_oth": {
"type": "integer"
},
"an_sas": {
"type": "integer"
},
"an_female": {
"type": "integer"
},
"an_male": {
"type": "integer"
}
}
},
"baseqranksum": {
"type": "float"
},
"clippingranksum": {
"type": "float"
},
"fs": {
"type": "float"
},
"dp": {
"type": "long"
},
"het": {
"properties": {
"het_afr": {
"type": "integer"
},
"het_amr": {
"type": "integer"
},
"het_eas": {
"type": "integer"
},
"het_fin": {
"type": "integer"
},
"het_nfe": {
"type": "integer"
},
"het_oth": {
"type": "integer"
},
"het_sas": {
"type": "integer"
},
"ac_het": {
"type": "integer"
}
}
},
"hom": {
"properties": {
"hom_afr": {
"type": "integer"
},
"hom_amr": {
"type": "integer"
},
"hom_eas": {
"type": "integer"
},
"hom_fin": {
"type": "integer"
},
"hom_nfe": {
"type": "integer"
},
"hom_oth": {
"type": "integer"
},
"hom_sas": {
"type": "integer"
},
"ac_hom": {
"type": "integer"
}
}
},
"inbreedingcoeff": {
"type": "float"
},
"mq": {
"properties": {
"mq": {
"type": "float"
},
"mq0": {
"type": "integer"
},
"mqranksum": {
"type": "float"
}
}
},
"ncc": {
"type": "long"
},
"qd": {
"type": "float"
},
"readposranksum": {
"type": "float"
},
"vqslod": {
"type": "float"
},
"culprit": {
"type": "string",
"analyzer": "string_lowercase"
}
}
}
}
return mapping
class ExacUploader(ExacBaseUploader):
name = "exac"
main_source= "exac"
def load_data(self,data_folder):
content = glob.glob(os.path.join(data_folder,"ExAC.r*.vcf"))
if len(content) != 1:
raise uploader.ResourceError("Expecting one single vcf file, got: %s" % repr(content))
input_file = content.pop()
self.logger.info("Load data from file '%s'" % input_file)
return load_data(self.__class__.name, input_file)
class ExacNonTCGAUploader(ExacBaseUploader):
name = "exac_nontcga"
main_source= "exac"
def load_data(self,data_folder):
content = glob.glob(os.path.join(data_folder,"ExAC_nonTCGA.r*.vcf"))
if len(content) != 1:
raise uploader.ResourceError("Expecting one single vcf file, got: %s" % repr(content))
input_file = content.pop()
self.logger.info("Load data from file '%s'" % input_file)
return load_data(self.__class__.name, input_file)
| 36.609959 | 98 | 0.204126 | 700 | 17,646 | 4.965714 | 0.162857 | 0.246835 | 0.089758 | 0.110472 | 0.873993 | 0.857883 | 0.857883 | 0.857883 | 0.857883 | 0.857883 | 0 | 0.001679 | 0.696248 | 17,646 | 481 | 99 | 36.686071 | 0.646828 | 0 | 0 | 0.63113 | 0 | 0 | 0.165363 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006397 | false | 0 | 0.008529 | 0 | 0.03838 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
07c6b757fb01e292c703bde835ce69b1abdeffc6 | 13,096 | py | Python | tests/pytests/unit/beacons/test_adb.py | babs/salt | c536ea716d5308880b244e7980f4b659d86fc104 | [
"Apache-2.0"
] | 1 | 2021-02-11T16:55:00.000Z | 2021-02-11T16:55:00.000Z | tests/pytests/unit/beacons/test_adb.py | babs/salt | c536ea716d5308880b244e7980f4b659d86fc104 | [
"Apache-2.0"
] | 9 | 2021-03-31T20:25:25.000Z | 2021-07-04T05:33:46.000Z | tests/pytests/unit/beacons/test_adb.py | babs/salt | c536ea716d5308880b244e7980f4b659d86fc104 | [
"Apache-2.0"
] | 1 | 2020-06-02T14:15:24.000Z | 2020-06-02T14:15:24.000Z | """
tests.pytests.unit.beacons.test_adb
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
ADB beacon test cases
"""
import pytest
import salt.beacons.adb as adb
from tests.support.mock import Mock, patch
@pytest.fixture
def configure_loader_modules():
return {adb: {"last_state": {}, "last_state_extra": {"no_devices": False}}}
def test_no_adb_command():
with patch("salt.utils.path.which") as mock:
mock.return_value = None
ret = adb.__virtual__()
mock.assert_called_once_with("adb")
assert not ret
def test_with_adb_command():
with patch("salt.utils.path.which") as mock:
mock.return_value = "/usr/bin/adb"
ret = adb.__virtual__()
mock.assert_called_once_with("adb")
assert ret == "adb"
def test_non_list_config():
config = {}
ret = adb.validate(config)
assert ret == (False, "Configuration for adb beacon must be a list.")
def test_empty_config():
config = [{}]
ret = adb.validate(config)
assert ret == (False, "Configuration for adb beacon must include a states array.")
def test_invalid_states():
config = [{"states": ["Random", "Failings"]}]
ret = adb.validate(config)
assert ret == (
False,
"Need a one of the following adb states: offline, bootloader, device, host, recovery, no permissions, sideload, unauthorized, unknown, missing",
)
def test_device_state():
config = [{"states": ["device"]}]
mock = Mock(return_value="List of devices attached\nHTC\tdevice")
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
def test_device_state_change():
config = [{"states": ["offline"]}]
out = [
"List of devices attached\nHTC\tdevice",
"List of devices attached\nHTC\toffline",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == []
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "offline", "tag": "offline"}]
def test_multiple_devices():
config = [{"states": ["offline", "device"]}]
out = [
"List of devices attached\nHTC\tdevice",
"List of devices attached\nHTC\toffline\nNexus\tdevice",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "offline", "tag": "offline"},
{"device": "Nexus", "state": "device", "tag": "device"},
]
def test_no_devices_with_different_states():
config = [{"states": ["offline"], "no_devices_event": True}]
mock = Mock(return_value="List of devices attached\nHTC\tdevice")
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == []
def test_no_devices_no_repeat():
config = [{"states": ["offline", "device"], "no_devices_event": True}]
out = [
"List of devices attached\nHTC\tdevice",
"List of devices attached",
"List of devices attached",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
ret = adb.beacon(config)
assert ret == [{"tag": "no_devices"}]
ret = adb.beacon(config)
assert ret == []
def test_no_devices():
config = [{"states": ["offline", "device"], "no_devices_event": True}]
out = ["List of devices attached", "List of devices attached"]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"tag": "no_devices"}]
ret = adb.beacon(config)
assert ret == []
def test_device_missing():
config = [{"states": ["device", "missing"]}]
out = [
"List of devices attached\nHTC\tdevice",
"List of devices attached",
"List of devices attached\nHTC\tdevice",
"List of devices attached\nHTC\tdevice",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "missing", "tag": "missing"}]
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
ret = adb.beacon(config)
assert ret == []
def test_with_startup():
config = [{"states": ["device"]}]
mock = Mock(
return_value="* daemon started successfully *\nList of devices attached\nHTC\tdevice",
)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
def test_with_user():
config = [{"states": ["device"], "user": "fred"}]
mock = Mock(
return_value="* daemon started successfully *\nList of devices attached\nHTC\tdevice"
)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
mock.assert_called_once_with("adb devices", runas="fred")
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
def test_device_low_battery():
config = [{"states": ["device"], "battery_low": 30}]
out = [
"List of devices attached\nHTC\tdevice",
"25",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "device", "tag": "device"},
{"device": "HTC", "battery_level": 25, "tag": "battery_low"},
]
def test_device_no_repeat():
config = [{"states": ["device"], "battery_low": 30}]
out = [
"List of devices attached\nHTC\tdevice",
"25",
"List of devices attached\nHTC\tdevice",
"25",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "device", "tag": "device"},
{"device": "HTC", "battery_level": 25, "tag": "battery_low"},
]
ret = adb.beacon(config)
assert ret == []
def test_device_no_repeat_capacity_increase():
config = [{"states": ["device"], "battery_low": 75}]
out = [
"List of devices attached\nHTC\tdevice",
"25",
"List of devices attached\nHTC\tdevice",
"30",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "device", "tag": "device"},
{"device": "HTC", "battery_level": 25, "tag": "battery_low"},
]
ret = adb.beacon(config)
assert ret == []
def test_device_no_repeat_with_not_found_state():
config = [{"states": ["offline"], "battery_low": 30}]
out = [
"List of devices attached\nHTC\tdevice",
"25",
"List of devices attached\nHTC\tdevice",
"25",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "battery_level": 25, "tag": "battery_low"}]
ret = adb.beacon(config)
assert ret == []
def test_device_battery_charged():
config = [{"states": ["device"], "battery_low": 30}]
out = [
"List of devices attached\nHTC\tdevice",
"100",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
def test_device_low_battery_equal():
config = [{"states": ["device"], "battery_low": 25}]
out = [
"List of devices attached\nHTC\tdevice",
"25",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "device", "tag": "device"},
{"device": "HTC", "battery_level": 25, "tag": "battery_low"},
]
def test_device_battery_not_found():
config = [{"states": ["device"], "battery_low": 25}]
out = [
"List of devices attached\nHTC\tdevice",
"/system/bin/sh: cat: /sys/class/power_supply/*/capacity: No such file or directory",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
def test_device_repeat_multi():
config = [{"states": ["offline"], "battery_low": 35}]
out = [
"List of devices attached\nHTC\tdevice",
"25",
"List of devices attached\nHTC\tdevice",
"40",
"List of devices attached\nHTC\tdevice",
"25",
"List of devices attached\nHTC\tdevice",
"80",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "battery_level": 25, "tag": "battery_low"}]
ret = adb.beacon(config)
assert ret == []
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "battery_level": 25, "tag": "battery_low"}]
ret = adb.beacon(config)
assert ret == []
def test_weird_batteries():
config = [{"states": ["device"], "battery_low": 25}]
out = [
"List of devices attached\nHTC\tdevice",
"-9000",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [{"device": "HTC", "state": "device", "tag": "device"}]
def test_multiple_batteries():
config = [{"states": ["device"], "battery_low": 30}]
out = [
"List of devices attached\nHTC\tdevice",
"25\n40",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "device", "tag": "device"},
{"device": "HTC", "battery_level": 25, "tag": "battery_low"},
]
def test_multiple_low_batteries():
config = [{"states": ["device"], "battery_low": 30}]
out = [
"List of devices attached\nHTC\tdevice",
"25\n14",
]
mock = Mock(side_effect=out)
with patch.dict(adb.__salt__, {"cmd.run": mock}):
ret = adb.validate(config)
assert ret == (True, "Valid beacon configuration")
ret = adb.beacon(config)
assert ret == [
{"device": "HTC", "state": "device", "tag": "device"},
{"device": "HTC", "battery_level": 25, "tag": "battery_low"},
]
| 28.53159 | 152 | 0.57025 | 1,508 | 13,096 | 4.793103 | 0.089523 | 0.049806 | 0.116215 | 0.084671 | 0.874101 | 0.858052 | 0.849336 | 0.839651 | 0.838683 | 0.837853 | 0 | 0.008138 | 0.258781 | 13,096 | 458 | 153 | 28.593886 | 0.736479 | 0.007178 | 0 | 0.708978 | 0 | 0.006192 | 0.294639 | 0.054152 | 0 | 0 | 0 | 0 | 0.19195 | 1 | 0.080495 | false | 0 | 0.009288 | 0.003096 | 0.092879 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
07d5016da02a24e6c0a6bb5e54626cc0b891b614 | 3,057 | py | Python | sightings/migrations/0002_auto_20210410_2311.py | zc2575/4501_Squirrel_Tracker | 3b90dd2f44149e607b342fcea582c4276cd8cf61 | [
"BSD-3-Clause"
] | null | null | null | sightings/migrations/0002_auto_20210410_2311.py | zc2575/4501_Squirrel_Tracker | 3b90dd2f44149e607b342fcea582c4276cd8cf61 | [
"BSD-3-Clause"
] | null | null | null | sightings/migrations/0002_auto_20210410_2311.py | zc2575/4501_Squirrel_Tracker | 3b90dd2f44149e607b342fcea582c4276cd8cf61 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 3.1.7 on 2021-04-10 23:11
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('sightings', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='squirrel',
name='approaches',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='chasing',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='climbing',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='eating',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='foraging',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='indifferent',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='kuks',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='moans',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='quaas',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='running',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='runs_from',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='tail_flags',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
migrations.AlterField(
model_name='squirrel',
name='tail_twitches',
field=models.CharField(blank=True, choices=[('TRUE', 'TRUE'), ('FALSE', 'FALSE')], max_length=100),
),
]
| 38.696203 | 111 | 0.544325 | 290 | 3,057 | 5.634483 | 0.175862 | 0.159119 | 0.198898 | 0.230722 | 0.856181 | 0.856181 | 0.831089 | 0.831089 | 0.831089 | 0.831089 | 0 | 0.0264 | 0.281322 | 3,057 | 78 | 112 | 39.192308 | 0.717342 | 0.01472 | 0 | 0.722222 | 1 | 0 | 0.153488 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.013889 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
07dc34dc3464d6489bf70aee0bc1e1eec0c017f5 | 2,942 | py | Python | conans/test/functional/remote/server_error_test.py | matthiasng/conan | 634eadc319da928084633a344d42785edccb8d6c | [
"MIT"
] | 6,205 | 2015-12-01T13:40:05.000Z | 2022-03-31T07:30:25.000Z | conans/test/functional/remote/server_error_test.py | matthiasng/conan | 634eadc319da928084633a344d42785edccb8d6c | [
"MIT"
] | 8,747 | 2015-12-01T16:28:48.000Z | 2022-03-31T23:34:53.000Z | conans/test/integration/remote/server_error_test.py | Mattlk13/conan | 005fc53485557b0a570bb71670f2ca9c66082165 | [
"MIT"
] | 961 | 2015-12-01T16:56:43.000Z | 2022-03-31T13:50:52.000Z | import unittest
from conans.test.utils.tools import TestClient, TestServer
from collections import namedtuple
class Error200NoJson(unittest.TestCase):
def test_error_no_json(self):
class RequesterMock(object):
def __init__(self, *args, **kwargs):
pass
def get(self, *args, **kwargs): # @UnusedVariable
# Response must be binary, it is decoded in RestClientCommon
return namedtuple("Response", "status_code headers content ok")(200, {}, b'<>',
True)
# https://github.com/conan-io/conan/issues/3432
client = TestClient(servers={"default": TestServer()},
requester_class=RequesterMock,
users={"default": [("lasote", "mypass")]})
client.run("install pkg/ref@user/testing", assert_error=True)
self.assertIn("ERROR: <>", client.out)
self.assertIn("Response from remote is not json, but 'None'", client.out)
def test_error_broken_json(self):
class RequesterMock(object):
def __init__(self, *args, **kwargs):
pass
def get(self, *args, **kwargs): # @UnusedVariable
# Response must be binary, it is decoded in RestClientCommon
headers = {"Content-Type": "application/json"}
return namedtuple("Response", "status_code headers content ok")(200, headers,
b'<>', True)
# https://github.com/conan-io/conan/issues/3432
client = TestClient(servers={"default": TestServer()},
requester_class=RequesterMock,
users={"default": [("lasote", "mypass")]})
client.run("install pkg/ref@user/testing", assert_error=True)
self.assertIn("Remote responded with broken json: <>", client.out)
def test_error_json(self):
class RequesterMock(object):
def __init__(self, *args, **kwargs):
pass
def get(self, *args, **kwargs): # @UnusedVariable
# Response must be binary, it is decoded in RestClientCommon
headers = {"Content-Type": "application/json"}
return namedtuple("Response", "status_code headers content ok")(200, headers,
b'[1, 2, 3]', True)
# https://github.com/conan-io/conan/issues/3432
client = TestClient(servers={"default": TestServer()},
requester_class=RequesterMock,
users={"default": [("lasote", "mypass")]})
client.run("install pkg/ref@user/testing", assert_error=True)
self.assertIn("ERROR: Unexpected server response [1, 2, 3]", client.out)
| 44.575758 | 99 | 0.540109 | 282 | 2,942 | 5.531915 | 0.29078 | 0.069231 | 0.053846 | 0.05 | 0.825 | 0.798077 | 0.798077 | 0.798077 | 0.798077 | 0.764103 | 0 | 0.015496 | 0.341944 | 2,942 | 65 | 100 | 45.261538 | 0.790289 | 0.123046 | 0 | 0.651163 | 0 | 0 | 0.185992 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 1 | 0.209302 | false | 0.139535 | 0.069767 | 0.023256 | 0.44186 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
6af43f4930883859b3639234f643a1561f28bbb8 | 813 | py | Python | complete/1.py | HaraDev001/ScheduleWork | 07aeb38465f36a632cafcf265be766f981078a88 | [
"MIT"
] | null | null | null | complete/1.py | HaraDev001/ScheduleWork | 07aeb38465f36a632cafcf265be766f981078a88 | [
"MIT"
] | null | null | null | complete/1.py | HaraDev001/ScheduleWork | 07aeb38465f36a632cafcf265be766f981078a88 | [
"MIT"
] | null | null | null | import pandas as pd
import os
df = pd.DataFrame({'name': ['Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello','Raphael', 'Donatello',],
'mask': ['red', 'purple','red', 'purple','red', 'purple','red', 'purple','red', 'purple','red', 'purple','red', 'purple','red', 'purple','red', 'purple','red', 'purple',],
'weapon': ['sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff','sai', 'bo staff',]})
df.to_csv(r"file1.txt",index=False, sep='\t', mode='a', header=False,encoding='utf-8-sig')
os.system('python 2.py') | 90.333333 | 260 | 0.590406 | 105 | 813 | 4.561905 | 0.352381 | 0.334029 | 0.208768 | 0.601253 | 0.730689 | 0.730689 | 0.730689 | 0.730689 | 0.730689 | 0.730689 | 0 | 0.004202 | 0.121771 | 813 | 9 | 261 | 90.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.498772 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ed0602c1c5470a11808c29d9b17b281c00ab0af6 | 2,984 | py | Python | education/services.py | Amirsorouri00/neolej | 8fa18f2c1a38b0a59ed7eeeed7ed37ef7b9dad97 | [
"MIT"
] | null | null | null | education/services.py | Amirsorouri00/neolej | 8fa18f2c1a38b0a59ed7eeeed7ed37ef7b9dad97 | [
"MIT"
] | null | null | null | education/services.py | Amirsorouri00/neolej | 8fa18f2c1a38b0a59ed7eeeed7ed37ef7b9dad97 | [
"MIT"
] | null | null | null | '''
.d8888b. 888 888 888 8888888 888
d88P Y88b 888 888 888 888 888
888 888 888 888 888 888 888
888 888 .d88b. 88888b. 8888b. 888 888 88888b.d88b. 88888b. .d88b. 888d888 888888 .d8888b
888 88888 888 d88""88b 888 "88b "88b 888 888 888 "888 "88b 888 "88b d88""88b 888P" 888 88K
888 888 888 888 888 888 888 .d888888 888 888 888 888 888 888 888 888 888 888 888 "Y8888b.
Y88b d88P 888 Y88..88P 888 d88P 888 888 888 888 888 888 888 888 d88P Y88..88P 888 Y88b. X88
"Y8888P88 888 "Y88P" 88888P" "Y888888 888 8888888 888 888 888 88888P" "Y88P" 888 "Y888 88888P'
888
888
888
'''
from django.db import models
from django.conf import settings
from django.contrib.auth import get_user_model
'''
.d8888b. 888 8888888b. d8b
d88P Y88b 888 888 Y88b Y8P
888 888 888 888 888
888 .d88b. 888888 888 d88P 888d888 888 .d8888b .d88b.
888 88888 d8P Y8b 888 8888888P" 888P" 888 d88P" d8P Y8b
888 888 88888888 888 888 888 888 888 88888888
Y88b d88P Y8b. Y88b. 888 888 888 Y88b. Y8b.
"Y8888P88 "Y8888 "Y888 888 888 888 "Y8888P "Y8888
'''
from education.models import WorkshopDiscount
def get_price(user, workshop):
cost = workshop.price.get_price('rial')
discounts = WorkshopDiscount.objects.filter(workshops__id=workshop.id)
user_discount = None
discount_type = None
for discount in discounts:
if hasattr(discount, 'limit'):
if discount.used_count < discount.limit:
user_discount = discount
discount_type = 'race'
else:
user_discount = None
break
print('race discount')
elif hasattr(discount, 'person'):
break
print('personal discount')
else:
break
print('date discount')
response = {'cost': cost, 'discount': user_discount, 'discount_type' = discount_type}
return response
| 54.254545 | 115 | 0.418901 | 275 | 2,984 | 4.490909 | 0.283636 | 0.296356 | 0.335223 | 0.330364 | 0.130364 | 0.118219 | 0.118219 | 0.080162 | 0.060729 | 0.060729 | 0 | 0.389406 | 0.531836 | 2,984 | 55 | 116 | 54.254545 | 0.494631 | 0 | 0 | 0.269231 | 0 | 0 | 0.075586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.153846 | null | null | 0.115385 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed5154e0f719721daf5c3c6c01866d4d2756e222 | 79 | py | Python | game/__init__.py | isaaclepes/pypboy | 2e283172f0232993fda93db3b29358351d0e94eb | [
"MIT"
] | null | null | null | game/__init__.py | isaaclepes/pypboy | 2e283172f0232993fda93db3b29358351d0e94eb | [
"MIT"
] | null | null | null | game/__init__.py | isaaclepes/pypboy | 2e283172f0232993fda93db3b29358351d0e94eb | [
"MIT"
] | null | null | null | from .core import Entity
from .core import EntityGroup
from .core import Engine | 26.333333 | 29 | 0.822785 | 12 | 79 | 5.416667 | 0.5 | 0.369231 | 0.646154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139241 | 79 | 3 | 30 | 26.333333 | 0.955882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ed5653b60a31aa7aba0b49406edba8b25a18ca0c | 18,888 | py | Python | modin/pandas/test/test_groupby.py | frreiss/modin | 4311f971e32c45940291de5778bee2e8f7c001c2 | [
"Apache-2.0"
] | 1 | 2019-01-14T04:01:13.000Z | 2019-01-14T04:01:13.000Z | modin/pandas/test/test_groupby.py | frreiss/modin | 4311f971e32c45940291de5778bee2e8f7c001c2 | [
"Apache-2.0"
] | null | null | null | modin/pandas/test/test_groupby.py | frreiss/modin | 4311f971e32c45940291de5778bee2e8f7c001c2 | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import pytest
import sys
import pandas
import numpy as np
import modin.pandas as pd
from modin.pandas.utils import (from_pandas, to_pandas)
PY2 = False
if sys.version_info.major < 3:
PY2 = True
@pytest.fixture
def ray_df_equals_pandas(ray_df, pandas_df):
assert isinstance(ray_df, pd.DataFrame)
assert to_pandas(ray_df).equals(pandas_df)
@pytest.fixture
def ray_df_almost_equals_pandas(ray_df, pandas_df):
assert isinstance(ray_df, pd.DataFrame)
difference = to_pandas(ray_df) - pandas_df
diff_max = difference.max().max()
assert to_pandas(ray_df).equals(pandas_df) or diff_max < 0.0001
@pytest.fixture
def ray_series_equals_pandas(ray_df, pandas_df):
assert ray_df.equals(pandas_df)
@pytest.fixture
def ray_df_equals(ray_df1, ray_df2):
assert to_pandas(ray_df1).equals(to_pandas(ray_df2))
@pytest.fixture
def ray_groupby_equals_pandas(ray_groupby, pandas_groupby):
for g1, g2 in zip(ray_groupby, pandas_groupby):
assert g1[0] == g2[0]
ray_df_equals_pandas(g1[1], g2[1])
def test_simple_row_groupby():
pandas_df = pandas.DataFrame({
'col1': [0, 1, 2, 3],
'col2': [4, 5, 6, 7],
'col3': [3, 8, 12, 10],
'col4': [17, 13, 16, 15],
'col5': [-4, -5, -6, -7]
})
ray_df = from_pandas(pandas_df, 2)
by = [1, 2, 1, 2]
n = 1
ray_groupby = ray_df.groupby(by=by)
pandas_groupby = pandas_df.groupby(by=by)
ray_groupby_equals_pandas(ray_groupby, pandas_groupby)
test_ngroups(ray_groupby, pandas_groupby)
test_skew(ray_groupby, pandas_groupby)
test_ffill(ray_groupby, pandas_groupby)
test_sem(ray_groupby, pandas_groupby)
test_mean(ray_groupby, pandas_groupby)
test_any(ray_groupby, pandas_groupby)
test_min(ray_groupby, pandas_groupby)
test_idxmax(ray_groupby, pandas_groupby)
test_ndim(ray_groupby, pandas_groupby)
test_cumsum(ray_groupby, pandas_groupby)
test_pct_change(ray_groupby, pandas_groupby)
test_cummax(ray_groupby, pandas_groupby)
apply_functions = [lambda df: df.sum(), lambda df: -df]
for func in apply_functions:
test_apply(ray_groupby, pandas_groupby, func)
test_dtypes(ray_groupby, pandas_groupby)
test_first(ray_groupby, pandas_groupby)
test_backfill(ray_groupby, pandas_groupby)
test_cummin(ray_groupby, pandas_groupby)
test_bfill(ray_groupby, pandas_groupby)
test_idxmin(ray_groupby, pandas_groupby)
test_prod(ray_groupby, pandas_groupby)
test_std(ray_groupby, pandas_groupby)
agg_functions = ['min', 'max']
for func in agg_functions:
test_agg(ray_groupby, pandas_groupby, func)
test_aggregate(ray_groupby, pandas_groupby, func)
test_last(ray_groupby, pandas_groupby)
test_mad(ray_groupby, pandas_groupby)
test_rank(ray_groupby, pandas_groupby)
test_max(ray_groupby, pandas_groupby)
test_var(ray_groupby, pandas_groupby)
test_len(ray_groupby, pandas_groupby)
test_sum(ray_groupby, pandas_groupby)
test_ngroup(ray_groupby, pandas_groupby)
test_nunique(ray_groupby, pandas_groupby)
test_median(ray_groupby, pandas_groupby)
test_head(ray_groupby, pandas_groupby, n)
test_cumprod(ray_groupby, pandas_groupby)
test_cov(ray_groupby, pandas_groupby)
transform_functions = [lambda df: df + 4, lambda df: -df - 10]
for func in transform_functions:
test_transform(ray_groupby, pandas_groupby, func)
pipe_functions = [lambda dfgb: dfgb.sum()]
for func in pipe_functions:
test_pipe(ray_groupby, pandas_groupby, func)
test_corr(ray_groupby, pandas_groupby)
test_fillna(ray_groupby, pandas_groupby)
test_count(ray_groupby, pandas_groupby)
test_tail(ray_groupby, pandas_groupby, n)
test_quantile(ray_groupby, pandas_groupby)
test_take(ray_groupby, pandas_groupby)
def test_single_group_row_groupby():
pandas_df = pandas.DataFrame({
'col1': [0, 1, 2, 3],
'col2': [4, 5, 36, 7],
'col3': [3, 8, 12, 10],
'col4': [17, 3, 16, 15],
'col5': [-4, 5, -6, -7]
})
ray_df = from_pandas(pandas_df, 2)
by = [1, 1, 1, 1]
n = 6
ray_groupby = ray_df.groupby(by=by)
pandas_groupby = pandas_df.groupby(by=by)
ray_groupby_equals_pandas(ray_groupby, pandas_groupby)
test_ngroups(ray_groupby, pandas_groupby)
test_skew(ray_groupby, pandas_groupby)
test_ffill(ray_groupby, pandas_groupby)
test_sem(ray_groupby, pandas_groupby)
test_mean(ray_groupby, pandas_groupby)
test_any(ray_groupby, pandas_groupby)
test_min(ray_groupby, pandas_groupby)
test_idxmax(ray_groupby, pandas_groupby)
test_ndim(ray_groupby, pandas_groupby)
test_cumsum(ray_groupby, pandas_groupby)
test_pct_change(ray_groupby, pandas_groupby)
test_cummax(ray_groupby, pandas_groupby)
apply_functions = [lambda df: df.sum(), lambda df: -df]
for func in apply_functions:
test_apply(ray_groupby, pandas_groupby, func)
test_dtypes(ray_groupby, pandas_groupby)
test_first(ray_groupby, pandas_groupby)
test_backfill(ray_groupby, pandas_groupby)
test_cummin(ray_groupby, pandas_groupby)
test_bfill(ray_groupby, pandas_groupby)
test_idxmin(ray_groupby, pandas_groupby)
test_prod(ray_groupby, pandas_groupby)
test_std(ray_groupby, pandas_groupby)
agg_functions = ['min', 'max']
for func in agg_functions:
test_agg(ray_groupby, pandas_groupby, func)
test_aggregate(ray_groupby, pandas_groupby, func)
test_last(ray_groupby, pandas_groupby)
test_mad(ray_groupby, pandas_groupby)
test_rank(ray_groupby, pandas_groupby)
test_max(ray_groupby, pandas_groupby)
test_var(ray_groupby, pandas_groupby)
test_len(ray_groupby, pandas_groupby)
test_sum(ray_groupby, pandas_groupby)
test_ngroup(ray_groupby, pandas_groupby)
test_nunique(ray_groupby, pandas_groupby)
test_median(ray_groupby, pandas_groupby)
test_head(ray_groupby, pandas_groupby, n)
test_cumprod(ray_groupby, pandas_groupby)
test_cov(ray_groupby, pandas_groupby)
transform_functions = [lambda df: df + 4, lambda df: -df - 10]
for func in transform_functions:
test_transform(ray_groupby, pandas_groupby, func)
pipe_functions = [lambda dfgb: dfgb.sum()]
for func in pipe_functions:
test_pipe(ray_groupby, pandas_groupby, func)
test_corr(ray_groupby, pandas_groupby)
test_fillna(ray_groupby, pandas_groupby)
test_count(ray_groupby, pandas_groupby)
test_tail(ray_groupby, pandas_groupby, n)
test_quantile(ray_groupby, pandas_groupby)
test_take(ray_groupby, pandas_groupby)
@pytest.mark.skip(reason="See Modin issue #21.")
def test_large_row_groupby():
pandas_df = pandas.DataFrame(
np.random.randint(0, 8, size=(100, 4)), columns=list('ABCD'))
ray_df = from_pandas(pandas_df, 2)
by = [str(i) for i in pandas_df['A'].tolist()]
n = 4
ray_groupby = ray_df.groupby(by=by)
pandas_groupby = pandas_df.groupby(by=by)
ray_groupby_equals_pandas(ray_groupby, pandas_groupby)
test_ngroups(ray_groupby, pandas_groupby)
test_skew(ray_groupby, pandas_groupby)
test_ffill(ray_groupby, pandas_groupby)
test_sem(ray_groupby, pandas_groupby)
test_mean(ray_groupby, pandas_groupby)
test_any(ray_groupby, pandas_groupby)
test_min(ray_groupby, pandas_groupby)
test_idxmax(ray_groupby, pandas_groupby)
test_ndim(ray_groupby, pandas_groupby)
test_cumsum(ray_groupby, pandas_groupby)
test_pct_change(ray_groupby, pandas_groupby)
test_cummax(ray_groupby, pandas_groupby)
apply_functions = [lambda df: df.sum(), lambda df: -df]
for func in apply_functions:
test_apply(ray_groupby, pandas_groupby, func)
test_dtypes(ray_groupby, pandas_groupby)
test_first(ray_groupby, pandas_groupby)
test_backfill(ray_groupby, pandas_groupby)
test_cummin(ray_groupby, pandas_groupby)
test_bfill(ray_groupby, pandas_groupby)
test_idxmin(ray_groupby, pandas_groupby)
# test_prod(ray_groupby, pandas_groupby) causes overflows
test_std(ray_groupby, pandas_groupby)
agg_functions = ['min', 'max']
for func in agg_functions:
test_agg(ray_groupby, pandas_groupby, func)
test_aggregate(ray_groupby, pandas_groupby, func)
test_last(ray_groupby, pandas_groupby)
test_mad(ray_groupby, pandas_groupby)
test_rank(ray_groupby, pandas_groupby)
test_max(ray_groupby, pandas_groupby)
test_var(ray_groupby, pandas_groupby)
test_len(ray_groupby, pandas_groupby)
test_sum(ray_groupby, pandas_groupby)
test_ngroup(ray_groupby, pandas_groupby)
test_nunique(ray_groupby, pandas_groupby)
test_median(ray_groupby, pandas_groupby)
test_head(ray_groupby, pandas_groupby, n)
# test_cumprod(ray_groupby, pandas_groupby) causes overflows
test_cov(ray_groupby, pandas_groupby)
transform_functions = [lambda df: df + 4, lambda df: -df - 10]
for func in transform_functions:
test_transform(ray_groupby, pandas_groupby, func)
pipe_functions = [lambda dfgb: dfgb.sum()]
for func in pipe_functions:
test_pipe(ray_groupby, pandas_groupby, func)
test_corr(ray_groupby, pandas_groupby)
test_fillna(ray_groupby, pandas_groupby)
test_count(ray_groupby, pandas_groupby)
test_tail(ray_groupby, pandas_groupby, n)
test_quantile(ray_groupby, pandas_groupby)
test_take(ray_groupby, pandas_groupby)
def test_simple_col_groupby():
pandas_df = pandas.DataFrame({
'col1': [0, 3, 2, 3],
'col2': [4, 1, 6, 7],
'col3': [3, 8, 2, 10],
'col4': [1, 13, 6, 15],
'col5': [-4, 5, 6, -7]
})
ray_df = from_pandas(pandas_df, 2)
by = [1, 2, 3, 2, 1]
ray_groupby = ray_df.groupby(axis=1, by=by)
pandas_groupby = pandas_df.groupby(axis=1, by=by)
ray_groupby_equals_pandas(ray_groupby, pandas_groupby)
test_ngroups(ray_groupby, pandas_groupby)
test_skew(ray_groupby, pandas_groupby)
test_ffill(ray_groupby, pandas_groupby)
test_sem(ray_groupby, pandas_groupby)
test_mean(ray_groupby, pandas_groupby)
test_any(ray_groupby, pandas_groupby)
test_min(ray_groupby, pandas_groupby)
test_ndim(ray_groupby, pandas_groupby)
if not PY2:
# idxmax and idxmin fail on column groupby in pandas with python2
test_idxmax(ray_groupby, pandas_groupby)
test_idxmin(ray_groupby, pandas_groupby)
test_rank(ray_groupby, pandas_groupby)
test_quantile(ray_groupby, pandas_groupby)
# https://github.com/pandas-dev/pandas/issues/21127
# test_cumsum(ray_groupby, pandas_groupby)
# test_cummax(ray_groupby, pandas_groupby)
# test_cummin(ray_groupby, pandas_groupby)
# test_cumprod(ray_groupby, pandas_groupby)
test_pct_change(ray_groupby, pandas_groupby)
apply_functions = [lambda df: -df, lambda df: df.sum(axis=1)]
for func in apply_functions:
test_apply(ray_groupby, pandas_groupby, func)
test_first(ray_groupby, pandas_groupby)
test_backfill(ray_groupby, pandas_groupby)
test_bfill(ray_groupby, pandas_groupby)
test_prod(ray_groupby, pandas_groupby)
test_std(ray_groupby, pandas_groupby)
test_last(ray_groupby, pandas_groupby)
test_mad(ray_groupby, pandas_groupby)
test_max(ray_groupby, pandas_groupby)
test_var(ray_groupby, pandas_groupby)
test_len(ray_groupby, pandas_groupby)
test_sum(ray_groupby, pandas_groupby)
# Pandas fails on this case with ValueError
# test_ngroup(ray_groupby, pandas_groupby)
# test_nunique(ray_groupby, pandas_groupby)
test_median(ray_groupby, pandas_groupby)
test_cov(ray_groupby, pandas_groupby)
transform_functions = [lambda df: df + 4, lambda df: -df - 10]
for func in transform_functions:
test_transform(ray_groupby, pandas_groupby, func)
pipe_functions = [lambda dfgb: dfgb.sum()]
for func in pipe_functions:
test_pipe(ray_groupby, pandas_groupby, func)
test_corr(ray_groupby, pandas_groupby)
test_fillna(ray_groupby, pandas_groupby)
test_count(ray_groupby, pandas_groupby)
test_take(ray_groupby, pandas_groupby)
@pytest.fixture
def test_ngroups(ray_groupby, pandas_groupby):
assert ray_groupby.ngroups == pandas_groupby.ngroups
@pytest.fixture
def test_skew(ray_groupby, pandas_groupby):
ray_df_almost_equals_pandas(ray_groupby.skew(), pandas_groupby.skew())
@pytest.fixture
def test_ffill(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.ffill(), pandas_groupby.ffill())
@pytest.fixture
def test_sem(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.sem()
@pytest.fixture
def test_mean(ray_groupby, pandas_groupby):
ray_df_almost_equals_pandas(ray_groupby.mean(), pandas_groupby.mean())
@pytest.fixture
def test_any(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.any(), pandas_groupby.any())
@pytest.fixture
def test_min(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.min(), pandas_groupby.min())
@pytest.fixture
def test_idxmax(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_df_equals_pandas(ray_groupby.idxmax(), pandas_groupby.idxmax())
@pytest.fixture
def test_ndim(ray_groupby, pandas_groupby):
assert ray_groupby.ndim == pandas_groupby.ndim
@pytest.fixture
def test_cumsum(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.cumsum(), pandas_groupby.cumsum())
ray_df_equals_pandas(
ray_groupby.cumsum(axis=1), pandas_groupby.cumsum(axis=1))
@pytest.fixture
def test_pct_change(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.pct_change()
@pytest.fixture
def test_cummax(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.cummax(), pandas_groupby.cummax())
ray_df_equals_pandas(
ray_groupby.cummax(axis=1), pandas_groupby.cummax(axis=1))
@pytest.fixture
def test_apply(ray_groupby, pandas_groupby, func):
ray_df_equals_pandas(ray_groupby.apply(func), pandas_groupby.apply(func))
@pytest.fixture
def test_dtypes(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.dtypes, pandas_groupby.dtypes)
@pytest.fixture
def test_first(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.first()
@pytest.fixture
def test_backfill(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.backfill(), pandas_groupby.backfill())
@pytest.fixture
def test_cummin(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.cummin(), pandas_groupby.cummin())
ray_df_equals_pandas(
ray_groupby.cummin(axis=1), pandas_groupby.cummin(axis=1))
@pytest.fixture
def test_bfill(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.bfill(), pandas_groupby.bfill())
@pytest.fixture
def test_idxmin(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_df_equals_pandas(ray_groupby.idxmin(), pandas_groupby.idxmin())
@pytest.fixture
def test_prod(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.prod(), pandas_groupby.prod())
@pytest.fixture
def test_std(ray_groupby, pandas_groupby):
ray_df_almost_equals_pandas(ray_groupby.std(), pandas_groupby.std())
@pytest.fixture
def test_aggregate(ray_groupby, pandas_groupby, func):
ray_df_equals_pandas(
ray_groupby.aggregate(func), pandas_groupby.aggregate(func))
@pytest.fixture
def test_agg(ray_groupby, pandas_groupby, func):
ray_df_equals_pandas(ray_groupby.agg(func), pandas_groupby.agg(func))
@pytest.fixture
def test_last(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.last()
@pytest.fixture
def test_mad(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.mad()
@pytest.fixture
def test_rank(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.rank(), pandas_groupby.rank())
@pytest.fixture
def test_max(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.max(), pandas_groupby.max())
@pytest.fixture
def test_var(ray_groupby, pandas_groupby):
ray_df_almost_equals_pandas(ray_groupby.var(), pandas_groupby.var())
@pytest.fixture
def test_len(ray_groupby, pandas_groupby):
assert len(ray_groupby) == len(pandas_groupby)
@pytest.fixture
def test_sum(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.sum(), pandas_groupby.sum())
@pytest.fixture
def test_ngroup(ray_groupby, pandas_groupby):
ray_series_equals_pandas(ray_groupby.ngroup(), pandas_groupby.ngroup())
@pytest.fixture
def test_nunique(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.nunique(), pandas_groupby.nunique())
@pytest.fixture
def test_median(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.median(), pandas_groupby.median())
@pytest.fixture
def test_head(ray_groupby, pandas_groupby, n):
with pytest.raises(NotImplementedError):
ray_df_equals_pandas(ray_groupby.head(n=n), pandas_groupby.head(n=n))
@pytest.fixture
def test_cumprod(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.cumprod(), pandas_groupby.cumprod())
ray_df_equals_pandas(
ray_groupby.cumprod(axis=1), pandas_groupby.cumprod(axis=1))
@pytest.fixture
def test_cov(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.cov()
@pytest.fixture
def test_transform(ray_groupby, pandas_groupby, func):
ray_df_equals_pandas(
ray_groupby.transform(func), pandas_groupby.transform(func))
@pytest.fixture
def test_corr(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.corr()
@pytest.fixture
def test_fillna(ray_groupby, pandas_groupby):
ray_df_equals_pandas(
ray_groupby.fillna(method="ffill"),
pandas_groupby.fillna(method="ffill"))
@pytest.fixture
def test_count(ray_groupby, pandas_groupby):
ray_df_equals_pandas(ray_groupby.count(), pandas_groupby.count())
@pytest.fixture
def test_pipe(ray_groupby, pandas_groupby, func):
ray_df_equals_pandas(ray_groupby.pipe(func), pandas_groupby.pipe(func))
@pytest.fixture
def test_tail(ray_groupby, pandas_groupby, n):
with pytest.raises(NotImplementedError):
ray_df_equals_pandas(ray_groupby.tail(n=n), pandas_groupby.tail(n=n))
@pytest.fixture
def test_quantile(ray_groupby, pandas_groupby):
ray_df_equals_pandas(
ray_groupby.quantile(q=0.4), pandas_groupby.quantile(q=0.4))
@pytest.fixture
def test_take(ray_groupby, pandas_groupby):
with pytest.raises(NotImplementedError):
ray_groupby.take(indices=[1])
| 31.48 | 77 | 0.749841 | 2,663 | 18,888 | 4.936913 | 0.062711 | 0.211455 | 0.268959 | 0.386628 | 0.834867 | 0.814558 | 0.764889 | 0.72914 | 0.717578 | 0.715753 | 0 | 0.010948 | 0.153748 | 18,888 | 599 | 78 | 31.532554 | 0.811562 | 0.027425 | 0 | 0.65977 | 0 | 0 | 0.006155 | 0 | 0 | 0 | 0 | 0 | 0.022989 | 1 | 0.121839 | false | 0 | 0.02069 | 0 | 0.142529 | 0.002299 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
71fa9b680251963268dd4097ddd6528d1d261b7c | 92 | py | Python | chainercv/functions/__init__.py | tn1031/chainercv | 6c96fedf283d69f6d328bf504a201299120c407b | [
"MIT"
] | 1 | 2019-05-14T06:27:10.000Z | 2019-05-14T06:27:10.000Z | chainercv/functions/__init__.py | tn1031/chainercv | 6c96fedf283d69f6d328bf504a201299120c407b | [
"MIT"
] | null | null | null | chainercv/functions/__init__.py | tn1031/chainercv | 6c96fedf283d69f6d328bf504a201299120c407b | [
"MIT"
] | null | null | null | from chainercv.functions.ps_roi_average_pooling_2d import ps_roi_average_pooling_2d # NOQA
| 46 | 91 | 0.891304 | 15 | 92 | 4.933333 | 0.666667 | 0.135135 | 0.324324 | 0.513514 | 0.567568 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023529 | 0.076087 | 92 | 1 | 92 | 92 | 0.847059 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9c3c35e690face12fe09e814d87b4d73f3efa171 | 14,506 | py | Python | questionnaire/migrations/0001_initial.py | affan2/ed-questionnaire | d210b5784c30c238dd17a351ace8848025fff936 | [
"BSD-3-Clause"
] | null | null | null | questionnaire/migrations/0001_initial.py | affan2/ed-questionnaire | d210b5784c30c238dd17a351ace8848025fff936 | [
"BSD-3-Clause"
] | null | null | null | questionnaire/migrations/0001_initial.py | affan2/ed-questionnaire | d210b5784c30c238dd17a351ace8848025fff936 | [
"BSD-3-Clause"
] | 1 | 2020-01-08T09:08:12.000Z | 2020-01-08T09:08:12.000Z | # -*- coding: utf-8 -*-
from south.utils import datetime_utils as datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'Subject'
db.create_table(u'questionnaire_subject', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('state', self.gf('django.db.models.fields.CharField')(default='inactive', max_length=16)),
('surname', self.gf('django.db.models.fields.CharField')(max_length=64, null=True, blank=True)),
('givenname', self.gf('django.db.models.fields.CharField')(max_length=64, null=True, blank=True)),
('email', self.gf('django.db.models.fields.EmailField')(max_length=75, null=True, blank=True)),
('gender', self.gf('django.db.models.fields.CharField')(default='unset', max_length=8, blank=True)),
('nextrun', self.gf('django.db.models.fields.DateField')(null=True, blank=True)),
('formtype', self.gf('django.db.models.fields.CharField')(default='email', max_length=16)),
('language', self.gf('django.db.models.fields.CharField')(default='en', max_length=2)),
))
db.send_create_signal(u'questionnaire', ['Subject'])
# Adding model 'Questionnaire'
db.create_table(u'questionnaire_questionnaire', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=128)),
('redirect_url', self.gf('django.db.models.fields.CharField')(default='/static/complete.html', max_length=128)),
))
db.send_create_signal(u'questionnaire', ['Questionnaire'])
# Adding model 'QuestionSet'
db.create_table(u'questionnaire_questionset', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('questionnaire', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Questionnaire'])),
('sortid', self.gf('django.db.models.fields.IntegerField')()),
('heading', self.gf('django.db.models.fields.CharField')(max_length=64)),
('checks', self.gf('django.db.models.fields.CharField')(max_length=256, blank=True)),
('text_en', self.gf('django.db.models.fields.TextField')()),
))
db.send_create_signal(u'questionnaire', ['QuestionSet'])
# Adding model 'RunInfo'
db.create_table(u'questionnaire_runinfo', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('subject', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Subject'])),
('random', self.gf('django.db.models.fields.CharField')(max_length=32)),
('runid', self.gf('django.db.models.fields.CharField')(max_length=32)),
('questionset', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.QuestionSet'], null=True, blank=True)),
('emailcount', self.gf('django.db.models.fields.IntegerField')(default=0)),
('created', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, blank=True)),
('emailsent', self.gf('django.db.models.fields.DateTimeField')(null=True, blank=True)),
('lastemailerror', self.gf('django.db.models.fields.CharField')(max_length=64, null=True, blank=True)),
('state', self.gf('django.db.models.fields.CharField')(max_length=16, null=True, blank=True)),
('cookies', self.gf('django.db.models.fields.TextField')(null=True, blank=True)),
('tags', self.gf('django.db.models.fields.TextField')(blank=True)),
('skipped', self.gf('django.db.models.fields.TextField')(blank=True)),
))
db.send_create_signal(u'questionnaire', ['RunInfo'])
# Adding model 'RunInfoHistory'
db.create_table(u'questionnaire_runinfohistory', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('subject', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Subject'])),
('runid', self.gf('django.db.models.fields.CharField')(max_length=32)),
('completed', self.gf('django.db.models.fields.DateField')()),
('tags', self.gf('django.db.models.fields.TextField')(blank=True)),
('skipped', self.gf('django.db.models.fields.TextField')(blank=True)),
('questionnaire', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Questionnaire'])),
))
db.send_create_signal(u'questionnaire', ['RunInfoHistory'])
# Adding model 'Question'
db.create_table(u'questionnaire_question', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('questionset', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.QuestionSet'])),
('number', self.gf('django.db.models.fields.CharField')(max_length=8)),
('sort_id', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('text_en', self.gf('django.db.models.fields.TextField')(blank=True)),
('type', self.gf('django.db.models.fields.CharField')(max_length=32)),
('extra_en', self.gf('django.db.models.fields.CharField')(max_length=512, null=True, blank=True)),
('checks', self.gf('django.db.models.fields.CharField')(max_length=512, null=True, blank=True)),
('footer_en', self.gf('django.db.models.fields.TextField')(blank=True)),
))
db.send_create_signal(u'questionnaire', ['Question'])
# Adding model 'Choice'
db.create_table(u'questionnaire_choice', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('question', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Question'])),
('sortid', self.gf('django.db.models.fields.IntegerField')()),
('value', self.gf('django.db.models.fields.CharField')(max_length=64)),
('text_en', self.gf('django.db.models.fields.CharField')(max_length=200)),
('tags', self.gf('django.db.models.fields.CharField')(max_length=64, blank=True)),
))
db.send_create_signal(u'questionnaire', ['Choice'])
# Adding model 'Answer'
db.create_table(u'questionnaire_answer', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('subject', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Subject'])),
('question', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['questionnaire.Question'])),
('runid', self.gf('django.db.models.fields.CharField')(max_length=32)),
('answer', self.gf('django.db.models.fields.TextField')()),
))
db.send_create_signal(u'questionnaire', ['Answer'])
def backwards(self, orm):
# Deleting model 'Subject'
db.delete_table(u'questionnaire_subject')
# Deleting model 'Questionnaire'
db.delete_table(u'questionnaire_questionnaire')
# Deleting model 'QuestionSet'
db.delete_table(u'questionnaire_questionset')
# Deleting model 'RunInfo'
db.delete_table(u'questionnaire_runinfo')
# Deleting model 'RunInfoHistory'
db.delete_table(u'questionnaire_runinfohistory')
# Deleting model 'Question'
db.delete_table(u'questionnaire_question')
# Deleting model 'Choice'
db.delete_table(u'questionnaire_choice')
# Deleting model 'Answer'
db.delete_table(u'questionnaire_answer')
models = {
u'questionnaire.answer': {
'Meta': {'object_name': 'Answer'},
'answer': ('django.db.models.fields.TextField', [], {}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'question': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Question']"}),
'runid': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
'subject': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Subject']"})
},
u'questionnaire.choice': {
'Meta': {'object_name': 'Choice'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'question': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Question']"}),
'sortid': ('django.db.models.fields.IntegerField', [], {}),
'tags': ('django.db.models.fields.CharField', [], {'max_length': '64', 'blank': 'True'}),
'text_en': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
'value': ('django.db.models.fields.CharField', [], {'max_length': '64'})
},
u'questionnaire.question': {
'Meta': {'object_name': 'Question'},
'checks': ('django.db.models.fields.CharField', [], {'max_length': '512', 'null': 'True', 'blank': 'True'}),
'extra_en': ('django.db.models.fields.CharField', [], {'max_length': '512', 'null': 'True', 'blank': 'True'}),
'footer_en': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'number': ('django.db.models.fields.CharField', [], {'max_length': '8'}),
'questionset': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.QuestionSet']"}),
'sort_id': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'text_en': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'type': ('django.db.models.fields.CharField', [], {'max_length': '32'})
},
u'questionnaire.questionnaire': {
'Meta': {'object_name': 'Questionnaire'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
'redirect_url': ('django.db.models.fields.CharField', [], {'default': "'/static/complete.html'", 'max_length': '128'})
},
u'questionnaire.questionset': {
'Meta': {'object_name': 'QuestionSet'},
'checks': ('django.db.models.fields.CharField', [], {'max_length': '256', 'blank': 'True'}),
'heading': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'questionnaire': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Questionnaire']"}),
'sortid': ('django.db.models.fields.IntegerField', [], {}),
'text_en': ('django.db.models.fields.TextField', [], {})
},
u'questionnaire.runinfo': {
'Meta': {'object_name': 'RunInfo'},
'cookies': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
'emailcount': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'emailsent': ('django.db.models.fields.DateTimeField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'lastemailerror': ('django.db.models.fields.CharField', [], {'max_length': '64', 'null': 'True', 'blank': 'True'}),
'questionset': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.QuestionSet']", 'null': 'True', 'blank': 'True'}),
'random': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
'runid': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
'skipped': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'state': ('django.db.models.fields.CharField', [], {'max_length': '16', 'null': 'True', 'blank': 'True'}),
'subject': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Subject']"}),
'tags': ('django.db.models.fields.TextField', [], {'blank': 'True'})
},
u'questionnaire.runinfohistory': {
'Meta': {'object_name': 'RunInfoHistory'},
'completed': ('django.db.models.fields.DateField', [], {}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'questionnaire': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Questionnaire']"}),
'runid': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
'skipped': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'subject': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['questionnaire.Subject']"}),
'tags': ('django.db.models.fields.TextField', [], {'blank': 'True'})
},
u'questionnaire.subject': {
'Meta': {'object_name': 'Subject'},
'email': ('django.db.models.fields.EmailField', [], {'max_length': '75', 'null': 'True', 'blank': 'True'}),
'formtype': ('django.db.models.fields.CharField', [], {'default': "'email'", 'max_length': '16'}),
'gender': ('django.db.models.fields.CharField', [], {'default': "'unset'", 'max_length': '8', 'blank': 'True'}),
'givenname': ('django.db.models.fields.CharField', [], {'max_length': '64', 'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'language': ('django.db.models.fields.CharField', [], {'default': "'en'", 'max_length': '2'}),
'nextrun': ('django.db.models.fields.DateField', [], {'null': 'True', 'blank': 'True'}),
'state': ('django.db.models.fields.CharField', [], {'default': "'inactive'", 'max_length': '16'}),
'surname': ('django.db.models.fields.CharField', [], {'max_length': '64', 'null': 'True', 'blank': 'True'})
}
}
complete_apps = ['questionnaire'] | 65.638009 | 156 | 0.597408 | 1,602 | 14,506 | 5.322097 | 0.069913 | 0.109782 | 0.190476 | 0.272109 | 0.838729 | 0.782078 | 0.757682 | 0.731175 | 0.686958 | 0.626789 | 0 | 0.008999 | 0.187991 | 14,506 | 221 | 157 | 65.638009 | 0.714831 | 0.030125 | 0 | 0.302198 | 0 | 0 | 0.503701 | 0.355323 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010989 | false | 0 | 0.021978 | 0 | 0.049451 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1312edd17f95ef1ad072f7de66dcffba4e8af6bd | 2,842 | py | Python | mcpipy/vehicles/TIE_Fighter.py | sprintingkiwi/pycraft_mod | a1ce8817ccff3e701aa787a2e531df5a6cc8f887 | [
"MIT"
] | 15 | 2017-08-27T15:33:19.000Z | 2021-05-05T07:30:57.000Z | mcpipy/vehicles/TIE_Fighter.py | sprintingkiwi/pycraft_mod | a1ce8817ccff3e701aa787a2e531df5a6cc8f887 | [
"MIT"
] | 2 | 2018-11-30T19:56:02.000Z | 2021-09-24T05:20:39.000Z | mcpipy/vehicles/TIE_Fighter.py | sprintingkiwi/pycraft_mod | a1ce8817ccff3e701aa787a2e531df5a6cc8f887 | [
"MIT"
] | 4 | 2018-05-10T19:37:06.000Z | 2019-09-02T19:28:14.000Z | baseAngle,highWater,baseVehicle=(-91.205322265625, -9223372036854775808L, {(-2, -5, 1): Block(35, 7), (-2, -2, 4): Block(35, 7), (1, -3, 1): Block(95, 15), (-2, -3, -2): Block(35, 7), (-1, -1, -1): Block(35, 7), (-1, -5, 4): Block(35, 7), (-2, -3, -4): Block(35, 7), (-1, -1, -4): Block(35, 7), (0, -2, -4): Block(35, 7), (1, -4, -1): Block(35, 7), (0, -3, 2): Block(35, 7), (-3, -4, 0): Block(35, 7), (-1, -1, 4): Block(35, 7), (1, -4, -4): Block(35, 7), (1, -2, 4): Block(35, 7), (0, -2, 4): Block(35, 7), (-3, -4, 4): Block(35, 7), (-3, -3, -4): Block(35, 7), (0, -1, -4): Block(35, 7), (-3, -3, -1): Block(35, 7), (-1, -3, -2): Block(35, 7), (1, -2, 0): Block(95, 15), (-2, -2, -2): Block(35, 7), (-2, -4, -4): Block(35, 7), (0, -4, -2): Block(35, 7), (-2, -1, -1): Block(35, 7), (-2, -5, 4): Block(35, 7), (0, -4, -4): Block(35, 7), (-1, -2, -2): Block(35, 7), (-3, -3, 1): Block(35, 7), (-1, -4, -4): Block(35, 7), (-1, -4, -2): Block(35, 7), (-1, -3, 4): Block(35, 7), (0, -1, 4): Block(35, 7), (-1, -6, -4): Block(35, 7), (-1, -3, -4): Block(35, 7), (0, -3, 4): Block(35, 7), (1, -4, 1): Block(35, 7), (-1, -4, 4): Block(35, 7), (-1, -2, -4): Block(35, 7), (-1, 0, 4): Block(35, 7), (-1, -3, 3): Block(35, 7), (0, -5, 0): Block(35, 7), (1, -3, -4): Block(35, 7), (0, -2, 2): Block(35, 7), (-1, -1, 1): Block(35, 7), (1, -3, -1): Block(95, 15), (-3, -2, 0): Block(35, 7), (-2, -3, 4): Block(35, 7), (-3, -2, -4): Block(35, 7), (1, -3, 4): Block(35, 7), (0, -5, 4): Block(35, 7), (0, -1, 0): Block(35, 7), (-1, -5, 1): Block(35, 7), (-3, -2, -1): Block(35, 7), (-2, -3, 2): Block(35, 7), (-3, -2, 4): Block(35, 7), (-1, -2, 4): Block(35, 7), (1, -3, 0): Block(95, 15), (0, -5, -1): Block(35, 7), (0, -1, -1): Block(35, 7), (-2, -4, 4): Block(35, 7), (0, -5, -4): Block(35, 7), (-1, -3, -3): Block(35, 7), (0, -2, -2): Block(35, 7), (-1, -6, 4): Block(35, 7), (-1, 0, -4): Block(35, 7), (-3, -4, 1): Block(35, 7), (-1, -4, 2): Block(35, 7), (-2, -1, 1): Block(35, 7), (1, -2, 1): Block(95, 15), (0, -4, 2): Block(35, 7), (1, -2, -4): Block(35, 7), (1, -2, -1): Block(95, 15), (-1, -5, -4): Block(35, 7), (-3, -3, 0): Block(35, 7), (-1, -1, 0): Block(35, 7), (1, -4, 4): Block(35, 7), (0, -4, 4): Block(35, 7), (-1, -2, 2): Block(35, 7), (-3, -3, 4): Block(35, 7), (-3, -4, -1): Block(35, 7), (0, -3, -4): Block(35, 7), (1, -4, 0): Block(35, 7), (-2, -1, 4): Block(35, 7), (-2, -1, -4): Block(35, 7), (-2, -4, 2): Block(35, 7), (-3, -4, -4): Block(35, 7), (0, -3, -2): Block(35, 7), (0, -1, 1): Block(35, 7), (-1, -3, 2): Block(35, 7), (-1, -5, 0): Block(35, 7), (-2, -4, -2): Block(35, 7), (-2, -2, 2): Block(35, 7), (-2, -2, -4): Block(35, 7), (0, -5, 1): Block(35, 7), (-2, -5, -1): Block(35, 7), (-2, -5, -4): Block(35, 7), (-2, -1, 0): Block(35, 7), (-3, -2, 1): Block(35, 7), (-2, -5, 0): Block(35, 7), (-1, -5, -1): Block(35, 7)})
| 1,421 | 2,841 | 0.419071 | 618 | 2,842 | 1.927184 | 0.029126 | 0.564232 | 0.644836 | 0.347607 | 0.928631 | 0.924433 | 0.883291 | 0.864819 | 0.864819 | 0.674223 | 0 | 0.279399 | 0.180155 | 2,842 | 1 | 2,842 | 2,842 | 0.23176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 16 |
131654f512580e486c28d01487bcd6ff4c6dbb6a | 10,352 | py | Python | examples/set2seq/utils/python_layers.py | LisaAnne/set2set | 77475742e3e44b74037f1daf6422194a6ef16cf7 | [
"BSD-2-Clause"
] | null | null | null | examples/set2seq/utils/python_layers.py | LisaAnne/set2set | 77475742e3e44b74037f1daf6422194a6ef16cf7 | [
"BSD-2-Clause"
] | null | null | null | examples/set2seq/utils/python_layers.py | LisaAnne/set2set | 77475742e3e44b74037f1daf6422194a6ef16cf7 | [
"BSD-2-Clause"
] | null | null | null | import sys
sys.path.insert(0, '../../python/')
import caffe
import random
import numpy as np
from threading import Thread
import pdb
from python_utils import *
class sortDataRead(object):
def __init__(self, data, batch_size, max_value, thread_result):
self.data = data
self.n = data.shape[0]
self.len_sequence = self.data.shape[1]
self.iteration = 0
self.thread_result = thread_result
self.batch_size = batch_size
self.max_value = max_value
def __call__(self):
rand_mat = np.zeros((self.batch_size, self.len_sequence))
if self.iteration + self.batch_size >= self.n:
rand_mat[:self.n-self.iteration,:] = self.data[self.iteration:self.n,:]
rand_mat[self.n-self.iteration:,:] = self.data[:self.n-self.iteration,:]
self.iteration = self.n-self.iteration
else:
rand_mat = self.data[self.iteration:self.iteration+self.batch_size,:]
self.iteration += self.batch_size
label_mat = np.sort(rand_mat, axis=1)
train_label_mat = np.argsort(rand_mat, axis=1)
rand_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.max_value))
label_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.max_value))
a1_idx = [[i]*self.len_sequence for i in range(self.batch_size)]
a1_idx = [i for j in a1_idx for i in j]
a2_idx = range(self.len_sequence)*self.batch_size
rand_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(rand_mat)] = 1
label_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(label_mat)] = 1
label_one_hot_mat_shift = np.zeros((self.batch_size, self.len_sequence, self.max_value))
label_one_hot_mat_shift[:,1:,:] = label_mat_one_hot[:,:-1,:]
self.thread_result['rand_mat'] = rand_mat_one_hot
self.thread_result['label_mat'] = label_one_hot_mat_shift
self.thread_result['train_label_mat'] = train_label_mat #train_label_mat.reshape((self.batch_size, self.len_sequence, 1))
class sortDataGeneratorOne(object):
def __init__(self, len_sequence, max_value, batch_size, thread_result):
self.len_sequence = len_sequence
self.max_value = max_value
self.batch_size = batch_size
self.thread_result = thread_result
self.write_txt = open('train_generate_sents.txt', 'w')
self.write_txt.writelines('begin\n')
self.write_txt.close()
def __call__(self):
self.write_txt = open('train_generate_sents.txt', 'a')
rand_mat = np.random.rand(self.batch_size, self.len_sequence)
#rand_mat = np.array(rand_mat*1000, dtype=int)
label_mat = np.sort(rand_mat, axis=1)
train_label_mat = np.argsort(rand_mat, axis=1)
label_shift = np.zeros((self.batch_size, self.len_sequence))
label_shift[:,1:] = label_mat[:,:-1]
self.thread_result['rand_mat'] = rand_mat.reshape((self.batch_size, self.len_sequence, 1))
self.thread_result['label_mat'] = label_shift.reshape((self.batch_size, self.len_sequence, 1))
#self.thread_result['train_label_mat'] = train_label_mat_one_hot
self.thread_result['train_label_mat'] = train_label_mat #train_label_mat.reshape((self.batch_size, self.len_sequence, 1))
for i in range(self.batch_size):
self.write_txt.writelines('%s\n' %(' '.join([str(m) for m in np.ndarray.tolist(rand_mat[i,:])])))
self.write_txt.close()
class sortDataGenerator(object):
def __init__(self, len_sequence, max_value, batch_size, thread_result):
self.len_sequence = len_sequence
self.max_value = max_value
self.batch_size = batch_size
self.thread_result = thread_result
def __call__(self):
rand_mat = np.random.rand(self.batch_size, self.len_sequence)
rand_mat = np.array(rand_mat*self.max_value, dtype=int)
label_mat = np.sort(rand_mat, axis=1)
train_label_mat = np.argsort(rand_mat, axis=1)
rand_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.max_value))
label_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.max_value))
train_label_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.len_sequence))
a1_idx = [[i]*self.len_sequence for i in range(self.batch_size)]
a1_idx = [i for j in a1_idx for i in j]
a2_idx = range(self.len_sequence)*self.batch_size
rand_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(rand_mat)] = 1
label_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(label_mat)] = 1
train_label_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(train_label_mat)] = 1
label_one_hot_mat_shift = np.zeros((self.batch_size, self.len_sequence, self.max_value))
label_one_hot_mat_shift[:,1:,:] = label_mat_one_hot[:,:-1,:]
self.thread_result['rand_mat'] = rand_mat_one_hot
self.thread_result['label_mat'] = label_one_hot_mat_shift
#self.thread_result['train_label_mat'] = train_label_mat_one_hot
self.thread_result['train_label_mat'] = train_label_mat #train_label_mat.reshape((self.batch_size, self.len_sequence, 1))
# def __call__(self):
# rand_mat = np.random.rand(self.batch_size, self.len_sequence)
# rand_mat = np.array(rand_mat*self.max_value, dtype=int)
# label_mat = np.sort(rand_mat, axis=1)
# rand_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.max_value))
# label_mat_one_hot = np.zeros((self.batch_size, self.len_sequence, self.max_value))
# a1_idx = [[i]*self.len_sequence for i in range(self.batch_size)]
# a1_idx = [i for j in a1_idx for i in j]
# a2_idx = range(self.len_sequence)*self.batch_size
# rand_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(rand_mat)] = 1
# label_mat_one_hot[a1_idx, a2_idx, np.ndarray.flatten(label_mat)] = 1
#
# self.thread_result['rand_mat'] = rand_mat_one_hot
# self.thread_result['label_mat'] = label_mat_one_hot
class caffeDataLayer(caffe.Layer):
def dispatch_worker(self):
assert self.thread is None
self.thread = Thread(target=self.batchAdvancer)
self.thread.start()
def join_worker(self):
assert self.thread is not None
self.thread.join()
self.thread = None
def forward(self, bottom, top):
if self.thread is not None:
self.join_worker()
for top_index, name in zip(range(len(top)), self.top_names):
top[top_index].data[...] = self.thread_result[name]
self.dispatch_worker()
def reshape(self, bottom, top):
pass
def backward(self, bottom, top):
pass
class generateSortData(caffeDataLayer):
def setup(self, bottom, top):
self.params = eval(self.param_str)
assert 'len_sequence' in self.params.keys()
assert 'max_value' in self.params.keys()
assert 'batch_size' in self.params.keys()
self.len_sequence = self.params['len_sequence']
self.max_value = self.params['max_value']
self.batch_size = self.params['batch_size']
self.thread_result = {}
self.thread = None
self.top_names = ['rand_mat', 'label_mat', 'train_label_mat']
self.batchAdvancer = sortDataGenerator(self.params['len_sequence'], self.params['max_value'],
self.params['batch_size'], self.thread_result)
self.dispatch_worker()
self.join_worker()
print 'Outputs:', self.top_names
if len(top) != len(self.top_names):
raise Exception('Incorrect number of outputs (expected %d, got %d)' %
(len(self.top_names), len(top)))
for top_index, name in enumerate(self.top_names):
if name == 'train_label_mat':
#shape = (self.batch_size, self.len_sequence, 1)
shape = (self.batch_size, self.len_sequence)
else:
shape = (self.batch_size, self.len_sequence, self.max_value)
top[top_index].reshape(*shape)
class generateSortDataOne(caffeDataLayer):
def setup(self, bottom, top):
self.params = eval(self.param_str)
assert 'len_sequence' in self.params.keys()
assert 'max_value' in self.params.keys()
assert 'batch_size' in self.params.keys()
self.len_sequence = self.params['len_sequence']
self.max_value = self.params['max_value']
self.batch_size = self.params['batch_size']
self.thread_result = {}
self.thread = None
self.top_names = ['rand_mat', 'label_mat', 'train_label_mat']
self.batchAdvancer = sortDataGeneratorOne(self.params['len_sequence'],
self.params['max_value'],
self.params['batch_size'],
self.thread_result)
self.dispatch_worker()
self.join_worker()
print 'Outputs:', self.top_names
if len(top) != len(self.top_names):
raise Exception('Incorrect number of outputs (expected %d, got %d)' %
(len(self.top_names), len(top)))
for top_index, name in enumerate(self.top_names):
if name == 'train_label_mat':
#shape = (self.batch_size, self.len_sequence, 1)
shape = (self.batch_size, self.len_sequence)
else:
shape = (self.batch_size, self.len_sequence, 1)
top[top_index].reshape(*shape)
class readSortData(caffeDataLayer):
def setup(self, bottom, top):
self.params = eval(self.param_str)
assert 'len_sequence' in self.params.keys()
assert 'max_value' in self.params.keys()
assert 'batch_size' in self.params.keys()
self.len_sequence = self.params['len_sequence']
self.max_value = self.params['max_value']
self.batch_size = self.params['batch_size']
data_txt = 'utils/data/ls_%d_mv_%d_train.txt' %(self.len_sequence, self.max_value)
self.data = read_data(data_txt)
self.thread_result = {}
self.thread = None
self.top_names = ['rand_mat', 'label_mat', 'train_label_mat']
self.batchAdvancer = sortDataRead(self.data, self.params['batch_size'],
self.params['max_value'], self.thread_result)
self.dispatch_worker()
self.join_worker()
print 'Outputs:', self.top_names
if len(top) != len(self.top_names):
raise Exception('Incorrect number of outputs (expected %d, got %d)' %
(len(self.top_names), len(top)))
for top_index, name in enumerate(self.top_names):
if name == 'train_label_mat':
#shape = (self.batch_size, self.len_sequence, 1)
shape = (self.batch_size, self.len_sequence)
else:
shape = (self.batch_size, self.len_sequence, self.max_value)
top[top_index].reshape(*shape)
| 39.51145 | 125 | 0.689046 | 1,554 | 10,352 | 4.287001 | 0.078507 | 0.079706 | 0.099069 | 0.086761 | 0.853648 | 0.82918 | 0.784899 | 0.781297 | 0.764785 | 0.758781 | 0 | 0.007444 | 0.182477 | 10,352 | 261 | 126 | 39.662835 | 0.779747 | 0.123551 | 0 | 0.664894 | 0 | 0 | 0.086476 | 0.008847 | 0 | 0 | 0 | 0 | 0.058511 | 0 | null | null | 0.010638 | 0.037234 | null | null | 0.015957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
133c70aa95c5c7cc9867cb0dbfb9b7bc304bbc9e | 6,092 | py | Python | bindings/py_wrappers/pytest/address_test.py | SparrowTek/libdogecoin | ca63b528aff21711063fbbaa75d1607cf44e5cd4 | [
"MIT"
] | null | null | null | bindings/py_wrappers/pytest/address_test.py | SparrowTek/libdogecoin | ca63b528aff21711063fbbaa75d1607cf44e5cd4 | [
"MIT"
] | null | null | null | bindings/py_wrappers/pytest/address_test.py | SparrowTek/libdogecoin | ca63b528aff21711063fbbaa75d1607cf44e5cd4 | [
"MIT"
] | null | null | null | """Testing module for wrappers from address.c"""
import unittest
import ctypes as ct
import sys
sys.path.append("./bindings/py_wrappers/libdogecoin/")
import wrappers as w
lib = w.load_libdogecoin()
class TestGeneratePrivPubKeyPair(unittest.TestCase):
"""Test class for function generate_priv_pub_key_pair()"""
def test_privkey_gen_mainnet(self):
"""Test that function returns private key for mainnet"""
res = w.generate_priv_pub_key_pair()
self.assertIsNotNone(res[0])
def test_privkey_gen_testnet(self):
"""Test function returns private key for testnet"""
res = w.generate_priv_pub_key_pair(chain_code=1)
self.assertIsNotNone(res[0])
def test_privkey_is_valid_mainnet(self):
"""Test function returns valid private key"""
res = w.generate_priv_pub_key_pair(as_bytes=True)
privkey = (ct.c_ubyte * 32)()
ct.memmove(privkey, res[0], 32)
dogecoin_key = w.DogecoinKey(privkey)
lib.dogecoin_ecc_start()
self.assertTrue(lib.dogecoin_privkey_is_valid(ct.byref(dogecoin_key)))
lib.dogecoin_ecc_stop()
def test_privkey_is_valid_testnet(self):
"""Test function returns valid private key"""
res = w.generate_priv_pub_key_pair(chain_code=1, as_bytes=True)
privkey = (ct.c_ubyte * 32)()
ct.memmove(privkey, res[0], 32)
dogecoin_key = w.DogecoinKey(privkey)
lib.dogecoin_ecc_start()
self.assertTrue(lib.dogecoin_privkey_is_valid(ct.byref(dogecoin_key)))
lib.dogecoin_ecc_stop()
def test_pubkey_gen_mainnet(self):
"""Test function returns public key for mainnet"""
res = w.generate_priv_pub_key_pair()
self.assertIsNotNone(res[1])
def test_pubkey_gen_testnet(self):
"""Test function returns public key for testnet"""
res = w.generate_priv_pub_key_pair(chain_code=1)
self.assertIsNotNone(res[1])
def test_p2pkh_addr_format_is_valid_mainnet(self):
"""Test function returns valid address for mainnet"""
res = w.generate_priv_pub_key_pair()
self.assertTrue(w.verify_p2pkh_address(res[1], 0))
def test_p2pkh_addr_format_is_valid_testnet(self):
"""Test function returns valid address for testnet"""
res = w.generate_priv_pub_key_pair(chain_code=1)
self.assertTrue(w.verify_p2pkh_address(res[1], 1))
def test_keypair_is_valid_mainnet(self):
"""Test that the private and public key for mainnet
are valid and associated to each other"""
res = w.generate_priv_pub_key_pair()
self.assertTrue(w.verify_priv_pub_keypair(res[0], res[1]))
def test_keypair_is_valid_testnet(self):
"""Test that the private and public key for testnet
are valid and associated to each other"""
res = w.generate_priv_pub_key_pair(chain_code=1)
self.assertTrue(w.verify_priv_pub_keypair(res[0], res[1], chain_code=1))
class TestGenerateHDMasterPrivPubKeyPair(unittest.TestCase):
"""Test class for function generate_hd_master_pub_key_pair"""
def test_master_privkey_gen_mainnet(self):
"""Test function returns master private key for mainnet"""
res = w.generate_hd_master_pub_key_pair()
self.assertIsNotNone(res[0])
def test_master_privkey_gen_testnet(self):
"""Test function returns amster private key for testnet"""
res = w.generate_hd_master_pub_key_pair(chain_code=1)
self.assertIsNotNone(res[0])
def test_privkey_is_valid_mainnet(self):
"""Test function returns valid master private key for mainnet"""
res = w.generate_hd_master_pub_key_pair(as_bytes=True)
privkey = (ct.c_ubyte * 32)()
# TODO: memmove operation only takes the first 32 bytes and cuts the rest
# should the is_valid check even return true? seems wrong
ct.memmove(privkey, res[0], 32)
dogecoin_key = w.DogecoinKey(privkey)
lib.dogecoin_ecc_start()
self.assertTrue(lib.dogecoin_privkey_is_valid(ct.byref(dogecoin_key)))
lib.dogecoin_ecc_stop()
def test_privkey_is_valid_testnet(self):
"""Test function returns valid private key"""
res = w.generate_priv_pub_key_pair(chain_code=1, as_bytes=True)
privkey = (ct.c_ubyte * 32)()
ct.memmove(privkey, res[0], 32)
dogecoin_key = w.DogecoinKey(privkey)
lib.dogecoin_ecc_start()
self.assertTrue(lib.dogecoin_privkey_is_valid(ct.byref(dogecoin_key)))
lib.dogecoin_ecc_stop()
def test_master_pubkey_gen_mainnet(self):
"""Test function returns master public key for mainnet"""
res = w.generate_hd_master_pub_key_pair()
self.assertIsNotNone(res[1])
def test_master_pubkey_gen_testnet(self):
"""Test function returns master public key for testnet"""
res = w.generate_hd_master_pub_key_pair(chain_code=1)
self.assertIsNotNone(res[1])
def test_master_keypair_is_valid_mainnet(self):
"""Test function verifies a valid hd keypair for mainnet"""
res = w.generate_hd_master_pub_key_pair()
self.assertTrue(w.verify_master_priv_pub_keypair(res[0], res[1], 0))
# TODO: need support for key derivation on testnet
# def test_master_keypair_is_valid_testnet(self):
# """Test function verifies a valid hd keypair for testnet"""
# res = w.generate_hd_master_pub_key_pair()
# self.assertTrue(w.verify_master_priv_pub_keypair(res[0], res[1], 1))
def test_p2pkh_addr_format_is_valid_mainnet(self):
"""Test function returns valid address for mainnet"""
res = w.generate_hd_master_pub_key_pair()
self.assertTrue(w.verify_p2pkh_address(res[1], 0))
# TODO: need support for key derivation on testnet
# def test_p2pkh_addr_format_is_valid_testnet(self):
# """Test function returns valid address for testnet"""
# res = w.generate_hd_master_pub_key_pair(chain_code=1)
# self.assertTrue(w.verify_p2pkh_address(res[1], 1))
if __name__ == "__main__":
unittest.main()
| 41.162162 | 81 | 0.6978 | 860 | 6,092 | 4.627907 | 0.116279 | 0.033166 | 0.055276 | 0.086683 | 0.90402 | 0.880653 | 0.85 | 0.760553 | 0.742965 | 0.696734 | 0 | 0.013608 | 0.203874 | 6,092 | 147 | 82 | 41.442177 | 0.80701 | 0.292351 | 0 | 0.666667 | 0 | 0 | 0.010292 | 0.008377 | 0 | 0 | 0 | 0.006803 | 0.214286 | 1 | 0.214286 | false | 0 | 0.047619 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
139402de483afa923832aad84e2df753588866f2 | 5,040 | py | Python | test/api_test.py | Trondheim-kommune/flod_matrikkel_address_restapi | 484879607b739f09b1ff710f931d485b85d7527f | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2017-01-09T09:03:11.000Z | 2017-01-09T09:03:11.000Z | test/api_test.py | Trondheim-kommune/flod_matrikkel_address_restapi | 484879607b739f09b1ff710f931d485b85d7527f | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2017-10-11T18:46:41.000Z | 2018-02-26T09:57:08.000Z | test/api_test.py | Trondheim-kommune/flod_matrikkel_address_restapi | 484879607b739f09b1ff710f931d485b85d7527f | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
import os
import unittest
import json
from base64 import b64encode
import app
@unittest.skipUnless(os.environ.get("MATRIKKEL_BASE_URL", None), "matrikkel url not set")
@unittest.skipUnless(os.environ.get("MATRIKKEL_USERNAME", None), "matrikkel username not set")
@unittest.skipUnless(os.environ.get("MATRIKKEL_PASSWORD", None), "matrikkel password not set")
class MatrikkelApiAddressTest(unittest.TestCase):
def setUp(self):
username = "test"
password = "test"
self.app = app.create_app(
username,
password,
os.environ["MATRIKKEL_BASE_URL"],
os.environ["MATRIKKEL_USERNAME"],
os.environ["MATRIKKEL_PASSWORD"]
)
self.client = self.app.test_client()
self.headers = {
'Authorization': 'Basic ' + b64encode("{0}:{1}".format(username, password))
}
def test_search_for_address(self):
rv = self.client.get("/api/v1/addresses?query=ravelsveita", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 2)
self.assertEqual(data[0]["name"], "Ravelsveita 4")
def test_handle_special_chars(self):
rv = self.client.get("/api/v1/addresses?query=kjøpmannsgata", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 54)
self.assertEqual(data[0]["name"], u"Kjøpmannsgata 50")
def test_handle_search_on_number(self):
rv = self.client.get("/api/v1/addresses?query=kjøpmannsgata 50", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 1)
self.assertEqual(data[0]["name"], u"Kjøpmannsgata 50")
def test_handle_search_on_number_and_letter(self):
rv = self.client.get("/api/v1/addresses?query=kjøpmannsgata 16B", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 1)
self.assertEqual(data[0]["name"], u"Kjøpmannsgata 16B")
def test_handle_search_on_number_and_letter_lowercase(self):
rv = self.client.get("/api/v1/addresses?query=kjøpmannsgata 16b", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 1)
self.assertEqual(data[0]["name"], u"Kjøpmannsgata 16B")
def test_handle_search_on_number_and_letter_with_space(self):
rv = self.client.get("/api/v1/addresses?query=kjøpmannsgata 16 B", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 1)
self.assertEqual(data[0]["name"], u"Kjøpmannsgata 16B")
def test_handle_search_on_number_and_letter_with_several_spaces(self):
rv = self.client.get("/api/v1/addresses?query=kjøpmannsgata 16 B", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 1)
self.assertEqual(data[0]["name"], u"Kjøpmannsgata 16B")
@unittest.skipUnless(os.environ.get("MATRIKKEL_BASE_URL", None), "matrikkel url not set")
@unittest.skipUnless(os.environ.get("MATRIKKEL_USERNAME", None), "matrikkel username not set")
@unittest.skipUnless(os.environ.get("MATRIKKEL_PASSWORD", None), "matrikkel password not set")
class MatrikkelApiBuildingTest(unittest.TestCase):
def setUp(self):
username = "test"
password = "test"
self.app = app.create_app(
username,
password,
os.environ["MATRIKKEL_BASE_URL"],
os.environ["MATRIKKEL_USERNAME"],
os.environ["MATRIKKEL_PASSWORD"]
)
self.client = self.app.test_client()
self.headers = {
'Authorization': 'Basic ' + b64encode("{0}:{1}".format(username, password))
}
def test_get_points_for_bnr_gnr(self):
rv = self.client.get("/api/v1/buildings?gardsnr=402&bruksnr=188", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 1)
self.assertEqual(data[0]["position"]["lat"], 63.43152178572586)
self.assertEqual(data[0]["position"]["lon"], 10.39263181301638)
def test_should_not_return_bygningsendring(self):
rv = self.client.get("/api/v1/buildings?gardsnr=16&bruksnr=60", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(len(data), 17)
def test_should_get_return_bygningsnummer(self):
rv = self.client.get("/api/v1/buildings?gardsnr=402&bruksnr=188", headers=self.headers)
self.assertEqual(200, rv.status_code)
data = json.loads(rv.data)
self.assertEqual(data[0]["building_number"], 182166081) | 39.069767 | 99 | 0.662897 | 635 | 5,040 | 5.129134 | 0.16063 | 0.133558 | 0.030703 | 0.049125 | 0.876881 | 0.858459 | 0.858459 | 0.858459 | 0.855695 | 0.831747 | 0 | 0.038166 | 0.199405 | 5,040 | 129 | 100 | 39.069767 | 0.769021 | 0.00754 | 0 | 0.67 | 0 | 0 | 0.20196 | 0.075585 | 0 | 0 | 0 | 0 | 0.29 | 1 | 0.12 | false | 0.1 | 0.05 | 0 | 0.19 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
139fbcc365ddbd2b101f330108f10b1274222b40 | 13,520 | py | Python | src/m101j/week03/hw3-2and3-3/validate.py | hemmerling/nosql-mongodb2013 | bd2bb4f76234e0732b738f14cb474f7554c864c1 | [
"Apache-2.0"
] | 1 | 2019-06-29T20:21:15.000Z | 2019-06-29T20:21:15.000Z | hw3-2and3-3/validate.py | boaglio/mongodb-java | 9f3b8dfcb3943d5b5c9df40b5f5a4dee78fe69d2 | [
"Apache-2.0"
] | null | null | null | hw3-2and3-3/validate.py | boaglio/mongodb-java | 9f3b8dfcb3943d5b5c9df40b5f5a4dee78fe69d2 | [
"Apache-2.0"
] | null | null | null | import base64
code="
import pymongo
import urllib2
import urllib
import cookielib
import random
import re
import string
import sys
import getopt

# init the global cookie jar
cj = cookielib.CookieJar()
# declare the variables to connect to db
connection = None
db = None
webhost = "localhost:8082"
mongostr = "mongodb://localhost:27017"
db_name = "blog"

# this script will check that homework 3.2 is correct

# makes a little salt
def make_salt(n):
    salt = ""
    for i in range(n):
        salt = salt + random.choice(string.ascii_letters)
    return salt


# this is a validation program to make sure that the blog works correctly.

def create_user(username, password):
    
    global cj

    try:
        print "Trying to create a test user ", username
        url = "http://{0}/signup".format(webhost)

        data = urllib.urlencode([("email",""),("username",username), ("password",password), ("verify",password)])
        request = urllib2.Request(url=url, data=data)
        opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
        f = opener.open(request)

        users = db.users
        # check that the user is in users collection
        user = users.find_one({'_id':username})
        if (user == None):
            print "Could not find the test user ", username, "in the users collection."
            return False
        print "Found the test user ", username, " in the users collection"

        # check that the user has been built
        result = f.read()
        expr = re.compile("Welcome\s+"+ username)
        if expr.search(result):
            return True
        
        print "When we tried to create a user, here is the output we got\n"
        print result
        
        return False
    except:
        print "the request to ", url, " failed, so your blog may not be running."
        raise
        return False


def try_to_login(username, password):

    try:
        print "Trying to login for test user ", username
        url = "http://{0}/login".format(webhost)

        data = urllib.urlencode([("username",username), ("password",password)])
        request = urllib2.Request(url=url, data=data)
        opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
        f = opener.open(request)

        # check for successful login
        result = f.read()
        expr = re.compile("Welcome\s+"+ username)
        if expr.search(result):
            return True

        print "When we tried to login, here is the output we got\n"
        print result
        return False
    except:
        print "the request to ", url, " failed, so your blog may not be running."
        return False


def add_blog_post(title,post,tags):

    try:
        print "Trying to submit a post with title ", title
        data = urllib.urlencode([("body",post), ("subject",title), ("tags",tags)])
        url = "http://{0}/newpost".format(webhost)
        request = urllib2.Request(url=url, data=data)
        cj.add_cookie_header(request)
        opener = urllib2.build_opener()
        f = opener.open(request)

        # check for successful login
        result = f.read()
        expr = re.compile(title + ".+" + post, re.DOTALL)

        if expr.search(result):
            return True

        print "When we tried to post, here is the output we got\n"
        print result
        return False

    except:
        print "the request to ", url, " failed, so your blog may not be running."
        raise

        return False

def add_blog_comment(title,post):

    try:
        print "Trying to submit a blog comment for post with title", title
        url = "http://{0}/newcomment".format(webhost)
        
        doc = {}
        check_mongo_for_post(title, post, doc)

        permalink = doc['doc']['permalink']

        comment_name = make_salt(12)
        comment_body = make_salt(12)

        data = urllib.urlencode([("commentName",comment_name), ("commentBody",comment_body), ("permalink",permalink)])
        request = urllib2.Request(url=url, data=data)
        cj.add_cookie_header(request)
        opener = urllib2.build_opener()
        f = opener.open(request)

        # check for successful addition of comment on page
        result = f.read()
        expr = re.compile(title + ".+" + post, re.DOTALL)

        if not expr.search(result):
            print "When we tried to find the comment we posted at the  ", url, " here is what we got"
            print result
            return False


        # check for successful addition of comment..retrieve the doc again
        if(not check_mongo_for_post(title, post, doc)):
            print "Could not find comment in database"
            return False
        
        found = False
        if ('comments' in doc['doc']):
            for comment in doc['doc']['comments']:
                if (comment['body'] == comment_body and comment['author'] == comment_name):
                    found = True

        return found

    except:
        print "the request to ", url, " failed, so your blog may not be running."
        raise

        return False


# grabs the blog index and checks that the posts appear in the right order
def check_blog_index(title1, title2):

    try:
        url = "http://{0}/".format(webhost)
        print "Trying to grab the blog home page at url ", url
        request = urllib2.Request(url=url)
        cj.add_cookie_header(request)
        opener = urllib2.build_opener()
        f = opener.open(request)

        # check for successful login
        result = f.read()
        expr = re.compile(title2.lower() + ".+" + title2, re.DOTALL)

        if expr.search(result):
            return True

        print "When we tried to read the blog index at ", url, " here is what we got"
        print result
        return False

    except:
        print "the request to ", url, " failed, so your blog may not be running."
        raise

        return False

# check that a particular blog post is in the collection
def check_mongo_for_post(title, body, document):
    
    posts = db.posts
    try:
        post = posts.find_one({'title':title, 'body':body})
        if (post is None):
            print "Can't find post with title ", title, " in collection"
            return False
        document['doc'] = post
        return True
    except:
        print "can' query MongoDB..is it running?"
        raise

        return False

# command line arg parsing to make folks happy who want to run at mongolabs or mongohq
# this functions uses global vars to communicate. forgive me.
def arg_parsing(argv):

    global webhost
    global mongostr
    global db_name

    try:
        opts, args = getopt.getopt(argv, "-p:-m:-d:")
    except getopt.GetoptError:
        print "usage validate.py -p webhost -m mongoConnectString -d databaseName"
        print "\twebhost defaults to {0}".format(webhost)
        print "\tmongoConnectionString default to {0}".format(mongostr)
        print "\tdatabaseName defaults to {0}".format(db_name)
        sys.exit(2)
    for opt, arg in opts:
        if (opt == '-h'):
            print "usage validate.py -p webhost -m mongoConnectString -d databaseName"
            sys.exit(2)
        elif opt in ("-p"):
            webhost = arg
            print "Overriding HTTP host to be ", webhost
        elif opt in ("-m"):
            mongostr = arg
            print "Overriding MongoDB connection string to be ", mongostr
        elif opt in ("-d"):
            db_name = arg
            print "Overriding MongoDB database to be ", db_name
            


# main section of the code
def main(argv):
            
    arg_parsing(argv)
    global connection
    global db

    print "Welcome to the HW 3.2 and HW 3.3 validation tester"

    # connect to the db (mongostr was set in arg_parsing)
    connection = pymongo.Connection(mongostr, safe=True)
    db = connection[db_name]
        
    username = make_salt(7)
    password = make_salt(8)

     # try to create user

    if (create_user(username, password)):
        print "User creation successful. "
         # try to login
        if (try_to_login(username, password)):
            print "User login successful."
        else:
            print "User login failed"
            print "Odd, this weeks's code should do that as given"
            sys.exit(1)

    else:
        print "Sorry, you have not solved it yet."
        sys.exit(1)


    # try to create a blog post
    post1 = make_salt(30)
    title1 = make_salt(30)
    tags1 = make_salt(5) + ", " + make_salt(5) + ", " + make_salt(5)


    if (add_blog_post(title1, post1,tags1)):
        print "Submission of single post successful"
    else:
        print "Unable to create a post"
        sys.exit(1)


    # try to create a second blog post
    post2 = make_salt(30)
    title2 = make_salt(30)
    tags2 = make_salt(5) + ", " + make_salt(5) + ", " + make_salt(5)

    if (add_blog_post(title2, post2,tags2)):
        print "Submission of second post successful"
    else:
        print "Unable to create second post"
        sys.exit(1)

    # now let's make sure that both posts appear on the home page of the blog, in the correct order

    if (check_blog_index(title1, title2)):
        print "Block index looks good."
    else:
        print "Blog index does not have the posts present, ordered correctly"
        sys.exit(1)


    # check for DB data integrity
    if (not check_mongo_for_post(title1, post1, {})):
        print "Can't find blog post in blog db, posts collection with title ", title
        sys.exit(1)
    else:
        print "Found blog post in posts collection"


    print "Tests Passed for HW 3.2. Your HW 3.2 validation code is njkfd489hj9fhds8934kf23"

    # now check that you can post a comment
    if (not add_blog_comment(title1,post1)):
        print "Can't add blog comments (so HW 3.3 not yet complete)"
        sys.exit(1)
    else:
        print "Successfully added blog comments"


    print "Tests Passed for HW 3.3. Your HW 3.3 validation code is ihfr48nf89jk09309kj0d2d"
    



if __name__ == "__main__":
    main(sys.argv[1:])







"
eval(compile(base64.b64decode(code), "<string>", 'exec'))
| 3,380 | 13,447 | 0.998151 | 11 | 13,520 | 1,226.818182 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097085 | 0.000444 | 13,520 | 3 | 13,448 | 4,506.666667 | 0.90151 | 0 | 0 | 0 | 0 | 0 | 0.99497 | 0.994083 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
13c88e561fa89e4098061f2dcc1ef3859cfec922 | 21,554 | py | Python | tests/unit_tests/test_UpdatedSynDataset_specified_target.py | liyu711/SUAS | 2f6592fc2ab316475eeabe2f4828e5ba5c1a4b0b | [
"MIT"
] | null | null | null | tests/unit_tests/test_UpdatedSynDataset_specified_target.py | liyu711/SUAS | 2f6592fc2ab316475eeabe2f4828e5ba5c1a4b0b | [
"MIT"
] | null | null | null | tests/unit_tests/test_UpdatedSynDataset_specified_target.py | liyu711/SUAS | 2f6592fc2ab316475eeabe2f4828e5ba5c1a4b0b | [
"MIT"
] | null | null | null | """
Running Successfully
import unittest
import math
from PIL import Image
from SyntheticDataset2.ImageCreator.specified_target import SpecifiedTarget
class SpecifiedTargetTestCase(unittest.TestCase):
def setUp(self):
self.test_image0 = SpecifiedTarget("circle","?", "A", 300, 1.5, (255, 255, 255, 255), (0, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image1 = SpecifiedTarget("quarter_circle", "N", "A", 300, 1.5, (255, 255, 255, 255), (0, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image2 = SpecifiedTarget("quarter_circle", "NE", "B", 300, 1.5, (255, 255, 255, 255), (0, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image3 = SpecifiedTarget("quarter_circle", "E", "C", 300, 1.5, (255, 255, 255, 255), (0, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image4 = SpecifiedTarget("quarter_circle", "SE", "D", 300, 1.5, (255, 255, 255, 255), (0, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image5 = SpecifiedTarget("quarter_circle", "S", "E", 400, 1.5, (0, 0, 0, 255), (255, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image6 = SpecifiedTarget("quarter_circle", "SW", "F", 400, 1.5, (0, 0, 0, 255), (255, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image7 = SpecifiedTarget("quarter_circle", "W", "G", 400, 1.5, (0, 0, 0, 255), (255, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image8 = SpecifiedTarget("quarter_circle", "NW", "H", 400, 1.5, (0, 0, 0, 255), (255, 255, 255, 255), 0, 0, 0).create_specified_target()
self.test_image9 = SpecifiedTarget("semicircle", "N", "I", 500, 1.5, (255, 0, 0, 255), (0, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image10 = SpecifiedTarget("semicircle", "E", "J", 500, 1.5, (255, 0, 0, 255), (0, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image11 = SpecifiedTarget("semicircle", "S", "K", 500, 2, (255, 0, 0, 255), (0, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image12 = SpecifiedTarget("semicircle", "W", "L", 500, 2, (255, 0, 0, 255), (0, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image13 = SpecifiedTarget("cross", "N", "M", 600, 2, (0, 255, 0, 255), (255, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image14 = SpecifiedTarget("cross", "D", "N", 600, 2, (0, 255, 0, 255), (255, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image15 = SpecifiedTarget("triangle", "N", "O", 600, 2, (0, 255, 0, 255), (255, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image16 = SpecifiedTarget("triangle", "S", "P", 600, 2, (0, 255, 0, 255), (255, 0, 0, 255), 0, 0, 0).create_specified_target()
self.test_image17 = SpecifiedTarget("square", "N", "Q", 700, 2, (0, 0, 255, 255), (0, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image18 = SpecifiedTarget("square", "D", "R", 700, 2, (0, 0, 255, 255), (0, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image19 = SpecifiedTarget("rectangle", "NS", "S", 700, 2, (0, 0, 255, 255), (0, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image20 = SpecifiedTarget("rectangle", "EW", "T", 700, 2, (0, 0, 255, 255), (0, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image21 = SpecifiedTarget("trapezoid", "N", "U", 800, 2, (255, 255, 0, 255), (0, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image22 = SpecifiedTarget("trapezoid", "S", "V", 800, 2.5, (255, 255, 0, 255), (0, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image23 = SpecifiedTarget("pentagon", "N", "W", 800, 2.5, (255, 255, 0, 255), (0, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image24 = SpecifiedTarget("pentagon", "S", "X", 800, 2.5, (255, 255, 0, 255), (0, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image25 = SpecifiedTarget("hexagon", "N", "Y", 900, 2.5, (255, 0, 255, 255), (255, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image26 = SpecifiedTarget("hexagon", "D", "Z", 900, 2.5, (255, 0, 255, 255), (255, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image27 = SpecifiedTarget("heptagon", "N", "A", 900, 2.5, (255, 0, 255, 255), (255, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image28 = SpecifiedTarget("heptagon", "S", "B", 900, 2.5, (255, 0, 255, 255), (255, 255, 0, 255), 0, 0, 0).create_specified_target()
self.test_image29 = SpecifiedTarget("octagon", "N", "C", 1000, 2.5, (0, 255, 255, 255), (255, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image30 = SpecifiedTarget("octagon", "D", "D", 1000, 2.5, (0, 255, 255, 255), (255, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image31 = SpecifiedTarget("star", "N", "E", 1000, 2.5, (0, 255, 255, 255), (255, 0, 255, 255), 0, 0, 0).create_specified_target()
self.test_image32 = SpecifiedTarget("star", "S", "F", 1000, 2.5, (0, 255, 255, 255), (255, 0, 255, 255), 0, 0, 0).create_specified_target()
def test_create_specified_single_target(self):
self.assertTrue(abs(self.test_image0.width - 300) <= 10)
self.assertTrue(abs(self.test_image0.height - 300) <= 10)
self.assertTrue(self.test_image0.load()[1, 1] != (255, 255, 255, 255))
self.assertTrue(self.test_image0.load()[self.test_image0.width-1, self.test_image0.height-1] != (255, 255, 255, 255))
self.assertTrue(self.test_image0.load()[self.test_image0.width/2, self.test_image0.height/2] == (255, 255, 255, 255))
self.assertTrue(abs(self.test_image1.width - (300 * 1.5 * math.sqrt(2))) <= 10)
self.assertTrue(abs(self.test_image1.height - (300 * 1.5)) <= 10)
self.assertTrue(self.test_image1.load()[1, 1] != (255, 255, 255, 255))
self.assertTrue(self.test_image1.load()[self.test_image1.width-1, self.test_image1.height-1] != (255, 255, 255, 255))
self.assertTrue(self.test_image1.load()[self.test_image1.width/2, self.test_image1.height/2] == (255, 255, 255, 255))
self.assertTrue(abs(self.test_image2.width - (300 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image2.height - (300 * 1.5)) <= 10)
self.assertTrue(self.test_image2.load()[1, 1] == (255, 255, 255, 255))
self.assertTrue(self.test_image2.load()[self.test_image2.width-1, self.test_image2.height-1] == (255, 255, 255, 255))
self.assertTrue(self.test_image2.load()[self.test_image2.width/2, self.test_image2.height/2] == (255, 255, 255, 255))
self.assertTrue(abs(self.test_image3.width - (300 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image3.height - (300 * 1.5 * math.sqrt(2))) <= 10)
self.assertTrue(self.test_image3.load()[1, 1] != (255, 255, 255, 255))
self.assertTrue(self.test_image3.load()[self.test_image3.width-1, self.test_image3.height-1] != (255, 255, 255, 255))
self.assertTrue(self.test_image3.load()[self.test_image3.width/2, self.test_image3.height/2] == (255, 255, 255, 255))
self.assertTrue(abs(self.test_image4.width - (300 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image4.height - (300 * 1.5)) <= 10)
self.assertTrue(self.test_image4.load()[1, 1] != (255, 255, 255, 255))
self.assertTrue(self.test_image4.load()[self.test_image4.width-1, self.test_image4.height-1] == (255, 255, 255, 255))
self.assertTrue(self.test_image4.load()[self.test_image4.width/2, self.test_image4.height/2] == (255, 255, 255, 255))
self.assertTrue(abs(self.test_image5.width - (400 * 1.5 * math.sqrt(2))) <= 10)
self.assertTrue(abs(self.test_image5.height - (400 * 1.5)) <= 10)
self.assertTrue(self.test_image5.load()[1, 1] != (0, 0, 0, 255))
self.assertTrue(self.test_image5.load()[self.test_image5.width-1, self.test_image5.height-1] != (0, 0, 0, 255))
self.assertTrue(self.test_image5.load()[self.test_image5.width/2, self.test_image5.height/2] == (255, 255, 255, 255))
self.assertTrue(abs(self.test_image6.width - (400 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image6.height - (400 * 1.5)) <= 10)
self.assertTrue(self.test_image6.load()[1, 1] == (0, 0, 0, 255))
self.assertTrue(self.test_image6.load()[self.test_image6.width-1, self.test_image6.height-1] == (0, 0, 0, 255))
self.assertTrue(self.test_image6.load()[self.test_image6.width/2, self.test_image6.height/2] == (0, 0, 0, 255))
self.assertTrue(abs(self.test_image7.width - (400 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image7.height - (400 * 1.5 * math.sqrt(2))) <= 10)
self.assertTrue(self.test_image7.load()[1, 1] != (0, 0, 0, 255))
self.assertTrue(self.test_image7.load()[self.test_image7.width-1, self.test_image7.height-1] != (0, 0, 0, 255))
self.assertTrue(self.test_image7.load()[self.test_image7.width/2, self.test_image7.height/2] == (0, 0, 0, 255))
self.assertTrue(abs(self.test_image8.width - (400 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image8.height - (400 * 1.5)) <= 10)
self.assertTrue(self.test_image8.load()[1, 1] == (0, 0, 0, 255))
self.assertTrue(self.test_image8.load()[self.test_image8.width-1, self.test_image8.height-1] != (0, 0, 0, 255))
self.assertTrue(self.test_image8.load()[self.test_image8.width/2, self.test_image8.height/2] == (0, 0, 0, 255))
self.assertTrue(abs(self.test_image9.width - 1000) <= 10)
self.assertTrue(abs(self.test_image9.height - 500) <= 10)
self.assertTrue(self.test_image9.load()[1, 1] == (255, 0, 0, 255))
self.assertTrue(self.test_image9.load()[self.test_image9.width-1, self.test_image9.height-1] != (255, 0, 0, 255))
self.assertTrue(self.test_image9.load()[self.test_image9.width/2, self.test_image9.height/2] == (0, 0, 0, 255))
self.assertTrue(abs(self.test_image10.width - 500) <= 10)
self.assertTrue(abs(self.test_image10.height - 1000) <= 10)
self.assertTrue(self.test_image10.load()[1, 1] != (255, 0, 0, 255))
self.assertTrue(self.test_image10.load()[self.test_image10.width-1, self.test_image10.height-1] == (255, 0, 0, 255))
self.assertTrue(self.test_image10.load()[self.test_image10.width/2, self.test_image10.height/2] == (255, 0, 0, 255))
self.assertTrue(abs(self.test_image11.width - 1000) <= 10)
self.assertTrue(abs(self.test_image11.height - 500) <= 10)
self.assertTrue(self.test_image11.load()[1, 1] != (255, 0, 0, 255))
self.assertTrue(self.test_image11.load()[self.test_image11.width-1, self.test_image11.height-1] == (255, 0, 0, 255))
self.assertTrue(self.test_image11.load()[self.test_image11.width/2, self.test_image11.height/2] == (0, 0, 0, 255))
self.assertTrue(abs(self.test_image12.width - 500) <= 10)
self.assertTrue(abs(self.test_image12.height - 1000) <= 10)
self.assertTrue(self.test_image12.load()[1, 1] == (255, 0, 0, 255))
self.assertTrue(self.test_image12.load()[self.test_image12.width-1, self.test_image12.height-1] != (255, 0, 0, 255))
self.assertTrue(self.test_image12.load()[self.test_image12.width/2, self.test_image12.height/2] == (255, 0, 0, 255))
self.assertTrue(abs(self.test_image13.width - (600 * 2)) <= 10)
self.assertTrue(abs(self.test_image13.height - (600 * 2)) <= 10)
self.assertTrue(self.test_image13.load()[1, 1] != (0, 255, 0, 255))
self.assertTrue(self.test_image13.load()[self.test_image13.width-1, self.test_image13.height-1] != (0, 255, 0, 255))
self.assertTrue(self.test_image13.load()[self.test_image13.width/2, self.test_image13.height/2] == (255, 0, 0, 255))
self.assertTrue(abs(self.test_image14.width - (800 * math.sqrt(2))) <= 10)
self.assertTrue(abs(self.test_image14.height - (800 * math.sqrt(2))) <= 10)
self.assertTrue(self.test_image14.load()[1, 1] != (0, 255, 0, 255))
self.assertTrue(self.test_image14.load()[self.test_image14.width-1, self.test_image14.height-1] != (0, 255, 0, 255))
self.assertTrue(self.test_image14.load()[self.test_image14.width/2, self.test_image14.height/2] == (255, 0, 0, 255))
self.assertTrue(abs(self.test_image15.width - (600 * 1.5 / math.sqrt(3) * 2)) <= 10)
self.assertTrue(abs(self.test_image15.height - (600 * 1.5)) <= 10)
self.assertTrue(self.test_image15.load()[1, 1] != (0, 255, 0, 255))
self.assertTrue(self.test_image15.load()[self.test_image15.width-1, self.test_image15.height-1] == (0, 255, 0, 255))
self.assertTrue(self.test_image15.load()[self.test_image15.width/2, self.test_image15.height/2] == (255, 0, 0, 255))
self.assertTrue(abs(self.test_image16.width - (600 * 1.5 / math.sqrt(3) * 2)) <= 10)
self.assertTrue(abs(self.test_image16.height - (600 * 1.5)) <= 10)
self.assertTrue(self.test_image16.load()[1, 1] == (0, 255, 0, 255))
self.assertTrue(self.test_image16.load()[self.test_image16.width-1, self.test_image16.height-1] != (0, 255, 0, 255))
self.assertTrue(self.test_image16.load()[self.test_image16.width/2, self.test_image16.height/2] == (0, 255, 0, 255))
self.assertTrue(abs(self.test_image17.width - (700 * math.sqrt(2))) <= 10)
self.assertTrue(abs(self.test_image17.height - (700 * math.sqrt(2))) <= 10)
self.assertTrue(self.test_image17.load()[1, 1] != (0, 0, 255, 255))
self.assertTrue(self.test_image17.load()[self.test_image17.width-1, self.test_image17.height-1] != (0, 0, 255, 255))
self.assertTrue(self.test_image17.load()[self.test_image17.width/2, self.test_image17.height/2] == (0, 0, 255, 255))
self.assertTrue(abs(self.test_image18.width - 700) <= 10)
self.assertTrue(abs(self.test_image18.height - 700) <= 10)
self.assertTrue(self.test_image18.load()[1, 1] == (0, 0, 255, 255))
self.assertTrue(self.test_image18.load()[self.test_image18.width-1, self.test_image18.height-1] == (0, 0, 255, 255))
self.assertTrue(self.test_image18.load()[self.test_image18.width/2, self.test_image18.height/2] == (0, 255, 0, 255))
self.assertTrue(abs(self.test_image19.width - 700) <= 10)
self.assertTrue(abs(self.test_image19.height - (700 * 1.5)) <= 10)
self.assertTrue(self.test_image19.load()[1, 1] == (0, 0, 255, 255))
self.assertTrue(self.test_image19.load()[self.test_image19.width-1, self.test_image19.height-1] == (0, 0, 255, 255))
self.assertTrue(self.test_image19.load()[self.test_image19.width/2, self.test_image19.height/2] == (0, 255, 0, 255))
self.assertTrue(abs(self.test_image20.width - (700 * 1.5)) <= 10)
self.assertTrue(abs(self.test_image20.height - 700) <= 10)
self.assertTrue(self.test_image20.load()[1, 1] == (0, 0, 255, 255))
self.assertTrue(self.test_image20.load()[self.test_image20.width-1, self.test_image20.height-1] == (0, 0, 255, 255))
self.assertTrue(self.test_image20.load()[self.test_image20.width/2, self.test_image20.height/2] == (0, 255, 0, 255))
self.assertTrue(abs(self.test_image21.width - 1200) <= 10)
self.assertTrue(abs(self.test_image21.height - 800) <= 10)
self.assertTrue(self.test_image21.load()[1, 1] != (255, 255, 0, 255))
self.assertTrue(self.test_image21.load()[self.test_image21.width-1, self.test_image21.height-1] == (255, 255, 0, 255))
self.assertTrue(self.test_image21.load()[self.test_image21.width/2, self.test_image21.height/2] == (255, 255, 0, 255))
self.assertTrue(abs(self.test_image22.width - 1200) <= 10)
self.assertTrue(abs(self.test_image22.height - 800) <= 10)
self.assertTrue(self.test_image22.load()[1, 1] == (255, 255, 0, 255))
self.assertTrue(self.test_image22.load()[self.test_image22.width-1, self.test_image22.height-1] != (255, 255, 0, 255))
self.assertTrue(self.test_image22.load()[self.test_image22.width/2, self.test_image22.height/2] == (255, 255, 0, 255))
self.assertTrue(abs(self.test_image23.width - (800 * 2 / 1.7 * math.sin(math.radians(72)))) <= 10)
self.assertTrue(abs(self.test_image23.height - (400 * 2 / 1.7 + (400 * 2 / 1.7 * math.cos(math.radians(36))))) <= 10)
self.assertTrue(self.test_image23.load()[1, 1] != (255, 255, 0, 255))
self.assertTrue(self.test_image23.load()[self.test_image23.width-1, self.test_image23.height-1] != (255, 255, 0, 255))
self.assertTrue(self.test_image23.load()[self.test_image23.width/2, self.test_image23.height/2] == (0, 0, 255, 255))
self.assertTrue(abs(self.test_image24.width - (800 * 2 / 1.7 * math.sin(math.radians(72)))) <= 10)
self.assertTrue(abs(self.test_image24.height - (400 * 2 / 1.7 + (400 * 2 / 1.7 * math.cos(math.radians(36))))) <= 10)
self.assertTrue(self.test_image24.load()[1, 1] != (255, 255, 0, 255))
self.assertTrue(self.test_image24.load()[self.test_image24.width-1, self.test_image24.height-1] != (255, 255, 0, 255))
self.assertTrue(self.test_image24.load()[self.test_image24.width/2, self.test_image24.height/2] == (255, 255, 0, 255))
self.assertTrue(abs(self.test_image25.width - (900 / 1.8 * math.sqrt(3))) <= 10)
self.assertTrue(abs(self.test_image25.height - (900 / 1.8 * 2)) <= 10)
self.assertTrue(self.test_image25.load()[1, 1] != (255, 0, 255, 255))
self.assertTrue(self.test_image25.load()[self.test_image25.width-1, self.test_image25.height-1] != (255, 0, 255, 255))
self.assertTrue(self.test_image25.load()[self.test_image25.width/2, self.test_image25.height/2] == (255, 255, 0, 255))
self.assertTrue(abs(self.test_image26.width - (900 / 1.8 * 2)) <= 10)
self.assertTrue(abs(self.test_image26.height - (900 / 1.8 * math.sqrt(3))) <= 10)
self.assertTrue(self.test_image26.load()[1, 1] != (255, 0, 255, 255))
self.assertTrue(self.test_image26.load()[self.test_image26.width-1, self.test_image26.height-1] != (255, 0, 255, 255))
self.assertTrue(self.test_image26.load()[self.test_image26.width/2, self.test_image26.height/2] == (255, 255, 0, 255))
self.assertTrue(abs(self.test_image27.width - (900 * 2 / 1.8 * math.sin(math.radians(540 / 7)))) <= 10)
self.assertTrue(abs(self.test_image27.height - (900 / 1.8 + (900 / 1.8 * math.cos(math.radians(360 / 14))))) <= 10)
self.assertTrue(self.test_image27.load()[1, 1] != (255, 0, 255, 255))
self.assertTrue(self.test_image27.load()[self.test_image27.width-1, self.test_image27.height-1] != (255, 0, 255, 255))
self.assertTrue(self.test_image27.load()[self.test_image27.width/2, self.test_image27.height/2] == (255, 0, 255, 255))
self.assertTrue(abs(self.test_image28.width - (900 * 2 / 1.8 * math.sin(math.radians(540 / 7)))) <= 10)
self.assertTrue(abs(self.test_image28.height - (900 / 1.8 + (900 / 1.8 * math.cos(math.radians(360 / 14))))) <= 10)
self.assertTrue(self.test_image28.load()[1, 1] != (255, 0, 255, 255))
self.assertTrue(self.test_image28.load()[self.test_image28.width-1, self.test_image28.height-1] != (255, 0, 255, 255))
self.assertTrue(self.test_image28.load()[self.test_image28.width/2, self.test_image28.height / 2] == (255, 255, 0, 255))
self.assertTrue(abs(self.test_image29.width - 1000 / 1.8 * 2) <= 10)
self.assertTrue(abs(self.test_image29.height - 1000 / 1.8 * 2) <= 10)
self.assertTrue(self.test_image29.load()[1, 1] != (0, 255, 255, 255))
self.assertTrue(self.test_image29.load()[self.test_image29.width-1, self.test_image29.height-1] != (0, 255, 255, 255))
self.assertTrue(self.test_image29.load()[self.test_image29.width/2, self.test_image29.height/2] == (0, 255, 255, 255))
self.assertTrue(abs(self.test_image30.width - (1000 / 1.8 * 2 * math.cos(math.radians(22.5)))) <= 10)
self.assertTrue(abs(self.test_image30.height - (1000 / 1.8 * 2 * math.cos(math.radians(22.5)))) <= 10)
self.assertTrue(self.test_image30.load()[1, 1] != (0, 255, 255, 255))
self.assertTrue(self.test_image30.load()[self.test_image30.width-1, self.test_image30.height-1] != (0, 255, 255, 255))
self.assertTrue(self.test_image30.load()[self.test_image30.width/2, self.test_image30.height/2] == (0, 255, 255, 255))
self.assertTrue(abs(self.test_image31.width - (2 * 1000 / 1.1 * math.sin(math.radians(72)))) <= 10)
self.assertTrue(abs(self.test_image31.height - ((1000 / 1.1) + (1000 / 1.1 * math.cos(math.radians(36))))) <= 10)
self.assertTrue(self.test_image31.load()[1, 1] != (0, 255, 255, 255))
self.assertTrue(self.test_image31.load()[self.test_image31.width-1, self.test_image31.height-1] != (0, 255, 255, 255))
self.assertTrue(self.test_image31.load()[self.test_image31.width/2, self.test_image31.height/2] == (255, 0, 255, 255))
self.assertTrue(abs(self.test_image32.width - (2 * 1000 / 1.1 * math.sin(math.radians(72)))) <= 10)
self.assertTrue(abs(self.test_image32.height - ((1000 / 1.1) + (1000 / 1.1 * math.cos(math.radians(36))))) <= 10)
self.assertTrue(self.test_image32.load()[1, 1] != (0, 255, 255, 255))
self.assertTrue(self.test_image32.load()[self.test_image32.width-1, self.test_image32.height-1] != (0, 255, 255, 255))
self.assertTrue(self.test_image32.load()[self.test_image32.width/2, self.test_image32.height/2] == (0, 255, 255, 255))
"""
| 87.97551 | 158 | 0.643036 | 3,363 | 21,554 | 3.999703 | 0.039548 | 0.196268 | 0.132481 | 0.161921 | 0.817783 | 0.81533 | 0.754962 | 0.728422 | 0.696231 | 0.660248 | 0 | 0.168492 | 0.161269 | 21,554 | 244 | 159 | 88.336066 | 0.575561 | 0.999582 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
13d77ad6b8815d1e5d1a9a4726334c29d781f679 | 33,271 | py | Python | backend/api/python_http_client/kfp_server_api/api/job_service_api.py | surajkota/pipelines | 159bfc896a3ac22611aae02ca5b3069757efe3e2 | [
"Apache-2.0"
] | 1 | 2020-10-01T20:09:29.000Z | 2020-10-01T20:09:29.000Z | backend/api/python_http_client/kfp_server_api/api/job_service_api.py | surajkota/pipelines | 159bfc896a3ac22611aae02ca5b3069757efe3e2 | [
"Apache-2.0"
] | null | null | null | backend/api/python_http_client/kfp_server_api/api/job_service_api.py | surajkota/pipelines | 159bfc896a3ac22611aae02ca5b3069757efe3e2 | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# coding: utf-8
"""
Kubeflow Pipelines API
This file contains REST API specification for Kubeflow Pipelines. The file is autogenerated from the swagger definition.
Contact: kubeflow-pipelines@google.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from kfp_server_api.api_client import ApiClient
from kfp_server_api.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class JobServiceApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_job(self, body, **kwargs): # noqa: E501
"""Creates a new job. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_job(body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param ApiJob body: The job to be created (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ApiJob
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_job_with_http_info(body, **kwargs) # noqa: E501
def create_job_with_http_info(self, body, **kwargs): # noqa: E501
"""Creates a new job. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_job_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param ApiJob body: The job to be created (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ApiJob, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'body'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_job" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in local_var_params or # noqa: E501
local_var_params['body'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `body` when calling `create_job`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/apis/v1beta1/jobs', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ApiJob', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_job(self, id, **kwargs): # noqa: E501
"""Deletes a job. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_job(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be deleted (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_job_with_http_info(id, **kwargs) # noqa: E501
def delete_job_with_http_info(self, id, **kwargs): # noqa: E501
"""Deletes a job. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_job_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be deleted (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(object, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_job" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `delete_job`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/apis/v1beta1/jobs/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def disable_job(self, id, **kwargs): # noqa: E501
"""Stops a job and all its associated runs. The job is not deleted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disable_job(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be disabled (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.disable_job_with_http_info(id, **kwargs) # noqa: E501
def disable_job_with_http_info(self, id, **kwargs): # noqa: E501
"""Stops a job and all its associated runs. The job is not deleted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disable_job_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be disabled (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(object, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method disable_job" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `disable_job`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/apis/v1beta1/jobs/{id}/disable', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def enable_job(self, id, **kwargs): # noqa: E501
"""Restarts a job that was previously stopped. All runs associated with the job will continue. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.enable_job(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be enabled (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.enable_job_with_http_info(id, **kwargs) # noqa: E501
def enable_job_with_http_info(self, id, **kwargs): # noqa: E501
"""Restarts a job that was previously stopped. All runs associated with the job will continue. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.enable_job_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be enabled (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(object, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method enable_job" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `enable_job`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/apis/v1beta1/jobs/{id}/enable', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_job(self, id, **kwargs): # noqa: E501
"""Finds a specific job by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be retrieved (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ApiJob
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_job_with_http_info(id, **kwargs) # noqa: E501
def get_job_with_http_info(self, id, **kwargs): # noqa: E501
"""Finds a specific job by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: The ID of the job to be retrieved (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ApiJob, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_job" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `get_job`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/apis/v1beta1/jobs/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ApiJob', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_jobs(self, **kwargs): # noqa: E501
"""Finds all jobs. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_jobs(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str page_token: A page token to request the next page of results. The token is acquried from the nextPageToken field of the response from the previous ListJobs call or can be omitted when fetching the first page.
:param int page_size: The number of jobs to be listed per page. If there are more jobs than this number, the response message will contain a nextPageToken field you can use to fetch the next page.
:param str sort_by: Can be format of \"field_name\", \"field_name asc\" or \"field_name desc\". Ascending by default.
:param str resource_reference_key_type: The type of the resource that referred to.
:param str resource_reference_key_id: The ID of the resource that referred to.
:param str filter: A url-encoded, JSON-serialized Filter protocol buffer (see [filter.proto](https://github.com/kubeflow/pipelines/ blob/master/backend/api/filter.proto)).
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ApiListJobsResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_jobs_with_http_info(**kwargs) # noqa: E501
def list_jobs_with_http_info(self, **kwargs): # noqa: E501
"""Finds all jobs. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_jobs_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str page_token: A page token to request the next page of results. The token is acquried from the nextPageToken field of the response from the previous ListJobs call or can be omitted when fetching the first page.
:param int page_size: The number of jobs to be listed per page. If there are more jobs than this number, the response message will contain a nextPageToken field you can use to fetch the next page.
:param str sort_by: Can be format of \"field_name\", \"field_name asc\" or \"field_name desc\". Ascending by default.
:param str resource_reference_key_type: The type of the resource that referred to.
:param str resource_reference_key_id: The ID of the resource that referred to.
:param str filter: A url-encoded, JSON-serialized Filter protocol buffer (see [filter.proto](https://github.com/kubeflow/pipelines/ blob/master/backend/api/filter.proto)).
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ApiListJobsResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'page_token',
'page_size',
'sort_by',
'resource_reference_key_type',
'resource_reference_key_id',
'filter'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_jobs" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'page_token' in local_var_params and local_var_params['page_token'] is not None: # noqa: E501
query_params.append(('page_token', local_var_params['page_token'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page_size', local_var_params['page_size'])) # noqa: E501
if 'sort_by' in local_var_params and local_var_params['sort_by'] is not None: # noqa: E501
query_params.append(('sort_by', local_var_params['sort_by'])) # noqa: E501
if 'resource_reference_key_type' in local_var_params and local_var_params['resource_reference_key_type'] is not None: # noqa: E501
query_params.append(('resource_reference_key.type', local_var_params['resource_reference_key_type'])) # noqa: E501
if 'resource_reference_key_id' in local_var_params and local_var_params['resource_reference_key_id'] is not None: # noqa: E501
query_params.append(('resource_reference_key.id', local_var_params['resource_reference_key_id'])) # noqa: E501
if 'filter' in local_var_params and local_var_params['filter'] is not None: # noqa: E501
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/apis/v1beta1/jobs', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ApiListJobsResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.420561 | 227 | 0.591536 | 3,849 | 33,271 | 4.887243 | 0.075604 | 0.041678 | 0.064005 | 0.028707 | 0.91489 | 0.904152 | 0.895168 | 0.888895 | 0.881771 | 0.863378 | 0 | 0.013634 | 0.334255 | 33,271 | 748 | 228 | 44.479947 | 0.835621 | 0.478615 | 0 | 0.706553 | 0 | 0 | 0.164794 | 0.049856 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.014245 | 0 | 0.088319 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b91e10587d9826b4de9fa5c28cc63c5797893925 | 195 | py | Python | robot-server/robot_server/service/legacy/models/__init__.py | faliester/opentrons | e945d0f72fed39b0f68c0b30b7afd1981644184f | [
"Apache-2.0"
] | 1 | 2022-03-17T20:38:04.000Z | 2022-03-17T20:38:04.000Z | robot-server/robot_server/service/legacy/models/__init__.py | faliester/opentrons | e945d0f72fed39b0f68c0b30b7afd1981644184f | [
"Apache-2.0"
] | null | null | null | robot-server/robot_server/service/legacy/models/__init__.py | faliester/opentrons | e945d0f72fed39b0f68c0b30b7afd1981644184f | [
"Apache-2.0"
] | null | null | null | from pydantic import BaseModel, Field
class V1BasicResponse(BaseModel):
"""A response with a human readable message"""
message: str = Field(..., description="A human-readable message")
| 27.857143 | 69 | 0.728205 | 23 | 195 | 6.173913 | 0.652174 | 0.084507 | 0.197183 | 0.295775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006098 | 0.158974 | 195 | 6 | 70 | 32.5 | 0.859756 | 0.205128 | 0 | 0 | 0 | 0 | 0.161074 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b92b1b9bff58027ea43739a49d3c1bb4c1f53e40 | 7,052 | py | Python | CondTools/L1Trigger/python/L1RSSubsystemParams_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | CondTools/L1Trigger/python/L1RSSubsystemParams_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | CondTools/L1Trigger/python/L1RSSubsystemParams_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | def initL1RSSubsystems( tagBaseVec = [],
L1MuDTTFMasksRcdKey = 'dummy',
L1MuGMTChannelMaskRcdKey = 'dummy',
L1RCTChannelMaskRcdKey = 'dummy',
L1RCTNoisyChannelMaskRcdKey = 'dummy',
L1GctChannelMaskRcdKey = 'dummy',
L1GtPrescaleFactorsAlgoTrigRcdKey = 'dummy',
L1GtPrescaleFactorsTechTrigRcdKey = 'dummy',
L1GtTriggerMaskAlgoTrigRcdKey = 'dummy',
L1GtTriggerMaskTechTrigRcdKey = 'dummy',
L1GtTriggerMaskVetoTechTrigRcdKey = 'dummy',
includeL1RCTNoisyChannelMask = True):
import FWCore.ParameterSet.Config as cms
from CondTools.L1Trigger.L1CondEnum_cfi import L1CondEnum
if includeL1RCTNoisyChannelMask == True:
initL1RSSubsystems.params = cms.PSet(
recordInfo = cms.VPSet(
cms.PSet(
record = cms.string('L1MuDTTFMasksRcd'),
tag = cms.string('L1MuDTTFMasks_' + tagBaseVec[ L1CondEnum.L1MuDTTFMasks ]),
type = cms.string('L1MuDTTFMasks'),
key = cms.string(L1MuDTTFMasksRcdKey)
),
cms.PSet(
record = cms.string('L1MuGMTChannelMaskRcd'),
tag = cms.string('L1MuGMTChannelMask_' + tagBaseVec[ L1CondEnum.L1MuGMTChannelMask ]),
type = cms.string('L1MuGMTChannelMask'),
key = cms.string(L1MuGMTChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1RCTChannelMaskRcd'),
tag = cms.string('L1RCTChannelMask_' + tagBaseVec[ L1CondEnum.L1RCTChannelMask ]),
type = cms.string('L1RCTChannelMask'),
key = cms.string(L1RCTChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1RCTNoisyChannelMaskRcd'),
tag = cms.string('L1RCTNoisyChannelMask_' + tagBaseVec[ L1CondEnum.L1RCTNoisyChannelMask ]),
type = cms.string('L1RCTNoisyChannelMask'),
key = cms.string(L1RCTNoisyChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1GctChannelMaskRcd'),
tag = cms.string('L1GctChannelMask_' + tagBaseVec[ L1CondEnum.L1GctChannelMask ]),
type = cms.string('L1GctChannelMask'),
key = cms.string(L1GctChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1GtPrescaleFactorsAlgoTrigRcd'),
tag = cms.string('L1GtPrescaleFactorsAlgoTrig_' + tagBaseVec[ L1CondEnum.L1GtPrescaleFactorsAlgoTrig ]),
type = cms.string('L1GtPrescaleFactors'),
key = cms.string(L1GtPrescaleFactorsAlgoTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtPrescaleFactorsTechTrigRcd'),
tag = cms.string('L1GtPrescaleFactorsTechTrig_' + tagBaseVec[ L1CondEnum.L1GtPrescaleFactorsTechTrig ]),
type = cms.string('L1GtPrescaleFactors'),
key = cms.string(L1GtPrescaleFactorsTechTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtTriggerMaskAlgoTrigRcd'),
tag = cms.string('L1GtTriggerMaskAlgoTrig_' + tagBaseVec[ L1CondEnum.L1GtTriggerMaskAlgoTrig ]),
type = cms.string('L1GtTriggerMask'),
key = cms.string(L1GtTriggerMaskAlgoTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtTriggerMaskTechTrigRcd'),
tag = cms.string('L1GtTriggerMaskTechTrig_' + tagBaseVec[ L1CondEnum.L1GtTriggerMaskTechTrig ]),
type = cms.string('L1GtTriggerMask'),
key = cms.string(L1GtTriggerMaskTechTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtTriggerMaskVetoTechTrigRcd'),
tag = cms.string('L1GtTriggerMaskVetoTechTrig_' + tagBaseVec[ L1CondEnum.L1GtTriggerMaskVetoTechTrig ]),
type = cms.string('L1GtTriggerMask'),
key = cms.string(L1GtTriggerMaskVetoTechTrigRcdKey)
))
)
else:
initL1RSSubsystems.params = cms.PSet(
recordInfo = cms.VPSet(
cms.PSet(
record = cms.string('L1MuDTTFMasksRcd'),
tag = cms.string('L1MuDTTFMasks_' + tagBaseVec[ L1CondEnum.L1MuDTTFMasks ]),
type = cms.string('L1MuDTTFMasks'),
key = cms.string(L1MuDTTFMasksRcdKey)
),
cms.PSet(
record = cms.string('L1MuGMTChannelMaskRcd'),
tag = cms.string('L1MuGMTChannelMask_' + tagBaseVec[ L1CondEnum.L1MuGMTChannelMask ]),
type = cms.string('L1MuGMTChannelMask'),
key = cms.string(L1MuGMTChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1RCTChannelMaskRcd'),
tag = cms.string('L1RCTChannelMask_' + tagBaseVec[ L1CondEnum.L1RCTChannelMask ]),
type = cms.string('L1RCTChannelMask'),
key = cms.string(L1RCTChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1GctChannelMaskRcd'),
tag = cms.string('L1GctChannelMask_' + tagBaseVec[ L1CondEnum.L1GctChannelMask ]),
type = cms.string('L1GctChannelMask'),
key = cms.string(L1GctChannelMaskRcdKey)
),
cms.PSet(
record = cms.string('L1GtPrescaleFactorsAlgoTrigRcd'),
tag = cms.string('L1GtPrescaleFactorsAlgoTrig_' + tagBaseVec[ L1CondEnum.L1GtPrescaleFactorsAlgoTrig ]),
type = cms.string('L1GtPrescaleFactors'),
key = cms.string(L1GtPrescaleFactorsAlgoTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtPrescaleFactorsTechTrigRcd'),
tag = cms.string('L1GtPrescaleFactorsTechTrig_' + tagBaseVec[ L1CondEnum.L1GtPrescaleFactorsTechTrig ]),
type = cms.string('L1GtPrescaleFactors'),
key = cms.string(L1GtPrescaleFactorsTechTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtTriggerMaskAlgoTrigRcd'),
tag = cms.string('L1GtTriggerMaskAlgoTrig_' + tagBaseVec[ L1CondEnum.L1GtTriggerMaskAlgoTrig ]),
type = cms.string('L1GtTriggerMask'),
key = cms.string(L1GtTriggerMaskAlgoTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtTriggerMaskTechTrigRcd'),
tag = cms.string('L1GtTriggerMaskTechTrig_' + tagBaseVec[ L1CondEnum.L1GtTriggerMaskTechTrig ]),
type = cms.string('L1GtTriggerMask'),
key = cms.string(L1GtTriggerMaskTechTrigRcdKey)
),
cms.PSet(
record = cms.string('L1GtTriggerMaskVetoTechTrigRcd'),
tag = cms.string('L1GtTriggerMaskVetoTechTrig_' + tagBaseVec[ L1CondEnum.L1GtTriggerMaskVetoTechTrig ]),
type = cms.string('L1GtTriggerMask'),
key = cms.string(L1GtTriggerMaskVetoTechTrigRcdKey)
))
)
| 50.733813 | 116 | 0.597703 | 455 | 7,052 | 9.21978 | 0.131868 | 0.163051 | 0.05888 | 0.072467 | 0.838141 | 0.838141 | 0.838141 | 0.838141 | 0.838141 | 0.838141 | 0 | 0.026862 | 0.303176 | 7,052 | 138 | 117 | 51.101449 | 0.826821 | 0 | 0 | 0.838235 | 0 | 0 | 0.175978 | 0.093165 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007353 | false | 0 | 0.014706 | 0 | 0.022059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b944a104be769e58abf64f1e7b1695656b6af93c | 8,716 | py | Python | rpi_d3m_primitives/featSelect/combineJointDist.py | naiyuyin/rpi_d3m_primitives | f95553bee90916d241885d28fb71c8167116d9fa | [
"MIT"
] | 1 | 2019-05-02T21:05:27.000Z | 2019-05-02T21:05:27.000Z | rpi_d3m_primitives/featSelect/combineJointDist.py | naiyuyin/rpi_d3m_primitives | f95553bee90916d241885d28fb71c8167116d9fa | [
"MIT"
] | 1 | 2021-03-18T15:52:27.000Z | 2021-03-26T17:54:04.000Z | rpi_d3m_primitives/featSelect/combineJointDist.py | naiyuyin/rpi_d3m_primitives | f95553bee90916d241885d28fb71c8167116d9fa | [
"MIT"
] | null | null | null | from rpi_d3m_primitives.featSelect.statsFunctions import get_num_value_HypoTest,getPvalue_Chisquare,getSquaredError
import numpy as np
def combine_joint_dist_table(joint_dist, X, Y, part_X, part_Y, hm_HypoTest, sig_level=0.15):
flag_vertical = 0
hm_sx = len(part_X) - 1
hm_sy = len(part_Y) - 1
joint_table = joint_dist
for i in range(hm_sx):
if hm_sx == max(part_X):
break
if part_X[i+1] == max(part_X) and part_X[-2] != max(part_X):
Len = part_X[i+1] - part_X[i] + 1
prob = np.zeros([Len, 1])
space_x = np.arange(part_X[i], part_X[i+1]+1)
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y >= x]
num_sampx = get_indexes(part_X[i], X)
num_sampx = np.array(num_sampx)
for j in range(max(part_Y)):
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y == x]
num_sampy = get_indexes(j+1, Y)
num_sampy = np.array(num_sampy)
joint_idx = np.intersect1d(num_sampx, num_sampy)
hm_x, hm_y = get_num_value_HypoTest(X, Y, part_X, part_Y)
joint_idx = joint_idx.astype(int)
p_val = getPvalue_Chisquare(X[joint_idx], Y[joint_idx], hm_x, 1)
if p_val > sig_level:
flag_vertical = 1
continue
else:
#hm_HypoTest = hm_HypoTest - 1
prob = np.zeros((1,Len))
for k in range(Len):
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y == x]
temp = get_indexes(part_X[i]+k, X[joint_idx])
prob[0,k] = len(temp)/len(X)
rem_prob = 1 - sum(prob[0])
if rem_prob != 0 :
joint_table[space_x-1, j] = 0
joint_table = rem_prob / sum(joint_table) * joint_table
joint_table[space_x-1, j] = prob
flag_vertical = 1
continue
if part_X[i+1] - part_X[i] >= 2:
Len = part_X[i+1] - part_X[i]
prob = np.zeros([Len, 1])
space_x = np.arange(part_X[i], part_X[i+1]+1)
flat_X = X.flatten()
num_sampx = np.argwhere((flat_X>=part_X[i]) & (flat_X<part_X[i+1])).flatten()
space_x = space_x[:-1]
for j in range(max(part_Y)):
num_sampy = np.argwhere(Y.flatten()==j+1).flatten()
joint_idx = np.intersect1d(num_sampx, num_sampy)
hm_x, hm_y = get_num_value_HypoTest(X, Y, part_X, part_Y)
p_val = getPvalue_Chisquare(X[joint_idx], Y[joint_idx], hm_x, 1)
if p_val > sig_level:
flag_vertical = 1
continue
else:
#hm_HypoTest = hm_HypoTest - 1
prob = np.zeros((1,Len))
flat_joint = X[joint_idx].flatten()
for k in range(Len):
temp = np.argwhere(flat_joint == part_X[i] + k).flatten()
prob[0,k] = len(temp)/len(X)
rem_prob = 1 - sum(prob[0])
if (rem_prob != 0 and any(np.isnan(sum(joint_table))) and any(sum(joint_table)==0)):
joint_table[space_x-1, j] = 0
joint_table = rem_prob/sum(joint_table)*joint_table
joint_table[space_x-1, j] = prob
flag_vertical = 1
#Horizontal search
if flag_vertical :
results = []
results.append(joint_table)
results.append(hm_HypoTest)
results.append(joint_table)
results.append(hm_HypoTest)
return results
for i in range(hm_sy):
if flag_vertical == 1:
break
if part_Y[i+1] == max(part_Y) and part_Y[-2] != max(part_Y):
Len = part_Y[i+1] - part_Y[i] + 1
prob = np.zeros([1, Len])
space_y = np.arange(part_Y[i], part_Y[i+1]+1)
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y >= x]
num_sampy = get_indexes(part_Y[i], Y)
num_sampy = np.array(num_sampy)
for j in range(max(part_X)):
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y == x]
num_sampx = get_indexes(j+1, X)
num_sampx = np.array(num_sampx)
joint_idx = np.intersect1d(num_sampx, num_sampy)
hm_x, hm_y = get_num_value_HypoTest(X,Y,part_X, part_Y)
p_val = getPvalue_Chisquare(X[joint_idx], Y[joint_idx], 1, hm_y)
if p_val > sig_level:
continue
else:
#hm_HypoTest = hm_HypoTest - 1
prob = np.zeros((1,Len))
for k in range(Len):
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y == x]
temp = get_indexes(part_Y[i] + k, Y[joint_idx])
prob[0,k] = len(temp)/len(X)
rem_prob = 1 - sum(prob[0])
if rem_prob != 0:
joint_table[j, space_y-1] = 0
joint_table = rem_prob / sum(joint_table) * joint_table
joint_table[j, space_y-1] = prob
continue
if part_Y[i+1] - part_Y[i] >= 2:
Len = part_Y[i+1] - part_Y[i]
prob = np.zeros(1, Len)
space_y = np.arange(part_Y[i], part_Y[i+1]+1)
get_indexes = lambda x1, x2, xs: [i for (y, i) in zip(xs, range(len(xs))) if y >= x1 and y < x2]
num_sampy = get_indexes(part_Y[i], part_Y[i+1], Y)
num_sampy = np.array(num_sampy)
space_y = space_y[:-1]
for j in range(max(part_X)):
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y == x]
num_sampx = get_indexes(j+1, X)
num_sampx = np.array(num_sampx)
joint_idx = np.intersect1d(num_sampx, num_sampy)
hm_x, hm_y = get_num_value_HypoTest(X,Y,part_X, part_Y)
p_val = getPvalue_Chisquare(X[joint_idx], Y[joint_idx], 1, hm_y)
if p_val > sig_level:
continue
else:
#hm_HypoTest = hm_HypoTest - 1
prob = np.zeros((1,Len))
for k in range(Len):
get_indexes = lambda x, xs: [i for (y, i) in zip(xs, range(len(xs))) if y == x]
temp = get_indexes(part_Y[i] + k, Y[joint_idx])
prob[0,k] = len(temp)/len(X)
rem_prob = 1 - sum(prob[0])
if rem_prob != 0:
joint_table[j, space_y-1] = 0
joint_table = rem_prob / sum(joint_table) * joint_table
joint_table[j, space_y-1] = prob
m = max(part_X)
n = max(part_Y)
flag_vertical = 0
for i in range(m):
index = np.argwhere(X==i+1).T[0]
p_val = getPvalue_Chisquare(X[index], Y[index], 1, n)
if p_val > sig_level:
total_prob = len(index)/len(X)
rem_prob = 1 - total_prob
if rem_prob != 0:
joint_table[i, 1:n] = 0
joint_table = rem_prob/sum(joint_table)*joint_table
joint_table[i, 1:n] = total_prob / n
flag_vertical = 1
if flag_vertical:
results = []
results.append(joint_table)
results.append(hm_HypoTest)
return results
for i in range(n):
index = np.argwhere(Y==i+1).T[0]
p_val = getPvalue_Chisquare(X[index], Y[index], 1, m)
if p_val > sig_level:
total_prob = len(index)/len(Y)
rem_prob = 1 - total_prob
if rem_prob != 0:
joint_table[1:m, i] = 0
joint_table = rem_prob/sum(joint_table)*joint_table
joint_table[1:m, i] = total_prob / m
results = []
results.append(joint_table)
results.append(hm_HypoTest)
return results
| 45.633508 | 116 | 0.47648 | 1,210 | 8,716 | 3.191736 | 0.071074 | 0.095805 | 0.024858 | 0.062144 | 0.815381 | 0.790782 | 0.787416 | 0.748576 | 0.719057 | 0.719057 | 0 | 0.022057 | 0.41223 | 8,716 | 190 | 117 | 45.873684 | 0.731798 | 0.015259 | 0 | 0.706587 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005988 | false | 0 | 0.011976 | 0 | 0.035928 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b948f5378907d2d18ac226c427dab98ba52bda9c | 80 | py | Python | escalate/core/models/__init__.py | darkreactions/ESCALATE | 0020da00b81a2dd80d1c9fd72d2edf92b519e605 | [
"MIT"
] | 11 | 2020-09-29T13:59:02.000Z | 2022-03-23T04:57:52.000Z | escalate/core/models/__init__.py | darkreactions/ESCALATE | 0020da00b81a2dd80d1c9fd72d2edf92b519e605 | [
"MIT"
] | 95 | 2019-11-18T20:10:49.000Z | 2022-03-31T17:09:49.000Z | escalate/core/models/__init__.py | darkreactions/ESCALATE | 0020da00b81a2dd80d1c9fd72d2edf92b519e605 | [
"MIT"
] | 2 | 2021-11-26T18:22:08.000Z | 2022-03-31T11:57:10.000Z | from .app_tables import *
from .core_tables import *
from .view_tables import *
| 20 | 26 | 0.775 | 12 | 80 | 4.916667 | 0.5 | 0.610169 | 0.542373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 80 | 3 | 27 | 26.666667 | 0.867647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b97f6a55a7ca44eb61f67817a45d733808971132 | 103 | py | Python | yolov5/__init__.py | Lynda-Starkus/DeepvisionAI_Backend | 27946561f0b3caaf2b4d394a29d62c80cf7b81d2 | [
"MIT"
] | 5 | 2021-09-16T04:30:17.000Z | 2022-02-19T14:30:40.000Z | yolov5/__init__.py | itssarah/DeepvisionAI_Backend | f3637ef173ed1af6cffd1bf98357e581d785fb88 | [
"MIT"
] | 1 | 2022-02-19T14:32:26.000Z | 2022-02-28T13:38:48.000Z | yolov5/__init__.py | itssarah/DeepvisionAI_Backend | f3637ef173ed1af6cffd1bf98357e581d785fb88 | [
"MIT"
] | 1 | 2021-09-17T13:54:03.000Z | 2021-09-17T13:54:03.000Z | from yolov5.helpers import YOLOv5
from yolov5.helpers import load_model as load
__version__ = "5.0.7"
| 20.6 | 45 | 0.796117 | 17 | 103 | 4.529412 | 0.647059 | 0.25974 | 0.441558 | 0.597403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067416 | 0.135922 | 103 | 4 | 46 | 25.75 | 0.797753 | 0 | 0 | 0 | 0 | 0 | 0.048544 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b98f97eb93b648bd625f7a5cef483214a98c2d7d | 500,079 | py | Python | sdk/recoveryservices/azure-mgmt-recoveryservicesbackup/azure/mgmt/recoveryservicesbackup/models/_models_py3.py | anuchandy/azure-sdk-for-python | 589b9890554ebf261aa2184e8f1c6507f01a207c | [
"MIT"
] | null | null | null | sdk/recoveryservices/azure-mgmt-recoveryservicesbackup/azure/mgmt/recoveryservicesbackup/models/_models_py3.py | anuchandy/azure-sdk-for-python | 589b9890554ebf261aa2184e8f1c6507f01a207c | [
"MIT"
] | null | null | null | sdk/recoveryservices/azure-mgmt-recoveryservicesbackup/azure/mgmt/recoveryservicesbackup/models/_models_py3.py | anuchandy/azure-sdk-for-python | 589b9890554ebf261aa2184e8f1c6507f01a207c | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from msrest.serialization import Model
class FeatureSupportRequest(Model):
"""Base class for feature request.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureBackupGoalFeatureSupportRequest,
AzureVMResourceFeatureSupportRequest
All required parameters must be populated in order to send to Azure.
:param feature_type: Required. Constant filled by server.
:type feature_type: str
"""
_validation = {
'feature_type': {'required': True},
}
_attribute_map = {
'feature_type': {'key': 'featureType', 'type': 'str'},
}
_subtype_map = {
'feature_type': {'AzureBackupGoals': 'AzureBackupGoalFeatureSupportRequest', 'AzureVMResourceBackup': 'AzureVMResourceFeatureSupportRequest'}
}
def __init__(self, **kwargs) -> None:
super(FeatureSupportRequest, self).__init__(**kwargs)
self.feature_type = None
class AzureBackupGoalFeatureSupportRequest(FeatureSupportRequest):
"""Azure backup goal feature specific request.
All required parameters must be populated in order to send to Azure.
:param feature_type: Required. Constant filled by server.
:type feature_type: str
"""
_validation = {
'feature_type': {'required': True},
}
_attribute_map = {
'feature_type': {'key': 'featureType', 'type': 'str'},
}
def __init__(self, **kwargs) -> None:
super(AzureBackupGoalFeatureSupportRequest, self).__init__(**kwargs)
self.feature_type = 'AzureBackupGoals'
class ProtectionContainer(Model):
"""Base class for container with backup items. Containers with specific
workloads are derived from this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureSqlContainer, AzureStorageContainer,
AzureWorkloadContainer, DpmContainer, GenericContainer, IaaSVMContainer,
MabContainer
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
}
_subtype_map = {
'container_type': {'AzureSqlContainer': 'AzureSqlContainer', 'StorageContainer': 'AzureStorageContainer', 'AzureWorkloadContainer': 'AzureWorkloadContainer', 'DPMContainer': 'DpmContainer', 'GenericContainer': 'GenericContainer', 'IaaSVMContainer': 'IaaSVMContainer', 'Windows': 'MabContainer'}
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, **kwargs) -> None:
super(ProtectionContainer, self).__init__(**kwargs)
self.friendly_name = friendly_name
self.backup_management_type = backup_management_type
self.registration_status = registration_status
self.health_status = health_status
self.container_type = None
class DpmContainer(ProtectionContainer):
"""DPM workload-specific protection container.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureBackupServerContainer
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param can_re_register: Specifies whether the container is re-registrable.
:type can_re_register: bool
:param container_id: ID of container.
:type container_id: str
:param protected_item_count: Number of protected items in the BackupEngine
:type protected_item_count: long
:param dpm_agent_version: Backup engine Agent version
:type dpm_agent_version: str
:param dpm_servers: List of BackupEngines protecting the container
:type dpm_servers: list[str]
:param upgrade_available: To check if upgrade available
:type upgrade_available: bool
:param protection_status: Protection status of the container.
:type protection_status: str
:param extended_info: Extended Info of the container.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.DPMContainerExtendedInfo
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'can_re_register': {'key': 'canReRegister', 'type': 'bool'},
'container_id': {'key': 'containerId', 'type': 'str'},
'protected_item_count': {'key': 'protectedItemCount', 'type': 'long'},
'dpm_agent_version': {'key': 'dpmAgentVersion', 'type': 'str'},
'dpm_servers': {'key': 'dpmServers', 'type': '[str]'},
'upgrade_available': {'key': 'upgradeAvailable', 'type': 'bool'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'DPMContainerExtendedInfo'},
}
_subtype_map = {
'container_type': {'AzureBackupServerContainer': 'AzureBackupServerContainer'}
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, can_re_register: bool=None, container_id: str=None, protected_item_count: int=None, dpm_agent_version: str=None, dpm_servers=None, upgrade_available: bool=None, protection_status: str=None, extended_info=None, **kwargs) -> None:
super(DpmContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.can_re_register = can_re_register
self.container_id = container_id
self.protected_item_count = protected_item_count
self.dpm_agent_version = dpm_agent_version
self.dpm_servers = dpm_servers
self.upgrade_available = upgrade_available
self.protection_status = protection_status
self.extended_info = extended_info
self.container_type = 'DPMContainer'
class AzureBackupServerContainer(DpmContainer):
"""AzureBackupServer (DPMVenus) workload-specific protection container.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param can_re_register: Specifies whether the container is re-registrable.
:type can_re_register: bool
:param container_id: ID of container.
:type container_id: str
:param protected_item_count: Number of protected items in the BackupEngine
:type protected_item_count: long
:param dpm_agent_version: Backup engine Agent version
:type dpm_agent_version: str
:param dpm_servers: List of BackupEngines protecting the container
:type dpm_servers: list[str]
:param upgrade_available: To check if upgrade available
:type upgrade_available: bool
:param protection_status: Protection status of the container.
:type protection_status: str
:param extended_info: Extended Info of the container.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.DPMContainerExtendedInfo
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'can_re_register': {'key': 'canReRegister', 'type': 'bool'},
'container_id': {'key': 'containerId', 'type': 'str'},
'protected_item_count': {'key': 'protectedItemCount', 'type': 'long'},
'dpm_agent_version': {'key': 'dpmAgentVersion', 'type': 'str'},
'dpm_servers': {'key': 'dpmServers', 'type': '[str]'},
'upgrade_available': {'key': 'upgradeAvailable', 'type': 'bool'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'DPMContainerExtendedInfo'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, can_re_register: bool=None, container_id: str=None, protected_item_count: int=None, dpm_agent_version: str=None, dpm_servers=None, upgrade_available: bool=None, protection_status: str=None, extended_info=None, **kwargs) -> None:
super(AzureBackupServerContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, can_re_register=can_re_register, container_id=container_id, protected_item_count=protected_item_count, dpm_agent_version=dpm_agent_version, dpm_servers=dpm_servers, upgrade_available=upgrade_available, protection_status=protection_status, extended_info=extended_info, **kwargs)
self.container_type = 'AzureBackupServerContainer'
class BackupEngineBase(Model):
"""The base backup engine class. All workload specific backup engines derive
from this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureBackupServerEngine, DpmBackupEngine
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the backup engine.
:type friendly_name: str
:param backup_management_type: Type of backup management for the backup
engine. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Registration status of the backup engine with
the Recovery Services Vault.
:type registration_status: str
:param backup_engine_state: Status of the backup engine with the Recovery
Services Vault. = {Active/Deleting/DeleteFailed}
:type backup_engine_state: str
:param health_status: Backup status of the backup engine.
:type health_status: str
:param can_re_register: Flag indicating if the backup engine be
registered, once already registered.
:type can_re_register: bool
:param backup_engine_id: ID of the backup engine.
:type backup_engine_id: str
:param dpm_version: Backup engine version
:type dpm_version: str
:param azure_backup_agent_version: Backup agent version
:type azure_backup_agent_version: str
:param is_azure_backup_agent_upgrade_available: To check if backup agent
upgrade available
:type is_azure_backup_agent_upgrade_available: bool
:param is_dpm_upgrade_available: To check if backup engine upgrade
available
:type is_dpm_upgrade_available: bool
:param extended_info: Extended info of the backupengine
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.BackupEngineExtendedInfo
:param backup_engine_type: Required. Constant filled by server.
:type backup_engine_type: str
"""
_validation = {
'backup_engine_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'backup_engine_state': {'key': 'backupEngineState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'can_re_register': {'key': 'canReRegister', 'type': 'bool'},
'backup_engine_id': {'key': 'backupEngineId', 'type': 'str'},
'dpm_version': {'key': 'dpmVersion', 'type': 'str'},
'azure_backup_agent_version': {'key': 'azureBackupAgentVersion', 'type': 'str'},
'is_azure_backup_agent_upgrade_available': {'key': 'isAzureBackupAgentUpgradeAvailable', 'type': 'bool'},
'is_dpm_upgrade_available': {'key': 'isDpmUpgradeAvailable', 'type': 'bool'},
'extended_info': {'key': 'extendedInfo', 'type': 'BackupEngineExtendedInfo'},
'backup_engine_type': {'key': 'backupEngineType', 'type': 'str'},
}
_subtype_map = {
'backup_engine_type': {'AzureBackupServerEngine': 'AzureBackupServerEngine', 'DpmBackupEngine': 'DpmBackupEngine'}
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, backup_engine_state: str=None, health_status: str=None, can_re_register: bool=None, backup_engine_id: str=None, dpm_version: str=None, azure_backup_agent_version: str=None, is_azure_backup_agent_upgrade_available: bool=None, is_dpm_upgrade_available: bool=None, extended_info=None, **kwargs) -> None:
super(BackupEngineBase, self).__init__(**kwargs)
self.friendly_name = friendly_name
self.backup_management_type = backup_management_type
self.registration_status = registration_status
self.backup_engine_state = backup_engine_state
self.health_status = health_status
self.can_re_register = can_re_register
self.backup_engine_id = backup_engine_id
self.dpm_version = dpm_version
self.azure_backup_agent_version = azure_backup_agent_version
self.is_azure_backup_agent_upgrade_available = is_azure_backup_agent_upgrade_available
self.is_dpm_upgrade_available = is_dpm_upgrade_available
self.extended_info = extended_info
self.backup_engine_type = None
class AzureBackupServerEngine(BackupEngineBase):
"""Backup engine type when Azure Backup Server is used to manage the backups.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the backup engine.
:type friendly_name: str
:param backup_management_type: Type of backup management for the backup
engine. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Registration status of the backup engine with
the Recovery Services Vault.
:type registration_status: str
:param backup_engine_state: Status of the backup engine with the Recovery
Services Vault. = {Active/Deleting/DeleteFailed}
:type backup_engine_state: str
:param health_status: Backup status of the backup engine.
:type health_status: str
:param can_re_register: Flag indicating if the backup engine be
registered, once already registered.
:type can_re_register: bool
:param backup_engine_id: ID of the backup engine.
:type backup_engine_id: str
:param dpm_version: Backup engine version
:type dpm_version: str
:param azure_backup_agent_version: Backup agent version
:type azure_backup_agent_version: str
:param is_azure_backup_agent_upgrade_available: To check if backup agent
upgrade available
:type is_azure_backup_agent_upgrade_available: bool
:param is_dpm_upgrade_available: To check if backup engine upgrade
available
:type is_dpm_upgrade_available: bool
:param extended_info: Extended info of the backupengine
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.BackupEngineExtendedInfo
:param backup_engine_type: Required. Constant filled by server.
:type backup_engine_type: str
"""
_validation = {
'backup_engine_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'backup_engine_state': {'key': 'backupEngineState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'can_re_register': {'key': 'canReRegister', 'type': 'bool'},
'backup_engine_id': {'key': 'backupEngineId', 'type': 'str'},
'dpm_version': {'key': 'dpmVersion', 'type': 'str'},
'azure_backup_agent_version': {'key': 'azureBackupAgentVersion', 'type': 'str'},
'is_azure_backup_agent_upgrade_available': {'key': 'isAzureBackupAgentUpgradeAvailable', 'type': 'bool'},
'is_dpm_upgrade_available': {'key': 'isDpmUpgradeAvailable', 'type': 'bool'},
'extended_info': {'key': 'extendedInfo', 'type': 'BackupEngineExtendedInfo'},
'backup_engine_type': {'key': 'backupEngineType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, backup_engine_state: str=None, health_status: str=None, can_re_register: bool=None, backup_engine_id: str=None, dpm_version: str=None, azure_backup_agent_version: str=None, is_azure_backup_agent_upgrade_available: bool=None, is_dpm_upgrade_available: bool=None, extended_info=None, **kwargs) -> None:
super(AzureBackupServerEngine, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, backup_engine_state=backup_engine_state, health_status=health_status, can_re_register=can_re_register, backup_engine_id=backup_engine_id, dpm_version=dpm_version, azure_backup_agent_version=azure_backup_agent_version, is_azure_backup_agent_upgrade_available=is_azure_backup_agent_upgrade_available, is_dpm_upgrade_available=is_dpm_upgrade_available, extended_info=extended_info, **kwargs)
self.backup_engine_type = 'AzureBackupServerEngine'
class BackupRequest(Model):
"""Base class for backup request. Workload-specific backup requests are
derived from this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureFileShareBackupRequest, AzureWorkloadBackupRequest,
IaasVMBackupRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureFileShareBackupRequest': 'AzureFileShareBackupRequest', 'AzureWorkloadBackupRequest': 'AzureWorkloadBackupRequest', 'IaasVMBackupRequest': 'IaasVMBackupRequest'}
}
def __init__(self, **kwargs) -> None:
super(BackupRequest, self).__init__(**kwargs)
self.object_type = None
class AzureFileShareBackupRequest(BackupRequest):
"""AzureFileShare workload-specific backup request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_point_expiry_time_in_utc: Backup copy will expire after
the time specified (UTC).
:type recovery_point_expiry_time_in_utc: datetime
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_expiry_time_in_utc': {'key': 'recoveryPointExpiryTimeInUTC', 'type': 'iso-8601'},
}
def __init__(self, *, recovery_point_expiry_time_in_utc=None, **kwargs) -> None:
super(AzureFileShareBackupRequest, self).__init__(**kwargs)
self.recovery_point_expiry_time_in_utc = recovery_point_expiry_time_in_utc
self.object_type = 'AzureFileShareBackupRequest'
class WorkloadProtectableItem(Model):
"""Base class for backup item. Workload-specific backup items are derived from
this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureFileShareProtectableItem,
AzureVmWorkloadProtectableItem, IaaSVMProtectableItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
}
_subtype_map = {
'protectable_item_type': {'AzureFileShare': 'AzureFileShareProtectableItem', 'AzureVmWorkloadProtectableItem': 'AzureVmWorkloadProtectableItem', 'IaaSVMProtectableItem': 'IaaSVMProtectableItem'}
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, **kwargs) -> None:
super(WorkloadProtectableItem, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.workload_type = workload_type
self.friendly_name = friendly_name
self.protection_state = protection_state
self.protectable_item_type = None
class AzureFileShareProtectableItem(WorkloadProtectableItem):
"""Protectable item for Azure Fileshare workloads.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_container_fabric_id: Full Fabric ID of container to which
this protectable item belongs. For example, ARM ID.
:type parent_container_fabric_id: str
:param parent_container_friendly_name: Friendly name of container to which
this protectable item belongs.
:type parent_container_friendly_name: str
:param azure_file_share_type: File Share type XSync or XSMB. Possible
values include: 'Invalid', 'XSMB', 'XSync'
:type azure_file_share_type: str or
~azure.mgmt.recoveryservicesbackup.models.AzureFileShareType
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_container_fabric_id': {'key': 'parentContainerFabricId', 'type': 'str'},
'parent_container_friendly_name': {'key': 'parentContainerFriendlyName', 'type': 'str'},
'azure_file_share_type': {'key': 'azureFileShareType', 'type': 'str'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_container_fabric_id: str=None, parent_container_friendly_name: str=None, azure_file_share_type=None, **kwargs) -> None:
super(AzureFileShareProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, **kwargs)
self.parent_container_fabric_id = parent_container_fabric_id
self.parent_container_friendly_name = parent_container_friendly_name
self.azure_file_share_type = azure_file_share_type
self.protectable_item_type = 'AzureFileShare'
class ProtectedItem(Model):
"""Base class for backup items.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureFileshareProtectedItem, AzureIaaSVMProtectedItem,
AzureSqlProtectedItem, AzureVmWorkloadProtectedItem, DPMProtectedItem,
GenericProtectedItem, MabFileFolderProtectedItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
}
_subtype_map = {
'protected_item_type': {'AzureFileShareProtectedItem': 'AzureFileshareProtectedItem', 'AzureIaaSVMProtectedItem': 'AzureIaaSVMProtectedItem', 'Microsoft.Sql/servers/databases': 'AzureSqlProtectedItem', 'AzureVmWorkloadProtectedItem': 'AzureVmWorkloadProtectedItem', 'DPMProtectedItem': 'DPMProtectedItem', 'GenericProtectedItem': 'GenericProtectedItem', 'MabFileFolderProtectedItem': 'MabFileFolderProtectedItem'}
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, **kwargs) -> None:
super(ProtectedItem, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.workload_type = workload_type
self.container_name = container_name
self.source_resource_id = source_resource_id
self.policy_id = policy_id
self.last_recovery_point = last_recovery_point
self.backup_set_name = backup_set_name
self.create_mode = create_mode
self.deferred_delete_time_in_utc = deferred_delete_time_in_utc
self.is_scheduled_for_deferred_delete = is_scheduled_for_deferred_delete
self.deferred_delete_time_remaining = deferred_delete_time_remaining
self.is_deferred_delete_schedule_upcoming = is_deferred_delete_schedule_upcoming
self.is_rehydrate = is_rehydrate
self.protected_item_type = None
class AzureFileshareProtectedItem(ProtectedItem):
"""Azure File Share workload-specific backup item.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the fileshare represented by this
backup item.
:type friendly_name: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param health_status: backups running status for this backup item.
Possible values include: 'Passed', 'ActionRequired', 'ActionSuggested',
'Invalid'
:type health_status: str or
~azure.mgmt.recoveryservicesbackup.models.HealthStatus
:param last_backup_status: Last backup operation status. Possible values:
Healthy, Unhealthy.
:type last_backup_status: str
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param extended_info: Additional information with this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureFileshareProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureFileshareProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, protection_status: str=None, protection_state=None, health_status=None, last_backup_status: str=None, last_backup_time=None, extended_info=None, **kwargs) -> None:
super(AzureFileshareProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.friendly_name = friendly_name
self.protection_status = protection_status
self.protection_state = protection_state
self.health_status = health_status
self.last_backup_status = last_backup_status
self.last_backup_time = last_backup_time
self.extended_info = extended_info
self.protected_item_type = 'AzureFileShareProtectedItem'
class AzureFileshareProtectedItemExtendedInfo(Model):
"""Additional information about Azure File Share backup item.
Variables are only populated by the server, and will be ignored when
sending a request.
:param oldest_recovery_point: The oldest backup copy available for this
item in the service.
:type oldest_recovery_point: datetime
:param recovery_point_count: Number of available backup copies associated
with this backup item.
:type recovery_point_count: int
:param policy_state: Indicates consistency of policy object and policy
applied to this backup item.
:type policy_state: str
:ivar resource_state: Indicates the state of this resource. Possible
values are from enum ResourceState {Invalid, Active, SoftDeleted, Deleted}
:vartype resource_state: str
:ivar resource_state_sync_time: The resource state sync time for this
backup item.
:vartype resource_state_sync_time: datetime
"""
_validation = {
'resource_state': {'readonly': True},
'resource_state_sync_time': {'readonly': True},
}
_attribute_map = {
'oldest_recovery_point': {'key': 'oldestRecoveryPoint', 'type': 'iso-8601'},
'recovery_point_count': {'key': 'recoveryPointCount', 'type': 'int'},
'policy_state': {'key': 'policyState', 'type': 'str'},
'resource_state': {'key': 'resourceState', 'type': 'str'},
'resource_state_sync_time': {'key': 'resourceStateSyncTime', 'type': 'iso-8601'},
}
def __init__(self, *, oldest_recovery_point=None, recovery_point_count: int=None, policy_state: str=None, **kwargs) -> None:
super(AzureFileshareProtectedItemExtendedInfo, self).__init__(**kwargs)
self.oldest_recovery_point = oldest_recovery_point
self.recovery_point_count = recovery_point_count
self.policy_state = policy_state
self.resource_state = None
self.resource_state_sync_time = None
class ProtectionPolicy(Model):
"""Base class for backup policy. Workload-specific backup policies are derived
from this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureVmWorkloadProtectionPolicy,
AzureFileShareProtectionPolicy, AzureIaaSVMProtectionPolicy,
AzureSqlProtectionPolicy, GenericProtectionPolicy, MabProtectionPolicy
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
}
_subtype_map = {
'backup_management_type': {'AzureWorkload': 'AzureVmWorkloadProtectionPolicy', 'AzureStorage': 'AzureFileShareProtectionPolicy', 'AzureIaasVM': 'AzureIaaSVMProtectionPolicy', 'AzureSql': 'AzureSqlProtectionPolicy', 'GenericProtectionPolicy': 'GenericProtectionPolicy', 'MAB': 'MabProtectionPolicy'}
}
def __init__(self, *, protected_items_count: int=None, **kwargs) -> None:
super(ProtectionPolicy, self).__init__(**kwargs)
self.protected_items_count = protected_items_count
self.backup_management_type = None
class AzureFileShareProtectionPolicy(ProtectionPolicy):
"""AzureStorage backup policy.
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
:param work_load_type: Type of workload for the backup management.
Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type work_load_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param schedule_policy: Backup schedule specified as part of backup
policy.
:type schedule_policy:
~azure.mgmt.recoveryservicesbackup.models.SchedulePolicy
:param retention_policy: Retention policy with the details on backup copy
retention ranges.
:type retention_policy:
~azure.mgmt.recoveryservicesbackup.models.RetentionPolicy
:param time_zone: TimeZone optional input as string. For example: TimeZone
= "Pacific Standard Time".
:type time_zone: str
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'work_load_type': {'key': 'workLoadType', 'type': 'str'},
'schedule_policy': {'key': 'schedulePolicy', 'type': 'SchedulePolicy'},
'retention_policy': {'key': 'retentionPolicy', 'type': 'RetentionPolicy'},
'time_zone': {'key': 'timeZone', 'type': 'str'},
}
def __init__(self, *, protected_items_count: int=None, work_load_type=None, schedule_policy=None, retention_policy=None, time_zone: str=None, **kwargs) -> None:
super(AzureFileShareProtectionPolicy, self).__init__(protected_items_count=protected_items_count, **kwargs)
self.work_load_type = work_load_type
self.schedule_policy = schedule_policy
self.retention_policy = retention_policy
self.time_zone = time_zone
self.backup_management_type = 'AzureStorage'
class ILRRequest(Model):
"""Parameters to Provision ILR API.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureFileShareProvisionILRRequest,
IaasVMILRRegistrationRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureFileShareProvisionILRRequest': 'AzureFileShareProvisionILRRequest', 'IaasVMILRRegistrationRequest': 'IaasVMILRRegistrationRequest'}
}
def __init__(self, **kwargs) -> None:
super(ILRRequest, self).__init__(**kwargs)
self.object_type = None
class AzureFileShareProvisionILRRequest(ILRRequest):
"""Update snapshot Uri with the correct friendly Name of the source Azure file
share.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_point_id: Recovery point ID.
:type recovery_point_id: str
:param source_resource_id: Source Storage account ARM Id
:type source_resource_id: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_id': {'key': 'recoveryPointId', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
}
def __init__(self, *, recovery_point_id: str=None, source_resource_id: str=None, **kwargs) -> None:
super(AzureFileShareProvisionILRRequest, self).__init__(**kwargs)
self.recovery_point_id = recovery_point_id
self.source_resource_id = source_resource_id
self.object_type = 'AzureFileShareProvisionILRRequest'
class RecoveryPoint(Model):
"""Base class for backup copies. Workload-specific backup copies are derived
from this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureFileShareRecoveryPoint, AzureWorkloadRecoveryPoint,
GenericRecoveryPoint, IaasVMRecoveryPoint
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureFileShareRecoveryPoint': 'AzureFileShareRecoveryPoint', 'AzureWorkloadRecoveryPoint': 'AzureWorkloadRecoveryPoint', 'GenericRecoveryPoint': 'GenericRecoveryPoint', 'IaasVMRecoveryPoint': 'IaasVMRecoveryPoint'}
}
def __init__(self, **kwargs) -> None:
super(RecoveryPoint, self).__init__(**kwargs)
self.object_type = None
class AzureFileShareRecoveryPoint(RecoveryPoint):
"""Azure File Share workload specific backup copy.
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_type: Type of the backup copy. Specifies whether it
is a crash consistent backup or app consistent.
:vartype recovery_point_type: str
:ivar recovery_point_time: Time at which this backup copy was created.
:vartype recovery_point_time: datetime
:ivar file_share_snapshot_uri: Contains Url to the snapshot of fileshare,
if applicable
:vartype file_share_snapshot_uri: str
:ivar recovery_point_size_in_gb: Contains recovery point size
:vartype recovery_point_size_in_gb: int
"""
_validation = {
'object_type': {'required': True},
'recovery_point_type': {'readonly': True},
'recovery_point_time': {'readonly': True},
'file_share_snapshot_uri': {'readonly': True},
'recovery_point_size_in_gb': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_type': {'key': 'recoveryPointType', 'type': 'str'},
'recovery_point_time': {'key': 'recoveryPointTime', 'type': 'iso-8601'},
'file_share_snapshot_uri': {'key': 'fileShareSnapshotUri', 'type': 'str'},
'recovery_point_size_in_gb': {'key': 'recoveryPointSizeInGB', 'type': 'int'},
}
def __init__(self, **kwargs) -> None:
super(AzureFileShareRecoveryPoint, self).__init__(**kwargs)
self.recovery_point_type = None
self.recovery_point_time = None
self.file_share_snapshot_uri = None
self.recovery_point_size_in_gb = None
self.object_type = 'AzureFileShareRecoveryPoint'
class RestoreRequest(Model):
"""Base class for restore request. Workload-specific restore requests are
derived from this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureFileShareRestoreRequest, AzureWorkloadRestoreRequest,
IaasVMRestoreRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureFileShareRestoreRequest': 'AzureFileShareRestoreRequest', 'AzureWorkloadRestoreRequest': 'AzureWorkloadRestoreRequest', 'IaasVMRestoreRequest': 'IaasVMRestoreRequest'}
}
def __init__(self, **kwargs) -> None:
super(RestoreRequest, self).__init__(**kwargs)
self.object_type = None
class AzureFileShareRestoreRequest(RestoreRequest):
"""AzureFileShare Restore Request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Source storage account ARM Id
:type source_resource_id: str
:param copy_options: Options to resolve copy conflicts. Possible values
include: 'Invalid', 'CreateCopy', 'Skip', 'Overwrite', 'FailOnConflict'
:type copy_options: str or
~azure.mgmt.recoveryservicesbackup.models.CopyOptions
:param restore_request_type: Restore Type (FullShareRestore or
ItemLevelRestore). Possible values include: 'Invalid', 'FullShareRestore',
'ItemLevelRestore'
:type restore_request_type: str or
~azure.mgmt.recoveryservicesbackup.models.RestoreRequestType
:param restore_file_specs: List of Source Files/Folders(which need to
recover) and TargetFolderPath details
:type restore_file_specs:
list[~azure.mgmt.recoveryservicesbackup.models.RestoreFileSpecs]
:param target_details: Target File Share Details
:type target_details:
~azure.mgmt.recoveryservicesbackup.models.TargetAFSRestoreInfo
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'copy_options': {'key': 'copyOptions', 'type': 'str'},
'restore_request_type': {'key': 'restoreRequestType', 'type': 'str'},
'restore_file_specs': {'key': 'restoreFileSpecs', 'type': '[RestoreFileSpecs]'},
'target_details': {'key': 'targetDetails', 'type': 'TargetAFSRestoreInfo'},
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, copy_options=None, restore_request_type=None, restore_file_specs=None, target_details=None, **kwargs) -> None:
super(AzureFileShareRestoreRequest, self).__init__(**kwargs)
self.recovery_type = recovery_type
self.source_resource_id = source_resource_id
self.copy_options = copy_options
self.restore_request_type = restore_request_type
self.restore_file_specs = restore_file_specs
self.target_details = target_details
self.object_type = 'AzureFileShareRestoreRequest'
class IaaSVMContainer(ProtectionContainer):
"""IaaS VM workload-specific container.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureIaaSClassicComputeVMContainer,
AzureIaaSComputeVMContainer
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param virtual_machine_id: Fully qualified ARM url of the virtual machine
represented by this Azure IaaS VM container.
:type virtual_machine_id: str
:param virtual_machine_version: Specifies whether the container represents
a Classic or an Azure Resource Manager VM.
:type virtual_machine_version: str
:param resource_group: Resource group name of Recovery Services Vault.
:type resource_group: str
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'virtual_machine_version': {'key': 'virtualMachineVersion', 'type': 'str'},
'resource_group': {'key': 'resourceGroup', 'type': 'str'},
}
_subtype_map = {
'container_type': {'Microsoft.ClassicCompute/virtualMachines': 'AzureIaaSClassicComputeVMContainer', 'Microsoft.Compute/virtualMachines': 'AzureIaaSComputeVMContainer'}
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, virtual_machine_id: str=None, virtual_machine_version: str=None, resource_group: str=None, **kwargs) -> None:
super(IaaSVMContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.virtual_machine_id = virtual_machine_id
self.virtual_machine_version = virtual_machine_version
self.resource_group = resource_group
self.container_type = 'IaaSVMContainer'
class AzureIaaSClassicComputeVMContainer(IaaSVMContainer):
"""IaaS VM workload-specific backup item representing a classic virtual
machine.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param virtual_machine_id: Fully qualified ARM url of the virtual machine
represented by this Azure IaaS VM container.
:type virtual_machine_id: str
:param virtual_machine_version: Specifies whether the container represents
a Classic or an Azure Resource Manager VM.
:type virtual_machine_version: str
:param resource_group: Resource group name of Recovery Services Vault.
:type resource_group: str
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'virtual_machine_version': {'key': 'virtualMachineVersion', 'type': 'str'},
'resource_group': {'key': 'resourceGroup', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, virtual_machine_id: str=None, virtual_machine_version: str=None, resource_group: str=None, **kwargs) -> None:
super(AzureIaaSClassicComputeVMContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, virtual_machine_id=virtual_machine_id, virtual_machine_version=virtual_machine_version, resource_group=resource_group, **kwargs)
self.container_type = 'Microsoft.ClassicCompute/virtualMachines'
class IaaSVMProtectableItem(WorkloadProtectableItem):
"""IaaS VM workload-specific backup item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureIaaSClassicComputeVMProtectableItem,
AzureIaaSComputeVMProtectableItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine.
:type virtual_machine_id: str
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
}
_subtype_map = {
'protectable_item_type': {'Microsoft.ClassicCompute/virtualMachines': 'AzureIaaSClassicComputeVMProtectableItem', 'Microsoft.Compute/virtualMachines': 'AzureIaaSComputeVMProtectableItem'}
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, virtual_machine_id: str=None, **kwargs) -> None:
super(IaaSVMProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, **kwargs)
self.virtual_machine_id = virtual_machine_id
self.protectable_item_type = 'IaaSVMProtectableItem'
class AzureIaaSClassicComputeVMProtectableItem(IaaSVMProtectableItem):
"""IaaS VM workload-specific backup item representing the Classic Compute VM.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine.
:type virtual_machine_id: str
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, virtual_machine_id: str=None, **kwargs) -> None:
super(AzureIaaSClassicComputeVMProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, virtual_machine_id=virtual_machine_id, **kwargs)
self.protectable_item_type = 'Microsoft.ClassicCompute/virtualMachines'
class AzureIaaSVMProtectedItem(ProtectedItem):
"""IaaS VM workload-specific backup item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureIaaSClassicComputeVMProtectedItem,
AzureIaaSComputeVMProtectedItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the VM represented by this backup
item.
:type friendly_name: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine
represented by this item.
:type virtual_machine_id: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param health_status: Health status of protected item. Possible values
include: 'Passed', 'ActionRequired', 'ActionSuggested', 'Invalid'
:type health_status: str or
~azure.mgmt.recoveryservicesbackup.models.HealthStatus
:param health_details: Health details on this backup item.
:type health_details:
list[~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMHealthDetails]
:param last_backup_status: Last backup operation status.
:type last_backup_status: str
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param protected_item_data_id: Data ID of the protected item.
:type protected_item_data_id: str
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMProtectedItemExtendedInfo
:param extended_properties:
:type extended_properties:
~azure.mgmt.recoveryservicesbackup.models.ExtendedProperties
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'health_details': {'key': 'healthDetails', 'type': '[AzureIaaSVMHealthDetails]'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'protected_item_data_id': {'key': 'protectedItemDataId', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureIaaSVMProtectedItemExtendedInfo'},
'extended_properties': {'key': 'extendedProperties', 'type': 'ExtendedProperties'},
}
_subtype_map = {
'protected_item_type': {'Microsoft.ClassicCompute/virtualMachines': 'AzureIaaSClassicComputeVMProtectedItem', 'Microsoft.Compute/virtualMachines': 'AzureIaaSComputeVMProtectedItem'}
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, virtual_machine_id: str=None, protection_status: str=None, protection_state=None, health_status=None, health_details=None, last_backup_status: str=None, last_backup_time=None, protected_item_data_id: str=None, extended_info=None, extended_properties=None, **kwargs) -> None:
super(AzureIaaSVMProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.friendly_name = friendly_name
self.virtual_machine_id = virtual_machine_id
self.protection_status = protection_status
self.protection_state = protection_state
self.health_status = health_status
self.health_details = health_details
self.last_backup_status = last_backup_status
self.last_backup_time = last_backup_time
self.protected_item_data_id = protected_item_data_id
self.extended_info = extended_info
self.extended_properties = extended_properties
self.protected_item_type = 'AzureIaaSVMProtectedItem'
class AzureIaaSClassicComputeVMProtectedItem(AzureIaaSVMProtectedItem):
"""IaaS VM workload-specific backup item representing the Classic Compute VM.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the VM represented by this backup
item.
:type friendly_name: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine
represented by this item.
:type virtual_machine_id: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param health_status: Health status of protected item. Possible values
include: 'Passed', 'ActionRequired', 'ActionSuggested', 'Invalid'
:type health_status: str or
~azure.mgmt.recoveryservicesbackup.models.HealthStatus
:param health_details: Health details on this backup item.
:type health_details:
list[~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMHealthDetails]
:param last_backup_status: Last backup operation status.
:type last_backup_status: str
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param protected_item_data_id: Data ID of the protected item.
:type protected_item_data_id: str
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMProtectedItemExtendedInfo
:param extended_properties:
:type extended_properties:
~azure.mgmt.recoveryservicesbackup.models.ExtendedProperties
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'health_details': {'key': 'healthDetails', 'type': '[AzureIaaSVMHealthDetails]'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'protected_item_data_id': {'key': 'protectedItemDataId', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureIaaSVMProtectedItemExtendedInfo'},
'extended_properties': {'key': 'extendedProperties', 'type': 'ExtendedProperties'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, virtual_machine_id: str=None, protection_status: str=None, protection_state=None, health_status=None, health_details=None, last_backup_status: str=None, last_backup_time=None, protected_item_data_id: str=None, extended_info=None, extended_properties=None, **kwargs) -> None:
super(AzureIaaSClassicComputeVMProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, friendly_name=friendly_name, virtual_machine_id=virtual_machine_id, protection_status=protection_status, protection_state=protection_state, health_status=health_status, health_details=health_details, last_backup_status=last_backup_status, last_backup_time=last_backup_time, protected_item_data_id=protected_item_data_id, extended_info=extended_info, extended_properties=extended_properties, **kwargs)
self.protected_item_type = 'Microsoft.ClassicCompute/virtualMachines'
class AzureIaaSComputeVMContainer(IaaSVMContainer):
"""IaaS VM workload-specific backup item representing an Azure Resource
Manager virtual machine.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param virtual_machine_id: Fully qualified ARM url of the virtual machine
represented by this Azure IaaS VM container.
:type virtual_machine_id: str
:param virtual_machine_version: Specifies whether the container represents
a Classic or an Azure Resource Manager VM.
:type virtual_machine_version: str
:param resource_group: Resource group name of Recovery Services Vault.
:type resource_group: str
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'virtual_machine_version': {'key': 'virtualMachineVersion', 'type': 'str'},
'resource_group': {'key': 'resourceGroup', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, virtual_machine_id: str=None, virtual_machine_version: str=None, resource_group: str=None, **kwargs) -> None:
super(AzureIaaSComputeVMContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, virtual_machine_id=virtual_machine_id, virtual_machine_version=virtual_machine_version, resource_group=resource_group, **kwargs)
self.container_type = 'Microsoft.Compute/virtualMachines'
class AzureIaaSComputeVMProtectableItem(IaaSVMProtectableItem):
"""IaaS VM workload-specific backup item representing the Azure Resource
Manager VM.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine.
:type virtual_machine_id: str
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, virtual_machine_id: str=None, **kwargs) -> None:
super(AzureIaaSComputeVMProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, virtual_machine_id=virtual_machine_id, **kwargs)
self.protectable_item_type = 'Microsoft.Compute/virtualMachines'
class AzureIaaSComputeVMProtectedItem(AzureIaaSVMProtectedItem):
"""IaaS VM workload-specific backup item representing the Azure Resource
Manager VM.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the VM represented by this backup
item.
:type friendly_name: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine
represented by this item.
:type virtual_machine_id: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param health_status: Health status of protected item. Possible values
include: 'Passed', 'ActionRequired', 'ActionSuggested', 'Invalid'
:type health_status: str or
~azure.mgmt.recoveryservicesbackup.models.HealthStatus
:param health_details: Health details on this backup item.
:type health_details:
list[~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMHealthDetails]
:param last_backup_status: Last backup operation status.
:type last_backup_status: str
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param protected_item_data_id: Data ID of the protected item.
:type protected_item_data_id: str
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMProtectedItemExtendedInfo
:param extended_properties:
:type extended_properties:
~azure.mgmt.recoveryservicesbackup.models.ExtendedProperties
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'health_details': {'key': 'healthDetails', 'type': '[AzureIaaSVMHealthDetails]'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'protected_item_data_id': {'key': 'protectedItemDataId', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureIaaSVMProtectedItemExtendedInfo'},
'extended_properties': {'key': 'extendedProperties', 'type': 'ExtendedProperties'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, virtual_machine_id: str=None, protection_status: str=None, protection_state=None, health_status=None, health_details=None, last_backup_status: str=None, last_backup_time=None, protected_item_data_id: str=None, extended_info=None, extended_properties=None, **kwargs) -> None:
super(AzureIaaSComputeVMProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, friendly_name=friendly_name, virtual_machine_id=virtual_machine_id, protection_status=protection_status, protection_state=protection_state, health_status=health_status, health_details=health_details, last_backup_status=last_backup_status, last_backup_time=last_backup_time, protected_item_data_id=protected_item_data_id, extended_info=extended_info, extended_properties=extended_properties, **kwargs)
self.protected_item_type = 'Microsoft.Compute/virtualMachines'
class AzureIaaSVMErrorInfo(Model):
"""Azure IaaS VM workload-specific error information.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar error_code: Error code.
:vartype error_code: int
:ivar error_title: Title: Typically, the entity that the error pertains
to.
:vartype error_title: str
:ivar error_string: Localized error string.
:vartype error_string: str
:ivar recommendations: List of localized recommendations for above error
code.
:vartype recommendations: list[str]
"""
_validation = {
'error_code': {'readonly': True},
'error_title': {'readonly': True},
'error_string': {'readonly': True},
'recommendations': {'readonly': True},
}
_attribute_map = {
'error_code': {'key': 'errorCode', 'type': 'int'},
'error_title': {'key': 'errorTitle', 'type': 'str'},
'error_string': {'key': 'errorString', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, **kwargs) -> None:
super(AzureIaaSVMErrorInfo, self).__init__(**kwargs)
self.error_code = None
self.error_title = None
self.error_string = None
self.recommendations = None
class AzureIaaSVMHealthDetails(Model):
"""Azure IaaS VM workload-specific Health Details.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar code: Health Code
:vartype code: int
:ivar title: Health Title
:vartype title: str
:ivar message: Health Message
:vartype message: str
:ivar recommendations: Health Recommended Actions
:vartype recommendations: list[str]
"""
_validation = {
'code': {'readonly': True},
'title': {'readonly': True},
'message': {'readonly': True},
'recommendations': {'readonly': True},
}
_attribute_map = {
'code': {'key': 'code', 'type': 'int'},
'title': {'key': 'title', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, **kwargs) -> None:
super(AzureIaaSVMHealthDetails, self).__init__(**kwargs)
self.code = None
self.title = None
self.message = None
self.recommendations = None
class Job(Model):
"""Defines workload agnostic properties for a job.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureIaaSVMJob, AzureStorageJob, AzureWorkloadJob, DpmJob,
MabJob
All required parameters must be populated in order to send to Azure.
:param entity_friendly_name: Friendly name of the entity on which the
current job is executing.
:type entity_friendly_name: str
:param backup_management_type: Backup management type to execute the
current job. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: The operation name.
:type operation: str
:param status: Job status.
:type status: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param activity_id: ActivityId of job.
:type activity_id: str
:param job_type: Required. Constant filled by server.
:type job_type: str
"""
_validation = {
'job_type': {'required': True},
}
_attribute_map = {
'entity_friendly_name': {'key': 'entityFriendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'activity_id': {'key': 'activityId', 'type': 'str'},
'job_type': {'key': 'jobType', 'type': 'str'},
}
_subtype_map = {
'job_type': {'AzureIaaSVMJob': 'AzureIaaSVMJob', 'AzureStorageJob': 'AzureStorageJob', 'AzureWorkloadJob': 'AzureWorkloadJob', 'DpmJob': 'DpmJob', 'MabJob': 'MabJob'}
}
def __init__(self, *, entity_friendly_name: str=None, backup_management_type=None, operation: str=None, status: str=None, start_time=None, end_time=None, activity_id: str=None, **kwargs) -> None:
super(Job, self).__init__(**kwargs)
self.entity_friendly_name = entity_friendly_name
self.backup_management_type = backup_management_type
self.operation = operation
self.status = status
self.start_time = start_time
self.end_time = end_time
self.activity_id = activity_id
self.job_type = None
class AzureIaaSVMJob(Job):
"""Azure IaaS VM workload-specific job object.
All required parameters must be populated in order to send to Azure.
:param entity_friendly_name: Friendly name of the entity on which the
current job is executing.
:type entity_friendly_name: str
:param backup_management_type: Backup management type to execute the
current job. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: The operation name.
:type operation: str
:param status: Job status.
:type status: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param activity_id: ActivityId of job.
:type activity_id: str
:param job_type: Required. Constant filled by server.
:type job_type: str
:param duration: Time elapsed during the execution of this job.
:type duration: timedelta
:param actions_info: Gets or sets the state/actions applicable on this job
like cancel/retry.
:type actions_info: list[str or
~azure.mgmt.recoveryservicesbackup.models.JobSupportedAction]
:param error_details: Error details on execution of this job.
:type error_details:
list[~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMErrorInfo]
:param virtual_machine_version: Specifies whether the backup item is a
Classic or an Azure Resource Manager VM.
:type virtual_machine_version: str
:param extended_info: Additional information for this job.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMJobExtendedInfo
"""
_validation = {
'job_type': {'required': True},
}
_attribute_map = {
'entity_friendly_name': {'key': 'entityFriendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'activity_id': {'key': 'activityId', 'type': 'str'},
'job_type': {'key': 'jobType', 'type': 'str'},
'duration': {'key': 'duration', 'type': 'duration'},
'actions_info': {'key': 'actionsInfo', 'type': '[JobSupportedAction]'},
'error_details': {'key': 'errorDetails', 'type': '[AzureIaaSVMErrorInfo]'},
'virtual_machine_version': {'key': 'virtualMachineVersion', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureIaaSVMJobExtendedInfo'},
}
def __init__(self, *, entity_friendly_name: str=None, backup_management_type=None, operation: str=None, status: str=None, start_time=None, end_time=None, activity_id: str=None, duration=None, actions_info=None, error_details=None, virtual_machine_version: str=None, extended_info=None, **kwargs) -> None:
super(AzureIaaSVMJob, self).__init__(entity_friendly_name=entity_friendly_name, backup_management_type=backup_management_type, operation=operation, status=status, start_time=start_time, end_time=end_time, activity_id=activity_id, **kwargs)
self.duration = duration
self.actions_info = actions_info
self.error_details = error_details
self.virtual_machine_version = virtual_machine_version
self.extended_info = extended_info
self.job_type = 'AzureIaaSVMJob'
class AzureIaaSVMJobExtendedInfo(Model):
"""Azure IaaS VM workload-specific additional information for job.
:param tasks_list: List of tasks associated with this job.
:type tasks_list:
list[~azure.mgmt.recoveryservicesbackup.models.AzureIaaSVMJobTaskDetails]
:param property_bag: Job properties.
:type property_bag: dict[str, str]
:param internal_property_bag: Job internal properties.
:type internal_property_bag: dict[str, str]
:param progress_percentage: Indicates progress of the job. Null if it has
not started or completed.
:type progress_percentage: float
:param estimated_remaining_duration: Time remaining for execution of this
job.
:type estimated_remaining_duration: str
:param dynamic_error_message: Non localized error message on job
execution.
:type dynamic_error_message: str
"""
_attribute_map = {
'tasks_list': {'key': 'tasksList', 'type': '[AzureIaaSVMJobTaskDetails]'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'internal_property_bag': {'key': 'internalPropertyBag', 'type': '{str}'},
'progress_percentage': {'key': 'progressPercentage', 'type': 'float'},
'estimated_remaining_duration': {'key': 'estimatedRemainingDuration', 'type': 'str'},
'dynamic_error_message': {'key': 'dynamicErrorMessage', 'type': 'str'},
}
def __init__(self, *, tasks_list=None, property_bag=None, internal_property_bag=None, progress_percentage: float=None, estimated_remaining_duration: str=None, dynamic_error_message: str=None, **kwargs) -> None:
super(AzureIaaSVMJobExtendedInfo, self).__init__(**kwargs)
self.tasks_list = tasks_list
self.property_bag = property_bag
self.internal_property_bag = internal_property_bag
self.progress_percentage = progress_percentage
self.estimated_remaining_duration = estimated_remaining_duration
self.dynamic_error_message = dynamic_error_message
class AzureIaaSVMJobTaskDetails(Model):
"""Azure IaaS VM workload-specific job task details.
:param task_id: The task display name.
:type task_id: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param instance_id: The instanceId.
:type instance_id: str
:param duration: Time elapsed for task.
:type duration: timedelta
:param status: The status.
:type status: str
:param progress_percentage: Progress of the task.
:type progress_percentage: float
:param task_execution_details: Details about execution of the task.
eg: number of bytes transferred etc
:type task_execution_details: str
"""
_attribute_map = {
'task_id': {'key': 'taskId', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'instance_id': {'key': 'instanceId', 'type': 'str'},
'duration': {'key': 'duration', 'type': 'duration'},
'status': {'key': 'status', 'type': 'str'},
'progress_percentage': {'key': 'progressPercentage', 'type': 'float'},
'task_execution_details': {'key': 'taskExecutionDetails', 'type': 'str'},
}
def __init__(self, *, task_id: str=None, start_time=None, end_time=None, instance_id: str=None, duration=None, status: str=None, progress_percentage: float=None, task_execution_details: str=None, **kwargs) -> None:
super(AzureIaaSVMJobTaskDetails, self).__init__(**kwargs)
self.task_id = task_id
self.start_time = start_time
self.end_time = end_time
self.instance_id = instance_id
self.duration = duration
self.status = status
self.progress_percentage = progress_percentage
self.task_execution_details = task_execution_details
class AzureIaaSVMProtectedItemExtendedInfo(Model):
"""Additional information on Azure IaaS VM specific backup item.
:param oldest_recovery_point: The oldest backup copy available for this
backup item.
:type oldest_recovery_point: datetime
:param recovery_point_count: Number of backup copies available for this
backup item.
:type recovery_point_count: int
:param policy_inconsistent: Specifies if backup policy associated with the
backup item is inconsistent.
:type policy_inconsistent: bool
"""
_attribute_map = {
'oldest_recovery_point': {'key': 'oldestRecoveryPoint', 'type': 'iso-8601'},
'recovery_point_count': {'key': 'recoveryPointCount', 'type': 'int'},
'policy_inconsistent': {'key': 'policyInconsistent', 'type': 'bool'},
}
def __init__(self, *, oldest_recovery_point=None, recovery_point_count: int=None, policy_inconsistent: bool=None, **kwargs) -> None:
super(AzureIaaSVMProtectedItemExtendedInfo, self).__init__(**kwargs)
self.oldest_recovery_point = oldest_recovery_point
self.recovery_point_count = recovery_point_count
self.policy_inconsistent = policy_inconsistent
class AzureIaaSVMProtectionPolicy(ProtectionPolicy):
"""IaaS VM workload-specific backup policy.
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
:param instant_rp_details:
:type instant_rp_details:
~azure.mgmt.recoveryservicesbackup.models.InstantRPAdditionalDetails
:param schedule_policy: Backup schedule specified as part of backup
policy.
:type schedule_policy:
~azure.mgmt.recoveryservicesbackup.models.SchedulePolicy
:param retention_policy: Retention policy with the details on backup copy
retention ranges.
:type retention_policy:
~azure.mgmt.recoveryservicesbackup.models.RetentionPolicy
:param instant_rp_retention_range_in_days: Instant RP retention policy
range in days
:type instant_rp_retention_range_in_days: int
:param time_zone: TimeZone optional input as string. For example: TimeZone
= "Pacific Standard Time".
:type time_zone: str
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'instant_rp_details': {'key': 'instantRPDetails', 'type': 'InstantRPAdditionalDetails'},
'schedule_policy': {'key': 'schedulePolicy', 'type': 'SchedulePolicy'},
'retention_policy': {'key': 'retentionPolicy', 'type': 'RetentionPolicy'},
'instant_rp_retention_range_in_days': {'key': 'instantRpRetentionRangeInDays', 'type': 'int'},
'time_zone': {'key': 'timeZone', 'type': 'str'},
}
def __init__(self, *, protected_items_count: int=None, instant_rp_details=None, schedule_policy=None, retention_policy=None, instant_rp_retention_range_in_days: int=None, time_zone: str=None, **kwargs) -> None:
super(AzureIaaSVMProtectionPolicy, self).__init__(protected_items_count=protected_items_count, **kwargs)
self.instant_rp_details = instant_rp_details
self.schedule_policy = schedule_policy
self.retention_policy = retention_policy
self.instant_rp_retention_range_in_days = instant_rp_retention_range_in_days
self.time_zone = time_zone
self.backup_management_type = 'AzureIaasVM'
class ProtectionIntent(Model):
"""Base class for backup ProtectionIntent.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureRecoveryServiceVaultProtectionIntent,
AzureResourceProtectionIntent
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param item_id: ID of the item which is getting protected, In case of
Azure Vm , it is ProtectedItemId
:type item_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protection_intent_item_type: Required. Constant filled by server.
:type protection_intent_item_type: str
"""
_validation = {
'protection_intent_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'item_id': {'key': 'itemId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protection_intent_item_type': {'key': 'protectionIntentItemType', 'type': 'str'},
}
_subtype_map = {
'protection_intent_item_type': {'RecoveryServiceVaultItem': 'AzureRecoveryServiceVaultProtectionIntent', 'AzureResourceItem': 'AzureResourceProtectionIntent'}
}
def __init__(self, *, backup_management_type=None, source_resource_id: str=None, item_id: str=None, policy_id: str=None, protection_state=None, **kwargs) -> None:
super(ProtectionIntent, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.source_resource_id = source_resource_id
self.item_id = item_id
self.policy_id = policy_id
self.protection_state = protection_state
self.protection_intent_item_type = None
class AzureRecoveryServiceVaultProtectionIntent(ProtectionIntent):
"""Azure Recovery Services Vault specific protection intent item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadAutoProtectionIntent
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param item_id: ID of the item which is getting protected, In case of
Azure Vm , it is ProtectedItemId
:type item_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protection_intent_item_type: Required. Constant filled by server.
:type protection_intent_item_type: str
"""
_validation = {
'protection_intent_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'item_id': {'key': 'itemId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protection_intent_item_type': {'key': 'protectionIntentItemType', 'type': 'str'},
}
_subtype_map = {
'protection_intent_item_type': {'AzureWorkloadAutoProtectionIntent': 'AzureWorkloadAutoProtectionIntent'}
}
def __init__(self, *, backup_management_type=None, source_resource_id: str=None, item_id: str=None, policy_id: str=None, protection_state=None, **kwargs) -> None:
super(AzureRecoveryServiceVaultProtectionIntent, self).__init__(backup_management_type=backup_management_type, source_resource_id=source_resource_id, item_id=item_id, policy_id=policy_id, protection_state=protection_state, **kwargs)
self.protection_intent_item_type = 'RecoveryServiceVaultItem'
class AzureResourceProtectionIntent(ProtectionIntent):
"""IaaS VM specific backup protection intent item.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param item_id: ID of the item which is getting protected, In case of
Azure Vm , it is ProtectedItemId
:type item_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protection_intent_item_type: Required. Constant filled by server.
:type protection_intent_item_type: str
:param friendly_name: Friendly name of the VM represented by this backup
item.
:type friendly_name: str
"""
_validation = {
'protection_intent_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'item_id': {'key': 'itemId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protection_intent_item_type': {'key': 'protectionIntentItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, source_resource_id: str=None, item_id: str=None, policy_id: str=None, protection_state=None, friendly_name: str=None, **kwargs) -> None:
super(AzureResourceProtectionIntent, self).__init__(backup_management_type=backup_management_type, source_resource_id=source_resource_id, item_id=item_id, policy_id=policy_id, protection_state=protection_state, **kwargs)
self.friendly_name = friendly_name
self.protection_intent_item_type = 'AzureResourceItem'
class AzureWorkloadContainer(ProtectionContainer):
"""Container for the workloads running inside Azure Compute or Classic
Compute.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureSQLAGWorkloadContainerProtectionContainer,
AzureVMAppContainerProtectionContainer
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param source_resource_id: ARM ID of the virtual machine represented by
this Azure Workload Container
:type source_resource_id: str
:param last_updated_time: Time stamp when this container was updated.
:type last_updated_time: datetime
:param extended_info: Additional details of a workload container.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadContainerExtendedInfo
:param workload_type: Workload type for which registration was sent.
Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param operation_type: Re-Do Operation. Possible values include:
'Invalid', 'Register', 'Reregister'
:type operation_type: str or
~azure.mgmt.recoveryservicesbackup.models.OperationType
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'last_updated_time': {'key': 'lastUpdatedTime', 'type': 'iso-8601'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureWorkloadContainerExtendedInfo'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'operation_type': {'key': 'operationType', 'type': 'str'},
}
_subtype_map = {
'container_type': {'SQLAGWorkLoadContainer': 'AzureSQLAGWorkloadContainerProtectionContainer', 'VMAppContainer': 'AzureVMAppContainerProtectionContainer'}
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, source_resource_id: str=None, last_updated_time=None, extended_info=None, workload_type=None, operation_type=None, **kwargs) -> None:
super(AzureWorkloadContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.source_resource_id = source_resource_id
self.last_updated_time = last_updated_time
self.extended_info = extended_info
self.workload_type = workload_type
self.operation_type = operation_type
self.container_type = 'AzureWorkloadContainer'
class AzureSQLAGWorkloadContainerProtectionContainer(AzureWorkloadContainer):
"""Container for SQL workloads under SQL Availability Group.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param source_resource_id: ARM ID of the virtual machine represented by
this Azure Workload Container
:type source_resource_id: str
:param last_updated_time: Time stamp when this container was updated.
:type last_updated_time: datetime
:param extended_info: Additional details of a workload container.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadContainerExtendedInfo
:param workload_type: Workload type for which registration was sent.
Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param operation_type: Re-Do Operation. Possible values include:
'Invalid', 'Register', 'Reregister'
:type operation_type: str or
~azure.mgmt.recoveryservicesbackup.models.OperationType
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'last_updated_time': {'key': 'lastUpdatedTime', 'type': 'iso-8601'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureWorkloadContainerExtendedInfo'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'operation_type': {'key': 'operationType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, source_resource_id: str=None, last_updated_time=None, extended_info=None, workload_type=None, operation_type=None, **kwargs) -> None:
super(AzureSQLAGWorkloadContainerProtectionContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, source_resource_id=source_resource_id, last_updated_time=last_updated_time, extended_info=extended_info, workload_type=workload_type, operation_type=operation_type, **kwargs)
self.container_type = 'SQLAGWorkLoadContainer'
class AzureSqlContainer(ProtectionContainer):
"""Azure Sql workload-specific container.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, **kwargs) -> None:
super(AzureSqlContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.container_type = 'AzureSqlContainer'
class AzureSqlProtectedItem(ProtectedItem):
"""Azure SQL workload-specific backup item.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param protected_item_data_id: Internal ID of a backup item. Used by Azure
SQL Backup engine to contact Recovery Services.
:type protected_item_data_id: str
:param protection_state: Backup state of the backed up item. Possible
values include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectedItemState
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureSqlProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'protected_item_data_id': {'key': 'protectedItemDataId', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureSqlProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, protected_item_data_id: str=None, protection_state=None, extended_info=None, **kwargs) -> None:
super(AzureSqlProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.protected_item_data_id = protected_item_data_id
self.protection_state = protection_state
self.extended_info = extended_info
self.protected_item_type = 'Microsoft.Sql/servers/databases'
class AzureSqlProtectedItemExtendedInfo(Model):
"""Additional information on Azure Sql specific protected item.
:param oldest_recovery_point: The oldest backup copy available for this
item in the service.
:type oldest_recovery_point: datetime
:param recovery_point_count: Number of available backup copies associated
with this backup item.
:type recovery_point_count: int
:param policy_state: State of the backup policy associated with this
backup item.
:type policy_state: str
"""
_attribute_map = {
'oldest_recovery_point': {'key': 'oldestRecoveryPoint', 'type': 'iso-8601'},
'recovery_point_count': {'key': 'recoveryPointCount', 'type': 'int'},
'policy_state': {'key': 'policyState', 'type': 'str'},
}
def __init__(self, *, oldest_recovery_point=None, recovery_point_count: int=None, policy_state: str=None, **kwargs) -> None:
super(AzureSqlProtectedItemExtendedInfo, self).__init__(**kwargs)
self.oldest_recovery_point = oldest_recovery_point
self.recovery_point_count = recovery_point_count
self.policy_state = policy_state
class AzureSqlProtectionPolicy(ProtectionPolicy):
"""Azure SQL workload-specific backup policy.
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
:param retention_policy: Retention policy details.
:type retention_policy:
~azure.mgmt.recoveryservicesbackup.models.RetentionPolicy
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'retention_policy': {'key': 'retentionPolicy', 'type': 'RetentionPolicy'},
}
def __init__(self, *, protected_items_count: int=None, retention_policy=None, **kwargs) -> None:
super(AzureSqlProtectionPolicy, self).__init__(protected_items_count=protected_items_count, **kwargs)
self.retention_policy = retention_policy
self.backup_management_type = 'AzureSql'
class AzureStorageContainer(ProtectionContainer):
"""Azure Storage Account workload-specific container.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param source_resource_id: Fully qualified ARM url.
:type source_resource_id: str
:param storage_account_version: Storage account version.
:type storage_account_version: str
:param resource_group: Resource group name of Recovery Services Vault.
:type resource_group: str
:param protected_item_count: Number of items backed up in this container.
:type protected_item_count: long
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'storage_account_version': {'key': 'storageAccountVersion', 'type': 'str'},
'resource_group': {'key': 'resourceGroup', 'type': 'str'},
'protected_item_count': {'key': 'protectedItemCount', 'type': 'long'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, source_resource_id: str=None, storage_account_version: str=None, resource_group: str=None, protected_item_count: int=None, **kwargs) -> None:
super(AzureStorageContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.source_resource_id = source_resource_id
self.storage_account_version = storage_account_version
self.resource_group = resource_group
self.protected_item_count = protected_item_count
self.container_type = 'StorageContainer'
class AzureStorageErrorInfo(Model):
"""Azure storage specific error information.
:param error_code: Error code.
:type error_code: int
:param error_string: Localized error string.
:type error_string: str
:param recommendations: List of localized recommendations for above error
code.
:type recommendations: list[str]
"""
_attribute_map = {
'error_code': {'key': 'errorCode', 'type': 'int'},
'error_string': {'key': 'errorString', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, *, error_code: int=None, error_string: str=None, recommendations=None, **kwargs) -> None:
super(AzureStorageErrorInfo, self).__init__(**kwargs)
self.error_code = error_code
self.error_string = error_string
self.recommendations = recommendations
class AzureStorageJob(Job):
"""Azure storage specific job.
All required parameters must be populated in order to send to Azure.
:param entity_friendly_name: Friendly name of the entity on which the
current job is executing.
:type entity_friendly_name: str
:param backup_management_type: Backup management type to execute the
current job. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: The operation name.
:type operation: str
:param status: Job status.
:type status: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param activity_id: ActivityId of job.
:type activity_id: str
:param job_type: Required. Constant filled by server.
:type job_type: str
:param duration: Time elapsed during the execution of this job.
:type duration: timedelta
:param actions_info: Gets or sets the state/actions applicable on this job
like cancel/retry.
:type actions_info: list[str or
~azure.mgmt.recoveryservicesbackup.models.JobSupportedAction]
:param error_details: Error details on execution of this job.
:type error_details:
list[~azure.mgmt.recoveryservicesbackup.models.AzureStorageErrorInfo]
:param storage_account_name: Specifies friendly name of the storage
account.
:type storage_account_name: str
:param storage_account_version: Specifies whether the Storage account is a
Classic or an Azure Resource Manager Storage account.
:type storage_account_version: str
:param extended_info: Additional information about the job.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureStorageJobExtendedInfo
"""
_validation = {
'job_type': {'required': True},
}
_attribute_map = {
'entity_friendly_name': {'key': 'entityFriendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'activity_id': {'key': 'activityId', 'type': 'str'},
'job_type': {'key': 'jobType', 'type': 'str'},
'duration': {'key': 'duration', 'type': 'duration'},
'actions_info': {'key': 'actionsInfo', 'type': '[JobSupportedAction]'},
'error_details': {'key': 'errorDetails', 'type': '[AzureStorageErrorInfo]'},
'storage_account_name': {'key': 'storageAccountName', 'type': 'str'},
'storage_account_version': {'key': 'storageAccountVersion', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureStorageJobExtendedInfo'},
}
def __init__(self, *, entity_friendly_name: str=None, backup_management_type=None, operation: str=None, status: str=None, start_time=None, end_time=None, activity_id: str=None, duration=None, actions_info=None, error_details=None, storage_account_name: str=None, storage_account_version: str=None, extended_info=None, **kwargs) -> None:
super(AzureStorageJob, self).__init__(entity_friendly_name=entity_friendly_name, backup_management_type=backup_management_type, operation=operation, status=status, start_time=start_time, end_time=end_time, activity_id=activity_id, **kwargs)
self.duration = duration
self.actions_info = actions_info
self.error_details = error_details
self.storage_account_name = storage_account_name
self.storage_account_version = storage_account_version
self.extended_info = extended_info
self.job_type = 'AzureStorageJob'
class AzureStorageJobExtendedInfo(Model):
"""Azure Storage workload-specific additional information for job.
:param tasks_list: List of tasks for this job
:type tasks_list:
list[~azure.mgmt.recoveryservicesbackup.models.AzureStorageJobTaskDetails]
:param property_bag: Job properties.
:type property_bag: dict[str, str]
:param dynamic_error_message: Non localized error message on job
execution.
:type dynamic_error_message: str
"""
_attribute_map = {
'tasks_list': {'key': 'tasksList', 'type': '[AzureStorageJobTaskDetails]'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'dynamic_error_message': {'key': 'dynamicErrorMessage', 'type': 'str'},
}
def __init__(self, *, tasks_list=None, property_bag=None, dynamic_error_message: str=None, **kwargs) -> None:
super(AzureStorageJobExtendedInfo, self).__init__(**kwargs)
self.tasks_list = tasks_list
self.property_bag = property_bag
self.dynamic_error_message = dynamic_error_message
class AzureStorageJobTaskDetails(Model):
"""Azure storage workload specific job task details.
:param task_id: The task display name.
:type task_id: str
:param status: The status.
:type status: str
"""
_attribute_map = {
'task_id': {'key': 'taskId', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
}
def __init__(self, *, task_id: str=None, status: str=None, **kwargs) -> None:
super(AzureStorageJobTaskDetails, self).__init__(**kwargs)
self.task_id = task_id
self.status = status
class ProtectableContainer(Model):
"""Protectable Container Class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureStorageProtectableContainer,
AzureVMAppContainerProtectableContainer
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param health_status: Status of health of the container.
:type health_status: str
:param container_id: Fabric Id of the container such as ARM Id.
:type container_id: str
:param protectable_container_type: Required. Constant filled by server.
:type protectable_container_type: str
"""
_validation = {
'protectable_container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_id': {'key': 'containerId', 'type': 'str'},
'protectable_container_type': {'key': 'protectableContainerType', 'type': 'str'},
}
_subtype_map = {
'protectable_container_type': {'StorageContainer': 'AzureStorageProtectableContainer', 'VMAppContainer': 'AzureVMAppContainerProtectableContainer'}
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, health_status: str=None, container_id: str=None, **kwargs) -> None:
super(ProtectableContainer, self).__init__(**kwargs)
self.friendly_name = friendly_name
self.backup_management_type = backup_management_type
self.health_status = health_status
self.container_id = container_id
self.protectable_container_type = None
class AzureStorageProtectableContainer(ProtectableContainer):
"""Azure Storage-specific protectable containers.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param health_status: Status of health of the container.
:type health_status: str
:param container_id: Fabric Id of the container such as ARM Id.
:type container_id: str
:param protectable_container_type: Required. Constant filled by server.
:type protectable_container_type: str
"""
_validation = {
'protectable_container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_id': {'key': 'containerId', 'type': 'str'},
'protectable_container_type': {'key': 'protectableContainerType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, health_status: str=None, container_id: str=None, **kwargs) -> None:
super(AzureStorageProtectableContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, health_status=health_status, container_id=container_id, **kwargs)
self.protectable_container_type = 'StorageContainer'
class AzureVMAppContainerProtectableContainer(ProtectableContainer):
"""Azure workload-specific container.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param health_status: Status of health of the container.
:type health_status: str
:param container_id: Fabric Id of the container such as ARM Id.
:type container_id: str
:param protectable_container_type: Required. Constant filled by server.
:type protectable_container_type: str
"""
_validation = {
'protectable_container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_id': {'key': 'containerId', 'type': 'str'},
'protectable_container_type': {'key': 'protectableContainerType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, health_status: str=None, container_id: str=None, **kwargs) -> None:
super(AzureVMAppContainerProtectableContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, health_status=health_status, container_id=container_id, **kwargs)
self.protectable_container_type = 'VMAppContainer'
class AzureVMAppContainerProtectionContainer(AzureWorkloadContainer):
"""Container for SQL workloads under Azure Virtual Machines.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param source_resource_id: ARM ID of the virtual machine represented by
this Azure Workload Container
:type source_resource_id: str
:param last_updated_time: Time stamp when this container was updated.
:type last_updated_time: datetime
:param extended_info: Additional details of a workload container.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadContainerExtendedInfo
:param workload_type: Workload type for which registration was sent.
Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param operation_type: Re-Do Operation. Possible values include:
'Invalid', 'Register', 'Reregister'
:type operation_type: str or
~azure.mgmt.recoveryservicesbackup.models.OperationType
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'last_updated_time': {'key': 'lastUpdatedTime', 'type': 'iso-8601'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureWorkloadContainerExtendedInfo'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'operation_type': {'key': 'operationType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, source_resource_id: str=None, last_updated_time=None, extended_info=None, workload_type=None, operation_type=None, **kwargs) -> None:
super(AzureVMAppContainerProtectionContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, source_resource_id=source_resource_id, last_updated_time=last_updated_time, extended_info=extended_info, workload_type=workload_type, operation_type=operation_type, **kwargs)
self.container_type = 'VMAppContainer'
class AzureVMResourceFeatureSupportRequest(FeatureSupportRequest):
"""AzureResource(IaaS VM) Specific feature support request.
All required parameters must be populated in order to send to Azure.
:param feature_type: Required. Constant filled by server.
:type feature_type: str
:param vm_size: Size of the resource: VM size(A/D series etc) in case of
IaasVM
:type vm_size: str
:param vm_sku: SKUs (Premium/Managed etc) in case of IaasVM
:type vm_sku: str
"""
_validation = {
'feature_type': {'required': True},
}
_attribute_map = {
'feature_type': {'key': 'featureType', 'type': 'str'},
'vm_size': {'key': 'vmSize', 'type': 'str'},
'vm_sku': {'key': 'vmSku', 'type': 'str'},
}
def __init__(self, *, vm_size: str=None, vm_sku: str=None, **kwargs) -> None:
super(AzureVMResourceFeatureSupportRequest, self).__init__(**kwargs)
self.vm_size = vm_size
self.vm_sku = vm_sku
self.feature_type = 'AzureVMResourceBackup'
class AzureVMResourceFeatureSupportResponse(Model):
"""Response for feature support requests for Azure IaasVm.
:param support_status: Support status of feature. Possible values include:
'Invalid', 'Supported', 'DefaultOFF', 'DefaultON', 'NotSupported'
:type support_status: str or
~azure.mgmt.recoveryservicesbackup.models.SupportStatus
"""
_attribute_map = {
'support_status': {'key': 'supportStatus', 'type': 'str'},
}
def __init__(self, *, support_status=None, **kwargs) -> None:
super(AzureVMResourceFeatureSupportResponse, self).__init__(**kwargs)
self.support_status = support_status
class WorkloadItem(Model):
"""Base class for backup item. Workload-specific backup items are derived from
this class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureVmWorkloadItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
}
_subtype_map = {
'workload_item_type': {'AzureVmWorkloadItem': 'AzureVmWorkloadItem'}
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, **kwargs) -> None:
super(WorkloadItem, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.workload_type = workload_type
self.friendly_name = friendly_name
self.protection_state = protection_state
self.workload_item_type = None
class AzureVmWorkloadItem(WorkloadItem):
"""Azure VM workload-specific workload item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureVmWorkloadSAPAseDatabaseWorkloadItem,
AzureVmWorkloadSAPAseSystemWorkloadItem,
AzureVmWorkloadSAPHanaDatabaseWorkloadItem,
AzureVmWorkloadSAPHanaSystemWorkloadItem,
AzureVmWorkloadSQLDatabaseWorkloadItem,
AzureVmWorkloadSQLInstanceWorkloadItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
}
_subtype_map = {
'workload_item_type': {'SAPAseDatabase': 'AzureVmWorkloadSAPAseDatabaseWorkloadItem', 'SAPAseSystem': 'AzureVmWorkloadSAPAseSystemWorkloadItem', 'SAPHanaDatabase': 'AzureVmWorkloadSAPHanaDatabaseWorkloadItem', 'SAPHanaSystem': 'AzureVmWorkloadSAPHanaSystemWorkloadItem', 'SQLDataBase': 'AzureVmWorkloadSQLDatabaseWorkloadItem', 'SQLInstance': 'AzureVmWorkloadSQLInstanceWorkloadItem'}
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, **kwargs) -> None:
super(AzureVmWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, **kwargs)
self.parent_name = parent_name
self.server_name = server_name
self.is_auto_protectable = is_auto_protectable
self.subinquireditemcount = subinquireditemcount
self.sub_workload_item_count = sub_workload_item_count
self.workload_item_type = 'AzureVmWorkloadItem'
class AzureVmWorkloadProtectableItem(WorkloadProtectableItem):
"""Azure VM workload-specific protectable item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureVmWorkloadSAPAseSystemProtectableItem,
AzureVmWorkloadSAPHanaDatabaseProtectableItem,
AzureVmWorkloadSAPHanaSystemProtectableItem,
AzureVmWorkloadSQLAvailabilityGroupProtectableItem,
AzureVmWorkloadSQLDatabaseProtectableItem,
AzureVmWorkloadSQLInstanceProtectableItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
_subtype_map = {
'protectable_item_type': {'SAPAseSystem': 'AzureVmWorkloadSAPAseSystemProtectableItem', 'SAPHanaDatabase': 'AzureVmWorkloadSAPHanaDatabaseProtectableItem', 'SAPHanaSystem': 'AzureVmWorkloadSAPHanaSystemProtectableItem', 'SQLAvailabilityGroupContainer': 'AzureVmWorkloadSQLAvailabilityGroupProtectableItem', 'SQLDataBase': 'AzureVmWorkloadSQLDatabaseProtectableItem', 'SQLInstance': 'AzureVmWorkloadSQLInstanceProtectableItem'}
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, **kwargs)
self.parent_name = parent_name
self.parent_unique_name = parent_unique_name
self.server_name = server_name
self.is_auto_protectable = is_auto_protectable
self.is_auto_protected = is_auto_protected
self.subinquireditemcount = subinquireditemcount
self.subprotectableitemcount = subprotectableitemcount
self.prebackupvalidation = prebackupvalidation
self.protectable_item_type = 'AzureVmWorkloadProtectableItem'
class AzureVmWorkloadProtectedItem(ProtectedItem):
"""Azure VM workload-specific protected item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureVmWorkloadSAPAseDatabaseProtectedItem,
AzureVmWorkloadSAPHanaDatabaseProtectedItem,
AzureVmWorkloadSQLDatabaseProtectedItem
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the DB represented by this backup
item.
:type friendly_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param parent_name: Parent name of the DB such as Instance or Availability
Group.
:type parent_name: str
:param parent_type: Parent type of protected item, example: for a DB,
standalone server or distributed
:type parent_type: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param last_backup_status: Last backup operation status. Possible values:
Healthy, Unhealthy. Possible values include: 'Invalid', 'Healthy',
'Unhealthy', 'IRPending'
:type last_backup_status: str or
~azure.mgmt.recoveryservicesbackup.models.LastBackupStatus
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param last_backup_error_detail: Error details in last backup
:type last_backup_error_detail:
~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
:param protected_item_data_source_id: Data ID of the protected item.
:type protected_item_data_source_id: str
:param protected_item_health_status: Health status of the backup item,
evaluated based on last heartbeat received. Possible values include:
'Invalid', 'Healthy', 'Unhealthy', 'NotReachable', 'IRPending'
:type protected_item_health_status: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectedItemHealthStatus
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureVmWorkloadProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_type': {'key': 'parentType', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'last_backup_error_detail': {'key': 'lastBackupErrorDetail', 'type': 'ErrorDetail'},
'protected_item_data_source_id': {'key': 'protectedItemDataSourceId', 'type': 'str'},
'protected_item_health_status': {'key': 'protectedItemHealthStatus', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureVmWorkloadProtectedItemExtendedInfo'},
}
_subtype_map = {
'protected_item_type': {'AzureVmWorkloadSAPAseDatabase': 'AzureVmWorkloadSAPAseDatabaseProtectedItem', 'AzureVmWorkloadSAPHanaDatabase': 'AzureVmWorkloadSAPHanaDatabaseProtectedItem', 'AzureVmWorkloadSQLDatabase': 'AzureVmWorkloadSQLDatabaseProtectedItem'}
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, server_name: str=None, parent_name: str=None, parent_type: str=None, protection_status: str=None, protection_state=None, last_backup_status=None, last_backup_time=None, last_backup_error_detail=None, protected_item_data_source_id: str=None, protected_item_health_status=None, extended_info=None, **kwargs) -> None:
super(AzureVmWorkloadProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.friendly_name = friendly_name
self.server_name = server_name
self.parent_name = parent_name
self.parent_type = parent_type
self.protection_status = protection_status
self.protection_state = protection_state
self.last_backup_status = last_backup_status
self.last_backup_time = last_backup_time
self.last_backup_error_detail = last_backup_error_detail
self.protected_item_data_source_id = protected_item_data_source_id
self.protected_item_health_status = protected_item_health_status
self.extended_info = extended_info
self.protected_item_type = 'AzureVmWorkloadProtectedItem'
class AzureVmWorkloadProtectedItemExtendedInfo(Model):
"""Additional information on Azure Workload for SQL specific backup item.
:param oldest_recovery_point: The oldest backup copy available for this
backup item.
:type oldest_recovery_point: datetime
:param recovery_point_count: Number of backup copies available for this
backup item.
:type recovery_point_count: int
:param policy_state: Indicates consistency of policy object and policy
applied to this backup item.
:type policy_state: str
"""
_attribute_map = {
'oldest_recovery_point': {'key': 'oldestRecoveryPoint', 'type': 'iso-8601'},
'recovery_point_count': {'key': 'recoveryPointCount', 'type': 'int'},
'policy_state': {'key': 'policyState', 'type': 'str'},
}
def __init__(self, *, oldest_recovery_point=None, recovery_point_count: int=None, policy_state: str=None, **kwargs) -> None:
super(AzureVmWorkloadProtectedItemExtendedInfo, self).__init__(**kwargs)
self.oldest_recovery_point = oldest_recovery_point
self.recovery_point_count = recovery_point_count
self.policy_state = policy_state
class AzureVmWorkloadProtectionPolicy(ProtectionPolicy):
"""Azure VM (Mercury) workload-specific backup policy.
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
:param work_load_type: Type of workload for the backup management.
Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type work_load_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param settings: Common settings for the backup management
:type settings: ~azure.mgmt.recoveryservicesbackup.models.Settings
:param sub_protection_policy: List of sub-protection policies which
includes schedule and retention
:type sub_protection_policy:
list[~azure.mgmt.recoveryservicesbackup.models.SubProtectionPolicy]
:param make_policy_consistent: Fix the policy inconsistency
:type make_policy_consistent: bool
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'work_load_type': {'key': 'workLoadType', 'type': 'str'},
'settings': {'key': 'settings', 'type': 'Settings'},
'sub_protection_policy': {'key': 'subProtectionPolicy', 'type': '[SubProtectionPolicy]'},
'make_policy_consistent': {'key': 'makePolicyConsistent', 'type': 'bool'},
}
def __init__(self, *, protected_items_count: int=None, work_load_type=None, settings=None, sub_protection_policy=None, make_policy_consistent: bool=None, **kwargs) -> None:
super(AzureVmWorkloadProtectionPolicy, self).__init__(protected_items_count=protected_items_count, **kwargs)
self.work_load_type = work_load_type
self.settings = settings
self.sub_protection_policy = sub_protection_policy
self.make_policy_consistent = make_policy_consistent
self.backup_management_type = 'AzureWorkload'
class AzureVmWorkloadSAPAseDatabaseProtectedItem(AzureVmWorkloadProtectedItem):
"""Azure VM workload-specific protected item representing SAP ASE Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the DB represented by this backup
item.
:type friendly_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param parent_name: Parent name of the DB such as Instance or Availability
Group.
:type parent_name: str
:param parent_type: Parent type of protected item, example: for a DB,
standalone server or distributed
:type parent_type: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param last_backup_status: Last backup operation status. Possible values:
Healthy, Unhealthy. Possible values include: 'Invalid', 'Healthy',
'Unhealthy', 'IRPending'
:type last_backup_status: str or
~azure.mgmt.recoveryservicesbackup.models.LastBackupStatus
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param last_backup_error_detail: Error details in last backup
:type last_backup_error_detail:
~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
:param protected_item_data_source_id: Data ID of the protected item.
:type protected_item_data_source_id: str
:param protected_item_health_status: Health status of the backup item,
evaluated based on last heartbeat received. Possible values include:
'Invalid', 'Healthy', 'Unhealthy', 'NotReachable', 'IRPending'
:type protected_item_health_status: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectedItemHealthStatus
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureVmWorkloadProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_type': {'key': 'parentType', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'last_backup_error_detail': {'key': 'lastBackupErrorDetail', 'type': 'ErrorDetail'},
'protected_item_data_source_id': {'key': 'protectedItemDataSourceId', 'type': 'str'},
'protected_item_health_status': {'key': 'protectedItemHealthStatus', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureVmWorkloadProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, server_name: str=None, parent_name: str=None, parent_type: str=None, protection_status: str=None, protection_state=None, last_backup_status=None, last_backup_time=None, last_backup_error_detail=None, protected_item_data_source_id: str=None, protected_item_health_status=None, extended_info=None, **kwargs) -> None:
super(AzureVmWorkloadSAPAseDatabaseProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, friendly_name=friendly_name, server_name=server_name, parent_name=parent_name, parent_type=parent_type, protection_status=protection_status, protection_state=protection_state, last_backup_status=last_backup_status, last_backup_time=last_backup_time, last_backup_error_detail=last_backup_error_detail, protected_item_data_source_id=protected_item_data_source_id, protected_item_health_status=protected_item_health_status, extended_info=extended_info, **kwargs)
self.protected_item_type = 'AzureVmWorkloadSAPAseDatabase'
class AzureVmWorkloadSAPAseDatabaseWorkloadItem(AzureVmWorkloadItem):
"""Azure VM workload-specific workload item representing SAP ASE Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, **kwargs) -> None:
super(AzureVmWorkloadSAPAseDatabaseWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, server_name=server_name, is_auto_protectable=is_auto_protectable, subinquireditemcount=subinquireditemcount, sub_workload_item_count=sub_workload_item_count, **kwargs)
self.workload_item_type = 'SAPAseDatabase'
class AzureVmWorkloadSAPAseSystemProtectableItem(AzureVmWorkloadProtectableItem):
"""Azure VM workload-specific protectable item representing SAP ASE System.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadSAPAseSystemProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, parent_unique_name=parent_unique_name, server_name=server_name, is_auto_protectable=is_auto_protectable, is_auto_protected=is_auto_protected, subinquireditemcount=subinquireditemcount, subprotectableitemcount=subprotectableitemcount, prebackupvalidation=prebackupvalidation, **kwargs)
self.protectable_item_type = 'SAPAseSystem'
class AzureVmWorkloadSAPAseSystemWorkloadItem(AzureVmWorkloadItem):
"""Azure VM workload-specific workload item representing SAP ASE System.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, **kwargs) -> None:
super(AzureVmWorkloadSAPAseSystemWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, server_name=server_name, is_auto_protectable=is_auto_protectable, subinquireditemcount=subinquireditemcount, sub_workload_item_count=sub_workload_item_count, **kwargs)
self.workload_item_type = 'SAPAseSystem'
class AzureVmWorkloadSAPHanaDatabaseProtectableItem(AzureVmWorkloadProtectableItem):
"""Azure VM workload-specific protectable item representing SAP HANA Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadSAPHanaDatabaseProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, parent_unique_name=parent_unique_name, server_name=server_name, is_auto_protectable=is_auto_protectable, is_auto_protected=is_auto_protected, subinquireditemcount=subinquireditemcount, subprotectableitemcount=subprotectableitemcount, prebackupvalidation=prebackupvalidation, **kwargs)
self.protectable_item_type = 'SAPHanaDatabase'
class AzureVmWorkloadSAPHanaDatabaseProtectedItem(AzureVmWorkloadProtectedItem):
"""Azure VM workload-specific protected item representing SAP HANA Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the DB represented by this backup
item.
:type friendly_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param parent_name: Parent name of the DB such as Instance or Availability
Group.
:type parent_name: str
:param parent_type: Parent type of protected item, example: for a DB,
standalone server or distributed
:type parent_type: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param last_backup_status: Last backup operation status. Possible values:
Healthy, Unhealthy. Possible values include: 'Invalid', 'Healthy',
'Unhealthy', 'IRPending'
:type last_backup_status: str or
~azure.mgmt.recoveryservicesbackup.models.LastBackupStatus
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param last_backup_error_detail: Error details in last backup
:type last_backup_error_detail:
~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
:param protected_item_data_source_id: Data ID of the protected item.
:type protected_item_data_source_id: str
:param protected_item_health_status: Health status of the backup item,
evaluated based on last heartbeat received. Possible values include:
'Invalid', 'Healthy', 'Unhealthy', 'NotReachable', 'IRPending'
:type protected_item_health_status: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectedItemHealthStatus
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureVmWorkloadProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_type': {'key': 'parentType', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'last_backup_error_detail': {'key': 'lastBackupErrorDetail', 'type': 'ErrorDetail'},
'protected_item_data_source_id': {'key': 'protectedItemDataSourceId', 'type': 'str'},
'protected_item_health_status': {'key': 'protectedItemHealthStatus', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureVmWorkloadProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, server_name: str=None, parent_name: str=None, parent_type: str=None, protection_status: str=None, protection_state=None, last_backup_status=None, last_backup_time=None, last_backup_error_detail=None, protected_item_data_source_id: str=None, protected_item_health_status=None, extended_info=None, **kwargs) -> None:
super(AzureVmWorkloadSAPHanaDatabaseProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, friendly_name=friendly_name, server_name=server_name, parent_name=parent_name, parent_type=parent_type, protection_status=protection_status, protection_state=protection_state, last_backup_status=last_backup_status, last_backup_time=last_backup_time, last_backup_error_detail=last_backup_error_detail, protected_item_data_source_id=protected_item_data_source_id, protected_item_health_status=protected_item_health_status, extended_info=extended_info, **kwargs)
self.protected_item_type = 'AzureVmWorkloadSAPHanaDatabase'
class AzureVmWorkloadSAPHanaDatabaseWorkloadItem(AzureVmWorkloadItem):
"""Azure VM workload-specific workload item representing SAP HANA Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, **kwargs) -> None:
super(AzureVmWorkloadSAPHanaDatabaseWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, server_name=server_name, is_auto_protectable=is_auto_protectable, subinquireditemcount=subinquireditemcount, sub_workload_item_count=sub_workload_item_count, **kwargs)
self.workload_item_type = 'SAPHanaDatabase'
class AzureVmWorkloadSAPHanaSystemProtectableItem(AzureVmWorkloadProtectableItem):
"""Azure VM workload-specific protectable item representing SAP HANA System.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadSAPHanaSystemProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, parent_unique_name=parent_unique_name, server_name=server_name, is_auto_protectable=is_auto_protectable, is_auto_protected=is_auto_protected, subinquireditemcount=subinquireditemcount, subprotectableitemcount=subprotectableitemcount, prebackupvalidation=prebackupvalidation, **kwargs)
self.protectable_item_type = 'SAPHanaSystem'
class AzureVmWorkloadSAPHanaSystemWorkloadItem(AzureVmWorkloadItem):
"""Azure VM workload-specific workload item representing SAP HANA System.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, **kwargs) -> None:
super(AzureVmWorkloadSAPHanaSystemWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, server_name=server_name, is_auto_protectable=is_auto_protectable, subinquireditemcount=subinquireditemcount, sub_workload_item_count=sub_workload_item_count, **kwargs)
self.workload_item_type = 'SAPHanaSystem'
class AzureVmWorkloadSQLAvailabilityGroupProtectableItem(AzureVmWorkloadProtectableItem):
"""Azure VM workload-specific protectable item representing SQL Availability
Group.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadSQLAvailabilityGroupProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, parent_unique_name=parent_unique_name, server_name=server_name, is_auto_protectable=is_auto_protectable, is_auto_protected=is_auto_protected, subinquireditemcount=subinquireditemcount, subprotectableitemcount=subprotectableitemcount, prebackupvalidation=prebackupvalidation, **kwargs)
self.protectable_item_type = 'SQLAvailabilityGroupContainer'
class AzureVmWorkloadSQLDatabaseProtectableItem(AzureVmWorkloadProtectableItem):
"""Azure VM workload-specific protectable item representing SQL Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadSQLDatabaseProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, parent_unique_name=parent_unique_name, server_name=server_name, is_auto_protectable=is_auto_protectable, is_auto_protected=is_auto_protected, subinquireditemcount=subinquireditemcount, subprotectableitemcount=subprotectableitemcount, prebackupvalidation=prebackupvalidation, **kwargs)
self.protectable_item_type = 'SQLDataBase'
class AzureVmWorkloadSQLDatabaseProtectedItem(AzureVmWorkloadProtectedItem):
"""Azure VM workload-specific protected item representing SQL Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the DB represented by this backup
item.
:type friendly_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param parent_name: Parent name of the DB such as Instance or Availability
Group.
:type parent_name: str
:param parent_type: Parent type of protected item, example: for a DB,
standalone server or distributed
:type parent_type: str
:param protection_status: Backup status of this backup item.
:type protection_status: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param last_backup_status: Last backup operation status. Possible values:
Healthy, Unhealthy. Possible values include: 'Invalid', 'Healthy',
'Unhealthy', 'IRPending'
:type last_backup_status: str or
~azure.mgmt.recoveryservicesbackup.models.LastBackupStatus
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param last_backup_error_detail: Error details in last backup
:type last_backup_error_detail:
~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
:param protected_item_data_source_id: Data ID of the protected item.
:type protected_item_data_source_id: str
:param protected_item_health_status: Health status of the backup item,
evaluated based on last heartbeat received. Possible values include:
'Invalid', 'Healthy', 'Unhealthy', 'NotReachable', 'IRPending'
:type protected_item_health_status: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectedItemHealthStatus
:param extended_info: Additional information for this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureVmWorkloadProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_type': {'key': 'parentType', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'last_backup_error_detail': {'key': 'lastBackupErrorDetail', 'type': 'ErrorDetail'},
'protected_item_data_source_id': {'key': 'protectedItemDataSourceId', 'type': 'str'},
'protected_item_health_status': {'key': 'protectedItemHealthStatus', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureVmWorkloadProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, server_name: str=None, parent_name: str=None, parent_type: str=None, protection_status: str=None, protection_state=None, last_backup_status=None, last_backup_time=None, last_backup_error_detail=None, protected_item_data_source_id: str=None, protected_item_health_status=None, extended_info=None, **kwargs) -> None:
super(AzureVmWorkloadSQLDatabaseProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, friendly_name=friendly_name, server_name=server_name, parent_name=parent_name, parent_type=parent_type, protection_status=protection_status, protection_state=protection_state, last_backup_status=last_backup_status, last_backup_time=last_backup_time, last_backup_error_detail=last_backup_error_detail, protected_item_data_source_id=protected_item_data_source_id, protected_item_health_status=protected_item_health_status, extended_info=extended_info, **kwargs)
self.protected_item_type = 'AzureVmWorkloadSQLDatabase'
class AzureVmWorkloadSQLDatabaseWorkloadItem(AzureVmWorkloadItem):
"""Azure VM workload-specific workload item representing SQL Database.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, **kwargs) -> None:
super(AzureVmWorkloadSQLDatabaseWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, server_name=server_name, is_auto_protectable=is_auto_protectable, subinquireditemcount=subinquireditemcount, sub_workload_item_count=sub_workload_item_count, **kwargs)
self.workload_item_type = 'SQLDataBase'
class AzureVmWorkloadSQLInstanceProtectableItem(AzureVmWorkloadProtectableItem):
"""Azure VM workload-specific protectable item representing SQL Instance.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protectable_item_type: Required. Constant filled by server.
:type protectable_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param parent_unique_name: Parent Unique Name is added to provide the
service formatted URI Name of the Parent
Only Applicable for data bases where the parent would be either Instance
or a SQL AG.
:type parent_unique_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if protectable item is
auto-protectable
:type is_auto_protectable: bool
:param is_auto_protected: Indicates if protectable item is auto-protected
:type is_auto_protected: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param subprotectableitemcount: For instance or AG, indicates number of
DB's to be protected
:type subprotectableitemcount: int
:param prebackupvalidation: Pre-backup validation for protectable objects
:type prebackupvalidation:
~azure.mgmt.recoveryservicesbackup.models.PreBackupValidation
"""
_validation = {
'protectable_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protectable_item_type': {'key': 'protectableItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'parent_unique_name': {'key': 'parentUniqueName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'is_auto_protected': {'key': 'isAutoProtected', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'subprotectableitemcount': {'key': 'subprotectableitemcount', 'type': 'int'},
'prebackupvalidation': {'key': 'prebackupvalidation', 'type': 'PreBackupValidation'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, parent_unique_name: str=None, server_name: str=None, is_auto_protectable: bool=None, is_auto_protected: bool=None, subinquireditemcount: int=None, subprotectableitemcount: int=None, prebackupvalidation=None, **kwargs) -> None:
super(AzureVmWorkloadSQLInstanceProtectableItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, parent_unique_name=parent_unique_name, server_name=server_name, is_auto_protectable=is_auto_protectable, is_auto_protected=is_auto_protected, subinquireditemcount=subinquireditemcount, subprotectableitemcount=subprotectableitemcount, prebackupvalidation=prebackupvalidation, **kwargs)
self.protectable_item_type = 'SQLInstance'
class AzureVmWorkloadSQLInstanceWorkloadItem(AzureVmWorkloadItem):
"""Azure VM workload-specific workload item representing SQL Instance.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management to backup an
item.
:type backup_management_type: str
:param workload_type: Type of workload for the backup management
:type workload_type: str
:param friendly_name: Friendly name of the backup item.
:type friendly_name: str
:param protection_state: State of the back up item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param workload_item_type: Required. Constant filled by server.
:type workload_item_type: str
:param parent_name: Name for instance or AG
:type parent_name: str
:param server_name: Host/Cluster Name for instance or AG
:type server_name: str
:param is_auto_protectable: Indicates if workload item is auto-protectable
:type is_auto_protectable: bool
:param subinquireditemcount: For instance or AG, indicates number of DB's
present
:type subinquireditemcount: int
:param sub_workload_item_count: For instance or AG, indicates number of
DB's to be protected
:type sub_workload_item_count: int
:param data_directory_paths: Data Directory Paths for default directories
:type data_directory_paths:
list[~azure.mgmt.recoveryservicesbackup.models.SQLDataDirectory]
"""
_validation = {
'workload_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'server_name': {'key': 'serverName', 'type': 'str'},
'is_auto_protectable': {'key': 'isAutoProtectable', 'type': 'bool'},
'subinquireditemcount': {'key': 'subinquireditemcount', 'type': 'int'},
'sub_workload_item_count': {'key': 'subWorkloadItemCount', 'type': 'int'},
'data_directory_paths': {'key': 'dataDirectoryPaths', 'type': '[SQLDataDirectory]'},
}
def __init__(self, *, backup_management_type: str=None, workload_type: str=None, friendly_name: str=None, protection_state=None, parent_name: str=None, server_name: str=None, is_auto_protectable: bool=None, subinquireditemcount: int=None, sub_workload_item_count: int=None, data_directory_paths=None, **kwargs) -> None:
super(AzureVmWorkloadSQLInstanceWorkloadItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, friendly_name=friendly_name, protection_state=protection_state, parent_name=parent_name, server_name=server_name, is_auto_protectable=is_auto_protectable, subinquireditemcount=subinquireditemcount, sub_workload_item_count=sub_workload_item_count, **kwargs)
self.data_directory_paths = data_directory_paths
self.workload_item_type = 'SQLInstance'
class AzureWorkloadAutoProtectionIntent(AzureRecoveryServiceVaultProtectionIntent):
"""Azure Recovery Services Vault specific protection intent item.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadSQLAutoProtectionIntent
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param item_id: ID of the item which is getting protected, In case of
Azure Vm , it is ProtectedItemId
:type item_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protection_intent_item_type: Required. Constant filled by server.
:type protection_intent_item_type: str
"""
_validation = {
'protection_intent_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'item_id': {'key': 'itemId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protection_intent_item_type': {'key': 'protectionIntentItemType', 'type': 'str'},
}
_subtype_map = {
'protection_intent_item_type': {'AzureWorkloadSQLAutoProtectionIntent': 'AzureWorkloadSQLAutoProtectionIntent'}
}
def __init__(self, *, backup_management_type=None, source_resource_id: str=None, item_id: str=None, policy_id: str=None, protection_state=None, **kwargs) -> None:
super(AzureWorkloadAutoProtectionIntent, self).__init__(backup_management_type=backup_management_type, source_resource_id=source_resource_id, item_id=item_id, policy_id=policy_id, protection_state=protection_state, **kwargs)
self.protection_intent_item_type = 'AzureWorkloadAutoProtectionIntent'
class AzureWorkloadBackupRequest(BackupRequest):
"""AzureWorkload workload-specific backup request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param backup_type: Type of backup, viz. Full, Differential, Log or
CopyOnlyFull. Possible values include: 'Invalid', 'Full', 'Differential',
'Log', 'CopyOnlyFull'
:type backup_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupType
:param enable_compression: Bool for Compression setting
:type enable_compression: bool
:param recovery_point_expiry_time_in_utc: Backup copy will expire after
the time specified (UTC).
:type recovery_point_expiry_time_in_utc: datetime
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'backup_type': {'key': 'backupType', 'type': 'str'},
'enable_compression': {'key': 'enableCompression', 'type': 'bool'},
'recovery_point_expiry_time_in_utc': {'key': 'recoveryPointExpiryTimeInUTC', 'type': 'iso-8601'},
}
def __init__(self, *, backup_type=None, enable_compression: bool=None, recovery_point_expiry_time_in_utc=None, **kwargs) -> None:
super(AzureWorkloadBackupRequest, self).__init__(**kwargs)
self.backup_type = backup_type
self.enable_compression = enable_compression
self.recovery_point_expiry_time_in_utc = recovery_point_expiry_time_in_utc
self.object_type = 'AzureWorkloadBackupRequest'
class AzureWorkloadContainerExtendedInfo(Model):
"""Extended information of the container.
:param host_server_name: Host Os Name in case of Stand Alone and Cluster
Name in case of distributed container.
:type host_server_name: str
:param inquiry_info: Inquiry Status for the container.
:type inquiry_info: ~azure.mgmt.recoveryservicesbackup.models.InquiryInfo
:param nodes_list: List of the nodes in case of distributed container.
:type nodes_list:
list[~azure.mgmt.recoveryservicesbackup.models.DistributedNodesInfo]
"""
_attribute_map = {
'host_server_name': {'key': 'hostServerName', 'type': 'str'},
'inquiry_info': {'key': 'inquiryInfo', 'type': 'InquiryInfo'},
'nodes_list': {'key': 'nodesList', 'type': '[DistributedNodesInfo]'},
}
def __init__(self, *, host_server_name: str=None, inquiry_info=None, nodes_list=None, **kwargs) -> None:
super(AzureWorkloadContainerExtendedInfo, self).__init__(**kwargs)
self.host_server_name = host_server_name
self.inquiry_info = inquiry_info
self.nodes_list = nodes_list
class AzureWorkloadErrorInfo(Model):
"""Azure storage specific error information.
:param error_code: Error code.
:type error_code: int
:param error_string: Localized error string.
:type error_string: str
:param error_title: Title: Typically, the entity that the error pertains
to.
:type error_title: str
:param recommendations: List of localized recommendations for above error
code.
:type recommendations: list[str]
:param additional_details: Additional details for above error code.
:type additional_details: str
"""
_attribute_map = {
'error_code': {'key': 'errorCode', 'type': 'int'},
'error_string': {'key': 'errorString', 'type': 'str'},
'error_title': {'key': 'errorTitle', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
'additional_details': {'key': 'additionalDetails', 'type': 'str'},
}
def __init__(self, *, error_code: int=None, error_string: str=None, error_title: str=None, recommendations=None, additional_details: str=None, **kwargs) -> None:
super(AzureWorkloadErrorInfo, self).__init__(**kwargs)
self.error_code = error_code
self.error_string = error_string
self.error_title = error_title
self.recommendations = recommendations
self.additional_details = additional_details
class AzureWorkloadJob(Job):
"""Azure storage specific job.
All required parameters must be populated in order to send to Azure.
:param entity_friendly_name: Friendly name of the entity on which the
current job is executing.
:type entity_friendly_name: str
:param backup_management_type: Backup management type to execute the
current job. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: The operation name.
:type operation: str
:param status: Job status.
:type status: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param activity_id: ActivityId of job.
:type activity_id: str
:param job_type: Required. Constant filled by server.
:type job_type: str
:param workload_type: Workload type of the job
:type workload_type: str
:param duration: Time elapsed during the execution of this job.
:type duration: timedelta
:param actions_info: Gets or sets the state/actions applicable on this job
like cancel/retry.
:type actions_info: list[str or
~azure.mgmt.recoveryservicesbackup.models.JobSupportedAction]
:param error_details: Error details on execution of this job.
:type error_details:
list[~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadErrorInfo]
:param extended_info: Additional information about the job.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadJobExtendedInfo
"""
_validation = {
'job_type': {'required': True},
}
_attribute_map = {
'entity_friendly_name': {'key': 'entityFriendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'activity_id': {'key': 'activityId', 'type': 'str'},
'job_type': {'key': 'jobType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'duration': {'key': 'duration', 'type': 'duration'},
'actions_info': {'key': 'actionsInfo', 'type': '[JobSupportedAction]'},
'error_details': {'key': 'errorDetails', 'type': '[AzureWorkloadErrorInfo]'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureWorkloadJobExtendedInfo'},
}
def __init__(self, *, entity_friendly_name: str=None, backup_management_type=None, operation: str=None, status: str=None, start_time=None, end_time=None, activity_id: str=None, workload_type: str=None, duration=None, actions_info=None, error_details=None, extended_info=None, **kwargs) -> None:
super(AzureWorkloadJob, self).__init__(entity_friendly_name=entity_friendly_name, backup_management_type=backup_management_type, operation=operation, status=status, start_time=start_time, end_time=end_time, activity_id=activity_id, **kwargs)
self.workload_type = workload_type
self.duration = duration
self.actions_info = actions_info
self.error_details = error_details
self.extended_info = extended_info
self.job_type = 'AzureWorkloadJob'
class AzureWorkloadJobExtendedInfo(Model):
"""Azure VM workload-specific additional information for job.
:param tasks_list: List of tasks for this job
:type tasks_list:
list[~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadJobTaskDetails]
:param property_bag: Job properties.
:type property_bag: dict[str, str]
:param dynamic_error_message: Non localized error message on job
execution.
:type dynamic_error_message: str
"""
_attribute_map = {
'tasks_list': {'key': 'tasksList', 'type': '[AzureWorkloadJobTaskDetails]'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'dynamic_error_message': {'key': 'dynamicErrorMessage', 'type': 'str'},
}
def __init__(self, *, tasks_list=None, property_bag=None, dynamic_error_message: str=None, **kwargs) -> None:
super(AzureWorkloadJobExtendedInfo, self).__init__(**kwargs)
self.tasks_list = tasks_list
self.property_bag = property_bag
self.dynamic_error_message = dynamic_error_message
class AzureWorkloadJobTaskDetails(Model):
"""Azure VM workload specific job task details.
:param task_id: The task display name.
:type task_id: str
:param status: The status.
:type status: str
"""
_attribute_map = {
'task_id': {'key': 'taskId', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
}
def __init__(self, *, task_id: str=None, status: str=None, **kwargs) -> None:
super(AzureWorkloadJobTaskDetails, self).__init__(**kwargs)
self.task_id = task_id
self.status = status
class AzureWorkloadRecoveryPoint(RecoveryPoint):
"""Workload specific recovery point, specifically encapsulates full/diff
recovery point.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadPointInTimeRecoveryPoint,
AzureWorkloadSAPHanaRecoveryPoint, AzureWorkloadSQLRecoveryPoint
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_time_in_utc: UTC time at which recovery point was
created
:vartype recovery_point_time_in_utc: datetime
:ivar type: Type of restore point. Possible values include: 'Invalid',
'Full', 'Log', 'Differential'
:vartype type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointType
"""
_validation = {
'object_type': {'required': True},
'recovery_point_time_in_utc': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_time_in_utc': {'key': 'recoveryPointTimeInUTC', 'type': 'iso-8601'},
'type': {'key': 'type', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureWorkloadPointInTimeRecoveryPoint': 'AzureWorkloadPointInTimeRecoveryPoint', 'AzureWorkloadSAPHanaRecoveryPoint': 'AzureWorkloadSAPHanaRecoveryPoint', 'AzureWorkloadSQLRecoveryPoint': 'AzureWorkloadSQLRecoveryPoint'}
}
def __init__(self, **kwargs) -> None:
super(AzureWorkloadRecoveryPoint, self).__init__(**kwargs)
self.recovery_point_time_in_utc = None
self.type = None
self.object_type = 'AzureWorkloadRecoveryPoint'
class AzureWorkloadPointInTimeRecoveryPoint(AzureWorkloadRecoveryPoint):
"""Recovery point specific to PointInTime.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadSAPHanaPointInTimeRecoveryPoint
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_time_in_utc: UTC time at which recovery point was
created
:vartype recovery_point_time_in_utc: datetime
:ivar type: Type of restore point. Possible values include: 'Invalid',
'Full', 'Log', 'Differential'
:vartype type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointType
:param time_ranges: List of log ranges
:type time_ranges:
list[~azure.mgmt.recoveryservicesbackup.models.PointInTimeRange]
"""
_validation = {
'object_type': {'required': True},
'recovery_point_time_in_utc': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_time_in_utc': {'key': 'recoveryPointTimeInUTC', 'type': 'iso-8601'},
'type': {'key': 'type', 'type': 'str'},
'time_ranges': {'key': 'timeRanges', 'type': '[PointInTimeRange]'},
}
_subtype_map = {
'object_type': {'AzureWorkloadSAPHanaPointInTimeRecoveryPoint': 'AzureWorkloadSAPHanaPointInTimeRecoveryPoint'}
}
def __init__(self, *, time_ranges=None, **kwargs) -> None:
super(AzureWorkloadPointInTimeRecoveryPoint, self).__init__(**kwargs)
self.time_ranges = time_ranges
self.object_type = 'AzureWorkloadPointInTimeRecoveryPoint'
class AzureWorkloadRestoreRequest(RestoreRequest):
"""AzureWorkload-specific restore.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadPointInTimeRestoreRequest,
AzureWorkloadSAPHanaRestoreRequest, AzureWorkloadSQLRestoreRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM on which
workload that was running is being recovered.
:type source_resource_id: str
:param property_bag: Workload specific property bag.
:type property_bag: dict[str, str]
:param target_info: Details of target database
:type target_info:
~azure.mgmt.recoveryservicesbackup.models.TargetRestoreInfo
:param recovery_mode: Defines whether the current recovery mode is file
restore or database restore. Possible values include: 'Invalid',
'FileRecovery', 'WorkloadRecovery'
:type recovery_mode: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryMode
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'target_info': {'key': 'targetInfo', 'type': 'TargetRestoreInfo'},
'recovery_mode': {'key': 'recoveryMode', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureWorkloadPointInTimeRestoreRequest': 'AzureWorkloadPointInTimeRestoreRequest', 'AzureWorkloadSAPHanaRestoreRequest': 'AzureWorkloadSAPHanaRestoreRequest', 'AzureWorkloadSQLRestoreRequest': 'AzureWorkloadSQLRestoreRequest'}
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, property_bag=None, target_info=None, recovery_mode=None, **kwargs) -> None:
super(AzureWorkloadRestoreRequest, self).__init__(**kwargs)
self.recovery_type = recovery_type
self.source_resource_id = source_resource_id
self.property_bag = property_bag
self.target_info = target_info
self.recovery_mode = recovery_mode
self.object_type = 'AzureWorkloadRestoreRequest'
class AzureWorkloadPointInTimeRestoreRequest(AzureWorkloadRestoreRequest):
"""AzureWorkload SAP Hana -specific restore. Specifically for PointInTime/Log
restore.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM on which
workload that was running is being recovered.
:type source_resource_id: str
:param property_bag: Workload specific property bag.
:type property_bag: dict[str, str]
:param target_info: Details of target database
:type target_info:
~azure.mgmt.recoveryservicesbackup.models.TargetRestoreInfo
:param recovery_mode: Defines whether the current recovery mode is file
restore or database restore. Possible values include: 'Invalid',
'FileRecovery', 'WorkloadRecovery'
:type recovery_mode: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryMode
:param point_in_time: PointInTime value
:type point_in_time: datetime
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'target_info': {'key': 'targetInfo', 'type': 'TargetRestoreInfo'},
'recovery_mode': {'key': 'recoveryMode', 'type': 'str'},
'point_in_time': {'key': 'pointInTime', 'type': 'iso-8601'},
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, property_bag=None, target_info=None, recovery_mode=None, point_in_time=None, **kwargs) -> None:
super(AzureWorkloadPointInTimeRestoreRequest, self).__init__(recovery_type=recovery_type, source_resource_id=source_resource_id, property_bag=property_bag, target_info=target_info, recovery_mode=recovery_mode, **kwargs)
self.point_in_time = point_in_time
self.object_type = 'AzureWorkloadPointInTimeRestoreRequest'
class AzureWorkloadSAPHanaPointInTimeRecoveryPoint(AzureWorkloadPointInTimeRecoveryPoint):
"""Recovery point specific to PointInTime in SAPHana.
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_time_in_utc: UTC time at which recovery point was
created
:vartype recovery_point_time_in_utc: datetime
:ivar type: Type of restore point. Possible values include: 'Invalid',
'Full', 'Log', 'Differential'
:vartype type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointType
:param time_ranges: List of log ranges
:type time_ranges:
list[~azure.mgmt.recoveryservicesbackup.models.PointInTimeRange]
"""
_validation = {
'object_type': {'required': True},
'recovery_point_time_in_utc': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_time_in_utc': {'key': 'recoveryPointTimeInUTC', 'type': 'iso-8601'},
'type': {'key': 'type', 'type': 'str'},
'time_ranges': {'key': 'timeRanges', 'type': '[PointInTimeRange]'},
}
def __init__(self, *, time_ranges=None, **kwargs) -> None:
super(AzureWorkloadSAPHanaPointInTimeRecoveryPoint, self).__init__(time_ranges=time_ranges, **kwargs)
self.object_type = 'AzureWorkloadSAPHanaPointInTimeRecoveryPoint'
class AzureWorkloadSAPHanaRestoreRequest(AzureWorkloadRestoreRequest):
"""AzureWorkload SAP Hana-specific restore.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadSAPHanaPointInTimeRestoreRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM on which
workload that was running is being recovered.
:type source_resource_id: str
:param property_bag: Workload specific property bag.
:type property_bag: dict[str, str]
:param target_info: Details of target database
:type target_info:
~azure.mgmt.recoveryservicesbackup.models.TargetRestoreInfo
:param recovery_mode: Defines whether the current recovery mode is file
restore or database restore. Possible values include: 'Invalid',
'FileRecovery', 'WorkloadRecovery'
:type recovery_mode: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryMode
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'target_info': {'key': 'targetInfo', 'type': 'TargetRestoreInfo'},
'recovery_mode': {'key': 'recoveryMode', 'type': 'str'},
}
_subtype_map = {
'object_type': {'AzureWorkloadSAPHanaPointInTimeRestoreRequest': 'AzureWorkloadSAPHanaPointInTimeRestoreRequest'}
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, property_bag=None, target_info=None, recovery_mode=None, **kwargs) -> None:
super(AzureWorkloadSAPHanaRestoreRequest, self).__init__(recovery_type=recovery_type, source_resource_id=source_resource_id, property_bag=property_bag, target_info=target_info, recovery_mode=recovery_mode, **kwargs)
self.object_type = 'AzureWorkloadSAPHanaRestoreRequest'
class AzureWorkloadSAPHanaPointInTimeRestoreRequest(AzureWorkloadSAPHanaRestoreRequest):
"""AzureWorkload SAP Hana -specific restore. Specifically for PointInTime/Log
restore.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM on which
workload that was running is being recovered.
:type source_resource_id: str
:param property_bag: Workload specific property bag.
:type property_bag: dict[str, str]
:param target_info: Details of target database
:type target_info:
~azure.mgmt.recoveryservicesbackup.models.TargetRestoreInfo
:param recovery_mode: Defines whether the current recovery mode is file
restore or database restore. Possible values include: 'Invalid',
'FileRecovery', 'WorkloadRecovery'
:type recovery_mode: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryMode
:param point_in_time: PointInTime value
:type point_in_time: datetime
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'target_info': {'key': 'targetInfo', 'type': 'TargetRestoreInfo'},
'recovery_mode': {'key': 'recoveryMode', 'type': 'str'},
'point_in_time': {'key': 'pointInTime', 'type': 'iso-8601'},
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, property_bag=None, target_info=None, recovery_mode=None, point_in_time=None, **kwargs) -> None:
super(AzureWorkloadSAPHanaPointInTimeRestoreRequest, self).__init__(recovery_type=recovery_type, source_resource_id=source_resource_id, property_bag=property_bag, target_info=target_info, recovery_mode=recovery_mode, **kwargs)
self.point_in_time = point_in_time
self.object_type = 'AzureWorkloadSAPHanaPointInTimeRestoreRequest'
class AzureWorkloadSAPHanaRecoveryPoint(AzureWorkloadRecoveryPoint):
"""SAPHana specific recoverypoint, specifically encapsulates full/diff
recoverypoints.
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_time_in_utc: UTC time at which recovery point was
created
:vartype recovery_point_time_in_utc: datetime
:ivar type: Type of restore point. Possible values include: 'Invalid',
'Full', 'Log', 'Differential'
:vartype type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointType
"""
_validation = {
'object_type': {'required': True},
'recovery_point_time_in_utc': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_time_in_utc': {'key': 'recoveryPointTimeInUTC', 'type': 'iso-8601'},
'type': {'key': 'type', 'type': 'str'},
}
def __init__(self, **kwargs) -> None:
super(AzureWorkloadSAPHanaRecoveryPoint, self).__init__(**kwargs)
self.object_type = 'AzureWorkloadSAPHanaRecoveryPoint'
class AzureWorkloadSQLAutoProtectionIntent(AzureWorkloadAutoProtectionIntent):
"""Azure Workload SQL Auto Protection intent item.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param item_id: ID of the item which is getting protected, In case of
Azure Vm , it is ProtectedItemId
:type item_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param protection_intent_item_type: Required. Constant filled by server.
:type protection_intent_item_type: str
:param workload_item_type: Workload item type of the item for which intent
is to be set. Possible values include: 'Invalid', 'SQLInstance',
'SQLDataBase', 'SAPHanaSystem', 'SAPHanaDatabase', 'SAPAseSystem',
'SAPAseDatabase'
:type workload_item_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadItemType
"""
_validation = {
'protection_intent_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'item_id': {'key': 'itemId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protection_intent_item_type': {'key': 'protectionIntentItemType', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, source_resource_id: str=None, item_id: str=None, policy_id: str=None, protection_state=None, workload_item_type=None, **kwargs) -> None:
super(AzureWorkloadSQLAutoProtectionIntent, self).__init__(backup_management_type=backup_management_type, source_resource_id=source_resource_id, item_id=item_id, policy_id=policy_id, protection_state=protection_state, **kwargs)
self.workload_item_type = workload_item_type
self.protection_intent_item_type = 'AzureWorkloadSQLAutoProtectionIntent'
class AzureWorkloadSQLRecoveryPoint(AzureWorkloadRecoveryPoint):
"""SQL specific recoverypoint, specifically encapsulates full/diff
recoverypoint along with extended info.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadSQLPointInTimeRecoveryPoint
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_time_in_utc: UTC time at which recovery point was
created
:vartype recovery_point_time_in_utc: datetime
:ivar type: Type of restore point. Possible values include: 'Invalid',
'Full', 'Log', 'Differential'
:vartype type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointType
:param extended_info: Extended Info that provides data directory details.
Will be populated in two cases:
When a specific recovery point is accessed using GetRecoveryPoint
Or when ListRecoveryPoints is called for Log RP only with ExtendedInfo
query filter
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadSQLRecoveryPointExtendedInfo
"""
_validation = {
'object_type': {'required': True},
'recovery_point_time_in_utc': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_time_in_utc': {'key': 'recoveryPointTimeInUTC', 'type': 'iso-8601'},
'type': {'key': 'type', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureWorkloadSQLRecoveryPointExtendedInfo'},
}
_subtype_map = {
'object_type': {'AzureWorkloadSQLPointInTimeRecoveryPoint': 'AzureWorkloadSQLPointInTimeRecoveryPoint'}
}
def __init__(self, *, extended_info=None, **kwargs) -> None:
super(AzureWorkloadSQLRecoveryPoint, self).__init__(**kwargs)
self.extended_info = extended_info
self.object_type = 'AzureWorkloadSQLRecoveryPoint'
class AzureWorkloadSQLPointInTimeRecoveryPoint(AzureWorkloadSQLRecoveryPoint):
"""Recovery point specific to PointInTime.
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_time_in_utc: UTC time at which recovery point was
created
:vartype recovery_point_time_in_utc: datetime
:ivar type: Type of restore point. Possible values include: 'Invalid',
'Full', 'Log', 'Differential'
:vartype type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointType
:param extended_info: Extended Info that provides data directory details.
Will be populated in two cases:
When a specific recovery point is accessed using GetRecoveryPoint
Or when ListRecoveryPoints is called for Log RP only with ExtendedInfo
query filter
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.AzureWorkloadSQLRecoveryPointExtendedInfo
:param time_ranges: List of log ranges
:type time_ranges:
list[~azure.mgmt.recoveryservicesbackup.models.PointInTimeRange]
"""
_validation = {
'object_type': {'required': True},
'recovery_point_time_in_utc': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_time_in_utc': {'key': 'recoveryPointTimeInUTC', 'type': 'iso-8601'},
'type': {'key': 'type', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'AzureWorkloadSQLRecoveryPointExtendedInfo'},
'time_ranges': {'key': 'timeRanges', 'type': '[PointInTimeRange]'},
}
def __init__(self, *, extended_info=None, time_ranges=None, **kwargs) -> None:
super(AzureWorkloadSQLPointInTimeRecoveryPoint, self).__init__(extended_info=extended_info, **kwargs)
self.time_ranges = time_ranges
self.object_type = 'AzureWorkloadSQLPointInTimeRecoveryPoint'
class AzureWorkloadSQLRestoreRequest(AzureWorkloadRestoreRequest):
"""AzureWorkload SQL -specific restore. Specifically for full/diff restore.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AzureWorkloadSQLPointInTimeRestoreRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM on which
workload that was running is being recovered.
:type source_resource_id: str
:param property_bag: Workload specific property bag.
:type property_bag: dict[str, str]
:param target_info: Details of target database
:type target_info:
~azure.mgmt.recoveryservicesbackup.models.TargetRestoreInfo
:param recovery_mode: Defines whether the current recovery mode is file
restore or database restore. Possible values include: 'Invalid',
'FileRecovery', 'WorkloadRecovery'
:type recovery_mode: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryMode
:param should_use_alternate_target_location: Default option set to true.
If this is set to false, alternate data directory must be provided
:type should_use_alternate_target_location: bool
:param is_non_recoverable: SQL specific property where user can chose to
set no-recovery when restore operation is tried
:type is_non_recoverable: bool
:param alternate_directory_paths: Data directory details
:type alternate_directory_paths:
list[~azure.mgmt.recoveryservicesbackup.models.SQLDataDirectoryMapping]
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'target_info': {'key': 'targetInfo', 'type': 'TargetRestoreInfo'},
'recovery_mode': {'key': 'recoveryMode', 'type': 'str'},
'should_use_alternate_target_location': {'key': 'shouldUseAlternateTargetLocation', 'type': 'bool'},
'is_non_recoverable': {'key': 'isNonRecoverable', 'type': 'bool'},
'alternate_directory_paths': {'key': 'alternateDirectoryPaths', 'type': '[SQLDataDirectoryMapping]'},
}
_subtype_map = {
'object_type': {'AzureWorkloadSQLPointInTimeRestoreRequest': 'AzureWorkloadSQLPointInTimeRestoreRequest'}
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, property_bag=None, target_info=None, recovery_mode=None, should_use_alternate_target_location: bool=None, is_non_recoverable: bool=None, alternate_directory_paths=None, **kwargs) -> None:
super(AzureWorkloadSQLRestoreRequest, self).__init__(recovery_type=recovery_type, source_resource_id=source_resource_id, property_bag=property_bag, target_info=target_info, recovery_mode=recovery_mode, **kwargs)
self.should_use_alternate_target_location = should_use_alternate_target_location
self.is_non_recoverable = is_non_recoverable
self.alternate_directory_paths = alternate_directory_paths
self.object_type = 'AzureWorkloadSQLRestoreRequest'
class AzureWorkloadSQLPointInTimeRestoreRequest(AzureWorkloadSQLRestoreRequest):
"""AzureWorkload SQL -specific restore. Specifically for PointInTime/Log
restore.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM on which
workload that was running is being recovered.
:type source_resource_id: str
:param property_bag: Workload specific property bag.
:type property_bag: dict[str, str]
:param target_info: Details of target database
:type target_info:
~azure.mgmt.recoveryservicesbackup.models.TargetRestoreInfo
:param recovery_mode: Defines whether the current recovery mode is file
restore or database restore. Possible values include: 'Invalid',
'FileRecovery', 'WorkloadRecovery'
:type recovery_mode: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryMode
:param should_use_alternate_target_location: Default option set to true.
If this is set to false, alternate data directory must be provided
:type should_use_alternate_target_location: bool
:param is_non_recoverable: SQL specific property where user can chose to
set no-recovery when restore operation is tried
:type is_non_recoverable: bool
:param alternate_directory_paths: Data directory details
:type alternate_directory_paths:
list[~azure.mgmt.recoveryservicesbackup.models.SQLDataDirectoryMapping]
:param point_in_time: PointInTime value
:type point_in_time: datetime
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'target_info': {'key': 'targetInfo', 'type': 'TargetRestoreInfo'},
'recovery_mode': {'key': 'recoveryMode', 'type': 'str'},
'should_use_alternate_target_location': {'key': 'shouldUseAlternateTargetLocation', 'type': 'bool'},
'is_non_recoverable': {'key': 'isNonRecoverable', 'type': 'bool'},
'alternate_directory_paths': {'key': 'alternateDirectoryPaths', 'type': '[SQLDataDirectoryMapping]'},
'point_in_time': {'key': 'pointInTime', 'type': 'iso-8601'},
}
def __init__(self, *, recovery_type=None, source_resource_id: str=None, property_bag=None, target_info=None, recovery_mode=None, should_use_alternate_target_location: bool=None, is_non_recoverable: bool=None, alternate_directory_paths=None, point_in_time=None, **kwargs) -> None:
super(AzureWorkloadSQLPointInTimeRestoreRequest, self).__init__(recovery_type=recovery_type, source_resource_id=source_resource_id, property_bag=property_bag, target_info=target_info, recovery_mode=recovery_mode, should_use_alternate_target_location=should_use_alternate_target_location, is_non_recoverable=is_non_recoverable, alternate_directory_paths=alternate_directory_paths, **kwargs)
self.point_in_time = point_in_time
self.object_type = 'AzureWorkloadSQLPointInTimeRestoreRequest'
class AzureWorkloadSQLRecoveryPointExtendedInfo(Model):
"""Extended info class details.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar data_directory_time_in_utc: UTC time at which data directory info
was captured
:vartype data_directory_time_in_utc: datetime
:ivar data_directory_paths: List of data directory paths during restore
operation.
:vartype data_directory_paths:
list[~azure.mgmt.recoveryservicesbackup.models.SQLDataDirectory]
"""
_validation = {
'data_directory_time_in_utc': {'readonly': True},
'data_directory_paths': {'readonly': True},
}
_attribute_map = {
'data_directory_time_in_utc': {'key': 'dataDirectoryTimeInUTC', 'type': 'iso-8601'},
'data_directory_paths': {'key': 'dataDirectoryPaths', 'type': '[SQLDataDirectory]'},
}
def __init__(self, **kwargs) -> None:
super(AzureWorkloadSQLRecoveryPointExtendedInfo, self).__init__(**kwargs)
self.data_directory_time_in_utc = None
self.data_directory_paths = None
class Resource(Model):
"""ARM Resource.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, **kwargs) -> None:
super(Resource, self).__init__(**kwargs)
self.id = None
self.name = None
self.type = None
self.location = location
self.tags = tags
self.e_tag = e_tag
class BackupEngineBaseResource(Resource):
"""The base backup engine class. All workload specific backup engines derive
from this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: BackupEngineBaseResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.BackupEngineBase
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'BackupEngineBase'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(BackupEngineBaseResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class BackupEngineExtendedInfo(Model):
"""Additional information on backup engine.
:param database_name: Database name of backup engine.
:type database_name: str
:param protected_items_count: Number of protected items in the backup
engine.
:type protected_items_count: int
:param protected_servers_count: Number of protected servers in the backup
engine.
:type protected_servers_count: int
:param disk_count: Number of disks in the backup engine.
:type disk_count: int
:param used_disk_space: Disk space used in the backup engine.
:type used_disk_space: float
:param available_disk_space: Disk space currently available in the backup
engine.
:type available_disk_space: float
:param refreshed_at: Last refresh time in the backup engine.
:type refreshed_at: datetime
:param azure_protected_instances: Protected instances in the backup
engine.
:type azure_protected_instances: int
"""
_attribute_map = {
'database_name': {'key': 'databaseName', 'type': 'str'},
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'protected_servers_count': {'key': 'protectedServersCount', 'type': 'int'},
'disk_count': {'key': 'diskCount', 'type': 'int'},
'used_disk_space': {'key': 'usedDiskSpace', 'type': 'float'},
'available_disk_space': {'key': 'availableDiskSpace', 'type': 'float'},
'refreshed_at': {'key': 'refreshedAt', 'type': 'iso-8601'},
'azure_protected_instances': {'key': 'azureProtectedInstances', 'type': 'int'},
}
def __init__(self, *, database_name: str=None, protected_items_count: int=None, protected_servers_count: int=None, disk_count: int=None, used_disk_space: float=None, available_disk_space: float=None, refreshed_at=None, azure_protected_instances: int=None, **kwargs) -> None:
super(BackupEngineExtendedInfo, self).__init__(**kwargs)
self.database_name = database_name
self.protected_items_count = protected_items_count
self.protected_servers_count = protected_servers_count
self.disk_count = disk_count
self.used_disk_space = used_disk_space
self.available_disk_space = available_disk_space
self.refreshed_at = refreshed_at
self.azure_protected_instances = azure_protected_instances
class BackupManagementUsage(Model):
"""Backup management usages of a vault.
:param unit: Unit of the usage. Possible values include: 'Count', 'Bytes',
'Seconds', 'Percent', 'CountPerSecond', 'BytesPerSecond'
:type unit: str or ~azure.mgmt.recoveryservicesbackup.models.UsagesUnit
:param quota_period: Quota period of usage.
:type quota_period: str
:param next_reset_time: Next reset time of usage.
:type next_reset_time: datetime
:param current_value: Current value of usage.
:type current_value: long
:param limit: Limit of usage.
:type limit: long
:param name: Name of usage.
:type name: ~azure.mgmt.recoveryservicesbackup.models.NameInfo
"""
_attribute_map = {
'unit': {'key': 'unit', 'type': 'str'},
'quota_period': {'key': 'quotaPeriod', 'type': 'str'},
'next_reset_time': {'key': 'nextResetTime', 'type': 'iso-8601'},
'current_value': {'key': 'currentValue', 'type': 'long'},
'limit': {'key': 'limit', 'type': 'long'},
'name': {'key': 'name', 'type': 'NameInfo'},
}
def __init__(self, *, unit=None, quota_period: str=None, next_reset_time=None, current_value: int=None, limit: int=None, name=None, **kwargs) -> None:
super(BackupManagementUsage, self).__init__(**kwargs)
self.unit = unit
self.quota_period = quota_period
self.next_reset_time = next_reset_time
self.current_value = current_value
self.limit = limit
self.name = name
class BackupRequestResource(Resource):
"""Base class for backup request. Workload-specific backup requests are
derived from this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: BackupRequestResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.BackupRequest
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'BackupRequest'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(BackupRequestResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class BackupResourceConfig(Model):
"""The resource storage details.
:param storage_model_type: Storage type. Possible values include:
'Invalid', 'GeoRedundant', 'LocallyRedundant'
:type storage_model_type: str or
~azure.mgmt.recoveryservicesbackup.models.StorageType
:param storage_type: Storage type. Possible values include: 'Invalid',
'GeoRedundant', 'LocallyRedundant'
:type storage_type: str or
~azure.mgmt.recoveryservicesbackup.models.StorageType
:param storage_type_state: Locked or Unlocked. Once a machine is
registered against a resource, the storageTypeState is always Locked.
Possible values include: 'Invalid', 'Locked', 'Unlocked'
:type storage_type_state: str or
~azure.mgmt.recoveryservicesbackup.models.StorageTypeState
"""
_attribute_map = {
'storage_model_type': {'key': 'storageModelType', 'type': 'str'},
'storage_type': {'key': 'storageType', 'type': 'str'},
'storage_type_state': {'key': 'storageTypeState', 'type': 'str'},
}
def __init__(self, *, storage_model_type=None, storage_type=None, storage_type_state=None, **kwargs) -> None:
super(BackupResourceConfig, self).__init__(**kwargs)
self.storage_model_type = storage_model_type
self.storage_type = storage_type
self.storage_type_state = storage_type_state
class BackupResourceConfigResource(Resource):
"""The resource storage details.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: BackupResourceConfigResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.BackupResourceConfig
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'BackupResourceConfig'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(BackupResourceConfigResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class BackupResourceVaultConfig(Model):
"""Backup resource vault config details.
:param storage_model_type: Storage type. Possible values include:
'Invalid', 'GeoRedundant', 'LocallyRedundant'
:type storage_model_type: str or
~azure.mgmt.recoveryservicesbackup.models.StorageType
:param storage_type: Storage type. Possible values include: 'Invalid',
'GeoRedundant', 'LocallyRedundant'
:type storage_type: str or
~azure.mgmt.recoveryservicesbackup.models.StorageType
:param storage_type_state: Locked or Unlocked. Once a machine is
registered against a resource, the storageTypeState is always Locked.
Possible values include: 'Invalid', 'Locked', 'Unlocked'
:type storage_type_state: str or
~azure.mgmt.recoveryservicesbackup.models.StorageTypeState
:param enhanced_security_state: Enabled or Disabled. Possible values
include: 'Invalid', 'Enabled', 'Disabled'
:type enhanced_security_state: str or
~azure.mgmt.recoveryservicesbackup.models.EnhancedSecurityState
:param soft_delete_feature_state: Soft Delete feature state. Possible
values include: 'Invalid', 'Enabled', 'Disabled'
:type soft_delete_feature_state: str or
~azure.mgmt.recoveryservicesbackup.models.SoftDeleteFeatureState
"""
_attribute_map = {
'storage_model_type': {'key': 'storageModelType', 'type': 'str'},
'storage_type': {'key': 'storageType', 'type': 'str'},
'storage_type_state': {'key': 'storageTypeState', 'type': 'str'},
'enhanced_security_state': {'key': 'enhancedSecurityState', 'type': 'str'},
'soft_delete_feature_state': {'key': 'softDeleteFeatureState', 'type': 'str'},
}
def __init__(self, *, storage_model_type=None, storage_type=None, storage_type_state=None, enhanced_security_state=None, soft_delete_feature_state=None, **kwargs) -> None:
super(BackupResourceVaultConfig, self).__init__(**kwargs)
self.storage_model_type = storage_model_type
self.storage_type = storage_type
self.storage_type_state = storage_type_state
self.enhanced_security_state = enhanced_security_state
self.soft_delete_feature_state = soft_delete_feature_state
class BackupResourceVaultConfigResource(Resource):
"""Backup resource vault config details.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: BackupResourceVaultConfigResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.BackupResourceVaultConfig
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'BackupResourceVaultConfig'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(BackupResourceVaultConfigResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class BackupStatusRequest(Model):
"""BackupStatus request.
:param resource_type: Container Type - VM, SQLPaaS, DPM, AzureFileShare.
Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type resource_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param resource_id: Entire ARM resource id of the resource
:type resource_id: str
:param po_logical_name: Protectable Item Logical Name
:type po_logical_name: str
"""
_attribute_map = {
'resource_type': {'key': 'resourceType', 'type': 'str'},
'resource_id': {'key': 'resourceId', 'type': 'str'},
'po_logical_name': {'key': 'poLogicalName', 'type': 'str'},
}
def __init__(self, *, resource_type=None, resource_id: str=None, po_logical_name: str=None, **kwargs) -> None:
super(BackupStatusRequest, self).__init__(**kwargs)
self.resource_type = resource_type
self.resource_id = resource_id
self.po_logical_name = po_logical_name
class BackupStatusResponse(Model):
"""BackupStatus response.
:param protection_status: Specifies whether the container is registered or
not. Possible values include: 'Invalid', 'NotProtected', 'Protecting',
'Protected', 'ProtectionFailed'
:type protection_status: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
:param vault_id: Specifies the arm resource id of the vault
:type vault_id: str
:param fabric_name: Specifies the fabric name - Azure or AD. Possible
values include: 'Invalid', 'Azure'
:type fabric_name: str or
~azure.mgmt.recoveryservicesbackup.models.FabricName
:param container_name: Specifies the product specific container name. E.g.
iaasvmcontainer;iaasvmcontainer;csname;vmname.
:type container_name: str
:param protected_item_name: Specifies the product specific ds name. E.g.
vm;iaasvmcontainer;csname;vmname.
:type protected_item_name: str
:param error_code: ErrorCode in case of intent failed
:type error_code: str
:param error_message: ErrorMessage in case of intent failed.
:type error_message: str
:param policy_name: Specifies the policy name which is used for protection
:type policy_name: str
:param registration_status: Container registration status
:type registration_status: str
"""
_attribute_map = {
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
'vault_id': {'key': 'vaultId', 'type': 'str'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'protected_item_name': {'key': 'protectedItemName', 'type': 'str'},
'error_code': {'key': 'errorCode', 'type': 'str'},
'error_message': {'key': 'errorMessage', 'type': 'str'},
'policy_name': {'key': 'policyName', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
}
def __init__(self, *, protection_status=None, vault_id: str=None, fabric_name=None, container_name: str=None, protected_item_name: str=None, error_code: str=None, error_message: str=None, policy_name: str=None, registration_status: str=None, **kwargs) -> None:
super(BackupStatusResponse, self).__init__(**kwargs)
self.protection_status = protection_status
self.vault_id = vault_id
self.fabric_name = fabric_name
self.container_name = container_name
self.protected_item_name = protected_item_name
self.error_code = error_code
self.error_message = error_message
self.policy_name = policy_name
self.registration_status = registration_status
class BEKDetails(Model):
"""BEK is bitlocker encryption key.
:param secret_url: Secret is BEK.
:type secret_url: str
:param secret_vault_id: ID of the Key Vault where this Secret is stored.
:type secret_vault_id: str
:param secret_data: BEK data.
:type secret_data: str
"""
_attribute_map = {
'secret_url': {'key': 'secretUrl', 'type': 'str'},
'secret_vault_id': {'key': 'secretVaultId', 'type': 'str'},
'secret_data': {'key': 'secretData', 'type': 'str'},
}
def __init__(self, *, secret_url: str=None, secret_vault_id: str=None, secret_data: str=None, **kwargs) -> None:
super(BEKDetails, self).__init__(**kwargs)
self.secret_url = secret_url
self.secret_vault_id = secret_vault_id
self.secret_data = secret_data
class BMSBackupEngineQueryObject(Model):
"""Query parameters to fetch list of backup engines.
:param expand: attribute to add extended info
:type expand: str
"""
_attribute_map = {
'expand': {'key': 'expand', 'type': 'str'},
}
def __init__(self, *, expand: str=None, **kwargs) -> None:
super(BMSBackupEngineQueryObject, self).__init__(**kwargs)
self.expand = expand
class BMSBackupEnginesQueryObject(Model):
"""Query parameters to fetch list of backup engines.
:param backup_management_type: Backup management type for the backup
engine. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param friendly_name: Friendly name of the backup engine.
:type friendly_name: str
:param expand: Attribute to add extended info.
:type expand: str
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'expand': {'key': 'expand', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, friendly_name: str=None, expand: str=None, **kwargs) -> None:
super(BMSBackupEnginesQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.friendly_name = friendly_name
self.expand = expand
class BMSBackupSummariesQueryObject(Model):
"""Query parameters to fetch backup summaries.
:param type: Backup management type for this container. Possible values
include: 'Invalid', 'BackupProtectedItemCountSummary',
'BackupProtectionContainerCountSummary'
:type type: str or ~azure.mgmt.recoveryservicesbackup.models.Type
"""
_attribute_map = {
'type': {'key': 'type', 'type': 'str'},
}
def __init__(self, *, type=None, **kwargs) -> None:
super(BMSBackupSummariesQueryObject, self).__init__(**kwargs)
self.type = type
class BMSContainerQueryObject(Model):
"""The query filters that can be used with the list containers API.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Required. Backup management type for this
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param container_type: Type of container for filter. Possible values
include: 'Invalid', 'Unknown', 'IaasVMContainer',
'IaasVMServiceContainer', 'DPMContainer', 'AzureBackupServerContainer',
'MABContainer', 'Cluster', 'AzureSqlContainer', 'Windows', 'VCenter',
'VMAppContainer', 'SQLAGWorkLoadContainer', 'StorageContainer',
'GenericContainer'
:type container_type: str or
~azure.mgmt.recoveryservicesbackup.models.ContainerType
:param backup_engine_name: Backup engine name
:type backup_engine_name: str
:param fabric_name: Fabric name for filter
:type fabric_name: str
:param status: Status of registration of this container with the Recovery
Services Vault.
:type status: str
:param friendly_name: Friendly name of this container.
:type friendly_name: str
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'backup_engine_name': {'key': 'backupEngineName', 'type': 'str'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
}
def __init__(self, *, backup_management_type, container_type=None, backup_engine_name: str=None, fabric_name: str=None, status: str=None, friendly_name: str=None, **kwargs) -> None:
super(BMSContainerQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.container_type = container_type
self.backup_engine_name = backup_engine_name
self.fabric_name = fabric_name
self.status = status
self.friendly_name = friendly_name
class BMSContainersInquiryQueryObject(Model):
"""The query filters that can be used with the inquire container API.
:param backup_management_type: Backup management type for this container.
Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Workload type for this container. Possible values
include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB', 'Exchange',
'Sharepoint', 'VMwareVM', 'SystemState', 'Client', 'GenericDataSource',
'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, **kwargs) -> None:
super(BMSContainersInquiryQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.workload_type = workload_type
class BMSPOQueryObject(Model):
"""Filters to list items that can be backed up.
:param backup_management_type: Backup management type. Possible values
include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM', 'AzureBackupServer',
'AzureSql', 'AzureStorage', 'AzureWorkload', 'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Workload type. Possible values include: 'Invalid',
'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB', 'Exchange', 'Sharepoint',
'VMwareVM', 'SystemState', 'Client', 'GenericDataSource', 'SQLDataBase',
'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param container_name: Full name of the container whose Protectable
Objects should be returned.
:type container_name: str
:param status: Backup status query parameter.
:type status: str
:param friendly_name: Friendly name.
:type friendly_name: str
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, status: str=None, friendly_name: str=None, **kwargs) -> None:
super(BMSPOQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.workload_type = workload_type
self.container_name = container_name
self.status = status
self.friendly_name = friendly_name
class BMSRefreshContainersQueryObject(Model):
"""The query filters that can be used with the refresh container API.
:param backup_management_type: Backup management type for this container.
Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, **kwargs) -> None:
super(BMSRefreshContainersQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
class BMSRPQueryObject(Model):
"""Filters to list backup copies.
:param start_date: Backup copies created after this time.
:type start_date: datetime
:param end_date: Backup copies created before this time.
:type end_date: datetime
:param restore_point_query_type: RestorePoint type. Possible values
include: 'Invalid', 'Full', 'Log', 'Differential', 'FullAndDifferential',
'All'
:type restore_point_query_type: str or
~azure.mgmt.recoveryservicesbackup.models.RestorePointQueryType
:param extended_info: In Get Recovery Point, it tells whether extended
information about recovery point is asked.
:type extended_info: bool
"""
_attribute_map = {
'start_date': {'key': 'startDate', 'type': 'iso-8601'},
'end_date': {'key': 'endDate', 'type': 'iso-8601'},
'restore_point_query_type': {'key': 'restorePointQueryType', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'bool'},
}
def __init__(self, *, start_date=None, end_date=None, restore_point_query_type=None, extended_info: bool=None, **kwargs) -> None:
super(BMSRPQueryObject, self).__init__(**kwargs)
self.start_date = start_date
self.end_date = end_date
self.restore_point_query_type = restore_point_query_type
self.extended_info = extended_info
class BMSWorkloadItemQueryObject(Model):
"""Filters to list items that can be backed up.
:param backup_management_type: Backup management type. Possible values
include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM', 'AzureBackupServer',
'AzureSql', 'AzureStorage', 'AzureWorkload', 'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_item_type: Workload Item type. Possible values include:
'Invalid', 'SQLInstance', 'SQLDataBase', 'SAPHanaSystem',
'SAPHanaDatabase', 'SAPAseSystem', 'SAPAseDatabase'
:type workload_item_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadItemType
:param workload_type: Workload type. Possible values include: 'Invalid',
'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB', 'Exchange', 'Sharepoint',
'VMwareVM', 'SystemState', 'Client', 'GenericDataSource', 'SQLDataBase',
'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param protection_status: Backup status query parameter. Possible values
include: 'Invalid', 'NotProtected', 'Protecting', 'Protected',
'ProtectionFailed'
:type protection_status: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionStatus
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_item_type': {'key': 'workloadItemType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'protection_status': {'key': 'protectionStatus', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, workload_item_type=None, workload_type=None, protection_status=None, **kwargs) -> None:
super(BMSWorkloadItemQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.workload_item_type = workload_item_type
self.workload_type = workload_type
self.protection_status = protection_status
class ClientDiscoveryDisplay(Model):
"""Localized display information of an operation.
:param provider: Name of the provider for display purposes
:type provider: str
:param resource: ResourceType for which this Operation can be performed.
:type resource: str
:param operation: Operations Name itself.
:type operation: str
:param description: Description of the operation having details of what
operation is about.
:type description: str
"""
_attribute_map = {
'provider': {'key': 'provider', 'type': 'str'},
'resource': {'key': 'resource', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
}
def __init__(self, *, provider: str=None, resource: str=None, operation: str=None, description: str=None, **kwargs) -> None:
super(ClientDiscoveryDisplay, self).__init__(**kwargs)
self.provider = provider
self.resource = resource
self.operation = operation
self.description = description
class ClientDiscoveryForLogSpecification(Model):
"""Class to represent shoebox log specification in json client discovery.
:param name: Name for shoebox log specification.
:type name: str
:param display_name: Localized display name
:type display_name: str
:param blob_duration: blob duration of shoebox log specification
:type blob_duration: str
"""
_attribute_map = {
'name': {'key': 'name', 'type': 'str'},
'display_name': {'key': 'displayName', 'type': 'str'},
'blob_duration': {'key': 'blobDuration', 'type': 'str'},
}
def __init__(self, *, name: str=None, display_name: str=None, blob_duration: str=None, **kwargs) -> None:
super(ClientDiscoveryForLogSpecification, self).__init__(**kwargs)
self.name = name
self.display_name = display_name
self.blob_duration = blob_duration
class ClientDiscoveryForProperties(Model):
"""Class to represent shoebox properties in json client discovery.
:param service_specification: Operation properties.
:type service_specification:
~azure.mgmt.recoveryservicesbackup.models.ClientDiscoveryForServiceSpecification
"""
_attribute_map = {
'service_specification': {'key': 'serviceSpecification', 'type': 'ClientDiscoveryForServiceSpecification'},
}
def __init__(self, *, service_specification=None, **kwargs) -> None:
super(ClientDiscoveryForProperties, self).__init__(**kwargs)
self.service_specification = service_specification
class ClientDiscoveryForServiceSpecification(Model):
"""Class to represent shoebox service specification in json client discovery.
:param log_specifications: List of log specifications of this operation.
:type log_specifications:
list[~azure.mgmt.recoveryservicesbackup.models.ClientDiscoveryForLogSpecification]
"""
_attribute_map = {
'log_specifications': {'key': 'logSpecifications', 'type': '[ClientDiscoveryForLogSpecification]'},
}
def __init__(self, *, log_specifications=None, **kwargs) -> None:
super(ClientDiscoveryForServiceSpecification, self).__init__(**kwargs)
self.log_specifications = log_specifications
class ClientDiscoveryValueForSingleApi(Model):
"""Available operation details.
:param name: Name of the Operation.
:type name: str
:param display: Contains the localized display information for this
particular operation
:type display:
~azure.mgmt.recoveryservicesbackup.models.ClientDiscoveryDisplay
:param origin: The intended executor of the operation;governs the display
of the operation in the RBAC UX and the audit logs UX
:type origin: str
:param properties: ShoeBox properties for the given operation.
:type properties:
~azure.mgmt.recoveryservicesbackup.models.ClientDiscoveryForProperties
"""
_attribute_map = {
'name': {'key': 'name', 'type': 'str'},
'display': {'key': 'display', 'type': 'ClientDiscoveryDisplay'},
'origin': {'key': 'origin', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ClientDiscoveryForProperties'},
}
def __init__(self, *, name: str=None, display=None, origin: str=None, properties=None, **kwargs) -> None:
super(ClientDiscoveryValueForSingleApi, self).__init__(**kwargs)
self.name = name
self.display = display
self.origin = origin
self.properties = properties
class ClientScriptForConnect(Model):
"""Client script details for file / folder restore.
:param script_content: File content of the client script for file / folder
restore.
:type script_content: str
:param script_extension: File extension of the client script for file /
folder restore - .ps1 , .sh , etc.
:type script_extension: str
:param os_type: OS type - Windows, Linux etc. for which this file / folder
restore client script works.
:type os_type: str
:param url: URL of Executable from where to source the content. If this is
not null then ScriptContent should not be used
:type url: str
:param script_name_suffix: Mandatory suffix that should be added to the
name of script that is given for download to user.
If its null or empty then , ignore it.
:type script_name_suffix: str
"""
_attribute_map = {
'script_content': {'key': 'scriptContent', 'type': 'str'},
'script_extension': {'key': 'scriptExtension', 'type': 'str'},
'os_type': {'key': 'osType', 'type': 'str'},
'url': {'key': 'url', 'type': 'str'},
'script_name_suffix': {'key': 'scriptNameSuffix', 'type': 'str'},
}
def __init__(self, *, script_content: str=None, script_extension: str=None, os_type: str=None, url: str=None, script_name_suffix: str=None, **kwargs) -> None:
super(ClientScriptForConnect, self).__init__(**kwargs)
self.script_content = script_content
self.script_extension = script_extension
self.os_type = os_type
self.url = url
self.script_name_suffix = script_name_suffix
class CloudError(Model):
"""CloudError.
"""
_attribute_map = {
}
class ContainerIdentityInfo(Model):
"""Container identity information.
:param unique_name: Unique name of the container
:type unique_name: str
:param aad_tenant_id: Protection container identity - AAD Tenant
:type aad_tenant_id: str
:param service_principal_client_id: Protection container identity - AAD
Service Principal
:type service_principal_client_id: str
:param audience: Protection container identity - Audience
:type audience: str
"""
_attribute_map = {
'unique_name': {'key': 'uniqueName', 'type': 'str'},
'aad_tenant_id': {'key': 'aadTenantId', 'type': 'str'},
'service_principal_client_id': {'key': 'servicePrincipalClientId', 'type': 'str'},
'audience': {'key': 'audience', 'type': 'str'},
}
def __init__(self, *, unique_name: str=None, aad_tenant_id: str=None, service_principal_client_id: str=None, audience: str=None, **kwargs) -> None:
super(ContainerIdentityInfo, self).__init__(**kwargs)
self.unique_name = unique_name
self.aad_tenant_id = aad_tenant_id
self.service_principal_client_id = service_principal_client_id
self.audience = audience
class DailyRetentionFormat(Model):
"""Daily retention format.
:param days_of_the_month: List of days of the month.
:type days_of_the_month:
list[~azure.mgmt.recoveryservicesbackup.models.Day]
"""
_attribute_map = {
'days_of_the_month': {'key': 'daysOfTheMonth', 'type': '[Day]'},
}
def __init__(self, *, days_of_the_month=None, **kwargs) -> None:
super(DailyRetentionFormat, self).__init__(**kwargs)
self.days_of_the_month = days_of_the_month
class DailyRetentionSchedule(Model):
"""Daily retention schedule.
:param retention_times: Retention times of retention policy.
:type retention_times: list[datetime]
:param retention_duration: Retention duration of retention Policy.
:type retention_duration:
~azure.mgmt.recoveryservicesbackup.models.RetentionDuration
"""
_attribute_map = {
'retention_times': {'key': 'retentionTimes', 'type': '[iso-8601]'},
'retention_duration': {'key': 'retentionDuration', 'type': 'RetentionDuration'},
}
def __init__(self, *, retention_times=None, retention_duration=None, **kwargs) -> None:
super(DailyRetentionSchedule, self).__init__(**kwargs)
self.retention_times = retention_times
self.retention_duration = retention_duration
class Day(Model):
"""Day of the week.
:param date_property: Date of the month
:type date_property: int
:param is_last: Whether Date is last date of month
:type is_last: bool
"""
_attribute_map = {
'date_property': {'key': 'date', 'type': 'int'},
'is_last': {'key': 'isLast', 'type': 'bool'},
}
def __init__(self, *, date_property: int=None, is_last: bool=None, **kwargs) -> None:
super(Day, self).__init__(**kwargs)
self.date_property = date_property
self.is_last = is_last
class DiskExclusionProperties(Model):
"""DiskExclusionProperties.
:param disk_lun_list: List of Disks' Logical Unit Numbers (LUN) to be used
for VM Protection.
:type disk_lun_list: list[int]
:param is_inclusion_list: Flag to indicate whether DiskLunList is to be
included/ excluded from backup.
:type is_inclusion_list: bool
"""
_attribute_map = {
'disk_lun_list': {'key': 'diskLunList', 'type': '[int]'},
'is_inclusion_list': {'key': 'isInclusionList', 'type': 'bool'},
}
def __init__(self, *, disk_lun_list=None, is_inclusion_list: bool=None, **kwargs) -> None:
super(DiskExclusionProperties, self).__init__(**kwargs)
self.disk_lun_list = disk_lun_list
self.is_inclusion_list = is_inclusion_list
class DiskInformation(Model):
"""Disk information.
:param lun:
:type lun: int
:param name:
:type name: str
"""
_attribute_map = {
'lun': {'key': 'lun', 'type': 'int'},
'name': {'key': 'name', 'type': 'str'},
}
def __init__(self, *, lun: int=None, name: str=None, **kwargs) -> None:
super(DiskInformation, self).__init__(**kwargs)
self.lun = lun
self.name = name
class DistributedNodesInfo(Model):
"""This is used to represent the various nodes of the distributed container.
:param node_name: Name of the node under a distributed container.
:type node_name: str
:param status: Status of this Node.
Failed | Succeeded
:type status: str
:param error_detail: Error Details if the Status is non-success.
:type error_detail: ~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
"""
_attribute_map = {
'node_name': {'key': 'nodeName', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'error_detail': {'key': 'errorDetail', 'type': 'ErrorDetail'},
}
def __init__(self, *, node_name: str=None, status: str=None, error_detail=None, **kwargs) -> None:
super(DistributedNodesInfo, self).__init__(**kwargs)
self.node_name = node_name
self.status = status
self.error_detail = error_detail
class DpmBackupEngine(BackupEngineBase):
"""Data Protection Manager (DPM) specific backup engine.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the backup engine.
:type friendly_name: str
:param backup_management_type: Type of backup management for the backup
engine. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Registration status of the backup engine with
the Recovery Services Vault.
:type registration_status: str
:param backup_engine_state: Status of the backup engine with the Recovery
Services Vault. = {Active/Deleting/DeleteFailed}
:type backup_engine_state: str
:param health_status: Backup status of the backup engine.
:type health_status: str
:param can_re_register: Flag indicating if the backup engine be
registered, once already registered.
:type can_re_register: bool
:param backup_engine_id: ID of the backup engine.
:type backup_engine_id: str
:param dpm_version: Backup engine version
:type dpm_version: str
:param azure_backup_agent_version: Backup agent version
:type azure_backup_agent_version: str
:param is_azure_backup_agent_upgrade_available: To check if backup agent
upgrade available
:type is_azure_backup_agent_upgrade_available: bool
:param is_dpm_upgrade_available: To check if backup engine upgrade
available
:type is_dpm_upgrade_available: bool
:param extended_info: Extended info of the backupengine
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.BackupEngineExtendedInfo
:param backup_engine_type: Required. Constant filled by server.
:type backup_engine_type: str
"""
_validation = {
'backup_engine_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'backup_engine_state': {'key': 'backupEngineState', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'can_re_register': {'key': 'canReRegister', 'type': 'bool'},
'backup_engine_id': {'key': 'backupEngineId', 'type': 'str'},
'dpm_version': {'key': 'dpmVersion', 'type': 'str'},
'azure_backup_agent_version': {'key': 'azureBackupAgentVersion', 'type': 'str'},
'is_azure_backup_agent_upgrade_available': {'key': 'isAzureBackupAgentUpgradeAvailable', 'type': 'bool'},
'is_dpm_upgrade_available': {'key': 'isDpmUpgradeAvailable', 'type': 'bool'},
'extended_info': {'key': 'extendedInfo', 'type': 'BackupEngineExtendedInfo'},
'backup_engine_type': {'key': 'backupEngineType', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, backup_engine_state: str=None, health_status: str=None, can_re_register: bool=None, backup_engine_id: str=None, dpm_version: str=None, azure_backup_agent_version: str=None, is_azure_backup_agent_upgrade_available: bool=None, is_dpm_upgrade_available: bool=None, extended_info=None, **kwargs) -> None:
super(DpmBackupEngine, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, backup_engine_state=backup_engine_state, health_status=health_status, can_re_register=can_re_register, backup_engine_id=backup_engine_id, dpm_version=dpm_version, azure_backup_agent_version=azure_backup_agent_version, is_azure_backup_agent_upgrade_available=is_azure_backup_agent_upgrade_available, is_dpm_upgrade_available=is_dpm_upgrade_available, extended_info=extended_info, **kwargs)
self.backup_engine_type = 'DpmBackupEngine'
class DPMContainerExtendedInfo(Model):
"""Additional information of the DPMContainer.
:param last_refreshed_at: Last refresh time of the DPMContainer.
:type last_refreshed_at: datetime
"""
_attribute_map = {
'last_refreshed_at': {'key': 'lastRefreshedAt', 'type': 'iso-8601'},
}
def __init__(self, *, last_refreshed_at=None, **kwargs) -> None:
super(DPMContainerExtendedInfo, self).__init__(**kwargs)
self.last_refreshed_at = last_refreshed_at
class DpmErrorInfo(Model):
"""DPM workload-specific error information.
:param error_string: Localized error string.
:type error_string: str
:param recommendations: List of localized recommendations for above error
code.
:type recommendations: list[str]
"""
_attribute_map = {
'error_string': {'key': 'errorString', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, *, error_string: str=None, recommendations=None, **kwargs) -> None:
super(DpmErrorInfo, self).__init__(**kwargs)
self.error_string = error_string
self.recommendations = recommendations
class DpmJob(Job):
"""DPM workload-specific job object.
All required parameters must be populated in order to send to Azure.
:param entity_friendly_name: Friendly name of the entity on which the
current job is executing.
:type entity_friendly_name: str
:param backup_management_type: Backup management type to execute the
current job. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: The operation name.
:type operation: str
:param status: Job status.
:type status: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param activity_id: ActivityId of job.
:type activity_id: str
:param job_type: Required. Constant filled by server.
:type job_type: str
:param duration: Time elapsed for job.
:type duration: timedelta
:param dpm_server_name: DPM server name managing the backup item or backup
job.
:type dpm_server_name: str
:param container_name: Name of cluster/server protecting current backup
item, if any.
:type container_name: str
:param container_type: Type of container.
:type container_type: str
:param workload_type: Type of backup item.
:type workload_type: str
:param actions_info: The state/actions applicable on this job like
cancel/retry.
:type actions_info: list[str or
~azure.mgmt.recoveryservicesbackup.models.JobSupportedAction]
:param error_details: The errors.
:type error_details:
list[~azure.mgmt.recoveryservicesbackup.models.DpmErrorInfo]
:param extended_info: Additional information for this job.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.DpmJobExtendedInfo
"""
_validation = {
'job_type': {'required': True},
}
_attribute_map = {
'entity_friendly_name': {'key': 'entityFriendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'activity_id': {'key': 'activityId', 'type': 'str'},
'job_type': {'key': 'jobType', 'type': 'str'},
'duration': {'key': 'duration', 'type': 'duration'},
'dpm_server_name': {'key': 'dpmServerName', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'actions_info': {'key': 'actionsInfo', 'type': '[JobSupportedAction]'},
'error_details': {'key': 'errorDetails', 'type': '[DpmErrorInfo]'},
'extended_info': {'key': 'extendedInfo', 'type': 'DpmJobExtendedInfo'},
}
def __init__(self, *, entity_friendly_name: str=None, backup_management_type=None, operation: str=None, status: str=None, start_time=None, end_time=None, activity_id: str=None, duration=None, dpm_server_name: str=None, container_name: str=None, container_type: str=None, workload_type: str=None, actions_info=None, error_details=None, extended_info=None, **kwargs) -> None:
super(DpmJob, self).__init__(entity_friendly_name=entity_friendly_name, backup_management_type=backup_management_type, operation=operation, status=status, start_time=start_time, end_time=end_time, activity_id=activity_id, **kwargs)
self.duration = duration
self.dpm_server_name = dpm_server_name
self.container_name = container_name
self.container_type = container_type
self.workload_type = workload_type
self.actions_info = actions_info
self.error_details = error_details
self.extended_info = extended_info
self.job_type = 'DpmJob'
class DpmJobExtendedInfo(Model):
"""Additional information on the DPM workload-specific job.
:param tasks_list: List of tasks associated with this job.
:type tasks_list:
list[~azure.mgmt.recoveryservicesbackup.models.DpmJobTaskDetails]
:param property_bag: The job properties.
:type property_bag: dict[str, str]
:param dynamic_error_message: Non localized error message on job
execution.
:type dynamic_error_message: str
"""
_attribute_map = {
'tasks_list': {'key': 'tasksList', 'type': '[DpmJobTaskDetails]'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'dynamic_error_message': {'key': 'dynamicErrorMessage', 'type': 'str'},
}
def __init__(self, *, tasks_list=None, property_bag=None, dynamic_error_message: str=None, **kwargs) -> None:
super(DpmJobExtendedInfo, self).__init__(**kwargs)
self.tasks_list = tasks_list
self.property_bag = property_bag
self.dynamic_error_message = dynamic_error_message
class DpmJobTaskDetails(Model):
"""DPM workload-specific job task details.
:param task_id: The task display name.
:type task_id: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param duration: Time elapsed for task.
:type duration: timedelta
:param status: The status.
:type status: str
"""
_attribute_map = {
'task_id': {'key': 'taskId', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'duration': {'key': 'duration', 'type': 'duration'},
'status': {'key': 'status', 'type': 'str'},
}
def __init__(self, *, task_id: str=None, start_time=None, end_time=None, duration=None, status: str=None, **kwargs) -> None:
super(DpmJobTaskDetails, self).__init__(**kwargs)
self.task_id = task_id
self.start_time = start_time
self.end_time = end_time
self.duration = duration
self.status = status
class DPMProtectedItem(ProtectedItem):
"""Additional information on Backup engine specific backup item.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the managed item
:type friendly_name: str
:param backup_engine_name: Backup Management server protecting this backup
item
:type backup_engine_name: str
:param protection_state: Protection state of the backup engine. Possible
values include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectedItemState
:param extended_info: Extended info of the backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.DPMProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_engine_name': {'key': 'backupEngineName', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'DPMProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, backup_engine_name: str=None, protection_state=None, extended_info=None, **kwargs) -> None:
super(DPMProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.friendly_name = friendly_name
self.backup_engine_name = backup_engine_name
self.protection_state = protection_state
self.extended_info = extended_info
self.protected_item_type = 'DPMProtectedItem'
class DPMProtectedItemExtendedInfo(Model):
"""Additional information of DPM Protected item.
:param protectable_object_load_path: Attribute to provide information on
various DBs.
:type protectable_object_load_path: dict[str, str]
:param protected: To check if backup item is disk protected.
:type protected: bool
:param is_present_on_cloud: To check if backup item is cloud protected.
:type is_present_on_cloud: bool
:param last_backup_status: Last backup status information on backup item.
:type last_backup_status: str
:param last_refreshed_at: Last refresh time on backup item.
:type last_refreshed_at: datetime
:param oldest_recovery_point: Oldest cloud recovery point time.
:type oldest_recovery_point: datetime
:param recovery_point_count: cloud recovery point count.
:type recovery_point_count: int
:param on_premise_oldest_recovery_point: Oldest disk recovery point time.
:type on_premise_oldest_recovery_point: datetime
:param on_premise_latest_recovery_point: latest disk recovery point time.
:type on_premise_latest_recovery_point: datetime
:param on_premise_recovery_point_count: disk recovery point count.
:type on_premise_recovery_point_count: int
:param is_collocated: To check if backup item is collocated.
:type is_collocated: bool
:param protection_group_name: Protection group name of the backup item.
:type protection_group_name: str
:param disk_storage_used_in_bytes: Used Disk storage in bytes.
:type disk_storage_used_in_bytes: str
:param total_disk_storage_size_in_bytes: total Disk storage in bytes.
:type total_disk_storage_size_in_bytes: str
"""
_attribute_map = {
'protectable_object_load_path': {'key': 'protectableObjectLoadPath', 'type': '{str}'},
'protected': {'key': 'protected', 'type': 'bool'},
'is_present_on_cloud': {'key': 'isPresentOnCloud', 'type': 'bool'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_refreshed_at': {'key': 'lastRefreshedAt', 'type': 'iso-8601'},
'oldest_recovery_point': {'key': 'oldestRecoveryPoint', 'type': 'iso-8601'},
'recovery_point_count': {'key': 'recoveryPointCount', 'type': 'int'},
'on_premise_oldest_recovery_point': {'key': 'onPremiseOldestRecoveryPoint', 'type': 'iso-8601'},
'on_premise_latest_recovery_point': {'key': 'onPremiseLatestRecoveryPoint', 'type': 'iso-8601'},
'on_premise_recovery_point_count': {'key': 'onPremiseRecoveryPointCount', 'type': 'int'},
'is_collocated': {'key': 'isCollocated', 'type': 'bool'},
'protection_group_name': {'key': 'protectionGroupName', 'type': 'str'},
'disk_storage_used_in_bytes': {'key': 'diskStorageUsedInBytes', 'type': 'str'},
'total_disk_storage_size_in_bytes': {'key': 'totalDiskStorageSizeInBytes', 'type': 'str'},
}
def __init__(self, *, protectable_object_load_path=None, protected: bool=None, is_present_on_cloud: bool=None, last_backup_status: str=None, last_refreshed_at=None, oldest_recovery_point=None, recovery_point_count: int=None, on_premise_oldest_recovery_point=None, on_premise_latest_recovery_point=None, on_premise_recovery_point_count: int=None, is_collocated: bool=None, protection_group_name: str=None, disk_storage_used_in_bytes: str=None, total_disk_storage_size_in_bytes: str=None, **kwargs) -> None:
super(DPMProtectedItemExtendedInfo, self).__init__(**kwargs)
self.protectable_object_load_path = protectable_object_load_path
self.protected = protected
self.is_present_on_cloud = is_present_on_cloud
self.last_backup_status = last_backup_status
self.last_refreshed_at = last_refreshed_at
self.oldest_recovery_point = oldest_recovery_point
self.recovery_point_count = recovery_point_count
self.on_premise_oldest_recovery_point = on_premise_oldest_recovery_point
self.on_premise_latest_recovery_point = on_premise_latest_recovery_point
self.on_premise_recovery_point_count = on_premise_recovery_point_count
self.is_collocated = is_collocated
self.protection_group_name = protection_group_name
self.disk_storage_used_in_bytes = disk_storage_used_in_bytes
self.total_disk_storage_size_in_bytes = total_disk_storage_size_in_bytes
class EncryptionDetails(Model):
"""Details needed if the VM was encrypted at the time of backup.
:param encryption_enabled: Identifies whether this backup copy represents
an encrypted VM at the time of backup.
:type encryption_enabled: bool
:param kek_url: Key Url.
:type kek_url: str
:param secret_key_url: Secret Url.
:type secret_key_url: str
:param kek_vault_id: ID of Key Vault where KEK is stored.
:type kek_vault_id: str
:param secret_key_vault_id: ID of Key Vault where Secret is stored.
:type secret_key_vault_id: str
"""
_attribute_map = {
'encryption_enabled': {'key': 'encryptionEnabled', 'type': 'bool'},
'kek_url': {'key': 'kekUrl', 'type': 'str'},
'secret_key_url': {'key': 'secretKeyUrl', 'type': 'str'},
'kek_vault_id': {'key': 'kekVaultId', 'type': 'str'},
'secret_key_vault_id': {'key': 'secretKeyVaultId', 'type': 'str'},
}
def __init__(self, *, encryption_enabled: bool=None, kek_url: str=None, secret_key_url: str=None, kek_vault_id: str=None, secret_key_vault_id: str=None, **kwargs) -> None:
super(EncryptionDetails, self).__init__(**kwargs)
self.encryption_enabled = encryption_enabled
self.kek_url = kek_url
self.secret_key_url = secret_key_url
self.kek_vault_id = kek_vault_id
self.secret_key_vault_id = secret_key_vault_id
class ErrorDetail(Model):
"""Error Detail class which encapsulates Code, Message and Recommendations.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar code: Error code.
:vartype code: str
:ivar message: Error Message related to the Code.
:vartype message: str
:ivar recommendations: List of recommendation strings.
:vartype recommendations: list[str]
"""
_validation = {
'code': {'readonly': True},
'message': {'readonly': True},
'recommendations': {'readonly': True},
}
_attribute_map = {
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, **kwargs) -> None:
super(ErrorDetail, self).__init__(**kwargs)
self.code = None
self.message = None
self.recommendations = None
class OperationResultInfoBase(Model):
"""Base class for operation result info.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: ExportJobsOperationResultInfo, OperationResultInfo
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'ExportJobsOperationResultInfo': 'ExportJobsOperationResultInfo', 'OperationResultInfo': 'OperationResultInfo'}
}
def __init__(self, **kwargs) -> None:
super(OperationResultInfoBase, self).__init__(**kwargs)
self.object_type = None
class ExportJobsOperationResultInfo(OperationResultInfoBase):
"""This class is used to send blob details after exporting jobs.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param blob_url: URL of the blob into which the serialized string of list
of jobs is exported.
:type blob_url: str
:param blob_sas_key: SAS key to access the blob. It expires in 15 mins.
:type blob_sas_key: str
:param excel_file_blob_url: URL of the blob into which the ExcelFile is
uploaded.
:type excel_file_blob_url: str
:param excel_file_blob_sas_key: SAS key to access the blob. It expires in
15 mins.
:type excel_file_blob_sas_key: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'blob_url': {'key': 'blobUrl', 'type': 'str'},
'blob_sas_key': {'key': 'blobSasKey', 'type': 'str'},
'excel_file_blob_url': {'key': 'excelFileBlobUrl', 'type': 'str'},
'excel_file_blob_sas_key': {'key': 'excelFileBlobSasKey', 'type': 'str'},
}
def __init__(self, *, blob_url: str=None, blob_sas_key: str=None, excel_file_blob_url: str=None, excel_file_blob_sas_key: str=None, **kwargs) -> None:
super(ExportJobsOperationResultInfo, self).__init__(**kwargs)
self.blob_url = blob_url
self.blob_sas_key = blob_sas_key
self.excel_file_blob_url = excel_file_blob_url
self.excel_file_blob_sas_key = excel_file_blob_sas_key
self.object_type = 'ExportJobsOperationResultInfo'
class ExtendedProperties(Model):
"""Extended Properties for Azure IaasVM Backup.
:param disk_exclusion_properties: Extended Properties for Disk Exclusion.
:type disk_exclusion_properties:
~azure.mgmt.recoveryservicesbackup.models.DiskExclusionProperties
"""
_attribute_map = {
'disk_exclusion_properties': {'key': 'diskExclusionProperties', 'type': 'DiskExclusionProperties'},
}
def __init__(self, *, disk_exclusion_properties=None, **kwargs) -> None:
super(ExtendedProperties, self).__init__(**kwargs)
self.disk_exclusion_properties = disk_exclusion_properties
class GenericContainer(ProtectionContainer):
"""Base class for generic container of backup items.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param fabric_name: Name of the container's fabric
:type fabric_name: str
:param extended_information: Extended information (not returned in List
container API calls)
:type extended_information:
~azure.mgmt.recoveryservicesbackup.models.GenericContainerExtendedInfo
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
'extended_information': {'key': 'extendedInformation', 'type': 'GenericContainerExtendedInfo'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, fabric_name: str=None, extended_information=None, **kwargs) -> None:
super(GenericContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.fabric_name = fabric_name
self.extended_information = extended_information
self.container_type = 'GenericContainer'
class GenericContainerExtendedInfo(Model):
"""Container extended information.
:param raw_cert_data: Public key of container cert
:type raw_cert_data: str
:param container_identity_info: Container identity information
:type container_identity_info:
~azure.mgmt.recoveryservicesbackup.models.ContainerIdentityInfo
:param service_endpoints: Azure Backup Service Endpoints for the container
:type service_endpoints: dict[str, str]
"""
_attribute_map = {
'raw_cert_data': {'key': 'rawCertData', 'type': 'str'},
'container_identity_info': {'key': 'containerIdentityInfo', 'type': 'ContainerIdentityInfo'},
'service_endpoints': {'key': 'serviceEndpoints', 'type': '{str}'},
}
def __init__(self, *, raw_cert_data: str=None, container_identity_info=None, service_endpoints=None, **kwargs) -> None:
super(GenericContainerExtendedInfo, self).__init__(**kwargs)
self.raw_cert_data = raw_cert_data
self.container_identity_info = container_identity_info
self.service_endpoints = service_endpoints
class GenericProtectedItem(ProtectedItem):
"""Base class for backup items.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param policy_state: Indicates consistency of policy object and policy
applied to this backup item.
:type policy_state: str
:param protection_state: Backup state of this backup item. Possible values
include: 'Invalid', 'IRPending', 'Protected', 'ProtectionError',
'ProtectionStopped', 'ProtectionPaused'
:type protection_state: str or
~azure.mgmt.recoveryservicesbackup.models.ProtectionState
:param protected_item_id: Data Plane Service ID of the protected item.
:type protected_item_id: long
:param source_associations: Loosely coupled (type, value) associations
(example - parent of a protected item)
:type source_associations: dict[str, str]
:param fabric_name: Name of this backup item's fabric.
:type fabric_name: str
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'policy_state': {'key': 'policyState', 'type': 'str'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'protected_item_id': {'key': 'protectedItemId', 'type': 'long'},
'source_associations': {'key': 'sourceAssociations', 'type': '{str}'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, policy_state: str=None, protection_state=None, protected_item_id: int=None, source_associations=None, fabric_name: str=None, **kwargs) -> None:
super(GenericProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.friendly_name = friendly_name
self.policy_state = policy_state
self.protection_state = protection_state
self.protected_item_id = protected_item_id
self.source_associations = source_associations
self.fabric_name = fabric_name
self.protected_item_type = 'GenericProtectedItem'
class GenericProtectionPolicy(ProtectionPolicy):
"""Azure VM (Mercury) workload-specific backup policy.
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
:param sub_protection_policy: List of sub-protection policies which
includes schedule and retention
:type sub_protection_policy:
list[~azure.mgmt.recoveryservicesbackup.models.SubProtectionPolicy]
:param time_zone: TimeZone optional input as string. For example: TimeZone
= "Pacific Standard Time".
:type time_zone: str
:param fabric_name: Name of this policy's fabric.
:type fabric_name: str
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'sub_protection_policy': {'key': 'subProtectionPolicy', 'type': '[SubProtectionPolicy]'},
'time_zone': {'key': 'timeZone', 'type': 'str'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
}
def __init__(self, *, protected_items_count: int=None, sub_protection_policy=None, time_zone: str=None, fabric_name: str=None, **kwargs) -> None:
super(GenericProtectionPolicy, self).__init__(protected_items_count=protected_items_count, **kwargs)
self.sub_protection_policy = sub_protection_policy
self.time_zone = time_zone
self.fabric_name = fabric_name
self.backup_management_type = 'GenericProtectionPolicy'
class GenericRecoveryPoint(RecoveryPoint):
"""Generic backup copy.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param friendly_name: Friendly name of the backup copy.
:type friendly_name: str
:param recovery_point_type: Type of the backup copy.
:type recovery_point_type: str
:param recovery_point_time: Time at which this backup copy was created.
:type recovery_point_time: datetime
:param recovery_point_additional_info: Additional information associated
with this backup copy.
:type recovery_point_additional_info: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'recovery_point_type': {'key': 'recoveryPointType', 'type': 'str'},
'recovery_point_time': {'key': 'recoveryPointTime', 'type': 'iso-8601'},
'recovery_point_additional_info': {'key': 'recoveryPointAdditionalInfo', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, recovery_point_type: str=None, recovery_point_time=None, recovery_point_additional_info: str=None, **kwargs) -> None:
super(GenericRecoveryPoint, self).__init__(**kwargs)
self.friendly_name = friendly_name
self.recovery_point_type = recovery_point_type
self.recovery_point_time = recovery_point_time
self.recovery_point_additional_info = recovery_point_additional_info
self.object_type = 'GenericRecoveryPoint'
class GetProtectedItemQueryObject(Model):
"""Filters to list backup items.
:param expand: Specifies if the additional information should be provided
for this item.
:type expand: str
"""
_attribute_map = {
'expand': {'key': 'expand', 'type': 'str'},
}
def __init__(self, *, expand: str=None, **kwargs) -> None:
super(GetProtectedItemQueryObject, self).__init__(**kwargs)
self.expand = expand
class IaasVMBackupRequest(BackupRequest):
"""IaaS VM workload-specific backup request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_point_expiry_time_in_utc: Backup copy will expire after
the time specified (UTC).
:type recovery_point_expiry_time_in_utc: datetime
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_expiry_time_in_utc': {'key': 'recoveryPointExpiryTimeInUTC', 'type': 'iso-8601'},
}
def __init__(self, *, recovery_point_expiry_time_in_utc=None, **kwargs) -> None:
super(IaasVMBackupRequest, self).__init__(**kwargs)
self.recovery_point_expiry_time_in_utc = recovery_point_expiry_time_in_utc
self.object_type = 'IaasVMBackupRequest'
class IaasVMILRRegistrationRequest(ILRRequest):
"""Restore files/folders from a backup copy of IaaS VM.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_point_id: ID of the IaaS VM backup copy from where the
files/folders have to be restored.
:type recovery_point_id: str
:param virtual_machine_id: Fully qualified ARM ID of the virtual machine
whose the files / folders have to be restored.
:type virtual_machine_id: str
:param initiator_name: iSCSI initiator name.
:type initiator_name: str
:param renew_existing_registration: Whether to renew existing registration
with the iSCSI server.
:type renew_existing_registration: bool
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_id': {'key': 'recoveryPointId', 'type': 'str'},
'virtual_machine_id': {'key': 'virtualMachineId', 'type': 'str'},
'initiator_name': {'key': 'initiatorName', 'type': 'str'},
'renew_existing_registration': {'key': 'renewExistingRegistration', 'type': 'bool'},
}
def __init__(self, *, recovery_point_id: str=None, virtual_machine_id: str=None, initiator_name: str=None, renew_existing_registration: bool=None, **kwargs) -> None:
super(IaasVMILRRegistrationRequest, self).__init__(**kwargs)
self.recovery_point_id = recovery_point_id
self.virtual_machine_id = virtual_machine_id
self.initiator_name = initiator_name
self.renew_existing_registration = renew_existing_registration
self.object_type = 'IaasVMILRRegistrationRequest'
class IaasVMRecoveryPoint(RecoveryPoint):
"""IaaS VM workload specific backup copy.
Variables are only populated by the server, and will be ignored when
sending a request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:ivar recovery_point_type: Type of the backup copy.
:vartype recovery_point_type: str
:ivar recovery_point_time: Time at which this backup copy was created.
:vartype recovery_point_time: datetime
:ivar recovery_point_additional_info: Additional information associated
with this backup copy.
:vartype recovery_point_additional_info: str
:ivar source_vm_storage_type: Storage type of the VM whose backup copy is
created.
:vartype source_vm_storage_type: str
:ivar is_source_vm_encrypted: Identifies whether the VM was encrypted when
the backup copy is created.
:vartype is_source_vm_encrypted: bool
:param key_and_secret: Required details for recovering an encrypted VM.
Applicable only when IsSourceVMEncrypted is true.
:type key_and_secret:
~azure.mgmt.recoveryservicesbackup.models.KeyAndSecretDetails
:param is_instant_ilr_session_active: Is the session to recover items from
this backup copy still active.
:type is_instant_ilr_session_active: bool
:param recovery_point_tier_details: Recovery point tier information.
:type recovery_point_tier_details:
list[~azure.mgmt.recoveryservicesbackup.models.RecoveryPointTierInformation]
:param is_managed_virtual_machine: Whether VM is with Managed Disks
:type is_managed_virtual_machine: bool
:param virtual_machine_size: Virtual Machine Size
:type virtual_machine_size: str
:param original_storage_account_option: Original Storage Account Option
:type original_storage_account_option: bool
:param os_type: OS type
:type os_type: str
:param recovery_point_disk_configuration: Disk configuration
:type recovery_point_disk_configuration:
~azure.mgmt.recoveryservicesbackup.models.RecoveryPointDiskConfiguration
"""
_validation = {
'object_type': {'required': True},
'recovery_point_type': {'readonly': True},
'recovery_point_time': {'readonly': True},
'recovery_point_additional_info': {'readonly': True},
'source_vm_storage_type': {'readonly': True},
'is_source_vm_encrypted': {'readonly': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_type': {'key': 'recoveryPointType', 'type': 'str'},
'recovery_point_time': {'key': 'recoveryPointTime', 'type': 'iso-8601'},
'recovery_point_additional_info': {'key': 'recoveryPointAdditionalInfo', 'type': 'str'},
'source_vm_storage_type': {'key': 'sourceVMStorageType', 'type': 'str'},
'is_source_vm_encrypted': {'key': 'isSourceVMEncrypted', 'type': 'bool'},
'key_and_secret': {'key': 'keyAndSecret', 'type': 'KeyAndSecretDetails'},
'is_instant_ilr_session_active': {'key': 'isInstantIlrSessionActive', 'type': 'bool'},
'recovery_point_tier_details': {'key': 'recoveryPointTierDetails', 'type': '[RecoveryPointTierInformation]'},
'is_managed_virtual_machine': {'key': 'isManagedVirtualMachine', 'type': 'bool'},
'virtual_machine_size': {'key': 'virtualMachineSize', 'type': 'str'},
'original_storage_account_option': {'key': 'originalStorageAccountOption', 'type': 'bool'},
'os_type': {'key': 'osType', 'type': 'str'},
'recovery_point_disk_configuration': {'key': 'recoveryPointDiskConfiguration', 'type': 'RecoveryPointDiskConfiguration'},
}
def __init__(self, *, key_and_secret=None, is_instant_ilr_session_active: bool=None, recovery_point_tier_details=None, is_managed_virtual_machine: bool=None, virtual_machine_size: str=None, original_storage_account_option: bool=None, os_type: str=None, recovery_point_disk_configuration=None, **kwargs) -> None:
super(IaasVMRecoveryPoint, self).__init__(**kwargs)
self.recovery_point_type = None
self.recovery_point_time = None
self.recovery_point_additional_info = None
self.source_vm_storage_type = None
self.is_source_vm_encrypted = None
self.key_and_secret = key_and_secret
self.is_instant_ilr_session_active = is_instant_ilr_session_active
self.recovery_point_tier_details = recovery_point_tier_details
self.is_managed_virtual_machine = is_managed_virtual_machine
self.virtual_machine_size = virtual_machine_size
self.original_storage_account_option = original_storage_account_option
self.os_type = os_type
self.recovery_point_disk_configuration = recovery_point_disk_configuration
self.object_type = 'IaasVMRecoveryPoint'
class IaasVMRestoreRequest(RestoreRequest):
"""IaaS VM workload-specific restore.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_point_id: ID of the backup copy to be recovered.
:type recovery_point_id: str
:param recovery_type: Type of this recovery. Possible values include:
'Invalid', 'OriginalLocation', 'AlternateLocation', 'RestoreDisks',
'Offline'
:type recovery_type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryType
:param source_resource_id: Fully qualified ARM ID of the VM which is being
recovered.
:type source_resource_id: str
:param target_virtual_machine_id: This is the complete ARM Id of the VM
that will be created.
For e.g.
/subscriptions/{subId}/resourcegroups/{rg}/provider/Microsoft.Compute/virtualmachines/{vm}
:type target_virtual_machine_id: str
:param target_resource_group_id: This is the ARM Id of the resource group
that you want to create for this Virtual machine and other artifacts.
For e.g. /subscriptions/{subId}/resourcegroups/{rg}
:type target_resource_group_id: str
:param storage_account_id: Fully qualified ARM ID of the storage account
to which the VM has to be restored.
:type storage_account_id: str
:param virtual_network_id: This is the virtual network Id of the vnet that
will be attached to the virtual machine.
User will be validated for join action permissions in the linked access.
:type virtual_network_id: str
:param subnet_id: Subnet ID, is the subnet ID associated with the to be
restored VM. For Classic VMs it would be
{VnetID}/Subnet/{SubnetName} and, for the Azure Resource Manager VMs it
would be ARM resource ID used to represent
the subnet.
:type subnet_id: str
:param target_domain_name_id: Fully qualified ARM ID of the domain name to
be associated to the VM being restored. This applies only to Classic
Virtual Machines.
:type target_domain_name_id: str
:param region: Region in which the virtual machine is restored.
:type region: str
:param affinity_group: Affinity group associated to VM to be restored.
Used only for Classic Compute Virtual Machines.
:type affinity_group: str
:param create_new_cloud_service: Should a new cloud service be created
while restoring the VM. If this is false, VM will be restored to the same
cloud service as it was at the time of backup.
:type create_new_cloud_service: bool
:param original_storage_account_option: Original Storage Account Option
:type original_storage_account_option: bool
:param encryption_details: Details needed if the VM was encrypted at the
time of backup.
:type encryption_details:
~azure.mgmt.recoveryservicesbackup.models.EncryptionDetails
:param restore_disk_lun_list: List of Disk LUNs for partial restore
:type restore_disk_lun_list: list[int]
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_point_id': {'key': 'recoveryPointId', 'type': 'str'},
'recovery_type': {'key': 'recoveryType', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'target_virtual_machine_id': {'key': 'targetVirtualMachineId', 'type': 'str'},
'target_resource_group_id': {'key': 'targetResourceGroupId', 'type': 'str'},
'storage_account_id': {'key': 'storageAccountId', 'type': 'str'},
'virtual_network_id': {'key': 'virtualNetworkId', 'type': 'str'},
'subnet_id': {'key': 'subnetId', 'type': 'str'},
'target_domain_name_id': {'key': 'targetDomainNameId', 'type': 'str'},
'region': {'key': 'region', 'type': 'str'},
'affinity_group': {'key': 'affinityGroup', 'type': 'str'},
'create_new_cloud_service': {'key': 'createNewCloudService', 'type': 'bool'},
'original_storage_account_option': {'key': 'originalStorageAccountOption', 'type': 'bool'},
'encryption_details': {'key': 'encryptionDetails', 'type': 'EncryptionDetails'},
'restore_disk_lun_list': {'key': 'restoreDiskLunList', 'type': '[int]'},
}
def __init__(self, *, recovery_point_id: str=None, recovery_type=None, source_resource_id: str=None, target_virtual_machine_id: str=None, target_resource_group_id: str=None, storage_account_id: str=None, virtual_network_id: str=None, subnet_id: str=None, target_domain_name_id: str=None, region: str=None, affinity_group: str=None, create_new_cloud_service: bool=None, original_storage_account_option: bool=None, encryption_details=None, restore_disk_lun_list=None, **kwargs) -> None:
super(IaasVMRestoreRequest, self).__init__(**kwargs)
self.recovery_point_id = recovery_point_id
self.recovery_type = recovery_type
self.source_resource_id = source_resource_id
self.target_virtual_machine_id = target_virtual_machine_id
self.target_resource_group_id = target_resource_group_id
self.storage_account_id = storage_account_id
self.virtual_network_id = virtual_network_id
self.subnet_id = subnet_id
self.target_domain_name_id = target_domain_name_id
self.region = region
self.affinity_group = affinity_group
self.create_new_cloud_service = create_new_cloud_service
self.original_storage_account_option = original_storage_account_option
self.encryption_details = encryption_details
self.restore_disk_lun_list = restore_disk_lun_list
self.object_type = 'IaasVMRestoreRequest'
class ILRRequestResource(Resource):
"""Parameters to Provision ILR API.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: ILRRequestResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.ILRRequest
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ILRRequest'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(ILRRequestResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class InquiryInfo(Model):
"""Details about inquired protectable items under a given container.
:param status: Inquiry Status for this container such as
InProgress | Failed | Succeeded
:type status: str
:param error_detail: Error Details if the Status is non-success.
:type error_detail: ~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
:param inquiry_details: Inquiry Details which will have workload specific
details.
For e.g. - For SQL and oracle this will contain different details.
:type inquiry_details:
list[~azure.mgmt.recoveryservicesbackup.models.WorkloadInquiryDetails]
"""
_attribute_map = {
'status': {'key': 'status', 'type': 'str'},
'error_detail': {'key': 'errorDetail', 'type': 'ErrorDetail'},
'inquiry_details': {'key': 'inquiryDetails', 'type': '[WorkloadInquiryDetails]'},
}
def __init__(self, *, status: str=None, error_detail=None, inquiry_details=None, **kwargs) -> None:
super(InquiryInfo, self).__init__(**kwargs)
self.status = status
self.error_detail = error_detail
self.inquiry_details = inquiry_details
class InquiryValidation(Model):
"""Validation for inquired protectable items under a given container.
Variables are only populated by the server, and will be ignored when
sending a request.
:param status: Status for the Inquiry Validation.
:type status: str
:param error_detail: Error Detail in case the status is non-success.
:type error_detail: ~azure.mgmt.recoveryservicesbackup.models.ErrorDetail
:ivar additional_detail: Error Additional Detail in case the status is
non-success.
:vartype additional_detail: str
"""
_validation = {
'additional_detail': {'readonly': True},
}
_attribute_map = {
'status': {'key': 'status', 'type': 'str'},
'error_detail': {'key': 'errorDetail', 'type': 'ErrorDetail'},
'additional_detail': {'key': 'additionalDetail', 'type': 'str'},
}
def __init__(self, *, status: str=None, error_detail=None, **kwargs) -> None:
super(InquiryValidation, self).__init__(**kwargs)
self.status = status
self.error_detail = error_detail
self.additional_detail = None
class InstantItemRecoveryTarget(Model):
"""Target details for file / folder restore.
:param client_scripts: List of client scripts.
:type client_scripts:
list[~azure.mgmt.recoveryservicesbackup.models.ClientScriptForConnect]
"""
_attribute_map = {
'client_scripts': {'key': 'clientScripts', 'type': '[ClientScriptForConnect]'},
}
def __init__(self, *, client_scripts=None, **kwargs) -> None:
super(InstantItemRecoveryTarget, self).__init__(**kwargs)
self.client_scripts = client_scripts
class InstantRPAdditionalDetails(Model):
"""InstantRPAdditionalDetails.
:param azure_backup_rg_name_prefix:
:type azure_backup_rg_name_prefix: str
:param azure_backup_rg_name_suffix:
:type azure_backup_rg_name_suffix: str
"""
_attribute_map = {
'azure_backup_rg_name_prefix': {'key': 'azureBackupRGNamePrefix', 'type': 'str'},
'azure_backup_rg_name_suffix': {'key': 'azureBackupRGNameSuffix', 'type': 'str'},
}
def __init__(self, *, azure_backup_rg_name_prefix: str=None, azure_backup_rg_name_suffix: str=None, **kwargs) -> None:
super(InstantRPAdditionalDetails, self).__init__(**kwargs)
self.azure_backup_rg_name_prefix = azure_backup_rg_name_prefix
self.azure_backup_rg_name_suffix = azure_backup_rg_name_suffix
class JobQueryObject(Model):
"""Filters to list the jobs.
:param status: Status of the job. Possible values include: 'Invalid',
'InProgress', 'Completed', 'Failed', 'CompletedWithWarnings', 'Cancelled',
'Cancelling'
:type status: str or ~azure.mgmt.recoveryservicesbackup.models.JobStatus
:param backup_management_type: Type of backup management for the job.
Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: Type of operation. Possible values include: 'Invalid',
'Register', 'UnRegister', 'ConfigureBackup', 'Backup', 'Restore',
'DisableBackup', 'DeleteBackupData', 'CrossRegionRestore', 'Undelete'
:type operation: str or
~azure.mgmt.recoveryservicesbackup.models.JobOperationType
:param job_id: JobID represents the job uniquely.
:type job_id: str
:param start_time: Job has started at this time. Value is in UTC.
:type start_time: datetime
:param end_time: Job has ended at this time. Value is in UTC.
:type end_time: datetime
"""
_attribute_map = {
'status': {'key': 'status', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'job_id': {'key': 'jobId', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
}
def __init__(self, *, status=None, backup_management_type=None, operation=None, job_id: str=None, start_time=None, end_time=None, **kwargs) -> None:
super(JobQueryObject, self).__init__(**kwargs)
self.status = status
self.backup_management_type = backup_management_type
self.operation = operation
self.job_id = job_id
self.start_time = start_time
self.end_time = end_time
class JobResource(Resource):
"""Defines workload agnostic properties for a job.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: JobResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.Job
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'Job'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(JobResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class KEKDetails(Model):
"""KEK is encryption key for BEK.
:param key_url: Key is KEK.
:type key_url: str
:param key_vault_id: Key Vault ID where this Key is stored.
:type key_vault_id: str
:param key_backup_data: KEK data.
:type key_backup_data: str
"""
_attribute_map = {
'key_url': {'key': 'keyUrl', 'type': 'str'},
'key_vault_id': {'key': 'keyVaultId', 'type': 'str'},
'key_backup_data': {'key': 'keyBackupData', 'type': 'str'},
}
def __init__(self, *, key_url: str=None, key_vault_id: str=None, key_backup_data: str=None, **kwargs) -> None:
super(KEKDetails, self).__init__(**kwargs)
self.key_url = key_url
self.key_vault_id = key_vault_id
self.key_backup_data = key_backup_data
class KeyAndSecretDetails(Model):
"""BEK is bitlocker key.
KEK is encryption key for BEK
If the VM was encrypted then we will store following details :
1. Secret(BEK) - Url + Backup Data + vaultId.
2. Key(KEK) - Url + Backup Data + vaultId.
3. EncryptionMechanism
BEK and KEK can potentially have different vault ids.
:param kek_details: KEK is encryption key for BEK.
:type kek_details: ~azure.mgmt.recoveryservicesbackup.models.KEKDetails
:param bek_details: BEK is bitlocker encryption key.
:type bek_details: ~azure.mgmt.recoveryservicesbackup.models.BEKDetails
:param encryption_mechanism: Encryption mechanism: None/ SinglePass/
DoublePass
:type encryption_mechanism: str
"""
_attribute_map = {
'kek_details': {'key': 'kekDetails', 'type': 'KEKDetails'},
'bek_details': {'key': 'bekDetails', 'type': 'BEKDetails'},
'encryption_mechanism': {'key': 'encryptionMechanism', 'type': 'str'},
}
def __init__(self, *, kek_details=None, bek_details=None, encryption_mechanism: str=None, **kwargs) -> None:
super(KeyAndSecretDetails, self).__init__(**kwargs)
self.kek_details = kek_details
self.bek_details = bek_details
self.encryption_mechanism = encryption_mechanism
class SchedulePolicy(Model):
"""Base class for backup schedule.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: LogSchedulePolicy, LongTermSchedulePolicy,
SimpleSchedulePolicy
All required parameters must be populated in order to send to Azure.
:param schedule_policy_type: Required. Constant filled by server.
:type schedule_policy_type: str
"""
_validation = {
'schedule_policy_type': {'required': True},
}
_attribute_map = {
'schedule_policy_type': {'key': 'schedulePolicyType', 'type': 'str'},
}
_subtype_map = {
'schedule_policy_type': {'LogSchedulePolicy': 'LogSchedulePolicy', 'LongTermSchedulePolicy': 'LongTermSchedulePolicy', 'SimpleSchedulePolicy': 'SimpleSchedulePolicy'}
}
def __init__(self, **kwargs) -> None:
super(SchedulePolicy, self).__init__(**kwargs)
self.schedule_policy_type = None
class LogSchedulePolicy(SchedulePolicy):
"""Log policy schedule.
All required parameters must be populated in order to send to Azure.
:param schedule_policy_type: Required. Constant filled by server.
:type schedule_policy_type: str
:param schedule_frequency_in_mins: Frequency of the log schedule operation
of this policy in minutes.
:type schedule_frequency_in_mins: int
"""
_validation = {
'schedule_policy_type': {'required': True},
}
_attribute_map = {
'schedule_policy_type': {'key': 'schedulePolicyType', 'type': 'str'},
'schedule_frequency_in_mins': {'key': 'scheduleFrequencyInMins', 'type': 'int'},
}
def __init__(self, *, schedule_frequency_in_mins: int=None, **kwargs) -> None:
super(LogSchedulePolicy, self).__init__(**kwargs)
self.schedule_frequency_in_mins = schedule_frequency_in_mins
self.schedule_policy_type = 'LogSchedulePolicy'
class RetentionPolicy(Model):
"""Base class for retention policy.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: LongTermRetentionPolicy, SimpleRetentionPolicy
All required parameters must be populated in order to send to Azure.
:param retention_policy_type: Required. Constant filled by server.
:type retention_policy_type: str
"""
_validation = {
'retention_policy_type': {'required': True},
}
_attribute_map = {
'retention_policy_type': {'key': 'retentionPolicyType', 'type': 'str'},
}
_subtype_map = {
'retention_policy_type': {'LongTermRetentionPolicy': 'LongTermRetentionPolicy', 'SimpleRetentionPolicy': 'SimpleRetentionPolicy'}
}
def __init__(self, **kwargs) -> None:
super(RetentionPolicy, self).__init__(**kwargs)
self.retention_policy_type = None
class LongTermRetentionPolicy(RetentionPolicy):
"""Long term retention policy.
All required parameters must be populated in order to send to Azure.
:param retention_policy_type: Required. Constant filled by server.
:type retention_policy_type: str
:param daily_schedule: Daily retention schedule of the protection policy.
:type daily_schedule:
~azure.mgmt.recoveryservicesbackup.models.DailyRetentionSchedule
:param weekly_schedule: Weekly retention schedule of the protection
policy.
:type weekly_schedule:
~azure.mgmt.recoveryservicesbackup.models.WeeklyRetentionSchedule
:param monthly_schedule: Monthly retention schedule of the protection
policy.
:type monthly_schedule:
~azure.mgmt.recoveryservicesbackup.models.MonthlyRetentionSchedule
:param yearly_schedule: Yearly retention schedule of the protection
policy.
:type yearly_schedule:
~azure.mgmt.recoveryservicesbackup.models.YearlyRetentionSchedule
"""
_validation = {
'retention_policy_type': {'required': True},
}
_attribute_map = {
'retention_policy_type': {'key': 'retentionPolicyType', 'type': 'str'},
'daily_schedule': {'key': 'dailySchedule', 'type': 'DailyRetentionSchedule'},
'weekly_schedule': {'key': 'weeklySchedule', 'type': 'WeeklyRetentionSchedule'},
'monthly_schedule': {'key': 'monthlySchedule', 'type': 'MonthlyRetentionSchedule'},
'yearly_schedule': {'key': 'yearlySchedule', 'type': 'YearlyRetentionSchedule'},
}
def __init__(self, *, daily_schedule=None, weekly_schedule=None, monthly_schedule=None, yearly_schedule=None, **kwargs) -> None:
super(LongTermRetentionPolicy, self).__init__(**kwargs)
self.daily_schedule = daily_schedule
self.weekly_schedule = weekly_schedule
self.monthly_schedule = monthly_schedule
self.yearly_schedule = yearly_schedule
self.retention_policy_type = 'LongTermRetentionPolicy'
class LongTermSchedulePolicy(SchedulePolicy):
"""Long term policy schedule.
All required parameters must be populated in order to send to Azure.
:param schedule_policy_type: Required. Constant filled by server.
:type schedule_policy_type: str
"""
_validation = {
'schedule_policy_type': {'required': True},
}
_attribute_map = {
'schedule_policy_type': {'key': 'schedulePolicyType', 'type': 'str'},
}
def __init__(self, **kwargs) -> None:
super(LongTermSchedulePolicy, self).__init__(**kwargs)
self.schedule_policy_type = 'LongTermSchedulePolicy'
class MabContainer(ProtectionContainer):
"""Container with items backed up using MAB backup engine.
All required parameters must be populated in order to send to Azure.
:param friendly_name: Friendly name of the container.
:type friendly_name: str
:param backup_management_type: Type of backup management for the
container. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param registration_status: Status of registration of the container with
the Recovery Services Vault.
:type registration_status: str
:param health_status: Status of health of the container.
:type health_status: str
:param container_type: Required. Constant filled by server.
:type container_type: str
:param can_re_register: Can the container be registered one more time.
:type can_re_register: bool
:param container_id: ContainerID represents the container.
:type container_id: long
:param protected_item_count: Number of items backed up in this container.
:type protected_item_count: long
:param agent_version: Agent version of this container.
:type agent_version: str
:param extended_info: Additional information for this container
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.MabContainerExtendedInfo
:param mab_container_health_details: Health details on this mab container.
:type mab_container_health_details:
list[~azure.mgmt.recoveryservicesbackup.models.MABContainerHealthDetails]
:param container_health_state: Health state of mab container.
:type container_health_state: str
"""
_validation = {
'container_type': {'required': True},
}
_attribute_map = {
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'registration_status': {'key': 'registrationStatus', 'type': 'str'},
'health_status': {'key': 'healthStatus', 'type': 'str'},
'container_type': {'key': 'containerType', 'type': 'str'},
'can_re_register': {'key': 'canReRegister', 'type': 'bool'},
'container_id': {'key': 'containerId', 'type': 'long'},
'protected_item_count': {'key': 'protectedItemCount', 'type': 'long'},
'agent_version': {'key': 'agentVersion', 'type': 'str'},
'extended_info': {'key': 'extendedInfo', 'type': 'MabContainerExtendedInfo'},
'mab_container_health_details': {'key': 'mabContainerHealthDetails', 'type': '[MABContainerHealthDetails]'},
'container_health_state': {'key': 'containerHealthState', 'type': 'str'},
}
def __init__(self, *, friendly_name: str=None, backup_management_type=None, registration_status: str=None, health_status: str=None, can_re_register: bool=None, container_id: int=None, protected_item_count: int=None, agent_version: str=None, extended_info=None, mab_container_health_details=None, container_health_state: str=None, **kwargs) -> None:
super(MabContainer, self).__init__(friendly_name=friendly_name, backup_management_type=backup_management_type, registration_status=registration_status, health_status=health_status, **kwargs)
self.can_re_register = can_re_register
self.container_id = container_id
self.protected_item_count = protected_item_count
self.agent_version = agent_version
self.extended_info = extended_info
self.mab_container_health_details = mab_container_health_details
self.container_health_state = container_health_state
self.container_type = 'Windows'
class MabContainerExtendedInfo(Model):
"""Additional information of the container.
:param last_refreshed_at: Time stamp when this container was refreshed.
:type last_refreshed_at: datetime
:param backup_item_type: Type of backup items associated with this
container. Possible values include: 'Invalid', 'VM', 'FileFolder',
'AzureSqlDb', 'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM',
'SystemState', 'Client', 'GenericDataSource', 'SQLDataBase',
'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type backup_item_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupItemType
:param backup_items: List of backup items associated with this container.
:type backup_items: list[str]
:param policy_name: Backup policy associated with this container.
:type policy_name: str
:param last_backup_status: Latest backup status of this container.
:type last_backup_status: str
"""
_attribute_map = {
'last_refreshed_at': {'key': 'lastRefreshedAt', 'type': 'iso-8601'},
'backup_item_type': {'key': 'backupItemType', 'type': 'str'},
'backup_items': {'key': 'backupItems', 'type': '[str]'},
'policy_name': {'key': 'policyName', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
}
def __init__(self, *, last_refreshed_at=None, backup_item_type=None, backup_items=None, policy_name: str=None, last_backup_status: str=None, **kwargs) -> None:
super(MabContainerExtendedInfo, self).__init__(**kwargs)
self.last_refreshed_at = last_refreshed_at
self.backup_item_type = backup_item_type
self.backup_items = backup_items
self.policy_name = policy_name
self.last_backup_status = last_backup_status
class MABContainerHealthDetails(Model):
"""MAB workload-specific Health Details.
:param code: Health Code
:type code: int
:param title: Health Title
:type title: str
:param message: Health Message
:type message: str
:param recommendations: Health Recommended Actions
:type recommendations: list[str]
"""
_attribute_map = {
'code': {'key': 'code', 'type': 'int'},
'title': {'key': 'title', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, *, code: int=None, title: str=None, message: str=None, recommendations=None, **kwargs) -> None:
super(MABContainerHealthDetails, self).__init__(**kwargs)
self.code = code
self.title = title
self.message = message
self.recommendations = recommendations
class MabErrorInfo(Model):
"""MAB workload-specific error information.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar error_string: Localized error string.
:vartype error_string: str
:ivar recommendations: List of localized recommendations.
:vartype recommendations: list[str]
"""
_validation = {
'error_string': {'readonly': True},
'recommendations': {'readonly': True},
}
_attribute_map = {
'error_string': {'key': 'errorString', 'type': 'str'},
'recommendations': {'key': 'recommendations', 'type': '[str]'},
}
def __init__(self, **kwargs) -> None:
super(MabErrorInfo, self).__init__(**kwargs)
self.error_string = None
self.recommendations = None
class MabFileFolderProtectedItem(ProtectedItem):
"""MAB workload-specific backup item.
All required parameters must be populated in order to send to Azure.
:param backup_management_type: Type of backup management for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param workload_type: Type of workload this item represents. Possible
values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB',
'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param container_name: Unique name of container
:type container_name: str
:param source_resource_id: ARM ID of the resource to be backed up.
:type source_resource_id: str
:param policy_id: ID of the backup policy with which this item is backed
up.
:type policy_id: str
:param last_recovery_point: Timestamp when the last (latest) backup copy
was created for this backup item.
:type last_recovery_point: datetime
:param backup_set_name: Name of the backup set the backup item belongs to
:type backup_set_name: str
:param create_mode: Create mode to indicate recovery of existing soft
deleted data source or creation of new data source. Possible values
include: 'Invalid', 'Default', 'Recover'
:type create_mode: str or
~azure.mgmt.recoveryservicesbackup.models.CreateMode
:param deferred_delete_time_in_utc: Time for deferred deletion in UTC
:type deferred_delete_time_in_utc: datetime
:param is_scheduled_for_deferred_delete: Flag to identify whether the DS
is scheduled for deferred delete
:type is_scheduled_for_deferred_delete: bool
:param deferred_delete_time_remaining: Time remaining before the DS marked
for deferred delete is permanently deleted
:type deferred_delete_time_remaining: str
:param is_deferred_delete_schedule_upcoming: Flag to identify whether the
deferred deleted DS is to be purged soon
:type is_deferred_delete_schedule_upcoming: bool
:param is_rehydrate: Flag to identify that deferred deleted DS is to be
moved into Pause state
:type is_rehydrate: bool
:param protected_item_type: Required. Constant filled by server.
:type protected_item_type: str
:param friendly_name: Friendly name of this backup item.
:type friendly_name: str
:param computer_name: Name of the computer associated with this backup
item.
:type computer_name: str
:param last_backup_status: Status of last backup operation.
:type last_backup_status: str
:param last_backup_time: Timestamp of the last backup operation on this
backup item.
:type last_backup_time: datetime
:param protection_state: Protected, ProtectionStopped, IRPending or
ProtectionError
:type protection_state: str
:param deferred_delete_sync_time_in_utc: Sync time for deferred deletion
in UTC
:type deferred_delete_sync_time_in_utc: long
:param extended_info: Additional information with this backup item.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.MabFileFolderProtectedItemExtendedInfo
"""
_validation = {
'protected_item_type': {'required': True},
}
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'source_resource_id': {'key': 'sourceResourceId', 'type': 'str'},
'policy_id': {'key': 'policyId', 'type': 'str'},
'last_recovery_point': {'key': 'lastRecoveryPoint', 'type': 'iso-8601'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
'create_mode': {'key': 'createMode', 'type': 'str'},
'deferred_delete_time_in_utc': {'key': 'deferredDeleteTimeInUTC', 'type': 'iso-8601'},
'is_scheduled_for_deferred_delete': {'key': 'isScheduledForDeferredDelete', 'type': 'bool'},
'deferred_delete_time_remaining': {'key': 'deferredDeleteTimeRemaining', 'type': 'str'},
'is_deferred_delete_schedule_upcoming': {'key': 'isDeferredDeleteScheduleUpcoming', 'type': 'bool'},
'is_rehydrate': {'key': 'isRehydrate', 'type': 'bool'},
'protected_item_type': {'key': 'protectedItemType', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'computer_name': {'key': 'computerName', 'type': 'str'},
'last_backup_status': {'key': 'lastBackupStatus', 'type': 'str'},
'last_backup_time': {'key': 'lastBackupTime', 'type': 'iso-8601'},
'protection_state': {'key': 'protectionState', 'type': 'str'},
'deferred_delete_sync_time_in_utc': {'key': 'deferredDeleteSyncTimeInUTC', 'type': 'long'},
'extended_info': {'key': 'extendedInfo', 'type': 'MabFileFolderProtectedItemExtendedInfo'},
}
def __init__(self, *, backup_management_type=None, workload_type=None, container_name: str=None, source_resource_id: str=None, policy_id: str=None, last_recovery_point=None, backup_set_name: str=None, create_mode=None, deferred_delete_time_in_utc=None, is_scheduled_for_deferred_delete: bool=None, deferred_delete_time_remaining: str=None, is_deferred_delete_schedule_upcoming: bool=None, is_rehydrate: bool=None, friendly_name: str=None, computer_name: str=None, last_backup_status: str=None, last_backup_time=None, protection_state: str=None, deferred_delete_sync_time_in_utc: int=None, extended_info=None, **kwargs) -> None:
super(MabFileFolderProtectedItem, self).__init__(backup_management_type=backup_management_type, workload_type=workload_type, container_name=container_name, source_resource_id=source_resource_id, policy_id=policy_id, last_recovery_point=last_recovery_point, backup_set_name=backup_set_name, create_mode=create_mode, deferred_delete_time_in_utc=deferred_delete_time_in_utc, is_scheduled_for_deferred_delete=is_scheduled_for_deferred_delete, deferred_delete_time_remaining=deferred_delete_time_remaining, is_deferred_delete_schedule_upcoming=is_deferred_delete_schedule_upcoming, is_rehydrate=is_rehydrate, **kwargs)
self.friendly_name = friendly_name
self.computer_name = computer_name
self.last_backup_status = last_backup_status
self.last_backup_time = last_backup_time
self.protection_state = protection_state
self.deferred_delete_sync_time_in_utc = deferred_delete_sync_time_in_utc
self.extended_info = extended_info
self.protected_item_type = 'MabFileFolderProtectedItem'
class MabFileFolderProtectedItemExtendedInfo(Model):
"""Additional information on the backed up item.
:param last_refreshed_at: Last time when the agent data synced to service.
:type last_refreshed_at: datetime
:param oldest_recovery_point: The oldest backup copy available.
:type oldest_recovery_point: datetime
:param recovery_point_count: Number of backup copies associated with the
backup item.
:type recovery_point_count: int
"""
_attribute_map = {
'last_refreshed_at': {'key': 'lastRefreshedAt', 'type': 'iso-8601'},
'oldest_recovery_point': {'key': 'oldestRecoveryPoint', 'type': 'iso-8601'},
'recovery_point_count': {'key': 'recoveryPointCount', 'type': 'int'},
}
def __init__(self, *, last_refreshed_at=None, oldest_recovery_point=None, recovery_point_count: int=None, **kwargs) -> None:
super(MabFileFolderProtectedItemExtendedInfo, self).__init__(**kwargs)
self.last_refreshed_at = last_refreshed_at
self.oldest_recovery_point = oldest_recovery_point
self.recovery_point_count = recovery_point_count
class MabJob(Job):
"""MAB workload-specific job.
All required parameters must be populated in order to send to Azure.
:param entity_friendly_name: Friendly name of the entity on which the
current job is executing.
:type entity_friendly_name: str
:param backup_management_type: Backup management type to execute the
current job. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB',
'DPM', 'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param operation: The operation name.
:type operation: str
:param status: Job status.
:type status: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param activity_id: ActivityId of job.
:type activity_id: str
:param job_type: Required. Constant filled by server.
:type job_type: str
:param duration: Time taken by job to run.
:type duration: timedelta
:param actions_info: The state/actions applicable on jobs like
cancel/retry.
:type actions_info: list[str or
~azure.mgmt.recoveryservicesbackup.models.JobSupportedAction]
:param mab_server_name: Name of server protecting the DS.
:type mab_server_name: str
:param mab_server_type: Server type of MAB container. Possible values
include: 'Invalid', 'Unknown', 'IaasVMContainer',
'IaasVMServiceContainer', 'DPMContainer', 'AzureBackupServerContainer',
'MABContainer', 'Cluster', 'AzureSqlContainer', 'Windows', 'VCenter',
'VMAppContainer', 'SQLAGWorkLoadContainer', 'StorageContainer',
'GenericContainer'
:type mab_server_type: str or
~azure.mgmt.recoveryservicesbackup.models.MabServerType
:param workload_type: Workload type of backup item. Possible values
include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB', 'Exchange',
'Sharepoint', 'VMwareVM', 'SystemState', 'Client', 'GenericDataSource',
'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
:param error_details: The errors.
:type error_details:
list[~azure.mgmt.recoveryservicesbackup.models.MabErrorInfo]
:param extended_info: Additional information on the job.
:type extended_info:
~azure.mgmt.recoveryservicesbackup.models.MabJobExtendedInfo
"""
_validation = {
'job_type': {'required': True},
}
_attribute_map = {
'entity_friendly_name': {'key': 'entityFriendlyName', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'operation': {'key': 'operation', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'activity_id': {'key': 'activityId', 'type': 'str'},
'job_type': {'key': 'jobType', 'type': 'str'},
'duration': {'key': 'duration', 'type': 'duration'},
'actions_info': {'key': 'actionsInfo', 'type': '[JobSupportedAction]'},
'mab_server_name': {'key': 'mabServerName', 'type': 'str'},
'mab_server_type': {'key': 'mabServerType', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
'error_details': {'key': 'errorDetails', 'type': '[MabErrorInfo]'},
'extended_info': {'key': 'extendedInfo', 'type': 'MabJobExtendedInfo'},
}
def __init__(self, *, entity_friendly_name: str=None, backup_management_type=None, operation: str=None, status: str=None, start_time=None, end_time=None, activity_id: str=None, duration=None, actions_info=None, mab_server_name: str=None, mab_server_type=None, workload_type=None, error_details=None, extended_info=None, **kwargs) -> None:
super(MabJob, self).__init__(entity_friendly_name=entity_friendly_name, backup_management_type=backup_management_type, operation=operation, status=status, start_time=start_time, end_time=end_time, activity_id=activity_id, **kwargs)
self.duration = duration
self.actions_info = actions_info
self.mab_server_name = mab_server_name
self.mab_server_type = mab_server_type
self.workload_type = workload_type
self.error_details = error_details
self.extended_info = extended_info
self.job_type = 'MabJob'
class MabJobExtendedInfo(Model):
"""Additional information for the MAB workload-specific job.
:param tasks_list: List of tasks for this job.
:type tasks_list:
list[~azure.mgmt.recoveryservicesbackup.models.MabJobTaskDetails]
:param property_bag: The job properties.
:type property_bag: dict[str, str]
:param dynamic_error_message: Non localized error message specific to this
job.
:type dynamic_error_message: str
"""
_attribute_map = {
'tasks_list': {'key': 'tasksList', 'type': '[MabJobTaskDetails]'},
'property_bag': {'key': 'propertyBag', 'type': '{str}'},
'dynamic_error_message': {'key': 'dynamicErrorMessage', 'type': 'str'},
}
def __init__(self, *, tasks_list=None, property_bag=None, dynamic_error_message: str=None, **kwargs) -> None:
super(MabJobExtendedInfo, self).__init__(**kwargs)
self.tasks_list = tasks_list
self.property_bag = property_bag
self.dynamic_error_message = dynamic_error_message
class MabJobTaskDetails(Model):
"""MAB workload-specific job task details.
:param task_id: The task display name.
:type task_id: str
:param start_time: The start time.
:type start_time: datetime
:param end_time: The end time.
:type end_time: datetime
:param duration: Time elapsed for task.
:type duration: timedelta
:param status: The status.
:type status: str
"""
_attribute_map = {
'task_id': {'key': 'taskId', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'duration': {'key': 'duration', 'type': 'duration'},
'status': {'key': 'status', 'type': 'str'},
}
def __init__(self, *, task_id: str=None, start_time=None, end_time=None, duration=None, status: str=None, **kwargs) -> None:
super(MabJobTaskDetails, self).__init__(**kwargs)
self.task_id = task_id
self.start_time = start_time
self.end_time = end_time
self.duration = duration
self.status = status
class MabProtectionPolicy(ProtectionPolicy):
"""Mab container-specific backup policy.
All required parameters must be populated in order to send to Azure.
:param protected_items_count: Number of items associated with this policy.
:type protected_items_count: int
:param backup_management_type: Required. Constant filled by server.
:type backup_management_type: str
:param schedule_policy: Backup schedule of backup policy.
:type schedule_policy:
~azure.mgmt.recoveryservicesbackup.models.SchedulePolicy
:param retention_policy: Retention policy details.
:type retention_policy:
~azure.mgmt.recoveryservicesbackup.models.RetentionPolicy
"""
_validation = {
'backup_management_type': {'required': True},
}
_attribute_map = {
'protected_items_count': {'key': 'protectedItemsCount', 'type': 'int'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'schedule_policy': {'key': 'schedulePolicy', 'type': 'SchedulePolicy'},
'retention_policy': {'key': 'retentionPolicy', 'type': 'RetentionPolicy'},
}
def __init__(self, *, protected_items_count: int=None, schedule_policy=None, retention_policy=None, **kwargs) -> None:
super(MabProtectionPolicy, self).__init__(protected_items_count=protected_items_count, **kwargs)
self.schedule_policy = schedule_policy
self.retention_policy = retention_policy
self.backup_management_type = 'MAB'
class MonthlyRetentionSchedule(Model):
"""Monthly retention schedule.
:param retention_schedule_format_type: Retention schedule format type for
monthly retention policy. Possible values include: 'Invalid', 'Daily',
'Weekly'
:type retention_schedule_format_type: str or
~azure.mgmt.recoveryservicesbackup.models.RetentionScheduleFormat
:param retention_schedule_daily: Daily retention format for monthly
retention policy.
:type retention_schedule_daily:
~azure.mgmt.recoveryservicesbackup.models.DailyRetentionFormat
:param retention_schedule_weekly: Weekly retention format for monthly
retention policy.
:type retention_schedule_weekly:
~azure.mgmt.recoveryservicesbackup.models.WeeklyRetentionFormat
:param retention_times: Retention times of retention policy.
:type retention_times: list[datetime]
:param retention_duration: Retention duration of retention Policy.
:type retention_duration:
~azure.mgmt.recoveryservicesbackup.models.RetentionDuration
"""
_attribute_map = {
'retention_schedule_format_type': {'key': 'retentionScheduleFormatType', 'type': 'str'},
'retention_schedule_daily': {'key': 'retentionScheduleDaily', 'type': 'DailyRetentionFormat'},
'retention_schedule_weekly': {'key': 'retentionScheduleWeekly', 'type': 'WeeklyRetentionFormat'},
'retention_times': {'key': 'retentionTimes', 'type': '[iso-8601]'},
'retention_duration': {'key': 'retentionDuration', 'type': 'RetentionDuration'},
}
def __init__(self, *, retention_schedule_format_type=None, retention_schedule_daily=None, retention_schedule_weekly=None, retention_times=None, retention_duration=None, **kwargs) -> None:
super(MonthlyRetentionSchedule, self).__init__(**kwargs)
self.retention_schedule_format_type = retention_schedule_format_type
self.retention_schedule_daily = retention_schedule_daily
self.retention_schedule_weekly = retention_schedule_weekly
self.retention_times = retention_times
self.retention_duration = retention_duration
class NameInfo(Model):
"""The name of usage.
:param value: Value of usage.
:type value: str
:param localized_value: Localized value of usage.
:type localized_value: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': 'str'},
'localized_value': {'key': 'localizedValue', 'type': 'str'},
}
def __init__(self, *, value: str=None, localized_value: str=None, **kwargs) -> None:
super(NameInfo, self).__init__(**kwargs)
self.value = value
self.localized_value = localized_value
class OperationResultInfo(OperationResultInfoBase):
"""Operation result info.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param job_list: List of jobs created by this operation.
:type job_list: list[str]
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'job_list': {'key': 'jobList', 'type': '[str]'},
}
def __init__(self, *, job_list=None, **kwargs) -> None:
super(OperationResultInfo, self).__init__(**kwargs)
self.job_list = job_list
self.object_type = 'OperationResultInfo'
class OperationWorkerResponse(Model):
"""This is the base class for operation result responses.
:param status_code: HTTP Status Code of the operation. Possible values
include: 'Continue', 'SwitchingProtocols', 'OK', 'Created', 'Accepted',
'NonAuthoritativeInformation', 'NoContent', 'ResetContent',
'PartialContent', 'MultipleChoices', 'Ambiguous', 'MovedPermanently',
'Moved', 'Found', 'Redirect', 'SeeOther', 'RedirectMethod', 'NotModified',
'UseProxy', 'Unused', 'TemporaryRedirect', 'RedirectKeepVerb',
'BadRequest', 'Unauthorized', 'PaymentRequired', 'Forbidden', 'NotFound',
'MethodNotAllowed', 'NotAcceptable', 'ProxyAuthenticationRequired',
'RequestTimeout', 'Conflict', 'Gone', 'LengthRequired',
'PreconditionFailed', 'RequestEntityTooLarge', 'RequestUriTooLong',
'UnsupportedMediaType', 'RequestedRangeNotSatisfiable',
'ExpectationFailed', 'UpgradeRequired', 'InternalServerError',
'NotImplemented', 'BadGateway', 'ServiceUnavailable', 'GatewayTimeout',
'HttpVersionNotSupported'
:type status_code: str or
~azure.mgmt.recoveryservicesbackup.models.HttpStatusCode
:param headers: HTTP headers associated with this operation.
:type headers: dict[str, list[str]]
"""
_attribute_map = {
'status_code': {'key': 'statusCode', 'type': 'HttpStatusCode'},
'headers': {'key': 'headers', 'type': '{[str]}'},
}
def __init__(self, *, status_code=None, headers=None, **kwargs) -> None:
super(OperationWorkerResponse, self).__init__(**kwargs)
self.status_code = status_code
self.headers = headers
class OperationResultInfoBaseResource(OperationWorkerResponse):
"""Base class for operation result info.
:param status_code: HTTP Status Code of the operation. Possible values
include: 'Continue', 'SwitchingProtocols', 'OK', 'Created', 'Accepted',
'NonAuthoritativeInformation', 'NoContent', 'ResetContent',
'PartialContent', 'MultipleChoices', 'Ambiguous', 'MovedPermanently',
'Moved', 'Found', 'Redirect', 'SeeOther', 'RedirectMethod', 'NotModified',
'UseProxy', 'Unused', 'TemporaryRedirect', 'RedirectKeepVerb',
'BadRequest', 'Unauthorized', 'PaymentRequired', 'Forbidden', 'NotFound',
'MethodNotAllowed', 'NotAcceptable', 'ProxyAuthenticationRequired',
'RequestTimeout', 'Conflict', 'Gone', 'LengthRequired',
'PreconditionFailed', 'RequestEntityTooLarge', 'RequestUriTooLong',
'UnsupportedMediaType', 'RequestedRangeNotSatisfiable',
'ExpectationFailed', 'UpgradeRequired', 'InternalServerError',
'NotImplemented', 'BadGateway', 'ServiceUnavailable', 'GatewayTimeout',
'HttpVersionNotSupported'
:type status_code: str or
~azure.mgmt.recoveryservicesbackup.models.HttpStatusCode
:param headers: HTTP headers associated with this operation.
:type headers: dict[str, list[str]]
:param operation: OperationResultInfoBaseResource operation
:type operation:
~azure.mgmt.recoveryservicesbackup.models.OperationResultInfoBase
"""
_attribute_map = {
'status_code': {'key': 'statusCode', 'type': 'HttpStatusCode'},
'headers': {'key': 'headers', 'type': '{[str]}'},
'operation': {'key': 'operation', 'type': 'OperationResultInfoBase'},
}
def __init__(self, *, status_code=None, headers=None, operation=None, **kwargs) -> None:
super(OperationResultInfoBaseResource, self).__init__(status_code=status_code, headers=headers, **kwargs)
self.operation = operation
class OperationStatus(Model):
"""Operation status.
:param id: ID of the operation.
:type id: str
:param name: Name of the operation.
:type name: str
:param status: Operation status. Possible values include: 'Invalid',
'InProgress', 'Succeeded', 'Failed', 'Canceled'
:type status: str or
~azure.mgmt.recoveryservicesbackup.models.OperationStatusValues
:param start_time: Operation start time. Format: ISO-8601.
:type start_time: datetime
:param end_time: Operation end time. Format: ISO-8601.
:type end_time: datetime
:param error: Error information related to this operation.
:type error:
~azure.mgmt.recoveryservicesbackup.models.OperationStatusError
:param properties: Additional information associated with this operation.
:type properties:
~azure.mgmt.recoveryservicesbackup.models.OperationStatusExtendedInfo
"""
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'error': {'key': 'error', 'type': 'OperationStatusError'},
'properties': {'key': 'properties', 'type': 'OperationStatusExtendedInfo'},
}
def __init__(self, *, id: str=None, name: str=None, status=None, start_time=None, end_time=None, error=None, properties=None, **kwargs) -> None:
super(OperationStatus, self).__init__(**kwargs)
self.id = id
self.name = name
self.status = status
self.start_time = start_time
self.end_time = end_time
self.error = error
self.properties = properties
class OperationStatusError(Model):
"""Error information associated with operation status call.
:param code: Error code of the operation failure.
:type code: str
:param message: Error message displayed if the operation failure.
:type message: str
"""
_attribute_map = {
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
}
def __init__(self, *, code: str=None, message: str=None, **kwargs) -> None:
super(OperationStatusError, self).__init__(**kwargs)
self.code = code
self.message = message
class OperationStatusExtendedInfo(Model):
"""Base class for additional information of operation status.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: OperationStatusJobExtendedInfo,
OperationStatusJobsExtendedInfo, OperationStatusProvisionILRExtendedInfo
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'OperationStatusJobExtendedInfo': 'OperationStatusJobExtendedInfo', 'OperationStatusJobsExtendedInfo': 'OperationStatusJobsExtendedInfo', 'OperationStatusProvisionILRExtendedInfo': 'OperationStatusProvisionILRExtendedInfo'}
}
def __init__(self, **kwargs) -> None:
super(OperationStatusExtendedInfo, self).__init__(**kwargs)
self.object_type = None
class OperationStatusJobExtendedInfo(OperationStatusExtendedInfo):
"""Operation status job extended info.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param job_id: ID of the job created for this protected item.
:type job_id: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'job_id': {'key': 'jobId', 'type': 'str'},
}
def __init__(self, *, job_id: str=None, **kwargs) -> None:
super(OperationStatusJobExtendedInfo, self).__init__(**kwargs)
self.job_id = job_id
self.object_type = 'OperationStatusJobExtendedInfo'
class OperationStatusJobsExtendedInfo(OperationStatusExtendedInfo):
"""Operation status extended info for list of jobs.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param job_ids: IDs of the jobs created for the protected item.
:type job_ids: list[str]
:param failed_jobs_error: Stores all the failed jobs along with the
corresponding error codes.
:type failed_jobs_error: dict[str, str]
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'job_ids': {'key': 'jobIds', 'type': '[str]'},
'failed_jobs_error': {'key': 'failedJobsError', 'type': '{str}'},
}
def __init__(self, *, job_ids=None, failed_jobs_error=None, **kwargs) -> None:
super(OperationStatusJobsExtendedInfo, self).__init__(**kwargs)
self.job_ids = job_ids
self.failed_jobs_error = failed_jobs_error
self.object_type = 'OperationStatusJobsExtendedInfo'
class OperationStatusProvisionILRExtendedInfo(OperationStatusExtendedInfo):
"""Operation status extended info for ILR provision action.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param recovery_target: Target details for file / folder restore.
:type recovery_target:
~azure.mgmt.recoveryservicesbackup.models.InstantItemRecoveryTarget
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'recovery_target': {'key': 'recoveryTarget', 'type': 'InstantItemRecoveryTarget'},
}
def __init__(self, *, recovery_target=None, **kwargs) -> None:
super(OperationStatusProvisionILRExtendedInfo, self).__init__(**kwargs)
self.recovery_target = recovery_target
self.object_type = 'OperationStatusProvisionILRExtendedInfo'
class PointInTimeRange(Model):
"""Provides details for log ranges.
:param start_time: Start time of the time range for log recovery.
:type start_time: datetime
:param end_time: End time of the time range for log recovery.
:type end_time: datetime
"""
_attribute_map = {
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
}
def __init__(self, *, start_time=None, end_time=None, **kwargs) -> None:
super(PointInTimeRange, self).__init__(**kwargs)
self.start_time = start_time
self.end_time = end_time
class PreBackupValidation(Model):
"""Pre-backup validation for Azure VM Workload provider.
:param status: Status of protectable item, i.e.
InProgress,Succeeded,Failed. Possible values include: 'Invalid',
'Success', 'Failed'
:type status: str or
~azure.mgmt.recoveryservicesbackup.models.InquiryStatus
:param code: Error code of protectable item
:type code: str
:param message: Message corresponding to the error code for the
protectable item
:type message: str
"""
_attribute_map = {
'status': {'key': 'status', 'type': 'str'},
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
}
def __init__(self, *, status=None, code: str=None, message: str=None, **kwargs) -> None:
super(PreBackupValidation, self).__init__(**kwargs)
self.status = status
self.code = code
self.message = message
class PreValidateEnableBackupRequest(Model):
"""Contract to validate if backup can be enabled on the given resource in a
given vault and given configuration.
It will validate followings
1. Vault capacity
2. VM is already protected
3. Any VM related configuration passed in properties.
:param resource_type: ProtectedItem Type- VM, SqlDataBase, AzureFileShare
etc. Possible values include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb',
'SQLDB', 'Exchange', 'Sharepoint', 'VMwareVM', 'SystemState', 'Client',
'GenericDataSource', 'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase',
'SAPAseDatabase'
:type resource_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param resource_id: ARM Virtual Machine Id
:type resource_id: str
:param vault_id: ARM id of the Recovery Services Vault
:type vault_id: str
:param properties: Configuration of VM if any needs to be validated like
OS type etc
:type properties: str
"""
_attribute_map = {
'resource_type': {'key': 'resourceType', 'type': 'str'},
'resource_id': {'key': 'resourceId', 'type': 'str'},
'vault_id': {'key': 'vaultId', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'str'},
}
def __init__(self, *, resource_type=None, resource_id: str=None, vault_id: str=None, properties: str=None, **kwargs) -> None:
super(PreValidateEnableBackupRequest, self).__init__(**kwargs)
self.resource_type = resource_type
self.resource_id = resource_id
self.vault_id = vault_id
self.properties = properties
class PreValidateEnableBackupResponse(Model):
"""Response contract for enable backup validation request.
:param status: Validation Status. Possible values include: 'Invalid',
'Succeeded', 'Failed'
:type status: str or
~azure.mgmt.recoveryservicesbackup.models.ValidationStatus
:param error_code: Response error code
:type error_code: str
:param error_message: Response error message
:type error_message: str
:param recommendation: Recommended action for user
:type recommendation: str
:param container_name: Specifies the product specific container name. E.g.
iaasvmcontainer;iaasvmcontainer;rgname;vmname. This is required
for portal
:type container_name: str
:param protected_item_name: Specifies the product specific ds name. E.g.
vm;iaasvmcontainer;rgname;vmname. This is required for portal
:type protected_item_name: str
"""
_attribute_map = {
'status': {'key': 'status', 'type': 'str'},
'error_code': {'key': 'errorCode', 'type': 'str'},
'error_message': {'key': 'errorMessage', 'type': 'str'},
'recommendation': {'key': 'recommendation', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'protected_item_name': {'key': 'protectedItemName', 'type': 'str'},
}
def __init__(self, *, status=None, error_code: str=None, error_message: str=None, recommendation: str=None, container_name: str=None, protected_item_name: str=None, **kwargs) -> None:
super(PreValidateEnableBackupResponse, self).__init__(**kwargs)
self.status = status
self.error_code = error_code
self.error_message = error_message
self.recommendation = recommendation
self.container_name = container_name
self.protected_item_name = protected_item_name
class ProtectableContainerResource(Resource):
"""Protectable Container Class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: ProtectableContainerResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.ProtectableContainer
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ProtectableContainer'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(ProtectableContainerResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class ProtectedItemQueryObject(Model):
"""Filters to list backup items.
:param health_state: Health State for the backed up item. Possible values
include: 'Passed', 'ActionRequired', 'ActionSuggested', 'Invalid'
:type health_state: str or
~azure.mgmt.recoveryservicesbackup.models.HealthState
:param backup_management_type: Backup management type for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param item_type: Type of workload this item represents. Possible values
include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB', 'Exchange',
'Sharepoint', 'VMwareVM', 'SystemState', 'Client', 'GenericDataSource',
'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type item_type: str or
~azure.mgmt.recoveryservicesbackup.models.DataSourceType
:param policy_name: Backup policy name associated with the backup item.
:type policy_name: str
:param container_name: Name of the container.
:type container_name: str
:param backup_engine_name: Backup Engine name
:type backup_engine_name: str
:param friendly_name: Friendly name of protected item
:type friendly_name: str
:param fabric_name: Name of the fabric.
:type fabric_name: str
:param backup_set_name: Name of the backup set.
:type backup_set_name: str
"""
_attribute_map = {
'health_state': {'key': 'healthState', 'type': 'str'},
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'item_type': {'key': 'itemType', 'type': 'str'},
'policy_name': {'key': 'policyName', 'type': 'str'},
'container_name': {'key': 'containerName', 'type': 'str'},
'backup_engine_name': {'key': 'backupEngineName', 'type': 'str'},
'friendly_name': {'key': 'friendlyName', 'type': 'str'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
'backup_set_name': {'key': 'backupSetName', 'type': 'str'},
}
def __init__(self, *, health_state=None, backup_management_type=None, item_type=None, policy_name: str=None, container_name: str=None, backup_engine_name: str=None, friendly_name: str=None, fabric_name: str=None, backup_set_name: str=None, **kwargs) -> None:
super(ProtectedItemQueryObject, self).__init__(**kwargs)
self.health_state = health_state
self.backup_management_type = backup_management_type
self.item_type = item_type
self.policy_name = policy_name
self.container_name = container_name
self.backup_engine_name = backup_engine_name
self.friendly_name = friendly_name
self.fabric_name = fabric_name
self.backup_set_name = backup_set_name
class ProtectedItemResource(Resource):
"""Base class for backup items.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: ProtectedItemResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.ProtectedItem
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ProtectedItem'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(ProtectedItemResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class ProtectionContainerResource(Resource):
"""Base class for container with backup items. Containers with specific
workloads are derived from this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: ProtectionContainerResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.ProtectionContainer
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ProtectionContainer'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(ProtectionContainerResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class ProtectionIntentQueryObject(Model):
"""Filters to list protection intent.
:param backup_management_type: Backup management type for the backed up
item. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param item_type: Type of workload this item represents. Possible values
include: 'Invalid', 'SQLInstance', 'SQLAvailabilityGroupContainer'
:type item_type: str or
~azure.mgmt.recoveryservicesbackup.models.IntentItemType
:param parent_name: Parent name of the intent
:type parent_name: str
:param item_name: Item name of the intent
:type item_name: str
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'item_type': {'key': 'itemType', 'type': 'str'},
'parent_name': {'key': 'parentName', 'type': 'str'},
'item_name': {'key': 'itemName', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, item_type=None, parent_name: str=None, item_name: str=None, **kwargs) -> None:
super(ProtectionIntentQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.item_type = item_type
self.parent_name = parent_name
self.item_name = item_name
class ProtectionIntentResource(Resource):
"""Base class for backup ProtectionIntent.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: ProtectionIntentResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.ProtectionIntent
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ProtectionIntent'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(ProtectionIntentResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class ProtectionPolicyQueryObject(Model):
"""Filters the list backup policies API.
:param backup_management_type: Backup management type for the backup
policy. Possible values include: 'Invalid', 'AzureIaasVM', 'MAB', 'DPM',
'AzureBackupServer', 'AzureSql', 'AzureStorage', 'AzureWorkload',
'DefaultBackup'
:type backup_management_type: str or
~azure.mgmt.recoveryservicesbackup.models.BackupManagementType
:param fabric_name: Fabric name for filter
:type fabric_name: str
:param workload_type: Workload type for the backup policy. Possible values
include: 'Invalid', 'VM', 'FileFolder', 'AzureSqlDb', 'SQLDB', 'Exchange',
'Sharepoint', 'VMwareVM', 'SystemState', 'Client', 'GenericDataSource',
'SQLDataBase', 'AzureFileShare', 'SAPHanaDatabase', 'SAPAseDatabase'
:type workload_type: str or
~azure.mgmt.recoveryservicesbackup.models.WorkloadType
"""
_attribute_map = {
'backup_management_type': {'key': 'backupManagementType', 'type': 'str'},
'fabric_name': {'key': 'fabricName', 'type': 'str'},
'workload_type': {'key': 'workloadType', 'type': 'str'},
}
def __init__(self, *, backup_management_type=None, fabric_name: str=None, workload_type=None, **kwargs) -> None:
super(ProtectionPolicyQueryObject, self).__init__(**kwargs)
self.backup_management_type = backup_management_type
self.fabric_name = fabric_name
self.workload_type = workload_type
class ProtectionPolicyResource(Resource):
"""Base class for backup policy. Workload-specific backup policies are derived
from this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: ProtectionPolicyResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.ProtectionPolicy
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'ProtectionPolicy'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(ProtectionPolicyResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class RecoveryPointDiskConfiguration(Model):
"""Disk configuration.
:param number_of_disks_included_in_backup: Number of disks included in
backup
:type number_of_disks_included_in_backup: int
:param number_of_disks_attached_to_vm: Number of disks attached to the VM
:type number_of_disks_attached_to_vm: int
:param included_disk_list: Information of disks included in backup
:type included_disk_list:
list[~azure.mgmt.recoveryservicesbackup.models.DiskInformation]
:param excluded_disk_list: Information of disks excluded from backup
:type excluded_disk_list:
list[~azure.mgmt.recoveryservicesbackup.models.DiskInformation]
"""
_attribute_map = {
'number_of_disks_included_in_backup': {'key': 'numberOfDisksIncludedInBackup', 'type': 'int'},
'number_of_disks_attached_to_vm': {'key': 'numberOfDisksAttachedToVm', 'type': 'int'},
'included_disk_list': {'key': 'includedDiskList', 'type': '[DiskInformation]'},
'excluded_disk_list': {'key': 'excludedDiskList', 'type': '[DiskInformation]'},
}
def __init__(self, *, number_of_disks_included_in_backup: int=None, number_of_disks_attached_to_vm: int=None, included_disk_list=None, excluded_disk_list=None, **kwargs) -> None:
super(RecoveryPointDiskConfiguration, self).__init__(**kwargs)
self.number_of_disks_included_in_backup = number_of_disks_included_in_backup
self.number_of_disks_attached_to_vm = number_of_disks_attached_to_vm
self.included_disk_list = included_disk_list
self.excluded_disk_list = excluded_disk_list
class RecoveryPointResource(Resource):
"""Base class for backup copies. Workload-specific backup copies are derived
from this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: RecoveryPointResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.RecoveryPoint
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'RecoveryPoint'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(RecoveryPointResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class RecoveryPointTierInformation(Model):
"""Recovery point tier information.
:param type: Recovery point tier type. Possible values include: 'Invalid',
'InstantRP', 'HardenedRP'
:type type: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryPointTierType
:param status: Recovery point tier status. Possible values include:
'Invalid', 'Valid', 'Disabled', 'Deleted'
:type status: str or
~azure.mgmt.recoveryservicesbackup.models.RecoveryPointTierStatus
"""
_attribute_map = {
'type': {'key': 'type', 'type': 'RecoveryPointTierType'},
'status': {'key': 'status', 'type': 'RecoveryPointTierStatus'},
}
def __init__(self, *, type=None, status=None, **kwargs) -> None:
super(RecoveryPointTierInformation, self).__init__(**kwargs)
self.type = type
self.status = status
class ResourceList(Model):
"""Base for all lists of resources.
:param next_link: The uri to fetch the next page of resources. Call
ListNext() fetches next page of resources.
:type next_link: str
"""
_attribute_map = {
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(self, *, next_link: str=None, **kwargs) -> None:
super(ResourceList, self).__init__(**kwargs)
self.next_link = next_link
class RestoreFileSpecs(Model):
"""Restore file specs like file path, type and target folder path info.
:param path: Source File/Folder path
:type path: str
:param file_spec_type: Indicates what the Path variable stands for
:type file_spec_type: str
:param target_folder_path: Destination folder path in target FileShare
:type target_folder_path: str
"""
_attribute_map = {
'path': {'key': 'path', 'type': 'str'},
'file_spec_type': {'key': 'fileSpecType', 'type': 'str'},
'target_folder_path': {'key': 'targetFolderPath', 'type': 'str'},
}
def __init__(self, *, path: str=None, file_spec_type: str=None, target_folder_path: str=None, **kwargs) -> None:
super(RestoreFileSpecs, self).__init__(**kwargs)
self.path = path
self.file_spec_type = file_spec_type
self.target_folder_path = target_folder_path
class RestoreRequestResource(Resource):
"""Base class for restore request. Workload-specific restore requests are
derived from this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: RestoreRequestResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.RestoreRequest
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'RestoreRequest'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(RestoreRequestResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class RetentionDuration(Model):
"""Retention duration.
:param count: Count of duration types. Retention duration is obtained by
the counting the duration type Count times.
For example, when Count = 3 and DurationType = Weeks, retention duration
will be three weeks.
:type count: int
:param duration_type: Retention duration type of retention policy.
Possible values include: 'Invalid', 'Days', 'Weeks', 'Months', 'Years'
:type duration_type: str or
~azure.mgmt.recoveryservicesbackup.models.RetentionDurationType
"""
_attribute_map = {
'count': {'key': 'count', 'type': 'int'},
'duration_type': {'key': 'durationType', 'type': 'str'},
}
def __init__(self, *, count: int=None, duration_type=None, **kwargs) -> None:
super(RetentionDuration, self).__init__(**kwargs)
self.count = count
self.duration_type = duration_type
class Settings(Model):
"""Common settings field for backup management.
:param time_zone: TimeZone optional input as string. For example: TimeZone
= "Pacific Standard Time".
:type time_zone: str
:param issqlcompression: SQL compression flag
:type issqlcompression: bool
:param is_compression: Workload compression flag. This has been added so
that 'isSqlCompression'
will be deprecated once clients upgrade to consider this flag.
:type is_compression: bool
"""
_attribute_map = {
'time_zone': {'key': 'timeZone', 'type': 'str'},
'issqlcompression': {'key': 'issqlcompression', 'type': 'bool'},
'is_compression': {'key': 'isCompression', 'type': 'bool'},
}
def __init__(self, *, time_zone: str=None, issqlcompression: bool=None, is_compression: bool=None, **kwargs) -> None:
super(Settings, self).__init__(**kwargs)
self.time_zone = time_zone
self.issqlcompression = issqlcompression
self.is_compression = is_compression
class SimpleRetentionPolicy(RetentionPolicy):
"""Simple policy retention.
All required parameters must be populated in order to send to Azure.
:param retention_policy_type: Required. Constant filled by server.
:type retention_policy_type: str
:param retention_duration: Retention duration of the protection policy.
:type retention_duration:
~azure.mgmt.recoveryservicesbackup.models.RetentionDuration
"""
_validation = {
'retention_policy_type': {'required': True},
}
_attribute_map = {
'retention_policy_type': {'key': 'retentionPolicyType', 'type': 'str'},
'retention_duration': {'key': 'retentionDuration', 'type': 'RetentionDuration'},
}
def __init__(self, *, retention_duration=None, **kwargs) -> None:
super(SimpleRetentionPolicy, self).__init__(**kwargs)
self.retention_duration = retention_duration
self.retention_policy_type = 'SimpleRetentionPolicy'
class SimpleSchedulePolicy(SchedulePolicy):
"""Simple policy schedule.
All required parameters must be populated in order to send to Azure.
:param schedule_policy_type: Required. Constant filled by server.
:type schedule_policy_type: str
:param schedule_run_frequency: Frequency of the schedule operation of this
policy. Possible values include: 'Invalid', 'Daily', 'Weekly'
:type schedule_run_frequency: str or
~azure.mgmt.recoveryservicesbackup.models.ScheduleRunType
:param schedule_run_days: List of days of week this schedule has to be
run.
:type schedule_run_days: list[str or
~azure.mgmt.recoveryservicesbackup.models.DayOfWeek]
:param schedule_run_times: List of times of day this schedule has to be
run.
:type schedule_run_times: list[datetime]
:param schedule_weekly_frequency: At every number weeks this schedule has
to be run.
:type schedule_weekly_frequency: int
"""
_validation = {
'schedule_policy_type': {'required': True},
}
_attribute_map = {
'schedule_policy_type': {'key': 'schedulePolicyType', 'type': 'str'},
'schedule_run_frequency': {'key': 'scheduleRunFrequency', 'type': 'str'},
'schedule_run_days': {'key': 'scheduleRunDays', 'type': '[DayOfWeek]'},
'schedule_run_times': {'key': 'scheduleRunTimes', 'type': '[iso-8601]'},
'schedule_weekly_frequency': {'key': 'scheduleWeeklyFrequency', 'type': 'int'},
}
def __init__(self, *, schedule_run_frequency=None, schedule_run_days=None, schedule_run_times=None, schedule_weekly_frequency: int=None, **kwargs) -> None:
super(SimpleSchedulePolicy, self).__init__(**kwargs)
self.schedule_run_frequency = schedule_run_frequency
self.schedule_run_days = schedule_run_days
self.schedule_run_times = schedule_run_times
self.schedule_weekly_frequency = schedule_weekly_frequency
self.schedule_policy_type = 'SimpleSchedulePolicy'
class SQLDataDirectory(Model):
"""SQLDataDirectory info.
:param type: Type of data directory mapping. Possible values include:
'Invalid', 'Data', 'Log'
:type type: str or
~azure.mgmt.recoveryservicesbackup.models.SQLDataDirectoryType
:param path: File path
:type path: str
:param logical_name: Logical name of the file
:type logical_name: str
"""
_attribute_map = {
'type': {'key': 'type', 'type': 'str'},
'path': {'key': 'path', 'type': 'str'},
'logical_name': {'key': 'logicalName', 'type': 'str'},
}
def __init__(self, *, type=None, path: str=None, logical_name: str=None, **kwargs) -> None:
super(SQLDataDirectory, self).__init__(**kwargs)
self.type = type
self.path = path
self.logical_name = logical_name
class SQLDataDirectoryMapping(Model):
"""Encapsulates information regarding data directory.
:param mapping_type: Type of data directory mapping. Possible values
include: 'Invalid', 'Data', 'Log'
:type mapping_type: str or
~azure.mgmt.recoveryservicesbackup.models.SQLDataDirectoryType
:param source_logical_name: Restore source logical name path
:type source_logical_name: str
:param source_path: Restore source path
:type source_path: str
:param target_path: Target path
:type target_path: str
"""
_attribute_map = {
'mapping_type': {'key': 'mappingType', 'type': 'str'},
'source_logical_name': {'key': 'sourceLogicalName', 'type': 'str'},
'source_path': {'key': 'sourcePath', 'type': 'str'},
'target_path': {'key': 'targetPath', 'type': 'str'},
}
def __init__(self, *, mapping_type=None, source_logical_name: str=None, source_path: str=None, target_path: str=None, **kwargs) -> None:
super(SQLDataDirectoryMapping, self).__init__(**kwargs)
self.mapping_type = mapping_type
self.source_logical_name = source_logical_name
self.source_path = source_path
self.target_path = target_path
class SubProtectionPolicy(Model):
"""Sub-protection policy which includes schedule and retention.
:param policy_type: Type of backup policy type. Possible values include:
'Invalid', 'Full', 'Differential', 'Log', 'CopyOnlyFull'
:type policy_type: str or
~azure.mgmt.recoveryservicesbackup.models.PolicyType
:param schedule_policy: Backup schedule specified as part of backup
policy.
:type schedule_policy:
~azure.mgmt.recoveryservicesbackup.models.SchedulePolicy
:param retention_policy: Retention policy with the details on backup copy
retention ranges.
:type retention_policy:
~azure.mgmt.recoveryservicesbackup.models.RetentionPolicy
"""
_attribute_map = {
'policy_type': {'key': 'policyType', 'type': 'str'},
'schedule_policy': {'key': 'schedulePolicy', 'type': 'SchedulePolicy'},
'retention_policy': {'key': 'retentionPolicy', 'type': 'RetentionPolicy'},
}
def __init__(self, *, policy_type=None, schedule_policy=None, retention_policy=None, **kwargs) -> None:
super(SubProtectionPolicy, self).__init__(**kwargs)
self.policy_type = policy_type
self.schedule_policy = schedule_policy
self.retention_policy = retention_policy
class TargetAFSRestoreInfo(Model):
"""Target Azure File Share Info.
:param name: File share name
:type name: str
:param target_resource_id: Target file share resource ARM ID
:type target_resource_id: str
"""
_attribute_map = {
'name': {'key': 'name', 'type': 'str'},
'target_resource_id': {'key': 'targetResourceId', 'type': 'str'},
}
def __init__(self, *, name: str=None, target_resource_id: str=None, **kwargs) -> None:
super(TargetAFSRestoreInfo, self).__init__(**kwargs)
self.name = name
self.target_resource_id = target_resource_id
class TargetRestoreInfo(Model):
"""Details about target workload during restore operation.
:param overwrite_option: Can Overwrite if Target DataBase already exists.
Possible values include: 'Invalid', 'FailOnConflict', 'Overwrite'
:type overwrite_option: str or
~azure.mgmt.recoveryservicesbackup.models.OverwriteOptions
:param container_id: Resource Id name of the container in which Target
DataBase resides
:type container_id: str
:param database_name: Database name InstanceName/DataBaseName for SQL or
System/DbName for SAP Hana
:type database_name: str
:param target_directory_for_file_restore: Target directory location for
restore as files.
:type target_directory_for_file_restore: str
"""
_attribute_map = {
'overwrite_option': {'key': 'overwriteOption', 'type': 'str'},
'container_id': {'key': 'containerId', 'type': 'str'},
'database_name': {'key': 'databaseName', 'type': 'str'},
'target_directory_for_file_restore': {'key': 'targetDirectoryForFileRestore', 'type': 'str'},
}
def __init__(self, *, overwrite_option=None, container_id: str=None, database_name: str=None, target_directory_for_file_restore: str=None, **kwargs) -> None:
super(TargetRestoreInfo, self).__init__(**kwargs)
self.overwrite_option = overwrite_option
self.container_id = container_id
self.database_name = database_name
self.target_directory_for_file_restore = target_directory_for_file_restore
class TokenInformation(Model):
"""The token information details.
:param token: Token value.
:type token: str
:param expiry_time_in_utc_ticks: Expiry time of token.
:type expiry_time_in_utc_ticks: long
:param security_pin: Security PIN
:type security_pin: str
"""
_attribute_map = {
'token': {'key': 'token', 'type': 'str'},
'expiry_time_in_utc_ticks': {'key': 'expiryTimeInUtcTicks', 'type': 'long'},
'security_pin': {'key': 'securityPIN', 'type': 'str'},
}
def __init__(self, *, token: str=None, expiry_time_in_utc_ticks: int=None, security_pin: str=None, **kwargs) -> None:
super(TokenInformation, self).__init__(**kwargs)
self.token = token
self.expiry_time_in_utc_ticks = expiry_time_in_utc_ticks
self.security_pin = security_pin
class ValidateOperationRequest(Model):
"""Base class for validate operation request.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: ValidateRestoreOperationRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
}
_subtype_map = {
'object_type': {'ValidateRestoreOperationRequest': 'ValidateRestoreOperationRequest'}
}
def __init__(self, **kwargs) -> None:
super(ValidateOperationRequest, self).__init__(**kwargs)
self.object_type = None
class ValidateRestoreOperationRequest(ValidateOperationRequest):
"""AzureRestoreValidation request.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: ValidateIaasVMRestoreOperationRequest
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param restore_request: Sets restore request to be validated
:type restore_request:
~azure.mgmt.recoveryservicesbackup.models.RestoreRequest
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'restore_request': {'key': 'restoreRequest', 'type': 'RestoreRequest'},
}
_subtype_map = {
'object_type': {'ValidateIaasVMRestoreOperationRequest': 'ValidateIaasVMRestoreOperationRequest'}
}
def __init__(self, *, restore_request=None, **kwargs) -> None:
super(ValidateRestoreOperationRequest, self).__init__(**kwargs)
self.restore_request = restore_request
self.object_type = 'ValidateRestoreOperationRequest'
class ValidateIaasVMRestoreOperationRequest(ValidateRestoreOperationRequest):
"""AzureRestoreValidation request.
All required parameters must be populated in order to send to Azure.
:param object_type: Required. Constant filled by server.
:type object_type: str
:param restore_request: Sets restore request to be validated
:type restore_request:
~azure.mgmt.recoveryservicesbackup.models.RestoreRequest
"""
_validation = {
'object_type': {'required': True},
}
_attribute_map = {
'object_type': {'key': 'objectType', 'type': 'str'},
'restore_request': {'key': 'restoreRequest', 'type': 'RestoreRequest'},
}
def __init__(self, *, restore_request=None, **kwargs) -> None:
super(ValidateIaasVMRestoreOperationRequest, self).__init__(restore_request=restore_request, **kwargs)
self.object_type = 'ValidateIaasVMRestoreOperationRequest'
class ValidateOperationResponse(Model):
"""Base class for validate operation response.
:param validation_results: Gets the validation result
:type validation_results:
list[~azure.mgmt.recoveryservicesbackup.models.ErrorDetail]
"""
_attribute_map = {
'validation_results': {'key': 'validationResults', 'type': '[ErrorDetail]'},
}
def __init__(self, *, validation_results=None, **kwargs) -> None:
super(ValidateOperationResponse, self).__init__(**kwargs)
self.validation_results = validation_results
class ValidateOperationsResponse(Model):
"""ValidateOperationsResponse.
:param validate_operation_response:
:type validate_operation_response:
~azure.mgmt.recoveryservicesbackup.models.ValidateOperationResponse
"""
_attribute_map = {
'validate_operation_response': {'key': 'validateOperationResponse', 'type': 'ValidateOperationResponse'},
}
def __init__(self, *, validate_operation_response=None, **kwargs) -> None:
super(ValidateOperationsResponse, self).__init__(**kwargs)
self.validate_operation_response = validate_operation_response
class WeeklyRetentionFormat(Model):
"""Weekly retention format.
:param days_of_the_week: List of days of the week.
:type days_of_the_week: list[str or
~azure.mgmt.recoveryservicesbackup.models.DayOfWeek]
:param weeks_of_the_month: List of weeks of month.
:type weeks_of_the_month: list[str or
~azure.mgmt.recoveryservicesbackup.models.WeekOfMonth]
"""
_attribute_map = {
'days_of_the_week': {'key': 'daysOfTheWeek', 'type': '[DayOfWeek]'},
'weeks_of_the_month': {'key': 'weeksOfTheMonth', 'type': '[WeekOfMonth]'},
}
def __init__(self, *, days_of_the_week=None, weeks_of_the_month=None, **kwargs) -> None:
super(WeeklyRetentionFormat, self).__init__(**kwargs)
self.days_of_the_week = days_of_the_week
self.weeks_of_the_month = weeks_of_the_month
class WeeklyRetentionSchedule(Model):
"""Weekly retention schedule.
:param days_of_the_week: List of days of week for weekly retention policy.
:type days_of_the_week: list[str or
~azure.mgmt.recoveryservicesbackup.models.DayOfWeek]
:param retention_times: Retention times of retention policy.
:type retention_times: list[datetime]
:param retention_duration: Retention duration of retention Policy.
:type retention_duration:
~azure.mgmt.recoveryservicesbackup.models.RetentionDuration
"""
_attribute_map = {
'days_of_the_week': {'key': 'daysOfTheWeek', 'type': '[DayOfWeek]'},
'retention_times': {'key': 'retentionTimes', 'type': '[iso-8601]'},
'retention_duration': {'key': 'retentionDuration', 'type': 'RetentionDuration'},
}
def __init__(self, *, days_of_the_week=None, retention_times=None, retention_duration=None, **kwargs) -> None:
super(WeeklyRetentionSchedule, self).__init__(**kwargs)
self.days_of_the_week = days_of_the_week
self.retention_times = retention_times
self.retention_duration = retention_duration
class WorkloadInquiryDetails(Model):
"""Details of an inquired protectable item.
:param type: Type of the Workload such as SQL, Oracle etc.
:type type: str
:param item_count: Contains the protectable item Count inside this
Container.
:type item_count: long
:param inquiry_validation: Inquiry validation such as permissions and
other backup validations.
:type inquiry_validation:
~azure.mgmt.recoveryservicesbackup.models.InquiryValidation
"""
_attribute_map = {
'type': {'key': 'type', 'type': 'str'},
'item_count': {'key': 'itemCount', 'type': 'long'},
'inquiry_validation': {'key': 'inquiryValidation', 'type': 'InquiryValidation'},
}
def __init__(self, *, type: str=None, item_count: int=None, inquiry_validation=None, **kwargs) -> None:
super(WorkloadInquiryDetails, self).__init__(**kwargs)
self.type = type
self.item_count = item_count
self.inquiry_validation = inquiry_validation
class WorkloadItemResource(Resource):
"""Base class for backup item. Workload-specific backup items are derived from
this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: WorkloadItemResource properties
:type properties: ~azure.mgmt.recoveryservicesbackup.models.WorkloadItem
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'WorkloadItem'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(WorkloadItemResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class WorkloadProtectableItemResource(Resource):
"""Base class for backup item. Workload-specific backup items are derived from
this class.
Variables are only populated by the server, and will be ignored when
sending a request.
:ivar id: Resource Id represents the complete path to the resource.
:vartype id: str
:ivar name: Resource name associated with the resource.
:vartype name: str
:ivar type: Resource type represents the complete path of the form
Namespace/ResourceType/ResourceType/...
:vartype type: str
:param location: Resource location.
:type location: str
:param tags: Resource tags.
:type tags: dict[str, str]
:param e_tag: Optional ETag.
:type e_tag: str
:param properties: WorkloadProtectableItemResource properties
:type properties:
~azure.mgmt.recoveryservicesbackup.models.WorkloadProtectableItem
"""
_validation = {
'id': {'readonly': True},
'name': {'readonly': True},
'type': {'readonly': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'type': {'key': 'type', 'type': 'str'},
'location': {'key': 'location', 'type': 'str'},
'tags': {'key': 'tags', 'type': '{str}'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'properties': {'key': 'properties', 'type': 'WorkloadProtectableItem'},
}
def __init__(self, *, location: str=None, tags=None, e_tag: str=None, properties=None, **kwargs) -> None:
super(WorkloadProtectableItemResource, self).__init__(location=location, tags=tags, e_tag=e_tag, **kwargs)
self.properties = properties
class YearlyRetentionSchedule(Model):
"""Yearly retention schedule.
:param retention_schedule_format_type: Retention schedule format for
yearly retention policy. Possible values include: 'Invalid', 'Daily',
'Weekly'
:type retention_schedule_format_type: str or
~azure.mgmt.recoveryservicesbackup.models.RetentionScheduleFormat
:param months_of_year: List of months of year of yearly retention policy.
:type months_of_year: list[str or
~azure.mgmt.recoveryservicesbackup.models.MonthOfYear]
:param retention_schedule_daily: Daily retention format for yearly
retention policy.
:type retention_schedule_daily:
~azure.mgmt.recoveryservicesbackup.models.DailyRetentionFormat
:param retention_schedule_weekly: Weekly retention format for yearly
retention policy.
:type retention_schedule_weekly:
~azure.mgmt.recoveryservicesbackup.models.WeeklyRetentionFormat
:param retention_times: Retention times of retention policy.
:type retention_times: list[datetime]
:param retention_duration: Retention duration of retention Policy.
:type retention_duration:
~azure.mgmt.recoveryservicesbackup.models.RetentionDuration
"""
_attribute_map = {
'retention_schedule_format_type': {'key': 'retentionScheduleFormatType', 'type': 'str'},
'months_of_year': {'key': 'monthsOfYear', 'type': '[MonthOfYear]'},
'retention_schedule_daily': {'key': 'retentionScheduleDaily', 'type': 'DailyRetentionFormat'},
'retention_schedule_weekly': {'key': 'retentionScheduleWeekly', 'type': 'WeeklyRetentionFormat'},
'retention_times': {'key': 'retentionTimes', 'type': '[iso-8601]'},
'retention_duration': {'key': 'retentionDuration', 'type': 'RetentionDuration'},
}
def __init__(self, *, retention_schedule_format_type=None, months_of_year=None, retention_schedule_daily=None, retention_schedule_weekly=None, retention_times=None, retention_duration=None, **kwargs) -> None:
super(YearlyRetentionSchedule, self).__init__(**kwargs)
self.retention_schedule_format_type = retention_schedule_format_type
self.months_of_year = months_of_year
self.retention_schedule_daily = retention_schedule_daily
self.retention_schedule_weekly = retention_schedule_weekly
self.retention_times = retention_times
self.retention_duration = retention_duration
| 47.8957 | 1,088 | 0.709036 | 56,500 | 500,079 | 6.023204 | 0.025274 | 0.027789 | 0.03009 | 0.039141 | 0.809313 | 0.779211 | 0.755835 | 0.734023 | 0.717467 | 0.703263 | 0 | 0.001026 | 0.17926 | 500,079 | 10,440 | 1,089 | 47.900287 | 0.828121 | 0.441786 | 0 | 0.584091 | 0 | 0 | 0.277432 | 0.078352 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058586 | false | 0 | 0.000253 | 0 | 0.22096 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b9b8b5d503018881e4f1da8f324a225697b976e2 | 3,760 | py | Python | test/test_tradeoff_analytics_v1.py | SeptBlast/python-sdk | 8ba86b8abbff7cd020303b877d730130696ea21d | [
"Apache-2.0"
] | 4 | 2019-03-19T05:07:32.000Z | 2021-08-12T13:11:30.000Z | test/test_tradeoff_analytics_v1.py | SeptBlast/python-sdk | 8ba86b8abbff7cd020303b877d730130696ea21d | [
"Apache-2.0"
] | 2 | 2020-02-12T00:05:29.000Z | 2020-06-05T17:51:24.000Z | test/test_tradeoff_analytics_v1.py | SeptBlast/python-sdk | 8ba86b8abbff7cd020303b877d730130696ea21d | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
import json
import os
import responses
import watson_developer_cloud
dilemmas_url = 'https://gateway.watsonplatform.net/tradeoff-analytics/api/v1/dilemmas'
@responses.activate
def test_visualization_no_preferable_options():
with open(os.path.join(os.path.dirname(__file__), '../resources/tradeoff-expect1.txt')) as expect_file:
dilemmas_response = expect_file.read()
responses.add(responses.POST, dilemmas_url,
body=dilemmas_response, status=200,
content_type='application/json')
tradeoff_analytics = watson_developer_cloud.TradeoffAnalyticsV1(
username="username", password="password")
with open(os.path.join(os.path.dirname(__file__), '../resources/problem.json')) as data_file:
tradeoff_analytics.dilemmas(json.load(data_file))
assert 'generate_visualization=true' in responses.calls[0].request.url
assert responses.calls[0].response.text == dilemmas_response
assert len(responses.calls) == 1
@responses.activate
def test_no_visualization_no_preferable_options():
with open(os.path.join(os.path.dirname(__file__), '../resources/tradeoff-expect2.txt')) as expect_file:
dilemmas_response = expect_file.read()
responses.add(responses.POST, dilemmas_url,
body=dilemmas_response, status=200,
content_type='application/json')
tradeoff_analytics = watson_developer_cloud.TradeoffAnalyticsV1(
username="username", password="password")
with open(os.path.join(os.path.dirname(__file__), '../resources/problem.json')) as data_file:
tradeoff_analytics.dilemmas(json.load(data_file), generate_visualization=False)
assert 'generate_visualization=false' in responses.calls[0].request.url
assert responses.calls[0].response.text == dilemmas_response
assert len(responses.calls) == 1
@responses.activate
def test_no_visualization_preferable_options():
with open(os.path.join(os.path.dirname(__file__), '../resources/tradeoff-expect3.txt')) as expect_file:
dilemmas_response = expect_file.read()
responses.add(responses.POST, dilemmas_url,
body=dilemmas_response, status=200,
content_type='application/json')
tradeoff_analytics = watson_developer_cloud.TradeoffAnalyticsV1(
username="username", password="password")
with open(os.path.join(os.path.dirname(__file__), '../resources/problem.json')) as data_file:
tradeoff_analytics.dilemmas(
json.load(data_file),
generate_visualization=False,
find_preferable_options=True)
assert 'find_preferable_options=true' in responses.calls[0].request.url
assert responses.calls[0].response.text == dilemmas_response
assert len(responses.calls) == 1
@responses.activate
def test_visualization_preferable_options():
with open(os.path.join(os.path.dirname(__file__), '../resources/tradeoff-expect4.txt')) as expect_file:
dilemmas_response = expect_file.read()
responses.add(responses.POST, dilemmas_url,
body=dilemmas_response, status=200,
content_type='application/json')
tradeoff_analytics = watson_developer_cloud.TradeoffAnalyticsV1(
username="username", password="password")
with open(os.path.join(os.path.dirname(__file__), '../resources/problem.json')) as data_file:
tradeoff_analytics.dilemmas(
json.load(data_file),
find_preferable_options=True)
assert 'generate_visualization=true' in responses.calls[0].request.url
assert 'find_preferable_options=true' in responses.calls[0].request.url
assert responses.calls[0].response.text == dilemmas_response
assert len(responses.calls) == 1
| 40 | 107 | 0.723404 | 446 | 3,760 | 5.838565 | 0.154709 | 0.036866 | 0.051843 | 0.043011 | 0.923579 | 0.895161 | 0.895161 | 0.895161 | 0.895161 | 0.895161 | 0 | 0.011118 | 0.162766 | 3,760 | 93 | 108 | 40.430108 | 0.816074 | 0.003191 | 0 | 0.746269 | 0 | 0 | 0.151361 | 0.098772 | 0 | 0 | 0 | 0 | 0.19403 | 1 | 0.059701 | false | 0.059701 | 0.059701 | 0 | 0.119403 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
b9ec180bd3a71b5d930124a822f7f7a0078d3610 | 10,928 | py | Python | airplayer/appletv.py | songww/Airplayer | 63308a5d91d3443074472dc7c915eeb700a6e945 | [
"BSD-4-Clause"
] | 86 | 2015-01-15T14:26:49.000Z | 2021-12-25T11:01:25.000Z | airplayer/appletv.py | songww/Airplayer | 63308a5d91d3443074472dc7c915eeb700a6e945 | [
"BSD-4-Clause"
] | 3 | 2016-06-22T22:17:16.000Z | 2021-06-30T21:53:25.000Z | airplayer/appletv.py | songww/Airplayer | 63308a5d91d3443074472dc7c915eeb700a6e945 | [
"BSD-4-Clause"
] | 16 | 2015-04-26T14:41:41.000Z | 2021-06-30T12:05:37.000Z | DEVICE_INFO = {
'deviceid' : 'FF:FF:FF:FF:FF:FF',
'features' : '0x77',
'model' : 'AppleTV2,1',
'srcvers' : '101.10'
}
SLIDESHOW_FEATURES = '<?xml version="1.0" encoding="UTF-8"?>\
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">\
<plist version="1.0">\
<dict>\
<key>themes</key>\
<array>\
<dict>\
<key>key</key>\
<string>KenBurns</string>\
<key>name</key>\
<string>Ken Burns</string>\
<key>transitions</key>\
<array>\
<dict>\
<key>key</key>\
<string>None</string>\
<key>name</key>\
<string>None</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Cube</string>\
<key>name</key>\
<string>Cube</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Dissolve</string>\
<key>name</key>\
<string>Dissolve</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Droplet</string>\
<key>name</key>\
<string>Droplet</string>\
</dict>\
<dict>\
<key>key</key>\
<string>FadeThruColor</string>\
<key>name</key>\
<string>Fade Through White</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Flip</string>\
<key>name</key>\
<string>Flip</string>\
</dict>\
<dict>\
<key>key</key>\
<string>TileFlip</string>\
<key>name</key>\
<string>Mosaic Flip</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>MoveIn</string>\
<key>name</key>\
<string>Move In</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>PageFlip</string>\
<key>name</key>\
<string>Page Flip</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Push</string>\
<key>name</key>\
<string>Push</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Reveal</string>\
<key>name</key>\
<string>Reveal</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Twirl</string>\
<key>name</key>\
<string>Twirl</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Wipe</string>\
<key>name</key>\
<string>Wipe</string>\
</dict>\
</array>\
</dict>\
<dict>\
<key>key</key>\
<string>Origami</string>\
<key>name</key>\
<string>Origami</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Reflections</string>\
<key>name</key>\
<string>Reflections</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Snapshots</string>\
<key>name</key>\
<string>Snapshots</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Classic</string>\
<key>name</key>\
<string>Classic</string>\
<key>transitions</key>\
<array>\
<dict>\
<key>key</key>\
<string>None</string>\
<key>name</key>\
<string>None</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Cube</string>\
<key>name</key>\
<string>Cube</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Dissolve</string>\
<key>name</key>\
<string>Dissolve</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Droplet</string>\
<key>name</key>\
<string>Droplet</string>\
</dict>\
<dict>\
<key>key</key>\
<string>FadeThruColor</string>\
<key>name</key>\
<string>Fade Through White</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Flip</string>\
<key>name</key>\
<string>Flip</string>\
</dict>\
<dict>\
<key>key</key>\
<string>TileFlip</string>\
<key>name</key>\
<string>Mosaic Flip</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>MoveIn</string>\
<key>name</key>\
<string>Move In</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>PageFlip</string>\
<key>name</key>\
<string>Page Flip</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Push</string>\
<key>name</key>\
<string>Push</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Reveal</string>\
<key>name</key>\
<string>Reveal</string>\
</dict>\
<dict>\
<key>key</key>\
<string>Twirl</string>\
<key>name</key>\
<string>Twirl</string>\
</dict>\
<dict>\
<key>directions</key>\
<array>\
<string>up</string>\
<string>down</string>\
<string>left</string>\
<string>down</string>\
</array>\
<key>key</key>\
<string>Wipe</string>\
<key>name</key>\
<string>Wipe</string>\
</dict>\
</array>\
</dict>\
</array>\
</dict>\
</plist>'
SERVER_INFO = '<?xml version="1.0" encoding="UTF-8"?>\
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">\
<plist version="1.0">\
<dict>\
<key>deviceid</key>\
<string>58:55:CA:06:BD:9E</string>\
<key>features</key>\
<integer>119</integer>\
<key>model</key>\
<string>AppleTV2,1</string>\
<key>protovers</key>\
<string>1.0</string>\
<key>srcvers</key>\
<string>101.10</string>\
</dict>\
</plist>'
PLAYBACK_INFO = '<?xml version="1.0" encoding="UTF-8"?>\
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">\
<plist version="1.0">\
<dict>\
<key>duration</key>\
<real>%f</real>\
<key>loadedTimeRanges</key>\
<array>\
<dict>\
<key>duration</key>\
<real>%f</real>\
<key>start</key>\
<real>0.0</real>\
</dict>\
</array>\
<key>playbackBufferEmpty</key>\
<true/>\
<key>playbackBufferFull</key>\
<false/>\
<key>playbackLikelyToKeepUp</key>\
<true/>\
<key>position</key>\
<real>%f</real>\
<key>rate</key>\
<real>%d</real>\
<key>readyToPlay</key>\
<true/>\
<key>seekableTimeRanges</key>\
<array>\
<dict>\
<key>duration</key>\
<real>%f</real>\
<key>start</key>\
<real>0.0</real>\
</dict>\
</array>\
</dict>\
</plist>' | 30.52514 | 103 | 0.387079 | 917 | 10,928 | 4.608506 | 0.102508 | 0.140558 | 0.06602 | 0.110033 | 0.875296 | 0.839565 | 0.835069 | 0.808093 | 0.80265 | 0.80265 | 0 | 0.00972 | 0.435121 | 10,928 | 358 | 104 | 30.52514 | 0.674874 | 0 | 0 | 0.915493 | 0 | 0.008451 | 0.029005 | 0 | 0 | 0 | 0.000366 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a02d99004d8504ed6a484336dbbc80d8f732794 | 168 | py | Python | tests/regressiontests/localflavor/tests.py | hafeez3000/django | 08e1175ac8d683b692ec3c67dc31df149f07dc8f | [
"BSD-3-Clause"
] | 4 | 2015-08-27T22:03:47.000Z | 2017-09-04T08:13:44.000Z | tests/regressiontests/localflavor/tests.py | hafeez3000/django | 08e1175ac8d683b692ec3c67dc31df149f07dc8f | [
"BSD-3-Clause"
] | null | null | null | tests/regressiontests/localflavor/tests.py | hafeez3000/django | 08e1175ac8d683b692ec3c67dc31df149f07dc8f | [
"BSD-3-Clause"
] | 1 | 2020-01-04T14:51:18.000Z | 2020-01-04T14:51:18.000Z | from django.test import TestCase
from django.utils import unittest
# just import your tests here
from au.tests import *
from mk.tests import *
from us.tests import *
| 18.666667 | 33 | 0.779762 | 27 | 168 | 4.851852 | 0.518519 | 0.251908 | 0.229008 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 168 | 8 | 34 | 21 | 0.935714 | 0.160714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6a30620bb66eba37fdef5154c5fed51345953a32 | 140 | py | Python | scripts/z64_ise.py | tweakoz/zed64 | c0231444418999191182d53d9319bf7978422bfb | [
"CC-BY-3.0"
] | 4 | 2015-06-04T01:14:43.000Z | 2018-06-16T05:45:57.000Z | scripts/z64_ise.py | tweakoz/zed64 | c0231444418999191182d53d9319bf7978422bfb | [
"CC-BY-3.0"
] | null | null | null | scripts/z64_ise.py | tweakoz/zed64 | c0231444418999191182d53d9319bf7978422bfb | [
"CC-BY-3.0"
] | null | null | null | #!/usr/bin/env python
import os
import localopts
print localopts.ISE_BIN_DIR()
os.system("%s/ise zed64/zed64.xise"%localopts.ISE_BIN_DIR())
| 23.333333 | 60 | 0.778571 | 24 | 140 | 4.375 | 0.583333 | 0.228571 | 0.285714 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030769 | 0.071429 | 140 | 5 | 61 | 28 | 0.776923 | 0.142857 | 0 | 0 | 0 | 0 | 0.193277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
6a462a9f3447ac8bdb2b0797418075f876f9b6fc | 381 | py | Python | metamoles/cheminform/__init__.py | metamoles/metamoles | 251de6672029566d8becf2538684c0506fc297d0 | [
"MIT"
] | 3 | 2019-04-04T22:44:00.000Z | 2020-07-30T18:16:56.000Z | metamoles/cheminform/__init__.py | metamoles/metamoles | 251de6672029566d8becf2538684c0506fc297d0 | [
"MIT"
] | null | null | null | metamoles/cheminform/__init__.py | metamoles/metamoles | 251de6672029566d8becf2538684c0506fc297d0 | [
"MIT"
] | null | null | null | from .cheminform import input_data
from .cheminform import fingerprint_products
from .cheminform import sim_i_j
from .cheminform import sim_i_all
from .cheminform import sim_metric
from .cheminform import calculate_dist
from .cheminform import count_C, count_H, count_O, count_N, count_P, count_S, count_X
from .cheminform import cpd_inform
from .cheminform import create_cpd_info
| 38.1 | 85 | 0.850394 | 60 | 381 | 5.1 | 0.433333 | 0.411765 | 0.588235 | 0.22549 | 0.156863 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110236 | 381 | 9 | 86 | 42.333333 | 0.902655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.111111 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e04bff2696eede5bac3ae61962952e3739708549 | 83,885 | py | Python | src/openprocurement/tender/cfaua/tests/lot_blanks.py | lttga/op | 12ba785d1b1cdd0201129805c484846da1e44e9d | [
"Apache-2.0"
] | null | null | null | src/openprocurement/tender/cfaua/tests/lot_blanks.py | lttga/op | 12ba785d1b1cdd0201129805c484846da1e44e9d | [
"Apache-2.0"
] | 1 | 2021-03-25T23:34:34.000Z | 2021-03-25T23:34:34.000Z | src/openprocurement/tender/cfaua/tests/lot_blanks.py | lttga/op | 12ba785d1b1cdd0201129805c484846da1e44e9d | [
"Apache-2.0"
] | 1 | 2020-08-20T06:09:14.000Z | 2020-08-20T06:09:14.000Z | # -*- coding: utf-8 -*-
from copy import deepcopy
from email.header import Header
from openprocurement.api.constants import RELEASE_2020_04_19
from openprocurement.tender.core.tests.cancellation import activate_cancellation_after_2020_04_19
from openprocurement.api.utils import get_now
from openprocurement.tender.belowthreshold.tests.base import test_cancellation, test_claim
def get_tender_lot(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot = response.json["data"]["lots"][0]
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
set(response.json["data"]),
set([u"status", u"date", u"description", u"title", u"minimalStep", u"auctionPeriod", u"value", u"id"]),
)
self.set_status("active.qualification")
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
api_lot = response.json["data"]
lot.pop("auctionPeriod")
api_lot.pop("auctionPeriod")
self.assertEqual(api_lot, lot)
response = self.app.get("/tenders/{}/lots/some_id".format(self.tender_id), status=404)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"lot_id"}])
response = self.app.get("/tenders/some_id/lots/some_id", status=404)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"tender_id"}]
)
def get_tender_lots(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot = response.json["data"]["lots"][0]
response = self.app.get("/tenders/{}/lots".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
set(response.json["data"][0]),
set([u"status", u"description", u"date", u"title", u"minimalStep", u"auctionPeriod", u"value", u"id"]),
)
self.set_status("active.qualification")
response = self.app.get("/tenders/{}/lots".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
api_lot = response.json["data"][0]
lot.pop("auctionPeriod")
api_lot.pop("auctionPeriod")
self.assertEqual(api_lot, lot)
response = self.app.get("/tenders/some_id/lots", status=404)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"tender_id"}]
)
def patch_tender_currency(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot = response.json["data"]["lots"][0]
self.assertEqual(lot["value"]["currency"], "UAH")
# update tender currency without mimimalStep currency change
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"value": {"currency": "GBP"}}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [u"currency should be identical to currency of value of tender"],
u"location": u"body",
u"name": u"minimalStep",
}
],
)
# update tender currency
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"value": {"currency": "GBP"}, "minimalStep": {"currency": "GBP"}}},
)
self.assertEqual(response.status, "200 OK")
# log currency is updated too
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertEqual(lot["value"]["currency"], "GBP")
# try to update lot currency
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"value": {"currency": "USD"}}},
)
self.assertEqual(response.status, "200 OK")
# but the value stays unchanged
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertEqual(lot["value"]["currency"], "GBP")
# try to update minimalStep currency
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"minimalStep": {"currency": "USD"}}},
)
self.assertEqual(response.status, "200 OK")
# but the value stays unchanged
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertEqual(lot["minimalStep"]["currency"], "GBP")
# try to update lot minimalStep currency and lot value currency in single request
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"value": {"currency": "USD"}, "minimalStep": {"currency": "USD"}}},
)
self.assertEqual(response.status, "200 OK")
# but the value stays unchanged
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertEqual(lot["value"]["currency"], "GBP")
self.assertEqual(lot["minimalStep"]["currency"], "GBP")
def patch_tender_lot(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot = response.json["data"]["lots"][0]
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"title": "new title"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["title"], "new title")
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"guarantee": {"amount": 12}}},
)
self.assertEqual(response.status, "200 OK")
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 12)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "UAH")
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"guarantee": {"currency": "USD"}}},
)
self.assertEqual(response.status, "200 OK")
# Deleted self.assertEqual(response.body, 'null') to make this test OK in other procedures,
# because there is a bug with invalidation bids at openua, openeu and openuadefence that makes body not null
response = self.app.patch_json(
"/tenders/{}/lots/some_id?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"title": "other title"}},
status=404,
)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"lot_id"}])
response = self.app.patch_json("/tenders/some_id/lots/some_id", {"data": {"title": "other title"}}, status=404)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"tender_id"}]
)
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["title"], "new title")
self.set_status("{}".format(self.forbidden_lot_actions_status))
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"title": "other title"}},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"],
"Can't update lot in current ({}) tender status".format(self.forbidden_lot_actions_status),
)
def patch_tender_vat(self):
# set tender VAT
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"value": {"valueAddedTaxIncluded": True}}},
)
self.assertEqual(response.status, "200 OK")
# get lot
response = self.app.get("/tenders/{}/lots".format(self.tender_id))
lot = response.json["data"][0]
self.assertTrue(lot["value"]["valueAddedTaxIncluded"])
# update tender VAT
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"value": {"valueAddedTaxIncluded": False}, "minimalStep": {"valueAddedTaxIncluded": False}}},
)
self.assertEqual(response.status, "200 OK")
# log VAT is updated too
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertFalse(lot["value"]["valueAddedTaxIncluded"])
# try to update lot VAT
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"value": {"valueAddedTaxIncluded": True}}},
)
self.assertEqual(response.status, "200 OK")
# but the value stays unchanged
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertFalse(lot["value"]["valueAddedTaxIncluded"])
# try to update minimalStep VAT
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"minimalStep": {"valueAddedTaxIncluded": True}}},
)
self.assertEqual(response.status, "200 OK")
# but the value stays unchanged
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertFalse(lot["minimalStep"]["valueAddedTaxIncluded"])
# try to update minimalStep VAT and value VAT in single request
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lot["id"], self.tender_token),
{"data": {"value": {"valueAddedTaxIncluded": True}, "minimalStep": {"valueAddedTaxIncluded": True}}},
)
self.assertEqual(response.status, "200 OK")
# but the value stays unchanged
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
lot = response.json["data"]
self.assertFalse(lot["value"]["valueAddedTaxIncluded"])
self.assertEqual(lot["minimalStep"]["valueAddedTaxIncluded"], lot["value"]["valueAddedTaxIncluded"])
def two_lot_3bid_3com_3win(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.initial_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
self.initial_lots = lots
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.initial_data["items"][0] for i in lots]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [{"relatedLot": i} for i in lots]}},
)
self.assertEqual(response.status, "200 OK")
# create bids
for x in range(self.min_bids_number):
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": self.test_bids_data[0]["value"], "relatedLot": lot_id} for lot_id in lots],
}
},
)
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.assertEqual(len(qualifications), self.min_bids_number * 2)
for qualification in qualifications:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
# switch to active.auction
self.time_shift("active.auction")
self.check_chronograph()
# get auction info
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(tender_id))
auction_bids_data = response.json["data"]["bids"]
for lot_id in lots:
# posting auction urls
response = self.app.patch_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id),
{
"data": {
"lots": [
{"id": i["id"], "auctionUrl": "https://tender.auction.url"}
for i in response.json["data"]["lots"]
],
"bids": [
{
"id": i["id"],
"lotValues": [
{
"relatedLot": j["relatedLot"],
"participationUrl": "https://tender.auction.url/for_bid/{}".format(i["id"]),
}
for j in i["lotValues"]
],
}
for i in auction_bids_data
],
}
},
)
# posting auction results
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.post_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id), {"data": {"bids": auction_bids_data}}
)
# for first lot
lot_id = lots[0]
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == lot_id][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
# get agreement id
response = self.app.get("/tenders/{}".format(tender_id))
agreement_id = response.json["data"]["agreements"][-1]["id"]
# after stand slill period
self.set_status("complete", {"status": "active.awarded"})
# time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# sign agreement
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/agreements/{}?acc_token={}".format(tender_id, agreement_id, owner_token),
{"data": {"status": "active"}},
)
# for second lot
lot_id = lots[1]
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == lot_id][0]
# set award as unsuccessful
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "unsuccessful"}},
)
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == lot_id][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
# get agreement id
response = self.app.get("/tenders/{}".format(tender_id))
agreement_id = response.json["data"]["agreements"][-1]["id"]
# after stand slill period
self.set_status("complete", {"status": "active.awarded"})
# time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# sign agreement
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/agreements/{}?acc_token={}".format(tender_id, agreement_id, owner_token),
{"data": {"status": "active"}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertTrue(all([i["status"] == "complete" for i in response.json["data"]["lots"]]))
self.assertEqual(response.json["data"]["status"], "complete")
def one_lot_2bid(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.initial_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
self.initial_lots = [response.json["data"]]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bids
for x in range(self.min_bids_number):
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[x]["tenderers"],
"lotValues": [{"value": self.test_bids_data[x]["value"], "relatedLot": lot_id}],
}
},
)
if x == 0:
bid_id = response.json["data"]["id"]
bid_token = response.json["access"]["token"]
# switch to active.auction
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
for qualification in qualifications:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.get("/tenders/{}?acc_token={}".format(tender_id, owner_token))
self.assertEqual(response.status, "200 OK")
for bid in response.json["data"]["bids"]:
self.assertEqual(bid["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.check_chronograph()
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "200 OK")
self.time_shift("active.auction")
self.check_chronograph()
# get auction info
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(tender_id))
auction_bids_data = response.json["data"]["bids"]
# posting auction urls
response = self.app.patch_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id),
{
"data": {
"lots": [
{"id": i["id"], "auctionUrl": "https://tender.auction.url"} for i in response.json["data"]["lots"]
],
"bids": [
{
"id": i["id"],
"lotValues": [
{
"relatedLot": j["relatedLot"],
"participationUrl": "https://tender.auction.url/for_bid/{}".format(i["id"]),
}
for j in i["lotValues"]
],
}
for i in auction_bids_data
],
}
},
)
# view bid participationUrl
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/bids/{}?acc_token={}".format(tender_id, bid_id, bid_token))
self.assertEqual(
response.json["data"]["lotValues"][0]["participationUrl"],
"https://tender.auction.url/for_bid/{}".format(bid_id),
)
# posting auction results
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.post_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id), {"data": {"bids": auction_bids_data}}
)
# # get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
# get agreement id
response = self.app.get("/tenders/{}".format(tender_id))
agreement_id = response.json["data"]["agreements"][-1]["id"]
# after stand slill period
self.time_shift("complete")
self.check_chronograph()
# # time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# # sign agreement
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/agreements/{}?acc_token={}".format(tender_id, agreement_id, owner_token),
{"data": {"status": "active"}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["lots"][0]["status"], "complete")
self.assertEqual(response.json["data"]["status"], "complete")
def one_lot_3bid_1del(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.initial_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
self.initial_lots = [response.json["data"]]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bids
self.app.authorization = ("Basic", ("broker", ""))
bids = []
for i in range(self.min_bids_number + 1):
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": self.test_bids_data[0]["value"], "relatedLot": lot_id}],
}
},
)
bids.append({response.json["data"]["id"]: response.json["access"]["token"]})
response = self.app.delete(
"/tenders/{}/bids/{}?acc_token={}".format(tender_id, bids[2].keys()[0], bids[2].values()[0])
)
self.assertEqual(response.status, "200 OK")
# switch to active.auction
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
for qualification in qualifications:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.check_chronograph()
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "200 OK")
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.time_shift("active.auction")
self.check_chronograph()
# get auction info
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(tender_id))
auction_bids_data = response.json["data"]["bids"]
# posting auction urls
data = {
"data": {
"lots": [
{"id": i["id"], "auctionUrl": "https://tender.auction.url"} for i in response.json["data"]["lots"]
],
"bids": list(auction_bids_data),
}
}
for bid_index, bid in enumerate(auction_bids_data):
if bid.get("status", "active") == "active":
for lot_index, lot_bid in enumerate(bid["lotValues"]):
if lot_bid["relatedLot"] == lot_id and lot_bid.get("status", "active") == "active":
data["data"]["bids"][bid_index]["lotValues"][lot_index][
"participationUrl"
] = "https://tender.auction.url/for_bid/{}".format(bid["id"])
break
response = self.app.patch_json("/tenders/{}/auction/{}".format(tender_id, lot_id), data)
# view bid participationUrl
self.app.authorization = ("Basic", ("broker", ""))
bid_id = bids[0].keys()[0]
bid_token = bids[0].values()[0]
response = self.app.get("/tenders/{}/bids/{}?acc_token={}".format(tender_id, bid_id, bid_token))
self.assertEqual(
response.json["data"]["lotValues"][0]["participationUrl"],
"https://tender.auction.url/for_bid/{}".format(bid_id),
)
bid_id = bids[2].keys()[0]
bid_token = bids[2].values()[0]
response = self.app.get("/tenders/{}/bids/{}?acc_token={}".format(tender_id, bid_id, bid_token))
self.assertEqual(response.json["data"]["status"], "deleted")
# posting auction results
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.post_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id), {"data": {"bids": auction_bids_data}}
)
# # get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
# get agreement id
response = self.app.get("/tenders/{}".format(tender_id))
agreement_id = response.json["data"]["agreements"][-1]["id"]
# after stand slill period
self.time_shift("complete")
self.check_chronograph()
# # time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# # sign agreement
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/agreements/{}?acc_token={}".format(tender_id, agreement_id, owner_token),
{"data": {"status": "active"}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["lots"][0]["status"], "complete")
self.assertEqual(response.json["data"]["status"], "complete")
def one_lot_3bid_1un(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.initial_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
self.initial_lots = [response.json["data"]]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bids = []
for i in range(self.min_bids_number + 1):
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": self.test_bids_data[0]["value"], "relatedLot": lot_id}],
}
},
)
bids.append({response.json["data"]["id"]: response.json["access"]["token"]})
# switch to active.auction
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
for qualification in qualifications:
if qualification["bidID"] == bids[2].keys()[0]:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "unsuccessful"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
else:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.check_chronograph()
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "200 OK")
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.time_shift("active.auction")
self.check_chronograph()
# get auction info
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(tender_id))
auction_bids_data = response.json["data"]["bids"]
# posting auction urls
data = {
"data": {
"lots": [
{"id": i["id"], "auctionUrl": "https://tender.auction.url"} for i in response.json["data"]["lots"]
],
"bids": list(auction_bids_data),
}
}
for bid_index, bid in enumerate(auction_bids_data):
if bid.get("status", "active") == "active":
for lot_index, lot_bid in enumerate(bid["lotValues"]):
if lot_bid["relatedLot"] == lot_id and lot_bid.get("status", "active") == "active":
data["data"]["bids"][bid_index]["lotValues"][lot_index][
"participationUrl"
] = "https://tender.auction.url/for_bid/{}".format(bid["id"])
break
response = self.app.patch_json("/tenders/{}/auction/{}".format(tender_id, lot_id), data)
# view bid participationUrl
self.app.authorization = ("Basic", ("broker", ""))
bid_id = bids[0].keys()[0]
bid_token = bids[0].values()[0]
response = self.app.get("/tenders/{}/bids/{}?acc_token={}".format(tender_id, bid_id, bid_token))
self.assertEqual(
response.json["data"]["lotValues"][0]["participationUrl"],
"https://tender.auction.url/for_bid/{}".format(bid_id),
)
bid_id = bids[2].keys()[0]
bid_token = bids[2].values()[0]
response = self.app.get("/tenders/{}/bids/{}?acc_token={}".format(tender_id, bid_id, bid_token))
self.assertNotIn("lotValues", response.json["data"])
# posting auction results
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.post_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id), {"data": {"bids": auction_bids_data}}
)
# # get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
# get agreement id
response = self.app.get("/tenders/{}".format(tender_id))
agreement_id = response.json["data"]["agreements"][-1]["id"]
# after stand slill period
self.time_shift("complete")
self.check_chronograph()
# # time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# # sign agreement
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/agreements/{}?acc_token={}".format(tender_id, agreement_id, owner_token),
{"data": {"status": "active"}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["lots"][0]["status"], "complete")
self.assertEqual(response.json["data"]["status"], "complete")
def two_lot_3bid_1win_bug(self):
"""
ref: http://prozorro.worksection.ua/project/141436/3931481/#com9856686
"""
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.initial_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
self.initial_lots = lots
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.initial_data["items"][0] for i in lots]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [{"relatedLot": i} for i in lots]}},
)
self.assertEqual(response.status, "200 OK")
# create bids
self.app.authorization = ("Basic", ("broker", ""))
for x in range(self.min_bids_number):
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[x]["tenderers"],
"lotValues": [{"value": self.test_bids_data[x]["value"], "relatedLot": lot_id} for lot_id in lots],
}
},
)
# create last bid
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[self.min_bids_number - 1]["tenderers"],
"lotValues": [
{"value": self.test_bids_data[self.min_bids_number - 1]["value"], "relatedLot": lot_id}
for lot_id in lots
],
}
},
)
bid_id = response.json["data"]["id"]
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.assertEqual(len(qualifications), (self.min_bids_number + 1) * 2)
for qualification in qualifications:
if lots[1] == qualification["lotID"] and bid_id == qualification["bidID"]:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "unsuccessful"}},
)
else:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
if lots[1] == qualification["lotID"] and bid_id == qualification["bidID"]:
self.assertEqual(response.json["data"]["status"], "unsuccessful")
else:
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
# switch to active.auction
self.time_shift("active.auction")
self.check_chronograph()
# get auction info
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(tender_id))
auction_bids_data = response.json["data"]["bids"]
for lot_id in lots:
# posting auction urls
response = self.app.patch_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id),
{
"data": {
"lots": [
{"id": i["id"], "auctionUrl": "https://tender.auction.url"}
for i in response.json["data"]["lots"]
],
"bids": [
{
"id": i["id"],
"lotValues": [
{
"relatedLot": j["relatedLot"],
"participationUrl": "https://tender.auction.url/for_bid/{}".format(i["id"]),
}
for j in i["lotValues"]
],
}
for i in auction_bids_data
],
}
},
)
# posting auction results
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.post_json(
"/tenders/{}/auction/{}".format(tender_id, lot_id), {"data": {"bids": auction_bids_data}}
)
# for first lot
lot_id = lots[0]
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == lot_id][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
# get agreement id
response = self.app.get("/tenders/{}".format(tender_id))
agreement_id = response.json["data"]["agreements"][-1]["id"]
# after stand slill period
self.set_status("complete", {"status": "active.awarded"})
# time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# sign agreement
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/agreements/{}?acc_token={}".format(tender_id, agreement_id, owner_token),
{"data": {"status": "active"}},
)
# for second lot
lot_id = lots[1]
for x in range(self.min_bids_number):
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == lot_id][0]
# set award as unsuccessful
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, owner_token),
{"data": {"status": "unsuccessful"}},
)
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
self.assertEqual([i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == lot_id], [])
# after stand slill period
self.set_status("complete", {"status": "active.awarded"})
# time travel
tender = self.db.get(tender_id)
for i in tender.get("awards", []):
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# ping by chronograph
self.check_chronograph()
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(set([i["status"] for i in response.json["data"]["lots"]]), set(["complete", "unsuccessful"]))
self.assertEqual(response.json["data"]["status"], "complete")
def proc_1lot_1can(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot_id = response.json["data"]["lots"][0]["id"]
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"items": [self.initial_data["items"][0]]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"items": [{"relatedLot": lot_id}]}},
)
self.assertEqual(response.status, "200 OK")
# switch to active.tendering
# TODO: set auctionPeriod.startDate for lots
# response = self.set_status('active.tendering', {"lots": [
# {"auctionPeriod": {"startDate": (get_now() + timedelta(days=self.days_till_auction_starts)).isoformat()}}
# ]})
# self.assertTrue(all(["auctionPeriod" in i for i in response.json['data']['lots']]))
# cancel lot
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": lot_id,
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
cancellation_id = response.json["data"]["id"]
if RELEASE_2020_04_19 < get_now():
activate_cancellation_after_2020_04_19(self, cancellation_id)
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertTrue(all([i["status"] == "cancelled" for i in response.json["data"]["lots"]]))
self.assertEqual(response.json["data"]["status"], "cancelled")
def create_tender_lot(self):
tender_data = deepcopy(self.initial_data)
tender_data["lots"] = deepcopy(self.initial_lots)
tender_data["lots"][0]["guarantee"] = {"amount": 100500, "currency": "USD"}
response = self.app.post_json("/tenders", {"data": tender_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
data = response.json["data"]
self.assertIn("guarantee", data["lots"][0])
self.assertEqual(data["lots"][0]["guarantee"]["amount"], 100500)
self.assertEqual(data["lots"][0]["guarantee"]["currency"], "USD")
self.tender_id = response.json["data"]["id"]
self.tender_token = response.json["access"]["token"]
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 100500)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "USD")
self.assertIn("guarantee", response.json["data"]["lots"][0])
# Create second lot with error
lot2 = deepcopy(self.test_lots_data[0])
lot2["guarantee"] = {"amount": 500, "currency": "UAH"}
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(self.tender_id, self.tender_token), {"data": lot2}, status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"Please provide no more than 1 item."], u"location": u"body", u"name": u"lots"}],
)
lot2["guarantee"] = {"currency": "USD"}
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(self.tender_id, self.tender_token), {"data": lot2}, status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": {u"amount": [u"This field is required."]}, u"location": u"body", u"name": u"guarantee"}],
)
lot2["guarantee"] = {"amount": 100600}
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(self.tender_id, self.tender_token), {"data": lot2}, status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"Please provide no more than 1 item."], u"location": u"body", u"name": u"lots"}],
)
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 100500)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "USD")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"guarantee": {"currency": "EUR"}}},
)
self.assertEqual(response.json["data"]["guarantee"]["amount"], 100500)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "EUR")
self.assertIn("guarantee", response.json["data"]["lots"][0])
self.assertEqual(len(response.json["data"]["lots"]), 1)
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(self.tender_id, self.tender_token),
{"data": self.test_lots_data[0]},
status=422,
)
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(
response.json["errors"],
[{u"description": [u"Please provide no more than 1 item."], u"location": u"body", u"name": u"lots"}],
)
def tender_lot_guarantee(self):
data = deepcopy(self.initial_data)
data["lots"] = deepcopy(self.initial_lots)
data["lots"][0]["guarantee"] = {"amount": 20, "currency": "USD"}
data["guarantee"] = {"amount": 100, "currency": "USD"}
response = self.app.post_json("/tenders", {"data": data})
tender = response.json["data"]
tender_token = response.json["access"]["token"]
self.assertEqual(response.status, "201 Created")
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "USD")
lot = deepcopy(self.test_lots_data[0])
lot_id = response.json["data"]["lots"][0]["id"]
self.assertIn("guarantee", response.json["data"]["lots"][0])
self.assertEqual(response.json["data"]["lots"][0]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["lots"][0]["guarantee"]["currency"], "USD")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], tender_token), {"data": {"guarantee": {"currency": "GBP"}}}
)
self.assertEqual(response.status, "200 OK")
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
lot["guarantee"] = {"amount": 20, "currency": "GBP"}
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender["id"], tender_token), {"data": lot}, status=422
)
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(
response.json["errors"],
[{u"description": [u"Please provide no more than 1 item."], u"location": u"body", u"name": u"lots"}],
)
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertEqual(response.json["data"]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], tender_token), {"data": {"guarantee": {"amount": 55}}}
)
self.assertEqual(response.json["data"]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 20)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
response = self.app.patch_json(
"/tenders/{}/lots/{}?acc_token={}".format(tender["id"], lot_id, tender_token),
{"data": {"guarantee": {"amount": 0, "currency": "GBP"}}},
)
self.assertEqual(response.json["data"]["guarantee"]["amount"], 0)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 0)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
response = self.app.delete(
"/tenders/{}/lots/{}?acc_token={}".format(tender["id"], lot_id, tender_token), status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "lots", "description": ["Please provide at least 1 item."]}],
)
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 0)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "GBP")
# TenderLotEdgeCasesTest
def question_blocking(self):
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.post_json(
"/tenders/{}/questions".format(self.tender_id),
{
"data": {
"title": "question title",
"description": "question description",
"questionOf": "lot",
"relatedItem": self.initial_lots[0]["id"],
"author": self.test_author,
}
},
)
question = response.json["data"]
self.assertEqual(question["questionOf"], "lot")
self.assertEqual(question["relatedItem"], self.initial_lots[0]["id"])
self.set_status("active.tendering", "end")
response = self.check_chronograph()
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.json["data"]["status"], "active.tendering")
# cancel lot
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[0]["id"],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
cancellation_id = response.json["data"]["id"]
if RELEASE_2020_04_19 < get_now():
activate_cancellation_after_2020_04_19(self, cancellation_id)
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.json["data"]["status"], "cancelled")
def claim_blocking(self):
self.app.authorization = ("Basic", ("broker", ""))
claim_data = deepcopy(test_claim)
claim_data["relatedLot"] = self.initial_lots[0]["id"]
response = self.app.post_json(
"/tenders/{}/complaints".format(self.tender_id),
{
"data": claim_data
},
)
self.assertEqual(response.status, "201 Created")
complaint = response.json["data"]
self.assertEqual(complaint["relatedLot"], self.initial_lots[0]["id"])
self.set_status("active.tendering", "end")
response = self.check_chronograph()
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.json["data"]["status"], "active.tendering")
# cancel lot
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[0]["id"],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
cancellation_id = response.json["data"]["id"]
if RELEASE_2020_04_19 < get_now():
activate_cancellation_after_2020_04_19(self, cancellation_id)
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.json["data"]["status"], "cancelled")
# Tender Lot Feature Resource Test
def tender_value(self):
request_path = "/tenders/{}".format(self.tender_id)
response = self.app.get(request_path)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["value"]["amount"], sum([i["value"]["amount"] for i in self.initial_lots]))
self.assertEqual(
response.json["data"]["minimalStep"]["amount"], min([i["minimalStep"]["amount"] for i in self.initial_lots])
)
def tender_features_invalid(self):
request_path = "/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token)
data = self.initial_data.copy()
item = data["items"][0].copy()
item["id"] = "1"
data["items"] = [item]
data["features"] = [
{
"featureOf": "lot",
"relatedItem": self.initial_lots[0]["id"],
"title": u"Потужність всмоктування",
"enum": [{"value": 0.1, "title": u"До 1000 Вт"}, {"value": 0.15, "title": u"Більше 1000 Вт"}],
}
]
response = self.app.patch_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"Features are not allowed for lots"], u"location": u"body", u"name": u"features"}],
)
data["features"] = [
{
"code": "OCDS-123454-POSTPONEMENT",
"featureOf": "tenderer",
"title": u"Відстрочка платежу",
"description": u"Термін відстрочки платежу",
"enum": [
{"value": self.invalid_feature_value, "title": u"До 90 днів"},
{"value": 0.1, "title": u"Більше 90 днів"},
],
}
]
response = self.app.patch_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [
{u"enum": [{u"value": [u"Value should be less than {}.".format(self.max_feature_value)]}]}
],
u"location": u"body",
u"name": u"features",
}
],
)
data["features"][0]["enum"][0]["value"] = 0.3
data["features"].append(data["features"][0].copy())
response = self.app.patch_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [u"Feature code should be uniq for all features"],
u"location": u"body",
u"name": u"features",
}
],
)
data["features"][1]["code"] = "OCDS-123456-POSTPONEMENT" # should be different
response = self.app.patch_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [u"Sum of max value of all features for lot should be less then or equal to 30%"],
u"location": u"body",
u"name": u"features",
}
],
)
tender = self.db.get(self.tender_id)
tender["lots"] = []
del tender["items"][0]["relatedLot"]
self.db.save(tender)
response = self.app.patch_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"Please provide at least 1 item."], u"location": u"body", u"name": u"lots"}],
)
def one_lot_1bid(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot_id = response.json["data"]["lots"][0]["id"]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"items": [{"relatedLot": lot_id}]}},
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.post_json(
"/tenders/{}/bids".format(self.tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": self.test_bids_data[0]["value"], "relatedLot": lot_id}],
}
},
)
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
# switch to unsuccessful
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token))
self.assertEqual(response.json["data"]["status"], "unsuccessful")
def tender_lot_document(self):
response = self.app.post(
"/tenders/{}/documents?acc_token={}".format(self.tender_id, self.tender_token),
upload_files=[("file", str(Header(u"укр.doc", "utf-8")), "content")],
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
doc_id = response.json["data"]["id"]
# dateModified = response.json["data"]['dateModified']
self.assertIn(doc_id, response.headers["Location"])
self.assertEqual(u"укр.doc", response.json["data"]["title"])
self.assertNotIn("documentType", response.json["data"])
response = self.app.patch_json(
"/tenders/{}/documents/{}?acc_token={}".format(self.tender_id, doc_id, self.tender_token),
{"data": {"documentOf": "lot"}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"This field is required."], u"location": u"body", u"name": u"relatedItem"}],
)
response = self.app.patch_json(
"/tenders/{}/documents/{}?acc_token={}".format(self.tender_id, doc_id, self.tender_token),
{"data": {"documentOf": "lot", "relatedItem": "0" * 32}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"relatedItem should be one of lots"], u"location": u"body", u"name": u"relatedItem"}],
)
# get tender for lot id
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token), status=200)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
# add document with lot_id
lot_id = tender["lots"][0]["id"]
response = self.app.patch_json(
"/tenders/{}/documents/{}?acc_token={}".format(self.tender_id, doc_id, self.tender_token),
{"data": {"documentOf": "lot", "relatedItem": lot_id}},
status=200,
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["relatedItem"], lot_id)
def proc_1lot_0bid(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot_id = response.json["data"]["lots"][0]["id"]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"items": [{"relatedLot": lot_id}]}},
)
self.assertEqual(response.status, "200 OK")
# switch to unsuccessful
response = self.set_status("active.tendering", "end")
self.app.authorization = ("Basic", ("chronograph", ""))
response = self.app.patch_json("/tenders/{}".format(self.tender_id), {"data": {"id": self.tender_id}})
self.assertEqual(response.json["data"]["lots"][0]["status"], "unsuccessful")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
def one_lot_2bid_1unqualified(self):
response = self.app.get("/tenders/{}".format(self.tender_id))
lot_id = response.json["data"]["lots"][0]["id"]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"items": [{"relatedLot": lot_id}]}},
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
for i in range(self.min_bids_number):
response = self.app.post_json(
"/tenders/{}/bids".format(self.tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[i]["tenderers"],
"lotValues": [{"value": self.test_bids_data[i]["value"], "relatedLot": lot_id}],
}
},
)
# switch to active.pre-qualification
self.set_status("active.tendering", "end")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, self.tender_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.app.authorization = ("Basic", ("token", ""))
for i in range(self.min_bids_number - 1):
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(
self.tender_id, qualifications[i]["id"], self.tender_token
),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(
self.tender_id, qualifications[-1]["id"], self.tender_token
),
{"data": {"status": "unsuccessful"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active.pre-qualification.stand-still")
self.set_status("active.pre-qualification.stand-still", "end")
self.app.authorization = ("Basic", ("chronograph", ""))
response = self.app.patch_json("/tenders/{}".format(self.tender_id), {"data": {"id": self.tender_id}})
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
# TenderLotFeatureBidderResourceTest
def create_tender_feature_bidder(self):
request_path = "/tenders/{}/bids".format(self.tender_id)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
"parameters": [
{"code": "code_item", "value": 0.01},
{"code": "code_tenderer", "value": 0.01},
# {"code": "code_lot", "value": 0.01},
],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
bidder = response.json["data"]
self.assertEqual(bidder["tenderers"][0]["name"], self.initial_data["procuringEntity"]["name"])
self.assertIn("id", bidder)
self.assertIn(bidder["id"], response.headers["Location"])
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
"parameters": [
{"code": "code_item", "value": 0.01},
{"code": "code_tenderer", "value": 0.01},
# {"code": "code_lot", "value": 0.01},
],
}
},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["errors"][0]["description"], "Can't add bid in current (unsuccessful) tender status")
def create_tender_feature_bidder_invalid(self):
request_path = "/tenders/{}/bids".format(self.tender_id)
response = self.app.post_json(
request_path,
{"data": {"selfEligible": True, "selfQualified": True, "tenderers": self.test_bids_data[0]["tenderers"]}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"relatedLot": [u"This field is required."]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": "0" * 32}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"relatedLot": [u"relatedLot should be one of lots"]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 5000000}, "relatedLot": self.lot_id}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"value": [u"value of bid should be less than value of lot"]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500, "valueAddedTaxIncluded": False}, "relatedLot": self.lot_id}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [
{
u"value": [
u"valueAddedTaxIncluded of bid should be identical to valueAddedTaxIncluded of value of lot"
]
}
],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500, "currency": "USD"}, "relatedLot": self.lot_id}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"value": [u"currency of bid should be identical to currency of value of lot"]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"All features parameters is required."], u"location": u"body", u"name": u"parameters"}],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
"parameters": [{"code": "code_item", "value": 0.01}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"All features parameters is required."], u"location": u"body", u"name": u"parameters"}],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
"parameters": [{"code": "code_invalid", "value": 0.01}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"code": [u"code should be one of feature code."]}],
u"location": u"body",
u"name": u"parameters",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.lot_id}],
"parameters": [
{"code": "code_item", "value": 0.01},
{"code": "code_tenderer", "value": 0},
# {"code": "code_lot", "value": 0.01},
],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"value": [u"value should be one of feature value."]}],
u"location": u"body",
u"name": u"parameters",
}
],
)
| 41.630273 | 120 | 0.596793 | 9,366 | 83,885 | 5.223788 | 0.036622 | 0.094428 | 0.133508 | 0.066774 | 0.927891 | 0.915525 | 0.898561 | 0.878163 | 0.862037 | 0.855333 | 0 | 0.012785 | 0.222376 | 83,885 | 2,014 | 121 | 41.650943 | 0.737257 | 0.045241 | 0 | 0.702768 | 0 | 0 | 0.245135 | 0.057895 | 0 | 0 | 0 | 0.000497 | 0.199759 | 1 | 0.013839 | false | 0 | 0.00361 | 0 | 0.017449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e05a9813d0c22c82b21156ef0eae8751d45fba7b | 3,456 | py | Python | tests/python/test_grouped.py | 447983454/taichi | 2bfbca88b2d8cb1a070da9a40c5422c99b23fc2f | [
"MIT"
] | 1 | 2020-06-01T14:22:19.000Z | 2020-06-01T14:22:19.000Z | tests/python/test_grouped.py | 447983454/taichi | 2bfbca88b2d8cb1a070da9a40c5422c99b23fc2f | [
"MIT"
] | null | null | null | tests/python/test_grouped.py | 447983454/taichi | 2bfbca88b2d8cb1a070da9a40c5422c99b23fc2f | [
"MIT"
] | null | null | null | import taichi as ti
@ti.all_archs
def test_vector_index():
val = ti.var(ti.i32)
n = 4
m = 7
p = 11
ti.root.dense(ti.i, n).dense(ti.j, m).dense(ti.k, p).place(val)
@ti.kernel
def test():
for i in range(n):
for j in range(m):
for k in range(p):
I = ti.Vector([i, j, k])
val[I] = i + j * 2 + k * 3
test()
for i in range(n):
for j in range(m):
for k in range(p):
assert val[i, j, k] == i + j * 2 + k * 3
@ti.all_archs
def test_grouped():
val = ti.var(ti.i32)
n = 4
m = 8
p = 16
ti.root.dense(ti.i, n).dense(ti.j, m).dense(ti.k, p).place(val)
@ti.kernel
def test():
for I in ti.grouped(val):
val[I] = I[0] + I[1] * 2 + I[2] * 3
test()
for i in range(n):
for j in range(m):
for k in range(p):
assert val[i, j, k] == i + j * 2 + k * 3
@ti.all_archs
def test_grouped_ndrange():
val = ti.var(ti.i32)
n = 4
m = 8
ti.root.dense(ti.ij, (n, m)).place(val)
x0 = 2
y0 = 3
x1 = 1
y1 = 6
@ti.kernel
def test():
for I in ti.grouped(ti.ndrange((x0, y0), (x1, y1))):
val[I] = I[0] + I[1] * 2
test()
for i in range(n):
for j in range(m):
assert val[i, j] == (i +
j * 2 if x0 <= i < y0 and x1 <= j < y1 else 0)
@ti.all_archs
def test_static_grouped_ndrange():
val = ti.var(ti.i32)
n = 4
m = 8
ti.root.dense(ti.ij, (n, m)).place(val)
x0 = 2
y0 = 3
x1 = 1
y1 = 6
@ti.kernel
def test():
for I in ti.static(ti.grouped(ti.ndrange((x0, y0), (x1, y1)))):
val[I] = I[0] + I[1] * 2
test()
for i in range(n):
for j in range(m):
assert val[i, j] == (i +
j * 2 if x0 <= i < y0 and x1 <= j < y1 else 0)
@ti.all_archs
def test_grouped_ndrange_starred():
val = ti.var(ti.i32)
n = 4
m = 8
p = 16
dim = 3
ti.root.dense(ti.ijk, (n, m, p)).place(val)
@ti.kernel
def test():
for I in ti.grouped(ti.ndrange(*(((0, n), ) * dim))):
val[I] = I[0] + I[1] * 2 + I[2] * 3
test()
for i in range(n):
for j in range(m):
for k in range(p):
assert val[i, j,
k] == (i + j * 2 + k * 3 if j < n and k < n else 0)
@ti.all_archs
def test_grouped_ndrange_0d():
val = ti.var(ti.i32, shape=())
@ti.kernel
def test():
for I in ti.grouped(ti.ndrange()):
val[I] = 42
test()
assert val[None] == 42
@ti.all_archs
def test_static_grouped_ndrange_0d():
val = ti.var(ti.i32, shape=())
@ti.kernel
def test():
for I in ti.static(ti.grouped(ti.ndrange())):
val[I] = 42
test()
assert val[None] == 42
@ti.all_archs
def test_static_grouped_func():
K = 3
dim = 2
v = ti.Vector(K, dt=ti.i32, shape=((K, ) * dim))
def stencil_range():
return ti.ndrange(*((K, ) * (dim + 1)))
@ti.kernel
def p2g():
for I in ti.static(ti.grouped(stencil_range())):
v[I[0], I[1]][I[2]] = I[0] + I[1] * 3 + I[2] * 10
p2g()
for i in range(K):
for j in range(K):
for k in range(K):
assert v[i, j][k] == i + j * 3 + k * 10
| 18.989011 | 79 | 0.449363 | 587 | 3,456 | 2.592845 | 0.105622 | 0.087385 | 0.055191 | 0.078844 | 0.8318 | 0.814717 | 0.814717 | 0.799606 | 0.785151 | 0.761498 | 0 | 0.057358 | 0.384549 | 3,456 | 181 | 80 | 19.093923 | 0.658204 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063492 | 1 | 0.134921 | false | 0 | 0.007937 | 0.007937 | 0.150794 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e062597f2db91a68f01abad5920453b1f723dd8c | 16,386 | py | Python | set_cache_values.py | Lever-age/api | d609b991e8e8bef96d442c4e9ae4110dbf2ba613 | [
"MIT"
] | 2 | 2016-09-30T12:36:32.000Z | 2018-11-27T23:32:01.000Z | set_cache_values.py | Lever-age/api | d609b991e8e8bef96d442c4e9ae4110dbf2ba613 | [
"MIT"
] | 44 | 2016-09-25T15:29:01.000Z | 2017-10-09T19:07:53.000Z | set_cache_values.py | Lever-age/api | d609b991e8e8bef96d442c4e9ae4110dbf2ba613 | [
"MIT"
] | 7 | 2016-09-24T20:36:44.000Z | 2018-01-10T00:18:55.000Z | #!/usr/bin/python3
# coding: utf-8
from leverageapi.database import db_session
# -- -----------------------------------------
# -- Set Committee total donations by year
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee -- donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, committee_id, 'committee', 'donations_by_year', '{}', '', SUM(d.donation_amount)
FROM political_donation d
WHERE d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee count of donations by year
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee -- count_donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'count_donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, committee_id, 'committee', 'count_donations_by_year', '{}', '', COUNT(d.donation_amount)
FROM political_donation d
WHERE d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee in-district contributions by year for State House races
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee (house) -- in_district_donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'in_district_donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'in_district_donations_by_year', '{}', '', SUM(d.donation_amount)
FROM political_donation d, committee comm, `candidate_committees` cand_comm, `candidate` cand, `candidacy`, `race`,
`contributor`, `contributor_address`, `contributor_address_cicero_district_set` ad_set, `cicero_district`
WHERE d.committee_id = comm.id
AND comm.id = cand_comm.committee_id
AND cand_comm.candidate_id = cand.id
AND cand.id = `candidacy`.candidate_id
AND `candidacy`.race_id = `race`.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.id = ad_set.address_id
AND ad_set.cicero_district_id = `cicero_district`.id
AND `race`.race_name = 'REPRESENTATIVE IN THE GENERAL ASSEMBLY'
AND `cicero_district`.district_type = 'STATE_LOWER'
AND `race`.race_district = `cicero_district`.district_id
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee in-district contributions by year for State Senate races
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee (senate) -- in_district_donations_by_year')
for breakdown in replacement_dict:
# Do not delete! These are deleted above!
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'in_district_donations_by_year', '{}', '', SUM(d.donation_amount)
FROM political_donation d, committee comm, `candidate_committees` cand_comm, `candidate` cand, `candidacy`, `race`,
`contributor`, `contributor_address`, `contributor_address_cicero_district_set` ad_set, `cicero_district`
WHERE d.committee_id = comm.id
AND comm.id = cand_comm.committee_id
AND cand_comm.candidate_id = cand.id
AND cand.id = `candidacy`.candidate_id
AND `candidacy`.race_id = `race`.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.id = ad_set.address_id
AND ad_set.cicero_district_id = `cicero_district`.id
AND `race`.race_name = 'SENATOR IN THE GENERAL ASSEMBLY'
AND `cicero_district`.district_type = 'STATE_UPPER'
AND `race`.race_district = `cicero_district`.district_id
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee in-district count of contributions by year for State House races
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee (house) -- in_district_count_donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'in_district_count_donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'in_district_count_donations_by_year', '{}', '', COUNT(d.donation_amount)
FROM political_donation d, committee comm, `candidate_committees` cand_comm, `candidate` cand, `candidacy`, `race`,
`contributor`, `contributor_address`, `contributor_address_cicero_district_set` ad_set, `cicero_district`
WHERE d.committee_id = comm.id
AND comm.id = cand_comm.committee_id
AND cand_comm.candidate_id = cand.id
AND cand.id = `candidacy`.candidate_id
AND `candidacy`.race_id = `race`.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.id = ad_set.address_id
AND ad_set.cicero_district_id = `cicero_district`.id
AND `race`.race_name = 'REPRESENTATIVE IN THE GENERAL ASSEMBLY'
AND `cicero_district`.district_type = 'STATE_LOWER'
AND `race`.race_district = `cicero_district`.district_id
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee in-district count of contributions by year for State Senate races
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee (senate) -- in_district_count_donations_by_year')
for breakdown in replacement_dict:
# Do not delete! These are deleted above!
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'in_district_count_donations_by_year', '{}', '', COUNT(d.donation_amount)
FROM political_donation d, committee comm, `candidate_committees` cand_comm, `candidate` cand, `candidacy`, `race`,
`contributor`, `contributor_address`, `contributor_address_cicero_district_set` ad_set, `cicero_district`
WHERE d.committee_id = comm.id
AND comm.id = cand_comm.committee_id
AND cand_comm.candidate_id = cand.id
AND cand.id = `candidacy`.candidate_id
AND `candidacy`.race_id = `race`.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.id = ad_set.address_id
AND ad_set.cicero_district_id = `cicero_district`.id
AND `race`.race_name = 'SENATOR IN THE GENERAL ASSEMBLY'
AND `cicero_district`.district_type = 'STATE_UPPER'
AND `race`.race_district = `cicero_district`.district_id
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee in-pa contributions by year
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee -- in_pa_donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'in_pa_donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'in_pa_donations_by_year', '{}', '', SUM(d.donation_amount)
FROM political_donation d, committee comm, `contributor`, `contributor_address`
WHERE d.committee_id = comm.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.state = 'pa'
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee count of in-pa contributions by year
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee -- in_pa_count_donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'in_pa_count_donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'in_pa_count_donations_by_year', '{}', '', COUNT(d.donation_amount)
FROM political_donation d, committee comm, `contributor`, `contributor_address`
WHERE d.committee_id = comm.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.state = 'pa'
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Committee out-of-pa contributions by year
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('committee -- out_of_pa_donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'committee'
AND breakdown_1 = 'out_of_pa_donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, comm.id, 'committee', 'out_of_pa_donations_by_year', '{}', '', SUM(d.donation_amount)
FROM political_donation d, committee comm, `contributor`, `contributor_address`
WHERE d.committee_id = comm.id
AND d.contributor_id = `contributor`.id
AND `contributor`.address_id = `contributor_address`.id
AND `contributor_address`.state != 'pa'
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY d.`committee_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
# -- -----------------------------------------
# -- Set Race total contributions by year
# -- -----------------------------------------
replacement_dict = {}
replacement_dict['total'] = {'date_start': '2016-12-31', 'date_end': '2019-01-01'}
replacement_dict['2018'] = {'date_start': '2017-12-31', 'date_end': '2019-01-01'}
replacement_dict['2017'] = {'date_start': '2016-12-31', 'date_end': '2018-01-01'}
print('race -- donations_by_year')
for breakdown in replacement_dict:
sql_delete = """DELETE FROM `cache_value_amount`
WHERE object_name = 'race'
AND breakdown_1 = 'donations_by_year'
AND breakdown_2 = '{}';""".format(breakdown)
db_session.execute(sql_delete)
sql_query = """INSERT INTO `cache_value_amount`
SELECT NULL, candidacy.race_id, 'race', 'donations_by_year', '{}', '', SUM(d.donation_amount)
FROM political_donation d, committee comm, `candidate_committees` cc, `candidate` cand, candidacy
WHERE d.committee_id = comm.id
AND comm.id = cc.committee_id
AND cc.candidate_id = cand.id
AND cand.id = candidacy.candidate_id
AND d.donation_date > '{}'
AND d.donation_date < '{}'
GROUP BY `candidacy`.`race_id`;""".format(breakdown, replacement_dict[breakdown]['date_start'], replacement_dict[breakdown]['date_end'])
#print(sql_query)
results = db_session.execute(sql_query)
| 38.921615 | 140 | 0.645246 | 2,021 | 16,386 | 4.939139 | 0.046512 | 0.105189 | 0.045081 | 0.03306 | 0.982569 | 0.982569 | 0.982068 | 0.981066 | 0.981066 | 0.97786 | 0 | 0.042397 | 0.168009 | 16,386 | 420 | 141 | 39.014286 | 0.689797 | 0.107165 | 0 | 0.874477 | 0 | 0.050209 | 0.707516 | 0.170347 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004184 | 0 | 0.004184 | 0.041841 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0eba5e9181c30ed831839b31f6d9af2c7fb9c2a8 | 44,838 | py | Python | libcloud/test/compute/test_opennebula.py | rgharris/libcloud | 90971e17bfd7b6bb97b2489986472c531cc8e140 | [
"Apache-2.0"
] | null | null | null | libcloud/test/compute/test_opennebula.py | rgharris/libcloud | 90971e17bfd7b6bb97b2489986472c531cc8e140 | [
"Apache-2.0"
] | 1 | 2021-12-06T12:29:13.000Z | 2021-12-06T12:29:13.000Z | libcloud/test/compute/test_opennebula.py | rgharris/libcloud | 90971e17bfd7b6bb97b2489986472c531cc8e140 | [
"Apache-2.0"
] | 1 | 2019-08-05T10:12:02.000Z | 2019-08-05T10:12:02.000Z | # Copyright 2002-2009, Distributed Systems Architecture Group, Universidad
# Complutense de Madrid (dsa-research.org)
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
OpenNebula.org test suite.
"""
__docformat__ = "epytext"
import unittest
import sys
from libcloud.utils.py3 import httplib
from libcloud.compute.base import Node, NodeImage, NodeSize, NodeState
from libcloud.compute.drivers.opennebula import OpenNebulaNodeDriver
from libcloud.compute.drivers.opennebula import OpenNebulaNetwork
from libcloud.compute.drivers.opennebula import OpenNebulaNodeSize
from libcloud.compute.drivers.opennebula import ACTION
import libcloud.compute.drivers.opennebula
from libcloud.test.file_fixtures import ComputeFileFixtures
from libcloud.test import MockHttp
from libcloud.test.secrets import OPENNEBULA_PARAMS
libcloud.compute.drivers.opennebula.API_HOST = "dummy"
class OpenNebula_1_4_Tests(unittest.TestCase):
"""
OpenNebula.org test suite for OpenNebula v1.4.
"""
def setUp(self):
"""
Setup test environment.
"""
OpenNebulaNodeDriver.connectionCls.conn_class = OpenNebula_1_4_MockHttp
self.driver = OpenNebulaNodeDriver(*OPENNEBULA_PARAMS + ("1.4",), host="dummy")
def test_create_node(self):
"""
Test create_node functionality.
"""
image = NodeImage(id=5, name="Ubuntu 9.04 LAMP", driver=self.driver)
size = NodeSize(
id=1,
name="small",
ram=None,
disk=None,
bandwidth=None,
price=None,
driver=self.driver,
)
networks = list()
networks.append(
OpenNebulaNetwork(
id=5,
name="Network 5",
address="192.168.0.0",
size=256,
driver=self.driver,
)
)
networks.append(
OpenNebulaNetwork(
id=15,
name="Network 15",
address="192.168.1.0",
size=256,
driver=self.driver,
)
)
node = self.driver.create_node(
name="Compute 5", image=image, size=size, networks=networks
)
self.assertEqual(node.id, "5")
self.assertEqual(node.name, "Compute 5")
self.assertEqual(node.state, OpenNebulaNodeDriver.NODE_STATE_MAP["ACTIVE"])
self.assertIsNone(node.public_ips[0].name)
self.assertEqual(node.public_ips[0].id, "5")
self.assertEqual(node.public_ips[0].address, "192.168.0.1")
self.assertEqual(node.public_ips[0].size, 1)
self.assertIsNone(node.public_ips[1].name)
self.assertEqual(node.public_ips[1].id, "15")
self.assertEqual(node.public_ips[1].address, "192.168.1.1")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.private_ips, [])
self.assertEqual(node.image.id, "5")
self.assertEqual(node.image.extra["dev"], "sda1")
def test_destroy_node(self):
"""
Test destroy_node functionality.
"""
node = Node(5, None, None, None, None, self.driver)
ret = self.driver.destroy_node(node)
self.assertTrue(ret)
def test_list_nodes(self):
"""
Test list_nodes functionality.
"""
nodes = self.driver.list_nodes()
self.assertEqual(len(nodes), 3)
node = nodes[0]
self.assertEqual(node.id, "5")
self.assertEqual(node.name, "Compute 5")
self.assertEqual(node.state, OpenNebulaNodeDriver.NODE_STATE_MAP["ACTIVE"])
self.assertEqual(node.public_ips[0].id, "5")
self.assertIsNone(node.public_ips[0].name)
self.assertEqual(node.public_ips[0].address, "192.168.0.1")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[1].id, "15")
self.assertIsNone(node.public_ips[1].name)
self.assertEqual(node.public_ips[1].address, "192.168.1.1")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.private_ips, [])
self.assertEqual(node.image.id, "5")
self.assertEqual(node.image.extra["dev"], "sda1")
node = nodes[1]
self.assertEqual(node.id, "15")
self.assertEqual(node.name, "Compute 15")
self.assertEqual(node.state, OpenNebulaNodeDriver.NODE_STATE_MAP["ACTIVE"])
self.assertEqual(node.public_ips[0].id, "5")
self.assertIsNone(node.public_ips[0].name)
self.assertEqual(node.public_ips[0].address, "192.168.0.2")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[1].id, "15")
self.assertIsNone(node.public_ips[1].name)
self.assertEqual(node.public_ips[1].address, "192.168.1.2")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.private_ips, [])
self.assertEqual(node.image.id, "15")
self.assertEqual(node.image.extra["dev"], "sda1")
node = nodes[2]
self.assertEqual(node.id, "25")
self.assertEqual(node.name, "Compute 25")
self.assertEqual(node.state, NodeState.UNKNOWN)
self.assertEqual(node.public_ips[0].id, "5")
self.assertIsNone(node.public_ips[0].name)
self.assertEqual(node.public_ips[0].address, "192.168.0.3")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[1].id, "15")
self.assertIsNone(node.public_ips[1].name)
self.assertEqual(node.public_ips[1].address, "192.168.1.3")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.private_ips, [])
self.assertIsNone(node.image)
def test_list_images(self):
"""
Test list_images functionality.
"""
images = self.driver.list_images()
self.assertEqual(len(images), 2)
image = images[0]
self.assertEqual(image.id, "5")
self.assertEqual(image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(image.extra["size"], "2048")
self.assertEqual(image.extra["url"], "file:///images/ubuntu/jaunty.img")
image = images[1]
self.assertEqual(image.id, "15")
self.assertEqual(image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(image.extra["size"], "2048")
self.assertEqual(image.extra["url"], "file:///images/ubuntu/jaunty.img")
def test_list_sizes(self):
"""
Test list_sizes functionality.
"""
sizes = self.driver.list_sizes()
self.assertEqual(len(sizes), 3)
size = sizes[0]
self.assertEqual(size.id, "1")
self.assertEqual(size.name, "small")
self.assertIsNone(size.ram)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[1]
self.assertEqual(size.id, "2")
self.assertEqual(size.name, "medium")
self.assertIsNone(size.ram)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[2]
self.assertEqual(size.id, "3")
self.assertEqual(size.name, "large")
self.assertIsNone(size.ram)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
def test_list_locations(self):
"""
Test list_locations functionality.
"""
locations = self.driver.list_locations()
self.assertEqual(len(locations), 1)
location = locations[0]
self.assertEqual(location.id, "0")
self.assertEqual(location.name, "")
self.assertEqual(location.country, "")
def test_ex_list_networks(self):
"""
Test ex_list_networks functionality.
"""
networks = self.driver.ex_list_networks()
self.assertEqual(len(networks), 2)
network = networks[0]
self.assertEqual(network.id, "5")
self.assertEqual(network.name, "Network 5")
self.assertEqual(network.address, "192.168.0.0")
self.assertEqual(network.size, "256")
network = networks[1]
self.assertEqual(network.id, "15")
self.assertEqual(network.name, "Network 15")
self.assertEqual(network.address, "192.168.1.0")
self.assertEqual(network.size, "256")
def test_ex_node_action(self):
"""
Test ex_node_action functionality.
"""
node = Node(5, None, None, None, None, self.driver)
ret = self.driver.ex_node_action(node, ACTION.STOP)
self.assertTrue(ret)
class OpenNebula_2_0_Tests(unittest.TestCase):
"""
OpenNebula.org test suite for OpenNebula v2.0 through v2.2.
"""
def setUp(self):
"""
Setup test environment.
"""
OpenNebulaNodeDriver.connectionCls.conn_class = OpenNebula_2_0_MockHttp
self.driver = OpenNebulaNodeDriver(*OPENNEBULA_PARAMS + ("2.0",), host="dummy")
def test_create_node(self):
"""
Test create_node functionality.
"""
image = NodeImage(id=5, name="Ubuntu 9.04 LAMP", driver=self.driver)
size = OpenNebulaNodeSize(
id=1,
name="small",
ram=1024,
cpu=1,
disk=None,
bandwidth=None,
price=None,
driver=self.driver,
)
networks = list()
networks.append(
OpenNebulaNetwork(
id=5,
name="Network 5",
address="192.168.0.0",
size=256,
driver=self.driver,
)
)
networks.append(
OpenNebulaNetwork(
id=15,
name="Network 15",
address="192.168.1.0",
size=256,
driver=self.driver,
)
)
context = {"hostname": "compute-5"}
node = self.driver.create_node(
name="Compute 5", image=image, size=size, networks=networks, context=context
)
self.assertEqual(node.id, "5")
self.assertEqual(node.name, "Compute 5")
self.assertEqual(node.state, OpenNebulaNodeDriver.NODE_STATE_MAP["ACTIVE"])
self.assertEqual(node.public_ips[0].id, "5")
self.assertEqual(node.public_ips[0].name, "Network 5")
self.assertEqual(node.public_ips[0].address, "192.168.0.1")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[0].extra["mac"], "02:00:c0:a8:00:01")
self.assertEqual(node.public_ips[1].id, "15")
self.assertEqual(node.public_ips[1].name, "Network 15")
self.assertEqual(node.public_ips[1].address, "192.168.1.1")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.public_ips[1].extra["mac"], "02:00:c0:a8:01:01")
self.assertEqual(node.private_ips, [])
self.assertTrue(
len([s for s in self.driver.list_sizes() if s.id == node.size.id]) == 1
)
self.assertEqual(node.image.id, "5")
self.assertEqual(node.image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(node.image.extra["type"], "DISK")
self.assertEqual(node.image.extra["target"], "hda")
context = node.extra["context"]
self.assertEqual(context["hostname"], "compute-5")
def test_destroy_node(self):
"""
Test destroy_node functionality.
"""
node = Node(5, None, None, None, None, self.driver)
ret = self.driver.destroy_node(node)
self.assertTrue(ret)
def test_list_nodes(self):
"""
Test list_nodes functionality.
"""
nodes = self.driver.list_nodes()
self.assertEqual(len(nodes), 3)
node = nodes[0]
self.assertEqual(node.id, "5")
self.assertEqual(node.name, "Compute 5")
self.assertEqual(node.state, OpenNebulaNodeDriver.NODE_STATE_MAP["ACTIVE"])
self.assertEqual(node.public_ips[0].id, "5")
self.assertEqual(node.public_ips[0].name, "Network 5")
self.assertEqual(node.public_ips[0].address, "192.168.0.1")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[0].extra["mac"], "02:00:c0:a8:00:01")
self.assertEqual(node.public_ips[1].id, "15")
self.assertEqual(node.public_ips[1].name, "Network 15")
self.assertEqual(node.public_ips[1].address, "192.168.1.1")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.public_ips[1].extra["mac"], "02:00:c0:a8:01:01")
self.assertEqual(node.private_ips, [])
self.assertTrue(
len([size for size in self.driver.list_sizes() if size.id == node.size.id])
== 1
)
self.assertEqual(node.size.id, "1")
self.assertEqual(node.size.name, "small")
self.assertEqual(node.size.ram, 1024)
self.assertTrue(node.size.cpu is None or isinstance(node.size.cpu, int))
self.assertTrue(node.size.vcpu is None or isinstance(node.size.vcpu, int))
self.assertEqual(node.size.cpu, 1)
self.assertIsNone(node.size.vcpu)
self.assertIsNone(node.size.disk)
self.assertIsNone(node.size.bandwidth)
self.assertIsNone(node.size.price)
self.assertTrue(
len(
[
image
for image in self.driver.list_images()
if image.id == node.image.id
]
)
== 1
)
self.assertEqual(node.image.id, "5")
self.assertEqual(node.image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(node.image.extra["type"], "DISK")
self.assertEqual(node.image.extra["target"], "hda")
context = node.extra["context"]
self.assertEqual(context["hostname"], "compute-5")
node = nodes[1]
self.assertEqual(node.id, "15")
self.assertEqual(node.name, "Compute 15")
self.assertEqual(node.state, OpenNebulaNodeDriver.NODE_STATE_MAP["ACTIVE"])
self.assertEqual(node.public_ips[0].id, "5")
self.assertEqual(node.public_ips[0].name, "Network 5")
self.assertEqual(node.public_ips[0].address, "192.168.0.2")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[0].extra["mac"], "02:00:c0:a8:00:02")
self.assertEqual(node.public_ips[1].id, "15")
self.assertEqual(node.public_ips[1].name, "Network 15")
self.assertEqual(node.public_ips[1].address, "192.168.1.2")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.public_ips[1].extra["mac"], "02:00:c0:a8:01:02")
self.assertEqual(node.private_ips, [])
self.assertTrue(
len([size for size in self.driver.list_sizes() if size.id == node.size.id])
== 1
)
self.assertEqual(node.size.id, "1")
self.assertEqual(node.size.name, "small")
self.assertEqual(node.size.ram, 1024)
self.assertTrue(node.size.cpu is None or isinstance(node.size.cpu, int))
self.assertTrue(node.size.vcpu is None or isinstance(node.size.vcpu, int))
self.assertEqual(node.size.cpu, 1)
self.assertIsNone(node.size.vcpu)
self.assertIsNone(node.size.disk)
self.assertIsNone(node.size.bandwidth)
self.assertIsNone(node.size.price)
self.assertTrue(
len(
[
image
for image in self.driver.list_images()
if image.id == node.image.id
]
)
== 1
)
self.assertEqual(node.image.id, "15")
self.assertEqual(node.image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(node.image.extra["type"], "DISK")
self.assertEqual(node.image.extra["target"], "hda")
context = node.extra["context"]
self.assertEqual(context["hostname"], "compute-15")
node = nodes[2]
self.assertEqual(node.id, "25")
self.assertEqual(node.name, "Compute 25")
self.assertEqual(node.state, NodeState.UNKNOWN)
self.assertEqual(node.public_ips[0].id, "5")
self.assertEqual(node.public_ips[0].name, "Network 5")
self.assertEqual(node.public_ips[0].address, "192.168.0.3")
self.assertEqual(node.public_ips[0].size, 1)
self.assertEqual(node.public_ips[0].extra["mac"], "02:00:c0:a8:00:03")
self.assertEqual(node.public_ips[1].id, "15")
self.assertEqual(node.public_ips[1].name, "Network 15")
self.assertEqual(node.public_ips[1].address, "192.168.1.3")
self.assertEqual(node.public_ips[1].size, 1)
self.assertEqual(node.public_ips[1].extra["mac"], "02:00:c0:a8:01:03")
self.assertEqual(node.private_ips, [])
self.assertIsNone(node.size)
self.assertIsNone(node.image)
context = node.extra["context"]
self.assertEqual(context, {})
def test_list_images(self):
"""
Test list_images functionality.
"""
images = self.driver.list_images()
self.assertEqual(len(images), 2)
image = images[0]
self.assertEqual(image.id, "5")
self.assertEqual(image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(image.extra["description"], "Ubuntu 9.04 LAMP Description")
self.assertEqual(image.extra["type"], "OS")
self.assertEqual(image.extra["size"], "2048")
image = images[1]
self.assertEqual(image.id, "15")
self.assertEqual(image.name, "Ubuntu 9.04 LAMP")
self.assertEqual(image.extra["description"], "Ubuntu 9.04 LAMP Description")
self.assertEqual(image.extra["type"], "OS")
self.assertEqual(image.extra["size"], "2048")
def test_list_sizes(self):
"""
Test list_sizes functionality.
"""
sizes = self.driver.list_sizes()
self.assertEqual(len(sizes), 4)
size = sizes[0]
self.assertEqual(size.id, "1")
self.assertEqual(size.name, "small")
self.assertEqual(size.ram, 1024)
self.assertTrue(size.cpu is None or isinstance(size.cpu, int))
self.assertTrue(size.vcpu is None or isinstance(size.vcpu, int))
self.assertEqual(size.cpu, 1)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[1]
self.assertEqual(size.id, "2")
self.assertEqual(size.name, "medium")
self.assertEqual(size.ram, 4096)
self.assertTrue(size.cpu is None or isinstance(size.cpu, int))
self.assertTrue(size.vcpu is None or isinstance(size.vcpu, int))
self.assertEqual(size.cpu, 4)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[2]
self.assertEqual(size.id, "3")
self.assertEqual(size.name, "large")
self.assertEqual(size.ram, 8192)
self.assertTrue(size.cpu is None or isinstance(size.cpu, int))
self.assertTrue(size.vcpu is None or isinstance(size.vcpu, int))
self.assertEqual(size.cpu, 8)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[3]
self.assertEqual(size.id, "4")
self.assertEqual(size.name, "custom")
self.assertEqual(size.ram, 0)
self.assertEqual(size.cpu, 0)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
def test_list_locations(self):
"""
Test list_locations functionality.
"""
locations = self.driver.list_locations()
self.assertEqual(len(locations), 1)
location = locations[0]
self.assertEqual(location.id, "0")
self.assertEqual(location.name, "")
self.assertEqual(location.country, "")
def test_ex_list_networks(self):
"""
Test ex_list_networks functionality.
"""
networks = self.driver.ex_list_networks()
self.assertEqual(len(networks), 2)
network = networks[0]
self.assertEqual(network.id, "5")
self.assertEqual(network.name, "Network 5")
self.assertEqual(network.address, "192.168.0.0")
self.assertEqual(network.size, "256")
network = networks[1]
self.assertEqual(network.id, "15")
self.assertEqual(network.name, "Network 15")
self.assertEqual(network.address, "192.168.1.0")
self.assertEqual(network.size, "256")
class OpenNebula_3_0_Tests(unittest.TestCase):
"""
OpenNebula.org test suite for OpenNebula v3.0.
"""
def setUp(self):
"""
Setup test environment.
"""
OpenNebulaNodeDriver.connectionCls.conn_class = OpenNebula_3_0_MockHttp
self.driver = OpenNebulaNodeDriver(*OPENNEBULA_PARAMS + ("3.0",), host="dummy")
def test_ex_list_networks(self):
"""
Test ex_list_networks functionality.
"""
networks = self.driver.ex_list_networks()
self.assertEqual(len(networks), 2)
network = networks[0]
self.assertEqual(network.id, "5")
self.assertEqual(network.name, "Network 5")
self.assertEqual(network.address, "192.168.0.0")
self.assertEqual(network.size, "256")
self.assertEqual(network.extra["public"], "YES")
network = networks[1]
self.assertEqual(network.id, "15")
self.assertEqual(network.name, "Network 15")
self.assertEqual(network.address, "192.168.1.0")
self.assertEqual(network.size, "256")
self.assertEqual(network.extra["public"], "NO")
def test_ex_node_set_save_name(self):
"""
Test ex_node_action functionality.
"""
image = NodeImage(id=5, name="Ubuntu 9.04 LAMP", driver=self.driver)
node = Node(5, None, None, None, None, self.driver, image=image)
ret = self.driver.ex_node_set_save_name(node, "test")
self.assertTrue(ret)
class OpenNebula_3_2_Tests(unittest.TestCase):
"""
OpenNebula.org test suite for OpenNebula v3.2.
"""
def setUp(self):
"""
Setup test environment.
"""
OpenNebulaNodeDriver.connectionCls.conn_class = OpenNebula_3_2_MockHttp
self.driver = OpenNebulaNodeDriver(*OPENNEBULA_PARAMS + ("3.2",), host="dummy")
def test_reboot_node(self):
"""
Test reboot_node functionality.
"""
image = NodeImage(id=5, name="Ubuntu 9.04 LAMP", driver=self.driver)
node = Node(5, None, None, None, None, self.driver, image=image)
ret = self.driver.reboot_node(node)
self.assertTrue(ret)
def test_list_sizes(self):
"""
Test ex_list_networks functionality.
"""
sizes = self.driver.list_sizes()
self.assertEqual(len(sizes), 3)
size = sizes[0]
self.assertEqual(size.id, "1")
self.assertEqual(size.name, "small")
self.assertEqual(size.ram, 1024)
self.assertTrue(size.cpu is None or isinstance(size.cpu, float))
self.assertTrue(size.vcpu is None or isinstance(size.vcpu, int))
self.assertEqual(size.cpu, 1)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[1]
self.assertEqual(size.id, "2")
self.assertEqual(size.name, "medium")
self.assertEqual(size.ram, 4096)
self.assertTrue(size.cpu is None or isinstance(size.cpu, float))
self.assertTrue(size.vcpu is None or isinstance(size.vcpu, int))
self.assertEqual(size.cpu, 4)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[2]
self.assertEqual(size.id, "3")
self.assertEqual(size.name, "large")
self.assertEqual(size.ram, 8192)
self.assertTrue(size.cpu is None or isinstance(size.cpu, float))
self.assertTrue(size.vcpu is None or isinstance(size.vcpu, int))
self.assertEqual(size.cpu, 8)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
class OpenNebula_3_6_Tests(unittest.TestCase):
"""
OpenNebula.org test suite for OpenNebula v3.6.
"""
def setUp(self):
"""
Setup test environment.
"""
OpenNebulaNodeDriver.connectionCls.conn_class = OpenNebula_3_6_MockHttp
self.driver = OpenNebulaNodeDriver(*OPENNEBULA_PARAMS + ("3.6",), host="dummy")
def test_create_volume(self):
new_volume = self.driver.create_volume(1000, "test-volume")
self.assertEqual(new_volume.id, "5")
self.assertEqual(new_volume.size, 1000)
self.assertEqual(new_volume.name, "test-volume")
def test_destroy_volume(self):
images = self.driver.list_images()
self.assertEqual(len(images), 2)
image = images[0]
ret = self.driver.destroy_volume(image)
self.assertTrue(ret)
def test_attach_volume(self):
nodes = self.driver.list_nodes()
node = nodes[0]
images = self.driver.list_images()
image = images[0]
ret = self.driver.attach_volume(node, image, "sda")
self.assertTrue(ret)
def test_detach_volume(self):
images = self.driver.list_images()
image = images[1]
ret = self.driver.detach_volume(image)
self.assertTrue(ret)
nodes = self.driver.list_nodes()
# node with only a single associated image
node = nodes[1]
ret = self.driver.detach_volume(node.image)
self.assertFalse(ret)
def test_list_volumes(self):
volumes = self.driver.list_volumes()
self.assertEqual(len(volumes), 2)
volume = volumes[0]
self.assertEqual(volume.id, "5")
self.assertEqual(volume.size, 2048)
self.assertEqual(volume.name, "Ubuntu 9.04 LAMP")
volume = volumes[1]
self.assertEqual(volume.id, "15")
self.assertEqual(volume.size, 1024)
self.assertEqual(volume.name, "Debian Sid")
class OpenNebula_3_8_Tests(unittest.TestCase):
"""
OpenNebula.org test suite for OpenNebula v3.8.
"""
def setUp(self):
"""
Setup test environment.
"""
OpenNebulaNodeDriver.connectionCls.conn_class = OpenNebula_3_8_MockHttp
self.driver = OpenNebulaNodeDriver(*OPENNEBULA_PARAMS + ("3.8",), host="dummy")
def test_list_sizes(self):
"""
Test ex_list_networks functionality.
"""
sizes = self.driver.list_sizes()
self.assertEqual(len(sizes), 3)
size = sizes[0]
self.assertEqual(size.id, "1")
self.assertEqual(size.name, "small")
self.assertEqual(size.ram, 1024)
self.assertEqual(size.cpu, 1)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[1]
self.assertEqual(size.id, "2")
self.assertEqual(size.name, "medium")
self.assertEqual(size.ram, 4096)
self.assertEqual(size.cpu, 4)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
size = sizes[2]
self.assertEqual(size.id, "3")
self.assertEqual(size.name, "large")
self.assertEqual(size.ram, 8192)
self.assertEqual(size.cpu, 8)
self.assertIsNone(size.vcpu)
self.assertIsNone(size.disk)
self.assertIsNone(size.bandwidth)
self.assertIsNone(size.price)
class OpenNebula_1_4_MockHttp(MockHttp):
"""
Mock HTTP server for testing v1.4 of the OpenNebula.org compute driver.
"""
fixtures = ComputeFileFixtures("opennebula_1_4")
def _compute(self, method, url, body, headers):
"""
Compute pool resources.
"""
if method == "GET":
body = self.fixtures.load("computes.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("compute_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _storage(self, method, url, body, headers):
"""
Storage pool resources.
"""
if method == "GET":
body = self.fixtures.load("storage.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("disk_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _network(self, method, url, body, headers):
"""
Network pool resources.
"""
if method == "GET":
body = self.fixtures.load("networks.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("network_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _compute_5(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _compute_15(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _compute_25(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_25.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _storage_5(self, method, url, body, headers):
"""
Storage entry resource.
"""
if method == "GET":
body = self.fixtures.load("disk_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _storage_15(self, method, url, body, headers):
"""
Storage entry resource.
"""
if method == "GET":
body = self.fixtures.load("disk_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _network_5(self, method, url, body, headers):
"""
Network entry resource.
"""
if method == "GET":
body = self.fixtures.load("network_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _network_15(self, method, url, body, headers):
"""
Network entry resource.
"""
if method == "GET":
body = self.fixtures.load("network_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
class OpenNebula_2_0_MockHttp(MockHttp):
"""
Mock HTTP server for testing v2.0 through v3.2 of the OpenNebula.org
compute driver.
"""
fixtures = ComputeFileFixtures("opennebula_2_0")
def _compute(self, method, url, body, headers):
"""
Compute pool resources.
"""
if method == "GET":
body = self.fixtures.load("compute_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("compute_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _storage(self, method, url, body, headers):
"""
Storage pool resources.
"""
if method == "GET":
body = self.fixtures.load("storage_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("storage_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _network(self, method, url, body, headers):
"""
Network pool resources.
"""
if method == "GET":
body = self.fixtures.load("network_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("network_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _compute_5(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _compute_15(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _compute_25(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_25.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _storage_5(self, method, url, body, headers):
"""
Storage entry resource.
"""
if method == "GET":
body = self.fixtures.load("storage_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _storage_15(self, method, url, body, headers):
"""
Storage entry resource.
"""
if method == "GET":
body = self.fixtures.load("storage_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _network_5(self, method, url, body, headers):
"""
Network entry resource.
"""
if method == "GET":
body = self.fixtures.load("network_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _network_15(self, method, url, body, headers):
"""
Network entry resource.
"""
if method == "GET":
body = self.fixtures.load("network_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
class OpenNebula_3_0_MockHttp(OpenNebula_2_0_MockHttp):
"""
Mock HTTP server for testing v3.0 of the OpenNebula.org compute driver.
"""
fixtures_3_0 = ComputeFileFixtures("opennebula_3_0")
def _network(self, method, url, body, headers):
"""
Network pool resources.
"""
if method == "GET":
body = self.fixtures_3_0.load("network_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures.load("network_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _network_5(self, method, url, body, headers):
"""
Network entry resource.
"""
if method == "GET":
body = self.fixtures_3_0.load("network_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _network_15(self, method, url, body, headers):
"""
Network entry resource.
"""
if method == "GET":
body = self.fixtures_3_0.load("network_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
class OpenNebula_3_2_MockHttp(OpenNebula_3_0_MockHttp):
"""
Mock HTTP server for testing v3.2 of the OpenNebula.org compute driver.
"""
fixtures_3_2 = ComputeFileFixtures("opennebula_3_2")
def _compute_5(self, method, url, body, headers):
"""
Compute entry resource.
"""
if method == "GET":
body = self.fixtures.load("compute_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _instance_type(self, method, url, body, headers):
"""
Instance type pool.
"""
if method == "GET":
body = self.fixtures_3_2.load("instance_type_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
class OpenNebula_3_6_MockHttp(OpenNebula_3_2_MockHttp):
"""
Mock HTTP server for testing v3.6 of the OpenNebula.org compute driver.
"""
fixtures_3_6 = ComputeFileFixtures("opennebula_3_6")
def _storage(self, method, url, body, headers):
if method == "GET":
body = self.fixtures.load("storage_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "POST":
body = self.fixtures_3_6.load("storage_5.xml")
return (httplib.CREATED, body, {}, httplib.responses[httplib.CREATED])
def _compute_5(self, method, url, body, headers):
if method == "GET":
body = self.fixtures_3_6.load("compute_5.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _compute_5_action(self, method, url, body, headers):
body = self.fixtures_3_6.load("compute_5.xml")
if method == "POST":
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "GET":
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _compute_15(self, method, url, body, headers):
if method == "GET":
body = self.fixtures_3_6.load("compute_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if method == "PUT":
body = ""
return (httplib.ACCEPTED, body, {}, httplib.responses[httplib.ACCEPTED])
if method == "DELETE":
body = ""
return (httplib.NO_CONTENT, body, {}, httplib.responses[httplib.NO_CONTENT])
def _storage_10(self, method, url, body, headers):
"""
Storage entry resource.
"""
if method == "GET":
body = self.fixtures_3_6.load("disk_10.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _storage_15(self, method, url, body, headers):
"""
Storage entry resource.
"""
if method == "GET":
body = self.fixtures_3_6.load("disk_15.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
class OpenNebula_3_8_MockHttp(OpenNebula_3_2_MockHttp):
"""
Mock HTTP server for testing v3.8 of the OpenNebula.org compute driver.
"""
fixtures_3_8 = ComputeFileFixtures("opennebula_3_8")
def _instance_type(self, method, url, body, headers):
"""
Instance type pool.
"""
if method == "GET":
body = self.fixtures_3_8.load("instance_type_collection.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _instance_type_small(self, method, url, body, headers):
"""
Small instance type.
"""
if method == "GET":
body = self.fixtures_3_8.load("instance_type_small.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _instance_type_medium(self, method, url, body, headers):
"""
Medium instance type pool.
"""
if method == "GET":
body = self.fixtures_3_8.load("instance_type_medium.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _instance_type_large(self, method, url, body, headers):
"""
Large instance type pool.
"""
if method == "GET":
body = self.fixtures_3_8.load("instance_type_large.xml")
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
if __name__ == "__main__":
sys.exit(unittest.main())
| 35.613979 | 88 | 0.598443 | 5,229 | 44,838 | 5.036145 | 0.050296 | 0.140123 | 0.088023 | 0.073821 | 0.911521 | 0.893977 | 0.877952 | 0.858776 | 0.847536 | 0.836675 | 0 | 0.029408 | 0.266649 | 44,838 | 1,258 | 89 | 35.642289 | 0.771456 | 0.072082 | 0 | 0.825829 | 0 | 0 | 0.065759 | 0.007484 | 0 | 0 | 0 | 0 | 0.411137 | 1 | 0.078199 | false | 0 | 0.014218 | 0 | 0.199052 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1618f4f9d9b3e7c72070bf2412345a281bc1747a | 247 | py | Python | rsfmri/examples/bias_registration.py | klarnemann/jagust_rsfmri | f450f322a9c178bf4800c3baa4ed59380112f9fc | [
"MIT"
] | null | null | null | rsfmri/examples/bias_registration.py | klarnemann/jagust_rsfmri | f450f322a9c178bf4800c3baa4ed59380112f9fc | [
"MIT"
] | null | null | null | rsfmri/examples/bias_registration.py | klarnemann/jagust_rsfmri | f450f322a9c178bf4800c3baa4ed59380112f9fc | [
"MIT"
] | null | null | null | from rsfmri import utils
from rsfmri import register
def get_spm_mean():
pass
def get_ants_mean():
pass
def get_anat_aparc():
pass
def biascorrect():
pass
def coreg_mi():
pass
def coreg_spm_ants_anat():
pass
| 9.5 | 27 | 0.663968 | 36 | 247 | 4.277778 | 0.444444 | 0.227273 | 0.207792 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.263158 | 247 | 25 | 28 | 9.88 | 0.846154 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | true | 0.428571 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
1646895f1935012c8b53ce4bd23ef8f9e87cdca1 | 130 | py | Python | python/source/__init__.py | dineshkumarsarangapani/leetcode-solutions | 1559e4ed71b7cbb081071434a029bbf1794e022e | [
"Apache-2.0"
] | 1 | 2020-10-18T09:28:17.000Z | 2020-10-18T09:28:17.000Z | python/source/__init__.py | dineshkumarsarangapani/leetcode-solutions | 1559e4ed71b7cbb081071434a029bbf1794e022e | [
"Apache-2.0"
] | null | null | null | python/source/__init__.py | dineshkumarsarangapani/leetcode-solutions | 1559e4ed71b7cbb081071434a029bbf1794e022e | [
"Apache-2.0"
] | null | null | null | from python.source.arrays.remove_duplicates_from_sorted_array import Solution
from python.source.arrays.max_profit import Solution | 65 | 77 | 0.9 | 19 | 130 | 5.894737 | 0.631579 | 0.178571 | 0.285714 | 0.392857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053846 | 130 | 2 | 78 | 65 | 0.910569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
16680df9c9873cbc9852c51ea700c7f8b3aa7896 | 2,099 | py | Python | kicker/tests/app/connectors/test_container_tags.py | omBratteng/mottak | b7d2e1d063b31c2ad89c66e5414297612f91ebe9 | [
"Apache-2.0"
] | null | null | null | kicker/tests/app/connectors/test_container_tags.py | omBratteng/mottak | b7d2e1d063b31c2ad89c66e5414297612f91ebe9 | [
"Apache-2.0"
] | null | null | null | kicker/tests/app/connectors/test_container_tags.py | omBratteng/mottak | b7d2e1d063b31c2ad89c66e5414297612f91ebe9 | [
"Apache-2.0"
] | null | null | null | import pytest
from app.connectors.container_tag_params import ContainerTagParams
@pytest.fixture
def testobj_container_tag_params() -> ContainerTagParams:
return ContainerTagParams(checksum='55c3d5fb0ccca55b3a6ab51971a5fe32d1626db9',
avscan='dcad1fc9f0adc563cd1ac63ed0e5990c7c7d43d7',
mailer='69beb8efaf117ff7628db8b01d9a349a2b426893',
unpack='eb9d0e0760dbbf553c06f67ef174a53d4a347c70',
arkade5='35b34abdfb554738172e46447d5d01d4deabf225',
artifact_writer='35be1bb590ee66ff2519329d126262120d42cac9',
close_dataset='e73e542ff248101aad79dd19b811b228d66c66d3',
open_dataset='e73e542ff248101aad79dd19b811b228d66c66d3',
transfer_archive='372ef772f28b7b3d5cb34ca639c51362731a8142',
save_dataset_in_mottak='372ef772f28b7b3d5cb34ca639c51362731a8142',
)
def test_container_tags(testobj_container_tag_params):
expected = testobj_container_tag_params
actual = ContainerTagParams(checksum='55c3d5fb0ccca55b3a6ab51971a5fe32d1626db9',
avscan='dcad1fc9f0adc563cd1ac63ed0e5990c7c7d43d7',
mailer='69beb8efaf117ff7628db8b01d9a349a2b426893',
unpack='eb9d0e0760dbbf553c06f67ef174a53d4a347c70',
arkade5='35b34abdfb554738172e46447d5d01d4deabf225',
artifact_writer='35be1bb590ee66ff2519329d126262120d42cac9',
close_dataset='e73e542ff248101aad79dd19b811b228d66c66d3',
open_dataset='e73e542ff248101aad79dd19b811b228d66c66d3',
transfer_archive='372ef772f28b7b3d5cb34ca639c51362731a8142',
save_dataset_in_mottak='372ef772f28b7b3d5cb34ca639c51362731a8142',
)
assert expected == actual
| 56.72973 | 98 | 0.624583 | 92 | 2,099 | 13.956522 | 0.445652 | 0.037383 | 0.056075 | 0.058411 | 0.813084 | 0.813084 | 0.813084 | 0.813084 | 0.813084 | 0.813084 | 0 | 0.354359 | 0.327775 | 2,099 | 36 | 99 | 58.305556 | 0.555634 | 0 | 0 | 0.62069 | 0 | 0 | 0.381134 | 0.381134 | 0 | 0 | 0 | 0 | 0.034483 | 1 | 0.068966 | false | 0 | 0.068966 | 0.034483 | 0.172414 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
16b100e30ac09d142183cb5c0956ef628ee9c93e | 24,589 | py | Python | testing/multilabel-classifier-test.py | kstepanmpmg/mldb | f78791cd34d01796705c0f173a14359ec1b2e021 | [
"Apache-2.0"
] | 1 | 2019-04-29T12:39:34.000Z | 2019-04-29T12:39:34.000Z | testing/multilabel-classifier-test.py | tomzhang/mldb | a09cf2d9ca454d1966b9e49ae69f2fe6bf571494 | [
"Apache-2.0"
] | 2 | 2021-03-20T05:52:26.000Z | 2021-11-15T17:52:54.000Z | testing/multilabel-classifier-test.py | matebestek/mldb | f78791cd34d01796705c0f173a14359ec1b2e021 | [
"Apache-2.0"
] | 1 | 2018-11-23T20:03:38.000Z | 2018-11-23T20:03:38.000Z | #
# multilabel-classifier-test.py
# Mathieu Marquis Bolduc, March 6th 2017
# this file is part of mldb. copyright 2017 mldb.ai inc. all rights reserved.
#
import datetime, os
from random import random, gauss, seed
from mldb import mldb, MldbUnitTest, ResponseException
class MultiLabelClassifierTest(MldbUnitTest): # noqa
@classmethod
def setUpClass(self):
# Create toy dataset
seed(12345)
for dataset_id in ["toy", "toy2"]:
dataset_config = {
'type' : 'sparse.mutable',
'id' : dataset_id
}
dataset = mldb.create_dataset(dataset_config)
now = datetime.datetime.now()
for i in range(5000):
label = random() < 0.5
if label:
dataset.record_row("u%d" % i, [["feat1", 5, now],
["feat2", 0, now],
["label0", True, now]])
else:
dataset.record_row("u%d" % i, [["feat1", 0, now],
["feat2", 5, now],
["label1", True, now]])
dataset.commit()
dataset_config = {
'type' : 'sparse.mutable',
'id' : 'trivial'
}
dataset = mldb.create_dataset(dataset_config)
now = datetime.datetime.now()
numLabelExample = 5
for i in range(numLabelExample):
dataset.record_row("u%d" % i, [["feat1", 5, now],
["feat2", 0, now],
["label0", True, now]])
dataset.record_row("u%d" % (i+numLabelExample), [["feat1", 0, now],
["feat2", 5, now],
["label1", True, now]])
dataset.commit()
dataset_config = {
'type' : 'sparse.mutable',
'id' : 'trivial2'
}
dataset = mldb.create_dataset(dataset_config)
now = datetime.datetime.now()
numLabelExample = 20
for i in range(numLabelExample):
dataset.record_row("u%d" % (1+i*6), [["feat1", 5, now],
["feat2", 0, now],
["feat3", 0, now],
["label0", True, now]])
dataset.record_row("u%d" % (2+i*6), [["feat1", 0, now],
["feat2", 5, now],
["feat3", 0, now],
["label1", True, now]])
dataset.record_row("u%d" % (3+i*6), [["feat1", 0, now],
["feat2", 0, now],
["feat3", 5, now],
["label2", True, now]])
dataset.record_row("u%d" % (4+i*6), [["feat1", 5, now],
["feat2", 5, now],
["feat3", 0, now],
["label0", True, now],
["label1", True, now]])
dataset.record_row("u%d" % (5+i*6), [["feat1", 5, now],
["feat2", 0, now],
["feat3", 5, now],
["label0", True, now],
["label2", True, now]])
dataset.record_row("u%d" % (6+i*6), [["feat1", 0, now],
["feat2", 5, now],
["feat3", 5, now],
["label1", True, now],
["label2", True, now]])
dataset.commit()
dataset_config = {
'type' : 'sparse.mutable',
'id' : 'categoricalds'
}
dataset = mldb.create_dataset(dataset_config)
now = datetime.datetime.now()
seed(123456)
numLabelExample = 20
for i in range(numLabelExample):
dataset.record_row("u%d" % (1+i*3), [["feat1", gauss(5,5), now],
["feat2", gauss(3,2), now],
["feat3", gauss(3,2), now],
["label", "banane", now]])
dataset.record_row("u%d" % (2+i*3), [["feat1", gauss(3,2), now],
["feat2", gauss(5,5), now],
["feat3", gauss(3,2), now],
["label", "pomme", now]])
dataset.record_row("u%d" % (3+i*3), [["feat1", gauss(3,2), now],
["feat2", gauss(3,2), now],
["feat3", gauss(5,5), now],
["label", "orange", now]])
dataset.commit()
def test_random_simple(self):
conf = {
"type": "classifier.train",
"params": {
"trainingData": """
select {* EXCLUDING(label0, label1)} as features,
{label0, label1} as label from toy
""",
"modelFileUrl": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "random",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf)
res = mldb.query("SELECT classifyMe({features : {5 as feat1, 0 as feat2}}) as *")
self.assertTableResultEquals(res, [
[
"_rowName",
"scores.\"\"\"label0\"\"\"",
"scores.\"\"\"label1\"\"\""
],
[
"result",
1,
-1
]
])
conf = {
"type": "classifier.experiment",
"params": {
"experimentName": "my_test_exp",
"inputData": "select {* EXCLUDING(label0, label1)} as features, {label0, label1} as label from toy",
"testingDataOverride": """
select {* EXCLUDING(label0, label1)} as features,
{label0, label1} as label from toy
""",
"datasetFolds" : [
{
"trainingWhere": "rowHash() % 10 < 7",
"testingWhere": "rowHash() % 10 >= 7",
}],
"modelFileUrlPattern": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"equalizationFactor": 0.5,
"mode": "multilabel",
"multilabelStrategy": "random",
"recallOverN" : [1],
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
"outputAccuracyDataset": False
}
}
rez = mldb.put("/v1/procedures/rocket_science", conf)
rez = mldb.post("/v1/procedures/rocket_science/runs")
js_rez = rez.json()
self.assertEqual(
js_rez["status"]["folds"][0]["resultsTest"]["weightedStatistics"]["recallOverTopN"][0], 1.0)
def test_decompose_simple(self):
conf = {
"type": "classifier.train",
"params": {
"trainingData": "select {* EXCLUDING(label0, label1)} as features, {label0, label1} as label from toy",
"modelFileUrl": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "decompose",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf)
res = mldb.query("SELECT classifyMe({features : {5 as feat1, 0 as feat2}}) as *")
self.assertTableResultEquals(res, [
[
"_rowName",
"scores.\"\"\"label0\"\"\"",
"scores.\"\"\"label1\"\"\""
],
[
"result",
1,
-1
]
])
conf = {
"type": "classifier.experiment",
"params": {
"experimentName": "my_test_exp",
"inputData": "select {* EXCLUDING(label0, label1)} as features, {label0, label1} as label from toy",
"testingDataOverride": """
select {* EXCLUDING(label0, label1)} as features,
{label0, label1} as label from toy
""",
"datasetFolds" : [
{
"trainingWhere": "rowHash() % 10 < 7",
"testingWhere": "rowHash() % 10 >= 7",
}],
"modelFileUrlPattern": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"equalizationFactor": 0.5,
"mode": "multilabel",
"multilabelStrategy": "decompose",
"recallOverN" : [1],
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
"outputAccuracyDataset": False
}
}
rez = mldb.put("/v1/procedures/rocket_science", conf)
rez = mldb.post("/v1/procedures/rocket_science/runs")
js_rez = rez.json()
self.assertEqual(
js_rez["status"]["folds"][0]["resultsTest"]["weightedStatistics"]["recallOverTopN"][0], 1.0)
def test_onevsall_simple(self):
conf = {
"type": "classifier.train",
"params": {
"trainingData": """
select {* EXCLUDING(label0, label1)} as features,
{label0, label1} as label from trivial
""",
"modelFileUrl": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "one-vs-all",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf)
res = mldb.query("SELECT classifyMe({features : {5 as feat1, 0 as feat2}}) as *")
self.assertTableResultEquals(res, [
[
"_rowName",
"scores.\"\"\"label0\"\"\"",
"scores.\"\"\"label1\"\"\""
],
[
"result",
0.9999726414680481,
2.73847472271882e-05
]
])
def test_onevsall_combinaison(self):
conf = {
"type": "classifier.train",
"params": {
"trainingData": """
select {* EXCLUDING(label0, label1, label2)} as features,
{label0, label1, label2} as label from trivial2
""",
"modelFileUrl": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "one-vs-all",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf)
res = mldb.query("SELECT classifyMe({features : {5 as feat1, 0 as feat2, 5 as feat3}}) as *")
self.assertTableResultEquals(res, [
[
"_rowName",
"scores.\"\"\"label0\"\"\"",
"scores.\"\"\"label1\"\"\"",
"scores.\"\"\"label2\"\"\""
],
[
"result",
0.9999789595603943,
3.6201177863404155e-05,
0.9999789595603943
]
])
def test_accuracy_multilabel (self):
conf_classifier = {
"type": "classifier.train",
"params": {
"trainingData": """
select {* EXCLUDING(label0, label1, label2)} as features,
{label0, label1, label2} as label from trivial2
""",
"modelFileUrl": "file://build/x86_64/tmp/multilabel1.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "one-vs-all",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf_classifier)
accuracyConf = {
"type": "classifier.test",
"params": {
"testingData": """
select classifyMe({{* EXCLUDING(label0, label1, label2)} as features}) as score,
{label0, label1, label2} as label from trivial2
""",
"mode" : "multilabel",
"recallOverN" : [1, 2],
"runOnCreation": True
}
}
res = mldb.put("/v1/procedures/multilabel_accuracy", accuracyConf);
self.assertEqual(res.json()["status"]["firstRun"]["status"], {
"weightedStatistics": {
"coverageError": 1.333333333333333,
"recallOverTopN": [
0.6666666666666666,
1.0
]
},
"recallOverN": [
1,
2
],
"labelStatistics": {
"label0": {
"recallOverTopN": [
0.6666666666666666,
1.0
]
},
"label1": {
"recallOverTopN": [
0.6666666666666666,
1.0
]
},
"label2": {
"recallOverTopN": [
0.6666666666666666,
1.0
]
}
}
})
def test_precision_over_n_categorical (self):
conf_classifier = {
"type": "classifier.train",
"params": {
"trainingData": "select {* EXCLUDING(label)} as features, label from categoricalds",
"modelFileUrl": "file://build/x86_64/tmp/categorical1.cls",
"algorithm": "dt",
"mode": "categorical",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 2,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/categorical_train", conf_classifier)
accuracyConf = {
"type": "classifier.test",
"params": {
"testingData": """select classifyMe({{* EXCLUDING(label)} as features}) as score,
label from categoricalds
""",
"mode" : "categorical",
"recallOverN" : [2,3],
"runOnCreation": True
}
}
res = mldb.put("/v1/procedures/categorical_accuracy", accuracyConf);
self.assertEqual(res.json()["status"]["firstRun"]["status"], {
"weightedStatistics": {
"recall": 0.65,
"support": 60.0,
"precision": 0.8292682926829269,
"recallOverTopN": [
0.8333333333333334,
1.0
],
"f1Score": 0.6476980089190377,
"accuracy": 0.7666666666666667
},
"labelStatistics": {
"orange": {
"recall": 0.5,
"support": 20.0,
"precision": 1.0,
"recallOverTopN": [
0.5,
1.0
],
"f1Score": 0.6666666666666666,
"accuracy": 0.8333333333333334
},
"banane": {
"recall": 0.45,
"support": 20.0,
"precision": 1.0,
"recallOverTopN": [
1.0,
1.0
],
"f1Score": 0.6206896551724138,
"accuracy": 0.8166666666666667
},
"pomme": {
"recall": 1.0,
"support": 20.0,
"precision": 0.4878048780487805,
"recallOverTopN": [
1.0,
1.0
],
"f1Score": 0.6557377049180327,
"accuracy": 0.65
}
},
"recallOverN": [
2,
3
],
"confusionMatrix": [
{
"count": 9.0,
"actual": "banane",
"predicted": "banane"
},
{
"count": 11.0,
"actual": "banane",
"predicted": "pomme"
},
{
"count": 10.0,
"actual": "orange",
"predicted": "orange"
},
{
"count": 10.0,
"actual": "orange",
"predicted": "pomme"
},
{
"count": 20.0,
"actual": "pomme",
"predicted": "pomme"
}
]
})
def test_decompose_explain(self):
conf = {
"type": "classifier.train",
"params": {
"trainingData": "select {* EXCLUDING(label0, label1)} as features, {label0, label1} as label from toy",
"modelFileUrl": "file://build/x86_64/tmp/multilabel_decompose_explain.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "decompose",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1,
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf)
mldb.put("/v1/functions/explain", {
"type": "classifier.explain",
"params": {
"modelFileUrl": "file://build/x86_64/tmp/multilabel_decompose_explain.cls",
}
})
res = mldb.query("""SELECT explain({features : {5 as feat1, 0 as feat2},
label : 'label0'}) as *
""")
self.assertEqual(res,[
[
"_rowName",
"bias",
"explanation.feat2"
],
[
"result",
0.0012001146096736193,
0.9987998604774475
]
])
def test_onevsall_explain(self):
conf = {
"type": "classifier.train",
"params": {
"trainingData": """select {* EXCLUDING(label0, label1)} as features,
{label0, label1} as label from trivial
""",
"modelFileUrl": "file://build/x86_64/tmp/multilabel_onevsall_explain.cls",
"algorithm": "dt",
"mode": "multilabel",
"multilabelStrategy": "one-vs-all",
"functionName" : "classifyMe",
"configuration": {
"dt": {
"type": "decision_tree",
"max_depth": 8,
"verbosity": 0,
"update_alg": "gentle",
"random_feature_propn": 1
}
},
}
}
mldb.put("/v1/procedures/multilabel_train", conf)
mldb.put("/v1/functions/explain", {
"type": "classifier.explain",
"params": {
"modelFileUrl": "file://build/x86_64/tmp/multilabel_onevsall_explain.cls",
}
})
res = mldb.query("SELECT explain({features : {5 as feat1, 0 as feat2}, label : 'label0'}) as *")
self.assertEqual(res,[
[
"_rowName",
"bias",
"explanation.feat2"
],
[
"result",
0,
1
]
])
if __name__ == '__main__':
mldb.run_tests()
| 37.199697 | 119 | 0.36195 | 1,632 | 24,589 | 5.357843 | 0.130515 | 0.032937 | 0.02882 | 0.025274 | 0.819419 | 0.79952 | 0.778934 | 0.741308 | 0.731016 | 0.683783 | 0 | 0.06954 | 0.51401 | 24,589 | 660 | 120 | 37.256061 | 0.662176 | 0.006832 | 0 | 0.654882 | 0 | 0.001684 | 0.313726 | 0.045263 | 0 | 0 | 0 | 0 | 0.016835 | 1 | 0.015152 | false | 0 | 0.005051 | 0 | 0.021886 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
16b4df68595a045b1339b5139681b502e0455232 | 53,839 | py | Python | api/qymatix/insights.py | manisharmagarg/qymatix | 0dc240970359429ae5105db79f9aebf1a99ba6fd | [
"Apache-2.0"
] | null | null | null | api/qymatix/insights.py | manisharmagarg/qymatix | 0dc240970359429ae5105db79f9aebf1a99ba6fd | [
"Apache-2.0"
] | null | null | null | api/qymatix/insights.py | manisharmagarg/qymatix | 0dc240970359429ae5105db79f9aebf1a99ba6fd | [
"Apache-2.0"
] | null | null | null | '''
Read databases for insghits results
'''
# pylint: disable=invalid-name
# pylint: disable=bare-except
# pylint: disable=broad-except
# pylint: disable=redefined-outer-name
# pylint: disable=too-many-statements
# pylint: disable=too-many-branches
# pylint: disable=unused-argument
# pylint: disable=singleton-comparison
# pylint: disable=redundant-keyword-arg
# pylint: disable=too-many-arguments
# pylint: disable=too-many-locals
# pylint: disable=no-value-for-parameter
# pylint: disable=too-many-function-args
# pylint: disable=too-many-nested-blocks
# pylint: disable=unused-variable
# pylint: disable=import-error
# pylint: disable=too-many-lines
# pylint: disable=unreachable
import datetime
import logging
import traceback
import numpy as np
import MySQLdb as mysql
from ..qymatix import results
from ..qymatix.analytics.performance_analytics import goals, kam, multiparam
from ..qymatix.analytics.sales_analytics import sales
from ..infrastructure.mysql import connection
logger = logging.getLogger(__name__)
def getInsights(dbname, account='all', raw=False,
local=False, dbusername='', passwd='', username=''):
''' Reads result's database, manipulate the data and returns it.
'''
dbname = 'data_{}'.format(dbname)
dbname_results = dbname
dbname_tasks = dbname.replace('tasks', 'data')
# username = ''
# passwd = ''
data = dict()
# try:
# account = account.decode('utf-8')
# except:
# pass
# account = account.encode('latin-1')
try:
# logging.debug(dbname)
datadb = dbname_results
mysql_connection = connection.MySQLConnection(datadb)
con = mysql_connection.connect()
cur = con.cursor()
if account == 'all':
data['plans per account'] = kam.plansPerAccount(cur, username=username)
data['actions per account'] = kam.actionsPerAccount(cur, username=username)
data['activity goals'] = kam.activityGoals(cur, account=account, username=username)
data['total sales plans'] = kam.totalSalesPlans(cur, account=account, username=username)
data['total plan goals'] = kam.totalPlanGoals(cur, account=account, username=username)
data['actions per day'] = kam.actionsPerDay(cur, account=account, username=username)
data['actions per month'] = kam.actionsPerMonth(cur, account=account, username=username)
data['actions per year'] = kam.actionsPerYear(cur, account=account, username=username)
data['goals per quarter'] = kam.goalsPerQuarter(cur, account=account, username=username)
data['total calls goal'] = kam.totalCallsGoal(cur, account=account, username=username)
data['total visits goal'] = kam.totalVisitsGoal(cur, account=account, username=username)
data['total offers goal'] = kam.totalOffersGoal(cur, account=account, username=username)
month = str(datetime.datetime.now().month)
try:
data['actions this month'] = data['actions per month'][month]
except:
data['actions this month'] = 0
data['actions QTD'] = kam.actionsQTD(cur, account=account, username=username)
data['actions MTD'] = kam.actionsMTD(cur, account=account, username=username)
data['actions YTD'] = kam.actionsYTD(cur, account=account, username=username)
today = str(datetime.datetime.now()).split(" ")[0]
firstday = str(datetime.date(datetime.datetime.now().year, 1, 1))
wd = np.busday_count(firstday, today) * 1.0
data['actions YTD date ratio'] = round(data['actions YTD'] / wd, 2)
except (
NameError, TypeError,
KeyError, ValueError,
AttributeError, IndexError
) as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
if account == 'all':
data['plans per account'] = "{}"
data['actions per account'] = "{}"
data['activity goals'] = 0
data['total sales plans'] = 0
data['total plan goals'] = 0
data['actions per day'] = 0
data['actions per month'] = 0
data['actions per year'] = 0
data['goals per quarter'] = 0
data['total calls goal'] = 0
data['total visits goal'] = 0
data['total offers goal'] = 0
data['actions this month'] = 0
data['actions QTD'] = 0
data['actions MTD'] = 0
data['actions YTD'] = 0
data['actions YTD date ratio'] = 0
raise
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
try:
mysql_connection = connection.MySQLConnection(dbname_tasks)
con = mysql_connection.connect()
cur = con.cursor()
today = datetime.datetime.now()
# list of all accounts
if account == 'all':
data['accounts'] = kam.accounts(cur, username)
# active accounts and sales in the las 3 months
data['active accounts'] = kam.activeAccounts(cur, username=username)
hoy = datetime.datetime.now()
_tmb = datetime.datetime(year=hoy.year, month=hoy.month, day=hoy.day)
try:
data['active accounts growth'] = 100. * (len(
data['active accounts'].keys()
) / len(kam.activeAccounts(
cur, account, username, today=_tmb
).keys()) - 1)
except:
data['active accounts growth'] = 0
data['lost accounts'] = [
a for a in data['accounts'] if a not in data['active accounts'].keys()
]
try:
data['actions-accounts ratio'] = round(
float(data['actions YTD']) / len(data['accounts']), 2
)
except:
data['actions-accounts ratio'] = 0.0
try:
data['actions-active accounts ratio'] = round(
float(data['actions YTD']) / len(
data['active accounts'].keys()
), 2
)
except:
data['actions-active accounts ratio'] = 0.0
try:
data['penetration ratio'] = round(
100 * float(len(
data['active accounts'].keys()
)) / len(data['accounts']), 2
)
except:
data['penetration ratio'] = 0.0
try:
data['sales YTD'] = round(
sales.salesYTD(
cur, account=account, username=username
), 2
)
except:
data['sales YTD'] = 0.0
try:
data['margin YTD'] = round(
sales.salesYTD(
cur, param='margin', account=account,
username=username), 2
)
except:
data['margin YTD'] = 0.0
try:
data['sales QTD'] = round(
sales.salesQTD(
cur, year=today.year, account=account,
username=username), 2
)
except:
data['sales QTD'] = 0.0
try:
data['margin QTD'] = round(
sales.salesQTD(
cur, param='margin', year=today.year,
account=account, username=username
), 2
)
except:
data['margin QTD'] = 0.0
try:
data['sales MTD'] = round(
sales.salesMTD(
cur, account=account, username=username
), 2
)
except:
data['sales MTD'] = 0.0
data['sales per quarter'] = sales.salesPerQuarter(
cur, param='price', year=today.year,
account=account, username=username
)
data['margin per quarter'] = sales.salesPerQuarter(
cur, param='margin', year=today.year,
account=account, username=username
)
data['monthly sales'] = multiparam.monthlyParam(
cur, param='price', year=today.year,
account=account, username=username
)
data['monthly sales last year'] = multiparam.monthlyParam(
cur, param='price', year=today.year - 1,
account=account, username=username
)
data['monthly margin'] = multiparam.monthlyParam(
cur, param='margin', year=today.year,
account=account, username=username
)
data['monthly margin last year'] = multiparam.monthlyParam(
cur, param='margin', year=today.year - 1,
account=account, username=username
)
s = 0
for d in data['monthly sales last year']:
s += d['sales']
data['sales last year'] = round(s, 2)
try:
data['sales growth YTD'] = round(
100 * data['sales YTD'] / data['sales last year'], 0
)
except:
data['sales growth YTD'] = 0.0
s = 0
if today.month > 1:
try:
data['sales growth month'] = round(
data['monthly sales'][today.month] /
data['monthly sales'][today.month - 1], 2
)
except:
data['sales growth month'] = 0.0
else:
for l in data['monthly sales last year']:
if l['month'] == 12:
sb = l['sales']
for l in data['monthly sales']:
if l['month'] == 12:
cs = l['sales']
try:
data['sales growth month'] = round(cs / sb, 2)
except:
data['sales growth month'] = 0.0
s = 0
for d in data['monthly margin last year']:
s += d['margin']
data['margin last year'] = round(s, 2)
try:
data['margin growth YTD'] = round(
100 * data['margin YTD'] / data['margin last year'], 0
)
except:
data['margin growth YTD'] = 0.0
s = 0
if today.month > 1:
try:
data['margin growth month'] = round(
data['monthly margin'][today.month] /
data['monthly margin'][today.month - 1], 2
)
except:
data['margin growth month'] = 0.0
else:
for l in data['monthly margin last year']:
if l['month'] == 12:
sb = l['margin']
for l in data['monthly margin']:
if l['month'] == 12:
cs = l['margin']
try:
data['margin growth month'] = round(cs / sb, 2)
except:
data['margin growth month'] = 0.0
# SALES
currentQuarter = (today.month - 1) // 3 + 1
salesCurrentQuarter = data['sales per quarter'][currentQuarter]
if currentQuarter == 1:
salesLastQuarter = round(
sales.salesPerQuarter(
cur, year=today.year - 1, param='price',
account=account, username=username)[4], 2
)
else:
salesLastQuarter = round(
data['sales per quarter'][currentQuarter - 1], 2
)
try:
data['sales growth QTD'] = round(
100 * salesCurrentQuarter / salesLastQuarter, 2
)
except:
data['sales growth QTD'] = 0.0
# MARGIN
currentQuarter = (today.month - 1) // 3 + 1
marginCurrentQuarter = data['margin per quarter'][currentQuarter]
if currentQuarter == 1:
marginLastQuarter = round(
sales.salesPerQuarter(cur, year=today.year - 1,
param='margin', account=account,
username=username)[4], 2
)
else:
marginLastQuarter = round(
data['margin per quarter'][currentQuarter - 1], 2
)
try:
data['margin growth QTD'] = round(
100 * marginCurrentQuarter / marginLastQuarter, 2
)
except:
data['margin growth QTD'] = 0.0
# PIPELINE
data['pipelines'] = sales.pipelines()
except mysql.Error as e:
raise
# data = {}
data['sales YTD'] = 0
data['margin YTD'] = 0
data['sales QTD'] = 0
data['margin QTD'] = 0
data['sales MTD'] = 0
data['sales per quarter'] = 0
data['margin per quarter'] = 0
data['monthly sales'] = 0
data['monthly sales last year'] = 0
data['monthly margin'] = 0
data['monthly margin last year'] = 0
data['sales last year'] = 0
data['sales growth YTD'] = 0.0
data['sales growth month'] = 0.0
data['margin last year'] = 0.0
data['margin growth YTD'] = 0.0
data['margin growth month'] = 0.0
data['sales growth QTD'] = 0.0
data['margin growth QTD'] = 0.0
data['pipelines'] = 0.0
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
return data
def get_insights_crm(dbname, account='all', raw=False, local=False,
dbusername='', passwd='', username='', user=''):
''' Reads result's database, manipulate the data and returns it.
'''
dbname = 'data_{}'.format(dbname)
dbname_results = dbname
dbname_tasks = dbname.replace('tasks', 'data')
data = dict()
try:
mysql_connection = connection.MySQLConnection(dbname_results)
con = mysql_connection.connect()
cur = con.cursor()
if account == 'all':
data['plans_per_account'] = kam.plansPerAccount(
cur, username=username
)
data['actions_per_account'] = kam.actionsPerAccount(
cur, username=username
)
data['activity_goals'] = kam.activityGoals(
cur, account=account, username=username
)
data['total_sales_plans'] = kam.totalSalesPlans(
cur, account=account, username=username
)
data['total_plan_goals'] = kam.totalPlanGoals(
cur, account=account, username=username
)
data['actions_per_day'] = kam.actionsPerDay(
cur, account=account, username=username
)
data['actions_per_month'] = kam.actionsPerMonth(
cur, account=account, username=username
)
data['actions_per_year'] = kam.actionsPerYear(
cur, account=account, username=username
)
data['goals_per_quarter'] = kam.goalsPerQuarter(
cur, account=account, username=username
)
data['total_calls_goal'] = kam.totalCallsGoal(
cur, account=account, username=username
)
data['total_visits_goal'] = kam.totalVisitsGoal(
cur, account=account, username=username
)
data['total_offers_goal'] = kam.totalOffersGoal(
cur, account=account, username=username
)
month = str(datetime.datetime.now().month)
try:
data['actions_this_month'] = data['actions_per_month'][month]
except:
data['actions_this_month'] = 0
data['actions_QTD'] = kam.actionsQTD(
cur, account=account, username=username
)
data['actions_MTD'] = kam.actionsMTD(
cur, account=account, username=username
)
data['actions_YTD'] = kam.actionsYTD(
cur, account=account, username=username
)
today = str(datetime.datetime.now()).split(" ")[0]
firstday = str(datetime.date(datetime.datetime.now().year, 1, 1))
wd = np.busday_count(firstday, today) * 1.0
data['actions_YTD_date_ratio'] = round(data['actions_YTD'] / wd, 2)
except Exception as exception:
if account == 'all':
data['plans_per_account'] = "{}"
data['actions_per_account'] = "{}"
data['activity_goals'] = 0
data['total_sales_plans'] = 0
data['total_plan_goals'] = 0
data['actions_per_day'] = 0
data['actions_per_month'] = 0
data['actions_perQyear'] = 0
data['goals_per_quarter'] = 0
data['total_calls_goal'] = 0
data['total_visits_goal'] = 0
data['total_offers_goal'] = 0
data['actions_this_month'] = 0
data['actions_QTD'] = 0
data['actions_MTD'] = 0
data['actions_YTD'] = 0
data['actions_YTD_date_ratio'] = 0
raise
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
try:
mysql_connection = connection.MySQLConnection(dbname_tasks)
con = mysql_connection.connect()
cur = con.cursor()
today = datetime.datetime.now()
# list of all accounts
if account == 'all':
data['accounts'] = kam.accounts(cur, username)
data['accounts_name'] = kam.accounts_name(cur, username)
# active accounts and sales in the las 3 months
data['active_accounts'] = kam.activeAccountsCRM(cur, username=username)
hoy = datetime.datetime.now()
_tmb = datetime.datetime(year=hoy.year, month=hoy.month, day=hoy.day)
try:
data['active_accounts_growth'] = 100. * (len(data['active_accounts'].keys()) / len(
kam.activeAccounts(cur, account, username, today=_tmb).keys()) - 1)
except:
data['active_accounts_growth'] = 0
data['lost_accounts'] = [
a for a in data['accounts'] if a not in data['active_accounts'].keys()
]
try:
data['actions_accounts_ratio'] = round(
float(data['actions_YTD']) / len(data['accounts']), 2
)
except:
data['actions_accounts_ratio'] = 0.0
try:
data['actions_active_accounts_ratio'] = round(
float(data['actions_YTD']) / len(
data['active_accounts'].keys()
), 2
)
except:
data['actions_active_accounts_ratio'] = 0.0
try:
data['penetration_ratio'] = round(
100 * float(len(data['active_accounts'].keys())) / len(
data['accounts']), 2
)
except:
data['penetration_ratio'] = 0.0
# PIPELINE
# pp = round(sales.pipelines(cur), 2)
# try:
# pp = sales.pipelines(cur)
# except:
# pp = 0
data['pipelines'] = data['actions_YTD_date_ratio']
months = (
'january', 'february', 'march', 'april', 'may', 'june', 'july',
'august', 'september', 'october', 'november', 'december'
)
data['goals'] = goals.get_goals(con, groupby='year', orient='list')
cur = con.cursor()
if today.year - 1 not in data['goals'].keys():
goals.create_goal(cur, {'year': today.year - 1}, user=user)
if today.year not in data['goals'].keys():
goals.create_goal(cur, {'year': today.year}, user=user)
if today.year + 1 not in data['goals'].keys():
goals.create_goal(cur, {'year': today.year + 1}, user=user)
if today.year + 2 not in data['goals'].keys():
goals.create_goal(cur, {'year': today.year + 2}, user=user)
data['goals'] = goals.get_goals(con, groupby='year', orient='list')
data['total_goals'] = {}
for k in data['goals'].keys():
data['total_goals'][k] = {}
for l in months:
data['total_goals'][k][l] = np.sum(data['goals'][k][l])
data['total_goals'][k]['id'] = data['goals'][k]['id'][0]
except Exception as e:
# print("Error {0}: {1}".format(e.args[0], e.args[1]))
# data = {}
data['sales YTD'] = 0
data['margin YTD'] = 0
data['sales QTD'] = 0
data['margin QTD'] = 0
data['sales MTD'] = 0
data['sales per quarter'] = 0
data['margin per quarter'] = 0
data['monthly sales'] = 0
data['monthly sales last year'] = 0
data['monthly margin'] = 0
data['monthly margin last year'] = 0
data['sales last year'] = 0
data['sales growth YTD'] = 0.0
data['sales growth month'] = 0.0
data['margin last year'] = 0.0
data['margin growth YTD'] = 0.0
data['margin growth month'] = 0.0
data['sales growth QTD'] = 0.0
data['margin growth QTD'] = 0.0
data['pipelines'] = 0.0
raise
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
return data
def get_insights(dbname, account='all', raw=False, local=False,
dbusername='', passwd='', username='', user=''):
''' Reads result's database, manipulate the data and returns it.
'''
dbname = 'data_{}'.format(dbname)
dbname_results = dbname
dbname_tasks = dbname.replace('tasks', 'data')
# username = ''
# passwd = ''
data = dict()
# try:
# account = account.decode('utf-8')
# except:
# pass
# account = account.encode('latin-1')
try:
mysql_connection = connection.MySQLConnection(dbname_results)
con = mysql_connection.connect()
cur = con.cursor()
if account == 'all':
data['plans_per_account'] = kam.plansPerAccount(
cur, username=username
)
data['actions_per_account'] = kam.actionsPerAccount(
cur, username=username
)
data['activity_goals'] = kam.activityGoals(
cur, account=account, username=username
)
data['total_sales_plans'] = kam.totalSalesPlans(
cur, account=account, username=username
)
data['total_plan_goals'] = kam.totalPlanGoals(
cur, account=account, username=username
)
data['actions_per_day'] = kam.actionsPerDay(
cur, account=account, username=username
)
data['actions_per_month'] = kam.actionsPerMonth(
cur, account=account, username=username
)
data['actions_per_year'] = kam.actionsPerYear(
cur, account=account, username=username
)
data['goals_per_quarter'] = kam.goalsPerQuarter(
cur, account=account, username=username
)
data['total_calls_goal'] = kam.totalCallsGoal(
cur, account=account, username=username
)
data['total_visits_goal'] = kam.totalVisitsGoal(
cur, account=account, username=username
)
data['total_offers_goal'] = kam.totalOffersGoal(
cur, account=account, username=username
)
month = str(datetime.datetime.now().month)
try:
# data['actions_this_month'] = data['actions_per_month'][month]
data['actions_this_month'] = month
except:
data['actions_this_month'] = 0
data['actions_QTD'] = kam.actionsQTD(
cur, account=account, username=username
)
data['actions_MTD'] = kam.actionsMTD(
cur, account=account, username=username
)
data['actions_YTD'] = kam.actionsYTD(
cur, account=account, username=username
)
today = str(datetime.datetime.now()).split(" ")[0]
firstday = str(datetime.date(datetime.datetime.now().year, 1, 1))
wd = np.busday_count(firstday, today) * 1.0
data['actions_YTD_date_ratio'] = round(data['actions_YTD'] / wd, 2)
except Exception as e:
# print("Error {0}: {1}".format(e.args[0], e.args[1]))
if account == 'all':
data['plans_per_account'] = "{}"
data['actions_per_account'] = "{}"
data['activity_goals'] = 0
data['total_sales_plans'] = 0
data['total_plan_goals'] = 0
data['actions_per_day'] = 0
data['actions_per_month'] = 0
data['actions_perQyear'] = 0
data['goals_per_quarter'] = 0
data['total_calls_goal'] = 0
data['total_visits_goal'] = 0
data['total_offers_goal'] = 0
data['actions_this_month'] = 0
data['actions_QTD'] = 0
data['actions_MTD'] = 0
data['actions_YTD'] = 0
data['actions_YTD_date_ratio'] = 0
raise
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s",
exception,
extra={
'type': 'Login'
}
)
try:
mysql_connection = connection.MySQLConnection(dbname_tasks)
con = mysql_connection.connect()
cur = con.cursor()
today = datetime.datetime.now()
# list of all accounts
if account == 'all':
data['accounts'] = kam.accounts(cur, username)
data['accounts_name'] = kam.accounts_name(cur, username)
# active accounts and sales in the las 3 months
data['active_accounts'] = kam.activeAccounts(cur, username=username)
hoy = datetime.datetime.now()
_tmb = datetime.datetime(year=hoy.year, month=hoy.month, day=hoy.day)
try:
data['active_accounts_growth'] = 100. * (len(data['active_accounts'].keys()) / len(
kam.activeAccounts(cur, account, username, today=_tmb).keys()) - 1)
except:
data['active_accounts_growth'] = 0
data['lost_accounts'] = [
a for a in data['accounts'] if a not in data['active_accounts'].keys()
]
try:
data['actions_accounts_ratio'] = round(
float(data['actions_YTD']) / len(
data['accounts']), 2
)
except:
data['actions_accounts_ratio'] = 0.0
try:
data['actions_active_accounts_ratio'] = round(
float(data['actions_YTD']) / len(
data['active_accounts'].keys()
), 2
)
except:
data['actions_active_accounts_ratio'] = 0.0
try:
data['penetration_ratio'] = round(
100 * float(len(
data['active_accounts'].keys()
)) / len(data['accounts']), 2
)
except:
data['penetration_ratio'] = 0.0
try:
data['sales_YTD'] = sales.salesYTD(
cur, account=account, username=username
)
if data['sales_YTD'] == None:
data['sales_YTD'] = 0.0
except:
data['sales_YTD'] = 0.0
# print(data['sales_YTD'])
try:
data['margin_YTD'] = sales.salesYTD(
cur, param='margin', account=account, username=username
)
if data['margin_YTD'] == None:
data['margin_YTD'] = 0.0
except:
data['margin_YTD'] = 0.0
try:
data['sales_QTD'] = round(
sales.salesQTD(
cur, year=today.year, account=account, username=username
), 2
)
except:
data['sales_QTD'] = 0.0
try:
data['margin_QTD'] = round(
sales.salesQTD(
cur, param='margin', year=today.year, account=account,
username=username
), 2
)
except:
data['margin_QTD'] = 0.0
try:
data['sales_MTD'] = round(
sales.salesMTD(cur, account=account, username=username), 2
)
except:
data['sales_MTD'] = 0.0
# data['sales per quarter'] = sales.salesPerQuarter(
# cur, param='price', year=today.year,
# account=account, username=username
# )
# data['margin per quarter'] = sales.salesPerQuarter(
# cur, param='margin', year=today.year,
# account=account, username=username
# )
#
# data['monthly sales'] = multiparam.monthlyParam(
# cur, param='price', year=today.year,
# account=account, username=username
# )
# data['monthly sales last year'] = multiparam.monthlyParam(
# cur, param='price', year=today.year-1,
# account=account, username=username
# )
#
# data['monthly margin'] = multiparam.monthlyParam(
# cur, param='margin', year=today.year,
# account=account, username=username
# )
# data['monthly margin last year'] = multiparam.monthlyParam(
# cur, param='margin', year=today.year-1,
# account=account, username=username
# )
data['values_per_month'] = multiparam.values_per_month(
cur, param='margin', year=today.year,
account=account, username=username
)
data['values_per_quarter'] = sales.values_per_quarter(
cur, param='price', year=today.year,
account=account, username=username
)
s = 0
for d in data['values_per_month'][today.year - 1]['sales']:
s += d
data['sales_last_year'] = round(s, 2)
s = 0
for d in data['values_per_month'][today.year]['sales']:
s += d
data['sales_current_year'] = round(s, 2)
try:
data['sales_growth_YTD'] = round(
100 * data['sales_current_year'] /
data['sales_last_year'], 0
)
except:
data['sales_growth_YTD'] = 0.0
# raise
s = 0
if today.month > 1:
try:
data['sales growth month'] = (
data['monthly sales'][today.month] /
data['monthly sales'][today.month - 1], 2
)
except:
data['sales growth month'] = 0.0
else:
try:
for l in data['monthly sales last year']:
if l['month'] == 12:
sb = l['sales']
for l in data['monthly sales']:
if l['month'] == 12:
cs = l['sales']
data['sales growth month'] = round(cs / sb, 2)
except:
data['sales growth month'] = 0.0
data['sales per quarter'] = sales.salesPerQuarter(
cur, param='price', year=today.year,
account=account, username=username
)
data['sales per quarter last year'] = sales.salesPerQuarter(
cur, param='price', year=today.year-1,
account=account, username=username
)
data['margin per quarter'] = sales.salesPerQuarter(
cur, param='margin', year=today.year,
account=account, username=username
)
data['margin per quarter last year'] = sales.salesPerQuarter(
cur, param='margin', year=today.year-1,
account=account, username=username
)
data['monthly sales'] = multiparam.monthlyParam(
cur, param='price', year=today.year,
account=account, username=username
)
data['monthly sales last year'] = multiparam.monthlyParam(
cur, param='price', year=today.year - 1,
account=account, username=username
)
data['monthly margin'] = multiparam.monthlyParam(
cur, param='margin', year=today.year,
account=account, username=username
)
data['monthly margin last year'] = multiparam.monthlyParam(
cur, param='margin', year=today.year - 1,
account=account, username=username
)
s = 0
for d in data['monthly margin last year']:
s += d['margin']
data['margin last year'] = round(s, 2)
s = 0
for d in data['values_per_month'][today.year - 1]['margin']:
# s += d['sales']
s += d
data['margin_last_year'] = round(s, 2)
# try:
# data['margin_growth_YTD'] = 100 * data['margin_YTD'] / data['margin_last_year']
# except:
# data['margin_growth_YTD'] = 0.0
# raise
s = 0
if today.month > 1:
try:
data['margin_growth_month'] = round(
data['monthly margin'][today.month] /
data['monthly margin'][today.month - 1], 2
)
except:
data['margin_growth_month'] = 0.0
else:
for l in data['monthly margin last year']:
if l['month'] == 12:
sb = l['margin']
for l in data['monthly margin']:
if l['month'] == 12:
cs = l['margin']
try:
data['margin_growth_month'] = round(cs / sb, 2)
except:
data['margin_growth_month'] = 0.0
# SALES
currentQuarter = (today.month - 1) // 3 + 1
data['sales_current_quarter'] = data[
'values_per_quarter'][today.year]['sales'][currentQuarter - 1]
if currentQuarter == 1:
data['sales_last_quarter'] = data[
'values_per_quarter'
][today.year - 1]['sales'][3]
else:
data['sales_last_quarter'] = data[
'values_per_quarter'
][today.year]['sales'][currentQuarter - 2]
try:
data['sales_growth_QTD'] = round(
100 * data['sales_current_quarter'] / data['sales_last_quarter'], 2
)
except:
data['sales_growth_QTD'] = 0.0
# MARGIN
data['margin_current_quarter'] = data[
'values_per_quarter'][today.year]['margin'][currentQuarter - 1]
if currentQuarter == 1:
data['margin_last_quarter'] = data['values_per_quarter'][today.year - 1]['margin'][3]
else:
data['margin_last_quarter'] = data[
'values_per_quarter'
][today.year]['margin'][currentQuarter - 2]
try:
data['margin_growth_QTD'] = round(
100 * data['margin_current_quarter'] /
data['margin_last_quarter'], 2
)
except:
data['margin_growth_QTD'] = 0.0
# PIPELINE
# pp = round(sales.pipelines(cur), 2)
try:
pp = sales.pipelines(cur)
except:
pp = 0
data['pipelines'] = pp
data['kpi'] = {}
data['kpi']['pipelines'] = pp
if account == 'all':
min_sales = sales.min_values(cur, username=username)
max_sales = sales.max_values(cur, username=username)
average_sales = sales.average_values(cur, username=username)
min_results = results.min_values(cur, username=username)
max_results = results.max_values(cur, username=username)
average_results = results.average_values(cur, username=username)
counts_ppb, counts_risk = results.count_ppb_risk(con, username=username)
data['counts_ppb'] = counts_ppb
data['counts_risk'] = counts_risk
data['max_values'] = {}
data['max_values'].update(max_sales)
data['max_values'].update(max_results)
data['min_values'] = {}
data['min_values'].update(min_sales)
data['min_values'].update(min_results)
data['average_values'] = {}
data['average_values'].update(average_sales)
data['average_values'].update(average_results)
data['normalized_average_values'] = {}
# ********* Fix Me ***********
# for k in data['max_values'].keys():
# data['normalized_average_values'][k] = data['average_values'][k] /
# data['max_values'][k]
months = (
'january', 'february', 'march', 'april', 'may', 'june',
'july', 'august', 'september', 'october', 'november', 'december'
)
data['goals'] = goals.get_goals(con, groupby='year', orient='list')
if today.year - 1 not in data['goals'].keys():
try:
goals.create_goal(con, {'year': today.year - 1}, user=user)
except:
pass
if today.year not in data['goals'].keys():
try:
goals.create_goal(con, {'year': today.year}, user=user)
except:
pass
if today.year + 1 not in data['goals'].keys():
try:
goals.create_goal(con, {'year': today.year + 1}, user=user)
except:
pass
if today.year + 2 not in data['goals'].keys():
try:
goals.create_goal(con, {'year': today.year + 2}, user=user)
except:
pass
data['goals'] = goals.get_goals(con, groupby='year', orient='list')
data['total_goals'] = {}
for k in data['goals'].keys():
data['total_goals'][k] = {}
for l in months:
data['total_goals'][k][l] = np.sum(data['goals'][k][l])
data['total_goals'][k]['id'] = data['goals'][k]['id'][0]
except Exception as exception:
data['sales YTD'] = 0
data['margin YTD'] = 0
data['sales QTD'] = 0
data['margin QTD'] = 0
data['sales MTD'] = 0
data['sales per quarter'] = 0
data['margin per quarter'] = 0
data['monthly sales'] = 0
data['monthly sales last year'] = 0
data['monthly margin'] = 0
data['monthly margin last year'] = 0
data['sales last year'] = 0
data['sales growth YTD'] = 0.0
data['sales growth month'] = 0.0
data['margin last year'] = 0.0
data['margin growth YTD'] = 0.0
data['margin growth month'] = 0.0
data['sales growth QTD'] = 0.0
data['margin growth QTD'] = 0.0
data['pipelines'] = 0.0
raise
data["message"] = exception
data['traceback'] = traceback.format_exc()
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
return data
def getPerformance(username, account='all', raw=False, local=False, dbusername='', passwd=''):
''' Reads result's database, manipulate the data and returns it.
'''
dbname = 'data_{}'.format(username)
dbname_results = dbname
dbname_tasks = dbname.replace('tasks', 'data')
# username = ''
# passwd = ''
data = dict()
try:
account = account.decode('utf-8')
except:
pass
account = account.encode('latin-1')
try:
mysql_connection = connection.MySQLConnection(dbname_results)
con = mysql_connection.connect()
cur = con.cursor()
if account == 'all':
data['plans per account'] = kam.plansPerAccount(cur)
data['actions per account'] = kam.actionsPerAccount(cur)
data['activity goals'] = kam.activityGoals(cur, account=account)
data['total sales plans'] = kam.totalSalesPlans(cur, account=account)
data['total plan goals'] = kam.totalPlanGoals(cur, account=account)
data['actions per day'] = kam.actionsPerDay(cur, account=account)
data['actions per month'] = kam.actionsPerMonth(cur, account=account)
data['actions per year'] = kam.actionsPerYear(cur, account=account)
data['goals per quarter'] = kam.goalsPerQuarter(cur, account=account)
data['total calls goal'] = kam.totalCallsGoal(cur, account=account)
data['total visits goal'] = kam.totalVisitsGoal(cur, account=account)
data['total offers goal'] = kam.totalOffersGoal(cur, account=account)
month = str(datetime.datetime.now().month)
try:
data['actions this month'] = data['actions per month'][month]
except:
data['actions this month'] = 0
data['actions QTD'] = kam.actionsQTD(cur, account=account)
data['actions MTD'] = kam.actionsMTD(cur, account=account)
data['actions YTD'] = kam.actionsYTD(cur, account=account)
today = str(datetime.datetime.now()).split(" ")[0]
firstday = str(datetime.date(datetime.datetime.now().year, 1, 1))
wd = np.busday_count(firstday, today) * 1.0
data['actions YTD date ratio'] = round(data['actions YTD'] / wd, 2)
except:
if account == 'all':
data['plans per account'] = "{}"
data['actions per account'] = "{}"
data['activity goals'] = 0
data['total sales plans'] = 0
data['total plan goals'] = 0
data['actions per day'] = 0
data['actions per month'] = 0
data['actions per year'] = 0
data['goals per quarter'] = 0
data['total calls goal'] = 0
data['total visits goal'] = 0
data['total offers goal'] = 0
data['actions this month'] = 0
data['actions QTD'] = 0
data['actions MTD'] = 0
data['actions YTD'] = 0
data['actions YTD date ratio'] = 0
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
try:
mysql_connection = connection.MySQLConnection(dbname_tasks)
con = mysql_connection.connect()
cur = con.cursor()
today = datetime.datetime.now()
# list of all accounts
if account == 'all':
data['accounts'] = kam.accounts(cur)
# active accounts and sales in the las 3 months
data['active accounts'] = kam.activeAccounts(cur)
hoy = datetime.datetime.now()
_tmb = datetime.datetime(year=hoy.year, month=hoy.month, day=hoy.day)
try:
data['active accounts growth'] = 100. * (len(
data['active accounts'].keys()
) / len(
kam.activeAccounts(cur, today=_tmb).keys()
) - 1)
except:
data['active accounts growth'] = 0
data['lost accounts'] = [
a for a in data['accounts'] if a not in data['active accounts'].keys()
]
try:
data['actions-accounts ratio'] = round(
float(data['actions YTD']) / len(
data['accounts']
), 2
)
except:
data['actions-accounts ratio'] = 0.0
try:
data['actions-active accounts ratio'] = round(
float(data['actions YTD']) / len(
data['active accounts'].keys()
), 2
)
except:
data['actions-active accounts ratio'] = 0.0
try:
data['penetration ratio'] = round(
100 * float(
len(
data['active accounts'].keys())
) / len(data['accounts']), 2
)
except:
data['penetration ratio'] = 0.0
data['sales YTD'] = round(sales.salesYTD(cur, account=account), 2)
data['margin YTD'] = round(
sales.salesYTD(
cur, param='margin', account=account
), 2
)
data['sales QTD'] = round(
sales.salesQTD(
cur, year=today.year, account=account
), 2
)
data['margin QTD'] = round(
sales.salesQTD(
cur, param='margin', year=today.year, account=account
), 2
)
data['sales MTD'] = round(sales.salesMTD(cur, account=account), 2)
data['sales per quarter'] = sales.salesPerQuarter(
cur, param='price', year=today.year, account=account
)
data['margin per quarter'] = sales.salesPerQuarter(
cur, param='margin', year=today.year, account=account
)
data['monthly sales'] = multiparam.monthlyParam(
cur, param='price', year=today.year, account=account
)
data['monthly sales last year'] = multiparam.monthlyParam(
cur, param='price', year=today.year - 1, account=account
)
data['monthly margin'] = multiparam.monthlyParam(
cur, param='margin', year=today.year, account=account
)
data['monthly margin last year'] = multiparam.monthlyParam(
cur, param='margin', year=today.year - 1, account=account
)
s = 0
for d in data['monthly sales last year']:
s += d['sales']
data['sales last year'] = round(s, 2)
try:
data['sales growth YTD'] = round(
100 * data['sales YTD'] / data['sales last year'], 0
)
except:
data['sales growth YTD'] = 0.0
s = 0
if today.month > 1:
try:
data['sales growth month'] = round(
data['monthly sales'][today.month] /
data['monthly sales'][today.month - 1], 2
)
except:
data['sales growth month'] = 0.0
else:
for l in data['monthly sales last year']:
if l['month'] == 12:
sb = l['sales']
for l in data['monthly sales']:
if l['month'] == 12:
cs = l['sales']
try:
data['sales growth month'] = round(cs / sb, 2)
except:
data['sales growth month'] = 0.0
s = 0
for d in data['monthly margin last year']:
s += d['margin']
data['margin last year'] = round(s, 2)
try:
data['margin growth YTD'] = round(
100 * data['margin YTD'] / data['margin last year'], 0
)
except:
data['margin growth YTD'] = 0.0
s = 0
if today.month > 1:
try:
data['margin growth month'] = round(
data['monthly margin'][today.month] /
data['monthly margin'][today.month - 1], 2
)
except:
data['margin growth month'] = 0.0
else:
for l in data['monthly margin last year']:
if l['month'] == 12:
sb = l['margin']
for l in data['monthly margin']:
if l['month'] == 12:
cs = l['margin']
try:
data['margin growth month'] = round(cs / sb, 2)
except:
data['margin growth month'] = 0.0
# SALES
currentQuarter = (today.month - 1) // 3 + 1
salesCurrentQuarter = data['sales per quarter'][currentQuarter]
if currentQuarter == 1:
salesLastQuarter = round(
sales.salesPerQuarter(
cur, year=today.year - 1, param='price', account=account
)[4], 2
)
else:
salesLastQuarter = round(
data['sales per quarter'][currentQuarter - 1], 2
)
try:
data['sales growth QTD'] = round(
100 * salesCurrentQuarter / salesLastQuarter, 2
)
except:
data['sales growth QTD'] = 0.0
# MARGIN
currentQuarter = (today.month - 1) // 3 + 1
marginCurrentQuarter = data['margin per quarter'][currentQuarter]
if currentQuarter == 1:
marginLastQuarter = round(
sales.salesPerQuarter(
cur, year=today.year - 1, param='margin',
account=account)[4], 2
)
else:
marginLastQuarter = round(
data['margin per quarter'][currentQuarter - 1], 2
)
try:
data['margin growth QTD'] = round(
100 * marginCurrentQuarter / marginLastQuarter, 2
)
except:
data['margin growth QTD'] = 0.0
# PIPELINE
# data['pipelines'] = sales._pipelines()
data['pipelines'] = sales.pipelines(cur, username)
except:
data['sales YTD'] = 0
data['margin YTD'] = 0
data['sales QTD'] = 0
data['margin QTD'] = 0
data['sales MTD'] = 0
data['sales per quarter'] = 0
data['margin per quarter'] = 0
data['monthly sales'] = 0
data['monthly sales last year'] = 0
data['monthly margin'] = 0
data['monthly margin last year'] = 0
data['sales last year'] = 0
data['sales growth YTD'] = 0.0
data['sales growth month'] = 0.0
data['margin last year'] = 0.0
data['margin growth YTD'] = 0.0
data['margin growth month'] = 0.0
data['sales growth QTD'] = 0.0
data['margin growth QTD'] = 0.0
data['pipelines'] = 0.0
finally:
try:
con.close()
except Exception as exception:
logger.error(
"message %s, error %s",
exception,
traceback.format_exc(),
extra={
'type': 'Login'
}
)
return data
if __name__ == "__main__":
local = True
# Tasks database name
dbname = ''
username = 'martinmasip'
dbname = 'data_{}_data_test_2015_2016_copy_4_xlsx'.format(username)
dbname = '{}_data_test_2015_2016_copy_4_xlsx'.format(username)
passwd = 'Qymatix!!!'
dbusername = 'webadmin'
dbusername = 'webuser'
username = 'martin_masip'
username = 'qymatix_best'
dbname = username
# data = getInsights(dbname=dbname, local=local, account='Acrion',
# username=dbusername, passwd=passwd
# )
account = u'Krankenhaus Hetzelstift Neustadt/Weinstrasse'
account = 'St\xe4dtisches Klinikum Karlsruhe gGmbH'.decode('latin-1')
account = 'St\xe4dtisches Klinikum Karlsruhe gGmbH'
account = 'Klinikum Wolfsburg'
account = 'all'
# dbname = 'coldjet_qy'
# username = 'robert_gruen'
dbname = 'qy___test_com'
username = 'ep__mtm___ne_de'
# dbname = 'mtm___ne_de'
dbname = 'qymatix_best'
# username = 'chancho_babe__qymatix_best'
username = 'qymatix__aet_at'
account = 'all'
account = 852
# account = 1
username = 'martin_masip__qymatix_de'
dbname = 'qymatix_de'
username = 'admin'
dbname = 'aet_at'
# data = getInsights(dbname=dbname, local=False, account=account,
# username=dbusername, passwd=passwd
# )
# data = getInsights(dbname=dbname, local=False, account=account,
# dbusername='webuser', username=username, passwd=passwd
# )
data = get_insights(
dbname=dbname, local=False, account=account,
dbusername='webuser', username=username,
passwd=passwd
)
# data = get_insights_crm(dbname=dbname, local=False, account=account,
# dbusername='webuser', username=username, passwd=passwd
# )
# print(data['monthly sales'])
# print(json.dumps(data))
# data = json.dumps(data, encoding='latin-1')
# print(data['sales per quarter'])
# print(data['monthly sales'])
# print(data['values_per_month'])
# print(data['values_per_quarter'])
# print(data['sales YTD'])
# print(data['pipelines'])
# data = json.dumps(data, encoding='latin-1')
# print(data)
# dbname = 'demo'
# data = getPerformance(username=dbname, local=local, account='all',
# dbusername=dbusername, passwd=passwd
# )
# print(data)
# print(json.dumps(data))
| 33.861006 | 99 | 0.516503 | 5,587 | 53,839 | 4.891713 | 0.052264 | 0.027442 | 0.059568 | 0.080132 | 0.904281 | 0.886206 | 0.874973 | 0.868203 | 0.852689 | 0.813136 | 0 | 0.017177 | 0.357696 | 53,839 | 1,589 | 100 | 33.882316 | 0.773141 | 0.075893 | 0 | 0.720706 | 0 | 0 | 0.189799 | 0.012302 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00321 | false | 0.008828 | 0.007223 | 0 | 0.013644 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bc47dfc85c8eae8a67ff9c980cba9e447e7b9cac | 3,557 | py | Python | src/frr/tests/topotests/bgp_rfapi_basic_sanity/scripts/add_routes.py | zhouhaifeng/vpe | 9c644ffd561988e5740021ed26e0f7739844353d | [
"Apache-2.0"
] | null | null | null | src/frr/tests/topotests/bgp_rfapi_basic_sanity/scripts/add_routes.py | zhouhaifeng/vpe | 9c644ffd561988e5740021ed26e0f7739844353d | [
"Apache-2.0"
] | null | null | null | src/frr/tests/topotests/bgp_rfapi_basic_sanity/scripts/add_routes.py | zhouhaifeng/vpe | 9c644ffd561988e5740021ed26e0f7739844353d | [
"Apache-2.0"
] | null | null | null | from lib.lutil import luCommand
holddownFactorSet = luCommand(
"r1",
'vtysh -c "show running"',
"rfp holddown-factor",
"none",
"Holddown factor set",
)
if not holddownFactorSet:
to = "-1"
cost = ""
else:
to = "6"
cost = "cost 50"
luCommand(
"r1",
'vtysh -c "debug rfapi-dev open vn 10.0.0.1 un 1.1.1.1"',
"rfapi_set_response_cb: status 0",
"pass",
"Opened RFAPI",
)
luCommand(
"r1",
'vtysh -c "debug rfapi-dev query vn 10.0.0.1 un 1.1.1.1 target 11.11.11.11"',
"rc=2",
"pass",
"Clean query",
)
luCommand(
"r1",
'vtysh -c "debug rfapi-dev register vn 10.0.0.1 un 1.1.1.1 prefix 11.11.11.0/24 lifetime {}"'.format(
to
),
"",
"none",
"Prefix registered",
)
luCommand(
"r1",
'vtysh -c "show vnc registrations local"',
"1 out of 1",
"wait",
"Local registration",
)
luCommand("r1", 'vtysh -c "debug rfapi-dev response-omit-self off"', ".", "none")
luCommand(
"r1",
'vtysh -c "debug rfapi-dev query vn 10.0.0.1 un 1.1.1.1 target 11.11.11.11"',
"11.11.11.0/24",
"pass",
"Query self",
)
luCommand(
"r3",
'vtysh -c "debug rfapi-dev open vn 10.0.0.2 un 2.2.2.2"',
"rfapi_set_response_cb: status 0",
"pass",
"Opened RFAPI",
)
luCommand(
"r3",
'vtysh -c "debug rfapi-dev register vn 10.0.0.2 un 2.2.2.2 prefix 22.22.22.0/24 lifetime {}"'.format(
to
),
"",
"none",
"Prefix registered",
)
luCommand(
"r3",
'vtysh -c "show vnc registrations local"',
"1 out of 1",
"wait",
"Local registration",
)
luCommand("r3", 'vtysh -c "debug rfapi-dev response-omit-self on"', ".", "none")
luCommand(
"r3",
'vtysh -c "debug rfapi-dev query vn 10.0.0.2 un 2.2.2.2 target 22.22.22.22"',
"rc=2",
"pass",
"Self excluded",
)
luCommand(
"r3",
'vtysh -c "debug rfapi-dev open vn 10.0.1.2 un 2.1.1.2"',
"rfapi_set_response_cb: status 0",
"pass",
"Opened query only RFAPI",
)
luCommand(
"r3",
'vtysh -c "debug rfapi-dev query vn 10.0.1.2 un 2.1.1.2 target 22.22.22.22"',
"22.22.22.0/24",
"pass",
"See local",
)
luCommand(
"r4",
'vtysh -c "debug rfapi-dev open vn 10.0.0.3 un 3.3.3.3"',
"rfapi_set_response_cb: status 0",
"pass",
"Opened RFAPI",
)
luCommand(
"r4",
'vtysh -c "debug rfapi-dev register vn 10.0.0.3 un 3.3.3.3 prefix 33.33.33.0/24 lifetime {}"'.format(
to
),
"",
"none",
"Prefix registered",
)
luCommand(
"r4",
'vtysh -c "show vnc registrations local"',
"1 out of 1",
"wait",
"Local registration",
)
luCommand("r4", 'vtysh -c "debug rfapi-dev response-omit-self off"', ".", "none")
luCommand(
"r4",
'vtysh -c "debug rfapi-dev query vn 10.0.0.3 un 3.3.3.3 target 33.33.33.33"',
"33.33.33.0/24",
"pass",
"Query self",
)
luCommand(
"r4",
'vtysh -c "debug rfapi-dev register vn 10.0.0.3 un 3.3.3.3 prefix 11.11.11.0/24 lifetime {} {}"'.format(
to, cost
),
"",
"none",
"MP Prefix registered",
)
luCommand(
"r4",
'vtysh -c "show vnc registrations local"',
"2 out of 2",
"wait",
"Local registration",
)
luCommand(
"r4",
'vtysh -c "debug rfapi-dev query vn 10.0.0.3 un 3.3.3.3 target 11.11.11.11"',
"11.11.11.0/24",
"pass",
"Query self MP",
)
luCommand("r1", 'vtysh -c "show vnc registrations"', ".", "none")
luCommand("r3", 'vtysh -c "show vnc registrations"', ".", "none")
luCommand("r4", 'vtysh -c "show vnc registrations"', ".", "none")
| 22.23125 | 108 | 0.561428 | 564 | 3,557 | 3.519504 | 0.12234 | 0.075567 | 0.094207 | 0.137028 | 0.889673 | 0.869521 | 0.835264 | 0.75466 | 0.733501 | 0.63728 | 0 | 0.107715 | 0.245713 | 3,557 | 159 | 109 | 22.371069 | 0.632128 | 0 | 0 | 0.623377 | 0 | 0.090909 | 0.603599 | 0.02474 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.064935 | 0.006494 | 0 | 0.006494 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
bc59881d84c71e8e738d8c38a83d40ca9b551fe5 | 219 | py | Python | __init__.py | npmontgomery/wread | 2577a70fd38c692c0540567560149d6ef3b76613 | [
"MIT"
] | 1 | 2018-07-16T23:30:21.000Z | 2018-07-16T23:30:21.000Z | __init__.py | npmontgomery/wread | 2577a70fd38c692c0540567560149d6ef3b76613 | [
"MIT"
] | null | null | null | __init__.py | npmontgomery/wread | 2577a70fd38c692c0540567560149d6ef3b76613 | [
"MIT"
] | null | null | null | #from wread.wread import csv_append, csv_list_rows, get_csv_row, read_pickle,save_pickle,write_2_txt,append_txt_line
from wread.src.csv.csv import *
from wread.src.pickle.pickle import *
from wread.src.txt.txt import *
| 43.8 | 116 | 0.821918 | 40 | 219 | 4.225 | 0.425 | 0.213018 | 0.213018 | 0.213018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005 | 0.086758 | 219 | 4 | 117 | 54.75 | 0.84 | 0.525114 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bc6f1fb4b21e7205e73c05293610a02cee46752a | 7,755 | py | Python | cifar_exps/ours/local_config.py | snu-mllab/Deep-Hash-Table-CVPR19 | 62c811c37001302e6759a18d6143b8ad657e4910 | [
"MIT"
] | 12 | 2019-05-20T10:26:01.000Z | 2020-05-07T02:19:05.000Z | cifar_exps/ours/local_config.py | maestrojeong/Deep-Hash-Table-CVPR19 | 62c811c37001302e6759a18d6143b8ad657e4910 | [
"MIT"
] | 1 | 2019-06-21T11:50:15.000Z | 2019-06-24T05:38:27.000Z | cifar_exps/ours/local_config.py | snu-mllab/Deep-Hash-Table-CVPR19 | 62c811c37001302e6759a18d6143b8ad657e4910 | [
"MIT"
] | 2 | 2019-03-21T01:54:11.000Z | 2019-05-08T10:38:46.000Z | import os
import sys
sys.path.append(os.path.join(os.path.abspath(os.path.dirname(__file__)), '../..'))
from configs.parser import cifar_parser
from configs.path import EXPDIR
import numpy as np
import argparse
KEY = 'cifar_ours_triplet_64_2_1'
RESULT_DIR = EXPDIR+"{}/".format(KEY)
ID_STRUCTURE_DICT = {
'cifar_ours_npairs_32_2_1' : ('param', 'plamb2', 'dptype'),
'cifar_ours_npairs_32_2_2' : ('param', 'plamb2', 'dptype'),
'cifar_ours_npairs_32_2_3' : ('param', 'plamb2', 'dptype'),
'cifar_ours_npairs_32_2_4' : ('param', 'plamb2', 'dptype'),
'cifar_ours_triplet_64_2_1' : ('param', 'plamb2', 'dptype'),
'cifar_ours_triplet_64_2_2' : ('param', 'plamb2', 'dptype'),
'cifar_ours_triplet_64_2_3' : ('param', 'plamb2', 'dptype'),
'cifar_ours_triplet_64_2_4' : ('param', 'plamb2', 'dptype'),
}
ID_STRUCTURE = ID_STRUCTURE_DICT[KEY]
def local_cifar_parser():
parser = cifar_parser()
parser.add_argument("--nsclass", default = 64, help="the number of selected class", type = int)
parser.add_argument("--label", default = 'dynamic', help="how to add label static or dynamic", type = str) # dynamic => label remapping, static => no label remapping
parser.add_argument("--plamb1", default = 100.0, help="lambda for pairwise cost", type = float)
parser.add_argument("--dtype", default = 'stair', help="decay type", type = str)
parser.add_argument("--dptype", default = 'a5', help="hash decay param type", type = str)
if KEY in ['cifar_ours_npairs_32_2_1']:
parser.add_argument("--d", default = 32, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 1, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_npairs_64/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_npairs_64/bestsave/', type=str)
parser.add_argument("--ltype", default = 'npair', help="loss type", type = str)
parser.add_argument("--param", default = 0.03, help="hash reg", type = float)
parser.add_argument("--plamb2", default = 0.07, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_npairs_32_2_2']:
parser.add_argument("--d", default = 32, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 2, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_npairs_64/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_npairs_64/bestsave/', type=str)
parser.add_argument("--ltype", default = 'npair', help="loss type", type = str)
parser.add_argument("--param", default = 0.01, help="hash reg", type = float)
parser.add_argument("--plamb2", default = 0.1, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_npairs_32_2_3']:
parser.add_argument("--d", default = 32, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 3, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_npairs_64/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_npairs_64/bestsave/', type=str)
parser.add_argument("--ltype", default = 'npair', help="loss type", type = str)
parser.add_argument("--param", default = 0.003, help="hash reg", type = float)
parser.add_argument("--plamb2", default = 0.1, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_npairs_32_2_4']:
parser.add_argument("--d", default = 32, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 4, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_npairs_64/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_npairs_64/bestsave/', type=str)
parser.add_argument("--ltype", default = 'npair', help="loss type", type = str)
parser.add_argument("--param", default = 0.001, help="hash reg", type = float)
parser.add_argument("--plamb2", default = 0.1, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_triplet_64_2_1']:
parser.add_argument("--d", default = 64, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 1, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_triplet_256/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_triplet_256/bestsave/', type=str)
parser.add_argument("--ltype", default = 'triplet', help="loss type", type = str)
parser.add_argument("--param", default = 0.5, help="hash margin alpha", type = float)
parser.add_argument("--plamb2", default = 1.0, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_triplet_64_2_2']:
parser.add_argument("--d", default = 64, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 2, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_triplet_256/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_triplet_256/bestsave/', type=str)
parser.add_argument("--ltype", default = 'triplet', help="loss type", type = str)
parser.add_argument("--param", default = 0.3, help="hash margin alpha", type = float)
parser.add_argument("--plamb2", default = 1.0, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_triplet_64_2_3']:
parser.add_argument("--d", default = 64, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 3, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_triplet_256/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_triplet_256/bestsave/', type=str)
parser.add_argument("--ltype", default = 'triplet', help="loss type", type = str)
parser.add_argument("--param", default = 0.5, help="hash margin alpha", type = float)
parser.add_argument("--plamb2", default = 1.0, help="lambda for pairwise cost", type = float)
elif KEY in ['cifar_ours_triplet_64_2_4']:
parser.add_argument("--d", default = 64, help="bucket d", type = int)
parser.add_argument("--k", default = 2, help="number of hierachical", type = int)
parser.add_argument("--sk", default = 4, help="sparse k", type = int)
parser.add_argument("--meta", default=EXPDIR+'cifar_metric_triplet_256/meta/meta.pkl', type=str)
parser.add_argument("--save", default=EXPDIR+'cifar_metric_triplet_256/bestsave/', type=str)
parser.add_argument("--ltype", default = 'triplet', help="loss type", type = str)
parser.add_argument("--param", default = 0.5, help="hash margin alpha", type = float)
parser.add_argument("--plamb2", default = 1.0, help="lambda for pairwise cost", type = float)
return parser
| 71.146789 | 169 | 0.658285 | 1,079 | 7,755 | 4.531047 | 0.093605 | 0.12702 | 0.239926 | 0.081816 | 0.885662 | 0.873798 | 0.860094 | 0.852322 | 0.794232 | 0.794232 | 0 | 0.031659 | 0.173179 | 7,755 | 108 | 170 | 71.805556 | 0.730817 | 0.007221 | 0 | 0.58 | 0 | 0 | 0.305483 | 0.126949 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01 | false | 0 | 0.06 | 0 | 0.08 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bc789186daa1e1974560304a45d37cc1c1b1c0ee | 178 | py | Python | src/tenzing/core/__init__.py | ieaves/tenzing | 92d39c1c3a5633d8074e0ffe8c2687c465aebbc8 | [
"MIT"
] | null | null | null | src/tenzing/core/__init__.py | ieaves/tenzing | 92d39c1c3a5633d8074e0ffe8c2687c465aebbc8 | [
"MIT"
] | null | null | null | src/tenzing/core/__init__.py | ieaves/tenzing | 92d39c1c3a5633d8074e0ffe8c2687c465aebbc8 | [
"MIT"
] | null | null | null | from tenzing.core.models import (tenzing_model, model_relation)
from tenzing.core import model_implementations
from tenzing.core import summary
from tenzing.core import plotting
| 35.6 | 63 | 0.859551 | 25 | 178 | 6 | 0.4 | 0.293333 | 0.4 | 0.42 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095506 | 178 | 4 | 64 | 44.5 | 0.931677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
bc84116339a8b9df7b6c60764ae412fc74bc4696 | 105 | py | Python | conan/tools/build/__init__.py | ShuangLiu1992/conan | b420ec1601febfa97f1f61d8da9ba083928ca7ea | [
"MIT"
] | 1 | 2022-03-06T16:49:00.000Z | 2022-03-06T16:49:00.000Z | conan/tools/build/__init__.py | ShuangLiu1992/conan | b420ec1601febfa97f1f61d8da9ba083928ca7ea | [
"MIT"
] | null | null | null | conan/tools/build/__init__.py | ShuangLiu1992/conan | b420ec1601febfa97f1f61d8da9ba083928ca7ea | [
"MIT"
] | null | null | null | from conan.tools.build.cpu import build_jobs
from conan.tools.build.cross_building import cross_building
| 35 | 59 | 0.866667 | 17 | 105 | 5.176471 | 0.529412 | 0.204545 | 0.318182 | 0.431818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07619 | 105 | 2 | 60 | 52.5 | 0.907216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
bc9afc6fd7d1f8ac8ef6e199bfacce54e70925a7 | 5,530 | py | Python | test_pomdp.py | n0whereRuoxi/aima-python | 777fa52b73830a534e27b33abf535933ace32a95 | [
"MIT"
] | 1 | 2019-11-12T21:13:47.000Z | 2019-11-12T21:13:47.000Z | test_pomdp.py | n0whereRuoxi/aima-python | 777fa52b73830a534e27b33abf535933ace32a95 | [
"MIT"
] | null | null | null | test_pomdp.py | n0whereRuoxi/aima-python | 777fa52b73830a534e27b33abf535933ace32a95 | [
"MIT"
] | null | null | null | from mdp import *
def test_pomdp_value_iteration():
t_prob = [
[#up
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#1
[0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#2
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#3
[0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#4
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#5
[0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#6
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#7
[0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#8
[0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0],#9
[0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0],#10
[0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0],#11
[0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0],#12
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0],#13
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0],#14
[0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0],#15
[0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],#16
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0],#17
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0],#18
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],#19
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0],#20
],
[#right
[0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#1
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#2
[0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#3
[0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#4
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#5
[0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#6
[0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0],#7
[0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0],#8
[0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0],#9
[0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0],#10
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0],#11
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0],#12
[0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0],#13
[0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],#14
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0],#15
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0],#16
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0],#17
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],#18
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1],#19
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1],#20
],
[#down
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#1
[0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#2
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#3
[0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#4
[0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0],#5
[0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0],#6
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0],#7
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0],#8
[0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0],#9
[0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0],#10
[0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0],#11
[0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],#12
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0],#13
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0],#14
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0],#15
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1],#16
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0],#17
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0],#18
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],#19
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1],#20
],
[#left
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#1
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#2
[0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#3
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#4
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#5
[0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0],#6
[0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0],#7
[0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0],#8
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0],#9
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0],#10
[0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0],#11
[0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0],#12
[0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0],#13
[0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],#14
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0],#15
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0],#16
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0],#17
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0],#18
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0],#19
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],#20
]
]
e_prob = [
[#up
],
[#right
],
[#down
],
[#left
],
]
rewards = [[5, -10], [-20, 5], [-1, -1]]
gamma = 0.95
actions = ('0', '1', '2', '3')
states = ('0', '1', '2', '3', '4', '5', '6', '7', '8', '9', '10', '11', '12', '13', '14',
'15', '16', '17', '18', '19', )
pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma)
utility = pomdp_value_iteration(pomdp, epsilon=5)
for _, v in utility.items():
sum_ = 0
for element in v:
sum_ += sum(element)
assert -9.76 < sum_ < -9.70 or 246.5 < sum_ < 248.5 or 0 < sum_ < 1
test_pomdp_value_iteration() | 44.596774 | 94 | 0.376311 | 1,787 | 5,530 | 1.154449 | 0.030778 | 1.326224 | 1.779932 | 2.109549 | 0.833737 | 0.833737 | 0.833737 | 0.833737 | 0.833737 | 0.833737 | 0 | 0.436173 | 0.259132 | 5,530 | 124 | 95 | 44.596774 | 0.067366 | 0.028571 | 0 | 0.756522 | 0 | 0 | 0.006433 | 0 | 0 | 0 | 0 | 0 | 0.008696 | 1 | 0.008696 | false | 0 | 0.008696 | 0 | 0.017391 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
bce33ab44cae83c035cabe36dea2bd6cf02ca3b4 | 18,095 | py | Python | yandex/cloud/mdb/postgresql/v1/cluster_service_pb2_grpc.py | IIKovalenko/python-sdk | 980e2c5d848eadb42799132b35a9f58ab7b27157 | [
"MIT"
] | 1 | 2019-06-07T10:45:58.000Z | 2019-06-07T10:45:58.000Z | yandex/cloud/mdb/postgresql/v1/cluster_service_pb2_grpc.py | IIKovalenko/python-sdk | 980e2c5d848eadb42799132b35a9f58ab7b27157 | [
"MIT"
] | null | null | null | yandex/cloud/mdb/postgresql/v1/cluster_service_pb2_grpc.py | IIKovalenko/python-sdk | 980e2c5d848eadb42799132b35a9f58ab7b27157 | [
"MIT"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from yandex.cloud.mdb.postgresql.v1 import cluster_pb2 as yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__pb2
from yandex.cloud.mdb.postgresql.v1 import cluster_service_pb2 as yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2
from yandex.cloud.operation import operation_pb2 as yandex_dot_cloud_dot_operation_dot_operation__pb2
class ClusterServiceStub(object):
"""A set of methods for managing PostgreSQL Cluster resources.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Get = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Get',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.GetClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__pb2.Cluster.FromString,
)
self.List = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/List',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClustersRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClustersResponse.FromString,
)
self.Create = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Create',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.CreateClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Update = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Update',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.UpdateClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Delete = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Delete',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.DeleteClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Start = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Start',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.StartClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Stop = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Stop',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.StopClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Move = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Move',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.MoveClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Backup = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Backup',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.BackupClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.Restore = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/Restore',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.RestoreClusterRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.ListLogs = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/ListLogs',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterLogsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterLogsResponse.FromString,
)
self.ListOperations = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/ListOperations',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterOperationsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterOperationsResponse.FromString,
)
self.ListBackups = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/ListBackups',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterBackupsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterBackupsResponse.FromString,
)
self.ListHosts = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/ListHosts',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterHostsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterHostsResponse.FromString,
)
self.AddHosts = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/AddHosts',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.AddClusterHostsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.DeleteHosts = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/DeleteHosts',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.DeleteClusterHostsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
self.UpdateHosts = channel.unary_unary(
'/yandex.cloud.mdb.postgresql.v1.ClusterService/UpdateHosts',
request_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.UpdateClusterHostsRequest.SerializeToString,
response_deserializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.FromString,
)
class ClusterServiceServicer(object):
"""A set of methods for managing PostgreSQL Cluster resources.
"""
def Get(self, request, context):
"""Returns the specified PostgreSQL Cluster resource.
To get the list of available PostgreSQL Cluster resources, make a [List] request.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def List(self, request, context):
"""Retrieves the list of PostgreSQL Cluster resources that belong
to the specified folder.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Create(self, request, context):
"""Creates a PostgreSQL cluster in the specified folder.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Update(self, request, context):
"""Updates the specified PostgreSQL cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Delete(self, request, context):
"""Deletes the specified PostgreSQL cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Start(self, request, context):
"""Start the specified PostgreSQL cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Stop(self, request, context):
"""Stop the specified PostgreSQL cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Move(self, request, context):
"""Moves the specified PostgreSQL cluster to the specified folder.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Backup(self, request, context):
"""Creates a backup for the specified PostgreSQL cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Restore(self, request, context):
"""Creates a new PostgreSQL cluster using the specified backup.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListLogs(self, request, context):
"""Retrieves logs for the specified PostgreSQL cluster.
For more information about logs, see the [Logs](/docs/managed-postgresql/concepts/logs) section in the documentation.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListOperations(self, request, context):
"""Retrieves the list of Operation resources for the specified cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListBackups(self, request, context):
"""Retrieves the list of available backups for the specified PostgreSQL cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListHosts(self, request, context):
"""Retrieves a list of hosts for the specified cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def AddHosts(self, request, context):
"""Creates new hosts for a cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteHosts(self, request, context):
"""Deletes the specified hosts for a cluster.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UpdateHosts(self, request, context):
"""Updates the specified hosts.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ClusterServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'Get': grpc.unary_unary_rpc_method_handler(
servicer.Get,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.GetClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__pb2.Cluster.SerializeToString,
),
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClustersRequest.FromString,
response_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClustersResponse.SerializeToString,
),
'Create': grpc.unary_unary_rpc_method_handler(
servicer.Create,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.CreateClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Update': grpc.unary_unary_rpc_method_handler(
servicer.Update,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.UpdateClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Delete': grpc.unary_unary_rpc_method_handler(
servicer.Delete,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.DeleteClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Start': grpc.unary_unary_rpc_method_handler(
servicer.Start,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.StartClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Stop': grpc.unary_unary_rpc_method_handler(
servicer.Stop,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.StopClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Move': grpc.unary_unary_rpc_method_handler(
servicer.Move,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.MoveClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Backup': grpc.unary_unary_rpc_method_handler(
servicer.Backup,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.BackupClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'Restore': grpc.unary_unary_rpc_method_handler(
servicer.Restore,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.RestoreClusterRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'ListLogs': grpc.unary_unary_rpc_method_handler(
servicer.ListLogs,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterLogsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterLogsResponse.SerializeToString,
),
'ListOperations': grpc.unary_unary_rpc_method_handler(
servicer.ListOperations,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterOperationsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterOperationsResponse.SerializeToString,
),
'ListBackups': grpc.unary_unary_rpc_method_handler(
servicer.ListBackups,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterBackupsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterBackupsResponse.SerializeToString,
),
'ListHosts': grpc.unary_unary_rpc_method_handler(
servicer.ListHosts,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterHostsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.ListClusterHostsResponse.SerializeToString,
),
'AddHosts': grpc.unary_unary_rpc_method_handler(
servicer.AddHosts,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.AddClusterHostsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'DeleteHosts': grpc.unary_unary_rpc_method_handler(
servicer.DeleteHosts,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.DeleteClusterHostsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
'UpdateHosts': grpc.unary_unary_rpc_method_handler(
servicer.UpdateHosts,
request_deserializer=yandex_dot_cloud_dot_mdb_dot_postgresql_dot_v1_dot_cluster__service__pb2.UpdateClusterHostsRequest.FromString,
response_serializer=yandex_dot_cloud_dot_operation_dot_operation__pb2.Operation.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'yandex.cloud.mdb.postgresql.v1.ClusterService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 55.676923 | 151 | 0.795689 | 2,074 | 18,095 | 6.431051 | 0.064127 | 0.047908 | 0.074524 | 0.090493 | 0.869321 | 0.861074 | 0.848403 | 0.791873 | 0.782351 | 0.709702 | 0 | 0.009078 | 0.135562 | 18,095 | 324 | 152 | 55.848765 | 0.843626 | 0.077701 | 0 | 0.358566 | 1 | 0 | 0.113125 | 0.058436 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075697 | false | 0 | 0.015936 | 0 | 0.099602 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.