hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c5c81be66863e31fe8a324d12e820c393d8c2e6e | 214 | py | Python | scripts/external_libs/elasticsearch7/elasticsearch/helpers/__init__.py | timgates42/trex-core | efe94752fcb2d0734c83d4877afe92a3dbf8eccd | [
"Apache-2.0"
] | 956 | 2015-06-24T15:04:55.000Z | 2022-03-30T06:25:04.000Z | scripts/external_libs/elasticsearch7/elasticsearch/helpers/__init__.py | angelyouyou/trex-core | fddf78584cae285d9298ef23f9f5c8725e16911e | [
"Apache-2.0"
] | 782 | 2015-09-20T15:19:00.000Z | 2022-03-31T23:52:05.000Z | scripts/external_libs/elasticsearch7/elasticsearch/helpers/__init__.py | angelyouyou/trex-core | fddf78584cae285d9298ef23f9f5c8725e16911e | [
"Apache-2.0"
] | 429 | 2015-06-27T19:34:21.000Z | 2022-03-23T11:02:51.000Z |
from .errors import BulkIndexError, ScanError
from .actions import expand_action, streaming_bulk, bulk, parallel_bulk
from .actions import scan, reindex
from .actions import _chunk_actions, _process_bulk_chunk
| 23.777778 | 71 | 0.831776 | 28 | 214 | 6.071429 | 0.535714 | 0.194118 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121495 | 214 | 8 | 72 | 26.75 | 0.904255 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c5e1efba7737a78ed613753973a75d60572c52dc | 44 | py | Python | mlswarm/infrastructure/services/__init__.py | lucasdavid/mlswarm-infrastructure | d52c4c6c6f41a85272acf098c7a152eb66aed337 | [
"MIT"
] | 2 | 2018-06-18T09:53:50.000Z | 2019-02-01T13:02:12.000Z | mlswarm/infrastructure/services/__init__.py | lucasdavid/mlswarm-infrastructure | d52c4c6c6f41a85272acf098c7a152eb66aed337 | [
"MIT"
] | null | null | null | mlswarm/infrastructure/services/__init__.py | lucasdavid/mlswarm-infrastructure | d52c4c6c6f41a85272acf098c7a152eb66aed337 | [
"MIT"
] | null | null | null | from .service_builder import ServiceBuilder
| 22 | 43 | 0.886364 | 5 | 44 | 7.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
68048cada33db149f4ee9b5060025502a701d6bb | 51 | py | Python | Chapter05/demo/__init__.py | jvstinian/Python-Reinforcement-Learning-Projects | 6c97c68351fc4af426cb5c3583d75aebfabac8aa | [
"MIT"
] | 114 | 2018-10-20T15:32:59.000Z | 2022-03-21T14:16:25.000Z | Chapter05/demo/__init__.py | jvstinian/Python-Reinforcement-Learning-Projects | 6c97c68351fc4af426cb5c3583d75aebfabac8aa | [
"MIT"
] | 11 | 2018-10-18T12:39:42.000Z | 2022-02-10T03:28:19.000Z | Chapter05/demo/__init__.py | jvstinian/Python-Reinforcement-Learning-Projects | 6c97c68351fc4af426cb5c3583d75aebfabac8aa | [
"MIT"
] | 72 | 2018-10-12T13:02:32.000Z | 2022-03-11T13:03:26.000Z | '''
Created on Nov 10, 2016
@author: a0096049
'''
| 8.5 | 23 | 0.627451 | 7 | 51 | 4.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.317073 | 0.196078 | 51 | 5 | 24 | 10.2 | 0.463415 | 0.823529 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a879a99e7d9d55d665a7d43858779953988691bb | 731 | py | Python | resource_tracker/models/__init__.py | LaudateCorpus1/squest | 98304f20c1d966fb3678d348ffd7c5be438bb6be | [
"Apache-2.0"
] | 112 | 2021-04-21T08:52:55.000Z | 2022-03-01T15:09:19.000Z | resource_tracker/models/__init__.py | LaudateCorpus1/squest | 98304f20c1d966fb3678d348ffd7c5be438bb6be | [
"Apache-2.0"
] | 216 | 2021-04-21T09:06:47.000Z | 2022-03-30T14:21:28.000Z | resource_tracker/models/__init__.py | LaudateCorpus1/squest | 98304f20c1d966fb3678d348ffd7c5be438bb6be | [
"Apache-2.0"
] | 21 | 2021-04-20T13:53:54.000Z | 2022-03-30T21:43:04.000Z | from resource_tracker.models.exceptions import ExceptionResourceTracker
from resource_tracker.models.resource import Resource
from resource_tracker.models.resource_attribute import ResourceAttribute
from resource_tracker.models.resource_text_attribute import ResourceTextAttribute
from resource_tracker.models.resource_group_attribute_definition import ResourceGroupAttributeDefinition
from resource_tracker.models.resource_group_text_attribute_definition import ResourceGroupTextAttributeDefinition
from resource_tracker.models.resource_group import ResourceGroup
from resource_tracker.models.resource_pool_attribute_definition import ResourcePoolAttributeDefinition
from resource_tracker.models.resource_pool import ResourcePool
| 73.1 | 113 | 0.926129 | 78 | 731 | 8.371795 | 0.24359 | 0.165391 | 0.261868 | 0.344564 | 0.43951 | 0.287902 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049248 | 731 | 9 | 114 | 81.222222 | 0.939568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
765af13f74c9ed55c7ce4c7c2f44bc321a894bf1 | 59 | py | Python | mi/abc/__init__.py | gitter-badger/Mi.py | ef6611c93c8a5237ec9d51ff89e845b85771e070 | [
"MIT"
] | 13 | 2021-09-14T02:47:23.000Z | 2022-02-27T16:48:09.000Z | mi/abc/__init__.py | gitter-badger/Mi.py | ef6611c93c8a5237ec9d51ff89e845b85771e070 | [
"MIT"
] | 62 | 2021-08-28T10:56:55.000Z | 2022-03-30T06:47:28.000Z | mi/abc/__init__.py | gitter-badger/Mi.py | ef6611c93c8a5237ec9d51ff89e845b85771e070 | [
"MIT"
] | 3 | 2021-12-23T20:10:57.000Z | 2022-03-30T13:19:49.000Z | from .chat import *
from .ext import *
from .note import *
| 14.75 | 19 | 0.694915 | 9 | 59 | 4.555556 | 0.555556 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.20339 | 59 | 3 | 20 | 19.666667 | 0.87234 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7668f32defd7a9ed4e43a2e6123d7e149e477e34 | 18,140 | py | Python | burton/database/test/sqlitetests.py | Extensis/Burton | a948f045a021f468ef34d6e8e6b8a5caaa132e27 | [
"MIT"
] | 2 | 2018-01-09T23:32:35.000Z | 2018-08-10T23:48:33.000Z | burton/database/test/sqlitetests.py | Extensis/Burton | a948f045a021f468ef34d6e8e6b8a5caaa132e27 | [
"MIT"
] | null | null | null | burton/database/test/sqlitetests.py | Extensis/Burton | a948f045a021f468ef34d6e8e6b8a5caaa132e27 | [
"MIT"
] | 5 | 2017-03-23T16:49:46.000Z | 2022-02-18T12:06:59.000Z | import codecs
import cStringIO
import mock
import os
import sqlite3
import unittest
from burton import database
class SQLiteTests(unittest.TestCase):
def tearDown(self):
if os.path.exists('some_filename.db'):
os.remove('some_filename.db')
def test_write_string_mapping_for_platform(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db._get_current_time = mock.Mock(
return_value = "datetime('2010-12-02 02:20:00')"
)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
"SomeString" : "Translation for some string",
"OtherString" : "Translation for some other string",
}
)
cursor = db.dbh.cursor()
self.assertEquals(
cursor.execute("select * from translation_keys").fetchall(),
[
(1, u"SomeString", u"2010-12-02 02:20:00"),
(2, u"OtherString", u"2010-12-02 02:20:00")
],
)
self.assertEquals(
cursor.execute("select * from native_translations").fetchall(),
[
(1, 1, 1, u"Translation for some string"),
(2, 2, 1, u"Translation for some other string")
],
)
db._get_current_time = mock.Mock(
return_value = "datetime('2010-12-02 02:21:00')"
)
db.write_string_mapping_for_platform(
"Win",
{
"SomeString" : "Translation for some string",
"OtherString" : "Translation for some other string",
}
)
self.assertEquals(
cursor.execute("select * from translation_keys").fetchall(),
[
(1, u"SomeString", u"2010-12-02 02:21:00"),
(2, u"OtherString", u"2010-12-02 02:21:00")
],
)
self.assertEquals(
cursor.execute("select * from native_translations").fetchall(),
[
(1, 1, 1, u"Translation for some string"),
(2, 2, 1, u"Translation for some other string"),
(3, 1, 2, u"Translation for some string"),
(4, 2, 2, u"Translation for some other string")
],
)
db._get_current_time = mock.Mock(
return_value = "datetime('2010-12-02 02:22:00')"
)
db.write_string_mapping_for_platform(
"Mac",
{
"SomeString" : "New translation for some string",
"OtherString" : "Translation for some other string",
}
)
self.assertEquals(
cursor.execute("select * from translation_keys").fetchall(),
[
(1, u"SomeString", u"2010-12-02 02:22:00"),
(2, u"OtherString", u"2010-12-02 02:21:00")
],
)
self.assertEquals(
cursor.execute("select * from native_translations").fetchall(),
[
(1, 1, 1, u"New translation for some string"),
(2, 2, 1, u"Translation for some other string"),
(3, 1, 2, u"Translation for some string"),
(4, 2, 2, u"Translation for some other string")
],
)
db._get_current_time = mock.Mock(
return_value = "datetime('2010-12-02 02:23:00')"
)
db.write_string_mapping_for_platform(
"Win",
{
"SomeString" : "New translation for some string",
}
)
self.assertEquals(
cursor.execute("select * from translation_keys").fetchall(),
[
(1, u"SomeString", u"2010-12-02 02:23:00"),
(2, u"OtherString", u"2010-12-02 02:23:00")
],
)
self.assertEquals(
cursor.execute("select * from native_translations").fetchall(),
[
(1, 1, 1, u"New translation for some string"),
(2, 2, 1, u"Translation for some other string"),
(3, 1, 2, u"New translation for some string")
],
)
db.disconnect()
def test_write_string_mapping_for_platform_translates_params(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db._get_current_time = mock.Mock(
return_value = "datetime('2010-12-02 02:20:00')"
)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
"SomeString" : "%03d of %03.3lld for {0} %@",
}
)
cursor = db.dbh.cursor()
self.assertEquals(
cursor.execute("select * from translation_keys").fetchall(),
[
(1, u"SomeString", u"2010-12-02 02:20:00"),
],
)
self.assertEquals(
cursor.execute("select * from native_translations").fetchall(),
[
(1, 1, 1, "%03d of %03.3lld for {0} %@"),
],
)
self.assertEquals(
cursor.execute("select * from replaced_params").fetchall(),
[
(1, 1, 1, 0, u"%03d" ),
(2, 1, 1, 1, u"%03.3lld"),
(3, 1, 1, 2, u"{0}" ),
(4, 1, 1, 3, u"%@" ),
],
)
db.write_string_mapping_for_platform(
"Mac",
{
"SomeString" : "%03d of %03.3lld",
}
)
self.assertEquals(
cursor.execute("select * from native_translations").fetchall(),
[
(1, 1, 1, "%03d of %03.3lld"),
],
)
self.assertEquals(
cursor.execute("select * from replaced_params").fetchall(),
[
(1, 1, 1, 0, u"%03d" ),
(2, 1, 1, 1, u"%03.3lld"),
],
)
db.disconnect()
@mock.patch.object(os.path, "abspath")
def test_update_from_vcs(self, mock_function):
mock_function.return_value = "some_full_path"
vcs = mock.Mock()
db = database.SQLite("some_filename")
submodule_path = "submodule"
db.update_from_vcs(vcs, submodule_path)
vcs.add_file.assert_called_with("some_full_path", submodule_path)
@mock.patch.object(os.path, "exists")
def test_connect_loads_schema_if_new_database(self, mock_function):
mock_function.return_value = False
db = database.SQLite("some_filename")
db._save_database = mock.Mock()
orig_load_schema = db._load_schema
db._load_schema = mock.Mock(side_effect = orig_load_schema)
db._schema_file = mock.Mock(
return_value = cStringIO.StringIO("""create table test_table (
test_column INTEGER NOT NULL
);""")
)
db.connect()
db._load_schema.assert_called_with()
cursor = db.dbh.cursor()
cursor.execute("insert into test_table (test_column) values(1)")
cursor.close()
db.disconnect()
@mock.patch.object(os.path, "exists")
def test_connect_loads_existing_database(self, mock_function):
mock_function.return_value = True
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
orig_load_database = db._load_database
db._load_database = mock.Mock(side_effect = orig_load_database)
db._open_for_reading = mock.Mock(
return_value = cStringIO.StringIO("""create table test_table (
test_column INTEGER NOT NULL
);
insert into test_table(test_column) values(1);
""")
)
db.connect()
db._load_database.assert_called_with()
db._open_for_reading.assert_called_with("some_filename")
cursor = db.dbh.cursor()
cursor.execute("select test_column from test_table")
self.assertEquals(cursor.fetchall(), [(1,)])
cursor.close()
db.disconnect()
def test_disconnect_saves_existing_database(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
lines = []
def _write(line):
lines.append(line)
db.connect = mock.Mock(side_effect = _connect)
output_file = mock.Mock()
output_file.write = mock.Mock(side_effect = _write)
db._open_for_writing = mock.Mock(return_value = output_file)
db._remove_temporary_file = mock.Mock()
db.connect()
cursor = db.dbh.cursor()
cursor.execute(
"create table test_table (test_column INTEGER NOT NULL);"
)
cursor.execute("insert into test_table (test_column) values(1);")
db.disconnect()
db._open_for_writing.assert_called_with("some_filename")
db._remove_temporary_file.assert_called_with()
self.assertEquals(
"".join(lines),
"""BEGIN TRANSACTION;
CREATE TABLE test_table (test_column INTEGER NOT NULL);
INSERT INTO "test_table" VALUES(1);
COMMIT;
""".replace(" ", "")
)
def test_remove_old_unmapped_strings(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
"SomeString" : "Translation for some string",
"OtherString" : "Translation for some other string",
}
)
cursor = db.dbh.cursor()
self.assertEquals(
cursor.execute("""
select translation_key_no, translation_key
from translation_keys"""
).fetchall(),
[
(2, u"OtherString"),
(1, u"SomeString")
],
)
cursor.execute("""
delete from native_translations
where translation_key_no = 1
""")
db.remove_old_unmapped_strings()
self.assertEquals(
cursor.execute("""
select translation_key_no, translation_key
from translation_keys"""
).fetchall(),
[
(2, u"OtherString"),
(1, u"SomeString")
],
)
cursor.execute("""
update translation_keys
set last_updated = datetime('now', '-89 days')
""")
db.remove_old_unmapped_strings()
self.assertEquals(
cursor.execute("""
select translation_key_no, translation_key
from translation_keys"""
).fetchall(),
[
(2, u"OtherString"),
(1, u"SomeString")
],
)
cursor.execute("""
update translation_keys
set last_updated = datetime('now', '-91 days')
""")
db.remove_old_unmapped_strings()
self.assertEquals(
cursor.execute("""
select translation_key_no, translation_key
from translation_keys"""
).fetchall(),
[
(2, u"OtherString")
],
)
db.disconnect()
def test_get_platforms(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
u"SomeString" : u"Mac translation for some string",
u"OtherString" : u"Mac translation for some other string",
}
)
db.write_string_mapping_for_platform(
"Win",
{
u"SomeString" : u"Win translation for some string",
u"OtherString" : u"Win translation for some other string",
}
)
self.assertEquals(db.get_platforms(), [ u"Mac", u"Win" ])
db.disconnect()
def test_get_string_mapping_for_platform(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
u"SomeString" : u"Mac translation for some string",
u"OtherString" : u"Mac translation for some other string",
}
)
db.write_string_mapping_for_platform(
"Win",
{
u"SomeString" : u"Win translation for some string",
u"OtherString" : u"Win translation for some other string",
}
)
self.assertEquals(
db.get_string_mapping_for_platform("Mac"),
{
u"SomeString" : u"Mac translation for some string",
u"OtherString" : u"Mac translation for some other string",
}
)
self.assertEquals(
db.get_string_mapping_for_platform("Win"),
{
u"SomeString" : u"Win translation for some string",
u"OtherString" : u"Win translation for some other string",
}
)
db.disconnect()
def test_get_all_translation_keys(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
u"SomeString" : u"Mac translation for some string",
u"OtherString" : u"Mac translation for some other string",
}
)
self.assertEquals(
db.get_all_translation_keys(),
[ u"OtherString", u"SomeString", ],
)
db.disconnect()
def test_get_all_native_translations(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{
u"SomeString" : u"Mac translation for some string",
u"OtherString" : u"Mac translation for some other string"
}
)
self.assertEquals(
db.get_all_native_translations(),
[
u"Mac translation for some string",
u"Mac translation for some other string"
],
)
db.disconnect()
def test_get_native_translations_for_platform(self):
db = database.SQLite("some_filename")
def _connect(*args, **kwargs):
db.dbh = sqlite3.connect(":memory:")
def _disconnect(*args, **kwargs):
db.dbh.close()
db._connect = mock.Mock(side_effect = _connect)
db.disconnect = mock.Mock(side_effect = _disconnect)
db._get_current_time = mock.Mock(
return_value = "datetime('2010-12-02 02:20:00')"
)
db.connect()
db.write_string_mapping_for_platform(
"Mac",
{ "SomeString" : "%03d of %03.3lld for {0} %@", }
)
self.assertEquals(
db.get_native_translations_for_platform("Mac"),
[ "%03d of %03.3lld for {0} %@" ]
)
db.disconnect()
@mock.patch("__builtin__.open")
def test_open_for_reading(self, open_func):
db = database.SQLite("some_filename")
db._open_for_reading("filename")
open_func.assert_called_with("filename", "r")
@mock.patch.object(codecs, "open")
def test_open_for_writing(self, open_func):
db = database.SQLite("some_filename")
db._open_for_writing("filename")
open_func.assert_called_with("filename", "w", "utf-8")
@mock.patch.object(os.path, "exists")
def test_deletes_existing_temp_file_on_connect(self, exists_func):
exists_func.return_value = True
db = database.SQLite("some_filename")
db._remove_temporary_file = mock.Mock()
db._load_database = mock.Mock()
db.connect()
db._remove_temporary_file.assert_called_with()
| 29.835526 | 75 | 0.533572 | 1,903 | 18,140 | 4.862323 | 0.079348 | 0.060521 | 0.077813 | 0.042797 | 0.869124 | 0.829461 | 0.799957 | 0.754782 | 0.72852 | 0.708419 | 0 | 0.030683 | 0.349614 | 18,140 | 607 | 76 | 29.884679 | 0.753602 | 0 | 0 | 0.586639 | 0 | 0 | 0.240942 | 0.00128 | 0 | 0 | 0 | 0 | 0.070981 | 1 | 0.075157 | false | 0 | 0.014614 | 0 | 0.091858 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4f4ff57a02e20840df6f45a253f2a007e5a10498 | 133 | py | Python | core/dl_framework/__init__.py | MarcelMoczarski/template_project | 126ca7e1749158bf3adb00eddffb289b6b88fea3 | [
"MIT"
] | null | null | null | core/dl_framework/__init__.py | MarcelMoczarski/template_project | 126ca7e1749158bf3adb00eddffb289b6b88fea3 | [
"MIT"
] | null | null | null | core/dl_framework/__init__.py | MarcelMoczarski/template_project | 126ca7e1749158bf3adb00eddffb289b6b88fea3 | [
"MIT"
] | null | null | null | from . import callbacks
from . import data
from . import learner
from . import loss_functions
from . import model
from . import utils | 22.166667 | 28 | 0.781955 | 19 | 133 | 5.421053 | 0.473684 | 0.582524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172932 | 133 | 6 | 29 | 22.166667 | 0.936364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4f59a198c3bde0f50d5e15cd6fee42198b31cf3e | 85 | py | Python | xpart/linear_normal_form.py | xbuffat/xpart | 482208e6aa964e98337b93bac8f604b8789ac8cc | [
"MIT"
] | null | null | null | xpart/linear_normal_form.py | xbuffat/xpart | 482208e6aa964e98337b93bac8f604b8789ac8cc | [
"MIT"
] | null | null | null | xpart/linear_normal_form.py | xbuffat/xpart | 482208e6aa964e98337b93bac8f604b8789ac8cc | [
"MIT"
] | null | null | null | from xtrack.linear_normal_form import healy_symplectify, compute_linear_normal_form
| 28.333333 | 83 | 0.905882 | 12 | 85 | 5.916667 | 0.75 | 0.338028 | 0.450704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070588 | 85 | 2 | 84 | 42.5 | 0.898734 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4f6bb8ed8732eeead8a7c8d428954de424ee3d13 | 506 | py | Python | propsettings/setting_types/list_setting_type.py | mnicolas94/propsettings | 2ec905bd045bf45a45e644846b098b64fb283130 | [
"MIT"
] | null | null | null | propsettings/setting_types/list_setting_type.py | mnicolas94/propsettings | 2ec905bd045bf45a45e644846b098b64fb283130 | [
"MIT"
] | null | null | null | propsettings/setting_types/list_setting_type.py | mnicolas94/propsettings | 2ec905bd045bf45a45e644846b098b64fb283130 | [
"MIT"
] | null | null | null | from propsettings.setting_type import SettingType
class List(SettingType):
def __init__(self, list_type: type, elements_setting_type: SettingType = None):
self._type = list_type
self._elements_setting_type = elements_setting_type
@property
def type(self):
return self._type
@property
def elements_setting_type(self):
return self._elements_setting_type
def has_elements_setting_type(self):
return self._elements_setting_type is not None
| 25.3 | 83 | 0.729249 | 63 | 506 | 5.428571 | 0.285714 | 0.25731 | 0.388889 | 0.201754 | 0.304094 | 0.304094 | 0.304094 | 0.304094 | 0.304094 | 0 | 0 | 0 | 0.211462 | 506 | 19 | 84 | 26.631579 | 0.857143 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0.076923 | 0.230769 | 0.692308 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
96c514bd91738ca4d5d7995ce73db811c99d5bd8 | 53,257 | py | Python | api_tests/nodes/views/test_node_comments_list.py | h-ci-user01/osf.h-test | a61db2c639a26031aa5b7f58c4dd719919aa5ece | [
"Apache-2.0"
] | null | null | null | api_tests/nodes/views/test_node_comments_list.py | h-ci-user01/osf.h-test | a61db2c639a26031aa5b7f58c4dd719919aa5ece | [
"Apache-2.0"
] | 18 | 2020-03-24T15:26:02.000Z | 2022-03-08T21:30:39.000Z | api_tests/nodes/views/test_node_comments_list.py | h-ci-user01/osf.h-test | a61db2c639a26031aa5b7f58c4dd719919aa5ece | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import pytest
import mock
from addons.wiki.tests.factories import NodeWikiFactory
from api.base.settings import osf_settings
from api.base.settings.defaults import API_BASE
from api_tests import utils as test_utils
from framework.auth import core
from osf.models import Guid
from osf_tests.factories import (
ProjectFactory,
RegistrationFactory,
AuthUserFactory,
CommentFactory,
)
from rest_framework import exceptions
@pytest.fixture()
def user():
return AuthUserFactory()
@pytest.mark.django_db
class NodeCommentsListMixin(object):
@pytest.fixture()
def user_non_contrib(self):
return AuthUserFactory()
@pytest.fixture()
def project_private_dict(self):
raise NotImplementedError
@pytest.fixture()
def project_public_dict(self):
raise NotImplementedError
@pytest.fixture()
def registration_dict(self):
raise NotImplementedError
def test_return_comments(self, app, user, user_non_contrib, project_public_dict, project_private_dict, registration_dict):
# test_return_public_node_comments_logged_out_user
res = app.get(project_public_dict['url'])
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert len(comment_json) == 1
assert project_public_dict['comment']._id in comment_ids
# test_return_public_node_comments_logged_in_user
res = app.get(project_public_dict['url'], auth=user_non_contrib)
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert len(comment_json) == 1
assert project_public_dict['comment']._id in comment_ids
# test_return_private_node_comments_logged_out_user
res = app.get(project_private_dict['url'], expect_errors=True)
assert res.status_code == 401
assert res.json['errors'][0]['detail'] == exceptions.NotAuthenticated.default_detail
# test_return_private_node_comments_logged_in_non_contributor
res = app.get(project_private_dict['url'], auth=user_non_contrib, expect_errors=True)
assert res.status_code == 401
assert res.json['errors'][0]['detail'] == exceptions.NotAuthenticated.default_detail
# test_return_private_node_comments_logged_in_contributor
res = app.get(project_private_dict['url'], auth=user.auth)
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert len(comment_json) == 1
assert project_private_dict['comment']._id in comment_ids
# test_return_registration_comments_logged_in_contributor
res = app.get(registration_dict['url'], auth=user.auth)
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert len(comment_json) == 1
assert registration_dict['comment']._id in comment_ids
def test_return_both_deleted_and_undeleted_comments(self, app, user, project_private_dict, mock_update_search=None):
deleted_comment = CommentFactory(node=project_private_dict['project'], user=user, target=project_private_dict['comment'].target, is_deleted=True)
res = app.get(project_private_dict['url'], auth=user.auth)
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert project_private_dict['comment']._id in comment_ids
assert deleted_comment._id in comment_ids
def test_node_comments_pagination(self, app, user, project_public_dict):
# test_node_comments_list_pagination
url = '{}?filter[target]={}&related_counts=False'.format(project_public_dict['url'], project_public_dict['project']._id)
res = app.get(url, user=user, auth=user.auth)
assert res.status_code == 200
assert res.json['links']['meta']['unread'] == 0
# test_node_comments_list_updated_pagination
url = '{}?filter[target]={}&related_counts=False&version=2.1'.format(project_public_dict['url'], project_public_dict['project']._id)
res = app.get(url, user=user, auth=user.auth)
assert res.status_code == 200
assert res.json['meta']['unread'] == 0
@pytest.mark.django_db
class TestNodeCommentsList(NodeCommentsListMixin):
@pytest.fixture()
def project_private_dict(self, user):
project_private = ProjectFactory(is_public=False, creator=user)
comment_private = CommentFactory(node=project_private, user=user)
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
return {'project': project_private, 'comment': comment_private, 'url': url_private}
@pytest.fixture()
def project_public_dict(self, user):
project_public = ProjectFactory(is_public=True, creator=user)
comment_public = CommentFactory(node=project_public, user=user)
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
return {'project': project_public, 'comment': comment_public, 'url': url_public}
@pytest.fixture()
def registration_dict(self, user):
registration = RegistrationFactory(creator=user)
comment_registration = CommentFactory(node=registration, user=user)
url_registration = '/{}registrations/{}/comments/'.format(API_BASE, registration._id)
return {'registration': registration, 'comment': comment_registration, 'url': url_registration}
@pytest.mark.django_db
class TestNodeCommentsListFiles(NodeCommentsListMixin):
@pytest.fixture()
def project_private_dict(self, user):
project_private = ProjectFactory(is_public=False, creator=user)
file_private = test_utils.create_test_file(project_private, user)
comment_private = CommentFactory(node=project_private, user=user, target=file_private.get_guid(), page='files')
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
return {'project': project_private, 'file': file_private, 'comment': comment_private, 'url': url_private}
@pytest.fixture()
def project_public_dict(self, user):
project_public = ProjectFactory(is_public=True, creator=user)
file_public = test_utils.create_test_file(project_public, user)
comment_public = CommentFactory(node=project_public, user=user, target=file_public.get_guid(), page='files')
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
return {'project': project_public, 'file': file_public, 'comment': comment_public, 'url': url_public}
@pytest.fixture()
def registration_dict(self, user):
registration = RegistrationFactory(creator=user)
file_registration = test_utils.create_test_file(registration, user)
comment_registration = CommentFactory(node=registration, user=user, target=file_registration.get_guid(), page='files')
url_registration = '/{}registrations/{}/comments/'.format(API_BASE, registration._id)
return {'registration': registration, 'file': file_registration, 'comment': comment_registration, 'url': url_registration}
def test_comments_on_deleted_files_are_not_returned(self, app, user, project_private_dict):
# Delete commented file
osfstorage = project_private_dict['project'].get_addon('osfstorage')
root_node = osfstorage.get_root()
# root_node.delete(project_private_dict['file'])
project_private_dict['file'].delete(user=user)
res = app.get(project_private_dict['url'], auth=user.auth)
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert project_private_dict['comment']._id not in comment_ids
@pytest.mark.django_db
class TestNodeCommentsListWiki(NodeCommentsListMixin):
@pytest.fixture()
def project_private_dict(self, user):
project_private = ProjectFactory(is_public=False, creator=user)
wiki_private = NodeWikiFactory(node=project_private, user=user)
comment_private = CommentFactory(node=project_private, user=user, target=Guid.load(wiki_private._id), page='wiki')
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
return {'project': project_private, 'wiki': wiki_private, 'comment': comment_private, 'url': url_private}
@pytest.fixture()
def project_public_dict(self, user):
project_public = ProjectFactory(is_public=True, creator=user)
wiki_public = NodeWikiFactory(node=project_public, user=user)
comment_public = CommentFactory(node=project_public, user=user, target=Guid.load(wiki_public._id), page='wiki')
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
return {'project': project_public, 'wiki': wiki_public, 'comment': comment_public, 'url': url_public}
@pytest.fixture()
def registration_dict(self, user):
registration = RegistrationFactory(creator=user)
wiki_registration = NodeWikiFactory(node=registration, user=user)
comment_registration = CommentFactory(node=registration, user=user, target=Guid.load(wiki_registration._id), page='wiki')
url_registration = '/{}registrations/{}/comments/'.format(API_BASE, registration._id)
return {'registration': registration, 'wiki': wiki_registration, 'comment': comment_registration, 'url': url_registration}
def test_comments_on_deleted_wikis_are_not_returned(self, app, user, project_private_dict, mock_update_search=None):
# Delete wiki
project_private_dict['project'].delete_node_wiki(project_private_dict['wiki'].page_name, core.Auth(user))
res = app.get(project_private_dict['url'], auth=user.auth)
assert res.status_code == 200
comment_json = res.json['data']
comment_ids = [comment['id'] for comment in comment_json]
assert project_private_dict['comment']._id not in comment_ids
@pytest.mark.django_db
class NodeCommentsCreateMixin(object):
@pytest.fixture()
def user_read_contrib(self):
return AuthUserFactory()
@pytest.fixture()
def user_non_contrib(self):
return AuthUserFactory()
@pytest.fixture()
def payload(self):
raise NotImplementedError
@pytest.fixture()
def project_private_comment_private(self):
raise NotImplementedError
@pytest.fixture()
def project_public_comment_private(self):
raise NotImplementedError
@pytest.fixture()
def project_public_comment_public(self):
raise NotImplementedError
@pytest.fixture()
def project_private_comment_public(self):
raise NotImplementedError
def test_node_comments(self, app, user, user_read_contrib, user_non_contrib, project_private_comment_private, project_private_comment_public, project_public_comment_public, project_public_comment_private):
# test_private_node_private_comment_level_logged_in_admin_can_comment
project_dict = project_private_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_private_node_private_comment_level_logged_in_read_contributor_can_comment
project_dict = project_private_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_read_contrib.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_private_node_private_comment_level_non_contributor_cannot_comment
project_dict = project_private_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_non_contrib.auth, expect_errors=True)
assert res.status_code == 403
assert res.json['errors'][0]['detail'] == exceptions.PermissionDenied.default_detail
# test_private_node_private_comment_level_logged_out_user_cannot_comment
project_dict = project_private_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], expect_errors=True)
assert res.status_code == 401
assert res.json['errors'][0]['detail'] == exceptions.NotAuthenticated.default_detail
# test_private_node_with_public_comment_level_admin_can_comment
# Test admin contributors can still comment on a private project with comment_level == 'public'
project_dict = project_private_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_private_node_with_public_comment_level_read_only_contributor_can_comment
# Test read-only contributors can still comment on a private project with comment_level == 'public'
project_dict = project_private_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_read_contrib.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_private_node_with_public_comment_level_non_contributor_cannot_comment
# Test non-contributors cannot comment on a private project with comment_level == 'public'
project_dict = project_private_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_non_contrib.auth, expect_errors=True)
assert res.status_code == 403
# test_private_node_with_public_comment_level_logged_out_user_cannot_comment
# Test logged out users cannot comment on a private project with comment_level == 'public'
project_dict = project_private_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], expect_errors=True)
assert res.status_code == 401
assert res.json['errors'][0]['detail'] == exceptions.NotAuthenticated.default_detail
# test_public_project_with_public_comment_level_admin_can_comment
# Test admin contributor can still comment on a public project when it is configured so any logged-in user can comment (comment_level == 'public')
project_dict = project_public_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_public_project_with_public_comment_level_read_only_contributor_can_comment
# Test read-only contributor can still comment on a public project when it is configured so any logged-in user can comment (comment_level == 'public')
project_dict = project_public_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_read_contrib.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_public_project_with_public_comment_level_non_contributor_can_comment
# Test non-contributors can comment on a public project when it is configured so any logged-in user can comment (comment_level == 'public')
project_dict = project_public_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_non_contrib.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_public_project_with_public_comment_level_logged_out_user_cannot_comment
# Test logged out users cannot comment on a public project when it is configured so any logged-in user can comment (comment_level == 'public')
project_dict = project_public_comment_public
res = app.post_json_api(project_dict['url'], project_dict['payload'], expect_errors=True)
assert res.status_code == 401
assert res.json['errors'][0]['detail'] == exceptions.NotAuthenticated.default_detail
# test_public_node_private_comment_level_admin_can_comment
# Test only contributors can comment on a public project when it is configured so only contributors can comment (comment_level == 'private')
project_dict = project_public_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_public_node_private_comment_level_read_only_contributor_can_comment
project_dict = project_public_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_read_contrib.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == project_dict['payload']['data']['attributes']['content']
# test_public_node_private_comment_level_non_contributor_cannot_comment
project_dict = project_public_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], auth=user_non_contrib.auth, expect_errors=True)
assert res.status_code == 403
assert res.json['errors'][0]['detail'] == exceptions.PermissionDenied.default_detail
# test_public_node_private_comment_level_logged_out_user_cannot_comment
project_dict = project_public_comment_private
res = app.post_json_api(project_dict['url'], project_dict['payload'], expect_errors=True)
assert res.status_code == 401
assert res.json['errors'][0]['detail'] == exceptions.NotAuthenticated.default_detail
@pytest.mark.django_db
class TestNodeCommentCreate(NodeCommentsCreateMixin):
@pytest.fixture()
def payload(self):
def make_payload(target_id):
return {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': target_id
}
}
}
}
}
return make_payload
@pytest.fixture()
def project_private_comment_private(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user, comment_level='private')
project_private.add_contributor(user_read_contrib, permissions=['read'])
project_private.save()
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
payload_private = payload(project_private._id)
return {'project': project_private, 'url': url_private, 'payload': payload_private}
@pytest.fixture()
def project_public_comment_private(self, user, user_read_contrib, payload):
project_public = ProjectFactory(is_public=True, creator=user, comment_level='private')
project_public.add_contributor(user_read_contrib, permissions=['read'])
project_public.save()
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
payload_public = payload(project_public._id)
return {'project': project_public, 'url': url_public, 'payload': payload_public}
@pytest.fixture()
def project_public_comment_public(self, user, user_read_contrib, payload):
""" Public project configured so that any logged-in user can comment."""
project_public = ProjectFactory(is_public=True, creator=user)
project_public.add_contributor(user_read_contrib, permissions=['read'])
project_public.save()
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
payload_public = payload(project_public._id)
return {'project': project_public, 'url': url_public, 'payload': payload_public}
@pytest.fixture()
def project_private_comment_public(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user)
project_private.add_contributor(user_read_contrib, permissions=['read'])
project_private.save()
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
payload_private = payload(project_private._id)
return {'project': project_private, 'url': url_private, 'payload': payload_private}
def test_create_comment_errors(self, app, user, payload, project_private_comment_private):
# test_create_comment_invalid_data
project_dict = project_private_comment_private
res = app.post_json_api(project_dict['url'], 'Incorrect data', auth=user.auth, expect_errors=True)
assert res.status_code == 400
# test_create_comment_no_relationships
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': '4:44'
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Request must include /data/relationships.'
assert res.json['errors'][0]['source']['pointer'] == '/data/relationships'
# test_create_comment_empty_relationships
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'Center for Closed Logic'
},
'relationships': {}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Request must include /data/relationships.'
assert res.json['errors'][0]['source']['pointer'] == '/data/relationships'
# test_create_comment_relationship_is_a_list
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': [{'id': project_dict['project']._id}]
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == exceptions.ParseError.default_detail
# test_create_comment_target_no_data_in_relationships
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Request must include /data.'
assert res.json['errors'][0]['source']['pointer'] == 'data/relationships/target/data'
# test_create_comment_no_target_key_in_relationships
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'data': {
'type': 'nodes',
'id': project_dict['project']._id
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == exceptions.ParseError.default_detail
# test_create_comment_blank_target_id
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': ''
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Invalid comment target \'\'.'
# test_create_comment_invalid_target_id
project_dict = project_private_comment_private
project_id = ProjectFactory()._id
payload_project = payload(project_id)
res = app.post_json_api(project_dict['url'], payload_project, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Invalid comment target \'' + str(project_id) + '\'.'
# test_create_comment_invalid_target_type
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'Invalid',
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 409
assert res.json['errors'][0]['detail'] == 'The target resource has a type of "nodes", but you set the json body\'s type field to "Invalid". You probably need to change the type field to match the target resource\'s type.'
# test_create_comment_no_target_type_in_relationships
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Request must include /type.'
# test_create_comment_no_type
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': '',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'This field may not be blank.'
assert res.json['errors'][0]['source']['pointer'] == '/data/type'
# test_create_comment_no_content
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': ''
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'This field may not be blank.'
assert res.json['errors'][0]['source']['pointer'] == '/data/attributes/content'
# test_create_comment_trims_whitespace
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': ' '
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'This field may not be blank.'
# test_create_comment_exceeds_max_length
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': ('c' * (osf_settings.COMMENT_MAXLENGTH + 3))
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Ensure this field has no more than {} characters.'.format(str(osf_settings.COMMENT_MAXLENGTH))
# test_create_comment_invalid_target_node
url_fake = '/{}nodes/{}/comments/'.format(API_BASE, 'abcde')
payload_fake = payload('abcde')
res = app.post_json_api(url_fake, payload_fake, auth=user.auth, expect_errors=True)
assert res.status_code == 404
assert res.json['errors'][0]['detail'] == exceptions.NotFound.default_detail
def test_create_comment_with_allowed_tags(self, app, user, project_private_comment_private):
project_dict = project_private_comment_private
payload = {
'data': {
'type': 'comments',
'attributes': {
'content': '<em>Logic</em> <strong>Reason</strong>'
},
'relationships': {
'target': {
'data': {
'type': 'nodes',
'id': project_dict['project']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload, auth=user.auth)
assert res.status_code == 201
assert res.json['data']['attributes']['content'] == payload['data']['attributes']['content']
@pytest.mark.django_db
class TestFileCommentCreate(NodeCommentsCreateMixin):
@pytest.fixture()
def payload(self):
def make_payload(target_id):
return {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'files',
'id': target_id
}
}
}
}
}
return make_payload
@pytest.fixture()
def project_private_comment_private(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user, comment_level='private')
project_private.add_contributor(user_read_contrib, permissions=['read'])
project_private.save()
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
file_private = test_utils.create_test_file(project_private, user)
payload_private = payload(file_private.get_guid()._id)
return {'project': project_private, 'url': url_private, 'file': file_private, 'payload': payload_private}
@pytest.fixture()
def project_public_comment_private(self, user, user_read_contrib, payload):
project_public = ProjectFactory(is_public=True, creator=user, comment_level='private')
project_public.add_contributor(user_read_contrib, permissions=['read'])
project_public.save()
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
file_public = test_utils.create_test_file(project_public, user)
payload_public = payload(file_public.get_guid()._id)
return {'project': project_public, 'url': url_public, 'file': file_public, 'payload': payload_public}
@pytest.fixture()
def project_public_comment_public(self, user, user_read_contrib, payload):
""" Public project configured so that any logged-in user can comment."""
project_public = ProjectFactory(is_public=True, creator=user)
project_public.add_contributor(user_read_contrib, permissions=['read'])
project_public.save()
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
file_public = test_utils.create_test_file(project_public, user)
payload_public = payload(file_public.get_guid()._id)
return {'project': project_public, 'url': url_public, 'file': file_public, 'payload': payload_public}
@pytest.fixture()
def project_private_comment_public(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user)
project_private.add_contributor(user_read_contrib, permissions=['read'])
project_private.save()
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
file_private = test_utils.create_test_file(project_private, user)
payload_private = payload(file_private.get_guid()._id)
return {'project': project_private, 'url': url_private, 'file': file_private, 'payload': payload_private}
def test_create_file_comment_errors(self, app, user, payload, project_private_comment_private):
# test_create_file_comment_invalid_target_id
project_dict = project_private_comment_private
file = test_utils.create_test_file(ProjectFactory(), user)
payload_req = payload(file._id)
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Invalid comment target \'' + str(file._id) + '\'.'
# test_create_file_comment_invalid_target_type
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'Invalid',
'id': project_dict['file'].get_guid()._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 409
assert res.json['errors'][0]['detail'] == 'The target resource has a type of "files", but you set the json body\'s type field to "Invalid". You probably need to change the type field to match the target resource\'s type.'
@pytest.mark.django_db
class TestWikiCommentCreate(NodeCommentsCreateMixin):
@pytest.fixture()
def payload(self):
def make_payload(target_id):
return {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'wiki',
'id': target_id
}
}
}
}
}
return make_payload
@pytest.fixture()
def project_private_comment_private(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user, comment_level='private')
project_private.add_contributor(user_read_contrib, permissions=['read'])
project_private.save()
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
wiki = NodeWikiFactory(node=project_private, user=user)
payload_private = payload(wiki._id)
return {'project': project_private, 'url': url_private, 'wiki': wiki, 'payload': payload_private}
@pytest.fixture()
def project_public_comment_private(self, user, user_read_contrib, payload):
project_public = ProjectFactory(is_public=True, creator=user, comment_level='private')
project_public.add_contributor(user_read_contrib, permissions=['read'])
project_public.save()
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
wiki = NodeWikiFactory(node=project_public, user=user)
payload_public = payload(wiki._id)
return {'project': project_public, 'url': url_public, 'wiki': wiki, 'payload': payload_public}
@pytest.fixture()
def project_public_comment_public(self, user, user_read_contrib, payload):
""" Public project configured so that any logged-in user can comment."""
project_public = ProjectFactory(is_public=True, creator=user)
project_public.add_contributor(user_read_contrib, permissions=['read'])
project_public.save()
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
wiki = NodeWikiFactory(node=project_public, user=user)
payload_public = payload(wiki._id)
return {'project': project_public, 'url': url_public, 'wiki': wiki, 'payload': payload_public}
@pytest.fixture()
def project_private_comment_public(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user)
project_private.add_contributor(user_read_contrib, permissions=['read'])
project_private.save()
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
wiki = NodeWikiFactory(node=project_private, user=user)
payload_private = payload(wiki._id)
return {'project': project_private, 'url': url_private, 'wiki': wiki, 'payload': payload_private}
def test_create_wiki_comment_errors(self, app, user, payload, project_private_comment_private, mock_update_search=None):
# test_create_wiki_comment_invalid_target_id
project_dict = project_private_comment_private
wiki = NodeWikiFactory(node=ProjectFactory(), user=user)
payload_req = payload(wiki._id)
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Invalid comment target \'' + str(wiki._id) + '\'.'
# test_create_wiki_comment_invalid_target_type
project_dict = project_private_comment_private
payload_req = {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'Invalid',
'id': project_dict['wiki']._id
}
}
}
}
}
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 409
assert res.json['errors'][0]['detail'] == 'The target resource has a type of "wiki", but you set the json body\'s type field to "Invalid". You probably need to change the type field to match the target resource\'s type.'
@pytest.mark.django_db
class TestCommentRepliesCreate(NodeCommentsCreateMixin):
@pytest.fixture()
def payload(self):
def make_payload(comment_id):
return {
'data': {
'type': 'comments',
'attributes': {
'content': 'This is a comment'
},
'relationships': {
'target': {
'data': {
'type': 'comments',
'id': comment_id
}
}
}
}
}
return make_payload
@pytest.fixture()
def project_private_comment_private(self, user, user_read_contrib, payload):
project_private = ProjectFactory.create(is_public=False, creator=user, comment_level='private')
project_private.add_contributor(user_read_contrib, permissions=['read'], save=True)
comment_private = CommentFactory(node=project_private, user=user)
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
payload_private = payload(comment_private._id)
return {'project': project_private, 'comment': comment_private, 'url': url_private, 'payload': payload_private}
@pytest.fixture()
def project_public_comment_private(self, user, user_read_contrib, payload):
project_public = ProjectFactory.create(is_public=True, creator=user, comment_level='private')
project_public.add_contributor(user_read_contrib, permissions=['read'], save=True)
comment_public = CommentFactory(node=project_public, user=user)
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
payload_public = payload(comment_public._id)
return {'project': project_public, 'comment': comment_public, 'url': url_public, 'payload': payload_public}
@pytest.fixture()
def project_private_comment_public(self, user, user_read_contrib, payload):
project_private = ProjectFactory(is_public=False, creator=user)
project_private.add_contributor(user_read_contrib, permissions=['read'], save=True)
comment_private = CommentFactory(node=project_private, user=user)
comment_reply = CommentFactory(node=project_private, target=Guid.load(comment_private._id), user=user)
url_private = '/{}nodes/{}/comments/'.format(API_BASE, project_private._id)
payload_private = payload(comment_reply._id)
return {'project': project_private, 'comment': comment_private, 'reply': comment_reply, 'url': url_private, 'payload': payload_private}
@pytest.fixture()
def project_public_comment_public(self, user, user_read_contrib, payload):
project_public = ProjectFactory(is_public=True, creator=user)
project_public.add_contributor(user_read_contrib, permissions=['read'], save=True)
comment_public = CommentFactory(node=project_public, user=user)
comment_reply = CommentFactory(node=project_public, target=Guid.load(comment_public._id), user=user)
url_public = '/{}nodes/{}/comments/'.format(API_BASE, project_public._id)
payload_public = payload(comment_reply._id)
return {'project': project_public, 'comment': comment_public, 'reply': comment_reply, 'url': url_public, 'payload': payload_public}
def test_create_comment_reply_invalid_target_id(self, app, user, payload, project_private_comment_private):
project_dict = project_private_comment_private
target_comment = CommentFactory(node=ProjectFactory(), user=user)
payload_req = payload(target_comment._id)
res = app.post_json_api(project_dict['url'], payload_req, auth=user.auth, expect_errors=True)
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'Invalid comment target \'' + str(target_comment._id) + '\'.'
@pytest.mark.django_db
class TestCommentFiltering:
@pytest.fixture()
def project(self, user):
return ProjectFactory(creator=user)
@pytest.fixture()
def comment(self, user, project):
return CommentFactory(node=project, user=user, page='node')
@pytest.fixture()
def comment_deleted(self, user, project):
return CommentFactory(node=project, user=user, is_deleted=True, page='node')
@pytest.fixture()
def url_base(self, project):
return '/{}nodes/{}/comments/'.format(API_BASE, project._id)
@pytest.fixture()
def date_created_formatted(self, comment):
return comment.created.strftime('%Y-%m-%dT%H:%M:%S.%f')
@pytest.fixture()
def date_modified_formatted(self, user, comment):
comment.edit('Edited comment', auth=core.Auth(user), save=True)
return comment.modified.strftime('%Y-%m-%dT%H:%M:%S.%f')
def test_filtering(self, app, user, project, comment, comment_deleted, date_created_formatted, date_modified_formatted, url_base):
# test_node_comments_with_no_filter_returns_all_comments
res = app.get(url_base, auth=user.auth)
assert len(res.json['data']) == 2
# test_filtering_for_deleted_comments
assert comment
assert comment_deleted
url = url_base + '?filter[deleted]=True'
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert res.json['data'][0]['attributes']['deleted']
# test_filtering_for_non_deleted_comments
assert comment
assert comment_deleted
url = url_base + '?filter[deleted]=False'
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert not res.json['data'][0]['attributes']['deleted']
# test_filtering_comments_created_before_date
url = url_base + '?filter[date_created][lt]={}'.format(date_created_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 0
# test_filtering_comments_created_on_date
url = url_base + '?filter[date_created][eq]={}'.format(date_created_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
# test_filtering_comments_created_on_or_before_date
url = url_base + '?filter[date_created][lte]={}'.format(date_created_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
# test_filtering_comments_created_after_date
url = url_base + '?filter[date_created][gt]={}'.format(date_created_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
# test_filtering_comments_created_on_or_after_date
url = url_base + '?filter[date_created][gte]={}'.format(date_created_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 2
# test_filtering_comments_modified_before_date
url = url_base + '?filter[date_modified][lt]={}'.format(date_modified_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
# test_filtering_comments_modified_on_date
url = url_base + '?filter[date_modified][eq]={}'.format(date_modified_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
# test_filtering_comments_modified_after_date
url = url_base + '?filter[date_modified][gt]={}'.format(date_modified_formatted)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 0
# test_filtering_by_target_node
url = url_base + '?filter[target]=' + str(project._id)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 2
assert project._id in res.json['data'][0]['relationships']['target']['links']['related']['href']
assert project._id in res.json['data'][1]['relationships']['target']['links']['related']['href']
# test_filtering_by_target_no_results
url = url_base + '?filter[target]=' + 'fakeid'
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 0
# test_filtering_by_target_no_results_with_related_counts
url = '{}?filter[target]=fakeid&related_counts=True'.format(url_base)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 0
# test_filtering_by_page_node
url = url_base + '?filter[page]=node'
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 2
assert 'node' == res.json['data'][0]['attributes']['page']
assert 'node' == res.json['data'][1]['attributes']['page']
def test_filtering_for_comment_replies(self, app, user, project, comment, comment_deleted, url_base):
reply = CommentFactory(node=project, user=user, target=Guid.load(comment._id))
url = url_base + '?filter[target]=' + str(comment._id)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert comment._id in res.json['data'][0]['relationships']['target']['links']['related']['href']
def test_filtering_by_target_file(self, app, user, project, comment, comment_deleted, url_base):
test_file = test_utils.create_test_file(project, user)
target = test_file.get_guid()
file_comment = CommentFactory(node=project, user=user, target=target)
url = url_base + '?filter[target]=' + str(target._id)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert test_file._id in res.json['data'][0]['relationships']['target']['links']['related']['href']
def test_filtering_by_target_wiki(self, app, user, project, comment, comment_deleted, url_base):
test_wiki = NodeWikiFactory(node=project, user=user)
wiki_comment = CommentFactory(node=project, user=user, target=Guid.load(test_wiki._id), page='wiki')
url = url_base + '?filter[target]=' + str(test_wiki._id)
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert test_wiki.get_absolute_url() == res.json['data'][0]['relationships']['target']['links']['related']['href']
def test_filtering_by_page_files(self, app, user, project, comment, comment_deleted, url_base):
test_file = test_utils.create_test_file(project, user)
file_comment = CommentFactory(node=project, user=user, target=test_file.get_guid(), page='files')
url = url_base + '?filter[page]=files'
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert 'files' == res.json['data'][0]['attributes']['page']
def test_filtering_by_page_wiki(self, app, user, project, comment, comment_deleted, url_base):
test_wiki = NodeWikiFactory(node=project, user=user)
wiki_comment = CommentFactory(node=project, user=user, target=Guid.load(test_wiki._id), page='wiki')
url = url_base + '?filter[page]=wiki'
res = app.get(url, auth=user.auth)
assert len(res.json['data']) == 1
assert 'wiki' == res.json['data'][0]['attributes']['page']
| 47.764126 | 230 | 0.633701 | 6,009 | 53,257 | 5.31586 | 0.041937 | 0.057415 | 0.019535 | 0.028551 | 0.897474 | 0.866137 | 0.845036 | 0.818834 | 0.793413 | 0.768369 | 0 | 0.005494 | 0.248155 | 53,257 | 1,114 | 231 | 47.807002 | 0.792263 | 0.083557 | 0 | 0.669265 | 0 | 0 | 0.123589 | 0.022599 | 0 | 0 | 0 | 0 | 0.154788 | 1 | 0.075724 | false | 0 | 0.011136 | 0.014477 | 0.14588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
96f4765fec2d42431d835dc10a27c16df38b9656 | 45 | py | Python | aptenodytes/__init__.py | yongrenjie/aptenodytes | 0eb33b89c2358be42e9c3c4aa554618c6b2809e2 | [
"MIT"
] | null | null | null | aptenodytes/__init__.py | yongrenjie/aptenodytes | 0eb33b89c2358be42e9c3c4aa554618c6b2809e2 | [
"MIT"
] | null | null | null | aptenodytes/__init__.py | yongrenjie/aptenodytes | 0eb33b89c2358be42e9c3c4aa554618c6b2809e2 | [
"MIT"
] | null | null | null | from .main import *
from .molecules import *
| 15 | 24 | 0.733333 | 6 | 45 | 5.5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 25 | 22.5 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c4d647782b5eb6d5eb6afa7f6e3f26d90eaef78 | 9,497 | py | Python | mpf/tests/test_Autofire.py | enteryourinitials/mpf | 8fa529aacc1b163c71557adb61b591077d66c77e | [
"MIT"
] | null | null | null | mpf/tests/test_Autofire.py | enteryourinitials/mpf | 8fa529aacc1b163c71557adb61b591077d66c77e | [
"MIT"
] | null | null | null | mpf/tests/test_Autofire.py | enteryourinitials/mpf | 8fa529aacc1b163c71557adb61b591077d66c77e | [
"MIT"
] | null | null | null | from unittest.mock import MagicMock
from mpf.platforms.interfaces.driver_platform_interface import PulseSettings
from mpf.core.platform import SwitchSettings, DriverSettings
from mpf.tests.MpfTestCase import MpfTestCase
class TestAutofire(MpfTestCase):
def get_config_file(self):
return 'config.yaml'
def get_machine_path(self):
return 'tests/machine_files/autofire/'
def test_hw_rule_pulse(self):
self.machine.default_platform.set_pulse_on_hit_rule = MagicMock()
self.machine.autofires["ac_test"].enable()
self.machine.default_platform.set_pulse_on_hit_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test"].hw_switch, invert=False, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=True)
)
self.machine.default_platform.clear_hw_rule = MagicMock()
self.machine.autofires["ac_test"].disable()
self.machine.default_platform.clear_hw_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test"].hw_switch, invert=False, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=True))
def test_hw_rule_overwrites(self):
self.machine.default_platform.set_pulse_on_hit_rule = MagicMock()
self.machine.autofires["ac_test_overwrites"].enable()
self.machine.default_platform.set_pulse_on_hit_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test"].hw_switch, invert=False, debounce=True),
DriverSettings(hw_driver=self.machine.coils["c_test"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=False)
)
self.machine.default_platform.clear_hw_rule = MagicMock()
self.machine.autofires["ac_test_overwrites"].disable()
self.machine.default_platform.clear_hw_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test"].hw_switch, invert=False, debounce=True),
DriverSettings(hw_driver=self.machine.coils["c_test"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=False))
self.machine.default_platform.set_pulse_on_hit_rule = MagicMock()
self.machine.autofires["ac_test_overwrites2"].enable()
self.machine.default_platform.set_pulse_on_hit_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test_debounce_on"].hw_switch, invert=False, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test_recycle_off"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=10), hold_settings=None, recycle=True)
)
self.machine.default_platform.clear_hw_rule = MagicMock()
self.machine.autofires["ac_test_overwrites2"].disable()
self.machine.default_platform.clear_hw_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test_debounce_on"].hw_switch, invert=False, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test_recycle_off"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=10), hold_settings=None, recycle=True))
def test_hw_rule_coil_and_switch_defaults(self):
self.machine.default_platform.set_pulse_on_hit_rule = MagicMock()
self.machine.autofires["ac_test_defaults"].enable()
self.machine.default_platform.set_pulse_on_hit_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test_debounce_on"].hw_switch, invert=False, debounce=True),
DriverSettings(hw_driver=self.machine.coils["c_test_recycle_off"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=10), hold_settings=None, recycle=False)
)
self.machine.default_platform.clear_hw_rule = MagicMock()
self.machine.autofires["ac_test_defaults"].disable()
self.machine.default_platform.clear_hw_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test_debounce_on"].hw_switch, invert=False, debounce=True),
DriverSettings(hw_driver=self.machine.coils["c_test_recycle_off"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=10), hold_settings=None, recycle=False))
def test_hw_rule_pulse_inverted_switch(self):
self.machine.default_platform.set_pulse_on_hit_rule = MagicMock()
self.machine.autofires["ac_test_inverted"].enable()
self.machine.default_platform.set_pulse_on_hit_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test_nc"].hw_switch, invert=True, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test2"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=True)
)
self.machine.default_platform.clear_hw_rule = MagicMock()
self.machine.autofires["ac_test_inverted"].disable()
self.machine.default_platform.clear_hw_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test_nc"].hw_switch, invert=True, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test2"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=True))
def test_hw_rule_pulse_inverted_autofire(self):
self.machine.default_platform.set_pulse_on_hit_rule = MagicMock()
self.machine.autofires["ac_test_inverted2"].enable()
self.machine.default_platform.set_pulse_on_hit_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test"].hw_switch, invert=True, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test2"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=True)
)
self.machine.default_platform.clear_hw_rule = MagicMock()
self.machine.autofires["ac_test_inverted2"].disable()
self.machine.default_platform.clear_hw_rule.assert_called_once_with(
SwitchSettings(hw_switch=self.machine.switches["s_test"].hw_switch, invert=True, debounce=False),
DriverSettings(hw_driver=self.machine.coils["c_test2"].hw_driver,
pulse_settings=PulseSettings(power=1.0, duration=23), hold_settings=None, recycle=True))
def test_disabled(self):
"""Verify that a disabled autofire coil doesn't post 'playfield_active'."""
self.mock_event("playfield_active")
self.machine_run()
self.hit_and_release_switch("s_test_disabled")
self.machine_run()
self.assertEventNotCalled("playfield_active")
self.machine.autofires["ac_test_disabled"].enable()
self.hit_and_release_switch("s_test_disabled")
self.machine_run()
self.assertEventCalled("playfield_active", times=1)
self.machine.autofires["ac_test_disabled"].disable()
self.hit_and_release_switch("s_test_disabled")
self.machine_run()
self.assertEventCalled("playfield_active", times=1)
def test_timeout(self):
self.machine.autofires["ac_test_timeout"].enable()
self.machine_run()
# 9 hits are ok
for _ in range(9):
self.hit_and_release_switch("s_test")
self.machine_run()
self.assertTrue(self.machine.autofires["ac_test_timeout"]._enabled)
# 10th hit should disable it
self.hit_and_release_switch("s_test")
self.machine_run()
self.assertFalse(self.machine.autofires["ac_test_timeout"]._enabled)
# reenable after 500ms
self.advance_time_and_run(.6)
self.assertTrue(self.machine.autofires["ac_test_timeout"]._enabled)
# exire the older hits
self.advance_time_and_run(1)
# 9 hits are ok
for _ in range(9):
self.hit_and_release_switch("s_test")
self.machine_run()
self.assertTrue(self.machine.autofires["ac_test_timeout"]._enabled)
# wait 1s
self.advance_time_and_run(1)
# another 9 hits are ok
for _ in range(9):
self.hit_and_release_switch("s_test")
self.machine_run()
self.assertTrue(self.machine.autofires["ac_test_timeout"]._enabled)
# 10th hit should disable it
self.hit_and_release_switch("s_test")
self.machine_run()
self.assertFalse(self.machine.autofires["ac_test_timeout"]._enabled)
self.advance_time_and_run(.2)
# disable manually while disabled by too many hits
self.machine.autofires["ac_test_timeout"].disable()
self.assertFalse(self.machine.autofires["ac_test_timeout"]._enabled)
# should not reenable
self.advance_time_and_run(.4)
self.assertFalse(self.machine.autofires["ac_test_timeout"]._enabled)
| 48.208122 | 122 | 0.703591 | 1,195 | 9,497 | 5.256904 | 0.104603 | 0.143585 | 0.068768 | 0.099331 | 0.894142 | 0.881248 | 0.846705 | 0.846705 | 0.83206 | 0.803884 | 0 | 0.010023 | 0.191113 | 9,497 | 196 | 123 | 48.454082 | 0.807732 | 0.030852 | 0 | 0.639706 | 0 | 0 | 0.086734 | 0.003156 | 0 | 0 | 0 | 0 | 0.169118 | 1 | 0.066176 | false | 0 | 0.029412 | 0.014706 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8c573598212723c34d72d9c5a33a303fd85f3976 | 11,043 | py | Python | app/api/random_positions.py | justinycho-business/battleships | e53f7dc8fa5af36d417d7694e07f9b20397f093e | [
"MIT"
] | null | null | null | app/api/random_positions.py | justinycho-business/battleships | e53f7dc8fa5af36d417d7694e07f9b20397f093e | [
"MIT"
] | null | null | null | app/api/random_positions.py | justinycho-business/battleships | e53f7dc8fa5af36d417d7694e07f9b20397f093e | [
"MIT"
] | null | null | null | import random
def get_block_square_positions(currentlyoccupiedpositions):
positions = []
square_orientation_dic = [
"UpperLeft",
"UpperRight",
"LowerLeft",
"LowerRight"
]
block_square_not_positioned = True
list_of_block_square_indexes = []
while block_square_not_positioned:
block_square_not_positioned = False
square_orietation = random.choice(square_orientation_dic)
print(square_orietation)
index_to_start = random.randrange(63)
print(index_to_start)
intital_block_position = index_to_start
currentIdx = intital_block_position
if square_orietation == "UpperLeft":
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx-8)
list_of_block_square_indexes.append(currentIdx-9)
list_of_block_square_indexes.append(currentIdx-1)
elif square_orietation == "UpperRight":
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx-8)
list_of_block_square_indexes.append(currentIdx-7)
list_of_block_square_indexes.append(currentIdx+1)
elif square_orietation == 'LowerLeft':
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx+8)
list_of_block_square_indexes.append(currentIdx+7)
list_of_block_square_indexes.append(currentIdx-1)
elif square_orietation == 'LowerRight':
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx+8)
list_of_block_square_indexes.append(currentIdx+9)
list_of_block_square_indexes.append(currentIdx+1)
for i in list_of_block_square_indexes:
if (i > 63) or (i < 0) or (i in currentlyoccupiedpositions):
block_square_not_positioned = True
if block_square_not_positioned:
list_of_block_square_indexes = []
positions = list_of_block_square_indexes
return positions
def get_block_L_positions():
positions = []
l_orietation_dic = [
"DownDownLeft", "DownDownRight", "UpUpRight", "UpUpLeft",
"LeftDownDown", "LeftUpUp", "RightDownDown", "RightUpUp",
"UpLeftLeft", "UpRightRight", "DownRightRight", "DownLeftLeft",
"LeftLeftUp", "RightRightUp", "RightRightDown", "LeftLeftDown"
]
block_L_not_positioned = True
list_of_block_L_indexes = []
while block_L_not_positioned:
block_L_not_positioned = False
l_orietation = random.choice(l_orietation_dic)
print(l_orietation)
index_to_start = random.randrange(63)
print(index_to_start)
intital_block_L_position = index_to_start
currentIdx = intital_block_L_position
if l_orietation == "DownDownLeft":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx-1)
elif l_orietation == "DownDownRight":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx+1)
elif l_orietation == "UpUpRight":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx+1)
elif l_orietation == "UpUpLeft":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx-1)
elif l_orietation == "LeftDownDown":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx+8)
elif l_orietation == "LeftUpUp":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx-8)
elif l_orietation == "RightDownDown":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx+8)
elif l_orietation == "RightUpUp":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx-8)
elif l_orietation == "UpLeftLeft":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx-1)
elif l_orietation == "UpRightRight":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-8)
currentIdx -= 8
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx+1)
elif l_orietation == "DownRightRight":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx+1)
elif l_orietation == "DownLeftLeft":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+8)
currentIdx += 8
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx-1)
elif l_orietation == "LeftLeftUp":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx-8)
elif l_orietation == "RightRightUp":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx-8)
elif l_orietation == "RightRightDown":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx+1)
currentIdx += 1
list_of_block_L_indexes.append(currentIdx+8)
elif l_orietation == "LeftLeftDown":
list_of_block_L_indexes.append(currentIdx)
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx-1)
currentIdx -= 1
list_of_block_L_indexes.append(currentIdx+8)
for i in list_of_block_L_indexes:
if i > 63 or i < 0:
block_L_not_positioned = True
if block_L_not_positioned:
list_of_block_L_indexes = []
positions = list_of_block_L_indexes
return positions
def get_random_positions_for_blocks():
allpositions = []
dic_of_positions = {}
for i in get_block_L_positions():
allpositions.append(i)
dic_of_positions[i] = i
for j in get_block_square_positions(dic_of_positions):
allpositions.append(j)
dic_of_positions[j] = j
for x in get_block_line_positions(dic_of_positions):
allpositions.append(x)
dic_of_positions[x] = x
for z in get_block_line_positions(dic_of_positions):
allpositions.append(z)
dic_of_positions[z] = z
return allpositions
def get_block_line_positions(currentlyoccupiedpositions):
positions = []
line_orientation_dic = [
"Up",
"Down",
"Left",
"Right"
]
block_line_not_positioned = True
list_of_block_square_indexes = []
while block_line_not_positioned:
block_line_not_positioned = False
line_orietation = random.choice(line_orientation_dic)
print(line_orietation)
index_to_start = random.randrange(63)
print(index_to_start)
intital_block_position = index_to_start
currentIdx = intital_block_position
if line_orietation == "Up":
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx-8)
list_of_block_square_indexes.append(currentIdx-16)
list_of_block_square_indexes.append(currentIdx-24)
elif line_orietation == "Down":
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx+8)
list_of_block_square_indexes.append(currentIdx+16)
list_of_block_square_indexes.append(currentIdx+24)
elif line_orietation == 'Left':
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx-1)
list_of_block_square_indexes.append(currentIdx-2)
list_of_block_square_indexes.append(currentIdx-3)
elif line_orietation == 'Right':
list_of_block_square_indexes.append(currentIdx)
list_of_block_square_indexes.append(currentIdx+1)
list_of_block_square_indexes.append(currentIdx+2)
list_of_block_square_indexes.append(currentIdx+3)
for i in list_of_block_square_indexes:
if (i > 63) or (i < 0) or (i in currentlyoccupiedpositions):
block_line_not_positioned = True
if block_line_not_positioned:
list_of_block_square_indexes = []
positions = list_of_block_square_indexes
return positions
| 36.81 | 72 | 0.662139 | 1,311 | 11,043 | 5.125858 | 0.058734 | 0.096429 | 0.176786 | 0.121429 | 0.812946 | 0.781696 | 0.768899 | 0.760268 | 0.760268 | 0.760268 | 0 | 0.015187 | 0.266594 | 11,043 | 299 | 73 | 36.93311 | 0.814545 | 0 | 0 | 0.614754 | 0 | 0 | 0.042199 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016393 | false | 0 | 0.004098 | 0 | 0.036885 | 0.02459 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4fc9e8e3a8b5fd8bb22e8aee1521e1bbf2275aa0 | 39,313 | py | Python | tests/test_qmock.py | fds-ajacobs/qmock-py | 6c6e7d1d2e8e3be26b36b475d1c7ad6088615a23 | [
"Apache-2.0"
] | null | null | null | tests/test_qmock.py | fds-ajacobs/qmock-py | 6c6e7d1d2e8e3be26b36b475d1c7ad6088615a23 | [
"Apache-2.0"
] | 2 | 2020-01-23T23:51:29.000Z | 2020-02-03T17:20:28.000Z | tests/test_qmock.py | fds-ajacobs/qmock-py | 6c6e7d1d2e8e3be26b36b475d1c7ad6088615a23 | [
"Apache-2.0"
] | 1 | 2020-01-23T23:36:50.000Z | 2020-01-23T23:36:50.000Z | from collections import OrderedDict
import signal
import sys
from threading import Thread
import unittest
import qmock
from qmock._python_compat import get_thread_id, mock
# arbitrary targets for qmock.patch() tests
import datetime, json, xml.etree.ElementTree
DATETIME_DATE = "datetime.date"
JSON_LOADS = "json.loads"
XML_ETREE_ELEMENTTREE = "xml.etree.ElementTree"
PY2 = sys.version_info[0] < 3
class QMockErrorsInThreadsTests(unittest.TestCase):
def test_str(self):
error = qmock.QMockErrorsInThreads(
[RuntimeError("foo"), ValueError("bar"), KeyError("baz")]
)
self.assertEqual(
str(error),
"Unhandled QMock errors raised in other threads:"
" ["
+ repr(RuntimeError("foo")) + ", "
+ repr(ValueError("bar")) + ", "
+ repr(KeyError("baz"))
+ "]"
)
class patchTests(unittest.TestCase):
def setUp(self):
self._assert_no_patches()
def tearDown(self):
self._assert_no_patches()
def _assert_no_patches(self):
self._assert_datetime_is_not_patched()
self._assert_json_is_not_patched()
self._assert_xml_etree_is_not_patched()
def _assert_datetime_is_not_patched(self):
self.assertEqual(
str(datetime.date(1, 2, 3)),
"0001-02-03"
)
def _assert_json_is_not_patched(self):
self.assertEqual(
json.loads("[1,2,3]"),
[1, 2, 3]
)
def _assert_xml_etree_is_not_patched(self):
self.assertEqual(
xml.etree.ElementTree.fromstring("<foo />").tag,
"foo"
)
def _force_unexpected_call_in_thread(self, qm):
try:
thread = Thread(target=qm.an_unknown_call)
thread.start()
# we expect the thread to die immediately because of an
# UnexpectedCall. the alarms are an abundance of caution.
signal.alarm(1)
thread.join()
signal.alarm(0)
except BaseException as ex:
self.fail("Thread setup caught: {0!r}".format(ex))
def _assert_thread_qmock_errors(self, errors_in_thread_error):
"""
QMockErrorsInThreads.errors should contain a single
UnexpectedCall raised in a different thread.
"""
qmock_errors_from_threads = errors_in_thread_error.errors
self.assertEqual(len(qmock_errors_from_threads), 1)
qmock_error_tid, qmock_error = qmock_errors_from_threads[0]
self.assertNotEqual(qmock_error_tid, get_thread_id())
self.assertIsInstance(qmock_error, qmock.UnexpectedCall)
def _assert_patched_func_error(self, errors_in_thread_error,
expected_func_error_type):
"""
in Python 3, when multiple exceptions are being handled at once,
each exception has a __context__ which is the last exception
raised before this one (or `None`, if this is the first
exception in the current batch of active exceptions).
so QMockErrorsInThreads.__context__ should be the exception
raised by the function/scope being patched.
"""
if PY2:
# Python 2 has no __context__
return
patched_func_error = errors_in_thread_error.__context__
if expected_func_error_type is None:
self.assertIsNone(patched_func_error)
else:
self.assertIsInstance(patched_func_error, expected_func_error_type)
def test_empty_function_decorator_succeeds(self):
@qmock.patch()
def foo(qm):
self._assert_no_patches()
qm.call_queue.push(qmock.call.bar(), 5)
self.assertEqual(qm.bar(), 5)
foo() # no raise == success
# a little silly because nothing is being patched, but just in case.
def test_empty_function_decorator_cleans_up_on_func_exception(self):
@qmock.patch()
def foo(qm):
self._assert_no_patches()
raise RuntimeError("TEST")
self.assertRaises(RuntimeError, foo)
def test_empty_function_decorator_raises_on_exit_if_queue_not_empty(self):
@qmock.patch()
def foo(qm):
self._assert_no_patches()
qm.call_queue.push(qmock.call.bar(), 5)
self.assertRaises(qmock.CallQueueNotEmpty, foo)
def test_empty_function_decorator_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
@qmock.patch()
def foo(qm):
self._assert_no_patches()
# would raise CallQueueNotEmpty if not handling RuntimeError
qm.call_queue.push(qmock.call.bar(), 5)
raise RuntimeError("TEST")
self.assertRaises(RuntimeError, foo)
def test_empty_function_decorator_raises_on_exit_if_errors_in_threads(self):
@qmock.patch()
def foo(qm):
self._assert_no_patches()
self._force_unexpected_call_in_thread(qm)
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_empty_function_decorator_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
@qmock.patch()
def foo(qm):
self._assert_no_patches()
# raises QMockErrorsInThreads on top of RuntimeError
self._force_unexpected_call_in_thread(qm)
raise RuntimeError("TEST")
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, RuntimeError)
def test_single_patch_function_decorator_succeeds(self):
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
self.assertEqual(datetime.date(1, 2, 3), 7)
self._assert_no_patches()
foo()
def test_single_patch_function_decorator_cleans_up_on_func_exception(self):
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
raise ValueError("TEST")
self._assert_no_patches()
self.assertRaises(ValueError, foo)
def test_single_patch_function_decorator_cleans_up_on_bad_patch(self):
@qmock.patch(dt="datetime.BAD")
def foo(qm):
self.fail("This test function should not run.")
self._assert_no_patches()
self.assertRaises(AttributeError, foo)
def test_single_patch_function_decorator_raises_on_exit_if_queue_not_empty(self):
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
self._assert_no_patches()
self.assertRaises(qmock.CallQueueNotEmpty, foo)
def test_single_patch_function_decorator_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
# would raise CallQueueNotEmpty if not handling ValueError
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
raise ValueError("TEST")
self._assert_no_patches()
self.assertRaises(ValueError, foo)
def test_single_patch_function_decorator_raises_on_exit_if_errors_in_threads(self):
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
self._force_unexpected_call_in_thread(qm)
self._assert_no_patches()
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_single_patch_function_decorator_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
# raises QMockErrorsInThreads on top of ValueError
self._force_unexpected_call_in_thread(qm)
raise ValueError("TEST")
self._assert_no_patches()
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, ValueError)
def test_multi_patch_function_decorator_succeeds(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self.assertEqual(datetime.date(1, 2, 3), "a")
self.assertEqual(xml.etree.ElementTree.fromstring("<foo />"), "b")
self.assertEqual(json.loads("[1,2,3]"), "c")
self._assert_no_patches()
foo()
def test_multi_patch_function_decorator_cleans_up_on_func_exception(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE)
def foo(qm):
raise KeyError("TEST")
self._assert_no_patches()
self.assertRaises(KeyError, foo)
def test_multi_patch_function_decorator_cleans_up_on_bad_patch(self):
@qmock.patch(dt=DATETIME_DATE, json="json.BAD", et=XML_ETREE_ELEMENTTREE)
def foo(qm):
self.fail("This test function should not run.")
self._assert_no_patches()
self.assertRaises(AttributeError, foo)
def test_multi_patch_function_decorator_raises_on_exit_if_queue_not_empty(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self._assert_no_patches()
self.assertRaises(qmock.CallQueueNotEmpty, foo)
def test_multi_patch_function_decorator_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE)
def foo(qm):
# would raise CallQueueNotEmpty if not handling KeyError
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
raise KeyError("TEST")
self._assert_no_patches()
self.assertRaises(KeyError, foo)
def test_multi_patch_function_decorator_raises_on_exit_if_errors_in_threads(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE)
def foo(qm):
self._force_unexpected_call_in_thread(qm)
self._assert_no_patches()
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_multi_patch_function_decorator_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE)
def foo(qm):
# raises QMockErrorsInThreads on top of KeyError
self._force_unexpected_call_in_thread(qm)
raise KeyError("TEST")
self._assert_no_patches()
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, KeyError)
def test_stacked_function_decorator_succeeds(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json=JSON_LOADS)
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self.assertEqual(datetime.date(1, 2, 3), "a")
self.assertEqual(xml.etree.ElementTree.fromstring("<foo />"), "b")
self.assertEqual(json.loads("[1,2,3]"), "c")
self._assert_no_patches()
foo()
def test_stacked_function_decorator_cleans_up_on_func_exception(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json=JSON_LOADS)
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
raise IndexError("TEST")
self._assert_no_patches()
self.assertRaises(IndexError, foo)
def test_stacked_function_decorator_cleans_up_on_bad_patch(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json="json.BAD")
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
self.fail("This test function should not run.")
self._assert_no_patches()
self.assertRaises(AttributeError, foo)
def test_stacked_function_decorator_raises_on_exit_if_queue_not_empty(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json=JSON_LOADS)
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self._assert_no_patches()
self.assertRaises(qmock.CallQueueNotEmpty, foo)
def test_stacked_function_decorator_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json=JSON_LOADS)
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
# would raise CallQueueNotEmpty if not handling IndexError
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
raise IndexError("TEST")
self._assert_no_patches()
self.assertRaises(IndexError, foo)
def test_stacked_function_decorator_raises_on_exit_if_errors_in_threads(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json=JSON_LOADS)
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
self._force_unexpected_call_in_thread(qm)
self._assert_no_patches()
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_stacked_function_decorator_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(json=JSON_LOADS)
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def foo(qm):
# raises QMockErrorsInThreads on top of IndexError
self._force_unexpected_call_in_thread(qm)
raise IndexError("TEST")
self._assert_no_patches()
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
foo()
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, IndexError)
def test_class_decorator_only_patches_test_methods(self):
@qmock.patch(dt=DATETIME_DATE)
class Foo(object):
fizz = "a"
test_buzz = "b"
def bar(foo_self):
self._assert_no_patches()
def test_baz(foo_self, qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
self.assertEqual(datetime.date(1, 2, 3), 7)
self._assert_no_patches()
f = Foo()
self._assert_no_patches()
self.assertEqual(f.fizz, "a")
self._assert_no_patches()
self.assertEqual(f.test_buzz, "b")
self._assert_no_patches()
f.bar()
self._assert_no_patches()
f.test_baz()
def test_mixed_decorator_patches(self):
@qmock.patch(dt=DATETIME_DATE, json=JSON_LOADS)
class Foo(object):
@qmock.patch(et=XML_ETREE_ELEMENTTREE)
def test_mixed(foo_self, qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self.assertEqual(datetime.date(1, 2, 3), "a")
self.assertEqual(xml.etree.ElementTree.fromstring("<foo />"), "b")
self.assertEqual(json.loads("[1,2,3]"), "c")
def test_no_cross_mix_between_methods(foo_self, qm):
self._assert_xml_etree_is_not_patched()
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self.assertEqual(datetime.date(1, 2, 3), "a")
self.assertEqual(json.loads("[1,2,3]"), "c")
@qmock.patch(et="xml.etree.BAD")
def test_bad_patch(foo_self, qm):
self.fail("This test function should not run.")
self._assert_no_patches()
f = Foo()
self._assert_no_patches()
f.test_mixed()
self._assert_no_patches()
f.test_no_cross_mix_between_methods()
self._assert_no_patches()
self.assertRaises(AttributeError, f.test_bad_patch)
def test_empty_context_manager_succeeds(self):
with qmock.patch() as qm:
self._assert_no_patches()
qm.call_queue.push(qmock.call.bar(), 5)
self.assertEqual(qm.bar(), 5)
# a little silly because nothing is being patched, but just in case.
def test_empty_context_manager_cleans_up_on_func_exception(self):
with self.assertRaises(RuntimeError):
with qmock.patch() as qm:
self._assert_no_patches()
raise RuntimeError("TEST")
def test_empty_context_manager_raises_on_exit_if_queue_not_empty(self):
with self.assertRaises(qmock.CallQueueNotEmpty):
with qmock.patch() as qm:
self._assert_no_patches()
qm.call_queue.push(qmock.call.bar(), 5)
def test_empty_context_manager_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
with self.assertRaises(RuntimeError):
with qmock.patch() as qm:
self._assert_no_patches()
# would raise CallQueueNotEmpty if not handling RuntimeError
qm.call_queue.push(qmock.call.bar(), 5)
raise RuntimeError("TEST")
def test_empty_context_manager_raises_on_exit_if_errors_in_threads(self):
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
with qmock.patch() as qm:
self._assert_no_patches()
self._force_unexpected_call_in_thread(qm)
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_empty_context_manager_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
with qmock.patch() as qm:
self._assert_no_patches()
# raises QMockErrorsInThreads on top of RuntimeError
self._force_unexpected_call_in_thread(qm)
raise RuntimeError("TEST")
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, RuntimeError)
def test_single_patch_context_manager_succeeds(self):
with qmock.patch(dt=DATETIME_DATE) as qm:
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
self.assertEqual(datetime.date(1, 2, 3), 7)
def test_single_patch_context_manager_cleans_up_on_func_exception(self):
with self.assertRaises(ValueError):
with qmock.patch(dt=DATETIME_DATE) as qm:
raise ValueError("TEST")
def test_single_patch_context_manager_cleans_up_on_bad_patch(self):
with self.assertRaises(AttributeError):
with qmock.patch(dt="datetime.BAD") as qm:
self.fail("This context should not be entered.")
def test_single_patch_context_manager_raises_on_exit_if_queue_not_empty(self):
with self.assertRaises(qmock.CallQueueNotEmpty):
with qmock.patch(dt=DATETIME_DATE) as qm:
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
def test_single_patch_context_manager_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
with self.assertRaises(ValueError):
with qmock.patch(dt=DATETIME_DATE) as qm:
# would raise CallQueueNotEmpty if not handling ValueError
qm.call_queue.push(qmock.call.dt(1, 2, 3), 7)
raise ValueError("TEST")
def test_single_patch_context_manager_raises_on_exit_if_errors_in_threads(self):
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
with qmock.patch(dt=DATETIME_DATE) as qm:
self._force_unexpected_call_in_thread(qm)
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_single_patch_context_manager_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
with qmock.patch(dt=DATETIME_DATE) as qm:
# raises QMockErrorsInThreads on top of ValueError
self._force_unexpected_call_in_thread(qm)
raise ValueError("TEST")
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, ValueError)
def test_multi_patch_function_decorator_succeeds(self):
with qmock.patch(
dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE
) as qm:
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
self.assertEqual(datetime.date(1, 2, 3), "a")
self.assertEqual(xml.etree.ElementTree.fromstring("<foo />"), "b")
self.assertEqual(json.loads("[1,2,3]"), "c")
def test_multi_patch_context_manager_cleans_up_on_func_exception(self):
with self.assertRaises(KeyError):
with qmock.patch(
dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE
) as qm:
raise KeyError("TEST")
def test_multi_patch_context_manager_cleans_up_on_bad_patch(self):
with self.assertRaises(AttributeError):
with qmock.patch(
dt=DATETIME_DATE, json="json.BAD", et=XML_ETREE_ELEMENTTREE
) as qm:
self.fail("This context should not be entered.")
def test_multi_patch_context_manager_raises_on_exit_if_queue_not_empty(self):
with self.assertRaises(qmock.CallQueueNotEmpty):
with qmock.patch(
dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE
) as qm:
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
def test_multi_patch_context_manager_doesnt_raise_on_exit_if_queue_not_empty_and_func_exception(self):
with self.assertRaises(KeyError):
with qmock.patch(
dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE
) as qm:
# would raise CallQueueNotEmpty if not handling KeyError
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.et.fromstring("<foo />"), "b")
qm.call_queue.push(qmock.call.json("[1,2,3]"), "c")
raise KeyError("TEST")
def test_multi_patch_context_manager_raises_on_exit_if_errors_in_threads(self):
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
with qmock.patch(
dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE
) as qm:
self._force_unexpected_call_in_thread(qm)
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, None)
def test_multi_patch_context_manager_still_raises_on_exit_if_errors_in_threads_and_func_exception(self):
with self.assertRaises(qmock.QMockErrorsInThreads) as assertion:
with qmock.patch(
dt=DATETIME_DATE, json=JSON_LOADS, et=XML_ETREE_ELEMENTTREE
) as qm:
# raises QMockErrorsInThreads on top of KeyError
self._force_unexpected_call_in_thread(qm)
raise KeyError("TEST")
self._assert_thread_qmock_errors(assertion.exception)
self._assert_patched_func_error(assertion.exception, KeyError)
#
# degenerate cases
#
def test_duplicate_patch_succeeds(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(dt=DATETIME_DATE)
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
self.assertEqual(datetime.date(1, 2, 3), "a")
self._assert_no_patches()
foo()
# this also indirectly tests that stacked patches are applied strictly
# bottom-up.
def test_same_patch_on_different_attr_is_weird(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(date=DATETIME_DATE)
def foo(qm):
# this is the wrong call to expect because the last patch for
# datetime.date was assigned to the `dt` attr
qm.call_queue.push(qmock.call.date(1, 2, 3), "a")
with self.assertRaises(qmock.UnexpectedCall):
datetime.date(1, 2, 3)
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
self.assertEqual(datetime.date(1, 2, 3), "a")
self._assert_no_patches()
foo()
def test_different_patch_on_same_attr_is_also_weird(self):
@qmock.patch(dt=DATETIME_DATE)
@qmock.patch(dt="datetime.datetime")
def foo(qm):
qm.call_queue.push(qmock.call.dt(1, 2, 3), "a")
qm.call_queue.push(qmock.call.dt(4, 5, 6), "b")
qm.call_queue.push(qmock.call.dt(7, 8, 9), "c")
self.assertEqual(datetime.date(1, 2, 3), "a")
self.assertEqual(datetime.datetime(4, 5, 6), "b")
self.assertEqual(datetime.date(7, 8, 9), "c")
self._assert_no_patches()
foo()
class QMockTests(unittest.TestCase):
def test_root_assigned_attributes(self):
qm = qmock.QMock()
qm.foo = 5
self.assertIs(qm.foo, 5) # retained across accesses
self.assertIsInstance(qm.foo, int)
self.assertRaises(TypeError, qm.foo) # not callable
def test_root_generated_attributes(self):
qm = qmock.QMock()
self.assertIsNot(qm.foo, qm.baz)
self.assertIs(qm.foo, qm.foo) # retained across accesses
self.assertIsInstance(qm.foo, qmock._qmock._CallProxy)
with self.assertRaises(qmock.UnexpectedCall):
qm.foo() # empty CallQueue
qm.call_queue.push(qmock.call.foo(), 5)
self.assertIs(qm.foo(), 5)
with self.assertRaises(qmock.UnexpectedCall):
qm.foo() # empty CallQueue
def test_nested_assigned_attributes(self):
qm = qmock.QMock()
qm.foo.bar = 5
self.assertIs(qm.foo.bar, 5) # retained across accesses
self.assertIsInstance(qm.foo.bar, int)
self.assertRaises(TypeError, qm.foo.bar) # not callable
self.assertIsNot(qm.foo, qm.baz)
self.assertIs(qm.foo, qm.foo) # retained across accesses
self.assertIsInstance(qm.foo, qmock._qmock._CallProxy)
with self.assertRaises(qmock.UnexpectedCall):
qm.foo() # empty CallQueue
def test_nested_generated_attributes(self):
qm = qmock.QMock()
self.assertIsNot(qm.foo.bar, qm.foo.baz)
self.assertIsNot(qm.foo.bar, qm.baz.bar)
self.assertIs(qm.foo.bar, qm.foo.bar) # retained across accesses
self.assertIsInstance(qm.foo.bar, qmock._qmock._CallProxy)
with self.assertRaises(qmock.UnexpectedCall):
qm.foo.bar() # empty CallQueue
self.assertIsNot(qm.foo, qm.baz)
self.assertIs(qm.foo, qm.foo) # retained across accesses
self.assertIsInstance(qm.foo, qmock._qmock._CallProxy)
with self.assertRaises(qmock.UnexpectedCall):
qm.foo() # empty CallQueue
qm.call_queue.push(qmock.call.foo.bar(), 5)
self.assertIs(qm.foo.bar(), 5)
with self.assertRaises(qmock.UnexpectedCall):
qm.foo.bar() # empty CallQueue
def test_assigned_attributes_are_attached(self):
qm = qmock.QMock()
m = mock.Mock()
qm.foo = m
with self.assertRaises(qmock.UnexpectedCall):
m()
def test_root_magic_methods(self):
qm = qmock.QMock()
with self.assertRaises(qmock.UnexpectedCall):
str(qm) # empty CallQueue
qm.call_queue.push(qmock.call.__getattr__("__str__")(qm), "test")
self.assertEqual(str(qm), "test")
with self.assertRaises(qmock.UnexpectedCall):
str(qm) # empty CallQueue
def test_nested_magic_methods(self):
qm = qmock.QMock()
with self.assertRaises(qmock.UnexpectedCall):
qm.foo < 5 # empty CallQueue
qm.call_queue.push(qmock.call.foo.__getattr__("__lt__")(qm.foo, 5), "test")
self.assertEqual((qm.foo < 5), "test")
with self.assertRaises(qmock.UnexpectedCall):
qm.foo < 5 # empty CallQueue
def test_magic_methods_are_always_the_same_object(self):
qm = qmock.QMock()
method = qm.__len__
self.assertIs(method, qm.__len__)
self.assertIs(method, qm.__len__)
self.assertIs(method, qm.__len__)
def test_magic_methods_are_unique(self):
qm = qmock.QMock()
self.assertIsNot(qm.__len__, qm.foo.__len__)
self.assertIsNot(qm.__len__, qm.bar.__len__)
self.assertIsNot(qm.foo.__len__, qm.bar.__len__)
qm.call_queue.assert_empty()
qm.call_queue.push(qmock.call.__getattr__("__len__")(qm), 1)
qm.call_queue.push(qmock.call.foo.__getattr__("__len__")(qm.foo), 2)
qm.call_queue.push(qmock.call.bar.__getattr__("__len__")(qm.bar), 3)
qm.call_queue.push(qmock.call.bar.__getattr__("__len__")(qm.bar), 4)
with self.assertRaises(qmock.UnexpectedCall):
len(qm.foo) # wrong call; expected len(qm)
self.assertEqual(len(qm.foo), 2)
with self.assertRaises(qmock.UnexpectedCall):
len(qm.foo) # wrong call; expected len(qm.bar)
self.assertEqual(len(qm.bar), 4)
qm.call_queue.assert_empty()
def test_can_be_a_context_manager(self):
qm = qmock.QMock()
qm.call_queue.assert_empty()
with self.assertRaises(qmock.UnexpectedCall):
with qm as foo: # empty CallQueue
pass
qm.call_queue.assert_empty()
qm.call_queue.push(qmock.call.__getattr__("__enter__")(qm), qm.foo)
qm.call_queue.push(qmock.call.foo(), 7357)
qm.call_queue.push(qmock.call.__getattr__("__exit__")(qm, None, None, None), None)
with qm as foo:
self.assertEqual(foo(), 7357)
qm.call_queue.assert_empty()
def test_mock_calls_returns_proxy(self):
qm = qmock.QMock()
self.assertIsInstance(qm.mock_calls, qmock._qmock._MockCallsProxy)
def test_eq(self):
alpha = qmock.QMock()
bravo = qmock.QMock()
self.assertTrue(alpha == alpha)
self.assertTrue(bravo == bravo)
self.assertFalse(alpha == bravo)
self.assertFalse(bravo == alpha)
def test_is_callable(self):
qm = qmock.QMock()
with self.assertRaises(qmock.UnexpectedCall):
qm() # empty CallQueue
qm.call_queue.push(qmock.call(), 5)
self.assertIs(qm(), 5)
def test_mock_return_assigned_attributes(self):
qm = qmock.QMock()
qm.foo = 5
self.assertIs(
qm.mock_return(qmock.call.foo),
5
)
qm = qmock.QMock()
qm.foo.return_value = 6
self.assertIs(
qm.mock_return(qmock.call.foo()),
6
)
qm = qmock.QMock()
qm.return_value = 7
self.assertIs(
qm.mock_return(qmock.call()),
7
)
qm = qmock.QMock()
qm.return_value.foo = 8
self.assertIs(
qm.mock_return(qmock.call().foo),
8
)
qm = qmock.QMock()
qm.return_value.foo.return_value.bar.return_value.baz.barf.return_value = 9
self.assertIs(
qm.mock_return(qmock.call(x=1).foo(y=2).bar(5).baz.barf(z={6: 7}, w=8)),
9
)
def test_mock_return_generated_attributes(self):
qm = qmock.QMock()
self.assertIs(
qm.mock_return(qmock.call.foo),
qm.foo
)
self.assertIs(
qm.mock_return(qmock.call.foo()),
qm.foo.return_value
)
self.assertIs(
qm.mock_return(qmock.call()),
qm.return_value
)
self.assertIs(
qm.mock_return(qmock.call().foo),
qm.return_value.foo
)
self.assertIs(
qm.mock_return(qmock.call(x=1).foo(y=2).bar(5).baz.barf(z={6: 7}, w=8)),
qm.return_value.foo.return_value.bar.return_value.baz.barf.return_value
)
def test_mock_return_null_call(self):
qm = qmock.QMock()
self.assertRaises(
AttributeError,
qm.mock_return,
qmock.call
)
class CallQueueTests(unittest.TestCase):
def test_push_attribute_call(self):
qm = qmock.QMock()
cq = qm.call_queue
self.assertRaises(
qmock.BadCall,
cq.push,
qmock.call.foo,
"bar"
)
self.assertEqual(len(cq.pop_errors), 0)
def test_push_function_call(self):
qm = qmock.QMock()
cq = qm.call_queue
self.assertEqual(len(cq._queue), 0)
cq.push(qmock.call.foo(), "bar")
self.assertEqual(
tuple(
(expected_call, self._copy_mock_side_effect(mock_result))
for expected_call, mock_result in cq._queue
),
(
(qmock.call.foo(), ("bar",)),
)
)
self.assertEqual(len(cq._queue), 1)
self.assertEqual(qm.foo(), "bar")
cq.assert_empty()
self.assertEqual(len(cq.pop_errors), 0)
def test_push_all_attribute_call(self):
qm = qmock.QMock()
cq = qm.call_queue
self.assertRaises(
qmock.BadCall,
cq.push_all,
qmock.call(x=1).foo(y=2).bar(5).baz.barf,
10
)
self.assertEqual(len(cq.pop_errors), 0)
def test_push_all_function_call(self):
qm = qmock.QMock()
cq = qm.call_queue
cq.push_all(qmock.call(x=1).foo(y=2).bar(5).baz.barf(z={6: 7}, w=8), 10)
self.assertEqual(
tuple(
(expected_call, self._copy_mock_side_effect(mock_result))
for expected_call, mock_result in cq._queue
),
(
(
qmock.call(x=1),
(qm.return_value,)
),
(
qmock.call(x=1).foo(y=2),
(qm.return_value.foo.return_value,)
),
(
qmock.call(x=1).foo(y=2).bar(5),
(qm.return_value.foo.return_value.bar.return_value,)
),
(
qmock.call(x=1).foo(y=2).bar(5).baz.barf(z={6: 7}, w=8),
(10,)
)
)
)
self.assertEqual(len(cq._queue), 4)
self.assertEqual(qm(x=1).foo(y=2).bar(5).baz.barf(z={6: 7}, w=8), 10)
cq.assert_empty()
self.assertEqual(len(cq.pop_errors), 0)
def test_pop_value_result(self):
qm = qmock.QMock()
cq = qm.call_queue
cq.push(qmock.call.foo(), 7357)
self.assertEqual(cq._pop(qmock.call.foo()), 7357)
cq.assert_empty()
self.assertEqual(len(cq.pop_errors), 0)
def test_pop_exception_result(self):
qm = qmock.QMock()
cq = qm.call_queue
cq.push(qmock.call.foo(), ValueError("test"))
with self.assertRaises(ValueError) as assertion:
cq._pop(qmock.call.foo())
self.assertEqual(str(assertion.exception), "test")
cq.assert_empty()
self.assertEqual(len(cq.pop_errors), 0)
def test_pop_raises_when_empty(self):
qm = qmock.QMock()
cq = qm.call_queue
self.assertRaises(qmock.UnexpectedCall, cq._pop, qmock.call.foo())
self.assertEqual(len(cq.pop_errors), 1)
record = cq.pop_errors[0]
self.assertEqual(record.thread_id, get_thread_id())
self.assertIsInstance(record.error, qmock.UnexpectedCall)
self.assertEqual(
str(record.error),
"Queue is empty. call: call.foo()"
)
def test_pop_raises_when_call_doesnt_match_expectation(self):
qm = qmock.QMock()
cq = qm.call_queue
cq.push(qmock.call.foo(), 7357)
self.assertRaises(qmock.UnexpectedCall, cq._pop, qmock.call.not_foo())
self.assertEqual(len(cq.pop_errors), 1)
record = cq.pop_errors[0]
self.assertEqual(record.thread_id, get_thread_id())
self.assertIsInstance(record.error, qmock.UnexpectedCall)
self.assertEqual(
str(record.error),
"Call does not match expectation. actual: call.not_foo(); expected: call.foo()"
)
def test_assert_empty(self):
qm = qmock.QMock()
cq = qm.call_queue
cq.assert_empty()
cq.push(qmock.call.foo(), "bar")
self.assertRaises(qmock.CallQueueNotEmpty, cq.assert_empty)
cq._pop(qmock.call.foo())
cq.assert_empty()
self.assertEqual(len(cq.pop_errors), 0)
def _copy_mock_side_effect(self, m):
"""
mock.Mock.side_effect is stored as a <tupleiterator>,
so iterating consumes it. so we'll consume it, store a copy,
re-populate it, and return the copy
"""
side_effect = tuple(m.side_effect)
m.side_effect = side_effect
return side_effect
| 36.266605 | 112 | 0.631267 | 4,969 | 39,313 | 4.697122 | 0.057758 | 0.035476 | 0.03629 | 0.040488 | 0.843273 | 0.809854 | 0.784619 | 0.752442 | 0.729692 | 0.703985 | 0 | 0.010986 | 0.261389 | 39,313 | 1,083 | 113 | 36.300092 | 0.792816 | 0.058403 | 0 | 0.633005 | 0 | 0 | 0.027562 | 0.000571 | 0 | 0 | 0 | 0 | 0.331281 | 1 | 0.151478 | false | 0.001232 | 0.009852 | 0 | 0.171182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4fcf0cd5093a44923cd6237b41fe34b181ee1bef | 10,280 | py | Python | src/explanation/views.py | fleur101/predict-python | d40c876d919232bbb77904e050b182c875bc36fa | [
"MIT"
] | 12 | 2018-06-27T08:09:18.000Z | 2021-10-10T22:19:04.000Z | src/explanation/views.py | fleur101/predict-python | d40c876d919232bbb77904e050b182c875bc36fa | [
"MIT"
] | 17 | 2018-06-12T17:36:11.000Z | 2020-11-16T21:23:22.000Z | src/explanation/views.py | fleur101/predict-python | d40c876d919232bbb77904e050b182c875bc36fa | [
"MIT"
] | 16 | 2018-08-02T14:40:17.000Z | 2021-11-12T12:28:46.000Z | from rest_framework import status
from rest_framework.decorators import api_view
from rest_framework.response import Response
from src.explanation.explanation import explanation, explanation_temporal_stability
from src.explanation.models import Explanation, ExplanationTypes
from src.jobs.models import Job
import pandas as pd
@api_view(['GET'])
def get_lime(request, pk, explanation_target):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.LIME.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'lime' not in exp.results:
exp.results.update({'lime': dict()})
if explanation_target in exp.results['lime']:
return Response(exp.results['lime'][explanation_target], status=200)
else:
error, result = explanation(exp.id, explanation_target)
exp.results['lime'].update({explanation_target: result})
exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
@api_view(['GET'])
def get_lime_temporal_stability(request, pk, explanation_target=None):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.LIME.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'lime_temporal' not in exp.results:
exp.results.update({'lime_temporal': dict()})
if explanation_target:
if explanation_target in exp.results['lime_temporal']:
return Response(exp.results['lime_temporal'][explanation_target], status=200)
else:
error, result = explanation_temporal_stability(exp.id, explanation_target=explanation_target)
exp.results['lime_temporal'].update({explanation_target: result})
exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
elif 'no_target' in explanation_target:
return Response(exp.results['lime_temporal']['no_target'], status=200)
else:
error, result = explanation_temporal_stability(exp.id, explanation_target=explanation_target)
exp.results['lime_temporal'].update({'no_target': result})
exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
@api_view(['GET'])
def get_shap_temporal_stability(request, pk, explanation_target=None):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.SHAP.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'shap_temporal' not in exp.results:
exp.results.update({'shap_temporal': dict()})
if explanation_target:
if explanation_target in exp.results['shap_temporal']:
return Response(exp.results['shap_temporal'][explanation_target], status=200)
else:
error, result = explanation_temporal_stability(exp.id, explanation_target=explanation_target)
#exp.results['shap_temporal'].update({explanation_target: pd.Series(result).to_json(orient='values')})
#exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
elif 'no_target' in explanation_target:
return Response(exp.results['shap_temporal']['no_target'], status=200)
else:
error, result = explanation_temporal_stability(exp.id, explanation_target=explanation_target)
#exp.results['shap_temporal'].update({'no_target': pd.Series(result).to_json(orient='values')})
#exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
@api_view(['GET'])
def get_temporal_stability(request, pk, explanation_target=None):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.TEMPORAL_STABILITY.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'temporal' not in exp.results:
exp.results.update({'temporal': dict()})
if explanation_target:
if explanation_target in exp.results['temporal']:
return Response(pd.read_json(exp.results['temporal'][explanation_target], typ='series',
orient='records'), status = 200)
else:
error, result = explanation_temporal_stability(exp.id, explanation_target=explanation_target)
exp.results['temporal'].update({explanation_target: result})
exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
elif 'no_target' in explanation_target:
return Response(exp.results['temporal']['no_target'], status=200)
else:
error, result = explanation_temporal_stability(exp.id, explanation_target=explanation_target)
exp.results['temporal'].update({'no_target': result})
exp.save()
if error == 'True':
return Response({'error': 'Explanation Target cannot be greater than ' + str(result)},
status=status.HTTP_416_REQUESTED_RANGE_NOT_SATISFIABLE)
else:
return Response(result, status=200)
@api_view(['GET'])
def get_shap(request, pk, explanation_target, prefix_target):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.SHAP.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'shap' not in exp.results:
exp.results.update({'shap': dict()})
if explanation_target not in exp.results['shap']:
exp.results['shap'] = {explanation_target: dict()}
if explanation_target in exp.results['shap'] and prefix_target in exp.results['shap'][explanation_target].keys():
return Response(pd.read_json(exp.results['shap'][explanation_target][prefix_target], typ='series', orient='records'), status=200)
else:
result = explanation(exp.id, explanation_target, prefix_target)
exp.results['shap'][explanation_target].update({prefix_target: pd.Series(result).to_json(orient='values')})
exp.save()
return Response(result, status=200)
@api_view(['GET'])
def get_skater(request, pk):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.SKATER.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'skater' in exp.results:
return Response(exp.results['skater'], status=200)
else:
result = explanation(exp.id, explanation_target = None)
exp.results['skater'] = result
exp.save()
return Response(result, status=200)
@api_view(['GET'])
def get_ice(request, pk, explanation_target):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.ICE.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'ice' not in exp.results:
exp.results.update({'ice': dict()})
if explanation_target in exp.results['ice']:
return Response(exp.results['ice'][explanation_target], status=200)
else:
result = explanation(exp.id, explanation_target)
exp.results['ice'].update({explanation_target: result})
exp.save()
return Response(result, status=200)
@api_view(['GET'])
def get_cmfeedback(request, pk, top_k):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.CMFEEDBACK.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
result = explanation(exp.id, int(top_k))
return Response(result, status=200)
@api_view(['POST'])
def get_retrain(request, pk):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.RETRAIN.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
target = request.data
result = explanation(exp.id, target)
return Response(result, status=200)
@api_view(['GET'])
def get_anchor(request, pk):
job = Job.objects.filter(pk=pk)[0]
exp, _ = Explanation.objects.get_or_create(type=ExplanationTypes.ANCHOR.value, split=job.split,
predictive_model=job.predictive_model, job=job)
exp.save()
if 'anchor' in exp.results:
return Response(exp.results['anchor'], status=200)
else:
result = explanation(exp.id, explanation_target=None)
exp.results['anchor'] = result
exp.save()
return Response(result, status=200)
| 42.833333 | 137 | 0.643872 | 1,199 | 10,280 | 5.341952 | 0.071726 | 0.151288 | 0.056206 | 0.052771 | 0.892116 | 0.86292 | 0.845589 | 0.794379 | 0.733958 | 0.717721 | 0 | 0.013158 | 0.238521 | 10,280 | 239 | 138 | 43.012552 | 0.805059 | 0.020914 | 0 | 0.617801 | 0 | 0 | 0.080211 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052356 | false | 0 | 0.036649 | 0 | 0.251309 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4fd0caaf9300b1b12e193f81be84cbbd50b3eab5 | 7,430 | py | Python | tests/test_plot.py | davebulaval/python2latex | da35bb8260e13d29dde5c81e363be2fc43abeba4 | [
"MIT"
] | null | null | null | tests/test_plot.py | davebulaval/python2latex | da35bb8260e13d29dde5c81e363be2fc43abeba4 | [
"MIT"
] | null | null | null | tests/test_plot.py | davebulaval/python2latex | da35bb8260e13d29dde5c81e363be2fc43abeba4 | [
"MIT"
] | null | null | null | import os
import shutil
from inspect import cleandoc
from python2latex.color import Color
from python2latex.document import Document
from python2latex.plot import Plot, LinePlot, MatrixPlot, _Plot
class TestPlot:
def teardown(self):
_Plot.plot_count = 0
def test_default_plot(self):
assert Plot(plot_name='plot_test').build() == cleandoc(r'''
\begin{figure}[h!]
\centering
\begin{tikzpicture}
\begin{axis}[grid style={dashed,gray!50}, axis y line*=left, axis x line*=bottom, every axis plot/.append style={line width=1.25pt, mark size=0pt}, width=.8\textwidth, height=.45\textwidth, grid=major]
\end{axis}
\end{tikzpicture}
\end{figure}
''')
os.remove('plot_test.csv')
def test_add_plot_with_legend(self):
plot = Plot(plot_name='plot_test')
plot.add_plot(list(range(10)), list(range(10)), 'red', legend='Legend', line_width='2pt')
assert plot.build() == cleandoc(r'''
\begin{figure}[h!]
\centering
\begin{tikzpicture}
\begin{axis}[grid style={dashed,gray!50}, axis y line*=left, axis x line*=bottom, every axis plot/.append style={line width=1.25pt, mark size=0pt}, width=.8\textwidth, height=.45\textwidth, grid=major]
\addplot[red, line width=2pt] table[x=x0, y=y0, col sep=comma]{./plot_test.csv};
\addlegendentry{Legend};
\end{axis}
\end{tikzpicture}
\end{figure}
''')
os.remove('plot_test.csv')
def test_add_plot_without_legend(self):
plot = Plot(plot_name='plot_test')
plot.add_plot(list(range(10)), list(range(10)), 'red', line_width='2pt')
assert plot.build() == cleandoc(r'''
\begin{figure}[h!]
\centering
\begin{tikzpicture}
\begin{axis}[grid style={dashed,gray!50}, axis y line*=left, axis x line*=bottom, every axis plot/.append style={line width=1.25pt, mark size=0pt}, width=.8\textwidth, height=.45\textwidth, grid=major]
\addplot[red, forget plot, line width=2pt] table[x=x0, y=y0, col sep=comma]{./plot_test.csv};
\end{axis}
\end{tikzpicture}
\end{figure}
''')
os.remove('plot_test.csv')
def test_add_plot_with_color_obj(self):
plot = Plot(plot_name='plot_test')
color = Color(.1, .2, .3, 'spam')
plot.add_plot(list(range(10)), list(range(10)), color, legend='Legend', line_width='2pt')
assert plot.build() == cleandoc(r'''
\begin{figure}[h!]
\centering
\begin{tikzpicture}
\begin{axis}[grid style={dashed,gray!50}, axis y line*=left, axis x line*=bottom, every axis plot/.append style={line width=1.25pt, mark size=0pt}, width=.8\textwidth, height=.45\textwidth, grid=major]
\addplot[spam, line width=2pt] table[x=x0, y=y0, col sep=comma]{./plot_test.csv};
\addlegendentry{Legend};
\end{axis}
\end{tikzpicture}
\end{figure}
''')
os.remove('plot_test.csv')
def test_save_csv_to_right_path(self):
filepath = './some_doc_path/'
plotpath = filepath + 'plot_path/'
plot_name = 'plot_name'
plot = Plot([1, 2, 3], [1, 2, 3], plot_name=plot_name, plot_path=plotpath)
plot.build()
assert os.path.exists(plotpath + plot_name + '.csv')
shutil.rmtree(filepath)
def test_build_pdf_to_other_relative_path(self):
filepath = './some_doc_path/'
plotpath = filepath + 'plot_path/'
doc_name = 'Doc name'
plot_name = 'plot_name'
doc = Document(doc_name, filepath=filepath)
plot = doc.new(Plot([1, 2, 3], [1, 2, 3], plot_name=plot_name, plot_path=plotpath))
try:
doc.build(show_pdf=False)
assert os.path.exists(filepath + doc_name + '.tex')
assert os.path.exists(filepath + doc_name + '.pdf')
assert os.path.exists(plotpath + plot_name + '.csv')
finally:
shutil.rmtree('./some_doc_path/')
def test_add_matrix_plot(self):
plot = Plot(plot_name='matrix_plot_test', grid=False, lines=False)
plot.add_matrix_plot(list(range(10)), list(range(10)), [[i for i in range(10)] for _ in range(10)])
assert plot.build() == cleandoc(r'''
\begin{figure}[h!]
\centering
\begin{tikzpicture}
\begin{axis}[grid style={dashed,gray!50}, axis y line*=left, axis x line*=bottom, colorbar, every axis plot/.append style={line width=0pt, mark size=0pt}, width=.8\textwidth, height=.45\textwidth, grid=none]
\addplot[matrix plot*, point meta=explicit, mesh/rows=10, mesh/cols=10] table[x=x0, y=y0, meta=z0, col sep=comma]{./matrix_plot_test.csv};
\end{axis}
\end{tikzpicture}
\end{figure}
''')
os.remove('matrix_plot_test.csv')
def test_build_pdf_with_matrix_plot(self):
filepath = './some_doc_path/'
plotpath = filepath + 'plot_path/'
doc_name = 'Doc name'
plot_name = 'plot_name'
doc = Document(doc_name, filepath=filepath)
X = list(range(10))
Y = list(range(10))
Z = [[i for i in range(10)] for _ in range(10)]
plot = doc.new(Plot(plot_name=plot_name, plot_path=plotpath, grid=False, lines=False, enlargelimits='false'))
plot.add_matrix_plot(X, Y, Z)
try:
doc.build(show_pdf=False)
assert os.path.exists(filepath + doc_name + '.tex')
assert os.path.exists(filepath + doc_name + '.pdf')
assert os.path.exists(plotpath + plot_name + '.csv')
finally:
shutil.rmtree('./some_doc_path/')
class TestLinePlot:
def teardown(self):
_Plot.plot_count = 0
def test_build_with_legend(self):
lineplot = LinePlot([1, 2, 3], [4, 5, 6], 'red', 'dashed', legend='Legend', line_width='2pt')
lineplot.plot_filepath = './some/path/file.csv'
assert lineplot.build() == cleandoc(r"""
\addplot[red, dashed, line width=2pt] table[x=x0, y=y0, col sep=comma]{./some/path/file.csv};
\addlegendentry{Legend};
""")
def test_build_without_legend(self):
lineplot = LinePlot([1, 2, 3], [4, 5, 6], 'red', 'dashed', line_width='2pt')
lineplot.plot_filepath = './some/path/file.csv'
assert lineplot.build() == cleandoc(r"""
\addplot[red, dashed, forget plot, line width=2pt] table[x=x0, y=y0, col sep=comma]{./some/path/file.csv};
""")
def test_lineplot_id_number_correctly_increments(self):
l1 = LinePlot([1], [2])
l2 = LinePlot([1], [2])
l3 = LinePlot([1], [2])
assert l1.id_number == 0
assert l2.id_number == 1
assert l3.id_number == 2
class TestMatrixPlot:
def teardown(self):
_Plot.plot_count = 0
def test_build_with_legend(self):
lineplot = MatrixPlot([1, 2, 3], [4, 5, 6], [list(range(3)) for _ in range(3)])
lineplot.plot_filepath = './some/path/file.csv'
assert lineplot.build() == cleandoc(r"""
\addplot[matrix plot*, point meta=explicit, mesh/rows=3, mesh/cols=3] table[x=x0, y=y0, meta=z0, col sep=comma]{./some/path/file.csv};
""")
| 43.450292 | 219 | 0.591925 | 993 | 7,430 | 4.289023 | 0.127895 | 0.037568 | 0.039446 | 0.014792 | 0.822259 | 0.791735 | 0.790326 | 0.758864 | 0.732097 | 0.715896 | 0 | 0.028799 | 0.256931 | 7,430 | 170 | 220 | 43.705882 | 0.742619 | 0 | 0 | 0.649007 | 0 | 0.07947 | 0.451952 | 0.068641 | 0 | 0 | 0 | 0 | 0.119205 | 1 | 0.099338 | false | 0 | 0.039735 | 0 | 0.15894 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8c6a92c14a6d1bdc3e7e8bd7a9571a056c234662 | 159 | py | Python | backend/posts/admin.py | Soumithri/website | de813cfd20f3b9e28b92c089524f998956ced3d9 | [
"MIT"
] | 2 | 2019-08-07T03:28:51.000Z | 2019-08-07T07:32:25.000Z | backend/posts/admin.py | Soumithri/website | de813cfd20f3b9e28b92c089524f998956ced3d9 | [
"MIT"
] | 7 | 2019-12-04T23:54:04.000Z | 2022-02-10T10:57:58.000Z | backend/posts/admin.py | Soumithri/website | de813cfd20f3b9e28b92c089524f998956ced3d9 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Post, Project, Author
admin.site.register(Post)
admin.site.register(Project)
admin.site.register(Author)
| 22.714286 | 41 | 0.811321 | 23 | 159 | 5.608696 | 0.478261 | 0.209302 | 0.395349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08805 | 159 | 6 | 42 | 26.5 | 0.889655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8ce4762487e38a171d2873f8dd9b3608a24a3448 | 12,719 | py | Python | apps/gsuite/mail_syncer/tests/apiclient_utils_tests.py | Kpaubert/onlineweb4 | 9ac79f163bc3a816db57ffa8477ea88770d97807 | [
"MIT"
] | 32 | 2017-02-22T13:38:38.000Z | 2022-03-31T23:29:54.000Z | apps/gsuite/mail_syncer/tests/apiclient_utils_tests.py | Kpaubert/onlineweb4 | 9ac79f163bc3a816db57ffa8477ea88770d97807 | [
"MIT"
] | 694 | 2017-02-15T23:09:52.000Z | 2022-03-31T23:16:07.000Z | apps/gsuite/mail_syncer/tests/apiclient_utils_tests.py | Kpaubert/onlineweb4 | 9ac79f163bc3a816db57ffa8477ea88770d97807 | [
"MIT"
] | 35 | 2017-09-02T21:13:09.000Z | 2022-02-21T11:30:30.000Z | from django.conf import settings
from django.contrib.auth.models import Group
from django.test import TestCase, override_settings
from django_dynamic_fixture import G
from googleapiclient.errors import HttpError
from mock import patch
from apps.authentication.models import OnlineUser
from apps.gsuite.mail_syncer.tests.test_utils import create_http_error
from apps.gsuite.mail_syncer.utils import (
check_amount_of_members_ow4_g_suite,
get_appropriate_g_suite_group_names_for_user,
get_excess_groups_for_user,
get_excess_users_in_g_suite,
get_g_suite_groups_for_user,
get_g_suite_users_for_group,
get_missing_g_suite_group_names_for_user,
get_missing_ow4_users_for_g_suite,
get_ow4_users_for_group,
insert_email_into_g_suite_group,
insert_ow4_user_into_g_suite_group,
)
class GSuiteAPIUtilsTestCase(TestCase):
"""Tests for ow4-side utils of G Suite app, like "get excess groups for user"."""
def setUp(self):
self.domain = "example.org"
self.group = "dotkom"
@patch("logging.Logger.info")
@patch("apps.gsuite.mail_syncer.utils.setup_g_suite_client", autospec=True)
def test_insert_ow4_user_into_g_suite_group(self, mocked_insert, mocked_logger):
user = G(OnlineUser, online_mail="firstname.lastname")
group_email = self.group + "@" + self.domain
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = True
ow4_gsuite_sync["ENABLE_INSERT"] = True
mocked_insert.return_value.members.return_value.insert.return_value.execute.return_value = {
"email": user.online_mail
}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
resp = insert_email_into_g_suite_group(
self.domain, self.group, user.online_mail
)
self.assertEqual(user.online_mail, resp.get("email"))
mocked_logger.assert_called_with(
f"Inserting '{user.online_mail}' into G Suite group '{group_email}'.",
extra={"email": user.online_mail, "group": group_email},
)
@patch("logging.Logger.error")
def test_insert_ow4_user_into_g_suite_group_no_online_mail(self, mocked_logger):
user = G(OnlineUser, online_mail=None)
insert_ow4_user_into_g_suite_group(self.domain, self.group, user)
mocked_logger.assert_called_with(
f"OW4 User '{user}' ({user.pk}) missing Online email address! "
f"(current: '{user.online_mail}')",
extra={"user": user, "group": self.group},
)
def test_get_ow4_users_for_group(self):
group = G(Group)
user1 = G(OnlineUser)
user2 = G(OnlineUser)
user3 = G(OnlineUser)
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
group.user_set.add(user1, user2, user3)
resp = get_ow4_users_for_group(group.name)
self.assertIn(user1, resp)
self.assertIn(user2, resp)
self.assertIn(user3, resp)
def test_get_appropriate_g_suite_group_names_for_user(self):
user = G(OnlineUser)
G(Group, name="appkom")
dotkom = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
dotkom.user_set.add(user)
groups = get_appropriate_g_suite_group_names_for_user(self.domain, user)
self.assertEqual(1, len(groups))
self.assertIn(dotkom.name.lower(), groups)
def test_check_amount_of_members_equal(self):
g_suite_members = [{"email": "test@example.org"}]
user = G(OnlineUser)
group = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
group.user_set.add(user)
self.assertTrue(
check_amount_of_members_ow4_g_suite(
g_suite_members, group.user_set.all(), quiet=False
)
)
@patch("logging.Logger.debug")
def test_check_amount_of_members_ow4_dominates(self, mocked_logger):
g_suite_members = [{"email": "test@example.org"}]
user = G(OnlineUser)
user2 = G(OnlineUser)
group = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
group.user_set.add(user)
group.user_set.add(user2)
self.assertFalse(
check_amount_of_members_ow4_g_suite(
g_suite_members, group.user_set.all(), quiet=False
)
)
mocked_logger.assert_called_with(
f"There are more users on OW4 ({group.user_set.count()}) than in G Suite ({len(g_suite_members)}). "
"Need to update G Suite with new members."
)
@patch("logging.Logger.debug")
def test_check_amount_of_members_gsuite_dominates(self, mocked_logger):
g_suite_members = [
{"email": "test@example.org"},
{"email": "test2@example.org"},
]
user = G(OnlineUser)
group = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
group.user_set.add(user)
self.assertFalse(
check_amount_of_members_ow4_g_suite(
g_suite_members, group.user_set.all(), quiet=False
)
)
mocked_logger.assert_called_with(
f"There are more users in G Suite ({len(g_suite_members)}) than on OW4 ({group.user_set.count()}). "
"Need to trim inactive users from G Suite."
)
def test_get_excess_users_in_gsuite(self):
g_suite_members = [
{"email": "test@example.org"},
{"email": "test2@example.org"},
]
user = G(OnlineUser, online_mail="test")
G(OnlineUser, online_mail="test2")
group = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
group.user_set.add(user)
users = get_excess_users_in_g_suite(g_suite_members, group.user_set.all())
self.assertIn({"email": "test2@example.org"}, users)
def test_get_missing_ow4_users_for_g_suite(self):
g_suite_members = [{"email": "test@%s" % self.domain}]
user = G(OnlineUser, online_mail="test")
user2 = G(OnlineUser, online_mail="test2")
group = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["DOMAIN"] = self.domain
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
group.user_set.add(user, user2)
users = get_missing_ow4_users_for_g_suite(
g_suite_members, group.user_set.all()
)
self.assertIn(user2, users)
@patch("apps.gsuite.mail_syncer.utils.get_g_suite_groups_for_user")
def test_get_missing_g_suite_group_names_for_user(self, mocked_client):
user = G(OnlineUser)
dotkom = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
mocked_client.return_value = [{"name": "dotkom@" + self.domain}]
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
dotkom.user_set.add(user)
groups = get_missing_g_suite_group_names_for_user(self.domain, user)
self.assertEqual(1, len(groups))
self.assertIn(dotkom.name.lower(), groups)
@patch("apps.gsuite.mail_syncer.utils.get_g_suite_groups_for_user")
def test_get_excess_groups_for_user(self, mocked_client):
user = G(OnlineUser)
dotkom = G(Group, name="dotkom")
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = False
ow4_gsuite_sync["GROUPS"] = {"appkom": "appkom", "dotkom": "dotkom"}
mocked_client.return_value = [{"name": "dotkom"}]
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
groups = get_excess_groups_for_user(self.domain, user)
self.assertEqual(1, len(groups))
self.assertIn(dotkom.name.lower(), groups)
@patch("apps.gsuite.mail_syncer.utils.setup_g_suite_client", autospec=True)
def test_get_g_suite_users_for_group(self, mocked_g_suite_client):
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = True
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
mocked_g_suite_client.return_value.members.return_value.list.return_value.execute.return_value.get.return_value = [
{"email": "1@" + self.domain}
]
resp = get_g_suite_users_for_group(
self.domain, self.group, suppress_http_errors=True
)
self.assertEqual(1, len(resp))
http_error = create_http_error(400, "Error", "Error")
mocked_g_suite_client.return_value.members.return_value.list.return_value.execute.return_value.get.side_effect = (
http_error
)
self.assertRaises(
HttpError, lambda: get_g_suite_users_for_group(self.domain, self.group)
)
@patch("apps.gsuite.mail_syncer.utils.setup_g_suite_client", autospec=True)
def test_get_g_suite_users_for_group_no_members(self, mocked_g_suite_client):
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = True
mocked_g_suite_client.return_value.members.return_value.list.return_value.execute.return_value.get.return_value = (
None
)
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
resp = get_g_suite_users_for_group(self.domain, self.group)
self.assertEqual(0, len(resp))
@patch("apps.gsuite.mail_syncer.utils.setup_g_suite_client", autospec=True)
def test_get_g_suite_groups_for_user(self, mocked_g_suite_client):
user = G(
OnlineUser,
first_name="Test",
last_name="Testesen",
online_mail="test.testesen",
)
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = True
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
mocked_g_suite_client.return_value.groups.return_value.list.return_value.execute.return_value.get.return_value = [
{"name": "dotkom@" + self.domain}
]
resp = get_g_suite_groups_for_user(
self.domain, user, suppress_http_errors=True
)
self.assertEqual(1, len(resp))
http_error = create_http_error(400, "Error", "Error")
mocked_g_suite_client.return_value.groups.return_value.list.return_value.execute.return_value.get.side_effect = (
http_error
)
self.assertRaises(
HttpError, lambda: get_g_suite_groups_for_user(self.domain, user)
)
@patch("apps.gsuite.mail_syncer.utils.setup_g_suite_client", autospec=True)
def test_get_g_suite_groups_for_user_no_members(self, mocked_g_suite_client):
user = G(
OnlineUser,
first_name="Test",
last_name="Testesen",
online_mail="test.testesen",
)
ow4_gsuite_sync = settings.OW4_GSUITE_SYNC
ow4_gsuite_sync["ENABLED"] = True
mocked_g_suite_client.return_value.groups.return_value.list.return_value.execute.return_value.get.return_value = (
None
)
with override_settings(OW4_GSUITE_SYNC=ow4_gsuite_sync):
resp = get_g_suite_groups_for_user(self.domain, user)
self.assertEqual(0, len(resp))
| 39.135385 | 127 | 0.65925 | 1,634 | 12,719 | 4.747246 | 0.082007 | 0.092819 | 0.134072 | 0.075803 | 0.843238 | 0.805208 | 0.756478 | 0.721026 | 0.706072 | 0.674101 | 0 | 0.013605 | 0.237204 | 12,719 | 324 | 128 | 39.256173 | 0.78592 | 0.005897 | 0 | 0.501916 | 0 | 0.007663 | 0.129767 | 0.038535 | 0 | 0 | 0 | 0 | 0.095785 | 1 | 0.061303 | false | 0 | 0.034483 | 0 | 0.099617 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
509eb99170f7979bc15c8c53dd4453ae0961d897 | 403 | py | Python | Hira/tester.py | homologus/2020-intermediate-class | 4eba7c6cc4d90b5e320f0750092e4a10844fc96e | [
"MIT"
] | 3 | 2020-07-03T14:38:00.000Z | 2020-07-31T22:31:13.000Z | Hira/tester.py | VP-Seahawks/2020-intermediate-class | 7614d46fdfd206f6dc07b7eef48ff800e67f2e06 | [
"MIT"
] | null | null | null | Hira/tester.py | VP-Seahawks/2020-intermediate-class | 7614d46fdfd206f6dc07b7eef48ff800e67f2e06 | [
"MIT"
] | 5 | 2020-07-03T14:26:17.000Z | 2020-07-30T23:00:30.000Z |
from Bio.Seq import Seq
from Bio import SeqIO
sequence = Seq("""GAAATTTGACAATTTCACAGGGGAATGTCCAAACTTTGTCTTCCCACTAAATTCTACAATCAAGACCATTCAACCACGTGTTGAAAAGAAAAAGCTTGAGGGTTTTATGGGTACGAATTCGATCTGTCTATCCTGTTGCATCACCAAATGAATGCAACCCAATGCACCTTTCGACGCTTATGAAGTGTGAACATTGTAGTGAAACTTCATGGCAAACTGGTGACTTCCTTAAAGCCACTTGTGAATTTTGTGGTACTGAAAATCAAGTCAAAGAAGGACCTACCACTTGTGGTTACCTTC""")
print(sequence.translate())
| 36.636364 | 322 | 0.918114 | 15 | 403 | 24.666667 | 0.6 | 0.037838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044665 | 403 | 10 | 323 | 40.3 | 0.961039 | 0 | 0 | 0 | 0 | 0 | 0.75 | 0.75 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.25 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
50a9aebd26f43f9282ed527c688df3ebf1ed87bd | 32 | py | Python | src/util/__init__.py | Stanford-ILIAD/lila | 8ea05d0faabc8da69d5e1c8c3926f194ccc86ddc | [
"MIT"
] | 6 | 2021-11-30T14:04:44.000Z | 2022-03-21T19:00:42.000Z | src/util/__init__.py | Stanford-ILIAD/lila | 8ea05d0faabc8da69d5e1c8c3926f194ccc86ddc | [
"MIT"
] | null | null | null | src/util/__init__.py | Stanford-ILIAD/lila | 8ea05d0faabc8da69d5e1c8c3926f194ccc86ddc | [
"MIT"
] | 1 | 2022-03-03T16:50:24.000Z | 2022-03-03T16:50:24.000Z | from .paths import create_paths
| 16 | 31 | 0.84375 | 5 | 32 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0fe59919216eb52a1169ed18359282d0d8f9d3e9 | 6,873 | py | Python | test_autolens/pipeline/phase/point_source/test_analysis_point_source.py | arfon/PyAutoLens | e1a886fa0cbd9620efc1f88457d3f2c5afdae622 | [
"MIT"
] | null | null | null | test_autolens/pipeline/phase/point_source/test_analysis_point_source.py | arfon/PyAutoLens | e1a886fa0cbd9620efc1f88457d3f2c5afdae622 | [
"MIT"
] | null | null | null | test_autolens/pipeline/phase/point_source/test_analysis_point_source.py | arfon/PyAutoLens | e1a886fa0cbd9620efc1f88457d3f2c5afdae622 | [
"MIT"
] | null | null | null | from os import path
import autolens as al
import pytest
from autolens.mock import mock
pytestmark = pytest.mark.filterwarnings(
"ignore:Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of "
"`arr[seq]`. In the future this will be interpreted as an arrays index, `arr[np.arrays(seq)]`, which will result "
"either in an error or a different result."
)
directory = path.dirname(path.realpath(__file__))
class TestFit:
def test__fit_using_positions(
self, positions_x2, positions_x2_noise_map, samples_with_result
):
phase_positions_x2 = al.PhasePointSource(
galaxies=dict(
lens=al.GalaxyModel(redshift=0.5, light=al.lp.EllipticalSersic),
source=al.GalaxyModel(redshift=1.0, point=al.ps.PointSource),
),
search=mock.MockSearch(samples=samples_with_result),
positions_solver=mock.MockPositionsSolver(model_positions=positions_x2),
)
result = phase_positions_x2.run(
positions=positions_x2,
positions_noise_map=positions_x2_noise_map,
results=mock.MockResults(),
)
assert isinstance(result.instance.galaxies[0], al.Galaxy)
assert isinstance(result.instance.galaxies[0], al.Galaxy)
def test__figure_of_merit__matches_correct_fit_given_galaxy_profiles(
self, positions_x2, positions_x2_noise_map
):
lens_galaxy = al.Galaxy(
redshift=0.5, light=al.ps.PointSource(centre=(0.0, 0.0))
)
phase_positions_x2 = al.PhasePointSource(
galaxies=dict(lens=lens_galaxy),
settings=al.SettingsPhasePositions(),
search=mock.MockSearch(),
positions_solver=mock.MockPositionsSolver(model_positions=positions_x2),
)
analysis = phase_positions_x2.make_analysis(
positions=positions_x2,
positions_noise_map=positions_x2_noise_map,
results=mock.MockResults(),
)
instance = phase_positions_x2.model.instance_from_unit_vector([])
fit_figure_of_merit = analysis.log_likelihood_function(instance=instance)
tracer = analysis.tracer_for_instance(instance=instance)
positions_solver = mock.MockPositionsSolver(model_positions=positions_x2)
fit_positions = al.FitPositionsImage(
positions=positions_x2,
noise_map=positions_x2_noise_map,
tracer=tracer,
positions_solver=positions_solver,
)
assert fit_positions.chi_squared == 0.0
assert fit_positions.log_likelihood == fit_figure_of_merit
model_positions = al.Grid2DIrregular([(0.0, 1.0), (1.0, 2.0)])
positions_solver = mock.MockPositionsSolver(model_positions=model_positions)
phase_positions_x2 = al.PhasePointSource(
galaxies=dict(lens=lens_galaxy),
settings=al.SettingsPhasePositions(),
search=mock.MockSearch(),
positions_solver=positions_solver,
)
analysis = phase_positions_x2.make_analysis(
positions=positions_x2,
positions_noise_map=positions_x2_noise_map,
results=mock.MockResults(),
)
instance = phase_positions_x2.model.instance_from_unit_vector([])
fit_figure_of_merit = analysis.log_likelihood_function(instance=instance)
fit_positions = al.FitPositionsImage(
positions=positions_x2,
noise_map=positions_x2_noise_map,
tracer=tracer,
positions_solver=positions_solver,
)
assert fit_positions.residual_map.in_list == [1.0, 1.0]
assert fit_positions.chi_squared == 2.0
assert fit_positions.log_likelihood == fit_figure_of_merit
def test__figure_of_merit__includes_fit_fluxes(
self, positions_x2, positions_x2_noise_map, fluxes_x2, fluxes_x2_noise_map
):
lens_galaxy = al.Galaxy(
redshift=0.5,
sis=al.mp.SphericalIsothermal(einstein_radius=1.0),
light=al.ps.PointSourceFlux(flux=1.0),
)
phase_positions_x2 = al.PhasePointSource(
galaxies=dict(lens=lens_galaxy),
settings=al.SettingsPhasePositions(),
search=mock.MockSearch(),
positions_solver=mock.MockPositionsSolver(model_positions=positions_x2),
)
analysis = phase_positions_x2.make_analysis(
positions=positions_x2,
positions_noise_map=positions_x2_noise_map,
fluxes=fluxes_x2,
fluxes_noise_map=fluxes_x2_noise_map,
results=mock.MockResults(),
)
instance = phase_positions_x2.model.instance_from_unit_vector([])
fit_figure_of_merit = analysis.log_likelihood_function(instance=instance)
tracer = analysis.tracer_for_instance(instance=instance)
positions_solver = mock.MockPositionsSolver(model_positions=positions_x2)
fit_positions = al.FitPositionsImage(
positions=positions_x2,
noise_map=positions_x2_noise_map,
tracer=tracer,
positions_solver=positions_solver,
)
fit_fluxes = al.FitFluxes(
fluxes=fluxes_x2,
noise_map=fluxes_x2_noise_map,
positions=positions_x2,
tracer=tracer,
)
assert (
fit_positions.log_likelihood + fit_fluxes.log_likelihood
== fit_figure_of_merit
)
model_positions = al.Grid2DIrregular([(0.0, 1.0), (1.0, 2.0)])
positions_solver = mock.MockPositionsSolver(model_positions=model_positions)
phase_positions_x2 = al.PhasePointSource(
galaxies=dict(lens=lens_galaxy),
settings=al.SettingsPhasePositions(),
search=mock.MockSearch(),
positions_solver=positions_solver,
)
analysis = phase_positions_x2.make_analysis(
positions=positions_x2,
positions_noise_map=positions_x2_noise_map,
results=mock.MockResults(),
)
instance = phase_positions_x2.model.instance_from_unit_vector([])
fit_figure_of_merit = analysis.log_likelihood_function(instance=instance)
fit_positions = al.FitPositionsImage(
positions=positions_x2,
noise_map=positions_x2_noise_map,
tracer=tracer,
positions_solver=positions_solver,
)
assert fit_positions.residual_map.in_list == [1.0, 1.0]
assert fit_positions.chi_squared == 2.0
assert fit_positions.log_likelihood == fit_figure_of_merit
| 36.951613 | 119 | 0.646879 | 737 | 6,873 | 5.682497 | 0.16825 | 0.115568 | 0.047755 | 0.072588 | 0.812321 | 0.790831 | 0.773639 | 0.751671 | 0.702245 | 0.702245 | 0 | 0.020056 | 0.274553 | 6,873 | 185 | 120 | 37.151351 | 0.819896 | 0 | 0 | 0.646259 | 0 | 0.013605 | 0.039623 | 0.003289 | 0 | 0 | 0 | 0 | 0.07483 | 1 | 0.020408 | false | 0 | 0.027211 | 0 | 0.054422 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0feaef50dcef0b9f7cc05964e9823837a1512a5e | 263 | py | Python | run_tests.py | luzfcb/sqlalchemy-access | 9e6d2db8b0612917ccb5879a80efdf466ebd57ec | [
"MIT"
] | 8 | 2016-03-08T08:27:12.000Z | 2018-11-29T09:01:31.000Z | awvspy/packages/sqlalchemy_access/run_tests.py | wcc526/awvspy | e4ed47a4c8db3afd9c07251579c4ac8b8b45dcb5 | [
"Apache-2.0"
] | null | null | null | awvspy/packages/sqlalchemy_access/run_tests.py | wcc526/awvspy | e4ed47a4c8db3afd9c07251579c4ac8b8b45dcb5 | [
"Apache-2.0"
] | 7 | 2016-11-06T07:17:26.000Z | 2018-11-29T09:02:52.000Z | from sqlalchemy.dialects import registry
registry.register("access", "sqlalchemy_access.pyodbc", "AccessDialect_pyodbc")
registry.register("access.pyodbc", "sqlalchemy_access.pyodbc", "AccessDialect_pyodbc")
from sqlalchemy.testing import runner
runner.main()
| 29.222222 | 86 | 0.81749 | 29 | 263 | 7.275862 | 0.413793 | 0.170616 | 0.208531 | 0.331754 | 0.388626 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068441 | 263 | 8 | 87 | 32.875 | 0.861224 | 0 | 0 | 0 | 0 | 0 | 0.406844 | 0.18251 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
ba1915d9fb4e4244a55d4829f54d99d7097d8bb7 | 179 | py | Python | hataripy/pest/__init__.py | cclauss/hataripy | 7db7869f34b875c9f76d42b7a4801b0c23738448 | [
"MIT"
] | 3 | 2019-12-23T06:45:58.000Z | 2021-01-06T20:14:58.000Z | hataripy/pest/__init__.py | cclauss/hataripy | 7db7869f34b875c9f76d42b7a4801b0c23738448 | [
"MIT"
] | 2 | 2021-01-12T08:57:17.000Z | 2021-01-21T18:06:12.000Z | hataripy/pest/__init__.py | cclauss/hataripy | 7db7869f34b875c9f76d42b7a4801b0c23738448 | [
"MIT"
] | 1 | 2021-08-05T19:11:27.000Z | 2021-08-05T19:11:27.000Z | from .tplarray import Util3dTpl
from .params import Params, zonearray2params
from .templatewriter import TemplateWriter
from .tplarray import Transient2dTpl, Util2dTpl, Util3dTpl
| 35.8 | 58 | 0.854749 | 19 | 179 | 8.052632 | 0.473684 | 0.156863 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.106145 | 179 | 4 | 59 | 44.75 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e8c684226a268330301ad2c47d4e9b1fd6ee08d2 | 69,902 | py | Python | hops/pylightcurve3/exoplanet_lc_fitting.py | ExoWorldsSpies/hops | a33e434befe17318c064210a289b453c6f91b44f | [
"MIT"
] | 5 | 2020-02-22T13:51:47.000Z | 2021-12-10T20:24:11.000Z | hops/pylightcurve3/exoplanet_lc_fitting.py | ExoWorldsSpies/hops | a33e434befe17318c064210a289b453c6f91b44f | [
"MIT"
] | 6 | 2020-02-24T16:29:11.000Z | 2021-11-27T22:57:19.000Z | hops/pylightcurve3/exoplanet_lc_fitting.py | ExoWorldsSpies/hops | a33e434befe17318c064210a289b453c6f91b44f | [
"MIT"
] | 2 | 2020-04-04T17:33:05.000Z | 2021-03-04T20:10:23.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import numpy as np
import pickle
import matplotlib
if os.environ.get('DISPLAY', '') == '':
print('no display found. Using non-interactive Agg backend')
matplotlib.use('Agg')
else:
try:
matplotlib.use('TkAgg')
except ImportError:
print('matplotlib.pyplot has been already imported. Tk features will not be supported')
pass
import matplotlib.pyplot as plt
from ._0errors import *
from .exoplanet_orbit import transit_projected_distance, transit_duration
from .exoplanet_lc import transit_flux_drop
from .analysis_emcee_fitting import EmceeFitting
class TransitAndPolyFitting:
def __init__(self, data,
method, limb_darkening_coefficients,
rp_over_rs, period, sma_over_rs, eccentricity, inclination, periastron, mid_time,
iterations, walkers, burn, precision=3,
exp_time=0, time_factor=1, fit_first_order=False, fit_second_order=False,
fit_rp_over_rs=False, fit_period=False, fit_sma_over_rs=False,
fit_eccentricity=False, fit_inclination=False, fit_periastron=False, fit_mid_time=False,
counter='auto', function_to_call=None):
# TODO check input parameters
self.data = data
self.total_sets = len(self.data)
self.method = method
self.limb_darkening_coefficients = limb_darkening_coefficients
self.fit_ld = False
if isinstance(self.limb_darkening_coefficients, str):
if self.limb_darkening_coefficients == 'fit':
self.fit_ld = True
if self.method == 'linear':
self.limb_darkening_coefficients = [0.5]
elif self.method in ['quad', 'sqrt']:
self.limb_darkening_coefficients = [0.5, 0.5]
elif self.method == 'claret':
self.limb_darkening_coefficients = [0.5, 0.5, 0.5, 0.5]
if self.method == 'claret':
self.total_ldcs = 4
elif self.method in ['quad', 'sqrt']:
self.total_ldcs = 2
elif self.method == 'linear':
self.total_ldcs = 1
self.rp_over_rs = rp_over_rs
self.period = period
self.sma_over_rs = sma_over_rs
self.eccentricity = eccentricity
self.inclination = inclination
self.periastron = periastron
self.mid_time = mid_time
self.precision = precision
self.exp_time = exp_time / (60.0 * 60.0 * 24.0)
self.time_factor = time_factor
self.fit_first_order = fit_first_order
self.fit_second_order = fit_second_order
self.fit_rp_over_rs = fit_rp_over_rs
self.fit_period = fit_period
self.fit_sma_over_rs = fit_sma_over_rs
self.fit_eccentricity = fit_eccentricity
self.fit_inclination = fit_inclination
self.fit_periastron = fit_periastron
self.fit_mid_time = fit_mid_time
self.iterations = iterations
self.walkers = walkers
self.burn = burn
self.counter = counter
self.data_time = np.array([])
self.data_flux = np.array([])
self.data_flux_error = np.array([])
self.data_set_number = np.array([])
self.data_set_dt = np.array([])
for set_number, set_arrays in enumerate(self.data):
if set_number == 0:
time_shift = round((np.mean(set_arrays[0]) - self.mid_time) / self.period)
self.mid_time += time_shift * self.period
if self.fit_mid_time:
self.fit_mid_time[0] += time_shift * self.period
self.fit_mid_time[1] += time_shift * self.period
self.data_time = np.append(self.data_time, set_arrays[0])
self.data_flux = np.append(self.data_flux, set_arrays[1])
self.data_flux_error = np.append(self.data_flux_error, set_arrays[2])
self.data_set_number = np.append(self.data_set_number, np.ones_like(set_arrays[0]) * set_number)
self.data_set_dt = np.append(self.data_set_dt, set_arrays[0] - set_arrays[0][0])
self.data_set_number = np.int_(self.data_set_number)
self.names = []
self.print_names = []
self.limits1 = []
self.limits2 = []
self.initial = []
for set_number, set_arrays in enumerate(self.data):
max_limit = (10 * (max(set_arrays[1]) - min(set_arrays[1])) / (max(set_arrays[0]) - min(set_arrays[0])) /
np.mean(set_arrays[1]))
self.names.append('N{0}'.format(str(set_number)))
self.print_names.append('N_{0}'.format(str(set_number)))
self.initial.append(np.mean(set_arrays[1]))
self.limits1.append(0.9 * np.min(set_arrays[1]))
self.limits2.append(1.1 * np.max(set_arrays[1]))
self.names.append('L{0}'.format(str(set_number)))
self.print_names.append('L_{0}'.format(str(set_number)))
self.initial.append(0)
if self.fit_first_order:
self.limits1.append(-3)
self.limits2.append(3)
else:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
self.names.append('Q{0}'.format(str(set_number)))
self.print_names.append('Q_{0}'.format(str(set_number)))
self.initial.append(0)
if self.fit_second_order:
self.limits1.append(-3)
self.limits2.append(3)
else:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
self.names.append('ldc1')
self.print_names.append('ldc_1')
self.initial.append(self.limb_darkening_coefficients[0])
if self.fit_ld:
self.limits1.append(0.000001)
self.limits2.append(0.999999)
else:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
if self.method in ['claret', 'quad', 'sqrt']:
self.names.append('ldc2')
self.print_names.append('ldc_2')
self.initial.append(self.limb_darkening_coefficients[1])
if self.fit_ld:
self.limits1.append(0.000001)
self.limits2.append(0.999999)
else:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
if self.method == 'claret':
self.names.append('ldc3')
self.print_names.append('ldc_3')
self.initial.append(self.limb_darkening_coefficients[2])
if self.fit_ld:
self.limits1.append(0.000001)
self.limits2.append(0.999999)
else:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
self.names.append('ldc4')
self.print_names.append('ldc_4')
self.initial.append(self.limb_darkening_coefficients[3])
if self.fit_ld:
self.limits1.append(0.000001)
self.limits2.append(0.999999)
else:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
self.names += ['rp', 'P', 'a', 'e', 'i', 'w', 'mt']
self.print_names += ['R_\mathrm{p}/R_*', 'P', 'a/R_*', 'e', 'i', '\omega', 'T_{mid}']
self.initial += [self.rp_over_rs, self.period, self.sma_over_rs, self.eccentricity,
self.inclination, self.periastron, self.mid_time]
limits = self.limits1 + [self.fit_rp_over_rs, self.fit_period, self.fit_sma_over_rs, self.fit_eccentricity,
self.fit_inclination, self.fit_periastron, self.fit_mid_time]
for var in range(3 * self.total_sets + self.total_ldcs, len(self.names)):
try:
self.initial[var] = float(self.initial[var])
except:
raise PyLCInputError('Improper value for {0}'.format(self.names[var]))
if limits[var] is False:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
elif limits[var] is None:
self.limits1.append(np.nan)
self.limits2.append(np.nan)
else:
try:
if len(np.array(limits[var])) != 2:
raise RuntimeError('Improper limits for {0}'.format(self.names[var]))
except:
raise RuntimeError('Improper limits for {0}'.format(self.names[var]))
if self.initial[var] < np.array(limits[var])[0] or self.initial[var] > np.array(limits[var])[1]:
raise RuntimeError('Initial value for {0} is outside the range of the prior.'.format(
self.names[var]))
else:
self.limits1.append(np.array(limits[var])[0])
self.limits2.append(np.array(limits[var])[1])
if self.exp_time == 0:
self.data_time_hr = self.data_time
else:
self.data_time_hr = np.array([])
for i in range(self.time_factor):
self.data_time_hr = np.append(self.data_time_hr, self.data_time - self.exp_time / 2.0 +
(i + 0.5) * self.exp_time / self.time_factor)
self.fitting = EmceeFitting([self.data_flux, self.data_flux_error],
self.full_model, self.initial, self.limits1, self.limits2,
self.walkers, self.iterations, self.burn,
names=self.names, print_names=self.print_names,
counter=self.counter, function_to_call=function_to_call)
self.results = 0
self.mcmc_run_complete = False
def detrend_model(self, *model_variables):
detrend_zero = np.array([model_variables[3 * xx] for xx in range(self.total_sets)])
detrend_zero = detrend_zero[self.data_set_number]
detrend_one = np.array([model_variables[3 * xx + 1] for xx in range(self.total_sets)])
detrend_one = detrend_one[self.data_set_number]
detrend_two = np.array([model_variables[3 * xx + 2] for xx in range(self.total_sets)])
detrend_two = detrend_two[self.data_set_number]
return detrend_zero * (1 + detrend_one * self.data_set_dt +
detrend_two * self.data_set_dt * self.data_set_dt)
def transit_model(self, *model_variables):
limb_darkening_coefficients = list(model_variables)[3 * self.total_sets: 3 * self.total_sets + self.total_ldcs]
rp_over_rs = list(model_variables)[3 * self.total_sets + self.total_ldcs]
z_over_rs = transit_projected_distance(*model_variables[3 * self.total_sets + self.total_ldcs + 1:],
time_array=self.data_time_hr)
transit_hr = transit_flux_drop(self.method, limb_darkening_coefficients, rp_over_rs, z_over_rs,
precision=self.precision)
return np.mean(np.reshape(transit_hr, (self.time_factor, len(self.data_time))), 0)
def full_model(self, *model_variables):
return self.detrend_model(*model_variables) * self.transit_model(*model_variables)
def run_mcmc(self):
self.fitting.run_mcmc()
def rerun_mcmc(self):
self.fitting.rerun_mcmc()
def get_results(self):
self.fitting.get_results()
self.results = self.fitting.results
self.mcmc_run_complete = True
self.results['input_series']['hjd'] = self.data_time
period = self.results['parameters']['P']['value']
mt = self.results['parameters']['mt']['value']
self.results['output_series']['phase'] = \
(self.data_time - mt) / period - np.round((self.data_time - mt) / period)
self.results['detrended_input_series'] = {
'hjd': self.results['input_series']['hjd'],
'value': self.results['input_series']['value'] / self.detrend_model(*self.results['parameters_final']),
'error': self.results['input_series']['error'] / self.detrend_model(*self.results['parameters_final'])}
self.results['detrended_output_series'] = {
'phase': self.results['output_series']['phase'],
'model': self.results['output_series']['model'] / self.detrend_model(*self.results['parameters_final']),
'residuals': (self.results['output_series']['residuals']
/ self.detrend_model(*self.results['parameters_final']))}
self.results['detrended_statistics'] = {ff: self.results['statistics'][ff] for ff in self.results['statistics']}
self.results['detrended_statistics']['res_std'] = np.std(self.results['detrended_output_series']['residuals'])
def save_all(self, export_file):
if not self.mcmc_run_complete:
raise PyLCProcessError('MCMC not completed')
pickle.dump(self.results, open(export_file, 'wb'))
def save_results(self, export_file):
self.fitting.save_results(export_file)
def plot_corner(self, export_file):
self.fitting.plot_corner(export_file)
def plot_traces(self, export_file):
self.fitting.plot_traces(export_file)
def plot_models(self, export_file, target=None, data_dates=None):
if not self.mcmc_run_complete:
raise PyLCProcessError('MCMC not completed')
if target is None:
target = ' '
if data_dates is None:
data_dates = list(map(str, ['set_{0}'.format(str(ff)) for ff in range(1, self.total_sets + 1)]))
for set_number in range(self.total_sets):
fig = plt.figure(set_number + 1)
fig.set_tight_layout(False)
self.results = {ff: self.results[ff] for ff in self.results}
period = self.results['parameters']['P']['value']
mt = self.results['parameters']['mt']['value']
mt += round((np.mean(self.data[set_number][0]) - mt) / period) * period
prediction = (self.mid_time +
round((np.mean(self.data[set_number][0]) - self.mid_time) / self.period) * self.period)
duration = transit_duration(self.rp_over_rs, self.period, self.sma_over_rs,
self.inclination, self.eccentricity, self.periastron)
ingress = prediction - duration / 2
egress = prediction + duration / 2
set_indices = np.where(self.data_set_number == set_number)
plt.subplot2grid((4, 1), (0, 0), rowspan=3)
plt.plot(self.results['output_series']['phase'][set_indices],
self.results['input_series']['value'][set_indices], 'ko', ms=2)
plt.plot(self.results['output_series']['phase'][set_indices],
self.results['output_series']['model'][set_indices], 'r-')
plt.ylim(min(self.results['output_series']['model'][set_indices])
- 5 * np.std(self.results['output_series']['residuals'][set_indices]),
max(self.results['output_series']['model'][set_indices])
+ 5 * np.std(self.results['output_series']['residuals'][set_indices]))
plt.yticks(plt.yticks()[0][1:])
plt.ylabel(r'$\mathrm{relative} \ \mathrm{flux}$', fontsize=15)
plt.ylim(min(self.results['output_series']['model'][set_indices])
- 5 * np.std(self.results['output_series']['residuals'][set_indices]),
max(self.results['output_series']['model'][set_indices])
+ 5 * np.std(self.results['output_series']['residuals'][set_indices]))
x_max = max(np.abs(self.results['output_series']['phase'][set_indices]) +
0.05 * (max(self.results['output_series']['phase'][set_indices]) -
min(self.results['output_series']['phase'][set_indices])))
plt.xlim(-x_max, x_max)
plt.tick_params(labelbottom='off')
rpstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
r'$R_\mathrm{p}/R_* = ', self.results['parameters']['rp']['print_value'], '_{-',
self.results['parameters']['rp']['print_m_error'], '}', '^{+',
self.results['parameters']['rp']['print_p_error'], '}$')
mtstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
r'$T_\mathrm{HJD} = ', self.results['parameters']['mt']['print_value'], '_{-',
self.results['parameters']['mt']['print_m_error'], '}', '^{+',
self.results['parameters']['mt']['print_p_error'], '}$')
plt.text(plt.xlim()[0] + 0.5 * (plt.xlim()[-1] - plt.xlim()[0]),
plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
'{0}\n{1}'.format(rpstr, mtstr), ha='center', va='center', fontsize=10)
plt.axvline((ingress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
plt.text((ingress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
'{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{ingress}$', '\n',
r'$\mathrm{start}$'),
ha='right', va='top', fontsize=10)
plt.axvline((egress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
plt.text((egress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
'{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{egress}$', '\n',
r'$\mathrm{end}$'),
ha='left', va='top', fontsize=10)
plt.suptitle('{0}{1}{2}'.format(r'$\mathbf{', target, '}$'), fontsize=20)
plt.text(plt.xlim()[1], plt.ylim()[1], '{0}{1}{2}'.format(r'$', data_dates[set_number], '$'),
fontsize=12, ha='right', va='bottom')
plt.subplot(4, 1, 4)
plt.cla()
plt.plot(self.results['output_series']['phase'][set_indices],
self.results['output_series']['residuals'][set_indices], 'ko', ms=2)
plt.plot(self.results['output_series']['phase'][set_indices],
np.zeros_like(self.results['output_series']['phase'][set_indices]), 'r-')
plt.ylim(- 5 * np.std(self.results['output_series']['residuals'][set_indices]),
5 * np.std(self.results['output_series']['residuals'][set_indices]))
plt.xlabel(r'$\mathrm{phase}$', fontsize=15)
plt.ylabel(r'$\mathrm{residuals}$', fontsize=15)
plt.xlim(-x_max, x_max)
plt.text(plt.xlim()[0] + 0.02 * (plt.xlim()[-1] - plt.xlim()[0]),
plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
r'$\mathrm{rms}_\mathrm{res} = %.1e$' %
np.std(self.results['output_series']['residuals'][set_indices]), fontsize=10)
plt.subplots_adjust(left=0.15, right=0.975, bottom=0.12, top=0.9, hspace=0.0)
plt.savefig(os.path.join(os.path.split(export_file)[0],
'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
transparent=True)
plt.close('all')
def plot_detrended_models(self, export_file, target=None, data_dates=None, return_plot=False):
if not self.mcmc_run_complete:
raise PyLCProcessError('MCMC not completed')
if target is None:
target = ' '
if data_dates is None:
data_dates = list(map(str, ['set_{0}'.format(str(ff)) for ff in range(1, self.total_sets + 1)]))
for set_number in range(self.total_sets):
fig = plt.figure(set_number + 1)
fig.set_tight_layout(False)
self.results = {ff: self.results[ff] for ff in self.results}
period = self.results['parameters']['P']['value']
mt = self.results['parameters']['mt']['value']
mt += round((np.mean(self.data[set_number][0]) - mt) / period) * period
prediction = (self.mid_time +
round((np.mean(self.data[set_number][0]) - self.mid_time) / self.period) * self.period)
duration = transit_duration(self.rp_over_rs, self.period, self.sma_over_rs,
self.inclination, self.eccentricity, self.periastron)
ingress = prediction - duration / 2
egress = prediction + duration / 2
set_indices = np.where(self.data_set_number == set_number)
plt.subplot2grid((4, 1), (0, 0), rowspan=3)
plt.plot(self.results['detrended_output_series']['phase'][set_indices],
self.results['detrended_input_series']['value'][set_indices], 'ko', ms=2)
plt.plot(self.results['detrended_output_series']['phase'][set_indices],
self.results['detrended_output_series']['model'][set_indices], 'r-')
plt.ylim(min(self.results['detrended_output_series']['model'][set_indices])
- 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]),
max(self.results['detrended_output_series']['model'][set_indices])
+ 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]))
plt.yticks(plt.yticks()[0][1:])
plt.ylabel(r'$\mathrm{relative} \ \mathrm{flux}$', fontsize=15)
plt.ylim(min(self.results['detrended_output_series']['model'][set_indices])
- 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]),
max(self.results['detrended_output_series']['model'][set_indices])
+ 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]))
plt.ylim(-1.17647 * (- plt.ylim()[0] + 0.15 * plt.ylim()[1]), plt.ylim()[1])
x_max = max(np.abs(self.results['detrended_output_series']['phase'][set_indices]) +
0.05 * (max(self.results['detrended_output_series']['phase'][set_indices]) -
min(self.results['detrended_output_series']['phase'][set_indices])))
plt.xlim(-x_max, x_max)
plt.tick_params(labelbottom='off')
rpstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
r'$R_\mathrm{p}/R_* = ', self.results['parameters']['rp']['print_value'], '_{-',
self.results['parameters']['rp']['print_m_error'], '}', '^{+',
self.results['parameters']['rp']['print_p_error'], '}$')
mtstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
r'$T_\mathrm{HJD} = ', self.results['parameters']['mt']['print_value'], '_{-',
self.results['parameters']['mt']['print_m_error'], '}', '^{+',
self.results['parameters']['mt']['print_p_error'], '}$')
plt.text(plt.xlim()[0] + 0.5 * (plt.xlim()[-1] - plt.xlim()[0]),
plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
'{0}{1}{2}'.format(rpstr, '\n', mtstr), ha='center', va='center', fontsize=10)
plt.axvline((ingress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
plt.text((ingress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
'{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{ingress}$', '\n',
r'$\mathrm{start}$'),
ha='right', va='top', fontsize=10)
plt.axvline((egress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
plt.text((egress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
'{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{egress}$', '\n',
r'$\mathrm{end}$'),
ha='left', va='top', fontsize=10)
plt.suptitle('{0}{1}{2}'.format(r'$\mathbf{', target, '}$'), fontsize=20)
plt.text(plt.xlim()[1], plt.ylim()[1], '{0}{1}{2}'.format(r'$', data_dates[set_number], '$'),
fontsize=12, ha='right', va='bottom')
plt.subplot(4, 1, 4)
plt.cla()
plt.plot(self.results['detrended_output_series']['phase'][set_indices],
self.results['detrended_output_series']['residuals'][set_indices], 'ko', ms=2)
plt.plot(self.results['detrended_output_series']['phase'][set_indices],
np.zeros_like(self.results['detrended_output_series']['phase'][set_indices]), 'r-')
plt.ylim(- 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]),
5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]))
plt.xlabel(r'$\mathrm{phase}$', fontsize=15)
plt.ylabel(r'$\mathrm{residuals}$', fontsize=15)
plt.xlim(-x_max, x_max)
plt.text(plt.xlim()[0] + 0.02 * (plt.xlim()[-1] - plt.xlim()[0]),
plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
r'$\mathrm{rms}_\mathrm{res} = %.1e$' %
np.std(self.results['detrended_output_series']['residuals'][set_indices]),
fontsize=10)
plt.subplots_adjust(left=0.15, right=0.975, bottom=0.12, top=0.9, hspace=0.0)
plt.savefig(os.path.join(os.path.split(export_file)[0],
'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
transparent=True)
if return_plot:
return [plt.figure(ff + 1) for ff in range(self.total_sets)]
else:
plt.close('all')
def save_models(self, export_file):
if not self.mcmc_run_complete:
raise PyLCProcessError('MCMC not completed')
for set_number in range(self.total_sets):
self.results = {ff: self.results[ff] for ff in self.results}
set_indices = np.where(self.data_time == self.data[set_number][0])
np.savetxt(os.path.join(os.path.split(export_file)[0],
'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
np.swapaxes([self.results['input_series']['hjd'][set_indices],
self.results['output_series']['phase'][set_indices],
self.results['input_series']['value'][set_indices],
self.results['input_series']['error'][set_indices],
self.results['output_series']['model'][set_indices],
self.results['output_series']['residuals'][set_indices]
], 0, 1))
def save_detrended_models(self, export_file):
if not self.mcmc_run_complete:
raise PyLCProcessError('MCMC not completed')
for set_number in range(self.total_sets):
self.results = {ff: self.results[ff] for ff in self.results}
set_indices = np.where(self.data_time == self.data[set_number][0])
np.savetxt(os.path.join(os.path.split(export_file)[0],
'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
np.swapaxes([self.results['detrended_input_series']['hjd'][set_indices],
self.results['detrended_output_series']['phase'][set_indices],
self.results['detrended_input_series']['value'][set_indices],
self.results['detrended_input_series']['error'][set_indices],
self.results['detrended_output_series']['model'][set_indices],
self.results['detrended_output_series']['residuals'][set_indices]
], 0, 1))
#
# class TransitAndHubbleFitting:
#
# def __init__(self, data,
# apply_up_down_stream_correction,
# exclude_initial_orbits, exclude_final_orbits, exclude_initial_orbit_points,
# first_orbit_ramp, second_order_ramp, mid_orbit_ramps,
# method, limb_darkening_coefficients,
# rp_over_rs, period, sma_over_rs, eccentricity, inclination, periastron, mid_time,
# iterations, walkers, burn, precision=3,
# exp_time=0, time_factor=1,
# fit_rp_over_rs=False, fit_period=False, fit_sma_over_rs=False,
# fit_eccentricity=False, fit_inclination=False, fit_periastron=False, fit_mid_time=False,
# counter=True, counter_window=False):
#
# # TODO check input parameters
#
# self.method = method
# self.limb_darkening_coefficients = limb_darkening_coefficients
# self.fit_ld = False
#
# if isinstance(self.limb_darkening_coefficients, str):
# if self.limb_darkening_coefficients == 'fit':
# self.fit_ld = True
# if self.method == 'linear':
# self.limb_darkening_coefficients = [0.5]
# elif self.method in ['quad', 'sqrt']:
# self.limb_darkening_coefficients = [0.5, 0.5]
# elif self.method == 'claret':
# self.limb_darkening_coefficients = [0.5, 0.5, 0.5, 0.5]
#
# if self.method == 'claret':
# self.total_ldcs = 4
# elif self.method in ['quad', 'sqrt']:
# self.total_ldcs = 2
# elif self.method == 'linear':
# self.total_ldcs = 1
#
# self.rp_over_rs = rp_over_rs
# self.period = period
# self.sma_over_rs = sma_over_rs
# self.eccentricity = eccentricity
# self.inclination = inclination
# self.periastron = periastron
# self.mid_time = mid_time
# self.precision = precision
# self.time_factor = time_factor
#
# self.fit_second_order_ramp = second_order_ramp
#
# self.fit_rp_over_rs = fit_rp_over_rs
# self.fit_period = fit_period
# self.fit_sma_over_rs = fit_sma_over_rs
# self.fit_eccentricity = fit_eccentricity
# self.fit_inclination = fit_inclination
# self.fit_periastron = fit_periastron
# self.fit_mid_time = fit_mid_time
#
# self.data = {}
# self.sets = ['set_{0}'.format(str(ff).zfill(2)) for ff in range(len(data))]
# self.total_sets = len(data)
# self.data_set_number = np.array([])
# self.data_time = np.array([])
# data_flux = np.array([])
# data_flux_error = np.array([])
#
# for set_number, set_arrays in enumerate(data):
#
# new_set_arrays = [ff for ff in set_arrays]
#
# # up-stream / down-stream correction
#
# star_y_position_array = new_set_arrays[1]
# spectrum_direction_array = new_set_arrays[2]
# scan_length_array = new_set_arrays[3]
#
# if apply_up_down_stream_correction:
# test1 = star_y_position_array[0] - 507
# test2 = test1 + spectrum_direction_array[0] * scan_length_array[0]
# if test1 * test2 < 0:
# apply_up_down_stream_correction = True
# else:
# apply_up_down_stream_correction = False
#
# if apply_up_down_stream_correction:
# for scan_direction in [1.0, -1.0]:
# fr = np.where(spectrum_direction_array == scan_direction)[0]
# if len(fr) > 0:
# zerofr = star_y_position_array.value[fr]
# sigmfr = scan_length_array.value[fr]
# begnfr = zerofr
# fitfr = np.poly1d(np.polyfit(begnfr, sigmfr, 1))
# for ii in range(4, len(new_set_arrays)):
# new_set_arrays[ii][fr] = new_set_arrays[ii][fr] * fitfr(begnfr[0]) / fitfr(begnfr)
# new_set_arrays[ii][fr] = new_set_arrays[ii][fr] * fitfr(begnfr[0]) / fitfr(begnfr)
#
# # exclude orbits / points
#
# heliocentric_julian_date_array = new_set_arrays[0]
#
# indices_to_remain = np.arange(len(heliocentric_julian_date_array))
#
# if exclude_initial_orbits > 0:
# htime = heliocentric_julian_date_array
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 30.0 / 60.0 / 24.0)[0]
# indices_to_remain = indices_to_remain[orbits[exclude_initial_orbits]:]
#
# if exclude_final_orbits > 0:
# htime = heliocentric_julian_date_array[indices_to_remain]
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 30.0 / 60.0 / 24.0)[0]
# indices_to_remain = indices_to_remain[:orbits[-exclude_final_orbits]]
#
# if exclude_initial_orbit_points > 0:
# htime = heliocentric_julian_date_array[indices_to_remain]
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 30.0 / 60.0 / 24.0)[0]
# indices_to_remain = np.delete(indices_to_remain,
# np.concatenate(
# [orbits + i for i in range(exclude_initial_orbit_points)]))
#
# new_set_arrays = [ff[indices_to_remain] for ff in new_set_arrays]
#
# # define hst orbital phases
#
# heliocentric_julian_date_array = new_set_arrays[0]
#
# if mid_orbit_ramps:
# htime = heliocentric_julian_date_array
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 30.0 / 60.0 / 24.0)[0]
# dumps = np.where(abs(htime - np.roll(htime, 1)) > 5.0 / 60.0 / 24.0)[0]
# dphase = np.zeros(len(htime))
# for i in range(1, len(dumps)):
# if dumps[i] not in orbits:
# if i != len(dumps) - 1:
# for j in range(dumps[i], dumps[i + 1]):
# dphase[j] = 1
# else:
# for j in range(dumps[i], len(dphase)):
# dphase[j] = 1
# else:
# htime = heliocentric_julian_date_array
# dphase = np.zeros(len(htime))
#
# if first_orbit_ramp:
# htime = heliocentric_julian_date_array
# if mid_orbit_ramps:
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 5.0 / 60.0 / 24.0)[0]
# else:
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 30.0 / 60.0 / 24.0)[0]
# orbits = htime[orbits]
# fphase = np.where(htime < orbits[1], 1, 0)
#
# else:
# htime = heliocentric_julian_date_array
# fphase = np.zeros(len(htime))
#
# htime = heliocentric_julian_date_array
# if mid_orbit_ramps:
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 5.0 / 60.0 / 24.0)[0]
# else:
# orbits = np.where(abs(htime - np.roll(htime, 1)) > 30.0 / 60.0 / 24.0)[0]
# t0s = htime[orbits]
# ophase = []
# for pp in t0s:
# ppp = htime - pp
# ppp = np.where(ppp < 0, 1000, ppp)
# ophase.append(ppp)
#
# ophase = np.min(ophase, 0)
#
# # outliers filter
#
# lightcurves = [ff for ff in new_set_arrays[6::2]]
# ica = FastICA(n_components=len(lightcurves), max_iter=1000)
# components = ica.fit_transform(np.array(lightcurves).T).T
#
# indices_to_remain = []
# for i in components:
# indices_to_remain.append(
# np.array(np.abs(i - np.median(i)) < 20 * np.median(np.abs(i - np.median(i)))))
# indices_to_remain = np.where(np.product(indices_to_remain, 0))[0]
# indices_to_remain = np.sort(np.unique(np.array(indices_to_remain)))
#
# new_set_arrays = [ff[indices_to_remain] for ff in new_set_arrays]
# ophase = ophase[indices_to_remain]
# dphase = dphase[indices_to_remain]
# fphase = fphase[indices_to_remain]
#
# # match forward and reverse scans
#
# spectrum_direction_array = new_set_arrays[2]
# flux_array = new_set_arrays[4]
#
# fr = np.where(spectrum_direction_array > 0)[0]
# if len(fr) != len(spectrum_direction_array):
#
# fr_out = np.where(spectrum_direction_array > 0)[0]
# rv_out = np.where(spectrum_direction_array < 0)[0]
# shift = np.mean(flux_array[fr_out]) / np.mean(flux_array[rv_out])
# for ii in range(4, len(new_set_arrays)):
# new_set_arrays[ii][fr] = new_set_arrays[ii][fr] / shift
#
# if set_number == 0:
# time_shift = round((np.mean(new_set_arrays[0]) - self.mid_time) / self.period)
# self.mid_time += time_shift * self.period
# if self.fit_mid_time:
# self.fit_mid_time[0] += time_shift * self.period
# self.fit_mid_time[1] += time_shift * self.period
#
# data_flux = np.append(data_flux, new_set_arrays[4])
# data_flux_error = np.append(data_flux_error, new_set_arrays[5])
# self.data_time = np.append(self.data_time, new_set_arrays[0])
# self.data_set_number = np.int_(np.append(self.data_set_number,
# np.ones_like(new_set_arrays[0]) * set_number))
#
# if exp_time == 0:
# hjd_hd = new_set_arrays[0]
#
# else:
# hjd_hd = np.array([])
# for i in range(self.time_factor):
# hjd_hd = np.append(hjd_hd, new_set_arrays[0] - (exp_time / (60.0 * 60.0 * 24.0)) / 2.0 +
# (i + 0.5) * (exp_time / (60.0 * 60.0 * 24.0)) / self.time_factor)
#
# epoch = round((np.mean(new_set_arrays[0]) - self.mid_time) / self.period)
#
# self.data[self.sets[set_number]] = {'epoch': epoch,
# 'hjd': new_set_arrays[0],
# 'ophase': ophase,
# 'dphase': dphase,
# 'fphase': fphase,
# 'scan': new_set_arrays[2],
# 'hjd_hd': hjd_hd,
# 'flux': new_set_arrays[4],
# 'error': new_set_arrays[5],
# 'pindices': []}
#
# names = []
# print_names = []
# limits1 = []
# limits2 = []
# initial = []
#
# for set_number, set_name in enumerate(self.sets):
#
# flux = self.data[set_name]['flux']
# scan = self.data[set_name]['scan']
# fphase = self.data[set_name]['fphase']
# dphase = self.data[set_name]['dphase']
#
# # forward scans normalisation factors
# names.append('n_w_for_{0}'.format(str(set_number)))
# print_names.append('n{}{}{}'.format('w', 'for', str(set_number)))
# initial.append(np.max(flux))
# if (scan < 0).all():
# limits1.append(np.nan)
# limits2.append(np.nan)
# else:
# limits1.append(np.max(flux) * 0.99)
# limits2.append(np.max(flux) * 1.01)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# print(np.median(flux), np.max(flux), np.max(flux) * 0.99, np.max(flux) * 1.01)
#
# # reverse scans normalisation factors
# names.append('n_w_rev_{0}'.format(str(set_number)))
# print_names.append('n{}{}{}'.format('w', 'rev', str(set_number)))
# initial.append(np.max(flux))
# if (scan > 0).all():
# limits1.append(np.nan)
# limits2.append(np.nan)
# else:
# limits1.append(np.max(flux) * 0.99)
# limits2.append(np.max(flux) * 1.01)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # long term ramp - 1st order
# names.append('r_a1_{0}'.format(str(set_number)))
# print_names.append('r{}{}'.format('a1', str(set_number)))
# initial.append(0.001)
# limits1.append(-1.0)
# limits2.append(1.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # long term ramp - 2nd order
# names.append('r_a2_{0}'.format(str(set_number)))
# print_names.append('r{}{}'.format('a2', str(set_number)))
# initial.append(0.0)
# if self.fit_second_order_ramp:
# limits1.append(-1.0)
# limits2.append(1.0)
# else:
# limits1.append(np.nan)
# limits2.append(np.nan)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # sort term ramp - amplitude
# names.append('r_b1_{0}'.format(str(set_number)))
# print_names.append('r{}{}'.format('b1', str(set_number)))
# initial.append(0.001)
# limits1.append(-1.0)
# limits2.append(1.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # sort term mid-orbit ramp - amplitude
# names.append('mor_b1_{0}'.format(str(set_number)))
# print_names.append('mor{}{}'.format('b1', str(set_number)))
# initial.append(0.001)
# if np.sum(dphase ** 2) == 0:
# limits1.append(np.nan)
# limits2.append(np.nan)
# else:
# limits1.append(-1.0)
# limits2.append(1.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # sort term first-orbit ramp - amplitude
# names.append('for_b1_{0}'.format(str(set_number)))
# print_names.append('for{}{}'.format('b1', str(set_number)))
# initial.append(0.001)
# if np.sum(fphase ** 2) == 0:
# limits1.append(np.nan)
# limits2.append(np.nan)
# else:
# limits1.append(-1.0)
# limits2.append(1.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # sort term ramp - decay
# names.append('r_b2_{0}'.format(str(set_number)))
# print_names.append('r{}{}'.format('b2', str(set_number)))
# initial.append(250.0)
# limits1.append(50.0)
# limits2.append(500.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # sort term mid-orbit ramp - decay
# names.append('mor_b2_{0}'.format(str(set_number)))
# print_names.append('mor{}{}'.format('b2', str(set_number)))
# initial.append(250.0)
# if np.sum(dphase ** 2) == 0:
# limits1.append(np.nan)
# limits2.append(np.nan)
# else:
# limits1.append(50.0)
# limits2.append(500.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # sort term first-orbit ramp - decay
# names.append('for_b2_{0}'.format(str(set_number)))
# print_names.append('for{}{}'.format('b2', str(set_number)))
# initial.append(150.0)
# if np.sum(fphase ** 2) == 0:
# limits1.append(np.nan)
# limits2.append(np.nan)
# else:
# limits1.append(50.0)
# limits2.append(500.0)
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# # rp
# names.append('rp_{0}'.format(str(set_number)))
# print_names.append('Rp/R*{}'.format(str(set_number)))
# initial.append(self.rp_over_rs)
# limits1.append(self.fit_rp_over_rs[0])
# limits2.append(self.fit_rp_over_rs[1])
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# self.len_systematics = int(len(names) / self.total_sets)
#
# names.append('ldc1')
# print_names.append('ldc_1')
# initial.append(self.limb_darkening_coefficients[0])
# if self.fit_ld:
# limits1.append(0.000001)
# limits2.append(0.999999)
# else:
# limits1.append(np.nan)
# limits2.append(np.nan)
#
# for set_name in self.sets:
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# if self.method in ['claret', 'quad', 'sqrt']:
#
# names.append('ldc2')
# print_names.append('ldc_2')
# initial.append(self.limb_darkening_coefficients[1])
# if self.fit_ld:
# limits1.append(0.000001)
# limits2.append(0.999999)
# else:
# limits1.append(np.nan)
# limits2.append(np.nan)
#
# for set_name in self.sets:
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# if self.method == 'claret':
#
# names.append('ldc3')
# print_names.append('ldc_3')
# initial.append(self.limb_darkening_coefficients[2])
# if self.fit_ld:
# limits1.append(0.000001)
# limits2.append(0.999999)
# else:
# limits1.append(np.nan)
# limits2.append(np.nan)
#
# for set_name in self.sets:
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# names.append('ldc4')
# print_names.append('ldc_4')
# initial.append(self.limb_darkening_coefficients[3])
# if self.fit_ld:
# limits1.append(0.000001)
# limits2.append(0.999999)
# else:
# limits1.append(np.nan)
# limits2.append(np.nan)
#
# for set_name in self.sets:
# self.data[set_name]['pindices'].append(len(names) - 1)
#
# for pindex in range(len(names), len(names) + 6):
# for set_name in self.sets:
# self.data[set_name]['pindices'].append(pindex)
#
# for set_name in self.sets:
# self.data[set_name]['pindices'] = np.int_(self.data[set_name]['pindices'])
#
# names += ['P', 'a', 'e', 'i', 'w', 'mt']
# print_names += ['P', 'a/R_*', 'e', 'i', '\omega', 'T_0']
#
# initial += [self.period, self.sma_over_rs, self.eccentricity,
# self.inclination, self.periastron, self.mid_time]
#
# limits = limits1 + [self.fit_period, self.fit_sma_over_rs, self.fit_eccentricity,
# self.fit_inclination, self.fit_periastron, self.fit_mid_time]
#
# for var in range(self.len_systematics * self.total_sets + self.total_ldcs, len(names)):
#
# try:
# initial[var] = float(initial[var])
# except:
# raise RuntimeError('Improper value for {0}'.format(names[var]))
#
# if limits[var] is False:
# limits1.append(np.nan)
# limits2.append(np.nan)
#
# elif limits[var] is None:
# limits1.append(np.nan)
# limits2.append(np.nan)
#
# else:
# try:
# if len(np.array(limits[var])) != 2:
# raise RuntimeError('Improper limits for {0}'.format(names[var]))
# except:
# raise RuntimeError('Improper limits for {0}'.format(names[var]))
#
# if initial[var] < np.array(limits[var])[0] or initial[var] > np.array(limits[var])[1]:
# raise RuntimeError('Initial value for {0} is outside the range of the prior.'.format(
# names[var]))
# else:
# limits1.append(np.array(limits[var])[0])
# limits2.append(np.array(limits[var])[1])
#
# self.fitting = EmceeFitting([data_flux, data_flux_error],
# self.full_model, initial, limits1, limits2,
# walkers, iterations, burn,
# names=names, print_names=print_names,
# counter=counter, counter_window=counter_window, strech_prior=1.0)
#
# self.results = {}
# self.mcmc_run_complete = False
#
# def detrend_model(self, *model_variables):
#
# model = []
#
# for set_name in self.sets:
#
# (model_norm_f, model_norm_r, model_r_a1, model_r_a2, model_r_b1, model_mor_b1, model_for_b1, model_r_b2,
# model_mor_b2, model_for_b2, model_rp) = np.array(
# model_variables)[self.data[set_name]['pindices']][:self.len_systematics]
# model_period = np.array(
# model_variables)[self.data[set_name]['pindices']][self.len_systematics + self.total_ldcs]
# model_mid_time = np.array(model_variables)[self.data[set_name]['pindices']][-1]
#
# model_ophase = self.data[set_name]['ophase']
# model_dphase = self.data[set_name]['dphase']
# model_fphase = self.data[set_name]['fphase']
# model_scan = self.data[set_name]['scan']
# model_hjd = self.data[set_name]['hjd']
# model_epoch = self.data[set_name]['epoch']
# model_vtime = model_hjd - (model_mid_time + model_epoch * model_period)
#
# normalisation = np.where(model_scan > 0, model_norm_f, model_norm_r)
# detrend1 = (1.0 - model_r_a1 * model_vtime + model_r_a2 * (model_vtime ** 2))
# ramp_ampl = np.where(model_dphase == 0, model_r_b1, model_mor_b1)
# ramp_ampl = np.where(model_fphase == 0, ramp_ampl, model_for_b1)
# ramp_decay = np.where(model_dphase == 0, model_r_b2, model_mor_b2)
# ramp_decay = np.where(model_fphase == 0, ramp_decay, model_for_b2)
# detrend2 = 1.0 - ramp_ampl * np.exp(- ramp_decay * model_ophase)
#
# model.append(normalisation * detrend1 * detrend2)
#
# return np.concatenate(model)
#
# def transit_model(self, *model_variables):
#
# model = []
#
# for set_name in self.sets:
# model_rp_over_rs = np.array(model_variables)[self.data[set_name]['pindices']][10]
#
# model_hjd = self.data[set_name]['hjd']
# model_hjd_hd = self.data[set_name]['hjd_hd']
#
# limb_darkening_coefficients = np.array(
# model_variables)[self.data[set_name]['pindices']][self.len_systematics:
# self.len_systematics + self.total_ldcs]
#
# z_over_rs = transit_projected_distance(*np.array(
# model_variables)[self.data[set_name]['pindices']][self.len_systematics + self.total_ldcs:],
# time_array=model_hjd_hd)
#
# transit_hr = transit_flux_drop(self.method, limb_darkening_coefficients, model_rp_over_rs, z_over_rs,
# precision=self.precision)
#
# model.append(np.mean(np.reshape(transit_hr, (self.time_factor, len(model_hjd))), 0))
#
# return np.concatenate(model)
#
# def full_model(self, *model_variables):
#
# return self.detrend_model(*model_variables) * self.transit_model(*model_variables)
#
# def run_mcmc(self):
#
# self.fitting.run_mcmc()
# self.results = self.fitting.results
# self.mcmc_run_complete = True
#
# self.results['input_series']['hjd'] = self.data_time
#
# period = self.results['parameters']['P']['value']
# mt = self.results['parameters']['mt']['value']
# self.results['output_series']['phase'] = \
# (self.data_time - mt) / period - np.round((self.data_time - mt) / period)
#
# self.results['detrended_input_series'] = {
# 'hjd': self.results['input_series']['hjd'],
# 'value': self.results['input_series']['value'] / self.detrend_model(*self.results['parameters_final']),
# 'error': self.results['input_series']['error'] / self.detrend_model(*self.results['parameters_final'])}
#
# self.results['detrended_output_series'] = {
# 'phase': self.results['output_series']['phase'],
# 'model': self.results['output_series']['model'] / self.detrend_model(*self.results['parameters_final']),
# 'residuals': (self.results['output_series']['residuals']
# / self.detrend_model(*self.results['parameters_final']))}
#
# self.results['detrended_statistics'] = {ff: self.results['statistics'][ff] for ff in self.results['statistics']}
# self.results['detrended_statistics']['res_std'] = np.std(self.results['detrended_output_series']['residuals'])
#
# def save_all(self, export_file):
#
# pickle.dump(self.results, open(export_file, 'wb'))
#
# def save_results(self, export_file):
#
# self.fitting.save_results(export_file)
#
# def plot_corner(self, export_file):
#
# self.fitting.plot_corner(export_file)
#
# def plot_traces(self, export_file):
#
# self.fitting.plot_traces(export_file)
#
# def plot_models(self, export_file, target=None, data_dates=None):
#
# if target is None:
# target = ' '
#
# if data_dates is None:
# data_dates = map(str, ['set_{0}'.format(str(ff)) for ff in range(1, self.total_sets + 1)])
#
# for set_number in range(self.total_sets):
#
# set_indices = np.where(self.data_set_number == set_number)
#
# fig = plt.figure(set_number + 1)
# fig.set_tight_layout(False)
#
# self.results = {ff: self.results[ff] for ff in self.results}
#
# period = self.results['parameters']['P']['value']
# mt = self.results['parameters']['mt']['value']
#
# mt += round((np.mean(self.data_time[set_indices]) - mt) / period) * period
#
# prediction = (self.mid_time +
# round((np.mean(self.data_time[set_indices]) - self.mid_time) / self.period) * self.period)
#
# duration = transit_duration(self.rp_over_rs, self.period, self.sma_over_rs,
# self.inclination, self.eccentricity, self.periastron)
#
# ingress = prediction - duration / 2
# egress = prediction + duration / 2
#
# plt.subplot2grid((4, 1), (0, 0), rowspan=3)
#
# plt.plot(self.results['output_series']['phase'][set_indices],
# self.results['input_series']['value'][set_indices], 'ko', ms=2)
# plt.plot(self.results['output_series']['phase'][set_indices],
# self.results['output_series']['model'][set_indices], 'r-')
#
# plt.ylim(min(self.results['output_series']['model'][set_indices])
# - 5 * np.std(self.results['output_series']['residuals'][set_indices]),
# max(self.results['output_series']['model'][set_indices])
# + 5 * np.std(self.results['output_series']['residuals'][set_indices]))
#
# plt.yticks(plt.yticks()[0][1:])
# plt.ylabel(r'$\mathrm{relative} \ \mathrm{flux}$', fontsize=15)
#
# plt.ylim(min(self.results['output_series']['model'][set_indices])
# - 5 * np.std(self.results['output_series']['residuals'][set_indices]),
# max(self.results['output_series']['model'][set_indices])
# + 5 * np.std(self.results['output_series']['residuals'][set_indices]))
#
# x_max = max(np.abs(self.results['output_series']['phase'][set_indices]) +
# 0.05 * (max(self.results['output_series']['phase'][set_indices]) -
# min(self.results['output_series']['phase'][set_indices])))
# plt.xlim(-x_max, x_max)
# plt.tick_params(labelbottom='off')
#
# rpstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
# r'$R_\mathrm{p}/R_* = ', self.results['parameters']['rp_{0}'.format(str(set_number))]['print_value'],
# '_{-', self.results['parameters']['rp_{0}'.format(str(set_number))]['print_m_error'], '}', '^{+',
# self.results['parameters']['rp_{0}'.format(str(set_number))]['print_p_error'], '}$')
# mtstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
# r'$T_\mathrm{HJD} = ', self.results['parameters']['mt']['print_value'], '_{-',
# self.results['parameters']['mt']['print_m_error'], '}', '^{+',
# self.results['parameters']['mt']['print_p_error'], '}$')
#
# plt.text(plt.xlim()[0] + 0.5 * (plt.xlim()[-1] - plt.xlim()[0]),
# plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
# '{0}\n{1}'.format(rpstr, mtstr), ha='center', va='center', fontsize=10)
#
# plt.axvline((ingress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
# plt.text((ingress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
# '{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{ingress}$', '\n',
# r'$\mathrm{start}$'),
# ha='right', va='top', fontsize=10)
# plt.axvline((egress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
# plt.text((egress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
# '{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{egress}$', '\n',
# r'$\mathrm{end}$'),
# ha='left', va='top', fontsize=10)
#
# plt.suptitle('{0}{1}{2}'.format(r'$\mathbf{', target, '}$'), fontsize=20)
# plt.text(plt.xlim()[1], plt.ylim()[1], '{0}{1}{2}'.format(r'$', data_dates[set_number], '$'),
# fontsize=12, ha='right', va='bottom')
#
# plt.subplot(4, 1, 4)
# plt.cla()
# plt.plot(self.results['output_series']['phase'][set_indices],
# self.results['output_series']['residuals'][set_indices], 'ko', ms=2)
# plt.plot(self.results['output_series']['phase'][set_indices],
# np.zeros_like(self.results['output_series']['phase'][set_indices]), 'r-')
#
# plt.ylim(- 5 * np.std(self.results['output_series']['residuals'][set_indices]),
# 5 * np.std(self.results['output_series']['residuals'][set_indices]))
#
# plt.xlabel(r'$\mathrm{phase}$', fontsize=15)
# plt.ylabel(r'$\mathrm{residuals}$', fontsize=15)
#
# plt.xlim(-x_max, x_max)
# plt.text(plt.xlim()[0] + 0.02 * (plt.xlim()[-1] - plt.xlim()[0]),
# plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
# r'$\mathrm{rms}_\mathrm{res} = %.1e$' %
# np.std(self.results['output_series']['residuals'][set_indices]), fontsize=10)
#
# plt.subplots_adjust(left=0.15, right=0.975, bottom=0.12, top=0.9, hspace=0.0)
#
# plt.savefig(os.path.join(os.path.split(export_file)[0],
# 'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
# transparent=True)
# plt.close('all')
#
# def plot_detrended_models(self, export_file, target=None, data_dates=None, return_plot=False):
#
# if target is None:
# target = ' '
#
# if data_dates is None:
# data_dates = map(str, ['set_{0}'.format(str(ff)) for ff in range(1, self.total_sets + 1)])
#
# for set_number in range(self.total_sets):
#
# set_indices = np.where(self.data_set_number == set_number)
#
# fig = plt.figure(set_number + 1)
# fig.set_tight_layout(False)
#
# self.results = {ff: self.results[ff] for ff in self.results}
#
# period = self.results['parameters']['P']['value']
# mt = self.results['parameters']['mt']['value']
# mt += round((np.mean(self.data_time[set_indices]) - mt) / period) * period
#
# prediction = (self.mid_time +
# round((np.mean(self.data_time[set_indices]) - self.mid_time) / self.period) * self.period)
#
# duration = transit_duration(self.rp_over_rs, self.period, self.sma_over_rs,
# self.inclination, self.eccentricity, self.periastron)
#
# ingress = prediction - duration / 2
# egress = prediction + duration / 2
#
# plt.subplot2grid((4, 1), (0, 0), rowspan=3)
#
# plt.plot(self.results['detrended_output_series']['phase'][set_indices],
# self.results['detrended_input_series']['value'][set_indices], 'ko', ms=2)
# plt.plot(self.results['detrended_output_series']['phase'][set_indices],
# self.results['detrended_output_series']['model'][set_indices], 'r-')
#
# plt.ylim(min(self.results['detrended_output_series']['model'][set_indices])
# - 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]),
# max(self.results['detrended_output_series']['model'][set_indices])
# + 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]))
#
# plt.yticks(plt.yticks()[0][1:])
# plt.ylabel(r'$\mathrm{relative} \ \mathrm{flux}$', fontsize=15)
#
# plt.ylim(min(self.results['detrended_output_series']['model'][set_indices])
# - 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]),
# max(self.results['detrended_output_series']['model'][set_indices])
# + 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]))
#
# plt.ylim(-1.17647 * (- plt.ylim()[0] + 0.15 * plt.ylim()[1]), plt.ylim()[1])
#
# x_max = max(np.abs(self.results['detrended_output_series']['phase'][set_indices]) +
# 0.05 * (max(self.results['detrended_output_series']['phase'][set_indices]) -
# min(self.results['detrended_output_series']['phase'][set_indices])))
# plt.xlim(-x_max, x_max)
# plt.tick_params(labelbottom='off')
#
# rpstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
# r'$R_\mathrm{p}/R_* = ', self.results['parameters']['rp_{0}'.format(str(set_number))]['print_value'],
# '_{-', self.results['parameters']['rp_{0}'.format(str(set_number))]['print_m_error'], '}', '^{+',
# self.results['parameters']['rp_{0}'.format(str(set_number))]['print_p_error'], '}$')
# mtstr = '{0}{1}{2}{3}{4}{5}{6}{7}'.format(
# r'$T_\mathrm{HJD} = ', self.results['parameters']['mt']['print_value'], '_{-',
# self.results['parameters']['mt']['print_m_error'], '}', '^{+',
# self.results['parameters']['mt']['print_p_error'], '}$')
#
# plt.text(plt.xlim()[0] + 0.5 * (plt.xlim()[-1] - plt.xlim()[0]),
# plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
# '{0}{1}{2}'.format(rpstr, '\n', mtstr), ha='center', va='center', fontsize=10)
#
# plt.axvline((ingress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
# plt.text((ingress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
# '{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{ingress}$', '\n',
# r'$\mathrm{start}$'),
# ha='right', va='top', fontsize=10)
# plt.axvline((egress - mt) / period, 0.3, 1.0, ls='--', c='k', lw=0.75)
# plt.text((egress - mt) / period, plt.ylim()[0] + 0.3 * (plt.ylim()[1] - plt.ylim()[0]),
# '{0}{1}{2}{3}{4}'.format(r'$\mathrm{predicted}$', '\n', r'$\mathrm{egress}$', '\n',
# r'$\mathrm{end}$'),
# ha='left', va='top', fontsize=10)
#
# plt.suptitle('{0}{1}{2}'.format(r'$\mathbf{', target, '}$'), fontsize=20)
# plt.text(plt.xlim()[1], plt.ylim()[1], '{0}{1}{2}'.format(r'$', data_dates[set_number], '$'),
# fontsize=12, ha='right', va='bottom')
#
# plt.subplot(4, 1, 4)
# plt.cla()
# plt.plot(self.results['detrended_output_series']['phase'][set_indices],
# self.results['detrended_output_series']['residuals'][set_indices], 'ko', ms=2)
# plt.plot(self.results['detrended_output_series']['phase'][set_indices],
# np.zeros_like(self.results['detrended_output_series']['phase'][set_indices]), 'r-')
#
# plt.ylim(- 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]),
# 5 * np.std(self.results['detrended_output_series']['residuals'][set_indices]))
#
# plt.xlabel(r'$\mathrm{phase}$', fontsize=15)
# plt.ylabel(r'$\mathrm{residuals}$', fontsize=15)
#
# plt.xlim(-x_max, x_max)
#
# plt.text(plt.xlim()[0] + 0.02 * (plt.xlim()[-1] - plt.xlim()[0]),
# plt.ylim()[0] + 0.07 * (plt.ylim()[-1] - plt.ylim()[0]),
# r'$\mathrm{rms}_\mathrm{res} = %.1e$' %
# np.std(self.results['detrended_output_series']['residuals'][set_indices]),
# fontsize=10)
#
# plt.subplots_adjust(left=0.15, right=0.975, bottom=0.12, top=0.9, hspace=0.0)
#
# plt.savefig(os.path.join(os.path.split(export_file)[0],
# 'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
# transparent=True)
# if return_plot:
# return [plt.figure(ff + 1) for ff in range(self.total_sets)]
# else:
# plt.close('all')
#
# def save_models(self, export_file):
#
# for set_number in range(self.total_sets):
#
# self.results = {ff: self.results[ff] for ff in self.results}
#
# set_indices = np.where(self.data_set_number == set_number)
#
# np.savetxt(os.path.join(os.path.split(export_file)[0],
# 'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
# np.swapaxes([self.results['input_series']['hjd'][set_indices],
# self.results['output_series']['phase'][set_indices],
# self.results['input_series']['value'][set_indices],
# self.results['input_series']['error'][set_indices],
# self.results['output_series']['model'][set_indices],
# self.results['output_series']['residuals'][set_indices]
# ], 0, 1))
#
# def save_detrended_models(self, export_file):
#
# for set_number in range(self.total_sets):
#
# self.results = {ff: self.results[ff] for ff in self.results}
#
# set_indices = np.where(self.data_set_number == set_number)
#
# np.savetxt(os.path.join(os.path.split(export_file)[0],
# 'set_{0}_{1}'.format(str(set_number + 1), os.path.split(export_file)[1])),
# np.swapaxes([self.results['detrended_input_series']['hjd'][set_indices],
# self.results['detrended_output_series']['phase'][set_indices],
# self.results['detrended_input_series']['value'][set_indices],
# self.results['detrended_input_series']['error'][set_indices],
# self.results['detrended_output_series']['model'][set_indices],
# self.results['detrended_output_series']['residuals'][set_indices]
# ], 0, 1))
| 47.943759 | 122 | 0.532617 | 8,374 | 69,902 | 4.243014 | 0.048722 | 0.066871 | 0.037151 | 0.03625 | 0.874363 | 0.833244 | 0.806394 | 0.783597 | 0.766598 | 0.737441 | 0 | 0.029128 | 0.307988 | 69,902 | 1,457 | 123 | 47.976664 | 0.705394 | 0.56821 | 0 | 0.4859 | 0 | 0 | 0.123341 | 0.029198 | 0 | 0 | 0 | 0.000686 | 0 | 1 | 0.032538 | false | 0.002169 | 0.030369 | 0.002169 | 0.073753 | 0.05423 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2cf10dfbae5b4271c8144721dc28aca8dc63bea9 | 126 | py | Python | tests/_site/import_error_app/catalogue/import_error_module.py | Jean1508/ya-madoa | 1ffb1d11e15bf33e4c3a09698675a4357e887eaa | [
"BSD-3-Clause"
] | null | null | null | tests/_site/import_error_app/catalogue/import_error_module.py | Jean1508/ya-madoa | 1ffb1d11e15bf33e4c3a09698675a4357e887eaa | [
"BSD-3-Clause"
] | 5 | 2021-05-28T19:38:28.000Z | 2022-03-12T00:45:39.000Z | tests/_site/import_error_app/catalogue/import_error_module.py | Jean1508/ya-madoa | 1ffb1d11e15bf33e4c3a09698675a4357e887eaa | [
"BSD-3-Clause"
] | null | null | null | # On purpose raise ImportError
from django import NonExistingClass
class ImportErrorClass(NonExistingClass):
pass
| 18 | 42 | 0.777778 | 12 | 126 | 8.166667 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 126 | 6 | 43 | 21 | 0.960784 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
2cf4dec9649ecc658fb27dfaccc14b2acef9b7c3 | 107 | py | Python | Python/Tests/TestData/Grammar/Calls.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/Grammar/Calls.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/Grammar/Calls.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | fob()
fob(1)
fob(oar = 1)
fob(*oar)
fob(**oar)
fob(*oar, **baz)
fob(oar = 1, baz = 2)
fob(oar, baz) | 13.375 | 22 | 0.514019 | 22 | 107 | 2.545455 | 0.272727 | 0.642857 | 0.25 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.214953 | 107 | 8 | 23 | 13.375 | 0.607143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fa22418ffaa5036d4cf65d5adf1e44a4817ca9a4 | 201 | py | Python | JumpscalePortalClassic/portal/docgenerator/macros/pagebreak/1_main.py | threefoldtech/jumpscale_portal_classic | d14fe4a17c0486df7a87d149e900746654091fda | [
"Apache-2.0"
] | 2 | 2016-04-14T14:05:01.000Z | 2016-04-21T07:20:36.000Z | JumpscalePortalClassic/portal/docgenerator/macros/pagebreak/1_main.py | threefoldtech/jumpscale_portal_classic | d14fe4a17c0486df7a87d149e900746654091fda | [
"Apache-2.0"
] | 74 | 2015-12-28T16:17:20.000Z | 2021-09-08T12:28:59.000Z | lib/portal/docgenerator/macros/pagebreak/1_main.py | Jumpscale/jumpscale_portal8 | 3a4d56a1ba985b68fe9b525aed2486a54808332f | [
"Apache-2.0"
] | 5 | 2016-03-08T07:49:51.000Z | 2018-10-19T13:57:04.000Z |
def main(j, params, service, tags, tasklet):
page = params.page
tags = params.tags
page.addPageBreak()
return params
def match(j, params, service, tags, tasklet):
return True
| 14.357143 | 45 | 0.651741 | 26 | 201 | 5.038462 | 0.461538 | 0.10687 | 0.21374 | 0.274809 | 0.381679 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243781 | 201 | 13 | 46 | 15.461538 | 0.861842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.142857 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
fa2fd3d61b1475189c206f62d45019e39d62f248 | 29 | py | Python | padqc/compiler/__init__.py | qis-unipr/padqc | 94599db20711dc755b53425951fa3cb15b749f64 | [
"Apache-2.0"
] | 1 | 2022-01-10T05:46:45.000Z | 2022-01-10T05:46:45.000Z | src/unv/__init__.py | UnvLabs/Python | 27e26983199fd732854bdfb699efb7cf2f803f22 | [
"MIT"
] | null | null | null | src/unv/__init__.py | UnvLabs/Python | 27e26983199fd732854bdfb699efb7cf2f803f22 | [
"MIT"
] | 1 | 2021-02-18T22:11:18.000Z | 2021-02-18T22:11:18.000Z | from .compile import compile
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fa671dcd24dc8f232001532ece907a12ef856fd8 | 94 | py | Python | vagga2lithos/main.py | tailhook/vagga2lithos | fc174479b953b382e097dd51643ac2b4c2d56dee | [
"Apache-2.0",
"MIT"
] | 5 | 2016-11-18T03:19:02.000Z | 2019-04-16T19:52:50.000Z | vagga2lithos/main.py | tailhook/vagga2lithos | fc174479b953b382e097dd51643ac2b4c2d56dee | [
"Apache-2.0",
"MIT"
] | 1 | 2018-03-15T18:23:50.000Z | 2018-03-15T18:23:50.000Z | vagga2lithos/main.py | tailhook/vagga2lithos | fc174479b953b382e097dd51643ac2b4c2d56dee | [
"Apache-2.0",
"MIT"
] | null | null | null | import click
@click.group()
def main():
pass
# load commands
from . import gen, update
| 9.4 | 25 | 0.670213 | 13 | 94 | 4.846154 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223404 | 94 | 9 | 26 | 10.444444 | 0.863014 | 0.138298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
d747be90fd0a556230fe9bd49bc10f0ed43b8306 | 29 | py | Python | code/tmp_rtrip/sqlite3/__init__.py | emilyemorehouse/ast-and-me | 3f58117512e125e1ecbe3c72f2f0d26adb80b7b3 | [
"MIT"
] | 24 | 2018-01-23T05:28:40.000Z | 2021-04-13T20:52:59.000Z | code/tmp_rtrip/sqlite3/__init__.py | emilyemorehouse/ast-and-me | 3f58117512e125e1ecbe3c72f2f0d26adb80b7b3 | [
"MIT"
] | 17 | 2017-12-21T18:32:31.000Z | 2018-12-18T17:09:50.000Z | code/tmp_rtrip/sqlite3/__init__.py | emilyemorehouse/ast-and-me | 3f58117512e125e1ecbe3c72f2f0d26adb80b7b3 | [
"MIT"
] | null | null | null | from sqlite3.dbapi2 import *
| 14.5 | 28 | 0.793103 | 4 | 29 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.137931 | 29 | 1 | 29 | 29 | 0.84 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d75a20a8ee214be41ea183e350258f791d82b15e | 135 | py | Python | framework/layers/__init__.py | lukovnikov/transformer_generalization | a538bfbba6877cd7a21e710f2535df2e9236ba52 | [
"MIT"
] | 47 | 2021-08-30T00:41:15.000Z | 2022-01-24T02:49:17.000Z | framework/layers/__init__.py | lukovnikov/transformer_generalization | a538bfbba6877cd7a21e710f2535df2e9236ba52 | [
"MIT"
] | null | null | null | framework/layers/__init__.py | lukovnikov/transformer_generalization | a538bfbba6877cd7a21e710f2535df2e9236ba52 | [
"MIT"
] | 5 | 2021-09-04T23:51:51.000Z | 2022-03-10T14:03:24.000Z | from .positional_encoding import PositionalEncoding, sinusoidal_pos_embedding
from .cross_entropy_label_smoothing import cross_entropy
| 45 | 77 | 0.911111 | 16 | 135 | 7.25 | 0.75 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 135 | 2 | 78 | 67.5 | 0.920635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d7613c67dab581482379e6ee5a47c6fb6204d3d8 | 26 | py | Python | tests/__init__.py | zapp-oz/AutoGit | a2894af75ee51bf84e656d4f842901e9f5940b6d | [
"MIT"
] | null | null | null | tests/__init__.py | zapp-oz/AutoGit | a2894af75ee51bf84e656d4f842901e9f5940b6d | [
"MIT"
] | null | null | null | tests/__init__.py | zapp-oz/AutoGit | a2894af75ee51bf84e656d4f842901e9f5940b6d | [
"MIT"
] | null | null | null | from . import test_git_ops | 26 | 26 | 0.846154 | 5 | 26 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d76c33b1e8fbc893158b8149cd4916e80cdc6ba4 | 45 | py | Python | corehq/apps/hqwebapp/tests/__init__.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | null | null | null | corehq/apps/hqwebapp/tests/__init__.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | 1 | 2022-03-12T01:03:25.000Z | 2022-03-12T01:03:25.000Z | corehq/apps/hqwebapp/tests/__init__.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | null | null | null | from .test_hq_shared_tags import TestCaseTag
| 22.5 | 44 | 0.888889 | 7 | 45 | 5.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 45 | 1 | 45 | 45 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ad02f6a572e847592a66b596b845c260779c4e3e | 75 | py | Python | py_tdlib/constructors/notification_group_type_calls.py | Mr-TelegramBot/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 24 | 2018-10-05T13:04:30.000Z | 2020-05-12T08:45:34.000Z | py_tdlib/constructors/notification_group_type_calls.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 3 | 2019-06-26T07:20:20.000Z | 2021-05-24T13:06:56.000Z | py_tdlib/constructors/notification_group_type_calls.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 5 | 2018-10-05T14:29:28.000Z | 2020-08-11T15:04:10.000Z | from ..factory import Type
class notificationGroupTypeCalls(Type):
pass
| 12.5 | 39 | 0.8 | 8 | 75 | 7.5 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 75 | 5 | 40 | 15 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
ad0a6e266fa6266ac564b6e321125a23c42b0d64 | 128,605 | py | Python | test/water_clusters/make_input.py | susilehtola/aquarius | 9160e73bd7e3e0d8d97b10d00d9a4860aee709d2 | [
"BSD-3-Clause"
] | 18 | 2015-02-11T15:02:39.000Z | 2021-09-24T13:10:12.000Z | test/water_clusters/make_input.py | susilehtola/aquarius | 9160e73bd7e3e0d8d97b10d00d9a4860aee709d2 | [
"BSD-3-Clause"
] | 21 | 2015-06-23T13:32:29.000Z | 2022-02-15T20:14:42.000Z | test/water_clusters/make_input.py | susilehtola/aquarius | 9160e73bd7e3e0d8d97b10d00d9a4860aee709d2 | [
"BSD-3-Clause"
] | 8 | 2016-01-09T23:36:21.000Z | 2019-11-19T14:22:34.000Z | #!/usr/bin/python
import sys
if ( len(sys.argv) != 4 ):
print "Usage: ./make_input.py <cluster> <basis> <method>"
print "<cluster> can be: w1 w2 w3 w4 w5 w6cage w6book w6prism w6cyclic w7 w8s4 w8d2d"
print " w9 w10 w11i434 w11i4412 w11i443 w11i515 w11i551 w12 w13 w14"
print " w15 w16 w17int w17surf w18 w19 w20dode w20fused w20face w20edge"
print " rubrene"
print "<basis> can be 6-31G 6-311G cc-pVDZ cc-pVTZ cc-pVQZ aug-cc-pVDZ aug-cc-pVTZ aug-cc-pVQZ etc."
print "Note that <basis> is case sensitive and must correspond to a file in AQUARIUS/basis."
print "<method> can be ccd, ccsd, or ccsdt"
sys.exit()
#------------
# Rubrene
#------------
def print_rubrene(file):
file.write('\tatom {C, 0.0000000000, 0.0000000000, 0.7386629749 },\n')
file.write('\tatom {C, -1.2483730887, 0.0888707092, 1.4320804162 },\n')
file.write('\tatom {C, -2.4387379231, 0.3562298522, 0.7287080969 },\n')
file.write('\tatom {C, -3.6963513178, 0.5360669578, 1.4047565975 },\n')
file.write('\tatom {C, -4.8565917531, 0.7574126966, 0.7195212871 },\n')
file.write('\tatom {C, -1.4047304272, -0.2925466203, 2.8766766348 },\n')
file.write('\tatom {C, -1.4608851165, -1.6566002050, 3.2010433539 },\n')
file.write('\tatom {C, -1.7156969530, -2.0726251992, 4.5078354542 },\n')
file.write('\tatom {C, -1.9273173957, -1.1294519924, 5.5139696863 },\n')
file.write('\tatom {C, -1.8847258306, 0.2305869075, 5.2017074868 },\n')
file.write('\tatom {C, -1.6318196804, 0.6445713746, 3.8942551073 },\n')
file.write('\tatom {H, -3.7136001469, 0.5055498407, 2.4859255482 },\n')
file.write('\tatom {H, -5.7860241262, 0.9079508679, 1.2609572299 },\n')
file.write('\tatom {H, -1.3092313361, -2.3929774307, 2.4170229637 },\n')
file.write('\tatom {H, -1.7566049358, -3.1338521657, 4.7371683252 },\n')
file.write('\tatom {H, -2.1277721044, -1.4503977356, 6.5320681783 },\n')
file.write('\tatom {H, -2.0489582393, 0.9729207845, 5.9774668901 },\n')
file.write('\tatom {H, -1.5985958139, 1.7035156402, 3.6590400764 },\n')
file.write('\tatom {C, 0.0000000000, 0.0000000000, -0.7355835217 },\n')
file.write('\tatom {C, 1.2483730887, -0.0888707092, 1.4320804162 },\n')
file.write('\tatom {C, -2.4372856050, 0.3712368198, -0.7172953151 },\n')
file.write('\tatom {C, -4.8542635050, 0.7786660597, -0.7038756087 },\n')
file.write('\tatom {C, 1.2487664643, -0.1076096437, -1.4258829382 },\n')
file.write('\tatom {C, -1.2487664643, 0.1076096437, -1.4258829382 },\n')
file.write('\tatom {C, 2.4387379231, -0.3562298522, 0.7287080969 },\n')
file.write('\tatom {C, 1.4047304272, 0.2925466203, 2.8766766348 },\n')
file.write('\tatom {C, -3.6924121455, 0.5742247089, -1.3916349861 },\n')
file.write('\tatom {H, -5.7816030635, 0.9471760195, -1.2436128745 },\n')
file.write('\tatom {C, 2.4372856050, -0.3712368198, -0.7172953151 },\n')
file.write('\tatom {C, 1.4161903836, 0.2432248114, -2.8773546944 },\n')
file.write('\tatom {C, -1.4161903836, -0.2432248114, -2.8773546944 },\n')
file.write('\tatom {C, 3.6963513178, -0.5360669578, 1.4047565975 },\n')
file.write('\tatom {C, 1.4608851165, 1.6566002050, 3.2010433539 },\n')
file.write('\tatom {C, 1.6318196804, -0.6445713746, 3.8942551073 },\n')
file.write('\tatom {H, -3.7063324227, 0.5748189620, -2.4732886534 },\n')
file.write('\tatom {C, 3.6924121455, -0.5742247089, -1.3916349861 },\n')
file.write('\tatom {C, 1.5338943044, 1.5975040689, -3.2246030980 },\n')
file.write('\tatom {C, 1.6007048957, -0.7204065916, -3.8785605231 },\n')
file.write('\tatom {C, -1.5338943044, -1.5975040689, -3.2246030980 },\n')
file.write('\tatom {C, -1.6007048957, 0.7204065916, -3.8785605231 },\n')
file.write('\tatom {C, 4.8565917531, -0.7574126966, 0.7195212871 },\n')
file.write('\tatom {H, 3.7136001469, -0.5055498407, 2.4859255482 },\n')
file.write('\tatom {C, 1.7156969530, 2.0726251992, 4.5078354542 },\n')
file.write('\tatom {H, 1.3092313361, 2.3929774307, 2.4170229637 },\n')
file.write('\tatom {C, 1.8847258306, -0.2305869075, 5.2017074868 },\n')
file.write('\tatom {H, 1.5985958139, -1.7035156402, 3.6590400764 },\n')
file.write('\tatom {C, 4.8542635050, -0.7786660597, -0.7038756087 },\n')
file.write('\tatom {H, 3.7063324227, -0.5748189620, -2.4732886534 },\n')
file.write('\tatom {C, 1.8050372220, 1.9795794408, -4.5385104659 },\n')
file.write('\tatom {H, 1.4175312323, 2.3531485506, -2.4529473087 },\n')
file.write('\tatom {C, 1.8705463877, -0.3404309422, -5.1929267030 },\n')
file.write('\tatom {H, 1.5225138636, -1.7726479350, -3.6245598182 },\n')
file.write('\tatom {C, -1.8050372220, -1.9795794408, -4.5385104659 },\n')
file.write('\tatom {H, -1.4175312323, -2.3531485506, -2.4529473087 },\n')
file.write('\tatom {C, -1.8705463877, 0.3404309422, -5.1929267030 },\n')
file.write('\tatom {H, -1.5225138636, 1.7726479350, -3.6245598182 },\n')
file.write('\tatom {H, 5.7860241262, -0.9079508679, 1.2609572299 },\n')
file.write('\tatom {C, 1.9273173957, 1.1294519924, 5.5139696863 },\n')
file.write('\tatom {H, 1.7566049358, 3.1338521657, 4.7371683252 },\n')
file.write('\tatom {H, 2.0489582393, -0.9729207845, 5.9774668901 },\n')
file.write('\tatom {H, 5.7816030635, -0.9471760195, -1.2436128745 },\n')
file.write('\tatom {C, 1.9731362015, 1.0109252482, -5.5283824054 },\n')
file.write('\tatom {H, 1.8931818096, 3.0338872143, -4.7857992482 },\n')
file.write('\tatom {H, 2.0016371726, -1.1023529188, -5.9559253229 },\n')
file.write('\tatom {C, -1.9731362015, -1.0109252482, -5.5283824054 },\n')
file.write('\tatom {H, -1.8931818096, -3.0338872143, -4.7857992482 },\n')
file.write('\tatom {H, -2.0016371726, 1.1023529188, -5.9559253229 },\n')
file.write('\tatom {H, 2.1277721044, 1.4503977356, 6.5320681783 },\n')
file.write('\tatom {H, 2.1867218297, 1.3050773503, -6.5519186432 },\n')
file.write('\tatom {H, -2.1867218297, -1.3050773503, -6.5519186432 },\n')
#------------
#Monomer *** C2v
#------------
def print_w1(file,ycor):
file.write('\tatom {O, 0.00000000, '+str(ycor)+', 0.11726921 },\n')
file.write('\tatom {H, 0.75698224, '+str(ycor)+', -0.46907685 },\n')
file.write('\tatom {H, -0.75698224, '+str(ycor)+', -0.46907685 },\n')
#------------
#Dimer *** Cs
#------------
def print_w2(file):
file.write('\tatom {O, -0.000545, 1.517541, 0.000000 },\n')
file.write('\tatom {H, 0.094538, 0.553640, 0.000000 },\n')
file.write('\tatom {H, 0.901237, 1.847958, 0.000000 },\n')
file.write('\tatom {O, -0.000545, -1.389760, 0.000000 },\n')
file.write('\tatom {H, -0.493527, -1.711924, 0.761014 },\n')
file.write('\tatom {H, -0.493527, -1.711924, -0.761014 },\n')
#-------------
#Trimer *** C1
#-------------
def print_w3(file):
file.write('\tatom {H, 1.218038, 0.017442, -0.022009 },\n')
file.write('\tatom {O, 1.295683, -0.951662, -0.092916 },\n')
file.write('\tatom {H, 1.961236, -1.203127, 0.552608 },\n')
file.write('\tatom {H, 0.093268, 2.241120, -0.596056 },\n')
file.write('\tatom {O, 0.179908, 1.594249, 0.109280 },\n')
file.write('\tatom {H, -0.624503, 1.046616, 0.055883 },\n')
file.write('\tatom {H, -2.030801, -1.071624, 0.569877 },\n')
file.write('\tatom {O, -1.476402, -0.636452, -0.082957 },\n')
file.write('\tatom {H, -0.610755, -1.079506, -0.027553 },\n')
#---------------
#Tetramer *** S4
#---------------
def print_w4(file):
file.write('\tatom {O, -1.367062, 1.364510, 0.007273 },\n')
file.write('\tatom {O, -1.364510, -1.367062, -0.007273 },\n')
file.write('\tatom {O, 1.364510, 1.367062, -0.007273 },\n')
file.write('\tatom {O, 1.367062, -1.364510, 0.007273 },\n')
file.write('\tatom {H, -0.395152, 1.503429, -0.005375 },\n')
file.write('\tatom {H, -1.503429, -0.395152, 0.005375 },\n')
file.write('\tatom {H, 1.503429, 0.395152, 0.005375 },\n')
file.write('\tatom {H, 0.395152, -1.503429, -0.005375 },\n')
file.write('\tatom {H, -1.687281, 1.875361, 0.755434 },\n')
file.write('\tatom {H, -1.875361, -1.687281, -0.755434 },\n')
file.write('\tatom {H, 1.875361, 1.687281, -0.755434 },\n')
file.write('\tatom {H, 1.687281, -1.875361, 0.755434 },\n')
#---------------
#Pentamer *** C1
#---------------
def print_w5(file):
file.write('\tatom {O, 2.289015, 0.225784, 0.175030 },\n')
file.write('\tatom {H, 1.837891, -0.638872, 0.046444 },\n')
file.write('\tatom {H, 2.811304, 0.122451, 0.974687 },\n')
file.write('\tatom {O, 0.929887, -2.095904, -0.167528 },\n')
file.write('\tatom {H, -0.037083, -1.936553, -0.084181 },\n')
file.write('\tatom {H, 1.034959, -2.583078, -0.988978 },\n')
file.write('\tatom {O, -1.718101, -1.549268, 0.073447 },\n')
file.write('\tatom {H, -1.882083, -0.580570, 0.056990 },\n')
file.write('\tatom {H, -2.170566, -1.871677, 0.857083 },\n')
file.write('\tatom {O, -1.987637, 1.157925, -0.077866 },\n')
file.write('\tatom {H, -1.103971, 1.590183, -0.076556 },\n')
file.write('\tatom {H, -2.534625, 1.699982, 0.496152 },\n')
file.write('\tatom {O, 0.498426, 2.249945, -0.063688 },\n')
file.write('\tatom {H, 1.178269, 1.547359, 0.044627 },\n')
file.write('\tatom {H, 0.773193, 2.742924, -0.841426 },\n')
#-------------------
#Hexamer_cage *** C1
#-------------------
def print_w6cage(file):
file.write('\tatom {O, .87746626, 1.70810837, .47631700\n')
file.write('\tatom {H, 1.69363812, 1.19357153, .28997545\n')
file.write('\tatom {H, 1.16537360, 2.60804843, .65262299\n')
file.write('\tatom {O, -.81592121, .61034772, -1.61581462 },\n')
file.write('\tatom {H, -.26718594, 1.17109579, -1.04349788 },\n')
file.write('\tatom {H, -.36582905, -.24649881, -1.56526881 },\n')
file.write('\tatom {O, -.63660726, -.48685974, 1.61880639 },\n')
file.write('\tatom {H, -.19581869, .37060555, 1.51121727 },\n')
file.write('\tatom {H, -1.53977900, -.32481815, 1.28377278 },\n')
file.write('\tatom {O, .57958746, -1.69528831, -.42798860\n')
file.write('\tatom {H, .43229451, -2.64323542, -.48817257\n')
file.write('\tatom {H, .09134000, -1.38625895, .38012967\n')
file.write('\tatom {O, 2.79390774, -.10315373, -.17926594\n')
file.write('\tatom {H, 3.44976591, -.44577598, .43415635\n')
file.write('\tatom {H, 2.14838286, -.82929443, -.30024187\n')
file.write('\tatom {O, -2.88225438, -.06267425, .06008357\n')
file.write('\tatom {H, -2.28540757, .26575834, -.64623972\n')
file.write('\tatom {H, -3.65812342, .50275705, .03758316\n')
#-------------------
#Hexamer_book *** C1
#-------------------
def print_w6book(file):
file.write('\tatom {O, .12690919, 1.55143405, .88294964 },\n')
file.write('\tatom {H, .97284357, 1.51744599, .37215837 },\n')
file.write('\tatom {H, .28507553, 2.15693315, 1.61278908 },\n')
file.write('\tatom {O, 2.40793689, 1.19494170, -.47593962 },\n')
file.write('\tatom {H, 2.47425116, .21052194 -.50791280 },\n')
file.write('\tatom {H, 2.52608509, 1.48774408, -1.38342225 },\n')
file.write('\tatom {O, 2.26093548, -1.48582803, -.45281407 },\n')
file.write('\tatom {H, 2.87900544, -2.06214604, .00406231 },\n')
file.write('\tatom {H, 1.40693434, -1.57230106, .03667764 },\n')
file.write('\tatom {O, -2.46634492, -1.39738447, -.44099696 },\n')
file.write('\tatom {H, -2.53313762, -.42400253, -.50763215 },\n')
file.write('\tatom {H, -3.27971184, -1.67686815, -.01242165 },\n')
file.write('\tatom {O, -2.30167782, 1.36485164, -.44449265 },\n')
file.write('\tatom {H, -1.44118213, 1.54197674, -.01689196 },\n')
file.write('\tatom {H, -2.29684418, 1.88395767, -1.25295402 },\n')
file.write('\tatom {O, -.03575629, -1.36737375, .93305473 },\n')
file.write('\tatom {H, -.05262084, -.41957364, 1.13558028 },\n')
file.write('\tatom {H, -.88407140, -1.52142902, .47340296 },\n')
#--------------------
#Hexamer_prism *** C1
#--------------------
def print_w6prism(file):
file.write('\tatom {O, -1.98809642, 1.07259854, -.17008272 },\n')
file.write('\tatom {H, -2.65432215, 1.75406534, -.29457427 },\n')
file.write('\tatom {H, -1.12410682, 1.54133723, -.16808174 },\n')
file.write('\tatom {O, -1.01210730, -1.16226196, 1.41783859 },\n')
file.write('\tatom {H, -1.00300319, -1.51670753, .50465897 },\n')
file.write('\tatom {H, -1.58053367, -.38562387, 1.30626941 },\n')
file.write('\tatom {O, 1.45258039, -.19062642, 1.46779070 },\n')
file.write('\tatom {H, .54824824, -.59584398, 1.56580550 },\n')
file.write('\tatom {H, 1.93276785, -.40081928, 2.27333290 },\n')
file.write('\tatom {O, -.94561210, -1.39646518, -1.35471457 },\n')
file.write('\tatom {H, -1.43675949, -.56245450, -1.30670239 },\n')
file.write('\tatom {H, -.02948938, -1.11988543, -1.51719299 },\n')
file.write('\tatom {O, 1.78328499, -.40814713, -1.26763066 },\n')
file.write('\tatom {H, 2.55647293, -.82448997, -1.65880940 },\n')
file.write('\tatom {H, 1.89247505, -.49176846, -.30058670 },\n')
file.write('\tatom {O, .57696197, 2.02004218, -.11301203 },\n')
file.write('\tatom {H, 1.02072837, 1.61701116, -.87156630 },\n')
file.write('\tatom {H, .94398492, 1.51061477, .62833682 },\n')
#---------------------
#Hexamer_Cyclic *** S6
#---------------------
def print_w6cyclic(file):
file.write('\tatom {O, .00023538, 2.69029255, .14950471 },\n')
file.write('\tatom {O, -2.32998012, -1.34494169, .14950441 },\n')
file.write('\tatom {O, 2.32998012, 1.34494169, -.14950441 },\n')
file.write('\tatom {O, -2.32974435, 1.34535112, -.14950375 },\n')
file.write('\tatom {O, -.00023538, -2.69029255, -.14950471 },\n')
file.write('\tatom {O, 2.32974435, -1.34535112, .14950375 },\n')
file.write('\tatom {H, .84303260, 2.19639176, .03463900 },\n')
file.write('\tatom {H, -2.32364786, -.36810840, .03463888 },\n')
file.write('\tatom {H, 2.32364786, .36810840, -.03463888 },\n')
file.write('\tatom {H, -1.48061559, 1.82828372, -.03463790 },\n')
file.write('\tatom {H, -.84303260, -2.19639176, -.03463900 },\n')
file.write('\tatom {H, 1.48061559, -1.82828372, .03463790 },\n')
file.write('\tatom {H, .12287628, 3.21232017, .94654204 },\n')
file.write('\tatom {H, -2.84339064, -1.49974629, .94654344 },\n')
file.write('\tatom {H, 2.84339064, 1.49974629, -.94654344 },\n')
file.write('\tatom {H, -2.72051496, 1.71257699, -.94654475 },\n')
file.write('\tatom {H, -.12287628, -3.21232017, -.94654204 },\n')
file.write('\tatom {H, 2.72051496, -1.71257699, .94654475 },\n')
#--------------
#Heptamer (n=7)
#--------------
def print_w7(file):
file.write('\tatom {O, -0.46306507, -2.84560143, 0.34712980 },\n')
file.write('\tatom {H, -0.31185448, -3.74723642, 0.05234746 },\n')
file.write('\tatom {H, -0.60259983, -2.31575342, -0.47038548 },\n')
file.write('\tatom {O, 1.84489062, 0.25067396, -1.20027152 },\n')
file.write('\tatom {H, 2.63515099, 0.41824386, -1.72121662 },\n')
file.write('\tatom {H, 1.46223315, 1.13476086, -0.99006607 },\n')
file.write('\tatom {O, -0.36089385, 1.06267739, 1.87503037 },\n')
file.write('\tatom {H, -0.44058089, 1.21621663, 2.82039947 },\n')
file.write('\tatom {H, 0.24249561, 0.27291766, 1.77033787 },\n')
file.write('\tatom {O, 1.25770268, -0.94282108, 1.34833194 },\n')
file.write('\tatom {H, 0.73624113, -1.73036585, 1.09041584 },\n')
file.write('\tatom {H, 1.67299069, -0.65240393, 0.51959506 },\n')
file.write('\tatom {O, 0.51098073, 2.46987019, -0.40854717 },\n')
file.write('\tatom {H, 0.33154661, 2.21905129, 0.51484431 },\n')
file.write('\tatom {H, -0.36738878, 2.41346155, -0.80745002 },\n')
file.write('\tatom {O, -0.68279808, -1.06009111, -1.69106587 },\n')
file.write('\tatom {H, -1.23658236, -0.35698007, -1.31005956 },\n')
file.write('\tatom {H, 0.20346941, -0.66506167, -1.72575016 },\n')
file.write('\tatom {O, -2.06529537, 1.02745574, -0.29419144 },\n')
file.write('\tatom {H, -1.67380330, 0.97232115, 0.60004746 },\n')
file.write('\tatom {H, -3.01776458, 0.99606215, -0.16747231 },\n')
#--------------
#Octamer *** S4
#--------------
def print_w8s4(file):
file.write('\tatom {O, 1.99300294, -.06309578, 1.47601417 },\n')
file.write('\tatom {O, -1.99300294, .06309578, 1.47601417 },\n')
file.write('\tatom {O, -.06309578, -1.99300294, -1.47601417 },\n')
file.write('\tatom {O, .06309578, 1.99300294, -1.47601417 },\n')
file.write('\tatom {H, 2.69237274, -.27127359, 2.10172647 },\n')
file.write('\tatom {H, -2.69237274, .27127359, 2.10172647 },\n')
file.write('\tatom {H, -.27127359, -2.69237274, -2.10172647 },\n')
file.write('\tatom {H, .27127359, 2.69237274, -2.10172647 },\n')
file.write('\tatom {H, 1.34051465, -.80936971, 1.52652079 },\n')
file.write('\tatom {H, -1.34051465, .80936971, 1.52652079 },\n')
file.write('\tatom {H, -.80936971, -1.34051465, -1.52652079 },\n')
file.write('\tatom {H, .80936971, 1.34051465, -1.52652079 },\n')
file.write('\tatom {O, 1.89520992, .06259188, -1.35912522 },\n')
file.write('\tatom {O, -1.89520992, -.06259188, -1.35912522 },\n')
file.write('\tatom {O, .06259188, -1.89520992, 1.35912522 },\n')
file.write('\tatom {O, -.06259188, 1.89520992, 1.35912522 },\n')
file.write('\tatom {H, 1.38132939, -.74888632, -1.51574883 },\n')
file.write('\tatom {H, -1.38132939, .74888632, -1.51574883 },\n')
file.write('\tatom {H, -.74888632, -1.38132939, 1.51574883 },\n')
file.write('\tatom {H, .74888632, 1.38132939, 1.51574883 },\n')
file.write('\tatom {H, 2.13830549, .01009435, -.41812902 },\n')
file.write('\tatom {H, -2.13830549, -.01009435, -.41812902 },\n')
file.write('\tatom {H, .01009435, -2.13830549, .41812902 },\n')
file.write('\tatom {H, -.01009435, 2.13830549, .41812902 },\n')
#---------------
#Octamer *** D2d
#---------------
def print_w8d2d(file):
file.write('\tatom {O, -1.46966769, 1.46966769, 1.34326600 },\n')
file.write('\tatom {O, 1.46966769, -1.46966769, 1.34326600 },\n')
file.write('\tatom {O, 1.46966769, 1.46966769, -1.34326600 },\n')
file.write('\tatom {O, -1.46966769, -1.46966769, -1.34326600 },\n')
file.write('\tatom {O, -1.36565412, 1.36565412, -1.32090835 },\n')
file.write('\tatom {O, 1.36565412, -1.36565412, -1.32090835 },\n')
file.write('\tatom {O, 1.36565412, 1.36565412, 1.32090835 },\n')
file.write('\tatom {O, -1.36565412, -1.36565412, 1.32090835 },\n')
file.write('\tatom {H, -2.10464162, 2.10464162, 1.68605609 },\n')
file.write('\tatom {H, 2.10464162, -2.10464162, 1.68605609 },\n')
file.write('\tatom {H, 2.10464162, 2.10464162, -1.68605609 },\n')
file.write('\tatom {H, -2.10464162, -2.10464162, -1.68605609 },\n')
file.write('\tatom {H, -1.52398844, 1.52398844, 0.35383543 },\n')
file.write('\tatom {H, 1.52398844, -1.52398844, 0.35383543 },\n')
file.write('\tatom {H, 1.52398844, 1.52398844, -0.35383543 },\n')
file.write('\tatom {H, -1.52398844, -1.52398844, -0.35383543 },\n')
file.write('\tatom {H, 1.51043211, 0.42340972, 1.51629003 },\n')
file.write('\tatom {H, -1.51043211, -0.42340972, 1.51629003 },\n')
file.write('\tatom {H, 0.42340972, -1.51043211, -1.51629003 },\n')
file.write('\tatom {H, -0.42340972, 1.51043211, -1.51629003 },\n')
file.write('\tatom {H, -1.51043211, 0.42340972, -1.51629003 },\n')
file.write('\tatom {H, 1.51043211, -0.42340972, -1.51629003 },\n')
file.write('\tatom {H, -0.42340972, -1.51043211, 1.51629003 },\n')
file.write('\tatom {H, 0.42340972, 1.51043211, 1.51629003 },\n')
#-------------
#Nonamer (n=9)
#-------------
def print_w9(file):
file.write('\tatom {O, -0.14423567, -3.30115048, 0.03928978 },\n')
file.write('\tatom {H, -0.32449295, -3.99879230, -0.59601201 },\n')
file.write('\tatom {H, -0.88532777, -2.65077353, -0.06170448 },\n')
file.write('\tatom {O, 1.49264094, 0.37393128, -1.75371713 },\n')
file.write('\tatom {H, 1.64172129, 1.15033975, -1.18630321 },\n')
file.write('\tatom {H, 1.74950109, -0.38438355, -1.19944121 },\n')
file.write('\tatom {O, -2.01172471, -1.40607776, -0.18016347 },\n')
file.write('\tatom {H, -1.80580932, -0.80898258, -0.92073297 },\n')
file.write('\tatom {H, -1.96360403, -0.83667527, 0.60810472 },\n')
file.write('\tatom {O, 1.98006129, -1.70338387, 0.11260098 },\n')
file.write('\tatom {H, 1.26037502, -2.38156679, 0.05360476 },\n')
file.write('\tatom {H, 2.79676709, -2.19376994, 0.24208099 },\n')
file.write('\tatom {O, 1.38609229, 2.47288083, 0.14383518 },\n')
file.write('\tatom {H, 0.40159688, 2.58065033, 0.06882204 },\n')
file.write('\tatom {H, 1.75448436, 3.35910354, 0.19570425 },\n')
file.write('\tatom {O, 1.14877099, 0.30157502, 1.95056315 },\n')
file.write('\tatom {H, 1.48584111, -0.43780643, 1.41385840 },\n')
file.write('\tatom {H, 1.39709310, 1.09729015, 1.44893216 },\n')
file.write('\tatom {O, -1.14485510, 0.46396139, -2.12809929 },\n')
file.write('\tatom {H, -1.37650220, 0.49988809, -3.06029252 },\n')
file.write('\tatom {H, -0.15553827, 0.39121296, -2.08870291 },\n')
file.write('\tatom {O, -1.51553094, 0.41505867, 1.93078837 },\n')
file.write('\tatom {H, -0.53309190, 0.32662137, 2.03998763 },\n')
file.write('\tatom {H, -1.88760438, 0.40212138, 2.81700243 },\n')
file.write('\tatom {O, -1.26689128, 2.40738706, -0.06733177 },\n')
file.write('\tatom {H, -1.40034323, 1.84974182, -0.85380574 },\n')
file.write('\tatom {H, -1.53577697, 1.83794512, 0.67494834 },\n')
#--------------
#Decamer (n=10)
#--------------
def print_w10(file):
file.write('\tatom {O, -1.55682959, 1.99913676, 1.22533588 },\n')
file.write('\tatom {H, -0.59298827, 2.17569825, 1.30499409 },\n')
file.write('\tatom {H, -1.99557150, 2.68355859, 1.73849182 },\n')
file.write('\tatom {O, 2.32177282, -0.76277855, -1.42669411 },\n')
file.write('\tatom {H, 1.59105553, -1.43353953, -1.36423833 },\n')
file.write('\tatom {H, 3.00556104, -1.16198054, -1.97170813 },\n')
file.write('\tatom {O, -1.61325150, 1.37598702, -1.60947581 },\n')
file.write('\tatom {H, -0.65665433, 1.53734850, -1.76954163 },\n')
file.write('\tatom {H, -1.75324957, 1.73531947, -0.71955514 },\n')
file.write('\tatom {O, 1.15495708, 2.25580534, 1.16165925 },\n')
file.write('\tatom {H, 1.56205957, 1.35383240, 1.29633057 },\n')
file.write('\tatom {H, 1.70314455, 2.86626909, 1.66309307 },\n')
file.write('\tatom {O, 2.20709492, -0.14872772, 1.32313726 },\n')
file.write('\tatom {H, 1.58553666, -0.84077746, 1.61628226 },\n')
file.write('\tatom {H, 2.42610789, -0.39646359, 0.40641496 },\n')
file.write('\tatom {O, 0.33228679, -2.46089610, -0.99370195 },\n')
file.write('\tatom {H, 0.30884918, -2.53777465, -0.02476291 },\n')
file.write('\tatom {H, -0.54269598, -2.09681688, -1.22506306 },\n')
file.write('\tatom {O, 1.09994277, 1.78719573, -1.72521459 },\n')
file.write('\tatom {H, 1.25026245, 2.11505418, -0.82532285 },\n')
file.write('\tatom {H, 1.57602755, 0.93864515, -1.74674617 },\n')
file.write('\tatom {O, 0.23664110, -2.08073213, 1.83761839 },\n')
file.write('\tatom {H, 0.16506577, -2.69854162, 2.57058807 },\n')
file.write('\tatom {H, -0.63561742, -1.60693505, 1.78751007 },\n')
file.write('\tatom {O, -2.04501919, -0.79447205, 1.43251978 },\n')
file.write('\tatom {H, -1.94086906, 0.17294968, 1.45336493 },\n')
file.write('\tatom {H, -2.25772766, -0.99361260, 0.50281358 },\n')
file.write('\tatom {O, -2.14305333, -1.18847857, -1.36396916 },\n')
file.write('\tatom {H, -2.85215622, -1.44851554, -1.95857066 },\n')
file.write('\tatom {H, -1.95986968, -0.22568719, -1.55342417 },\n')
#----------------------------
#Endecamer (n=11), 434 Isomer
#----------------------------
def print_w11i434(file):
file.write('\tatom {O, -2.08119667, -2.26673485, -0.52122025 },\n')
file.write('\tatom {H, -2.87105167, -2.78737932, -0.69063304 },\n')
file.write('\tatom {H, -2.37048203, -1.48052425, 0.00808991 },\n')
file.write('\tatom {O, 0.02632908, 2.46420106, 1.43136415 },\n')
file.write('\tatom {H, -0.79872021, 2.51272310, 0.91403477 },\n')
file.write('\tatom {H, 0.03392860, 1.56479884, 1.79448727 },\n')
file.write('\tatom {O, -0.10639962, -1.18952211, -2.30993496 },\n')
file.write('\tatom {H, -0.83122588, -1.60450407, -1.81428466 },\n')
file.write('\tatom {H, 0.70146823, -1.55647816, -1.91492397 },\n')
file.write('\tatom {O, -2.53660296, -0.13426118, 1.00652753 },\n')
file.write('\tatom {H, -1.76448175, -0.10778780, 1.59108568 },\n')
file.write('\tatom {H, -2.48980443, 0.70013958, 0.50025834 },\n')
file.write('\tatom {O, 0.15122048, -2.65188991, 1.10644184 },\n')
file.write('\tatom {H, -0.65274681, -2.72021834, 0.55885102 },\n')
file.write('\tatom {H, 0.88417011, -2.67183587, 0.46397298 },\n')
file.write('\tatom {O, 2.13888494, -2.13440481, -0.78805319 },\n')
file.write('\tatom {H, 2.93049477, -2.60943049, -1.05525823 },\n')
file.write('\tatom {H, 2.44486586, -1.33305502, -0.29341074 },\n')
file.write('\tatom {O, 2.64879913, 0.01324964, 0.70328241 },\n')
file.write('\tatom {H, 2.54120429, 0.87044707, 0.25037000 },\n')
file.write('\tatom {H, 1.93995918, 0.00174018, 1.36408007 },\n')
file.write('\tatom {O, 0.14754784, -0.31377914, 2.31635550 },\n')
file.write('\tatom {H, 0.20756983, -0.48066858, 3.26302056 },\n')
file.write('\tatom {H, 0.15063221, -1.21915761, 1.88819098 },\n')
file.write('\tatom {O, 1.91413653, 2.45179984, -0.46204184 },\n')
file.write('\tatom {H, 2.44614529, 3.25118653, -0.50821174 },\n')
file.write('\tatom {H, 1.25730486, 2.59531756, 0.26467692 },\n')
file.write('\tatom {O, -0.24381434, 1.54060642, -2.15574462 },\n')
file.write('\tatom {H, 0.59881902, 1.83276745, -1.77386015 },\n')
file.write('\tatom {H, -0.16451956, 0.56552049, -2.25620710 },\n')
file.write('\tatom {O, -2.06702886, 2.20623002, -0.40395822 },\n')
file.write('\tatom {H, -2.70789898, 2.81776225, -0.77656938 },\n')
file.write('\tatom {H, -1.43251084, 1.96924165, -1.14177635 },\n')
#-----------------------------
#Endecamer (n=11), 4412 Isomer
#-----------------------------
def print_w11i4412(file):
file.write('\tatom {O, 2.51766192, 0.53321759, -1.03874478 },\n')
file.write('\tatom {H, 1.86897322, -0.14656265, -1.30684503 },\n')
file.write('\tatom {H, 2.10586438, 1.37269025, -1.30696563 },\n')
file.write('\tatom {O, 0.09186913, -3.91493968, -1.43866648 },\n')
file.write('\tatom {H, -0.63507120, -4.36739448, -1.87482259 },\n')
file.write('\tatom {H, -0.09209872, -3.99592562, -0.48063299 },\n')
file.write('\tatom {O, 0.32703371, -1.16364690, -1.31684672 },\n')
file.write('\tatom {H, 0.13127551, -1.13520548, -0.35997468 },\n')
file.write('\tatom {H, 0.26100450, -2.11400425, -1.54213322 },\n')
file.write('\tatom {O, -3.14387363, 1.26147659, -0.42032986 },\n')
file.write('\tatom {H, -3.86548867, 0.69632883, -0.70791922 },\n')
file.write('\tatom {H, -2.44492123, 1.18076397, -1.12217955 },\n')
file.write('\tatom {O, 0.47853440, 2.65740808, 1.32203064 },\n')
file.write('\tatom {H, 1.16509807, 2.02396937, 1.59193637 },\n')
file.write('\tatom {H, -0.35810118, 2.21752307, 1.55341863 },\n')
file.write('\tatom {O, -0.38383453, -3.73443912, 1.27525999 },\n')
file.write('\tatom {H, -0.24715696, -2.78828153, 1.48248464 },\n')
file.write('\tatom {H, 0.10371081, -4.21710263, 1.94780289 },\n')
file.write('\tatom {O, 0.87004108, 2.83536075, -1.32288658 },\n')
file.write('\tatom {H, 0.68996872, 2.87849856, -0.35019650 },\n')
file.write('\tatom {H, 1.02267666, 3.74028357, -1.60867794 },\n')
file.write('\tatom {O, 2.30679745, 0.49649996, 1.58743619 },\n')
file.write('\tatom {H, 3.14934829, 0.37991303, 2.03505313 },\n')
file.write('\tatom {H, 2.50246697, 0.49675828, 0.60818176 },\n')
file.write('\tatom {O, -1.16215513, 1.03600785, -2.17353723 },\n')
file.write('\tatom {H, -0.50300472, 1.73372032, -2.01577333 },\n')
file.write('\tatom {H, -0.67893807, 0.20527170, -2.00103239 },\n')
file.write('\tatom {O, -1.82487473, 1.05756029, 1.85950069 },\n')
file.write('\tatom {H, -2.39730527, 1.23843669, 2.61044765 },\n')
file.write('\tatom {H, -2.40369356, 1.10993284, 1.05336169 },\n')
file.write('\tatom {O, -0.01673943, -1.01280507, 1.47458520 },\n')
file.write('\tatom {H, -0.70957776, -0.36504575, 1.70952012 },\n')
file.write('\tatom {H, 0.83050856, -0.56712969, 1.67070259 },\n')
#----------------------------
#Endecamer (n=11), 443 Isomer
#----------------------------
def print_w11i443(file):
file.write('\tatom {O, 1.73475650, -0.85921782, 0.23574306 },\n')
file.write('\tatom {H, 1.99137662, -1.80301680, 0.20404462 },\n')
file.write('\tatom {H, 1.09679422, -0.75299981, -0.49721552 },\n')
file.write('\tatom {O, -1.27206733, 2.80496529, -0.28891370 },\n')
file.write('\tatom {H, -0.60578303, 2.73702447, -0.99559039 },\n')
file.write('\tatom {H, -1.85281734, 2.03754617, -0.42166837 },\n')
file.write('\tatom {O, 1.80695587, -3.59833500, 0.10205169 },\n')
file.write('\tatom {H, 0.90799062, -3.57655532, -0.29039582 },\n')
file.write('\tatom {H, 2.30634942, -4.22659696, -0.42572346 },\n')
file.write('\tatom {O, -0.12145988, -0.34334066, -1.82442778 },\n')
file.write('\tatom {H, -1.01224699, -0.07152574, -1.53856364 },\n')
file.write('\tatom {H, 0.28597604, 0.48223462, -2.14348423 },\n')
file.write('\tatom {O, -0.68915241, -3.05982185, -0.92595481 },\n')
file.write('\tatom {H, -0.48761997, -2.30734376, -1.50128800 },\n')
file.write('\tatom {H, -1.21188699, -2.66470042, -0.20777354 },\n')
file.write('\tatom {O, -2.09346332, -1.53869676, 1.12610210 },\n')
file.write('\tatom {H, -2.65480582, -1.94698273, 1.79161381 },\n')
file.write('\tatom {H, -1.35671709, -1.08133288, 1.63037003 },\n')
file.write('\tatom {O, -0.07309071, -0.34388533, 2.26037947 },\n')
file.write('\tatom {H, -0.05911794, 0.62899126, 2.29089374 },\n')
file.write('\tatom {H, 0.65655267, -0.57380442, 1.64756926 },\n')
file.write('\tatom {O, -2.63803429, 0.33061937, -0.73147324 },\n')
file.write('\tatom {H, -2.61897591, -0.34347822, -0.01382400 },\n')
file.write('\tatom {H, -3.46549831, 0.19948696, -1.20336761 },\n')
file.write('\tatom {O, 0.18483579, 2.45864040, 1.92403542 },\n')
file.write('\tatom {H, 0.06120028, 3.16461555, 2.56459630 },\n')
file.write('\tatom {H, -0.41609360, 2.66417962, 1.16372425 },\n')
file.write('\tatom {O, 2.38148885, 1.84471789, 0.19406847 },\n')
file.write('\tatom {H, 1.79701667, 2.13193373, 0.91636607 },\n')
file.write('\tatom {H, 2.41047829, 0.87440971, 0.28367310 },\n')
file.write('\tatom {O, 0.93495023, 2.24386326, -1.99438054 },\n')
file.write('\tatom {H, 1.39152819, 2.77646464, -2.65162264 },\n')
file.write('\tatom {H, 1.56204030, 2.14910646, -1.22690927 },\n')
#----------------------------
#Endecamer (n=11), 515 Isomer
#----------------------------
def print_w11i515(file):
file.write('\tatom {O, -2.05673439, -2.24219020, -0.59070364 },\n')
file.write('\tatom {H, -2.85960922, -2.71216775, -0.83194231 },\n')
file.write('\tatom {H, -2.31170553, -1.28779385, -0.48591844 },\n')
file.write('\tatom {O, -0.98862801, 1.95925008, -1.83773516 },\n')
file.write('\tatom {H, -0.19283865, 1.37440617, -2.01112532 },\n')
file.write('\tatom {H, -1.29755299, 2.23666864, -2.70520182 },\n')
file.write('\tatom {O, 2.66280215, 0.39906851, 0.20966910 },\n')
file.write('\tatom {H, 2.08355904, 0.96354359, 0.80225504 },\n')
file.write('\tatom {H, 3.54563537, 0.77481870, 0.27490790 },\n')
file.write('\tatom {O, 1.06868130, 1.83557079, 1.69995394 },\n')
file.write('\tatom {H, 0.67959226, 2.57153073, 1.18638819 },\n')
file.write('\tatom {H, 0.30513953, 1.31840872, 2.01606330 },\n')
file.write('\tatom {O, -2.53128977, 0.33175948, -0.12780056 },\n')
file.write('\tatom {H, -2.13323649, 0.47156951, 0.74880817 },\n')
file.write('\tatom {H, -2.05400844, 0.93018332, -0.73176080 },\n')
file.write('\tatom {O, 1.08781287, 0.41236202, -2.19211365 },\n')
file.write('\tatom {H, 0.84687752, -0.54078875, -2.18986670 },\n')
file.write('\tatom {H, 1.72278337, 0.49296329, -1.46078319 },\n')
file.write('\tatom {O, -1.15112537, 0.25247267, 2.36996011 },\n')
file.write('\tatom {H, -1.55537396, 0.28585363, 3.24176955 },\n')
file.write('\tatom {H, -0.89254563, -0.69703881, 2.22643531 },\n')
file.write('\tatom {O, -0.49712938, -2.24331355, 1.75305582 },\n')
file.write('\tatom {H, -1.06939703, -2.43283192, 0.98789289 },\n')
file.write('\tatom {H, 0.40945281, -2.29856980, 1.40271776 },\n')
file.write('\tatom {O, 2.04540085, -2.23631883, 0.50229127 },\n')
file.write('\tatom {H, 2.37809812, -1.31209245, 0.44946491 },\n')
file.write('\tatom {H, 2.78652001, -2.76976241, 0.80287332 },\n')
file.write('\tatom {O, -0.06400653, 3.75127132, 0.05294777 },\n')
file.write('\tatom {H, -0.45593357, 3.21878933, -0.66844882 },\n')
file.write('\tatom {H, -0.76376844, 4.33868360, 0.35046421 },\n')
file.write('\tatom {O, 0.45431479, -2.24545424, -1.91816312 },\n')
file.write('\tatom {H, -0.44734756, -2.32049949, -1.56054149 },\n')
file.write('\tatom {H, 1.02921350, -2.45754244, -1.16678636 },\n')
#----------------------------
#Endecamer (n=11), 551 Isomer
#----------------------------
def print_w11i551(file):
file.write('\tatom {O, -0.44679161, -3.11807829, 0.07512616 },\n')
file.write('\tatom {H, -0.47083707, -4.07016462, 0.20484203 },\n')
file.write('\tatom {H, 0.42313859, -2.80647073, 0.44156921 },\n')
file.write('\tatom {O, -2.35251345, 0.21241407, -1.60292564 },\n')
file.write('\tatom {H, -3.05945992, 0.42843604, -2.21751584 },\n')
file.write('\tatom {H, -2.11917220, 1.06597654, -1.14137673 },\n')
file.write('\tatom {O, 1.31710665, 0.30426889, 2.33990874 },\n')
file.write('\tatom {H, 0.35015548, 0.47690198, 2.40278925 },\n')
file.write('\tatom {H, 1.67238486, 0.46144257, 3.21951216 },\n')
file.write('\tatom {O, -1.73684436, 2.32662913, -0.16337063 },\n')
file.write('\tatom {H, -0.92055454, 2.80785030, -0.41888592 },\n')
file.write('\tatom {H, -1.56262439, 1.96862966, 0.72107433 },\n')
file.write('\tatom {O, -2.38411323, -1.23726570, 0.79868253 },\n')
file.write('\tatom {H, -2.54009043, -0.83325760, -0.07504919 },\n')
file.write('\tatom {H, -1.77228445, -1.97564874, 0.62010912 },\n')
file.write('\tatom {O, -0.17324707, -1.45158813, -2.22676968 },\n')
file.write('\tatom {H, -0.36520499, -2.15007361, -1.57874892 },\n')
file.write('\tatom {H, -0.90468026, -0.81520501, -2.12113096 },\n')
file.write('\tatom {O, -1.36748202, 0.71922094, 2.22797547 },\n')
file.write('\tatom {H, -1.76670368, -0.05892347, 1.74309966 },\n')
file.write('\tatom {H, -1.93319318, 0.86376513, 2.99231898 },\n')
file.write('\tatom {O, 2.36975031, 1.61090235, -0.00285760 },\n')
file.write('\tatom {H, 2.40195538, 0.85131693, -0.61167306 },\n')
file.write('\tatom {H, 2.06201153, 1.23292448, 0.83660922 },\n')
file.write('\tatom {O, 0.62759029, 3.47102875, -0.89433062 },\n')
file.write('\tatom {H, 0.94894654, 4.34198164, -0.64844645 },\n')
file.write('\tatom {H, 1.31674156, 2.83691024, -0.58655002 },\n')
file.write('\tatom {O, 1.85407828, -2.07329571, 0.87955296 },\n')
file.write('\tatom {H, 2.20126219, -1.67906179, 0.05906797 },\n')
file.write('\tatom {H, 1.70832841, -1.31284908, 1.46909879 },\n')
file.write('\tatom {O, 2.29907574, -0.74002506, -1.56453742 },\n')
file.write('\tatom {H, 2.89440767, -0.90673150, -2.30064806 },\n')
file.write('\tatom {H, 1.38566882, -0.94096379, -1.89934988 },\n')
#----------------
#Dodecamer (n=12)
#----------------
def print_w12(file):
file.write('\tatom {O, 1.79799517, -2.87189360, -0.91374020 },\n')
file.write('\tatom {O, 0.96730604, -2.75911220, 1.62798799 },\n')
file.write('\tatom {O, 1.65380168, -0.07006642, -1.01974524 },\n')
file.write('\tatom {O, 1.02235809, 0.07175530, 1.65572815 },\n')
file.write('\tatom {O, -1.02223258, 0.06714841, -1.65231025 },\n')
file.write('\tatom {O, -1.65303657, -0.06953734, 1.02328922 },\n')
file.write('\tatom {O, -1.79714075, -2.87225082, 0.91489225 },\n')
file.write('\tatom {O, -0.96714604, -2.76268933, -1.62695366 },\n')
file.write('\tatom {O, -0.91046484, 2.86984708, -1.80205865 },\n')
file.write('\tatom {O, 1.62976535, 2.75963881, -0.96492508 },\n')
file.write('\tatom {O, 0.90962365, 2.87584732, 1.79706100 },\n')
file.write('\tatom {O, -1.63064745, 2.76057720, 0.96075175 },\n')
file.write('\tatom {H, 1.58187452, -2.90533857, 0.05372043 },\n')
file.write('\tatom {H, 2.45880812, -3.55319457, -1.06640225 },\n')
file.write('\tatom {H, -1.58013630, -2.90956053, -0.05228132 },\n')
file.write('\tatom {H, -2.45728026, -3.55363054, 1.07010099 },\n')
file.write('\tatom {H, 0.05632428, 2.90394112, -1.58314254 },\n')
file.write('\tatom {H, -1.06231290, 3.55393270, -2.46019459 },\n')
file.write('\tatom {H, -1.56680088, 2.89107809, -0.00131897 },\n')
file.write('\tatom {H, -1.92210192, 1.83983329, 1.06475175 },\n')
file.write('\tatom {H, 1.34757079, 0.06744042, 0.72858318 },\n')
file.write('\tatom {H, 1.14627218, 0.98941003, 1.95482520 },\n')
file.write('\tatom {H, -1.14676057, 0.98295160, -1.95675717 },\n')
file.write('\tatom {H, -1.34650335, 0.06810117, -0.72482129 },\n')
file.write('\tatom {H, 0.72810518, -0.06186014, -1.34872142 },\n')
file.write('\tatom {H, 1.95057642, -0.98833069, -1.14558354 },\n')
file.write('\tatom {H, -0.72703573, -0.06575415, 1.35156867 },\n')
file.write('\tatom {H, -1.95270073, -0.98745311, 1.14464678 },\n')
file.write('\tatom {H, 1.56681459, 2.89291882, -0.00316599 },\n')
file.write('\tatom {H, 1.92433921, 1.83958829, -1.06611724 },\n')
file.write('\tatom {H, -0.00511757, -2.89477284, -1.56679971 },\n')
file.write('\tatom {H, -1.07087869, -1.84139535, -1.91693517 },\n')
file.write('\tatom {H, -0.05748186, 2.90789148, 1.57927484 },\n')
file.write('\tatom {H, 1.06103881, 3.56065162, 2.45452965 },\n')
file.write('\tatom {H, 0.00532585, -2.89179177, 1.56768173 },\n')
file.write('\tatom {H, 1.07022772, -1.83898762, 1.92167783 },\n')
#----
#n=13
#----
def print_w13(file):
file.write('\tatom {O, -1.72139424, 0.02340552, -0.51045640 },\n')
file.write('\tatom {H, -1.41945833, -0.19918539, 0.39792084 },\n')
file.write('\tatom {H, -2.08410215, 0.92470087, -0.43342046 },\n')
file.write('\tatom {O, 0.79400268, -0.08355426, -1.61800002 },\n')
file.write('\tatom {H, 0.85565631, -0.95150633, -2.05523841 },\n')
file.write('\tatom {H, -0.14478345, -0.00666204, -1.33941210 },\n')
file.write('\tatom {O, 1.88539825, -0.54493346, 0.86793990 },\n')
file.write('\tatom {H, 2.21705660, 0.30661953, 1.20298967 },\n')
file.write('\tatom {H, 1.59162060, -0.34584233, -0.04812200 },\n')
file.write('\tatom {O, -0.64782869, -0.70695809, 1.91336684 },\n')
file.write('\tatom {H, -0.88546275, -1.64565072, 2.00988548 },\n')
file.write('\tatom {H, 0.29356130, -0.71474515, 1.63126766 },\n')
file.write('\tatom {O, -2.22834883, 2.72190764, 0.03221480 },\n')
file.write('\tatom {H, -3.04149905, 3.21574821, 0.17116033 },\n')
file.write('\tatom {H, -1.68594256, 3.26002244, -0.60000827 },\n')
file.write('\tatom {O, 2.04408786, 2.18396164, 1.49754959 },\n')
file.write('\tatom {H, 2.55607055, 2.75644150, 2.07583412 },\n')
file.write('\tatom {H, 1.10896828, 2.23076729, 1.82358766 },\n')
file.write('\tatom {O, -1.05731510, -3.42939005, 1.37368402 },\n')
file.write('\tatom {H, -1.46149517, -3.24864692, 0.48556168 },\n')
file.write('\tatom {H, -1.55071838, -4.16328069, 1.75065962 },\n')
file.write('\tatom {O, 0.73767926, -2.83126196, -2.17334751 },\n')
file.write('\tatom {H, 1.13682533, -3.08129264, -1.30009612 },\n')
file.write('\tatom {H, 1.13848219, -3.40938044, -2.82862428 },\n')
file.write('\tatom {O, -0.52701249, 2.09999519, 2.24761959 },\n')
file.write('\tatom {H, -1.14290984, 2.39298709, 1.55267539 },\n')
file.write('\tatom {H, -0.73272120, 1.15964649, 2.37541234 },\n')
file.write('\tatom {O, 1.66707873, 2.57920642, -1.30440517 },\n')
file.write('\tatom {H, 1.90731745, 2.52069656, -0.36318855 },\n')
file.write('\tatom {H, 1.48843837, 1.66087626, -1.56929781 },\n')
file.write('\tatom {O, -1.89504769, -2.73577319, -1.06875216 },\n')
file.write('\tatom {H, -1.10057794, -2.82795469, -1.62235556 },\n')
file.write('\tatom {H, -2.06282681, -1.77825357, -1.03951486 },\n')
file.write('\tatom {O, 1.59082367, -3.30613578, 0.31610668 },\n')
file.write('\tatom {H, 0.77253863, -3.47637118, 0.81436194 },\n')
file.write('\tatom {H, 1.92467277, -2.46573318, 0.67066967 },\n')
file.write('\tatom {O, -0.55535266, 4.04413293, -1.54810837 },\n')
file.write('\tatom {H, -0.69073489, 4.23058735, -2.48075953 },\n')
file.write('\tatom {H, 0.29356952, 3.53504382, -1.50101232 },\n')
#----
#n=14
#----
def print_w14(file):
file.write('\tatom {O, 0.16950902, -3.62933548, -1.48729544 },\n')
file.write('\tatom {H, 0.98770719, -3.06650603, -1.47350004 },\n')
file.write('\tatom {H, 0.36558718, -4.36990691, -2.06811647 },\n')
file.write('\tatom {O, -1.52323398, 3.36629878, 1.22298734 },\n')
file.write('\tatom {H, -2.14947133, 4.03669098, 1.51064704 },\n')
file.write('\tatom {H, -1.51379808, 3.40165005, 0.22855923 },\n')
file.write('\tatom {O, -1.38450391, 0.59271033, 1.38535936 },\n')
file.write('\tatom {H, -1.59520556, 1.53608251, 1.51638084 },\n')
file.write('\tatom {H, -0.41143176, 0.52234858, 1.52061238 },\n')
file.write('\tatom {O, -2.21575526, -2.09500069, -1.60759835 },\n')
file.write('\tatom {H, -1.43074012, -2.66745000, -1.67194669 },\n')
file.write('\tatom {H, -2.48611802, -2.16671557, -0.67923326 },\n')
file.write('\tatom {O, 2.15060733, -2.17256087, 1.68850765 },\n')
file.write('\tatom {H, 2.69633884, -2.61992730, 2.34129044 },\n')
file.write('\tatom {H, 1.29846370, -2.68196763, 1.64355580 },\n')
file.write('\tatom {O, 1.33448057, 3.27996304, 1.33570608 },\n')
file.write('\tatom {H, 0.38389554, 3.42353471, 1.48218741 },\n')
file.write('\tatom {H, 1.48036085, 2.35283381, 1.59102534 },\n')
file.write('\tatom {O, 1.58589874, 3.35889207, -1.32735114 },\n')
file.write('\tatom {H, 2.21639392, 4.01653613, -1.63431500 },\n')
file.write('\tatom {H, 1.58831112, 3.41125226, -0.33696278 },\n')
file.write('\tatom {O, 1.32875257, 0.45891643, 1.39808185 },\n')
file.write('\tatom {H, 1.70543594, -0.39526632, 1.68375482 },\n')
file.write('\tatom {H, 1.43974701, 0.46154170, 0.42083419 },\n')
file.write('\tatom {O, -1.29975022, 0.46035814, -1.41593042 },\n')
file.write('\tatom {H, -1.40711360, 0.48254384, -0.44404161 },\n')
file.write('\tatom {H, -1.67541854, -0.41580686, -1.66369858 },\n')
file.write('\tatom {O, -2.47970188, -1.87944259, 1.30541875 },\n')
file.write('\tatom {H, -2.20856342, -0.94697656, 1.45198105 },\n')
file.write('\tatom {H, -3.21435381, -2.03835724, 1.90502812 },\n')
file.write('\tatom {O, 2.31621387, -2.08571364, -1.18509816 },\n')
file.write('\tatom {H, 2.47470999, -2.11418880, -0.22679247 },\n')
file.write('\tatom {H, 2.14511234, -1.14607629, -1.37705261 },\n')
file.write('\tatom {O, -0.13052795, -3.47566330, 1.30871260 },\n')
file.write('\tatom {H, -0.09494258, -3.70864151, 0.36353304 },\n')
file.write('\tatom {H, -0.95412877, -2.96588919, 1.40500302 },\n')
file.write('\tatom {O, -1.26899342, 3.25761821, -1.42008891 },\n')
file.write('\tatom {H, -0.32004927, 3.39833567, -1.58010994 },\n')
file.write('\tatom {H, -1.42111317, 2.32652422, -1.66237216 },\n')
file.write('\tatom {O, 1.41628336, 0.55716143, -1.34563941 },\n')
file.write('\tatom {H, 0.44624959, 0.49805311, -1.50659495 },\n')
file.write('\tatom {H, 1.64976997, 1.48057920, -1.54747866 },\n')
#----
#n=15
#----
def print_w15(file):
file.write('\tatom {O, -0.41037728, 3.06338557, -2.19453935 },\n')
file.write('\tatom {H, -1.10225722, 3.15700229, -1.50322402 },\n')
file.write('\tatom {H, -0.55868201, 3.78135241, -2.81686194 },\n')
file.write('\tatom {O, -0.34214882, 0.20087415, -2.29863851 },\n')
file.write('\tatom {H, 0.52678420, 0.10051353, -1.84576600 },\n')
file.write('\tatom {H, -0.36804153, 1.13006533, -2.57601866 },\n')
file.write('\tatom {O, -2.23196288, 3.01297353, -0.14410577 },\n')
file.write('\tatom {H, -2.92179851, 3.66294562, 0.01954583 },\n')
file.write('\tatom {H, -1.73818286, 2.92285986, 0.71848955 },\n')
file.write('\tatom {O, -2.36727601, -2.73235161, -0.30448997 },\n')
file.write('\tatom {H, -2.67970705, -1.81681475, -0.35802889 },\n')
file.write('\tatom {H, -1.73980002, -2.80584683, -1.05791736 },\n')
file.write('\tatom {O, 1.60775457, -0.06745560, 1.64608103 },\n')
file.write('\tatom {H, 0.63325622, -0.07453825, 1.78914481 },\n')
file.write('\tatom {H, 1.88844716, 0.82956238, 1.89432294 },\n')
file.write('\tatom {O, -2.30250596, 0.14897737, -0.40481292 },\n')
file.write('\tatom {H, -2.58514101, 1.07692660, -0.37282923 },\n')
file.write('\tatom {H, -1.60883631, 0.13892784, -1.10403419 },\n')
file.write('\tatom {O, -1.14781983, -2.99211225, 2.01614355 },\n')
file.write('\tatom {H, -1.63218518, -2.97754098, 1.14324747 },\n')
file.write('\tatom {H, -1.57112536, -3.67596952, 2.54301254 },\n')
file.write('\tatom {O, 2.13161682, 2.66193894, -0.94063605 },\n')
file.write('\tatom {H, 1.30281644, 2.87721472, -1.40185307 },\n')
file.write('\tatom {H, 2.30155821, 1.73012958, -1.16100356 },\n')
file.write('\tatom {O, 2.03863122, -0.14910922, -1.03750931 },\n')
file.write('\tatom {H, 2.29027066, -1.06413710, -1.25777871 },\n')
file.write('\tatom {H, 1.89247851, -0.16323798, -0.06343480 },\n')
file.write('\tatom {O, -1.07959061, -0.19223930, 2.00451613 },\n')
file.write('\tatom {H, -1.53139246, -0.12237038, 1.13212592 },\n')
file.write('\tatom {H, -1.21282358, -1.11848311, 2.27583104 },\n')
file.write('\tatom {O, -0.54129318, -2.69118030, -2.36237717 },\n')
file.write('\tatom {H, -0.53101580, -1.76517125, -2.64617762 },\n')
file.write('\tatom {H, 0.36452643, -2.85595028, -2.04808200 },\n')
file.write('\tatom {O, 1.60493380, -2.91090837, 1.40237411 },\n')
file.write('\tatom {H, 1.80909034, -2.00751362, 1.69164607 },\n')
file.write('\tatom {H, 0.67031170, -3.03851425, 1.65171826 },\n')
file.write('\tatom {O, 1.87438878, 2.74300689, 1.71123562 },\n')
file.write('\tatom {H, 2.48187829, 3.37972572, 2.09828615 },\n')
file.write('\tatom {H, 2.01080637, 2.79193943, 0.72916891 },\n')
file.write('\tatom {O, -0.91062230, 2.61943091, 2.10670728 },\n')
file.write('\tatom {H, -1.02038252, 1.68324940, 2.34467938 },\n')
file.write('\tatom {H, 0.05362613, 2.74889220, 2.03849853 },\n')
file.write('\tatom {O, 2.07248122, -2.94084258, -1.22138117 },\n')
file.write('\tatom {H, 1.91566855, -3.02563145, -0.24488011 },\n')
file.write('\tatom {H, 2.71120402, -3.62293405, -1.44731277 },\n')
#----
#n=16
#----
def print_w16(file):
file.write('\tatom {O, 1.41406651, 1.47898306, 1.34080339 },\n')
file.write('\tatom {H, 1.49686076, 1.45044122, 0.36223864 },\n')
file.write('\tatom {H, 1.62064251, 2.40207704, 1.57178605 },\n')
file.write('\tatom {O, -1.41406651, -1.47898306, 1.34080339 },\n')
file.write('\tatom {H, -1.62064251, -2.40207704, 1.57178605 },\n')
file.write('\tatom {H, -1.49686076, -1.45044122, 0.36223864 },\n')
file.write('\tatom {O, 1.39000816, -4.15918450, 1.27951492 },\n')
file.write('\tatom {H, 0.45245028, -4.28804192, 1.50630410 },\n')
file.write('\tatom {H, 1.57669938, -3.23997897, 1.53183841 },\n')
file.write('\tatom {O, 1.33400994, -1.32197720, 1.41420070 },\n')
file.write('\tatom {H, 1.59995522, -0.42479230, 1.67545887 },\n')
file.write('\tatom {H, 0.35477253, -1.32600182, 1.49428525 },\n')
file.write('\tatom {O, -1.33400994, -1.32197720, -1.41420070 },\n')
file.write('\tatom {H, -1.59995522, -0.42479230, -1.67545887 },\n')
file.write('\tatom {H, -0.35477253, -1.32600182, -1.49428525 },\n')
file.write('\tatom {O, 1.39000816, 4.15918450, -1.27951492 },\n')
file.write('\tatom {H, 0.45245028, 4.28804192, -1.50630410 },\n')
file.write('\tatom {H, 1.57669938, 3.23997897, -1.53183841 },\n')
file.write('\tatom {O, -1.45801073, 4.28844406, -1.38716611 },\n')
file.write('\tatom {H, -1.52043672, 4.32892728, -0.39770335 },\n')
file.write('\tatom {H, -2.05158507, 4.96664049, -1.72209901 },\n')
file.write('\tatom {O, 1.45801073, -4.28844406, -1.38716611 },\n')
file.write('\tatom {H, 2.05158507, -4.96664049, -1.72209901 },\n')
file.write('\tatom {H, 1.52043672, -4.32892728, -0.39770335 },\n')
file.write('\tatom {O, -1.45801073, -4.28844406, 1.38716611 },\n')
file.write('\tatom {H, -1.52043672, -4.32892728, 0.39770335 },\n')
file.write('\tatom {H, -2.05158507, -4.96664049, 1.72209901 },\n')
file.write('\tatom {O, -1.39000816, 4.15918450, 1.27951492 },\n')
file.write('\tatom {H, -0.45245028, 4.28804192, 1.50630410 },\n')
file.write('\tatom {H, -1.57669938, 3.23997897, 1.53183841 },\n')
file.write('\tatom {O, 1.41406651, -1.47898306, -1.34080339 },\n')
file.write('\tatom {H, 1.49686076, -1.45044122, -0.36223864 },\n')
file.write('\tatom {H, 1.62064251, -2.40207704, -1.57178605 },\n')
file.write('\tatom {O, -1.41406651, 1.47898306, -1.34080339 },\n')
file.write('\tatom {H, -1.62064251, 2.40207704, -1.57178605 },\n')
file.write('\tatom {H, -1.49686076, 1.45044122, -0.36223864 },\n')
file.write('\tatom {O, 1.45801073, 4.28844406, 1.38716611 },\n')
file.write('\tatom {H, 1.52043672, 4.32892728, 0.39770335 },\n')
file.write('\tatom {H, 2.05158507, 4.96664049, 1.72209901 },\n')
file.write('\tatom {O, -1.39000816, -4.15918450, -1.27951492 },\n')
file.write('\tatom {H, -0.45245028, -4.28804192, -1.50630410 },\n')
file.write('\tatom {H, -1.57669938, -3.23997897, -1.53183841 },\n')
file.write('\tatom {O, 1.33400994, 1.32197720, -1.41420070 },\n')
file.write('\tatom {H, 1.59995522, 0.42479230, -1.67545887 },\n')
file.write('\tatom {H, 0.35477253, 1.32600182, -1.49428525 },\n')
file.write('\tatom {O, -1.33400994, 1.32197720, 1.41420070 },\n')
file.write('\tatom {H, -0.35477253, 1.32600182, 1.49428525 },\n')
file.write('\tatom {H, -1.59995522, 0.42479230, 1.67545887 },\n')
#---------------
#n=17 (Interior)
#---------------
def print_w17int(file):
file.write('\tatom {O, -0.01493103, -0.11597399, 0.08504794 },\n')
file.write('\tatom {H, -0.54830827, -0.79672051, 0.54924004 },\n')
file.write('\tatom {H, 0.26142249, 0.50039437, 0.79799914 },\n')
file.write('\tatom {O, -1.40600079, -2.02486637, 1.51449118 },\n')
file.write('\tatom {H, -1.97744278, -2.38235975, 0.80783724 },\n')
file.write('\tatom {H, -0.70270740, -2.69031204, 1.63619275 },\n')
file.write('\tatom {O, 0.69995502, 1.48523512, 2.19778076 },\n')
file.write('\tatom {H, 0.72668055, 2.39089850, 1.83196509 },\n')
file.write('\tatom {H, 1.62972317, 1.18983224, 2.22697611 },\n')
file.write('\tatom {O, 1.71882695, -1.36674740, -1.60938263 },\n')
file.write('\tatom {H, 1.92559761, -0.71100080, -2.29699013 },\n')
file.write('\tatom {H, 1.14742176, -0.86778248, -0.98574738 },\n')
file.write('\tatom {O, -0.82892133, 1.17597634, -2.17815471 },\n')
file.write('\tatom {H, -0.53754194, 0.70230668, -1.36963306 },\n')
file.write('\tatom {H, -0.02586849, 1.20790698, -2.72499002 },\n')
file.write('\tatom {O, -2.62491952, -2.63028076, -0.89040553 },\n')
file.write('\tatom {H, -3.36659624, -3.18288446, -1.15368648 },\n')
file.write('\tatom {H, -2.87552044, -1.70311641, -1.15589515 },\n')
file.write('\tatom {O, 3.22568486, 0.51580834, 1.60184690 },\n')
file.write('\tatom {H, 3.15887363, -0.44655664, 1.35468137 },\n')
file.write('\tatom {H, 4.07147791, 0.60462248, 2.05102255 },\n')
file.write('\tatom {O, 0.70024553, 3.77012031, 0.65875677 },\n')
file.write('\tatom {H, -0.17945986, 3.66970044, 0.19378959 },\n')
file.write('\tatom {H, 0.79432527, 4.70654786, 0.85464147 },\n')
file.write('\tatom {O, -3.27899902, 1.54471635, 0.80788480 },\n')
file.write('\tatom {H, -2.78624126, 1.14871181, 1.57540152 },\n')
file.write('\tatom {H, -4.12274660, 1.83521150, 1.16718828 },\n')
file.write('\tatom {O, -1.56776961, 3.32426908, -0.57758859 },\n')
file.write('\tatom {H, -2.20888987, 2.82483943, -0.03976389 },\n')
file.write('\tatom {H, -1.33373669, 2.69714016, -1.29006638 },\n')
file.write('\tatom {O, 2.63519585, 2.18068693, -0.67386812 },\n')
file.write('\tatom {H, 2.92050546, 1.60093117, 0.05353762 },\n')
file.write('\tatom {H, 2.01063301, 2.80736955, -0.26724530 },\n')
file.write('\tatom {O, -3.22522739, -0.12727675, -1.49339530 },\n')
file.write('\tatom {H, -3.31981827, 0.42898292, -0.69813308 },\n')
file.write('\tatom {H, -2.51643255, 0.30332052, -2.00099814 },\n')
file.write('\tatom {O, 0.79609848, -3.71439238, 1.25117393 },\n')
file.write('\tatom {H, 0.54477356, -3.77685417, 0.28988842 },\n')
file.write('\tatom {H, 0.93625080, -4.61883181, 1.54705505 },\n')
file.write('\tatom {O, 1.89827284, 1.08163047, -2.99757387 },\n')
file.write('\tatom {H, 2.22210990, 1.53460165, -2.17874904 },\n')
file.write('\tatom {H, 2.42264308, 1.43810302, -3.72002788 },\n')
file.write('\tatom {O, -1.89000594, 0.44196341, 2.79825366 },\n')
file.write('\tatom {H, -0.98912297, 0.80774296, 2.80111040 },\n')
file.write('\tatom {H, -1.77219591, -0.48917214, 2.54168817 },\n')
file.write('\tatom {O, 0.04909521, -3.57372894, -1.29036379 },\n')
file.write('\tatom {H, 0.60452184, -2.85690908, -1.64983740 },\n')
file.write('\tatom {H, -0.86368532, -3.23847512, -1.32330854 },\n')
file.write('\tatom {O, 3.05559597, -1.98916361, 0.75928244 },\n')
file.write('\tatom {H, 2.36653504, -2.59144463, 1.08898534 },\n')
file.write('\tatom {H, 2.80846963, -1.85697748, -0.17499773 },\n')
#--------------
#n=17 (Surface)
#--------------
def print_w17surf(file):
file.write('\tatom {O, 0.23692669, -4.65683721, 0.48774215 },\n')
file.write('\tatom {H, -0.51925947, -4.31137465, 1.03257353 },\n')
file.write('\tatom {H, 0.41856682, -5.54767870, 0.79964933 },\n')
file.write('\tatom {O, -2.08586139, -0.05117740, 0.04583617 },\n')
file.write('\tatom {H, -1.50756298, 0.28226837, -0.67576496 },\n')
file.write('\tatom {H, -2.50701356, -0.85944526, -0.30189210 },\n')
file.write('\tatom {O, 3.57542689, 0.17848797, -1.39307975 },\n')
file.write('\tatom {H, 4.46764291, 0.24313009, -1.74576954 },\n')
file.write('\tatom {H, 3.67350147, -0.03196931, -0.42827831 },\n')
file.write('\tatom {O, 1.54422957, -2.21962678, 0.54684888 },\n')
file.write('\tatom {H, 1.29549099, -3.16244212, 0.56351294 },\n')
file.write('\tatom {H, 1.37067696, -1.91517279, -0.36731664 },\n')
file.write('\tatom {O, 1.16595877, -1.22238147, -2.02560843 },\n')
file.write('\tatom {H, 2.05251088, -0.82228813, -2.06847549 },\n')
file.write('\tatom {H, 0.55256907, -0.47502704, -2.14975084 },\n')
file.write('\tatom {O, -0.57175754, 3.55391445, 2.10626722 },\n')
file.write('\tatom {H, -1.39115075, 3.18839567, 1.68000593 },\n')
file.write('\tatom {H, -0.87411109, 4.03719665, 2.88085344 },\n')
file.write('\tatom {O, -1.76131856, -3.43767851, 1.74722883 },\n')
file.write('\tatom {H, -2.32962276, -3.17202142, 1.00434061 },\n')
file.write('\tatom {H, -1.40135001, -2.59813718, 2.08528286 },\n')
file.write('\tatom {O, -0.63093208, -3.48491822, -1.94302346 },\n')
file.write('\tatom {H, -0.23829063, -4.03664137, -1.24468989 },\n')
file.write('\tatom {H, 0.04495724, -2.82066258, -2.15250417 },\n')
file.write('\tatom {O, -0.45520628, 1.05871738, -1.83625961 },\n')
file.write('\tatom {H, 0.30663738, 1.47711374, -1.36797287 },\n')
file.write('\tatom {H, -0.97787375, 1.81828666, -2.14816799 },\n')
file.write('\tatom {O, 0.29947130, 4.50820655, -0.44394424 },\n')
file.write('\tatom {H, 0.06150883, 4.39973198, 0.49330177 },\n')
file.write('\tatom {H, 0.98230459, 3.82762475, -0.57671799 },\n')
file.write('\tatom {O, -2.86315640, -2.59367438, -0.76458985 },\n')
file.write('\tatom {H, -2.08742294, -2.92276141, -1.28622298 },\n')
file.write('\tatom {H, -3.64710413, -2.87417039, -1.24542095 },\n')
file.write('\tatom {O, 3.47461432, -0.34625855, 1.22144547 },\n')
file.write('\tatom {H, 2.94262697, 0.30871211, 1.69912297 },\n')
file.write('\tatom {H, 2.90241444, -1.13387509, 1.17616012 },\n')
file.write('\tatom {O, -2.65823268, 2.60253978, 0.73991459 },\n')
file.write('\tatom {H, -2.56743584, 2.96397951, -0.15815229 },\n')
file.write('\tatom {H, -2.64863471, 1.63809077, 0.60365680 },\n')
file.write('\tatom {O, 1.21442507, 1.36709466, 2.01026056 },\n')
file.write('\tatom {H, 0.60933196, 0.60337976, 2.13396618 },\n')
file.write('\tatom {H, 0.67766394, 2.14516345, 2.24988227 },\n')
file.write('\tatom {O, -1.77828582, 3.54903590, -1.82004078 },\n')
file.write('\tatom {H, -2.15107605, 4.18713341, -2.43497442 },\n')
file.write('\tatom {H, -1.01812491, 4.00751351, -1.37534830 },\n')
file.write('\tatom {O, 1.68793160, 2.08055151, -0.57030338 },\n')
file.write('\tatom {H, 2.46208845, 1.61030075, -0.92691453 },\n')
file.write('\tatom {H, 1.59018131, 1.77711776, 0.36267192 },\n')
file.write('\tatom {O, -0.38604848, -0.89729623, 2.02255122 },\n')
file.write('\tatom {H, -1.00819445, -0.57102949, 1.33531575 },\n')
file.write('\tatom {H, 0.29367122, -1.38422097, 1.51134834 },\n')
#--------------
#n=18 (Surface)
#--------------
def print_w18(file):
file.write('\tatom {O, 2.05847074, -4.03432286, 1.31047808 },\n')
file.write('\tatom {H, 1.18052907, -4.40859785, 1.49696787 },\n')
file.write('\tatom {H, 1.97818927, -3.09758325, 1.55869811 },\n')
file.write('\tatom {O, -0.68550781, -4.83729972, 1.31015757 },\n')
file.write('\tatom {H, -0.70328898, -4.89129151, 0.31638823 },\n')
file.write('\tatom {H, -1.11637156, -5.63356464, 1.63343528 },\n')
file.write('\tatom {O, 2.22632201, -4.10379025, -1.35943877 },\n')
file.write('\tatom {H, 2.99202269, -4.58639542, -1.68332186 },\n')
file.write('\tatom {H, 2.27696713, -4.13004153, -0.36932902 },\n')
file.write('\tatom {O, -0.56478868, -4.71945768, -1.33892371 },\n')
file.write('\tatom {H, 0.38273906, -4.62558121, -1.53662053 },\n')
file.write('\tatom {H, -0.94951761, -3.85608537, -1.57625912 },\n')
file.write('\tatom {O, 1.33879793, -1.30403906, 1.34452743 },\n')
file.write('\tatom {H, 1.42232421, -1.30311909, 0.36540415 },\n')
file.write('\tatom {H, 1.48885980, -0.37546505, 1.60171284 },\n')
file.write('\tatom {O, -1.25874146, -2.12218316, 1.43553426 },\n')
file.write('\tatom {H, -0.33672027, -1.79275214, 1.53850102 },\n')
file.write('\tatom {H, -1.20250156, -3.08617156, 1.57815253 },\n')
file.write('\tatom {O, 1.36668067, -1.43692881, -1.41628685 },\n')
file.write('\tatom {H, 1.80905971, -2.28248970, -1.61108093 },\n')
file.write('\tatom {H, 0.40661967, -1.61694172, -1.54039121 },\n')
file.write('\tatom {O, 1.26137413, 1.45144982, 1.48016252 },\n')
file.write('\tatom {H, 0.29511972, 1.62130277, 1.53065802 },\n')
file.write('\tatom {H, 1.68061979, 2.30534131, 1.69188388 },\n')
file.write('\tatom {O, -1.29056761, -2.01779296, -1.36363982 },\n')
file.write('\tatom {H, -1.94815102, -1.32895426, -1.61450342 },\n')
file.write('\tatom {H, -1.35420222, -2.04415886, -0.38795315 },\n')
file.write('\tatom {O, -3.15828768, -0.19083915, 1.39452665 },\n')
file.write('\tatom {H, -2.55834478, -0.95735314, 1.52883846 },\n')
file.write('\tatom {H, -3.89144535, -0.31843982, 2.00328132 },\n')
file.write('\tatom {O, 1.42751515, 1.32626524, -1.29381126 },\n')
file.write('\tatom {H, 1.58986208, 0.39946517, -1.54887549 },\n')
file.write('\tatom {H, 1.44702551, 1.31497135, -0.31339551 },\n')
file.write('\tatom {O, 2.09948977, 4.11570775, 1.47267075 },\n')
file.write('\tatom {H, 2.84262212, 4.59723418, 1.84681981 },\n')
file.write('\tatom {H, 2.21399375, 4.14503392, 0.48656745 },\n')
file.write('\tatom {O, -1.41028187, 1.99418107, 1.23220378 },\n')
file.write('\tatom {H, -1.41183312, 2.03209564, 0.24932954 },\n')
file.write('\tatom {H, -2.07418914, 1.31583240, 1.45388990 },\n')
file.write('\tatom {O, -3.05625916, 0.04693188, -1.57157373 },\n')
file.write('\tatom {H, -2.52147097, 0.85316155, -1.67721407 },\n')
file.write('\tatom {H, -3.31630766, 0.05278117, -0.63821070 },\n')
file.write('\tatom {O, 2.11573205, 4.05999364, -1.19395405 },\n')
file.write('\tatom {H, 1.25583413, 4.44145521, -1.44153843 },\n')
file.write('\tatom {H, 2.05218034, 3.12604891, -1.45774007 },\n')
file.write('\tatom {O, -0.68334809, 4.72256431, 1.28552964 },\n')
file.write('\tatom {H, 0.25096857, 4.62581480, 1.53805303 },\n')
file.write('\tatom {H, -1.07856344, 3.85501436, 1.47993924 },\n')
file.write('\tatom {O, -1.18486931, 2.16167975, -1.49247911 },\n')
file.write('\tatom {H, -1.12164442, 3.11883584, -1.66399249 },\n')
file.write('\tatom {H, -0.26395756, 1.82978079, -1.56957718 },\n')
file.write('\tatom {O, -0.60255064, 4.89711964, -1.37747275 },\n')
file.write('\tatom {H, -0.70253095, 4.93857234, -0.39094048 },\n')
file.write('\tatom {H, -1.01614374, 5.69108795, -1.72765539 },\n')
#---------------
#n=19 (Interior)
#---------------
def print_w19(file):
file.write('\tatom {O, -1.78991610, 1.51321269, -1.17846555 },\n')
file.write('\tatom {H, -2.63160242, 1.02336537, -1.09895002 },\n')
file.write('\tatom {H, -1.15355746, 0.93276870, -0.69523130 },\n')
file.write('\tatom {O, 0.03256222, -0.00540767, 0.13005355 },\n')
file.write('\tatom {H, -0.35332650, -0.69045565, 0.71966335 },\n')
file.write('\tatom {H, 0.46829712, 0.62864405, 0.73986929 },\n')
file.write('\tatom {O, -0.85363152, -1.95142647, 1.87846743 },\n')
file.write('\tatom {H, -1.07489331, -2.75816424, 1.37330312 },\n')
file.write('\tatom {H, 0.02089709, -2.13528375, 2.27440873 },\n')
file.write('\tatom {O, 1.06520757, 1.83756967, 1.91953561 },\n')
file.write('\tatom {H, 1.23064218, 2.66505116, 1.42920907 },\n')
file.write('\tatom {H, 0.22568412, 1.99214094, 2.39570811 },\n')
file.write('\tatom {O, 1.73641639, -1.51936110, -1.41953903 },\n')
file.write('\tatom {H, 2.58546662, -1.04462702, -1.40085287 },\n')
file.write('\tatom {H, 1.15706906, -0.95156367, -0.86611967 },\n')
file.write('\tatom {O, -2.27744615, -2.15163174, -1.65043961 },\n')
file.write('\tatom {H, -1.59433767, -1.86047183, -2.29534636 },\n')
file.write('\tatom {H, -1.86597712, -2.85818716, -1.12471521 },\n')
file.write('\tatom {O, -1.55144695, 2.21537273, 2.78981610 },\n')
file.write('\tatom {H, -1.87911716, 2.47582356, 3.65565516 },\n')
file.write('\tatom {H, -2.06409590, 1.39428925, 2.54987946 },\n')
file.write('\tatom {O, 1.82080978, -2.38163840, 2.50934536 },\n')
file.write('\tatom {H, 2.22141601, -2.68867558, 3.32788975 },\n')
file.write('\tatom {H, 2.31922415, -1.55504333, 2.26618345 },\n')
file.write('\tatom {O, 2.08894260, 2.25089574, -1.78461216 },\n')
file.write('\tatom {H, 1.73510916, 2.91356802, -1.16536237 },\n')
file.write('\tatom {H, 1.34533352, 2.00852733, -2.36458106 },\n')
file.write('\tatom {O, -3.74023115, -0.30841120, -0.52601984 },\n')
file.write('\tatom {H, -4.65837809, -0.45104294, -0.77304433 },\n')
file.write('\tatom {H, -3.22439128, -1.05537029, -0.94802780 },\n')
file.write('\tatom {O, -2.83434488, -0.00223514, 2.14908380 },\n')
file.write('\tatom {H, -3.24076521, -0.08692872, 1.26703331 },\n')
file.write('\tatom {H, -2.21210378, -0.74986342, 2.19814876 },\n')
file.write('\tatom {O, 3.06440299, -0.12777184, 1.87382466 },\n')
file.write('\tatom {H, 3.37404420, 0.01256554, 0.96071756 },\n')
file.write('\tatom {H, 2.45581921, 0.61399336, 2.03583308 },\n')
file.write('\tatom {O, -1.09132068, -4.04631153, 0.09570936 },\n')
file.write('\tatom {H, -1.39785255, -4.95681845, 0.12755242 },\n')
file.write('\tatom {H, -0.09658303, -4.08390695, 0.02715018 },\n')
file.write('\tatom {O, 1.10324680, 4.00937316, 0.18579009 },\n')
file.write('\tatom {H, 0.10637553, 4.03389639, 0.23220530 },\n')
file.write('\tatom {H, 1.40258859, 4.92066185, 0.25049313 },\n')
file.write('\tatom {O, 3.71505267, 0.36947883, -0.85503621 },\n')
file.write('\tatom {H, 4.60499009, 0.55744977, -1.16647114 },\n')
file.write('\tatom {H, 3.14623717, 1.10223053, -1.20829228 },\n')
file.write('\tatom {O, -1.50650441, 3.76858590, 0.40424706 },\n')
file.write('\tatom {H, -1.64831105, 3.36993687, 1.28085968 },\n')
file.write('\tatom {H, -1.72104185, 3.03906481, -0.21217906 },\n')
file.write('\tatom {O, 1.53279435, -3.84248704, 0.05530571 },\n')
file.write('\tatom {H, 1.76399279, -3.48323592, 0.92861203 },\n')
file.write('\tatom {H, 1.70697057, -3.09136967, -0.54939068 },\n')
file.write('\tatom {O, -0.20825966, 1.56738878, -3.34458330 },\n')
file.write('\tatom {H, -0.91555776, 1.62269461, -2.65996381 },\n')
file.write('\tatom {H, -0.47499620, 2.15875012, -4.05447009 },\n')
file.write('\tatom {O, -0.29205805, -1.32680854, -3.36251707 },\n')
file.write('\tatom {H, -0.28330368, -0.37130281, -3.53034185 },\n')
file.write('\tatom {H, 0.52826469, -1.47084962, -2.85809049 },\n')
#------------------------
#n=20 Dodecahedron *** C1
#------------------------
#MP2/aug-cc-pVTZ delta E=209.28, kcal/mol
def print_w20dode(file):
file.write('\tatom {O, 2.21363671, 3.19691705, 0.68721850 },\n')
file.write('\tatom {O, 3.65057365, 0.72957220, 0.88213303 },\n')
file.write('\tatom {O, 0.12378993, 3.10718406, 2.42529051 },\n')
file.write('\tatom {O, 1.04255648, 3.11293656, -1.89832518 },\n')
file.write('\tatom {O, 2.66039031, -0.78644391, 2.76500519 },\n')
file.write('\tatom {O, 0.36138214, 0.55170878, 3.71095844 },\n')
file.write('\tatom {O, 3.44677186, -0.54403736, -1.51310232 },\n')
file.write('\tatom {O, -2.09205528, 2.96824738, 1.03881140 },\n')
file.write('\tatom {O, 1.84970876, 0.88709177, -3.35373862 },\n')
file.write('\tatom {O, -1.57543181, 3.18469944, -1.67774247 },\n')
file.write('\tatom {O, 1.60954259, -3.04290753, 1.53036305 },\n')
file.write('\tatom {O, -1.84749768, -0.92916854, 3.16936612 },\n')
file.write('\tatom {O, 2.22513624, -3.09413850, -0.99837469 },\n')
file.write('\tatom {O, -3.57764895, 0.60042916, 1.46505684 },\n')
file.write('\tatom {O, -0.29188080, -0.56461668, -3.75259415 },\n')
file.write('\tatom {O, -2.48971046, 0.73765115, -2.78470750 },\n')
file.write('\tatom {O, -1.14768291, -3.15619582, 2.01547295 },\n')
file.write('\tatom {O, -0.08921585, -3.08042930, -2.39438070 },\n')
file.write('\tatom {O, -3.76178142, -0.88923220, -0.79127141 },\n')
file.write('\tatom {O, -2.21900716, -2.98260021, -0.54081277 },\n')
file.write('\tatom {H, 1.48860041, 3.23998239, 1.35127815 },\n')
file.write('\tatom {H, 2.75181797, 3.98041800, 0.83609425 },\n')
file.write('\tatom {H, 3.18060821, 1.57490928, 0.79640718 },\n')
file.write('\tatom {H, 3.59581278, 0.30392539, -0.00375389 },\n')
file.write('\tatom {H, -0.02806528, 3.77888401, 3.09713154 },\n')
file.write('\tatom {H, -0.72822476, 3.05806149, 1.90367024 },\n')
file.write('\tatom {H, 1.49860445, 3.10177065, -1.03934799 },\n')
file.write('\tatom {H, 1.37545271, 2.34187778, -2.39860402 },\n')
file.write('\tatom {H, 3.36310551, -0.92937961, 3.40603010 },\n')
file.write('\tatom {H, 3.06263115, -0.20906586, 2.05191259 },\n')
file.write('\tatom {H, 1.17914609, 0.11628774, 3.40878159 },\n')
file.write('\tatom {H, 0.34157730, 1.40897869, 3.25477828 },\n')
file.write('\tatom {H, 3.01239546, -1.40402339, -1.38931272 },\n')
file.write('\tatom {H, 2.91496939, -0.07998757, -2.18428610 },\n')
file.write('\tatom {H, -1.95886631, 3.07079722, 0.07599025 },\n')
file.write('\tatom {H, -2.61516021, 2.15456166, 1.14902557 },\n')
file.write('\tatom {H, 1.03290046, 0.33689971, -3.53115236 },\n')
file.write('\tatom {H, 2.23582115, 1.05936690, -4.21764861 },\n')
file.write('\tatom {H, -1.87075714, 3.96859381, -2.15053933 },\n')
file.write('\tatom {H, -0.58252407, 3.17084169, -1.77960923 },\n')
file.write('\tatom {H, 0.65842849, -3.07749678, 1.74545916 },\n')
file.write('\tatom {H, 1.96710620, -2.26663730, 2.00507084 },\n')
file.write('\tatom {H, -2.42366840, -0.37434822, 2.61890681 },\n')
file.write('\tatom {H, -1.06369958, -0.37190759, 3.38436481 },\n')
file.write('\tatom {H, 2.88227181, -3.79105733, -1.08977609 },\n')
file.write('\tatom {H, 1.99017082, -3.07948303, -0.02275854 },\n')
file.write('\tatom {H, -4.44765215, 0.74838340, 1.84831745 },\n')
file.write('\tatom {H, -3.73087280, 0.05697639, 0.65771307 },\n')
file.write('\tatom {H, -0.25280203, -1.42086828, -3.29625616 },\n')
file.write('\tatom {H, -1.10573897, -0.12324754, -3.42072712 },\n')
file.write('\tatom {H, -2.22307731, 1.58423051, -2.38534021 },\n')
file.write('\tatom {H, -2.93898411, 0.24358925, -2.07959638 },\n')
file.write('\tatom {H, -1.43114729, -3.86916980, 2.59572095 },\n')
file.write('\tatom {H, -1.42737601, -2.30951496, 2.47633957 },\n')
file.write('\tatom {H, -0.13586670, -3.83927742, -2.98398146 },\n')
file.write('\tatom {H, 0.77934548, -3.15444337, -1.93321293 },\n')
file.write('\tatom {H, -4.59832291, -1.22537007, -1.12737541 },\n')
file.write('\tatom {H, -3.17257126, -1.69388848, -0.69954170 },\n')
file.write('\tatom {H, -1.46763166, -3.03149906, -1.15878165 },\n')
file.write('\tatom {H, -1.83957475, -3.10072588, 0.35232980 },\n')
#-----------------------
#n=20 Fused Cubes *** C2
#-----------------------
#MP2/aug-cc-pVTZ delta E=217.88, kcal/mol
def print_w20fused(file):
file.write('\tatom {O, -0.00011170, 1.89124892, 5.56014624 },\n')
file.write('\tatom {O, 2.01365707, -0.13154392, 5.65472774 },\n')
file.write('\tatom {O, 0.13154392, 2.01365707, -5.65472774 },\n')
file.write('\tatom {O, 1.89124892, 0.00011170, -5.56014624 },\n')
file.write('\tatom {O, 1.94859527, 0.02560379, -2.73064080 },\n')
file.write('\tatom {O, -1.94859527, -0.02560379, -2.73064080 },\n')
file.write('\tatom {O, 1.94851071, 0.02903527, 2.85767511 },\n')
file.write('\tatom {O, -1.94851071, -0.02903527, 2.85767511 },\n')
file.write('\tatom {O, 0.00011170, -1.89124892, 5.56014624 },\n')
file.write('\tatom {O, -2.01365707, 0.13154392, 5.65472774 },\n')
file.write('\tatom {O, -0.13154392, -2.01365707, -5.65472774 },\n')
file.write('\tatom {O, -1.89124892, -0.00011170, -5.56014624 },\n')
file.write('\tatom {O, -1.95562639, 0.07896636, 0.06622882 },\n')
file.write('\tatom {O, -0.02560379, 1.94859527, 2.73064080 },\n')
file.write('\tatom {O, -0.02903527, 1.94851071, -2.85767511 },\n')
file.write('\tatom {O, 0.02903527, -1.94851071, -2.85767511 },\n')
file.write('\tatom {O, 0.07896636, 1.95562639, -0.06622882 },\n')
file.write('\tatom {O, -0.07896636, -1.95562639, -0.06622882 },\n')
file.write('\tatom {O, 0.02560379, -1.94859527, 2.73064080 },\n')
file.write('\tatom {O, 1.95562639, -0.07896636, 0.06622882 },\n')
file.write('\tatom {H, 0.79989605, 1.35280911, 5.68906262 },\n')
file.write('\tatom {H, 0.06198380, 2.20521176, 4.64309683 },\n')
file.write('\tatom {H, 1.32913877, -0.84798431, 5.70007128 },\n')
file.write('\tatom {H, 2.66479384, -0.33786677, 6.33131988 },\n')
file.write('\tatom {H, 0.33786677, 2.66479384, -6.33131988 },\n')
file.write('\tatom {H, 0.84798431, 1.32913877, -5.70007128 },\n')
file.write('\tatom {H, 1.35280911, -0.79989605, -5.68906262 },\n')
file.write('\tatom {H, 2.20521176, -0.06198380, -4.64309683 },\n')
file.write('\tatom {H, 2.31173867, 0.01944289, -1.82913594 },\n')
file.write('\tatom {H, 1.32886723, 0.78738104, -2.74525907 },\n')
file.write('\tatom {H, -2.31173867, -0.01944289, -1.82913594 },\n')
file.write('\tatom {H, -1.32886723, -0.78738104, -2.74525907 },\n')
file.write('\tatom {H, 2.25842437, -0.00766951, 3.78016786 },\n')
file.write('\tatom {H, 1.32470791, 0.78779803, 2.83722371 },\n')
file.write('\tatom {H, -1.32470791, -0.78779803, 2.83722371 },\n')
file.write('\tatom {H, -2.25842437, 0.00766951, 3.78016786 },\n')
file.write('\tatom {H, -0.06198380, -2.20521176, 4.64309683 },\n')
file.write('\tatom {H, -0.79989605, -1.35280911, 5.68906262 },\n')
file.write('\tatom {H, -1.32913877, 0.84798431, 5.70007128 },\n')
file.write('\tatom {H, -2.66479384, 0.33786677, 6.33131988 },\n')
file.write('\tatom {H, -0.84798431, -1.32913877, -5.70007128 },\n')
file.write('\tatom {H, -0.33786677, -2.66479384, -6.33131988 },\n')
file.write('\tatom {H, -1.35280911, 0.79989605, -5.68906262 },\n')
file.write('\tatom {H, -2.20521176, 0.06198380, -4.64309683 },\n')
file.write('\tatom {H, -1.31495188, 0.82267561, 0.04672494 },\n')
file.write('\tatom {H, -2.30533139, 0.08017087, 0.97354165 },\n')
file.write('\tatom {H, -0.01944289, 2.31173867, 1.82913594 },\n')
file.write('\tatom {H, -0.78738104, 1.32886723, 2.74525907 },\n')
file.write('\tatom {H, -0.78779803, 1.32470791, -2.83722371 },\n')
file.write('\tatom {H, 0.00766951, 2.25842437, -3.78016786 },\n')
file.write('\tatom {H, -0.00766951, -2.25842437, -3.78016786 },\n')
file.write('\tatom {H, 0.78779803, -1.32470791, -2.83722371 },\n')
file.write('\tatom {H, 0.82267561, 1.31495188, -0.04672494 },\n')
file.write('\tatom {H, 0.08017087, 2.30533139, -0.97354165 },\n')
file.write('\tatom {H, -0.08017087, -2.30533139, -0.97354165 },\n')
file.write('\tatom {H, -0.82267561, -1.31495188, -0.04672494 },\n')
file.write('\tatom {H, 0.01944289, -2.31173867, 1.82913594 },\n')
file.write('\tatom {H, 0.78738104, -1.32886723, 2.74525907 },\n')
file.write('\tatom {H, 2.30533139, -0.08017087, 0.97354165 },\n')
file.write('\tatom {H, 1.31495188, -0.82267561, 0.04672494 },\n')
#------------------------------------------
#n=20 Face-sharing pentagonal prisms *** C1
#------------------------------------------
#MP2/aug-cc-pVTZ delta E=218.45, kcal/mol
def print_w20face(file):
file.write('\tatom {O, 0.52945249, -1.32341961, -2.30158189 },\n')
file.write('\tatom {H, 0.57230430, -0.42717289, -2.66988776 },\n')
file.write('\tatom {H, 1.20070409, -1.32066723, -1.58236699 },\n')
file.write('\tatom {O, 1.15049608, -4.32687653, 2.04924411 },\n')
file.write('\tatom {H, 1.57542699, -5.00739601, 2.57892027 },\n')
file.write('\tatom {H, 1.60949740, -4.34253049, 1.16189634 },\n')
file.write('\tatom {O, 0.43806402, -4.20372761, -2.32782506 },\n')
file.write('\tatom {H, 0.44787429, -3.28142680, -2.62541858 },\n')
file.write('\tatom {H, -0.45932379, -4.33561864, -1.97710108 },\n')
file.write('\tatom {O, 2.35781728, -1.30466145, -0.25541273 },\n')
file.write('\tatom {H, 2.72470105, -0.40693996, -0.27646674 },\n')
file.write('\tatom {H, 1.86671290, -1.35451305, 0.59543217 },\n')
file.write('\tatom {O, 0.96735180, -1.54034611, 2.09783423 },\n')
file.write('\tatom {H, 0.00404971, -1.51159356, 1.90135129 },\n')
file.write('\tatom {H, 1.13621383, -2.47031437, 2.33758076 },\n')
file.write('\tatom {O, -1.62143181, -4.23624316, 1.51666252 },\n')
file.write('\tatom {H, -0.68246122, -4.36821795, 1.74577177 },\n')
file.write('\tatom {H, -1.80536731, -3.31735200, 1.76945516 },\n')
file.write('\tatom {O, 2.31463680, -4.17464923, -0.30755498 },\n')
file.write('\tatom {H, 1.67602063, -4.26997138, -1.04828109 },\n')
file.write('\tatom {H, 2.63435439, -3.26358668, -0.39274644 },\n')
file.write('\tatom {O, -1.70526253, -1.39791294, 1.52170484 },\n')
file.write('\tatom {H, -1.80917617, -1.40893550, 0.54278325 },\n')
file.write('\tatom {H, -1.98575126, -0.50890908, 1.79217826 },\n')
file.write('\tatom {O, -1.95876773, -1.53963091, -1.19559330 },\n')
file.write('\tatom {H, -1.06418216, -1.49619064, -1.60411429 },\n')
file.write('\tatom {H, -2.24140869, -2.46507811, -1.31525881 },\n')
file.write('\tatom {O, -2.14759768, -4.31870400, -1.09600433 },\n')
file.write('\tatom {H, -2.81592119, -4.98319666, -1.28498287 },\n')
file.write('\tatom {H, -1.97069280, -4.37663278, -0.12078047 },\n')
file.write('\tatom {O, 0.47462142, 4.36989186, -2.26500408 },\n')
file.write('\tatom {H, 0.64332081, 5.05859373, -2.91474547 },\n')
file.write('\tatom {H, 1.15332550, 4.48635526, -1.56420801 },\n')
file.write('\tatom {O, 1.04744580, 1.25382475, 2.07656068 },\n')
file.write('\tatom {H, 1.20137037, 0.34809551, 2.39608938 },\n')
file.write('\tatom {H, 1.50864481, 1.29478677, 1.20898789 },\n')
file.write('\tatom {O, 0.36246279, 1.51681328, -2.28083815 },\n')
file.write('\tatom {H, -0.51589979, 1.43165665, -1.84405298 },\n')
file.write('\tatom {H, 0.40074983, 2.44105957, -2.57548934 },\n')
file.write('\tatom {O, 2.25856065, 4.37564134, -0.17806446 },\n')
file.write('\tatom {H, 1.75221775, 4.32182058, 0.68054568 },\n')
file.write('\tatom {H, 2.95320011, 5.02386318, -0.02770403 },\n')
file.write('\tatom {O, 0.90843697, 4.07580627, 2.06792242 },\n')
file.write('\tatom {H, -0.05337760, 4.21506749, 1.98795378 },\n')
file.write('\tatom {H, 1.00425702, 3.15139572, 2.35266656 },\n')
file.write('\tatom {O, -1.65298960, 1.41022398, 1.65281057 },\n')
file.write('\tatom {H, -0.68490739, 1.37364478, 1.82298417 },\n')
file.write('\tatom {H, -1.90699229, 2.32674767, 1.85706401 },\n')
file.write('\tatom {O, 2.29898253, 1.52003111, -0.34503957 },\n')
file.write('\tatom {H, 2.58424894, 2.44853098, -0.33251102 },\n')
file.write('\tatom {H, 1.61192718, 1.49454437, -1.04947491 },\n')
file.write('\tatom {O, -1.86712930, 4.21917278, 1.62359977 },\n')
file.write('\tatom {H, -2.47864145, 4.86605383, 1.98706675 },\n')
file.write('\tatom {H, -1.98950935, 4.24471119, 0.63900806 },\n')
file.write('\tatom {O, -2.09773771, 4.06193394, -1.02689536 },\n')
file.write('\tatom {H, -2.29133429, 3.12966308, -1.22336514 },\n')
file.write('\tatom {H, -1.26407666, 4.24453134, -1.49235316 },\n')
file.write('\tatom {O, -2.05469277, 1.24139297, -1.04290791 },\n')
file.write('\tatom {H, -2.34397200, 0.33343012, -1.23979408 },\n')
file.write('\tatom {H, -1.92043436, 1.25492362, -0.06791205 },\n')
#------------------------------------------
#n=20 Edge-sharing pentagonal prisms *** C1
#------------------------------------------
#MP2/aug-cc-pVTZ delta E=220.32, kcal/mol
def print_w20edge(file):
file.write('\tatom {O, -1.39543275, 2.31657627, 1.37046631 },\n')
file.write('\tatom {H, -0.92662369, 1.46817589, 1.52206631 },\n')
file.write('\tatom {H, -2.32442996, 2.12591716, 1.59673517 },\n')
file.write('\tatom {O, -1.32982068, 2.51637315, -1.37954570 },\n')
file.write('\tatom {H, -1.36882754, 2.54005044, -0.39789671 },\n')
file.write('\tatom {H, -0.65200515, 3.17800652, -1.61031691 },\n')
file.write('\tatom {O, -0.01733366, -0.02086307, 1.42288405 },\n')
file.write('\tatom {H, -0.39379156, -0.91808476, 1.55793838 },\n')
file.write('\tatom {H, 0.95427978, -0.07116580, 1.55922563 },\n')
file.write('\tatom {O, 2.70141424, 0.02680068, 1.34938542 },\n')
file.write('\tatom {H, 2.75656934, 0.02328537, 0.36773137 },\n')
file.write('\tatom {H, 2.96101409, 0.93225671, 1.60318599 },\n')
file.write('\tatom {O, -4.03268277, -1.36044802, -1.47184098 },\n')
file.write('\tatom {H, -4.78522154, -1.60751488, -2.01683073 },\n')
file.write('\tatom {H, -3.97489677, -0.37132077, -1.52599292 },\n')
file.write('\tatom {O, 0.89946758, -4.24052125, 1.48696795 },\n')
file.write('\tatom {H, 1.06572432, -5.00858815, 2.04076342 },\n')
file.write('\tatom {H, 1.73078849, -3.69863929, 1.51444552 },\n')
file.write('\tatom {O, 3.12169441, -2.79383799, -1.56662762 },\n')
file.write('\tatom {H, 3.69601882, -3.27637364, -2.16793827 },\n')
file.write('\tatom {H, 2.24825705, -3.26645164, -1.58986551 },\n')
file.write('\tatom {O, -0.08687104, 0.08739608, -1.30493909 },\n')
file.write('\tatom {H, -0.52989350, 0.93922436, -1.51509713 },\n')
file.write('\tatom {H, -0.08003457, 0.07934656, -0.31956773 },\n')
file.write('\tatom {O, -1.39615176, -2.28999690, -1.40973862 },\n')
file.write('\tatom {H, -2.33616644, -2.12129946, -1.60528195 },\n')
file.write('\tatom {H, -0.96468677, -1.41450889, -1.52550547 },\n')
file.write('\tatom {O, -4.01405126, 1.35720135, 1.50706655 },\n')
file.write('\tatom {H, -4.75134581, 1.59164800, 2.07791242 },\n')
file.write('\tatom {H, -3.93675624, 0.36921416, 1.55883498 },\n')
file.write('\tatom {O, 0.78524496, -4.05592492, -1.37112947 },\n')
file.write('\tatom {H, 0.73646675, -4.29406759, -0.42971134 },\n')
file.write('\tatom {H, -0.02972066, -3.55028342, -1.54040588 },\n')
file.write('\tatom {O, 2.63041510, -0.04600633, -1.40445880 },\n')
file.write('\tatom {H, 2.89071090, -0.95485093, -1.64456265 },\n')
file.write('\tatom {H, 1.65651502, -0.01485283, -1.52765948 },\n')
file.write('\tatom {O, 3.13906504, 2.71354684, -1.28081039 },\n')
file.write('\tatom {H, 3.10239157, 1.76228567, -1.48429784 },\n')
file.write('\tatom {H, 3.32533139, 2.75043550, -0.32691209 },\n')
file.write('\tatom {O, -1.23232183, -2.45173072, 1.34110243 },\n')
file.write('\tatom {H, -0.60298141, -3.16346609, 1.56144181 },\n')
file.write('\tatom {H, -1.30491352, -2.48082581, 0.36147818 },\n')
file.write('\tatom {O, 0.92255596, 4.17488519, -1.52804515 },\n')
file.write('\tatom {H, 1.11093701, 4.93710992, -2.08306670 },\n')
file.write('\tatom {H, 1.74878456, 3.62425684, -1.53602828 },\n')
file.write('\tatom {O, 3.11084510, -2.76903827, 1.29472370 },\n')
file.write('\tatom {H, 3.09043382, -1.82589836, 1.53137113 },\n')
file.write('\tatom {H, 3.30883978, -2.77464518, 0.34210251 },\n')
file.write('\tatom {O, -3.90092000, 1.30179294, -1.35091239 },\n')
file.write('\tatom {H, -3.09681655, 1.79496369, -1.58474282 },\n')
file.write('\tatom {H, -4.03470630, 1.48678957, -0.40518202 },\n')
file.write('\tatom {O, 3.12357086, 2.78007575, 1.58228565 },\n')
file.write('\tatom {H, 2.24950517, 3.25165854, 1.58673980 },\n')
file.write('\tatom {H, 3.69419494, 3.28034834, 2.17303947 },\n')
file.write('\tatom {O, 0.80305370, 4.06443682, 1.32921270 },\n')
file.write('\tatom {H, -0.02269679, 3.58096491, 1.50807247 },\n')
file.write('\tatom {H, 0.76332278, 4.27139378, 0.37953183 },\n')
file.write('\tatom {O, -3.82938205, -1.29987979, 1.37976151 },\n')
file.write('\tatom {H, -3.00615211, -1.77303605, 1.58824604 },\n')
file.write('\tatom {H, -3.98755690, -1.48869245, 0.43855016 },\n')
#--------------
#n=21 (Surface)
#--------------
def print_w21(file):
file.write('\tatom {O, 2.88102880, 1.35617710, -1.35936865 },\n')
file.write('\tatom {H, 2.58653411, 2.28696609, -1.45609052 },\n')
file.write('\tatom {H, 3.32093709, 1.29343465, -0.49116710 },\n')
file.write('\tatom {O, 0.32179453, 0.62629846, -0.94851226 },\n')
file.write('\tatom {H, 1.26447032, 0.88843717, -1.07026883 },\n')
file.write('\tatom {H, -0.21718807, 1.36563667, -1.31919284 },\n')
file.write('\tatom {O, -0.16007111, -1.63872066, -2.31840571 },\n')
file.write('\tatom {H, 0.01823460, -0.80812840, -1.82332661 },\n')
file.write('\tatom {H, 0.10093174, -2.33288028, -1.67894566 },\n')
file.write('\tatom {O, -2.84753775, -0.70659788, 1.22536498 },\n')
file.write('\tatom {H, -2.82066138, -1.51379494, 1.77020749 },\n')
file.write('\tatom {H, -2.78672686, -1.02520290, 0.29442139 },\n')
file.write('\tatom {O, -1.28347305, 2.64257815, -1.71714330 },\n')
file.write('\tatom {H, -1.51430450, 2.88443923, -0.79572381 },\n')
file.write('\tatom {H, -2.09726640, 2.24298480, -2.07172804 },\n')
file.write('\tatom {O, -3.72092200, 1.16535937, -2.05155378 },\n')
file.write('\tatom {H, -3.96467209, 1.40975690, -1.12274906 },\n')
file.write('\tatom {H, -4.48373842, 1.39703848, -2.59023807 },\n')
file.write('\tatom {O, -2.75707813, -1.44066167, -1.41017665 },\n')
file.write('\tatom {H, -3.20125609, -0.67001218, -1.80694890 },\n')
file.write('\tatom {H, -1.88535531, -1.49090783, -1.86154221 },\n')
file.write('\tatom {O, -2.18430745, -4.00503366, -0.11745556 },\n')
file.write('\tatom {H, -2.58170529, -3.32141555, -0.67897787 },\n')
file.write('\tatom {H, -1.24901286, -4.03122679, -0.38234526 },\n')
file.write('\tatom {O, 3.76378263, 0.84987648, 1.25359107 },\n')
file.write('\tatom {H, 4.64925014, 0.99382151, 1.60153466 },\n')
file.write('\tatom {H, 3.59269678, -0.12230330, 1.37734803 },\n')
file.write('\tatom {O, 3.25939067, -1.73586556, 1.54809767 },\n')
file.write('\tatom {H, 2.38789220, -1.97346725, 1.90398296 },\n')
file.write('\tatom {H, 3.34047461, -2.24000005, 0.71725832 },\n')
file.write('\tatom {O, 0.47067808, -2.08139855, 2.05406984 },\n')
file.write('\tatom {H, -0.31565776, -2.51643817, 2.42960695 },\n')
file.write('\tatom {H, 0.20916882, -1.14318846, 1.97673663 },\n')
file.write('\tatom {O, 0.84341794, 4.45219154, 1.19543335 },\n')
file.write('\tatom {H, 1.22169208, 3.69174389, 1.70700542 },\n')
file.write('\tatom {H, 1.12534416, 5.24681772, 1.65820509 },\n')
file.write('\tatom {O, 3.29073454, -3.06649080, -0.92760728 },\n')
file.write('\tatom {H, 4.01092916, -3.62238167, -1.23916202 },\n')
file.write('\tatom {H, 3.16110064, -2.37099393, -1.62514329 },\n')
file.write('\tatom {O, -2.13311585, -3.15170259, 2.41174304 },\n')
file.write('\tatom {H, -2.19514114, -3.57941119, 1.51889439 },\n')
file.write('\tatom {H, -2.56149431, -3.75503610, 3.02609742 },\n')
file.write('\tatom {O, 1.49292018, 3.74546296, -1.50486304 },\n')
file.write('\tatom {H, 1.31323368, 4.10353119, -0.61794466 },\n')
file.write('\tatom {H, 0.61187301, 3.56898981, -1.86464476 },\n')
file.write('\tatom {O, -0.48284790, 0.54531195, 1.58871855 },\n')
file.write('\tatom {H, -0.15462414, 0.54530283, 0.65343103 },\n')
file.write('\tatom {H, -1.36928803, 0.11358922, 1.52480652 },\n')
file.write('\tatom {O, 2.72236852, -1.18342053, -2.72120243 },\n')
file.write('\tatom {H, 1.76763800, -1.29146509, -2.86748821 },\n')
file.write('\tatom {H, 2.83269882, -0.26932242, -2.40792605 },\n')
file.write('\tatom {O, 1.64182263, 2.24659436, 2.46745517 },\n')
file.write('\tatom {H, 2.41153336, 1.77257628, 2.10024166 },\n')
file.write('\tatom {H, 0.91471507, 1.60671414, 2.41570272 },\n')
file.write('\tatom {O, -4.10080206, 1.73028823, 0.54526604 },\n')
file.write('\tatom {H, -3.83492913, 0.90816710, 0.99225907 },\n')
file.write('\tatom {H, -3.44369737, 2.38297040, 0.83399222 },\n')
file.write('\tatom {O, -1.60618492, 3.09549014, 1.00019438 },\n')
file.write('\tatom {H, -0.94029673, 3.78712081, 1.16680637 },\n')
file.write('\tatom {H, -1.18360076, 2.28420300, 1.34174432 },\n')
file.write('\tatom {O, 0.57200422, -3.46363889, -0.35742274 },\n')
file.write('\tatom {H, 0.57764102, -2.95313709, 0.47874423 },\n')
file.write('\tatom {H, 1.51287263, -3.60041770, -0.56712755 },\n')
#--------------
#n=64 (Graham Fletcher)
#--------------
def print_w64(file):
file.write('\tatom { O, 5.3341355614, 2.2822875147, 2.5569123279 },\n')
file.write('\tatom { H, 6.1242446995, 2.3804428156, 3.0638359324 },\n')
file.write('\tatom { H, 4.6164505920, 2.2847341851, 3.1699382266 },\n')
file.write('\tatom { O, -4.9828260569, 6.5645399222, -1.4305048434 },\n')
file.write('\tatom { H, -5.7327448357, 6.5144561802, -2.0014618453 },\n')
file.write('\tatom { H, -4.9037228691, 5.7222103356, -1.0120529257 },\n')
file.write('\tatom { O, -3.6227640662, -2.0094049926, -3.5842035387 },\n')
file.write('\tatom { H, -3.5704397802, -2.5983493456, -2.8484826866 },\n')
file.write('\tatom { H, -4.5389553668, -1.8307904854, -3.7240921110 },\n')
file.write('\tatom { O, 3.1392579772, 2.4144865880, -4.5706452556 },\n')
file.write('\tatom { H, 2.8184642359, 3.2470917478, -4.8784414178 },\n')
file.write('\tatom { H, 3.3782568871, 2.5337043355, -3.6653579510 },\n')
file.write('\tatom { O, -6.0971764862, -1.1475458408, -3.2398107026 },\n')
file.write('\tatom { H, -5.8105777311, -0.6248900075, -2.5079838873 },\n')
file.write('\tatom { H, -6.8392216912, -1.6486367323, -2.9412218779 },\n')
file.write('\tatom { O, -4.7437455608, 0.2406638067, -1.4425658527 },\n')
file.write('\tatom { H, -5.1439053474, 0.8961107449, -0.8938036906 },\n')
file.write('\tatom { H, -4.1189937486, 0.6893944851, -1.9895616530 },\n')
file.write('\tatom { O, -1.3905594340, 0.6003032181, 1.6523328323 },\n')
file.write('\tatom { H, -1.0076438736, -0.2571121741, 1.5569757518 },\n')
file.write('\tatom { H, -1.0031627095, 1.1459707668, 0.9867134529 },\n')
file.write('\tatom { O, -2.9506051416, -2.0575020924, 2.2939955486 },\n')
file.write('\tatom { H, -2.0162828847, -2.1275710915, 2.4080600466 },\n')
file.write('\tatom { H, -3.0925674007, -1.6978547015, 1.4329619109 },\n')
file.write('\tatom { O, -1.4892138488, 4.9887537665, -3.2733984712 },\n')
file.write('\tatom { H, -0.5572623487, 5.0949966526, -3.1682447519 },\n')
file.write('\tatom { H, -1.8837342524, 5.8159639054, -3.0476652410 },\n')
file.write('\tatom { O, 0.9956287780, 2.8403164571, -1.8268350733 },\n')
file.write('\tatom { H, 1.9163773315, 2.6332485532, -1.8118872182 },\n')
file.write('\tatom { H, 0.9290176036, 3.7698996404, -1.9762221903 },\n')
file.write('\tatom { O, -2.5570429685, 7.1678424798, -2.1041651892 },\n')
file.write('\tatom { H, -2.4346193763, 8.0661032604, -2.3668788894 },\n')
file.write('\tatom { H, -3.4798399701, 7.0565619038, -1.9400287177 },\n')
file.write('\tatom { O, 1.9364331416, 0.8087408534, 1.8554569521 },\n')
file.write('\tatom { H, 1.4707676717, 0.8191552420, 2.6763867084 },\n')
file.write('\tatom { H, 1.9844358212, 1.7036938663, 1.5594277280 },\n')
file.write('\tatom { O, -0.8237869838, 6.5422485997, -0.2172900020 },\n')
file.write('\tatom { H, -1.0773967082, 5.8227010871, 0.3384184997 },\n')
file.write('\tatom { H, -1.5172520267, 6.6558571044, -0.8474310942 },\n')
file.write('\tatom { O, -1.1636685123, 4.5053778469, 1.4336238815 },\n')
file.write('\tatom { H, -0.9421345969, 3.7303124294, 0.9426302185 },\n')
file.write('\tatom { H, -2.0469731124, 4.3864670806, 1.7442871514 },\n')
file.write('\tatom { O, -0.4673817162, -0.9650433991, -1.0753723663 },\n')
file.write('\tatom { H, -1.3292040099, -0.9969136435, -0.6918034996 },\n')
file.write('\tatom { H, -0.4411881520, -0.1946487172, -1.6200597862 },\n')
file.write('\tatom { O, -2.7498569237, 0.5798645061, -3.2061586058 },\n')
file.write('\tatom { H, -2.9949221552, -0.3106733578, -3.4004877068 },\n')
file.write('\tatom { H, -1.8071593424, 0.6147993766, -3.2374520413 },\n')
file.write('\tatom { O, -2.9697982126, -1.3746742901, -0.2808320442 },\n')
file.write('\tatom { H, -3.6357151582, -0.7980251231, -0.6198140699 },\n')
file.write('\tatom { H, -3.0855526381, -2.2060771209, -0.7123974507 },\n')
file.write('\tatom { O, 1.8148526783, 0.1245102111, -4.6237838221 },\n')
file.write('\tatom { H, 2.2429426431, 0.9556874312, -4.7532514430 },\n')
file.write('\tatom { H, 2.4707150376, -0.4704553354, -4.2970744432 },\n')
file.write('\tatom { O, -0.8261033555, 2.4323079020, -0.0832596083 },\n')
file.write('\tatom { H, -1.5269390763, 2.5208697741, -0.7092504559 },\n')
file.write('\tatom { H, -0.0196120782, 2.4648804414, -0.5725338701 },\n')
file.write('\tatom { O, -1.3817049691, -5.0352329774, -2.4232812447 },\n')
file.write('\tatom { H, -0.7684293116, -4.3182706978, -2.4504199904 },\n')
file.write('\tatom { H, -1.4545340129, -5.3634550894, -3.3052362490 },\n')
file.write('\tatom { O, 1.9215826085, -1.0921763125, 0.1197749095 },\n')
file.write('\tatom { H, 1.0849945087, -1.2270160735, -0.2959377849 },\n')
file.write('\tatom { H, 1.7847739979, -0.4579975452, 0.8053251625 },\n')
file.write('\tatom { O, -1.0018623482, -5.1120322982, -5.0837566752 },\n')
file.write('\tatom { H, -1.3933176067, -5.5582077871, -5.8176290536 },\n')
file.write('\tatom { H, -1.1664244625, -4.1908496790, -5.2071274443 },\n')
file.write('\tatom { O, 3.5172957950, 2.3202722362, -1.9688149465 },\n')
file.write('\tatom { H, 3.9976007522, 2.8596853436, -1.3611809269 },\n')
file.write('\tatom { H, 3.7014504726, 1.4233801808, -1.7395767524 },\n')
file.write('\tatom { O, -2.6102841193, 3.0430679155, -1.9350734213 },\n')
file.write('\tatom { H, -2.8695431759, 2.3270909601, -2.4927831979 },\n')
file.write('\tatom { H, -2.2135983071, 3.6922269672, -2.4937426848 },\n')
file.write('\tatom { O, 0.9098841921, 5.4361164481, -2.0327307748 },\n')
file.write('\tatom { H, 1.8021902964, 5.7117315130, -1.8959659902 },\n')
file.write('\tatom { H, 0.3966346177, 5.8268238497, -1.3436726824 },\n')
file.write('\tatom { O, -0.1396389829, 0.8002592786, -2.9481938727 },\n')
file.write('\tatom { H, 0.1803973363, 1.6461091339, -2.6780212784 },\n')
file.write('\tatom { H, 0.4842490367, 0.4598789293, -3.5693085246 },\n')
file.write('\tatom { O, -4.3860363431, 4.2720576892, -0.3433107406 },\n')
file.write('\tatom { H, -3.7336474223, 3.8267809447, -0.8600270757 },\n')
file.write('\tatom { H, -4.0487873372, 4.3197159157, 0.5369564095 },\n')
file.write('\tatom { O, -0.3551237102, -1.8527769523, 2.0684277226 },\n')
file.write('\tatom { H, -0.0920198226, -2.5657488890, 1.5086725835 },\n')
file.write('\tatom { H, 0.2994391739, -1.7831738698, 2.7448752784 },\n')
file.write('\tatom { O, -3.4203294336, -3.7199007883, -1.3994806141 },\n')
file.write('\tatom { H, -2.7432687765, -4.3024393603, -1.7046299754 },\n')
file.write('\tatom { H, -3.7574552271, -4.0910677065, -0.5998183277 },\n')
file.write('\tatom { O, -0.0519252874, -2.8278717444, -2.8209994698 },\n')
file.write('\tatom { H, -0.5501145679, -2.5647894176, -3.5782798566 },\n')
file.write('\tatom { H, -0.2137034959, -2.1819475862, -2.1520538463 },\n')
file.write('\tatom { O, 3.9326972172, -0.1447916343, -1.2323267086 },\n')
file.write('\tatom { H, 4.6265233238, -0.0736418998, -0.5963852781 },\n')
file.write('\tatom { H, 3.1862113709, -0.5043840273, -0.7802972336 },\n')
file.write('\tatom { O, 3.7401140949, -0.5820784924, 3.2346232262 },\n')
file.write('\tatom { H, 3.4109667678, 0.0010845580, 2.5694455449 },\n')
file.write('\tatom { H, 4.4887698990, -1.0213950985, 2.8639564478 },\n')
file.write('\tatom { O, 1.6035214145, 7.5332271460, 0.5500656116 },\n')
file.write('\tatom { H, 0.7804662926, 7.4203968594, 0.1020387745 },\n')
file.write('\tatom { H, 1.6492220601, 8.4396260189, 0.8093550024 },\n')
file.write('\tatom { O, 4.4003916363, 3.8226598525, 0.0069188393 },\n')
file.write('\tatom { H, 5.2612591295, 3.6986968994, 0.3735505340 },\n')
file.write('\tatom { H, 3.7823167909, 3.6072717418, 0.6869702355 },\n')
file.write('\tatom { O, -3.5834193506, 3.9852493421, 2.1840448160 },\n')
file.write('\tatom { H, -3.7291009199, 3.0645550013, 2.3322921269 },\n')
file.write('\tatom { H, -3.7812015565, 4.4252777695, 2.9953004084 },\n')
file.write('\tatom { O, 3.4936095435, -3.1058523194, 0.9291805120 },\n')
file.write('\tatom { H, 3.3712294837, -3.7693386248, 0.2691142293 },\n')
file.write('\tatom { H, 2.9197575006, -2.3908543652, 0.7047919818 },\n')
file.write('\tatom { O, -3.7164915636, -0.3806762095, 4.1363149244 },\n')
file.write('\tatom { H, -4.3366135314, -0.6984719984, 4.7729726312 },\n')
file.write('\tatom { H, -3.5133359239, -1.1073635653, 3.5692772474 },\n')
file.write('\tatom { O, -3.8937397475, -4.4355904336, 1.1000366022 },\n')
file.write('\tatom { H, -3.6344581178, -3.7269260008, 1.6669988693 },\n')
file.write('\tatom { H, -3.4115082598, -5.1971956457, 1.3798378660 },\n')
file.write('\tatom { O, 5.7005647273, 0.3190826622, 0.6496630329 },\n')
file.write('\tatom { H, 5.5135869586, 0.9928593935, 1.2836525316 },\n')
file.write('\tatom { H, 5.8969239381, -0.4650189811, 1.1370090366 },\n')
file.write('\tatom { O, 0.3840099694, -4.1137118300, 0.7819986172 },\n')
file.write('\tatom { H, 1.0860882843, -4.1283857388, 0.1513250327 },\n')
file.write('\tatom { H, -0.1978841279, -4.8218905579, 0.5566974790 },\n')
file.write('\tatom { O, -3.8748255878, 1.3976416485, 2.1437263758 },\n')
file.write('\tatom { H, -3.0861576019, 1.0303095317, 1.7777367885 },\n')
file.write('\tatom { H, -4.0796022953, 0.8893530343, 2.9122233757 },\n')
file.write('\tatom { O, 5.6156770347, -2.0418229061, 1.9133661087 },\n')
file.write('\tatom { H, 6.2490773326, -2.6191226940, 2.3088499537 },\n')
file.write('\tatom { H, 4.9780399179, -2.5908451139, 1.4857307772 },\n')
file.write('\tatom { O, -0.6231302264, -6.3802894242, -0.3460579667 },\n')
file.write('\tatom { H, 0.1618029347, -6.8320433206, -0.6119090598 },\n')
file.write('\tatom { H, -1.0156082384, -6.0333732407, -1.1312258633 },\n')
file.write('\tatom { O, -3.3285808216, 4.9913954097, 4.6233326842 },\n')
file.write('\tatom { H, -2.4414687619, 4.6895031048, 4.7363445590 },\n')
file.write('\tatom { H, -3.3912517639, 5.8217440407, 5.0677108542 },\n')
file.write('\tatom { O, -1.4311111886, 1.2081374480, 4.1811948741 },\n')
file.write('\tatom { H, -1.4049614958, 1.0234543269, 3.2559452902 },\n')
file.write('\tatom { H, -2.1760032331, 0.7446962855, 4.5293853176 },\n')
file.write('\tatom { O, 3.2736711954, 6.0672313090, -1.0775623608 },\n')
file.write('\tatom { H, 3.0040606059, 6.6267462176, -0.3668362173 },\n')
file.write('\tatom { H, 3.7758538289, 5.3655843574, -0.6949625541 },\n')
file.write('\tatom { O, 2.3742924087, -3.5008642086, -3.2237852560 },\n')
file.write('\tatom { H, 2.9884781739, -2.8117693661, -3.4207690065 },\n')
file.write('\tatom { H, 1.5253394643, -3.0948867370, -3.1507123820 },\n')
file.write('\tatom { O, -0.9668503726, 3.7985207601, 4.3145144878 },\n')
file.write('\tatom { H, -1.1550049065, 2.8768954373, 4.3925086527 },\n')
file.write('\tatom { H, -0.8388155195, 3.9724514381, 3.3956926064 },\n')
file.write('\tatom { O, 2.4466289510, -4.6602701167, -0.8846294398 },\n')
file.write('\tatom { H, 2.3411313009, -5.5896999919, -1.0107595591 },\n')
file.write('\tatom { H, 2.4976467404, -4.2739068426, -1.7442799849 },\n')
file.write('\tatom { O, 2.5305065060, 3.3990198177, 1.7380162868 },\n')
file.write('\tatom { H, 2.8139586713, 3.1626575111, 2.6067315359 },\n')
file.write('\tatom { H, 1.9901367993, 4.1675619150, 1.8286919663 },\n')
file.write('\tatom { O, 2.0604372566, -4.3421927104, 2.9544881392 },\n')
file.write('\tatom { H, 1.3133508954, -4.4899706086, 2.3969021675 },\n')
file.write('\tatom { H, 2.7485484168, -4.0061983788, 2.4026826530 },\n')
file.write('\tatom { O, -1.4732015151, -2.5280373896, -4.9717166508 },\n')
file.write('\tatom { H, -1.1453908577, -1.8601111236, -5.5524860802 },\n')
file.write('\tatom { H, -2.3506759588, -2.2741409052, -4.7341169248 },\n')
file.write('\tatom { O, 1.1220256376, 5.5421086354, 2.2765980287 },\n')
file.write('\tatom { H, 0.2228618163, 5.3651637598, 2.0506038086 },\n')
file.write('\tatom { H, 1.3717300800, 6.3212767322, 1.8060417384 },\n')
file.write('\tatom { O, 3.1317154301, 2.5492758922, 4.1628199793 },\n')
file.write('\tatom { H, 2.7683876916, 3.2785666929, 4.6392717988 },\n')
file.write('\tatom { H, 2.6043521159, 1.7961017823, 4.3761158982 },\n')
file.write('\tatom { O, 1.1372281976, 0.6885852150, 4.3813162371 },\n')
file.write('\tatom { H, 1.2494192006, -0.2236936306, 4.5958831050 },\n')
file.write('\tatom { H, 0.2369780196, 0.9023734278, 4.5676564399 },\n')
file.write('\tatom { O, -0.1016378554, -0.6926890421, -6.2527824769 },\n')
file.write('\tatom { H, 0.1702908401, -0.6992191681, -7.1566024789 },\n')
file.write('\tatom { H, 0.6406347495, -0.4004969649, -5.7482692223 },\n')
file.write('\tatom { O, -2.3285161725, -6.5511603803, 1.6473226256 },\n')
file.write('\tatom { H, -1.6896078368, -6.6733975054, 0.9634127793 },\n')
file.write('\tatom { H, -2.6495824264, -7.4090128354, 1.8751033974 },\n')
file.write('\tatom { O, 1.4527752549, -5.8880275983, -4.0675040894 },\n')
file.write('\tatom { H, 0.6343187338, -5.6701343479, -4.4840733017 },\n')
file.write('\tatom { H, 1.9234004643, -5.0778176061, -3.9537054302 },\n')
file.write('\tatom { O, 1.5983976737, -1.9057788553, 3.9163341508 },\n')
file.write('\tatom { H, 2.4379449939, -1.5554994781, 3.6646535812 },\n')
file.write('\tatom { H, 1.6912564086, -2.8448470595, 3.9364989058 },\n')
file.write('\tatom { O, 1.6232603136, 4.6427291861, 4.8129279939 },\n')
file.write('\tatom { H, 0.7443920087, 4.3339913767, 4.9650789773 },\n')
file.write('\tatom { H, 1.5954824854, 5.1634302425, 4.0261769175 },\n')
file.write('\tatom { O, 1.6169447310, -7.0971132635, -1.7472363134 },\n')
file.write('\tatom { H, 1.4837903359, -6.7909864629, -2.6300923283 },\n')
file.write('\tatom { H, 2.0775965422, -7.9184244668, -1.8114723640 },\n')
file.write('\tatom { O, 3.8562197281, -1.2815791583, -3.5491264929 },\n')
file.write('\tatom { H, 4.0398199763, -0.8915958844, -2.7094346155 },\n')
file.write('\tatom { H, 4.6633269012, -1.2650987669, -4.0381917871 },\n')
file.write('\tatom { O, -5.6995689859, 1.9293266748, 0.2651776192 },\n')
file.write('\tatom { H, -5.5321355459, 2.8335349273, 0.0524526011 },\n')
file.write('\tatom { H, -5.1861533360, 1.7315023768, 1.0320859394 },\n')
file.write('\tatom { O, -0.8806597750, 9.1048813981, 0.6911891481 },\n')
file.write('\tatom { H, -1.5189059755, 9.4074827100, 1.3172503972 },\n')
file.write('\tatom { H, -1.0704878524, 8.1947118324, 0.5286003513 },\n')
def print_geom(file,cluster):
file.write('molecule\n{\n')
file.write('\tcoords cartesian,\n')
if cluster == "rubrene":
print_rubrene(file)
if cluster == "w1":
print_w1(file,0.0)
elif cluster == "w2":
print_w2(file)
elif cluster == "w3":
print_w3(file)
elif cluster == "w4":
print_w4(file)
elif cluster == "w5":
print_w5(file)
elif cluster == "w6cage":
print_w6cage(file)
elif cluster == "w6book":
print_w6book(file)
elif cluster == "w6prism":
print_w6prism(file)
elif cluster == "w6cyclic":
print_w6cyclic(file)
elif cluster == "w7":
print_w7(file)
elif cluster == "w8s4":
print_w8s4(file)
elif cluster == "w8d2d":
print_w8d2d(file)
elif cluster == "w9":
print_w9(file)
elif cluster == "w10":
print_w10(file)
elif cluster == "w11i434":
print_w11i434(file)
elif cluster == "w11i4412":
print_w11i4412(file)
elif cluster == "w11i443":
print_w11i443(file)
elif cluster == "w11i515":
print_w11i515(file)
elif cluster == "w11i551":
print_w11i551(file)
elif cluster == "w12":
print_w12(file)
elif cluster == "w13":
print_w13(file)
elif cluster == "w14":
print_w14(file)
elif cluster == "w15":
print_w15(file)
elif cluster == "w16":
print_w16(file)
elif cluster == "w17int":
print_w17int(file)
elif cluster == "w17surf":
print_w17surf(file)
elif cluster == "w18":
print_w18(file)
elif cluster == "w19":
print_w19(file)
elif cluster == "w20dode":
print_w20dode(file)
elif cluster == "w20fused":
print_w20fused(file)
elif cluster == "w20face":
print_w20face(file)
elif cluster == "w20edge":
print_w20edge(file)
elif cluster == "w21":
print_w21(file)
elif cluster == "w64":
print_w64(file)
elif cluster[0] == "w":
n = int(cluster[1:])
for i in range(0,n):
print_w1(file,1.0*i)
def print_basis(file,basis):
file.write('\tbasis\n')
file.write('\t\tbasis_set '+basis+'\n},\n')
def print_scf(file):
file.write('1eints,\n')
file.write('2eints,\n')
file.write('localaoscf\n{\n')
file.write('\tconvergence 1e-9,\n')
file.write('\tfrozen_core on,\n')
file.write('\tmax_iterations 100,\n')
file.write('\tdiis\n\t{\n')
file.write('\t\torder 6,\n')
file.write('\t\tstart 8\n\t}\n},\n')
file.write('aomoints,\n')
def print_cc(file,method):
file.write(method+'\n{\n')
file.write('\tdiis order 5,\n')
file.write('\tconvergence 1e-9,\n')
file.write('\tmax_iterations 5\n}\n')
cluster = str(sys.argv[1])
basis = str(sys.argv[2])
method = str(sys.argv[3])
name = cluster+'_'+basis+'_'+method
filename = name+'.aq'
file = open(filename,'w\n')
print name
print_geom(file,cluster)
print_basis(file,basis)
print_scf(file)
print_cc(file,method)
file.close()
| 75.295667 | 108 | 0.502585 | 16,028 | 128,605 | 4.027015 | 0.240766 | 0.197165 | 0.302797 | 0.316291 | 0.482563 | 0.475157 | 0.181532 | 0.18051 | 0.179658 | 0.177628 | 0 | 0.430611 | 0.290829 | 128,605 | 1,707 | 109 | 75.339777 | 0.277096 | 0.015886 | 0 | 0.00129 | 0 | 0.108387 | 0.723879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.000645 | null | null | 0.056774 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ad2f65f10e71a271b7da7f9a84665da3d7f0bfd1 | 209 | py | Python | src/whylogs/proto/__init__.py | bernease/whylogs-python | cfd2a2f71280537aae584cbd40a752fbe7da647b | [
"Apache-2.0"
] | null | null | null | src/whylogs/proto/__init__.py | bernease/whylogs-python | cfd2a2f71280537aae584cbd40a752fbe7da647b | [
"Apache-2.0"
] | null | null | null | src/whylogs/proto/__init__.py | bernease/whylogs-python | cfd2a2f71280537aae584cbd40a752fbe7da647b | [
"Apache-2.0"
] | null | null | null | """
Auto-generated protobuf class definitions.
Protobuf allows us to serialize/deserialize classes across languages
"""
from .messages_pb2 import *
from .summaries_pb2 import *
from .constraints_pb2 import *
| 23.222222 | 68 | 0.799043 | 26 | 209 | 6.307692 | 0.730769 | 0.164634 | 0.158537 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016484 | 0.129187 | 209 | 8 | 69 | 26.125 | 0.884615 | 0.535885 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d1481ace914e84853809b76e231ba79394985def | 400 | py | Python | soccminer/environment.py | M3SOulu/soccminer | f8a5930ee1164e485bed5e6f712b323ed0b42b9b | [
"MIT"
] | 2 | 2021-12-19T12:55:05.000Z | 2022-01-25T13:14:42.000Z | soccminer/environment.py | M3SOulu/soccminer | f8a5930ee1164e485bed5e6f712b323ed0b42b9b | [
"MIT"
] | null | null | null | soccminer/environment.py | M3SOulu/soccminer | f8a5930ee1164e485bed5e6f712b323ed0b42b9b | [
"MIT"
] | 1 | 2021-12-19T12:55:17.000Z | 2021-12-19T12:55:17.000Z | import sys
class Platform:
@staticmethod
def fetch_platform():
return sys.platform
@staticmethod
def is_windows_platform():
return Platform.fetch_platform() == "win32"
@staticmethod
def is_unix_platform():
if Platform.fetch_platform() == "aix" or Platform.fetch_platform() == "linux":
return True
else:
return False
| 21.052632 | 86 | 0.6175 | 41 | 400 | 5.829268 | 0.463415 | 0.217573 | 0.263598 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007042 | 0.29 | 400 | 18 | 87 | 22.222222 | 0.834507 | 0 | 0 | 0.214286 | 0 | 0 | 0.0325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | true | 0 | 0.071429 | 0.142857 | 0.642857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
0f0110d66672563fb984df3a74cb02dd13d4ee9c | 31,301 | py | Python | st/clitests/rollback_spec.py | RakeshVaghasiya/cortx-s3server | 356c00f7523883300f3271b365545f4ff8b4c2be | [
"Apache-2.0"
] | null | null | null | st/clitests/rollback_spec.py | RakeshVaghasiya/cortx-s3server | 356c00f7523883300f3271b365545f4ff8b4c2be | [
"Apache-2.0"
] | null | null | null | st/clitests/rollback_spec.py | RakeshVaghasiya/cortx-s3server | 356c00f7523883300f3271b365545f4ff8b4c2be | [
"Apache-2.0"
] | null | null | null | #
# Copyright (c) 2020 Seagate Technology LLC and/or its Affiliates
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# For any questions about this software or licensing,
# please email opensource@seagate.com or cortx-questions@seagate.com.
#
#!/usr/bin/python3.6
from framework import Config
from framework import S3PyCliTest
from s3cmd import S3cmdTest
from s3fi import S3fiTest
from jclient import JClientTest
from s3client_config import S3ClientConfig
from s3kvstool import S3kvTest
import s3kvs
import yaml
# Helps debugging
# Config.log_enabled = True
# Config.dummy_run = True
# Config.client_execution_timeout = 300 * 1000
# Config.request_timeout = 300 * 1000
# Config.socket_timeout = 300 * 1000
# Enable retry flag to limit retries on failure
Config.s3cmd_max_retries = 2
# Set time_readable_format to False if you want to display the time in milli seconds.
# Config.time_readable_format = False
# TODO
# DNS-compliant bucket names should not contains underscore or other special characters.
# The allowed characters are [a-zA-Z0-9.-]*
#
# Add validations to S3 server and write system tests for the same.
# ***MAIN ENTRY POINT
# Run before all to setup the test environment.
print("Configuring LDAP")
S3PyCliTest('Before_all').before_all()
# Set pathstyle =false to run jclient for partial multipart upload
S3ClientConfig.pathstyle = False
S3ClientConfig.access_key_id = 'AKIAJPINPFRBTPAYOGNA'
S3ClientConfig.secret_key = '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'
# Path style tests.
Config.config_file = "pathstyle.s3cfg"
# ************ Create bucket Fail ************
# Note: We clean kvs entries using cqlsh(cassandra-kvs) for this test to work
S3fiTest('s3cmd enable FI create index fail').enable_fi("enable", "always", "motr_idx_create_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot create bucket').create_bucket("seagatebucket").execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_idx_create_fail").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI PUT KV').enable_fi("enable", "always", "motr_kv_put_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot create bucket').create_bucket("seagatebucket").execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_put_fail").execute_test().command_is_successful()
# ************ Create bucket ************
S3cmdTest('s3cmd can create bucket').create_bucket("seagatebucket").execute_test().command_is_successful()
# ************ List buckets ************
S3cmdTest('s3cmd can list buckets').list_buckets().execute_test().command_is_successful().command_response_should_have('s3://seagatebucket')
# ************ Multi delete empty bucket test *********
JClientTest('Jclient multiple delete should succeed when objects not present').delete_multiple_objects("seagatebucket", ["8kfile", "700Kfile", "18MBfile"]).execute_test().command_is_successful()
# ************ 18MB FILE Multipart Rollback TEST ***********
# function to cleanup multipart upload
def clean_18mb_multipart():
result = S3cmdTest('s3cmd can list multipart uploads in progress').list_multipart_uploads("seagatebucket").execute_test()
if '18MBfile' in result.status.stdout:
upload_id = result.status.stdout.split('\n')[2].split('\t')[2]
S3cmdTest('S3cmd can abort multipart upload').abort_multipart("seagatebucket", "18MBfile", upload_id).execute_test().command_is_successful()
else:
raise AssertionError("Failed to find multipart info.")
return
S3fiTest('s3cmd enable FI create index fail').enable_fi("enable", "always", "motr_idx_create_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd can upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd should not have objects after rollback').list_objects('seagatebucket').execute_test().command_is_successful().command_response_should_not_have('18MBfile')
S3fiTest('s3cmd can disable Fault injection').disable_fi("motr_idx_create_fail").execute_test().command_is_successful()
# Set second rollback checkpoint in multipart upload
S3fiTest('s3cmd enable FI create index fail').enable_fi_enablen("enable", "motr_idx_create_fail", "1").execute_test().command_is_successful()
S3cmdTest('s3cmd can upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd should not have objects after rollback').list_objects('seagatebucket').execute_test().command_is_successful().command_response_should_not_have('18MBfile')
S3fiTest('s3cmd can disable Fault injection').disable_fi("motr_idx_create_fail").execute_test().command_is_successful()
is_object_leak_track_enabled=yaml.load(open("/opt/seagate/cortx/s3/conf/s3config.yaml"))["S3_SERVER_CONFIG"]["S3_SERVER_ENABLE_OBJECT_LEAK_TRACKING"]
fi_off="2"
if is_object_leak_track_enabled:
fi_off="4"
S3fiTest('s3cmd enable FI PUT KV').enable_fi_offnonm("enable", "motr_kv_put_fail", fi_off, "99").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_put_fail").execute_test().command_is_successful()
clean_18mb_multipart()
S3fiTest('s3cmd enable FI GET KV').enable_fi_offnonm("enable", "motr_kv_get_fail", "3", "99").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
clean_18mb_multipart()
S3fiTest('s3cmd enable FI GET KV').enable_fi_offnonm("enable", "motr_kv_get_fail", "5", "99").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
clean_18mb_multipart()
S3fiTest('s3cmd enable FI GET KV').enable_fi_offnonm("enable", "motr_kv_get_fail", "9", "99").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
clean_18mb_multipart()
S3fiTest('s3cmd enable FI GET KV').enable_fi_offnonm("enable", "motr_kv_get_fail", "19", "99").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
clean_18mb_multipart()
S3fiTest('s3cmd enable FI fail_save_part_mdata').enable_fi("enable", "always", "fail_save_part_mdata").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3cmdTest('s3cmd should not have objects after rollback').list_objects('seagatebucket').execute_test().command_is_successful().command_response_should_not_have('18MBfile')
S3fiTest('s3cmd can disable Fault injection').disable_fi("fail_save_part_mdata").execute_test().command_is_successful()
# ************ auth FI ***************
S3fiTest('s3cmd enable FI auth').enable_fi("enable", "always", "fake_authentication_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 3k file').upload_test("seagatebucket", "3kfile", 3000).execute_test(negative_case=True).command_should_fail().command_error_should_have("InvalidAccessKeyId")
JClientTest('JClient cannot upload 3k file').put_object("seagatebucket", "3kfile", 3000, chunked=True).execute_test(negative_case=True).command_should_fail().command_error_should_have("InvalidAccessKeyId")
S3fiTest('s3cmd disable Fault injection').disable_fi("fake_authentication_fail").execute_test().command_is_successful()
#S3cmdTest('Stop s3authserver service').stop_s3authserver_test().execute_test().command_is_successful().command_is_successful()
#S3cmdTest('s3cmd cannot upload 3k file').upload_test("seagatebucket", "3kfile", 3000).execute_test(negative_case=True).command_should_fail().command_error_should_have("ServiceUnavailable")
#S3cmdTest('Start s3authserver service').start_s3authserver_test().execute_test().command_is_successful().command_is_successful()
# ************ OBJ open FI ***************
S3fiTest('s3cmd enable FI Obj open').enable_fi("enable", "always", "motr_obj_open_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot download 18MB nonexistent file').download_test("seagatebucket", "18MBfile").download_test("seagatebucket", "18MBfile").execute_test(negative_case=True).command_should_fail().command_error_should_have("Not Found")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_open_fail").execute_test().command_is_successful()
result = S3cmdTest('s3cmd can list multipart uploads in progress').list_multipart_uploads("seagatebucket").execute_test()
result.command_response_should_have('18MBfile')
upload_id = result.status.stdout.split('\n')[2].split('\t')[2]
S3cmdTest('S3cmd can abort multipart upload').abort_multipart("seagatebucket", "18MBfile", upload_id). execute_test().command_is_successful()
# ************ OBJ open FI ***************
S3cmdTest('s3cmd can upload 3k file').upload_test("seagatebucket", "3kfile", 3000).execute_test().command_is_successful()
S3fiTest('s3cmd enable FI Obj open').enable_fi("enable", "always", "motr_obj_open_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot download 3k file').download_test("seagatebucket", "3kfile").execute_test(negative_case=True).command_error_should_have("Internal Server Error")
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_open_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd can delete 3k file').delete_test("seagatebucket", "3kfile").execute_test().command_is_successful()
# ************ OBJ open FI ***************
S3cmdTest('s3cmd can upload file-overwrite file').upload_test("seagatebucket", "file-overwrite", 3000).execute_test().command_is_successful()
S3fiTest('s3cmd enable FI Obj open').enable_fi("enable", "always", "motr_obj_open_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload file-overwrite file').upload_test("seagatebucket", "file-overwrite", 18000000).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_open_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd list old file-overwrite object').list_objects('seagatebucket').execute_test().command_is_successful().command_response_should_have('3000')
S3cmdTest('s3cmd can delete file-overwrite file').delete_test("seagatebucket", "file-overwrite").execute_test().command_is_successful()
# ************ OBJ create FI ***************
S3fiTest('s3cmd enable FI Obj create').enable_fi("enable", "always", "motr_obj_create_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 3k file').upload_test("seagatebucket", "3kfile", 3000).execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_create_fail").execute_test().command_is_successful()
#************* PUT KV FI ***************
S3fiTest('s3cmd enable FI PUT KV').enable_fi("enable", "always", "motr_kv_put_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot upload 3k file').upload_test("seagatebucket", "3kfile", 3000).execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_put_fail").execute_test().command_is_successful()
#************** upload objects *************
S3cmdTest('s3cmd upload 3k file').upload_test("seagatebucket", "3kfile", 3000).execute_test().command_is_successful()
S3cmdTest('s3cmd upload 18MB file').upload_test("seagatebucket", "18MBfile", 18000000).execute_test().command_is_successful()
# **************** OBJ DELETE FI ****************
S3fiTest('s3cmd enable FI OBJ Delete').enable_fi("enable", "always", "motr_kv_delete_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot delete 3k file').delete_test("seagatebucket", "3kfile").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot delete 18MB file').delete_test("seagatebucket", "18MBfile").execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_delete_fail").execute_test().command_is_successful()
#**************** GET KV FI ****************
S3fiTest('s3cmd enable FI GET KV').enable_fi("enable", "always", "motr_kv_get_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot download 3k file').download_test("seagatebucket", "3kfile").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot download 18MB file').download_test("seagatebucket", "18MBfile").execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
# **************** OBJ DELETE FI ****************
S3fiTest('s3cmd enable FI OBJ Delete').enable_fi("enable", "always", "motr_obj_delete_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd can delete 3k file').delete_test("seagatebucket", "3kfile").execute_test().command_is_successful()
S3cmdTest('s3cmd can delete 18MB file').delete_test("seagatebucket", "18MBfile").execute_test().command_is_successful()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_delete_fail").execute_test().command_is_successful()
# ************ Multiple Delete bucket TEST ************
file_name = "3kfile"
for num in range(0, 2):
new_file_name = '%s%d' % (file_name, num)
S3cmdTest('s3cmd can upload 3k file').upload_test("seagatebucket", new_file_name, 3000).execute_test().command_is_successful()
S3fiTest('s3cmd enable fail_fetch_bucket_info').enable_fi("enable", "always", "fail_fetch_bucket_info").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot delete multiple objects').multi_delete_test("seagatebucket").execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("fail_fetch_bucket_info").execute_test().command_is_successful()
S3fiTest('s3cmd enable fail_fetch_objects_info').enable_fi("enable", "always", "fail_fetch_objects_info").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot delete multiple objects').multi_delete_test("seagatebucket").execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("fail_fetch_objects_info").execute_test().command_is_successful()
S3fiTest('s3cmd enable fail_delete_objects_metadata').enable_fi("enable", "always", "fail_delete_objects_metadata").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot delete multiple objects').multi_delete_test("seagatebucket").execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('s3cmd disable Fault injection').disable_fi("fail_delete_objects_metadata").execute_test().command_is_successful()
S3cmdTest('s3cmd can delete multiple objects').multi_delete_test("seagatebucket").execute_test().command_is_successful().command_response_should_have('delete: \'s3://seagatebucket/3kfile0\'').command_response_should_have('delete: \'s3://seagatebucket/3kfile1\'')
# This test will leave stale objects in motr.
S3fiTest('s3cmd enable FI OBJ Delete').enable_fi("enable", "always", "motr_obj_delete_fail").execute_test().command_is_successful()
file_name = "3kfile"
for num in range(0, 2):
new_file_name = '%s%d' % (file_name, num)
S3cmdTest('s3cmd can upload 3k file').upload_test("seagatebucket", new_file_name, 3000).execute_test().command_is_successful()
S3cmdTest('s3cmd can delete multiple objects').multi_delete_test("seagatebucket").execute_test().command_is_successful().command_response_should_have('delete: \'s3://seagatebucket/3kfile0\'').command_response_should_have('delete: \'s3://seagatebucket/3kfile1\'')
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_delete_fail").execute_test().command_is_successful()
# ************ Cleanup bucket + Object ************
S3cmdTest('s3cmd can delete bucket').delete_bucket("seagatebucket").execute_test().command_is_successful()
# ******************* multipart and partial parts *********************
S3cmdTest('s3cmd can create bucket').create_bucket("seagatebucket").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI Obj create').enable_fi("enable", "always", "motr_obj_create_fail").execute_test().command_is_successful()
JClientTest('Jclient cannot upload partial parts.').partial_multipart_upload("seagatebucket", "18MBfile", 18000000, 1, 2).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_create_fail").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI Obj open').enable_fi("enable", "always", "motr_obj_open_fail").execute_test().command_is_successful()
JClientTest('Jclient cannot upload partial parts.').partial_multipart_upload("seagatebucket", "18MBfile", 18000000, 1, 2).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_obj_open_fail").execute_test().command_is_successful()
result = JClientTest('Jclient can list all multipart uploads.').list_multipart("seagatebucket").execute_test()
result.command_response_should_have('18MBfile')
upload_id = result.status.stdout.split("id - ")[1]
JClientTest('Jclient can abort multipart upload').abort_multipart("seagatebucket", "18MBfile", upload_id)\
.execute_test().command_is_successful()
S3fiTest('s3cmd enable FI PUT KV').enable_fi("enable", "always", "motr_kv_put_fail").execute_test().command_is_successful()
JClientTest('Jclient cannot upload partial parts.').partial_multipart_upload("seagatebucket", "18MBfile", 18000000, 1, 2).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_put_fail").execute_test().command_is_successful()
JClientTest('Jclient can upload partial parts to test abort and list multipart.').partial_multipart_upload("seagatebucket", "18MBfile", 18000000, 1, 2).execute_test().command_is_successful()
S3fiTest('s3cmd enable FI GET KV').enable_fi("enable", "always", "motr_kv_get_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot list multipart uploads in progress').list_multipart_uploads("seagatebucket").execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
result = S3cmdTest('s3cmd can list multipart uploads in progress').list_multipart_uploads("seagatebucket").execute_test()
result.command_response_should_have('18MBfile')
upload_id = result.status.stdout.split('\n')[2].split('\t')[2]
S3fiTest('s3cmd enable FI GET KV').enable_fi("enable", "always", "motr_kv_get_fail").execute_test().command_is_successful()
result = S3cmdTest('S3cmd cannot list parts of multipart upload.').list_parts("seagatebucket", "18MBfile", upload_id).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI GET KV').enable_fi_offnonm("enable", "motr_kv_get_fail", "4", "99").execute_test().command_is_successful()
result = S3cmdTest('S3cmd cannot list parts of multipart upload.').list_parts("seagatebucket", "18MBfile", upload_id).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI GET KV').enable_fi("enable", "always", "motr_kv_get_fail").execute_test().command_is_successful()
S3cmdTest('S3cmd cannot abort multipart upload').abort_multipart("seagatebucket", "18MBfile", upload_id).execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI fail_remove_part_mindex').enable_fi("enable", "always", "fail_remove_part_mindex").execute_test().command_is_successful()
S3cmdTest('S3cmd can abort multipart upload').abort_multipart("seagatebucket", "18MBfile", upload_id).execute_test().command_is_successful()
S3fiTest('s3cmd can disable Fault injection').disable_fi("fail_remove_part_mindex").execute_test().command_is_successful()
S3cmdTest('s3cmd can test the multipart was aborted.').list_multipart_uploads('seagatebucket').execute_test().command_is_successful().command_response_should_not_have('18MBfile')
S3cmdTest('s3cmd can delete bucket').delete_bucket("seagatebucket").execute_test().command_is_successful()
# ******************************************************************
# *************** Unused FI APIs above *************
# NOTE: Remove FI API if they are used in any test above in future
S3fiTest('s3cmd enable FI random test').enable_fi_random("enable", "unused_fail", "10").execute_test().command_is_successful()
S3fiTest('s3cmd disable Fault injection').disable_fi("unused_fail").execute_test().command_is_successful()
S3fiTest('s3cmd enable FI once test').enable_fi("enable", "once", "unused_fail").execute_test().command_is_successful()
S3fiTest('s3cmd disable Fault injection').disable_fi("unused_fail").execute_test().command_is_successful()
# ************ Negative ACL/Policy TESTS ************
S3cmdTest('s3cmd can create bucket').create_bucket("seagatebucket").execute_test().command_is_successful()
S3cmdTest('s3cmd can upload 3k file with default acl').upload_test("seagatebucket", "3kfile", 3000).execute_test().command_is_successful()
S3fiTest('s3cmd enable FI PUT KV').enable_fi("enable", "always", "motr_kv_put_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd cannot set acl on bucket').setacl_bucket("seagatebucket","read:123").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot set acl on object').setacl_object("seagatebucket","3kfile", "read:123").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot revoke acl on bucket').revoke_acl_bucket("seagatebucket","read:123").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot revoke acl on object').revoke_acl_object("seagatebucket","3kfile","read:123").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd cannot set policy on bucket').setpolicy_bucket("seagatebucket","policy.txt").execute_test(negative_case=True).command_should_fail()
S3cmdTest('s3cmd can set policy on bucket').delpolicy_bucket("seagatebucket").execute_test(negative_case=True).command_should_fail()
S3fiTest('s3cmd disable Fault injection').disable_fi("motr_kv_put_fail").execute_test().command_is_successful()
S3cmdTest('s3cmd can delete 3kfile after setting acl').delete_test("seagatebucket", "3kfile").execute_test().command_is_successful()
S3cmdTest('s3cmd can delete bucket after setting policy/acl').delete_bucket("seagatebucket").execute_test().command_is_successful()
# ************************************************
# Path style tests.
pathstyle_values = [True, False]
for i, val in enumerate(pathstyle_values):
S3ClientConfig.pathstyle = val
print("\nPath style = " + str(val) + "\n")
# ************ Create bucket ************
JClientTest('Jclient can create bucket').create_bucket("seagatebucket").execute_test().command_is_successful()
# ************ List buckets ************
JClientTest('Jclient can list buckets').list_buckets().execute_test().command_is_successful().command_response_should_have('seagatebucket')
# ************ OBJ Create FI: CHUNK UPLOAD ************
S3fiTest('S3Fi enable FI Obj create').enable_fi("enable", "always", "motr_obj_create_fail")\
.execute_test().command_is_successful()
JClientTest('JClient cannot upload 3k file').put_object("seagatebucket", "3kfile", 3000, chunked=True)\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
JClientTest('JClient cannot upload 18MB file').put_object("seagatebucket", "18MBfile", 18000000, chunked=True)\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_obj_create_fail").execute_test().command_is_successful()
# ************ OBJ Write FI: CHUNK UPLOAD ************
S3fiTest('S3Fi enable FI Obj write').enable_fi("enable", "always", "motr_obj_write_fail")\
.execute_test().command_is_successful()
JClientTest('JClient cannot upload 3k file').put_object("seagatebucket", "3kfile", 3000, chunked=True)\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
JClientTest('JClient cannot upload 18MB file').put_object("seagatebucket", "18MBfile", 18000000, chunked=True)\
.execute_test(negative_case=True).command_should_fail()
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_obj_write_fail").execute_test().command_is_successful()
# ************ OBJ Create FI: Multipart ************
S3fiTest('S3Fi enable FI Obj create').enable_fi("enable", "always", "motr_obj_create_fail")\
.execute_test().command_is_successful()
JClientTest('JClient cannot upload 18MB file (Multipart)').put_object_multipart("seagatebucket", "18MBfile", 18000000, 15)\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_obj_create_fail").execute_test().command_is_successful()
# ************ OBJ Open FI ************
S3fiTest('S3Fi enable FI Obj open').enable_fi("enable", "always", "motr_obj_open_fail")\
.execute_test().command_is_successful()
JClientTest('Jclient cannot download non existent 3kfile file').get_object("seagatebucket", "3kfile").execute_test(negative_case=True).command_should_fail().command_error_should_have("NoSuchKey")
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_obj_open_fail").execute_test().command_is_successful()
# ************ OBJ Write FI: Multipart ************
S3fiTest('S3Fi enable FI Obj write').enable_fi("enable", "always", "motr_obj_write_fail")\
.execute_test().command_is_successful()
JClientTest('JClient cannot upload 18MB file (Multipart)').put_object_multipart("seagatebucket", "18MBfile", 18000000, 15)\
.execute_test(negative_case=True).command_should_fail() #.command_error_should_have("Multipart upload failed")
JClientTest('JClient cannot upload 18MB file (Multipart)').put_object_multipart("seagatebucket", "18MBfile", 18000000, 15, chunked=True)\
.execute_test(negative_case=True).command_should_fail() #.command_error_should_have("Multipart upload failed")
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_obj_write_fail").execute_test().command_is_successful()
# ************ Partial Multipart Upload ************
JClientTest('JClient can upload parts of 18MB file').partial_multipart_upload("seagatebucket", "18MBfile", 18000000, 5, 2)\
.execute_test().command_is_successful()
# ************ OBJ LIST FI: Partial Multipart ************
result = JClientTest('Jclient can list all multipart uploads.').list_multipart("seagatebucket").execute_test()
result.command_response_should_have('18MBfile')
upload_id = result.status.stdout.split("id - ")[1]
print(upload_id)
S3fiTest('S3Fi enable FI get KV').enable_fi("enable", "always", "motr_kv_get_fail").execute_test()
JClientTest('Jclient cannot list all multipart uploads.').list_multipart("seagatebucket")\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
JClientTest('Jclient cannot list parts of multipart upload.').list_parts("seagatebucket", "18MBfile", upload_id)\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_kv_get_fail").execute_test().command_is_successful()
# ************ OBJ DELETE FI: Multipart ************
S3fiTest('S3Fi enable FI delete').enable_fi("enable", "always", "motr_kv_delete_fail")\
.execute_test().command_is_successful()
JClientTest('Jclient cannot abort multipart upload.').abort_multipart("seagatebucket", "18MBfile", upload_id)\
.execute_test(negative_case=True).command_should_fail().command_error_should_have("InternalError")
S3fiTest('S3Fi disable Fault injection').disable_fi("motr_kv_delete_fail").execute_test().command_is_successful()
# ************ Abort multipart upload ************
JClientTest('Jclient can abort multipart upload.').abort_multipart("seagatebucket", "18MBfile", upload_id)\
.execute_test().command_is_successful()
# ************ Delete bucket TEST ************
JClientTest('Jclient can delete bucket').delete_bucket("seagatebucket").execute_test().command_is_successful()
| 74.52619 | 262 | 0.768218 | 4,033 | 31,301 | 5.654104 | 0.084057 | 0.085866 | 0.101653 | 0.105249 | 0.848134 | 0.836293 | 0.827479 | 0.82112 | 0.813007 | 0.78744 | 0 | 0.030461 | 0.076004 | 31,301 | 419 | 263 | 74.704057 | 0.75797 | 0.124661 | 0 | 0.581967 | 0 | 0 | 0.357739 | 0.016584 | 0 | 0 | 0 | 0.002387 | 0.004098 | 1 | 0.004098 | false | 0 | 0.036885 | 0 | 0.045082 | 0.012295 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0f6930bfb95c978ad229e1271a9b5d373809e3b4 | 115 | py | Python | templates/django/__APPNAME__/settings/__init__.py | ba1dr/tplgenerator | f05b6f9a32cf825d326dd2faf551d1e156d2df37 | [
"MIT"
] | null | null | null | templates/django/__APPNAME__/settings/__init__.py | ba1dr/tplgenerator | f05b6f9a32cf825d326dd2faf551d1e156d2df37 | [
"MIT"
] | null | null | null | templates/django/__APPNAME__/settings/__init__.py | ba1dr/tplgenerator | f05b6f9a32cf825d326dd2faf551d1e156d2df37 | [
"MIT"
] | null | null | null |
from __future__ import absolute_import
from .settings import *
# from .celery_settings import app as celery_app
| 16.428571 | 48 | 0.808696 | 16 | 115 | 5.375 | 0.5 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156522 | 115 | 6 | 49 | 19.166667 | 0.886598 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7e4783408789cb4782b121175bbbe766bee7189c | 24 | py | Python | slinga/__init__.py | hemebond/slinga | 6584bbf5260b0afb5898a3965a21356e334d8f79 | [
"MIT"
] | null | null | null | slinga/__init__.py | hemebond/slinga | 6584bbf5260b0afb5898a3965a21356e334d8f79 | [
"MIT"
] | null | null | null | slinga/__init__.py | hemebond/slinga | 6584bbf5260b0afb5898a3965a21356e334d8f79 | [
"MIT"
] | null | null | null | from .slinga import app
| 12 | 23 | 0.791667 | 4 | 24 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7e8ec77be1467e447d18fc27e2242b76fb776332 | 45 | py | Python | mw2fcitx/utils/normalize.py | outloudvi/mw2fcitx | a4fbbcd5e8068ee1f08714f0e18b46c8b289a42c | [
"Unlicense"
] | 67 | 2020-08-13T13:58:03.000Z | 2022-03-29T11:33:51.000Z | mw2fcitx/utils/normalize.py | outloudvi/fcitx5-pinyin-moegirl | c62d3f7d049143a4d8726f408bdd345f53ff3347 | [
"Unlicense"
] | 5 | 2020-11-16T01:48:32.000Z | 2022-02-18T08:04:32.000Z | mw2fcitx/utils/normalize.py | outloudvi/fcitx5-pinyin-moegirl | c62d3f7d049143a4d8726f408bdd345f53ff3347 | [
"Unlicense"
] | 3 | 2020-10-08T15:44:30.000Z | 2022-03-23T12:40:11.000Z | def normalize(word):
return word.strip()
| 15 | 23 | 0.688889 | 6 | 45 | 5.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 24 | 22.5 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
7e906e1fe6c141d951c3e406097b9228fa67c0c7 | 104 | py | Python | examples/test_pytest.py | danibaena/expects | 296203a3fb07cf3061b8f7b348136c9208195d93 | [
"Apache-2.0"
] | 189 | 2015-01-05T13:26:40.000Z | 2021-09-27T12:44:48.000Z | examples/test_pytest.py | danibaena/expects | 296203a3fb07cf3061b8f7b348136c9208195d93 | [
"Apache-2.0"
] | 38 | 2015-02-13T16:08:23.000Z | 2022-02-14T12:14:28.000Z | examples/test_pytest.py | danibaena/expects | 296203a3fb07cf3061b8f7b348136c9208195d93 | [
"Apache-2.0"
] | 32 | 2015-03-12T08:06:47.000Z | 2022-03-08T18:16:28.000Z | # -*- coding: utf-8 -*-
from expects import *
def test_failing():
expect("foo").to(equal("bar"))
| 13 | 34 | 0.586538 | 14 | 104 | 4.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011765 | 0.182692 | 104 | 7 | 35 | 14.857143 | 0.694118 | 0.201923 | 0 | 0 | 0 | 0 | 0.074074 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0e79ce37189c490053e827e66ae6ba83776fde9c | 527 | py | Python | usr/callbacks/logical/logical.py | uwitec/LEHome | a959a2fe64a23c58de7c0ff3254eae8c27732320 | [
"Apache-2.0"
] | 151 | 2015-01-25T10:25:29.000Z | 2022-03-15T10:04:09.000Z | usr/callbacks/logical/logical.py | legendmohe/LEHome | a959a2fe64a23c58de7c0ff3254eae8c27732320 | [
"Apache-2.0"
] | null | null | null | usr/callbacks/logical/logical.py | legendmohe/LEHome | a959a2fe64a23c58de7c0ff3254eae8c27732320 | [
"Apache-2.0"
] | 70 | 2015-02-02T02:35:48.000Z | 2021-05-13T09:51:08.000Z | #!/usr/bin/env python
# encoding: utf-8
from util.log import *
from lib.model import Callback
class logical_callback(Callback.Callback):
def callback(self, aValue, bValue):
DEBUG("logical callback invoke.")
return aValue and bValue
class and_callback(Callback.Callback):
def callback(self, aValue, bValue):
# import pdb
# pdb.set_trace()
return aValue and bValue
class or_callback(Callback.Callback):
def callback(self, aValue, bValue):
return aValue or bValue
| 22.913043 | 42 | 0.686907 | 67 | 527 | 5.343284 | 0.41791 | 0.268156 | 0.201117 | 0.226257 | 0.572626 | 0.427374 | 0.427374 | 0.427374 | 0 | 0 | 0 | 0.002439 | 0.222011 | 527 | 22 | 43 | 23.954545 | 0.870732 | 0.119545 | 0 | 0.416667 | 0 | 0 | 0.052174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.166667 | 0.166667 | 0.916667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
0e81a4703d1f7cd37171b48a9d10365e3c3d9830 | 267 | py | Python | packages/watchmen-meta/src/watchmen_meta/console/__init__.py | Indexical-Metrics-Measure-Advisory/watchmen | c54ec54d9f91034a38e51fd339ba66453d2c7a6d | [
"MIT"
] | null | null | null | packages/watchmen-meta/src/watchmen_meta/console/__init__.py | Indexical-Metrics-Measure-Advisory/watchmen | c54ec54d9f91034a38e51fd339ba66453d2c7a6d | [
"MIT"
] | null | null | null | packages/watchmen-meta/src/watchmen_meta/console/__init__.py | Indexical-Metrics-Measure-Advisory/watchmen | c54ec54d9f91034a38e51fd339ba66453d2c7a6d | [
"MIT"
] | null | null | null | from .connected_space_graphic_service import ConnectedSpaceGraphicService
from .connected_space_service import ConnectedSpaceService
from .dashboard_service import DashboardService
from .report_service import ReportService
from .subject_service import SubjectService
| 44.5 | 73 | 0.906367 | 28 | 267 | 8.357143 | 0.5 | 0.277778 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074906 | 267 | 5 | 74 | 53.4 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7ee300585b96b0a9c09ba9d785ea0f6eb2e8048f | 23,926 | py | Python | utils/processors/utils_blue.py | sy-wada/blue_benchmark_with_transformers | fbf6236db5a4fb7affde94a05a5c875cc5ee948b | [
"Apache-2.0"
] | 17 | 2020-05-18T06:40:26.000Z | 2022-03-23T08:34:27.000Z | utils/processors/utils_blue.py | sy-wada/blue_benchmark_with_transformers | fbf6236db5a4fb7affde94a05a5c875cc5ee948b | [
"Apache-2.0"
] | 3 | 2020-05-18T23:24:13.000Z | 2021-05-27T07:12:14.000Z | utils/processors/utils_blue.py | sy-wada/blue_benchmark_with_transformers | fbf6236db5a4fb7affde94a05a5c875cc5ee948b | [
"Apache-2.0"
] | 2 | 2020-05-18T20:26:15.000Z | 2021-11-09T14:21:11.000Z | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
GLUE processors and helpers.
Import from https://github.com/huggingface/transformers/blob/master/src/transformers/data/processors/glue.py
and Modify to fit BLUE datasets.
__version__ = "2.5.1"
BlueBERT processors are imported from
https://github.com/ncbi-nlp/bluebert/blob/master/bluebert/run_bluebert.py
"""
import logging
import os
import csv
from .tokenization import convert_to_unicode
from .file_utils import is_tf_available
from .utils import DataProcessor, InputExample, InputFeatures
if is_tf_available():
import tensorflow as tf
logger = logging.getLogger(__name__)
def blue_convert_examples_to_features(
examples,
tokenizer,
max_length=512,
task=None,
label_list=None,
output_mode=None,
pad_on_left=False,
pad_token=0,
pad_token_segment_id=0,
mask_padding_with_zero=True,
):
"""
Loads a data file into a list of ``InputFeatures``
Args:
examples: List of ``InputExamples`` or ``tf.data.Dataset`` containing the examples.
tokenizer: Instance of a tokenizer that will tokenize the examples
max_length: Maximum example length
task: GLUE task
label_list: List of labels. Can be obtained from the processor using the ``processor.get_labels()`` method
output_mode: String indicating the output mode. Either ``regression`` or ``classification``
pad_on_left: If set to ``True``, the examples will be padded on the left rather than on the right (default)
pad_token: Padding token
pad_token_segment_id: The segment ID for the padding token (It is usually 0, but can vary such as for XLNet where it is 4)
mask_padding_with_zero: If set to ``True``, the attention mask will be filled by ``1`` for actual values
and by ``0`` for padded values. If set to ``False``, inverts it (``1`` for padded values, ``0`` for
actual values)
Returns:
If the ``examples`` input is a ``tf.data.Dataset``, will return a ``tf.data.Dataset``
containing the task-specific features. If the input is a list of ``InputExamples``, will return
a list of task-specific ``InputFeatures`` which can be fed to the model.
"""
is_tf_dataset = False
if is_tf_available() and isinstance(examples, tf.data.Dataset):
is_tf_dataset = True
if task is not None:
processor = glue_processors[task]()
if label_list is None:
label_list = processor.get_labels()
logger.info("Using label list %s for task %s" % (label_list, task))
if output_mode is None:
output_mode = glue_output_modes[task]
logger.info("Using output mode %s for task %s" % (output_mode, task))
label_map = {label: i for i, label in enumerate(label_list)}
features = []
for (ex_index, example) in enumerate(examples):
len_examples = 0
if is_tf_dataset:
example = processor.get_example_from_tensor_dict(example)
example = processor.tfds_map(example)
len_examples = tf.data.experimental.cardinality(examples)
else:
len_examples = len(examples)
if ex_index % 10000 == 0:
logger.info("Writing example %d/%d" % (ex_index, len_examples))
inputs = tokenizer.encode_plus(example.text_a, example.text_b, add_special_tokens=True, max_length=max_length,)
input_ids, token_type_ids = inputs["input_ids"], inputs["token_type_ids"]
# The mask has 1 for real tokens and 0 for padding tokens. Only real
# tokens are attended to.
attention_mask = [1 if mask_padding_with_zero else 0] * len(input_ids)
# Zero-pad up to the sequence length.
padding_length = max_length - len(input_ids)
if pad_on_left:
input_ids = ([pad_token] * padding_length) + input_ids
attention_mask = ([0 if mask_padding_with_zero else 1] * padding_length) + attention_mask
token_type_ids = ([pad_token_segment_id] * padding_length) + token_type_ids
else:
input_ids = input_ids + ([pad_token] * padding_length)
attention_mask = attention_mask + ([0 if mask_padding_with_zero else 1] * padding_length)
token_type_ids = token_type_ids + ([pad_token_segment_id] * padding_length)
assert len(input_ids) == max_length, "Error with input length {} vs {}".format(len(input_ids), max_length)
assert len(attention_mask) == max_length, "Error with input length {} vs {}".format(
len(attention_mask), max_length
)
assert len(token_type_ids) == max_length, "Error with input length {} vs {}".format(
len(token_type_ids), max_length
)
if output_mode == "classification":
label = label_map[example.label]
elif output_mode == "regression":
label = float(example.label)
else:
raise KeyError(output_mode)
if ex_index < 5:
logger.info("*** Example ***")
logger.info("guid: %s" % (example.guid))
logger.info("tokens: %s", " ".join(tokenizer.convert_ids_to_tokens(input_ids)))
logger.info("input_ids: %s" % " ".join([str(x) for x in input_ids]))
logger.info("attention_mask: %s" % " ".join([str(x) for x in attention_mask]))
logger.info("token_type_ids: %s" % " ".join([str(x) for x in token_type_ids]))
logger.info("label: %s (id = %d)" % (example.label, label))
features.append(
InputFeatures(
input_ids=input_ids, attention_mask=attention_mask, token_type_ids=token_type_ids, label=label
)
)
if is_tf_available() and is_tf_dataset:
def gen():
for ex in features:
yield (
{
"input_ids": ex.input_ids,
"attention_mask": ex.attention_mask,
"token_type_ids": ex.token_type_ids,
},
ex.label,
)
return tf.data.Dataset.from_generator(
gen,
({"input_ids": tf.int32, "attention_mask": tf.int32, "token_type_ids": tf.int32}, tf.int64),
(
{
"input_ids": tf.TensorShape([None]),
"attention_mask": tf.TensorShape([None]),
"token_type_ids": tf.TensorShape([None]),
},
tf.TensorShape([]),
),
)
return features
def convert_multi_label_examples_to_features(
examples,
tokenizer,
max_length=512,
task=None,
label_list=None,
output_mode=None,
pad_on_left=False,
pad_token=0,
pad_token_segment_id=0,
mask_padding_with_zero=True,
):
"""
Loads a data file into a list of ``InputFeatures``
Args:
examples: List of ``InputExamples`` or ``tf.data.Dataset`` containing the examples.
tokenizer: Instance of a tokenizer that will tokenize the examples
max_length: Maximum example length
task: GLUE task
label_list: List of labels. Can be obtained from the processor using the ``processor.get_labels()`` method
output_mode: String indicating the output mode. Either ``regression`` or ``classification``
pad_on_left: If set to ``True``, the examples will be padded on the left rather than on the right (default)
pad_token: Padding token
pad_token_segment_id: The segment ID for the padding token (It is usually 0, but can vary such as for XLNet where it is 4)
mask_padding_with_zero: If set to ``True``, the attention mask will be filled by ``1`` for actual values
and by ``0`` for padded values. If set to ``False``, inverts it (``1`` for padded values, ``0`` for
actual values)
Returns:
If the ``examples`` input is a ``tf.data.Dataset``, will return a ``tf.data.Dataset``
containing the task-specific features. If the input is a list of ``InputExamples``, will return
a list of task-specific ``InputFeatures`` which can be fed to the model.
"""
is_tf_dataset = False
if is_tf_available() and isinstance(examples, tf.data.Dataset):
is_tf_dataset = True
if task is not None:
processor = glue_processors[task]()
if label_list is None:
label_list = processor.get_labels()
logger.info("Using label list %s for task %s" % (label_list, task))
if output_mode is None:
output_mode = glue_output_modes[task]
logger.info("Using output mode %s for task %s" % (output_mode, task))
# label_map = {label: i for i, label in enumerate(label_list)}
features = []
for (ex_index, example) in enumerate(examples):
len_examples = 0
if is_tf_dataset:
example = processor.get_example_from_tensor_dict(example)
example = processor.tfds_map(example)
len_examples = tf.data.experimental.cardinality(examples)
else:
len_examples = len(examples)
if ex_index % 10000 == 0:
logger.info("Writing example %d/%d" % (ex_index, len_examples))
inputs = tokenizer.encode_plus(example.text_a, example.text_b, add_special_tokens=True, max_length=max_length,)
input_ids, token_type_ids = inputs["input_ids"], inputs["token_type_ids"]
# The mask has 1 for real tokens and 0 for padding tokens. Only real
# tokens are attended to.
attention_mask = [1 if mask_padding_with_zero else 0] * len(input_ids)
# Zero-pad up to the sequence length.
padding_length = max_length - len(input_ids)
if pad_on_left:
input_ids = ([pad_token] * padding_length) + input_ids
attention_mask = ([0 if mask_padding_with_zero else 1] * padding_length) + attention_mask
token_type_ids = ([pad_token_segment_id] * padding_length) + token_type_ids
else:
input_ids = input_ids + ([pad_token] * padding_length)
attention_mask = attention_mask + ([0 if mask_padding_with_zero else 1] * padding_length)
token_type_ids = token_type_ids + ([pad_token_segment_id] * padding_length)
assert len(input_ids) == max_length, "Error with input length {} vs {}".format(len(input_ids), max_length)
assert len(attention_mask) == max_length, "Error with input length {} vs {}".format(
len(attention_mask), max_length
)
assert len(token_type_ids) == max_length, "Error with input length {} vs {}".format(
len(token_type_ids), max_length
)
# if output_mode == "classification":
# label = label_map[example.label]
# elif output_mode == "regression":
# label = float(example.label)
# else:
# raise KeyError(output_mode)
label = example.label
if ex_index < 5:
logger.info("*** Example ***")
logger.info("guid: %s" % (example.guid))
logger.info("tokens: %s", " ".join(tokenizer.convert_ids_to_tokens(input_ids)))
logger.info("input_ids: %s" % " ".join([str(x) for x in input_ids]))
logger.info("attention_mask: %s" % " ".join([str(x) for x in attention_mask]))
logger.info("token_type_ids: %s" % " ".join([str(x) for x in token_type_ids]))
logger.info("label: %s " % (','.join(['{}_{}'.format(i, l) for i, l in enumerate(label)])))
features.append(
InputFeatures(
input_ids=input_ids, attention_mask=attention_mask, token_type_ids=token_type_ids, label=label
)
)
if is_tf_available() and is_tf_dataset:
def gen():
for ex in features:
yield (
{
"input_ids": ex.input_ids,
"attention_mask": ex.attention_mask,
"token_type_ids": ex.token_type_ids,
},
ex.label,
)
return tf.data.Dataset.from_generator(
gen,
({"input_ids": tf.int32, "attention_mask": tf.int32, "token_type_ids": tf.int32}, tf.int64),
(
{
"input_ids": tf.TensorShape([None]),
"attention_mask": tf.TensorShape([None]),
"token_type_ids": tf.TensorShape([None]),
},
tf.TensorShape([]),
),
)
return features
class DataProcessor(object):
"""Base class for data converters for sequence classification data sets."""
def get_train_examples(self, data_dir):
"""Gets a collection of `InputExample`s for the train set."""
raise NotImplementedError()
def get_dev_examples(self, data_dir):
"""Gets a collection of `InputExample`s for the dev set."""
raise NotImplementedError()
def get_test_examples(self, data_dir):
"""Gets a collection of `InputExample`s for prediction."""
raise NotImplementedError()
def get_labels(self):
"""Gets the list of labels for this data set."""
raise NotImplementedError()
@classmethod
def _read_tsv(cls, input_file, quotechar=None):
"""Reads a tab separated value file."""
with open(input_file, "r") as f:
reader = csv.reader(f, delimiter="\t", quotechar=quotechar)
lines = []
for line in reader:
lines.append(line)
return lines
class BlueBERTProcessor(DataProcessor):
"""Processor for the BLUE data set."""
def get_train_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "train.tsv")), "train")
def get_dev_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "dev.tsv")), "dev")
def get_test_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "test.tsv")), "test")
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
examples = []
for (i, line) in enumerate(lines):
# skip header
if i == 0:
continue
guid = line[0]
text_a = convert_to_unicode(line[1])
if set_type == "test":
# MODIFY:
# We add the option "--predict" to calculate metrics and to describe outputs.
# label = self.get_labels()[-1]
try:
label = convert_to_unicode(line[2])
except IndexError:
logging.exception(line)
exit(1)
else:
try:
label = convert_to_unicode(line[2])
except IndexError:
logging.exception(line)
exit(1)
examples.append(InputExample(guid=guid, text_a=text_a, text_b=None, label=label))
return examples
#ADD:
def get_y_true(self, data_dir, set_type, quotechar=None):
"""Read labels for evaluation."""
input_file = os.path.join(data_dir, "{}.tsv".format(set_type))
with open(input_file, "r") as f:
reader = csv.reader(f, delimiter="\t", quotechar=quotechar)
labels = []
for i, line in enumerate(reader):
# skip header
if i == 0:
continue
labels.append(convert_to_unicode(line[2]))
return labels
class ChemProtProcessor(BlueBERTProcessor):
def get_labels(self):
"""See base class."""
return ["CPR:3", "CPR:4", "CPR:5", "CPR:6", "CPR:9", "false"]
class DDI2013Processor(BlueBERTProcessor):
def get_labels(self):
return ["DDI-advise", "DDI-effect", "DDI-int", "DDI-mechanism", 'DDI-false']
class I2b2_2010_Processor(BlueBERTProcessor):
def get_labels(self):
return ['PIP', 'TeCP', 'TeRP', 'TrAP', 'TrCP', 'TrIP', 'TrNAP', 'TrWP', 'false']
class StsProcessor(DataProcessor):
"""Processor for the STS-B data set."""
def get_example_from_tensor_dict(self, tensor_dict):
"""See base class."""
return InputExample(
tensor_dict["idx"].numpy(),
tensor_dict["sentence1"].numpy().decode("utf-8"),
tensor_dict["sentence2"].numpy().decode("utf-8"),
str(tensor_dict["label"].numpy()),
)
def get_train_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "train.tsv")), "train")
def get_dev_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "dev.tsv")), "dev")
# ADDED
def get_test_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "test.tsv")), "test")
def get_labels(self):
"""See base class."""
return [None]
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
examples = []
for (i, line) in enumerate(lines):
if i == 0:
continue
guid = "%s-%s" % (set_type, convert_to_unicode(line[0]))
text_a = convert_to_unicode(line[-3])
text_b = convert_to_unicode(line[-2])
label = float(line[-1])
examples.append(InputExample(guid=guid, text_a=text_a, text_b=text_b, label=label))
return examples
#ADD:
def get_y_true(self, data_dir, set_type, quotechar=None):
"""Read labels for evaluation."""
input_file = os.path.join(data_dir, "{}.tsv".format(set_type))
with open(input_file, "r") as f:
reader = csv.reader(f, delimiter="\t", quotechar=quotechar)
labels = []
for i, line in enumerate(reader):
# skip header
if i == 0:
continue
labels.append(convert_to_unicode(line[-1]))
return labels
class HoCProcessor(DataProcessor):
"""Processor for the HoC data set."""
def get_train_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "train.tsv")), "train")
def get_dev_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "dev.tsv")), "dev")
def get_test_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "test.tsv")), "test")
def get_labels(self):
"""See base class."""
return list(range(10))
# return ['activating invasion and metastasis', 'avoiding immune destruction',
# 'cellular energetics', 'enabling replicative immortality', 'evading growth suppressors',
# 'genomic instability and mutation', 'inducing angiogenesis', 'resisting cell death',
# 'sustaining proliferative signaling', 'tumor promoting inflammation']
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
#ADD:
# convert the format of 'labels' from str to list.
def convert_str_to_list(labels):
cols = labels.split(',')
res = [int(v[-1]) for v in cols]
return res
examples = []
for (i, line) in enumerate(lines):
# Only the test set has a header
if i == 0:
continue
guid = "%s-%s" % (set_type, i)
# if set_type == "test":
# text_a = tokenization.convert_to_unicode(line[1])
# label = "0"
# else:
# text_a = tokenization.convert_to_unicode(line[3])
# label = tokenization.convert_to_unicode(line[1])
label = convert_str_to_list(line[0])
text_a = convert_to_unicode(line[1])
examples.append(
InputExample(guid=guid, text_a=text_a, text_b=None, label=label))
return examples
class MedNLIProcessor(DataProcessor):
def get_train_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "train.tsv")), "train")
def get_dev_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "dev.tsv")), "dev")
def get_test_examples(self, data_dir):
"""See base class."""
return self._create_examples(
self._read_tsv(os.path.join(data_dir, "test.tsv")), "test")
def get_labels(self):
"""See base class."""
return ['contradiction', 'entailment', 'neutral']
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
examples = []
for (i, line) in enumerate(lines):
if i == 0:
continue
guid = "%s-%s" % (set_type, convert_to_unicode(line[0]))
text_a = convert_to_unicode(line[-3])
text_b = convert_to_unicode(line[-2])
label = convert_to_unicode(line[-1])
# guid = line[1]
# text_a = convert_to_unicode(line[2])
# text_b = convert_to_unicode(line[3])
# if set_type == "test":
# label = self.get_labels()[-1]
# else:
# label = convert_to_unicode(line[0])
examples.append(
InputExample(guid=guid, text_a=text_a, text_b=text_b, label=label))
return examples
#ADD:
def get_y_true(self, data_dir, set_type, quotechar=None):
"""Read labels for evaluation."""
input_file = os.path.join(data_dir, "{}.tsv".format(set_type))
with open(input_file, "r") as f:
reader = csv.reader(f, delimiter="\t", quotechar=quotechar)
labels = []
for i, line in enumerate(reader):
# skip header
if i == 0:
continue
labels.append(convert_to_unicode(line[-1]))
return labels
blue_processors = {
"medsts": StsProcessor,
"biosses": StsProcessor,
"ddi2013": DDI2013Processor,
"chemprot": ChemProtProcessor,
"i2b2_2010": I2b2_2010_Processor,
"hoc": HoCProcessor,
"mednli": MedNLIProcessor,
}
blue_output_modes = {
"medsts": "regression",
"biosses": "regression",
"ddi2013": "classification",
"chemprot": "classification",
"i2b2_2010": "classification",
"hoc": "classification",
"mednli": "classification",
} | 39.287356 | 130 | 0.599557 | 2,990 | 23,926 | 4.588294 | 0.124415 | 0.022159 | 0.02799 | 0.029157 | 0.793134 | 0.779722 | 0.768059 | 0.754938 | 0.75246 | 0.746483 | 0 | 0.010203 | 0.287219 | 23,926 | 609 | 131 | 39.287356 | 0.794242 | 0.255496 | 0 | 0.733681 | 0 | 0 | 0.081367 | 0 | 0 | 0 | 0 | 0 | 0.015666 | 1 | 0.093995 | false | 0 | 0.018277 | 0.005222 | 0.21671 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7eef211f9e8703e97cc87836a6537ac6971629c3 | 138 | py | Python | stock_management/products/models/__init__.py | hitenjadeja/stock-management | fe542efc7a7b4f870f280cc20f52d7d92c45fc7f | [
"MIT"
] | null | null | null | stock_management/products/models/__init__.py | hitenjadeja/stock-management | fe542efc7a7b4f870f280cc20f52d7d92c45fc7f | [
"MIT"
] | null | null | null | stock_management/products/models/__init__.py | hitenjadeja/stock-management | fe542efc7a7b4f870f280cc20f52d7d92c45fc7f | [
"MIT"
] | null | null | null | from model_product import Product
from model_location import Location
from model_stock import Stock
from model_warehouse import Warehouse
| 27.6 | 37 | 0.884058 | 20 | 138 | 5.9 | 0.35 | 0.305085 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115942 | 138 | 4 | 38 | 34.5 | 0.967213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7d2306adee377a9b95d0e9bd45d087c562163faa | 8,936 | py | Python | usaspending_api/disaster/tests/integration/test_cfda_loans.py | jbuendiallc/usaspending-api | f827870cbca4b6a6e16f1c5272bb2ff73a113d76 | [
"CC0-1.0"
] | 1 | 2020-08-14T04:14:32.000Z | 2020-08-14T04:14:32.000Z | usaspending_api/disaster/tests/integration/test_cfda_loans.py | jbuendiallc/usaspending-api | f827870cbca4b6a6e16f1c5272bb2ff73a113d76 | [
"CC0-1.0"
] | null | null | null | usaspending_api/disaster/tests/integration/test_cfda_loans.py | jbuendiallc/usaspending-api | f827870cbca4b6a6e16f1c5272bb2ff73a113d76 | [
"CC0-1.0"
] | null | null | null | import pytest
from rest_framework import status
from usaspending_api.search.tests.data.utilities import setup_elasticsearch_test
url = "/api/v2/disaster/cfda/loans/"
@pytest.mark.django_db
def test_correct_response_defc_no_results(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, award_type_codes=["07", "08"], def_codes=["N"])
expected_results = []
assert resp.status_code == status.HTTP_200_OK
assert resp.json()["results"] == expected_results
@pytest.mark.django_db
def test_correct_response_single_defc(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, def_codes=["L"])
expected_results = [
{
"code": "20.200",
"award_count": 1,
"description": "CFDA 2",
"face_value_of_loan": 30.0,
"id": 200,
"obligation": 20.0,
"outlay": 0.0,
"resource_link": "www.example.com/200",
},
{
"code": "10.100",
"award_count": 1,
"description": "CFDA 1",
"face_value_of_loan": 3.0,
"id": 100,
"obligation": 2.0,
"outlay": 0.0,
"resource_link": None,
},
]
assert resp.status_code == status.HTTP_200_OK
assert resp.json()["results"] == expected_results
@pytest.mark.django_db
def test_correct_response_multiple_defc(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, def_codes=["L", "M"])
expected_results = [
{
"code": "20.200",
"award_count": 2,
"description": "CFDA 2",
"face_value_of_loan": 330.0,
"id": 200,
"obligation": 220.0,
"outlay": 100.0,
"resource_link": "www.example.com/200",
},
{
"code": "10.100",
"award_count": 1,
"description": "CFDA 1",
"face_value_of_loan": 3.0,
"id": 100,
"obligation": 2.0,
"outlay": 0.0,
"resource_link": None,
},
]
assert resp.status_code == status.HTTP_200_OK
assert resp.json()["results"] == expected_results
@pytest.mark.django_db
def test_correct_response_with_query(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, def_codes=["L", "M"], query="GIBBERISH")
expected_results = []
assert resp.status_code == status.HTTP_200_OK
assert resp.json()["results"] == expected_results
resp = helpers.post_for_spending_endpoint(client, url, def_codes=["L", "M"], query="2")
expected_results = [
{
"code": "20.200",
"award_count": 2,
"description": "CFDA 2",
"face_value_of_loan": 330.0,
"id": 200,
"obligation": 220.0,
"outlay": 100.0,
"resource_link": "www.example.com/200",
}
]
assert resp.status_code == status.HTTP_200_OK
assert resp.json()["results"] == expected_results
@pytest.mark.django_db
def test_invalid_defc(client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, def_codes=["ZZ"])
assert resp.status_code == status.HTTP_400_BAD_REQUEST
assert resp.data["detail"] == "Field 'filter|def_codes' is outside valid values ['L', 'M', 'N']"
@pytest.mark.django_db
def test_invalid_defc_type(client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, def_codes="100")
assert resp.status_code == status.HTTP_400_BAD_REQUEST
assert resp.data["detail"] == "Invalid value in 'filter|def_codes'. '100' is not a valid type (array)"
@pytest.mark.django_db
def test_missing_defc(client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url)
assert resp.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
assert resp.data["detail"] == "Missing value: 'filter|def_codes' is a required field"
@pytest.mark.django_db
def test_pagination_page_and_limit(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, def_codes=["L", "M"], page=2, limit=1)
expected_results = {
"results": [
{
"code": "10.100",
"award_count": 1,
"description": "CFDA 1",
"face_value_of_loan": 3.0,
"id": 100,
"obligation": 2.0,
"outlay": 0.0,
"resource_link": None,
}
],
"page_metadata": {
"hasNext": False,
"hasPrevious": True,
"limit": 1,
"next": None,
"page": 2,
"previous": 1,
"total": 2,
},
"messages": [
"Notice! API Request to sort on 'id' field isn't fully "
"implemented. Results were actually sorted using 'description' "
"field."
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_results
@pytest.mark.django_db
def test_invalid_award_type_codes(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(client, url, award_type_codes=["ZZ", "08"], def_codes=["L", "M"])
assert resp.status_code == status.HTTP_400_BAD_REQUEST
assert resp.data["detail"] == "Field 'filter|award_type_codes' is outside valid values ['07', '08']"
@pytest.mark.django_db
def test_correct_response_with_award_type_codes(
client, monkeypatch, helpers, elasticsearch_award_index, cfda_awards_and_transactions
):
setup_elasticsearch_test(monkeypatch, elasticsearch_award_index)
resp = helpers.post_for_spending_endpoint(
client, url, award_type_codes=["07"], def_codes=["L", "M"], sort="description"
)
expected_results = {
"results": [
{
"code": "20.200",
"award_count": 1,
"description": "CFDA 2",
"face_value_of_loan": 30.0,
"id": 200,
"obligation": 20.0,
"outlay": 0.0,
"resource_link": "www.example.com/200",
},
{
"code": "10.100",
"award_count": 1,
"description": "CFDA 1",
"face_value_of_loan": 3.0,
"id": 100,
"obligation": 2.0,
"outlay": 0.0,
"resource_link": None,
},
],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"limit": 10,
"next": None,
"page": 1,
"previous": None,
"total": 2,
},
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_results
resp = helpers.post_for_spending_endpoint(
client, url, award_type_codes=["08"], def_codes=["L", "M"], sort="description"
)
expected_results = {
"results": [
{
"code": "20.200",
"award_count": 1,
"description": "CFDA 2",
"face_value_of_loan": 300.0,
"id": 200,
"obligation": 200.0,
"outlay": 100.0,
"resource_link": "www.example.com/200",
}
],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"limit": 10,
"next": None,
"page": 1,
"previous": None,
"total": 1,
},
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_results
| 33.219331 | 114 | 0.597359 | 999 | 8,936 | 5.037037 | 0.132132 | 0.047695 | 0.091415 | 0.042925 | 0.884141 | 0.871423 | 0.855525 | 0.853935 | 0.838831 | 0.821542 | 0 | 0.039359 | 0.280662 | 8,936 | 268 | 115 | 33.343284 | 0.743466 | 0 | 0 | 0.691304 | 0 | 0 | 0.179834 | 0.005931 | 0 | 0 | 0 | 0 | 0.104348 | 1 | 0.043478 | false | 0 | 0.013043 | 0 | 0.056522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7d272f97eaee966a565f80142fded04cf63e5b4b | 49 | py | Python | AutoTicketsBot/__init__.py | y95847frank/AutomatedTicketBot | 66754758430c7a1240b69259e32fcb452639c134 | [
"MIT"
] | 1 | 2021-03-26T05:07:20.000Z | 2021-03-26T05:07:20.000Z | AutoTicketsBot/__init__.py | y95847frank/AutomatedTicketBot | 66754758430c7a1240b69259e32fcb452639c134 | [
"MIT"
] | null | null | null | AutoTicketsBot/__init__.py | y95847frank/AutomatedTicketBot | 66754758430c7a1240b69259e32fcb452639c134 | [
"MIT"
] | null | null | null | from .AutoTicketsBot import *
from .util import * | 24.5 | 29 | 0.77551 | 6 | 49 | 6.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 49 | 2 | 30 | 24.5 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7d2a3061458192e2da2fadac9a5416dd47f7340c | 95 | py | Python | CodeForces/Round541Div2/A.py | takaaki82/Java-Lessons | c4f11462bf84c091527dde5f25068498bfb2cc49 | [
"MIT"
] | 1 | 2018-11-25T04:15:45.000Z | 2018-11-25T04:15:45.000Z | CodeForces/Round541Div2/A.py | takaaki82/Java-Lessons | c4f11462bf84c091527dde5f25068498bfb2cc49 | [
"MIT"
] | null | null | null | CodeForces/Round541Div2/A.py | takaaki82/Java-Lessons | c4f11462bf84c091527dde5f25068498bfb2cc49 | [
"MIT"
] | 2 | 2018-08-08T13:01:14.000Z | 2018-11-25T12:38:36.000Z | w1, h1, w2, h2 = map(int, input().split())
print(w2 + h2 * 2 + 2 + w1 + 2 * h1 + 2 + w1 - w2)
| 23.75 | 50 | 0.473684 | 19 | 95 | 2.368421 | 0.526316 | 0.177778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205882 | 0.284211 | 95 | 3 | 51 | 31.666667 | 0.455882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
adabcbeedbe23959d9a13d6e966dd4bd43c70522 | 10,207 | py | Python | Smart-Licensing-Dashboard-Backend/WBXTeamsMeetingRoom/test/test_teamroommeetingcreation.py | bhavanaraya/ciscodashboard | 50ac8fd57e3dbfd215f012bdaa1c8e581f14fcf1 | [
"CECILL-B"
] | 6 | 2019-07-26T14:56:19.000Z | 2020-12-21T13:43:40.000Z | Smart-Licensing-Dashboard-Backend/WBXTeamsMeetingRoom/test/test_teamroommeetingcreation.py | bhavanaraya/ciscodashboard | 50ac8fd57e3dbfd215f012bdaa1c8e581f14fcf1 | [
"CECILL-B"
] | 4 | 2021-10-06T04:31:59.000Z | 2022-02-18T06:29:50.000Z | Smart-Licensing-Dashboard-Backend/WBXTeamsMeetingRoom/test/test_teamroommeetingcreation.py | bhavanaraya/ciscodashboard | 50ac8fd57e3dbfd215f012bdaa1c8e581f14fcf1 | [
"CECILL-B"
] | 5 | 2020-07-26T22:53:18.000Z | 2021-07-01T12:57:29.000Z | """
Copyright (c) 2019 Cisco and/or its affiliates.
This software is licensed to you under the terms of the Cisco Sample
Code License, Version 1.0 (the "License"). You may obtain a copy of the
License at
https://developer.cisco.com/docs/licenses
All use of the material herein must be in accordance with the terms of
the License. All rights not expressly granted by the License are
reserved. Unless required by applicable law or agreed to separately in
writing, software distributed under the License is distributed on an "AS
IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
or implied.
"""
import unittest
import json
import WBXTeamsMeetingRoom as wbxtm_meeting
__author__ = "Tim Taylor <timtayl@cisco.com>"
__contributors__ = []
__copyright__ = "Copyright (c) 2019 Cisco and/or its affiliates."
__license__ = "Cisco Sample Code License, Version 1.0"
class WBXTeamsMeetingRoomTest(unittest.TestCase):
def test_WBXTeamsMeetingRoomExists(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
self.assertIsNotNone(meeting_room_maker, 'test_WBXTeamsMeetingRoomExits should return an object')
def test_creates_correct_object(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
self.assertIsInstance(meeting_room_maker, wbxtm_meeting.WBXTeamsMeetingRoom,
'test_creates_correct_object should return correct object.\nExpected: {}\n' \
'Result: {}'.format(type(wbxtm_meeting.WBXTeamsMeetingRoom), type(meeting_room_maker)))
def test_message_json_creation_is_correct(self):
meeting_room_maker= wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = {
"toPersonEmail": "timtayl@cisco.com",
"markdown": "This is the **Smart Dashboard Bot**. Welcome to the Smart Licensing Dashboard! Please stand by while we get things setup."
}
result = json.loads(meeting_room_maker.message_json())
self.assertEqual(result, expected, 'test_message_json_creation_is_correct should return correct message json.\n'
'Expected: {}\nResult: {}'.format(expected, result))
def test_message_creation_response_returns_proper_personId_returns_string(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = "somePersonIDalskd3"
input_json = {
"id": "someID",
"roomId": "someMeetingRoomID1q3qerwerqew",
"toPersonEmail": "timtayl@cisco.com",
"roomType": "direct",
"text": "This is the Smart Dashboard Bot. Welcome to the Smart Licensing Dashboard! Please stand by while we get things setup.",
"personId": "somePersonID",
"personEmail": "SLDBot@webex.bot",
"markdown": "some Mark Down",
"html": "<p>Some html</p>",
"created": "2019-06-20T15:05:09.544Z"
}
result = meeting_room_maker.roomId_from_response_json(input_json)
self.assertIsInstance(result, str, 'test_message_creation_response_returns_proper_personId_returns_string.\n'
'Expected: {}\nResult: {}'.format(type(""), type(result)))
def test_message_creation_response_parsing_returns_roomID_sending_dict(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = "someMeetingRoomID1q3qerwerqew"
input_json = {
"id": "someID",
"roomId": "someMeetingRoomID1q3qerwerqew",
"toPersonEmail": "timtayl@cisco.com",
"roomType": "direct",
"text": "This is the Smart Dashboard Bot. Welcome to the Smart Licensing Dashboard! Please stand by while we get things setup.",
"personId": "somePersonID",
"personEmail": "SLDBot@webex.bot",
"markdown": "some Mark Down",
"html": "<p>Some html</p>",
"created": "2019-06-20T15:05:09.544Z"
}
result = meeting_room_maker.roomId_from_response_json(input_json)
print(expected)
self.assertEqual(result, expected, 'test_message_creation_response_parsing_returns_roomID.\nExpected: {}\n'
'Result: {}'.format(expected, result))
def test_message_creation_response_parsing_returns_roomID_sending_string(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = "someMeetingRoomID1q3qerwerqew"
input_json = {
"id": "someID",
"roomId": "someMeetingRoomID1q3qerwerqew",
"toPersonEmail": "timtayl@cisco.com",
"roomType": "direct",
"text": "This is the Smart Dashboard Bot. Welcome to the Smart Licensing Dashboard! Please stand by while we get things setup.",
"personId": "somePersonID",
"personEmail": "SLDBot@webex.bot",
"markdown": "some Mark Down",
"html": "<p>Some html</p>",
"created": "2019-06-20T15:05:09.544Z"
}
result = meeting_room_maker.roomId_from_response_json(json.dumps(input_json))
print(expected)
self.assertEqual(result, expected, 'test_message_creation_response_parsing_returns_roomID.\nExpected: {}\n'
'Result: {}'.format(expected, result))
def test_membership_check_response_returns_proper_personId_returns_string(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = "somePersonIDalskd3"
input_json = {
"items": [
{
"id": "someID1",
"roomId": "someMeetingRoomID1q3qerwerqew",
"personId": "somePersonIDalskd3",
"personEmail": "timtayl@cisco.com",
"personDisplayName": "Tim Taylor",
"personOrgId": "somePersonORgID1",
"created": "2019-06-07T17:14:33.919Z"
},
{
"id": "someID2",
"roomId": "someMeetingRoomID1q3qerwerqew",
"personId": "somePersonIDaladskfpkj4",
"personEmail": "SLDBot@webex.bot",
"personDisplayName": "SmartLicensingBot",
"personOrgId": "somePersonORgID2",
"created": "2019-06-07T17:14:33.919Z"
}
]
}
result = meeting_room_maker.personId_from_response_json(input_json)
self.assertIsInstance(result, str, 'test_membership_check_response_returns_proper_personId_returns_string.\n'
'Expected: {}\nResult: {}'.format(type(""), type(result)))
def test_membership_check_response_returns_proper_personId_sending_dict(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = "somePersonIDalskd3"
input_json = {
"items": [
{
"id": "someID1",
"roomId": "someMeetingRoomID1q3qerwerqew",
"personId": "somePersonIDalskd3",
"personEmail": "timtayl@cisco.com",
"personDisplayName": "Tim Taylor",
"personOrgId": "somePersonORgID1",
"created": "2019-06-07T17:14:33.919Z"
},
{
"id": "someID2",
"roomId": "someMeetingRoomID1q3qerwerqew",
"personId": "somePersonIDaladskfpkj4",
"personEmail": "SLDBot@webex.bot",
"personDisplayName": "SmartLicensingBot",
"personOrgId": "somePersonORgID2",
"created": "2019-06-07T17:14:33.919Z"
}
]
}
result = meeting_room_maker.personId_from_response_json(input_json)
self.assertEqual(result, expected,
'test_membership_check_response_returns_proper_personId_sending_dict returns '
'correct value.\nExpected: {}\nResult: {}'.format(expected, result))
def test_membership_check_response_returns_proper_personId_sending_string(self):
meeting_room_maker = wbxtm_meeting.WBXTeamsMeetingRoom('sample_bot_token', 'timtayl@cisco.com')
expected = "somePersonIDalskd3"
input_json = {
"items": [
{
"id": "someID1",
"roomId": "someMeetingRoomID1q3qerwerqew",
"personId": "somePersonIDalskd3",
"personEmail": "timtayl@cisco.com",
"personDisplayName": "Tim Taylor",
"personOrgId": "somePersonORgID1",
"created": "2019-06-07T17:14:33.919Z"
},
{
"id": "someID2",
"roomId": "someMeetingRoomID1q3qerwerqew",
"personId": "somePersonIDaladskfpkj4",
"personEmail": "SLDBot@webex.bot",
"personDisplayName": "SmartLicensingBot",
"personOrgId": "somePersonORgID2",
"created": "2019-06-07T17:14:33.919Z"
}
]
}
result = meeting_room_maker.personId_from_response_json(json.dumps(input_json))
self.assertEqual(result, expected,
'test_membership_check_response_returns_proper_personId_sending_string returns '
'correct value.\nExpected: {}\nResult: {}'.format(expected, result))
if __name__ == '__main__':
unittest.main()
| 43.99569 | 149 | 0.59567 | 930 | 10,207 | 6.287097 | 0.192473 | 0.035745 | 0.051992 | 0.035916 | 0.829143 | 0.829143 | 0.803318 | 0.789636 | 0.757654 | 0.747563 | 0 | 0.029358 | 0.302537 | 10,207 | 231 | 150 | 44.186147 | 0.791965 | 0.061037 | 0 | 0.668639 | 0 | 0.023669 | 0.382494 | 0.115626 | 0 | 0 | 0 | 0 | 0.053254 | 1 | 0.053254 | false | 0 | 0.017751 | 0 | 0.076923 | 0.011834 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
70c036350fb41fa26b288744499304bfac8f6e7b | 179 | py | Python | gary/coordinates/__init__.py | adrn/gary-old | 065b371534baa03deeb860893640068d90ba5881 | [
"MIT"
] | null | null | null | gary/coordinates/__init__.py | adrn/gary-old | 065b371534baa03deeb860893640068d90ba5881 | [
"MIT"
] | null | null | null | gary/coordinates/__init__.py | adrn/gary-old | 065b371534baa03deeb860893640068d90ba5881 | [
"MIT"
] | null | null | null | from .core import *
from .sgr import *
from .orphan import *
from .propermotion import *
from .velocity_transforms import *
from .poincarepolar import *
from .quaternion import *
| 22.375 | 34 | 0.765363 | 22 | 179 | 6.181818 | 0.454545 | 0.441176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156425 | 179 | 7 | 35 | 25.571429 | 0.900662 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
70f47270c6833fc5ac69352202b94a4364ce4c60 | 86 | py | Python | uncertify/utils/io.py | matthaeusheer/uncertify | dfc2df16fb07ee8d7d17906827e0f0c8b2747532 | [
"MIT"
] | 1 | 2021-07-09T00:06:55.000Z | 2021-07-09T00:06:55.000Z | uncertify/utils/io.py | matthaeusheer/uncertify | dfc2df16fb07ee8d7d17906827e0f0c8b2747532 | [
"MIT"
] | 1 | 2021-04-29T21:55:32.000Z | 2021-04-29T21:55:32.000Z | uncertify/utils/io.py | matthaeusheer/uncertify | dfc2df16fb07ee8d7d17906827e0f0c8b2747532 | [
"MIT"
] | null | null | null | from pathlib import Path
def load_pytorch_checkpoint(checkpoint_path: Path):
pass | 21.5 | 51 | 0.813953 | 12 | 86 | 5.583333 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139535 | 86 | 4 | 52 | 21.5 | 0.905405 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
cb6fa2c4208e62e09262ea3b635901a0fbb165bb | 31 | py | Python | src/originexample/technology/__init__.py | project-origin/example-backend | 13d9b528533dcaada8b0f0b93bbe2ef6a25c38ae | [
"MIT"
] | 1 | 2021-04-23T08:19:49.000Z | 2021-04-23T08:19:49.000Z | src/datahub/technology/__init__.py | project-origin/datahub-service | 0c3f27ee4fa0381ce3147f1fffef1108f13dc2c2 | [
"MIT"
] | 1 | 2021-02-10T02:28:52.000Z | 2021-02-10T02:28:52.000Z | src/originexample/technology/__init__.py | project-origin/example-backend | 13d9b528533dcaada8b0f0b93bbe2ef6a25c38ae | [
"MIT"
] | null | null | null | from .models import Technology
| 15.5 | 30 | 0.83871 | 4 | 31 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cb9f1c2ded9b63f36a7f394156ac4d869aea5864 | 28,099 | py | Python | capsul/pipeline/test/test_pipeline_parameters.py | denisri/capsul | ea1b41f08ab1acc95e50d90916c1e282807874ca | [
"CECILL-B"
] | null | null | null | capsul/pipeline/test/test_pipeline_parameters.py | denisri/capsul | ea1b41f08ab1acc95e50d90916c1e282807874ca | [
"CECILL-B"
] | null | null | null | capsul/pipeline/test/test_pipeline_parameters.py | denisri/capsul | ea1b41f08ab1acc95e50d90916c1e282807874ca | [
"CECILL-B"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import print_function
from __future__ import absolute_import
import os
import json
import shutil
import unittest
import tempfile
from datetime import date, time, datetime
import sys
from capsul.api import Process, Pipeline
from capsul.pipeline.pipeline_tools import save_pipeline_parameters, load_pipeline_parameters
from traits.api import Float, File, String, Int, List, TraitListObject, Time, Date, Undefined, TraitError
import six
def load_pipeline_dictionary(filename):
"""
Just a part of load_pipeline_parameters to check if the values stored
in the dictionary are correct.
:param filename: the json filename
"""
if filename:
kwargs = {}
if sys.version_info[0] >= 3:
kwargs['encoding'] = 'utf8'
with open(filename, 'r', **kwargs) as file:
dic = json.load(file)
return dic
#############################################################
# TEST PROCESSES DEFINITION #
#############################################################
class TestInt(Process):
def __init__(self):
super(TestInt, self).__init__()
self.add_trait("in_1", Int(output=False))
self.add_trait("in_2", Int(output=False))
self.add_trait("out", Int(output=True))
def _run_process(self):
self.out = self.in_1 + self.in_2
class TestFloat(Process):
def __init__(self):
super(TestFloat, self).__init__()
self.add_trait("in_1", Float(output=False))
self.add_trait("in_2", Float(output=False))
self.add_trait("out", Float(output=True))
def _run_process(self):
self.out = self.in_1 - self.in_2
class TestString(Process):
def __init__(self):
super(TestString, self).__init__()
self.add_trait("in_1", String(output=False))
self.add_trait("in_2", String(output=False))
self.add_trait("out", String(output=True))
def _run_process(self):
self.out = self.in_1 + self.in_2
class TestFile(Process):
def __init__(self):
super(TestFile, self).__init__()
self.add_trait("in_1", File(output=False))
self.add_trait("in_2", File(output=False))
self.add_trait("out", List(File(), output=True))
def _run_process(self):
self.out = [self.in_1, self.in_2]
class TestListInt(Process):
def __init__(self):
super(TestListInt, self).__init__()
self.add_trait("in_1", List(Int(), output=False))
self.add_trait("in_2", List(Int(), output=False))
self.add_trait("out", List(Int(), output=True))
def _run_process(self):
l = []
for idx, i in enumerate(self.in_1):
l.append(i + self.in_2[idx])
self.out = l
class TestListFloat(Process):
def __init__(self):
super(TestListFloat, self).__init__()
self.add_trait("in_1", List(Float(), output=False))
self.add_trait("in_2", List(Float(), output=False))
self.add_trait("out", List(Float(), output=True))
def _run_process(self):
l = []
for idx, i in enumerate(self.in_1):
l.append(i - self.in_2[idx])
self.out = l
class TestListString(Process):
def __init__(self):
super(TestListString, self).__init__()
self.add_trait("in_1", List(String(), output=False))
self.add_trait("in_2", List(String(), output=False))
self.add_trait("out", List(String(), output=True))
def _run_process(self):
l = []
for idx, i in enumerate(self.in_1):
l.append(i + self.in_2[idx])
self.out = l
class TestListFile(Process):
def __init__(self):
super(TestListFile, self).__init__()
self.add_trait("in_1", List(File(), output=False))
self.add_trait("in_2", List(File(), output=False))
self.add_trait("out", List(File(), output=True))
def _run_process(self):
self.out = [self.in_1[0], self.in_2[0]]
class TestListList(Process):
def __init__(self):
super(TestListList, self).__init__()
self.add_trait("in_1", List(List(Int()), output=False))
self.add_trait("in_2", List(List(Int()), output=False))
self.add_trait("out", List(Int(), output=True))
def _run_process(self):
l = []
for idx, i in enumerate(self.in_1):
l.append(i[0] + self.in_2[idx][0])
self.out = l
class TestDateTime(Process):
def __init__(self):
super(TestDateTime, self).__init__()
self.add_trait("in_1", Date(output=False))
self.add_trait("in_2", Time(output=False))
self.add_trait("out", List(output=True))
def _run_process(self):
self.out = [self.in_1, self.in_2]
#############################################################
# UNITTESTS DEFINITION #
#############################################################
class TestPipelineMethods(unittest.TestCase):
"""
Class executing the unit tests of load_pipeline_parameters and save_pipeline_parameters
"""
def setUp(self):
"""
Called before every unit test
Creates a temporary folder containing the json file that will be used for the test
"""
self.temp_folder = tempfile.mkdtemp()
self.path = os.path.join(self.temp_folder, "test.json")
def tearDown(self):
"""
Called after every unit test
Deletes the temporary folder created for the test
"""
shutil.rmtree(self.temp_folder)
def test_int(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestInt())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = 2
in_2 = 4
out = 6
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), int)
self.assertEqual(type(pipeline1.in_2), int)
self.assertEqual(type(pipeline1.out), int)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), int)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), int)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), int)
def test_float(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestFloat())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
pipeline1 = Pipeline1()
pipeline1.in_1 = 2.0
pipeline1.in_2 = 4.0
pipeline1()
in_1 = 2.0
in_2 = 4.0
out = -2.0
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), float)
self.assertEqual(type(pipeline1.in_2), float)
self.assertEqual(type(pipeline1.out), float)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), float)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), float)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), float)
def test_string(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestString())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = "This is "
in_2 = "a test"
out = "This is " + "a test"
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), str)
self.assertEqual(type(pipeline1.in_2), str)
self.assertEqual(type(pipeline1.out), str)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), six.text_type)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), six.text_type)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), six.text_type)
def test_file(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestFile())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = '/tmp/yolo.nii'
in_2 = '/tmp/yolo2.nii'
out = ['/tmp/yolo.nii', '/tmp/yolo2.nii']
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), six.text_type)
self.assertEqual(type(pipeline1.in_2), six.text_type)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), six.text_type)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), six.text_type)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), six.text_type)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
def test_list_int(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestListInt())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = [2, 4, 5]
in_2 = [4, 8, 9]
out = [6, 12, 14]
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), TraitListObject)
self.assertEqual(type(pipeline1.in_2), TraitListObject)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.in_1):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), int)
for idx, element in enumerate(pipeline1.in_2):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), int)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), int)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
for idx, element in enumerate(dic["pipeline_parameters"]["in_1"]):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), int)
for idx, element in enumerate(dic["pipeline_parameters"]["in_2"]):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), int)
for idx, element in enumerate(dic["pipeline_parameters"]["out"]):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), int)
def test_list_float(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestListFloat())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = [2.0, 4.0, 5.0]
in_2 = [4.0, 8.0, 9.0]
out = [-2.0, -4.0, -4.0]
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), TraitListObject)
self.assertEqual(type(pipeline1.in_2), TraitListObject)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.in_1):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), float)
for idx, element in enumerate(pipeline1.in_2):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), float)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), float)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
for idx, element in enumerate(dic["pipeline_parameters"]["in_1"]):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), float)
for idx, element in enumerate(dic["pipeline_parameters"]["in_2"]):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), float)
for idx, element in enumerate(dic["pipeline_parameters"]["out"]):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), float)
def test_list_string(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestListString())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = ["hello ", "hey "]
in_2 = ["salut", "coucou"]
out = ["hello salut", "hey coucou"]
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), TraitListObject)
self.assertEqual(type(pipeline1.in_2), TraitListObject)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.in_1):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), str)
for idx, element in enumerate(pipeline1.in_2):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), str)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), str)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
for idx, element in enumerate(dic["pipeline_parameters"]["in_1"]):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), six.text_type)
for idx, element in enumerate(dic["pipeline_parameters"]["in_2"]):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), six.text_type)
for idx, element in enumerate(dic["pipeline_parameters"]["out"]):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), six.text_type)
def test_list_file(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestListFile())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = ["/tmp/yolo.txt", "/tmp/yolo2.txt"]
in_2 = ["/tmp/yolo.nii", "/tmp/yolo2.nii"]
out = ["/tmp/yolo.txt", "/tmp/yolo.nii"]
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), TraitListObject)
self.assertEqual(type(pipeline1.in_2), TraitListObject)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.in_1):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), six.text_type)
for idx, element in enumerate(pipeline1.in_2):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), six.text_type)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), six.text_type)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
for idx, element in enumerate(dic["pipeline_parameters"]["in_1"]):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), six.text_type)
for idx, element in enumerate(dic["pipeline_parameters"]["in_2"]):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), six.text_type)
for idx, element in enumerate(dic["pipeline_parameters"]["out"]):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), six.text_type)
def test_list_list(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestListList())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = [[1, 1, 1], [2, 2, 2], [3, 3, 3]]
in_2 = [[2, 2, 2], [3, 3, 3], [4, 4, 4]]
out = [3, 5, 7]
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, in_1)
self.assertEqual(pipeline1.in_2, in_2)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.in_1), TraitListObject)
self.assertEqual(type(pipeline1.in_2), TraitListObject)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.in_1):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), TraitListObject)
for idx, element in enumerate(pipeline1.in_2):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), TraitListObject)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), int)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], in_1)
self.assertEqual(dic["pipeline_parameters"]["in_2"], in_2)
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), list)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
for idx, element in enumerate(dic["pipeline_parameters"]["in_1"]):
self.assertEqual(element, in_1[idx])
self.assertEqual(type(element), list)
for idx, element in enumerate(dic["pipeline_parameters"]["in_2"]):
self.assertEqual(element, in_2[idx])
self.assertEqual(type(element), list)
for idx, element in enumerate(dic["pipeline_parameters"]["out"]):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), int)
def test_date_time(self):
class Pipeline1(Pipeline):
def pipeline_definition(self):
# Create processes
self.add_process("node_1", TestDateTime())
# Exports
self.export_parameter("node_1", "in_1", "in_1")
self.export_parameter("node_1", "in_2", "in_2")
self.export_parameter("node_1", "out", "out")
in_1 = date(2008, 6, 5)
in_2 = time(14, 4, 5)
out = ['2008-06-05', '14:04:05']
pipeline1 = Pipeline1()
pipeline1.in_1 = in_1
pipeline1.in_2 = in_2
pipeline1()
save_pipeline_parameters(self.path, pipeline1)
# Reinitializing pipeline and loading parameters
pipeline1 = Pipeline1()
load_pipeline_parameters(self.path, pipeline1)
self.assertEqual(pipeline1.in_1, None)
self.assertEqual(pipeline1.in_2, None)
self.assertEqual(pipeline1.out, out)
self.assertEqual(type(pipeline1.out), TraitListObject)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, out[idx])
self.assertEqual(type(element), six.text_type)
# Verifying the dictionary
dic = load_pipeline_dictionary(self.path)
self.assertEqual(dic["pipeline_parameters"]["in_1"], six.text_type(in_1))
self.assertEqual(dic["pipeline_parameters"]["in_2"], six.text_type(in_2))
self.assertEqual(dic["pipeline_parameters"]["out"], out)
self.assertEqual(type(dic["pipeline_parameters"]["in_1"]), six.text_type)
self.assertEqual(type(dic["pipeline_parameters"]["in_2"]), six.text_type)
self.assertEqual(type(dic["pipeline_parameters"]["out"]), list)
for idx, element in enumerate(pipeline1.out):
self.assertEqual(element, six.text_type(out[idx]))
self.assertEqual(type(element), six.text_type)
# a function test*() has to be defined in a test module in order to be
# taken into account by the main test module capsul.test.test_capsul
def test():
suite = unittest.TestLoader().loadTestsFromTestCase(TestPipelineMethods)
runtime = unittest.TextTestRunner(verbosity=2).run(suite)
return runtime.wasSuccessful()
if __name__ == '__main__':
test()
| 35.840561 | 105 | 0.61536 | 3,375 | 28,099 | 4.907259 | 0.053333 | 0.166647 | 0.104396 | 0.069436 | 0.879 | 0.855271 | 0.841867 | 0.813489 | 0.782695 | 0.775752 | 0 | 0.029331 | 0.248941 | 28,099 | 783 | 106 | 35.886335 | 0.755449 | 0.059148 | 0 | 0.69962 | 0 | 0 | 0.096832 | 0 | 0.001901 | 0 | 0 | 0 | 0.34981 | 1 | 0.08365 | false | 0 | 0.024715 | 0 | 0.152091 | 0.001901 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1de87f45aa5eddeb764b1954037128c438044bab | 113 | py | Python | tests/test_dice.py | Beltro39/ci-me-dice-on-demand | 48ca68da2ba897482624039ed469ca14a1a69df7 | [
"MIT"
] | null | null | null | tests/test_dice.py | Beltro39/ci-me-dice-on-demand | 48ca68da2ba897482624039ed469ca14a1a69df7 | [
"MIT"
] | null | null | null | tests/test_dice.py | Beltro39/ci-me-dice-on-demand | 48ca68da2ba897482624039ed469ca14a1a69df7 | [
"MIT"
] | null | null | null | import unittest
import app
def test_test():
assert app.test() == "Works2!"
#assert "Works!" == "Works!"
| 16.142857 | 34 | 0.619469 | 14 | 113 | 4.928571 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011111 | 0.20354 | 113 | 6 | 35 | 18.833333 | 0.755556 | 0.238938 | 0 | 0 | 0 | 0 | 0.082353 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
38487c314c15d4bad825dab4546ec7f23ac689e6 | 28 | py | Python | methods/raft/model/__init__.py | awaelchli/torch-optical-flow | 1f48d95b8f3412052f7c35eb2ec1fa7cb739efe1 | [
"MIT"
] | null | null | null | methods/raft/model/__init__.py | awaelchli/torch-optical-flow | 1f48d95b8f3412052f7c35eb2ec1fa7cb739efe1 | [
"MIT"
] | null | null | null | methods/raft/model/__init__.py | awaelchli/torch-optical-flow | 1f48d95b8f3412052f7c35eb2ec1fa7cb739efe1 | [
"MIT"
] | 1 | 2021-11-14T09:13:03.000Z | 2021-11-14T09:13:03.000Z | from model.raft import RAFT
| 14 | 27 | 0.821429 | 5 | 28 | 4.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
69c1e61166db7ff7dbc9f4aa028f7c8407a9a40a | 31 | py | Python | app/__init__.py | chengxianga2008/abn_amro | 66172747328b33a591ea4e4fcbb902cb823b91e0 | [
"BSD-2-Clause"
] | null | null | null | app/__init__.py | chengxianga2008/abn_amro | 66172747328b33a591ea4e4fcbb902cb823b91e0 | [
"BSD-2-Clause"
] | null | null | null | app/__init__.py | chengxianga2008/abn_amro | 66172747328b33a591ea4e4fcbb902cb823b91e0 | [
"BSD-2-Clause"
] | null | null | null | from .core import daily_summary | 31 | 31 | 0.870968 | 5 | 31 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 31 | 1 | 31 | 31 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
69ca3c7f1b94eaf970a1efb39d86b5d093ac421d | 82 | py | Python | Kasa/__init__.py | Hotlynn2/kasa | aa4d18723451608bd4f008552d645b2b38b7daba | [
"MIT"
] | 1 | 2021-03-28T18:32:07.000Z | 2021-03-28T18:32:07.000Z | Kasa/__init__.py | Hotlynn2/kasa | aa4d18723451608bd4f008552d645b2b38b7daba | [
"MIT"
] | null | null | null | Kasa/__init__.py | Hotlynn2/kasa | aa4d18723451608bd4f008552d645b2b38b7daba | [
"MIT"
] | null | null | null | from .preprocessing import *
from .berttokenizer import *
from .trainbert import * | 27.333333 | 28 | 0.792683 | 9 | 82 | 7.222222 | 0.555556 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134146 | 82 | 3 | 29 | 27.333333 | 0.915493 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3888171cad13e1ba81220af2a99577d7590f3b2c | 71 | py | Python | python/src/test/resources/pyfunc/math_log10_test.py | maropu/lljvm-translator | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 70 | 2017-12-12T10:54:00.000Z | 2022-03-22T07:45:19.000Z | python/src/test/resources/pyfunc/math_log10_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 14 | 2018-02-28T01:29:46.000Z | 2019-12-10T01:42:22.000Z | python/src/test/resources/pyfunc/math_log10_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 4 | 2019-07-21T07:58:25.000Z | 2021-02-01T09:46:59.000Z | import math
def math_log10_test(x, y):
return 2 * y + math.log10(x)
| 14.2 | 30 | 0.676056 | 14 | 71 | 3.285714 | 0.642857 | 0.391304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087719 | 0.197183 | 71 | 4 | 31 | 17.75 | 0.719298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 6 |
38974a0e5002dfbe4f9e89c704a3bb19b7dd1c25 | 17,456 | py | Python | userbot/plugins/animazioni3.py | Kazutettoh/strafattinoh-bot | e8ab44b6e720c8133fd43695355fabf20d37fe1c | [
"MIT"
] | null | null | null | userbot/plugins/animazioni3.py | Kazutettoh/strafattinoh-bot | e8ab44b6e720c8133fd43695355fabf20d37fe1c | [
"MIT"
] | null | null | null | userbot/plugins/animazioni3.py | Kazutettoh/strafattinoh-bot | e8ab44b6e720c8133fd43695355fabf20d37fe1c | [
"MIT"
] | null | null | null | """
Commands:
.avast
.avast1
.call
.hack
.linux
.macos
.stock
.windows
"""
import asyncio
from telethon import events
from platform import uname
from userbot import CMD_HELP, ALIVE_NAME
from userbot.utils import admin_cmd
DEFAULTUSER = str(ALIVE_NAME) if ALIVE_NAME else "I'M STUPID"
@borg.on(admin_cmd(pattern=f"avast", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 0.1
animation_ttl = range(0, 11)
#input_str = event.pattern_match.group(1)
#if input_str == "avast":
await event.edit("avast")
animation_chars = [
"`Downloading File..`",
"`File Downloaded....`",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 0%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 4%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 8%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 20%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 36%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 52%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 84%\n█████████████████████▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 100%\n█████████████████████████ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nRicerca: 01 of 01 Files Scansione...\n\nSTATUS: Nessun Virus Rilevato...`"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 11])
@borg.on(admin_cmd(pattern=f"avast1", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 1
animation_ttl = range(0, 11)
#input_str = event.pattern_match.group(1)
#if input_str == "avast1":
await event.edit(input_str)
animation_chars = [
"`Downloading File..`",
"`File Downloaded....`",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 0%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 4%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 8%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 20%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 36%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 52%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 84%\n█████████████████████▒▒▒▒ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nFile Scanned... 100%\n█████████████████████████ `",
"`Avast Security Checkup\n\n\nAccount: User Pro\nScadenza: 31/12/2099\n\nRicerca: 01 of 01 Scansione...\n\nSTATUS:⚠️Virus Rilevato⚠️\nINFO: Torzan, Spyware, Adware`"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 11])
@borg.on(admin_cmd(pattern=f"call", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 3
animation_ttl = range(0, 18)
#input_str = event.pattern_match.group(1)
#if input_str == "call":
await event.edit("call")
animation_chars = [
"`Chiamata alla sede di Telegram...`",
"`Chiamata Connessa`",
"`Telegram: Salve, risponde la sede di Telegram. Chi è lei?`",
f"{DEFAULTUSER}:`Salve sono` {DEFAULTUSER} ,`Devo parlare con il mio socio ,Pavel Durov`",
"`User Autorizzato`",
"`Chiamata a Pavel Durov` `+3969696969`",
"`Chiamata Connessa`",
f"{DEFAULTUSER}:`Banna questo account da Telegram`",
"`Pavel: Posso sapere chi sei?`",
f"{DEFAULTUSER}:`Yo bro, sono il tuo socio`",
"`Pavel: OMG!!! Ma è da tanto che non ci vediamo, bro...\nMi assicurerò io che l'account venga bloccato entro 24 ore.`",
f"{DEFAULTUSER}:`Grazie, a dopo bro.`",
"`Pavel: Ma va bro, telegram è nostro. Chiamami quando sei libero`",
f"{DEFAULTUSER}:`C'è qualche problema bro?🤔`",
"`Pavel: Sì bro, c'è un bug in telegram v8.6.9.\nNon sono in grado di risolverlo. Mi, aiuti a correggere il bug?`",
f"{DEFAULTUSER}:`Inviami tutto in chat, risolverò il bug.`",
"`Pavel: Grazie bro \nCi sentiamo :)`",
"`Chiamata Disconnessa.`"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 18])
@borg.on(admin_cmd(pattern=f"hack", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 2
animation_ttl = range(0, 12)
#input_str = event.pattern_match.group(1)
#if input_str == "hack":
await event.edit("hack")
animation_chars = [
"**Connessione a Telegram Data Center**",
f"Target Selected By Hacker: {DEFAULTUSER}",
"`Hacking... 0%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)",
"`Hacking... 4%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package",
"`Hacking... 8%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)",
"`Hacking... 20%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)\nBuilding wheel for Tg-Bruteforcing (setup.py): finished with status 'done'",
"`Hacking... 36%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)\nBuilding wheel for Tg-Bruteforcing (setup.py): finished with status 'done'\nCreated wheel for telegram: filename=Telegram-Data-Sniffer-0.0.1-py3-none-any.whl size=1306 sha256=cb224caad7fe01a6649188c62303cd4697c1869fa12d280570bb6ac6a88e6b7e",
"`Hacking... 52%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)\nBuilding wheel for Tg-Bruteforcing (setup.py): finished with status 'done'\nCreated wheel for telegram: filename=Telegram-Data-Sniffer-0.0.1-py3-none-any.whl size=1306 sha256=cb224caad7fe01a6649188c62303cd4697c1869fa12d280570bb6ac6a88e6b7e\n Stored in directory: /app/.cache/pip/wheels/a2/9f/b5/650dd4d533f0a17ca30cc11120b176643d27e0e1f5c9876b5b",
"`Hacking... 84%\n█████████████████████▒▒▒▒ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)\nBuilding wheel for Tg-Bruteforcing (setup.py): finished with status 'done'\nCreated wheel for telegram: filename=Telegram-Data-Sniffer-0.0.1-py3-none-any.whl size=1306 sha256=cb224caad7fe01a6649188c62303cd4697c1869fa12d280570bb6ac6a88e6b7e\n Stored in directory: /app/.cache/pip/wheels/a2/9f/b5/650dd4d533f0a17ca30cc11120b176643d27e0e1f5c9876b5b\n\n **Successfully Hacked Telegram Server Database**",
"`Hacking... 100%\n█████████HACKED███████████ `\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)\nBuilding wheel for Tg-Bruteforcing (setup.py): finished with status 'done'\nCreated wheel for telegram: filename=Telegram-Data-Sniffer-0.0.1-py3-none-any.whl size=1306 sha256=cb224caad7fe01a6649188c62303cd4697c1869fa12d280570bb6ac6a88e6b7e\n Stored in directory: /app/.cache/pip/wheels/a2/9f/b5/650dd4d533f0a17ca30cc11120b176643d27e0e1f5c9876b5b\n\n **Successfully Hacked Telegram Server Database**\n\n\n🔹Output: Generating.....",
f"`Account Hackerato...\n\nPaga 699€ a` {DEFAULTUSER} o @strafattino .`Per Rimuovere questo VIRUS`\n\n\n TERMINAL:\nDownloading Bruteforce-Telegram-0.1.tar.gz (9.3 kB)\nCollecting Data Package\n Downloading Telegram-Data-Sniffer-7.1.1-py2.py3-none-any.whl (82 kB)\nBuilding wheel for Tg-Bruteforcing (setup.py): finished with status 'done'\nCreated wheel for telegram: filename=Telegram-Data-Sniffer-0.0.1-py3-none-any.whl size=1306 sha256=cb224caad7fe01a6649188c62303cd4697c1869fa12d280570bb6ac6a88e6b7e\n Stored in directory: /app/.cache/pip/wheels/a2/9f/b5/650dd4d533f0a17ca30cc11120b176643d27e0e1f5c9876b5b\n\n **Successfully Hacked Telegram Server Database**\n\n\n🔹**Output:** Successful"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 12])
@borg.on(admin_cmd(pattern=f"linux", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 0.5
animation_ttl = range(0, 11)
#input_str = event.pattern_match.group(1)
#if input_str == "linux":
await event.edit("linux")
animation_chars = [
"`Connessione a Linux...`",
"`Inizializza Linux Login.`",
"`Loading Linux... 0%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Linux... 3%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Linux... 9%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Linux... 23%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Linux... 39%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Linux... 69%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Linux... 89%\n█████████████████████▒▒▒▒ `",
"`Loading Linux... 100%\n█████████████████████████ `",
"`Welcome...\n\nStock OS: Symbian OS\nCurrent OS: Linux`\n\n**My PC Specs:**\n\n **CPU:** __2.9GHz Intel Core i9-8950HK (hexa-core, 12MB cache, up to 4.8GHz)__\n\n**Graphics:** __Nvidia GeForce GTX 1080 OC (8GB GDDR5X)__\n\n**RAM:** __32GB DDR4 (2,666MHz)__\n\n**Screen:** __17.3-inch, QHD (2,560 x 1,440) 120Hz G-Sync__\n\n**Storage:** __512GB PCIe SSD, 1TB HDD (7,200 rpm)__\n\n**Ports:** __2 x USB 3.0, 1 x USB-C 3.0, 1 x USB-C (Thunderbolt 3), HDMI, mini DisplayPort, Ethernet, headphone jack, microphone jack__\n\n**Connectivity:** __Killer 1550 802.11ac Wi-Fi, Bluetooth 5.0__\n\n**Camera:** __Alienware FHD camera, Tobii IR Eye-tracking with Windows Hello__\n\n**Size:** __16.7 x 13.1 x 1.18 inches (42.4 x 33.2 x 2.99cm; W x D x H)__"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 11])
@borg.on(admin_cmd(pattern=f"macos", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 0.5
animation_ttl = range(0, 11)
#input_str = event.pattern_match.group(1)
#if input_str == "macos":
await event.edit("macos")
animation_chars = [
"`Connessione a Hackintosh...`",
"`Inizializza Hackintosh Login.`",
"`Loading Hackintosh... 0%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Hackintosh... 3%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Hackintosh... 9%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Hackintosh... 23%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Hackintosh... 39%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Hackintosh... 69%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Hackintosh... 89%\n█████████████████████▒▒▒▒ `",
"`Loading Hackintosh... 100%\n█████████████████████████ `",
"`Welcome...\n\nStock OS: Symbian OS\nCurrent OS: Hackintosh`\n\n**My PC Specs:**\n\n **CPU:** __2.9GHz Intel Core i9-8950HK (hexa-core, 12MB cache, up to 4.8GHz)__\n\n**Graphics:** __Nvidia GeForce GTX 1080 OC (8GB GDDR5X)__\n\n**RAM:** __32GB DDR4 (2,666MHz)__\n\n**Screen:** __17.3-inch, QHD (2,560 x 1,440) 120Hz G-Sync__\n\n**Storage:** __512GB PCIe SSD, 1TB HDD (7,200 rpm)__\n\n**Ports:** __2 x USB 3.0, 1 x USB-C 3.0, 1 x USB-C (Thunderbolt 3), HDMI, mini DisplayPort, Ethernet, headphone jack, microphone jack__\n\n**Connectivity:** __Killer 1550 802.11ac Wi-Fi, Bluetooth 5.0__\n\n**Camera:** __Alienware FHD camera, Tobii IR Eye-tracking with Windows Hello__\n\n**Size:** __16.7 x 13.1 x 1.18 inches (42.4 x 33.2 x 2.99cm; W x D x H)__"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 11])
@borg.on(admin_cmd(pattern=f"stock", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 0.5
animation_ttl = range(0, 11)
#input_str = event.pattern_match.group(1)
#if input_str == "stock":
await event.edit("stock")
animation_chars = [
"`Connessione a Symbian OS...`",
"`Inizializza Symbian OS Login.`",
"`Loading Symbian OS... 0%\n█████████████████████████ `",
"`Loading Symbian OS... 3%\n█████████████████████▒▒▒▒ `",
"`Loading Symbian OS... 9%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Symbian OS... 23%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Symbian OS... 39%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Symbian OS... 69%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Symbian OS... 89%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Symbian OS... 100%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Welcome...\n\nStock OS: Symbian OS\nCurrent OS: Symbian OS`\n\n**My PC Specs:**\n\n **CPU:** __2.9GHz Intel Core i9-8950HK (hexa-core, 12MB cache, up to 4.8GHz)__\n\n**Graphics:** __Nvidia GeForce GTX 1080 OC (8GB GDDR5X)__\n\n**RAM:** __32GB DDR4 (2,666MHz)__\n\n**Screen:** __17.3-inch, QHD (2,560 x 1,440) 120Hz G-Sync__\n\n**Storage:** __512GB PCIe SSD, 1TB HDD (7,200 rpm)__\n\n**Ports:** __2 x USB 3.0, 1 x USB-C 3.0, 1 x USB-C (Thunderbolt 3), HDMI, mini DisplayPort, Ethernet, headphone jack, microphone jack__\n\n**Connectivity:** __Killer 1550 802.11ac Wi-Fi, Bluetooth 5.0__\n\n**Camera:** __Alienware FHD camera, Tobii IR Eye-tracking with Windows Hello__\n\n**Size:** __16.7 x 13.1 x 1.18 inches (42.4 x 33.2 x 2.99cm; W x D x H)__"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 11])
@borg.on(admin_cmd(pattern=f"windows", outgoing=True))
async def _(event):
if event.fwd_from:
return
animation_interval = 0.5
animation_ttl = range(0, 11)
#input_str = event.pattern_match.group(1)
#if input_str == "windows":
await event.edit("windows")
animation_chars = [
"`Connessione a Windows 10...`",
"`Inizializza Windows 10 Login.`",
"`Loading Windows 10... 0%\n▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Windows 10... 3%\n█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Windows 10... 9%\n██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Windows 10... 23%\n█████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Windows 10... 39%\n█████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Windows 10... 69%\n█████████████▒▒▒▒▒▒▒▒▒▒▒▒ `",
"`Loading Windows 10... 89%\n█████████████████████▒▒▒▒ `",
"`Loading Windows 10... 100%\n█████████████████████████ `",
"`Welcome...\n\nStock OS: Symbian OS\nCurrent OS: Windows 10`\n\n**My PC Specs:**\n\n **CPU:** __2.9GHz Intel Core i9-8950HK (hexa-core, 12MB cache, up to 4.8GHz)__\n\n**Graphics:** __Nvidia GeForce GTX 1080 OC (8GB GDDR5X)__\n\n**RAM:** __32GB DDR4 (2,666MHz)__\n\n**Screen:** __17.3-inch, QHD (2,560 x 1,440) 120Hz G-Sync__\n\n**Storage:** __512GB PCIe SSD, 1TB HDD (7,200 rpm)__\n\n**Ports:** __2 x USB 3.0, 1 x USB-C 3.0, 1 x USB-C (Thunderbolt 3), HDMI, mini DisplayPort, Ethernet, headphone jack, microphone jack__\n\n**Connectivity:** __Killer 1550 802.11ac Wi-Fi, Bluetooth 5.0__\n\n**Camera:** __Alienware FHD camera, Tobii IR Eye-tracking with Windows Hello__\n\n**Size:** __16.7 x 13.1 x 1.18 inches (42.4 x 33.2 x 2.99cm; W x D x H)__"
]
for i in animation_ttl:
await asyncio.sleep(animation_interval)
await event.edit(animation_chars[i % 11])
| 66.372624 | 759 | 0.587477 | 2,386 | 17,456 | 4.7829 | 0.145013 | 0.014546 | 0.031546 | 0.033123 | 0.711795 | 0.711795 | 0.706011 | 0.706011 | 0.702944 | 0.702944 | 0 | 0.083032 | 0.19277 | 17,456 | 262 | 760 | 66.625954 | 0.62728 | 0.033799 | 0 | 0.436548 | 0 | 0.167513 | 0.710918 | 0.267767 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025381 | 0 | 0.06599 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2a16d1f1e10a6df0758b91d080a65afcafce61f8 | 3,631 | py | Python | tests/test_modules/test_arcface.py | plutoyuxie/mmgeneration | 0a7f5d16c970de1766ebf049d7a0264fe506504b | [
"Apache-2.0"
] | null | null | null | tests/test_modules/test_arcface.py | plutoyuxie/mmgeneration | 0a7f5d16c970de1766ebf049d7a0264fe506504b | [
"Apache-2.0"
] | null | null | null | tests/test_modules/test_arcface.py | plutoyuxie/mmgeneration | 0a7f5d16c970de1766ebf049d7a0264fe506504b | [
"Apache-2.0"
] | null | null | null | from copy import deepcopy
import pytest
import torch
from mmgen.models.architectures import IDLossModel
# yapf:disable
from mmgen.models.architectures.arcface.model_irse import Backbone
# yapf:enable
class TestArcFace:
@classmethod
def setup_class(cls):
cls.default_cfg = dict(
input_size=224,
num_layers=50,
mode='ir',
drop_ratio=0.4,
affine=True)
def test_arcface_cpu(self):
model = Backbone(**self.default_cfg)
x = torch.randn((2, 3, 224, 224))
y = model(x)
assert y.shape == (2, 512)
# test different input size
cfg = deepcopy(self.default_cfg)
cfg.update(dict(input_size=112))
model = Backbone(**cfg)
x = torch.randn((2, 3, 112, 112))
y = model(x)
assert y.shape == (2, 512)
# test different num_layers
cfg = deepcopy(self.default_cfg)
cfg.update(dict(num_layers=50))
model = Backbone(**cfg)
x = torch.randn((2, 3, 224, 224))
y = model(x)
assert y.shape == (2, 512)
# test different mode
cfg = deepcopy(self.default_cfg)
cfg.update(dict(mode='ir_se'))
model = Backbone(**cfg)
x = torch.randn((2, 3, 224, 224))
y = model(x)
assert y.shape == (2, 512)
# test different drop ratio
cfg = deepcopy(self.default_cfg)
cfg.update(dict(drop_ratio=0.8))
model = Backbone(**cfg)
x = torch.randn((2, 3, 224, 224))
y = model(x)
assert y.shape == (2, 512)
# test affine=False
cfg = deepcopy(self.default_cfg)
cfg.update(dict(affine=False))
model = Backbone(**cfg)
x = torch.randn((2, 3, 224, 224))
y = model(x)
assert y.shape == (2, 512)
@pytest.mark.skipif(not torch.cuda.is_available(), reason='requires cuda')
def test_arcface_cuda(self):
model = Backbone(**self.default_cfg).cuda()
x = torch.randn((2, 3, 224, 224)).cuda()
y = model(x)
assert y.shape == (2, 512)
# test different input size
cfg = deepcopy(self.default_cfg)
cfg.update(dict(input_size=112))
model = Backbone(**cfg).cuda()
x = torch.randn((2, 3, 112, 112)).cuda()
y = model(x)
assert y.shape == (2, 512)
# test different num_layers
cfg = deepcopy(self.default_cfg)
cfg.update(dict(num_layers=50))
model = Backbone(**cfg).cuda()
x = torch.randn((2, 3, 224, 224)).cuda()
y = model(x)
assert y.shape == (2, 512)
# test different mode
cfg = deepcopy(self.default_cfg)
cfg.update(dict(mode='ir_se'))
model = Backbone(**cfg).cuda()
x = torch.randn((2, 3, 224, 224)).cuda()
y = model(x)
assert y.shape == (2, 512)
# test different drop ratio
cfg = deepcopy(self.default_cfg)
cfg.update(dict(drop_ratio=0.8))
model = Backbone(**cfg).cuda()
x = torch.randn((2, 3, 224, 224)).cuda()
y = model(x)
assert y.shape == (2, 512)
# test affine=False
cfg = deepcopy(self.default_cfg)
cfg.update(dict(affine=False))
model = Backbone(**cfg).cuda()
x = torch.randn((2, 3, 224, 224)).cuda()
y = model(x)
assert y.shape == (2, 512)
# test loss model
id_loss_model = IDLossModel()
x1 = torch.randn((2, 3, 224, 224)).cuda()
x2 = torch.randn((2, 3, 224, 224)).cuda()
y, _ = id_loss_model(pred=x1, gt=x2)
assert y >= 0
| 29.520325 | 78 | 0.544754 | 485 | 3,631 | 4 | 0.146392 | 0.072165 | 0.079381 | 0.086598 | 0.775773 | 0.775773 | 0.746907 | 0.729381 | 0.710825 | 0.702062 | 0 | 0.074489 | 0.312311 | 3,631 | 122 | 79 | 29.762295 | 0.702443 | 0.07491 | 0 | 0.711111 | 0 | 0 | 0.007474 | 0 | 0 | 0 | 0 | 0 | 0.144444 | 1 | 0.033333 | false | 0 | 0.055556 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2a2b78e6147109360dd2adb3dd25ac9a990ef925 | 29 | py | Python | mydip/__init__.py | kommunium/dip-lab | 2c8e08a994fb34b87da55da48a7b72b7c13d9c81 | [
"MIT"
] | null | null | null | mydip/__init__.py | kommunium/dip-lab | 2c8e08a994fb34b87da55da48a7b72b7c13d9c81 | [
"MIT"
] | null | null | null | mydip/__init__.py | kommunium/dip-lab | 2c8e08a994fb34b87da55da48a7b72b7c13d9c81 | [
"MIT"
] | null | null | null | import numpy as np
import PIL | 14.5 | 18 | 0.827586 | 6 | 29 | 4 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 29 | 2 | 19 | 14.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2a5b0dc0ffe6c61566679afaae519f65c95f4c02 | 14,155 | py | Python | loaderscript.py | sayanmutd/Project-DeepView | 3c9fd134085b38e42f3439e2eda97f4a1606c9e6 | [
"MIT"
] | 4 | 2019-05-11T12:22:11.000Z | 2020-06-22T05:28:18.000Z | loaderscript.py | sayanmutd/Project-DeepView | 3c9fd134085b38e42f3439e2eda97f4a1606c9e6 | [
"MIT"
] | null | null | null | loaderscript.py | sayanmutd/Project-DeepView | 3c9fd134085b38e42f3439e2eda97f4a1606c9e6 | [
"MIT"
] | 2 | 2019-05-11T12:22:17.000Z | 2021-12-18T22:32:38.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Sat Aug 11 13:29:25 2018
@author: rishav
"""
import json
import pandas as pd
import os
from sklearn.metrics import accuracy_score
import pickle
#creating dummy row
file = open('Violent/rgb_000350_keypoints.json','r')
datastore = json.load(file)
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
#print(len(rperson1))
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df=pd.DataFrame([datamerge])
df2=pd.DataFrame([datamerge])
df3=pd.DataFrame([datamerge])
df4=pd.DataFrame([datamerge])
#print(df)
## REPLACE ALL THESE MULTIPLE CALLS WITH A SINGLE FUNCTION WITH PASSED PARAMETERS AFASP
for file in os.listdir("Violent"):
if file.endswith(".json"):
dir=os.path.join("Violent/", file)
file = open(dir,'r')
datastore = json.load(file)
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df.loc[len(df)]=datamerge
for file in os.listdir("NonViolent"):
if file.endswith(".json"):
dir=os.path.join("NonViolent/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df.loc[len(df)]=datamerge
for file in os.listdir("NV2"):
if file.endswith(".json"):
dir=os.path.join("NV2/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df.loc[len(df)]=datamerge
for file in os.listdir("NV3"):
if file.endswith(".json"):
dir=os.path.join("NV3/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df.loc[len(df)]=datamerge
for file in os.listdir("NV4"):
if file.endswith(".json"):
dir=os.path.join("NV4/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df.loc[len(df)]=datamerge
for file in os.listdir("NV5"):
if file.endswith(".json"):
dir=os.path.join("NV5/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df.loc[len(df)]=datamerge
for file in os.listdir("NV6"):
if file.endswith(".json"):
dir=os.path.join("NV6/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df.loc[len(df)]=datamerge
for file in os.listdir("V2"):
if file.endswith(".json"):
dir=os.path.join("V2/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df.loc[len(df)]=datamerge
for file in os.listdir("V3"):
if file.endswith(".json"):
dir=os.path.join("V3/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df.loc[len(df)]=datamerge
for file in os.listdir("V4"):
if file.endswith(".json"):
dir=os.path.join("V4/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df.loc[len(df)]=datamerge
for file in os.listdir("V5"):
if file.endswith(".json"):
dir=os.path.join("V5/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df.loc[len(df)]=datamerge
for file in os.listdir("V6"):
if file.endswith(".json"):
dir=os.path.join("V6/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df.loc[len(df)]=datamerge
for file in os.listdir("V7"):
if file.endswith(".json"):
dir=os.path.join("V7/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(1)
df2.loc[len(df2)]=datamerge
for file in os.listdir("NV7"):
if file.endswith(".json"):
dir=os.path.join("NV7/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df3.loc[len(df3)]=datamerge
for file in os.listdir("TestO"):
if file.endswith(".json"):
dir=os.path.join("TestO/", file)
file = open(dir,'r')
datastore = json.load(file)
try:
person1=datastore['people'][0]['pose_keypoints_2d']
person2=datastore['people'][1]['pose_keypoints_2d']
except:
print()
rperson1=[]
rperson2=[]
counter=1;
for dat in person1:
if counter%3 !=0:
rperson1.append(dat);
counter=counter+1;
counter=1;
for dat in person2:
if counter%3 !=0:
rperson2.append(dat);
counter=counter+1;
datamerge=rperson1+rperson2
datamerge.append(0)
df4.loc[len(df4)]=datamerge
df=df.sort_values(by=[72])
df=df.reset_index(drop=True)
X = df.iloc[363:, 0:72].values
y = df.iloc[363:, 72].values
XTEST = df2.iloc[1:, 0:72].values
yTEST = df3.iloc[1:, 0:72].values
Custom= df4.iloc[1:,0:72].values
# Splitting the dataset into the Training set and Test set
from sklearn.cross_validation import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.25, random_state = 42)
# Feature Scaling
#from sklearn.preprocessing import MinMaxScaler
#sc = MinMaxScaler(feature_range=(0,1))
#X_train = sc.fit_transform(X_train)
#X_test = sc.transform(X_test)
#######Linear SVM Classification:
# Fitting SVM to the Training set
#from sklearn.svm import SVC
from sklearn.naive_bayes import GaussianNB
#classifier = SVC(kernel = 'linear', random_state = 0)
####RBF SVM Classification:
#classifier = SVC(kernel = 'sigmoid', random_state = 42)
###### POLYNOMIAL########
#classifier = SVC(kernel = 'poly', random_state = 0, degree = 3)
######SIGMOID ############################
classifier= GaussianNB()
classifier.fit(X_train, y_train)
filename="RBF.sav"
pickle.dump(classifier, open(filename, 'wb'))
# Visualising the Test set results
######SIGMOID ############################
# Visualising the Test set results
predicted_test=classifier.predict(X_test)
predicted_train=classifier.predict(X_train)
predicted_violent=classifier.predict(XTEST)
predicted_nonviolent=classifier.predict(yTEST)
predicted_Custom=classifier.predict(Custom)
#get the accuracy score
test_accuracy=accuracy_score(y_test,predicted_test)
train_accuracy=accuracy_score(y_train,predicted_train)
print(test_accuracy,train_accuracy)
| 30.053079 | 95 | 0.552172 | 1,662 | 14,155 | 4.637184 | 0.098676 | 0.066433 | 0.062281 | 0.045673 | 0.79071 | 0.775658 | 0.768652 | 0.768652 | 0.708317 | 0.706371 | 0 | 0.047336 | 0.310491 | 14,155 | 470 | 96 | 30.117021 | 0.742316 | 0.059626 | 0 | 0.845606 | 0 | 0 | 0.074911 | 0.002502 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016627 | 0 | 0.016627 | 0.035629 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2a92f3938aa70fe13b295276ca4cac3ad5cb4fdd | 100 | py | Python | dual_rocks/user_profile/context_processors.py | dual-rocks/dual.rocks | 6231833fcc36839b1dc6de79edda99d9d15c2cfe | [
"MIT"
] | null | null | null | dual_rocks/user_profile/context_processors.py | dual-rocks/dual.rocks | 6231833fcc36839b1dc6de79edda99d9d15c2cfe | [
"MIT"
] | 10 | 2020-02-18T00:37:32.000Z | 2022-03-12T00:17:58.000Z | dual_rocks/user_profile/context_processors.py | dual-rocks/dual.rocks | 6231833fcc36839b1dc6de79edda99d9d15c2cfe | [
"MIT"
] | null | null | null | def current_profile(request):
return {
'current_profile': request.current_profile
}
| 20 | 50 | 0.68 | 10 | 100 | 6.5 | 0.5 | 0.646154 | 0.646154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.23 | 100 | 4 | 51 | 25 | 0.844156 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
aa625cc2c3be65bfc42c780631044500aac6d0d9 | 8,045 | py | Python | code/trainUtils.py | AnilOsmanTur/Classifying-The-ID-Visibility | 58516ffa91bd15e968a54fc4e7a21730ceda8e36 | [
"MIT"
] | null | null | null | code/trainUtils.py | AnilOsmanTur/Classifying-The-ID-Visibility | 58516ffa91bd15e968a54fc4e7a21730ceda8e36 | [
"MIT"
] | null | null | null | code/trainUtils.py | AnilOsmanTur/Classifying-The-ID-Visibility | 58516ffa91bd15e968a54fc4e7a21730ceda8e36 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sat Jan 18 17:19:52 2020
@author: anilosmantur
"""
import numpy as np
import time
import torch
import torch.optim as optim
from Model import CardModel
from sklearn import metrics
def init_model(path='../artifacts/modified_mobilenet_v2_features_state_dict.pth', load_model=False, cuda=False, lr=1e-3, decay_points=[], decay=0.1):
print('initizalizing Model...')
model = CardModel(path=path, load=(not load_model))
if load_model:
model.load_state_dict(torch.load(path), strict=True)
if cuda:
model.cuda()
optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=lr, betas=(0.9,0.999))
scheduler = optim.lr_scheduler.MultiStepLR(optimizer, milestones=decay_points, gamma=decay)
print('Model loaded...')
return model, optimizer, scheduler
def calculate_predictions(out):
predicted = np.argmax(out, axis=1)
return predicted
def train(model, optimizer, loader, epoch_i, cuda=False):
model.train()
loss = 0.0
avg = 0.0
start = time.time()
loss_hist = []
predictions = []
labels = []
for batch_idx, data in enumerate(loader, 1):
batch_time_s = time.time()
y = torch.squeeze(data['label'])
x = data['image']
if cuda:
x = x.cuda()
y = y.cuda()
optimizer.zero_grad()
out = model(x)
loss = model.criter(out, y)
loss.backward()
optimizer.step()
out = out.cpu().detach().numpy()
preds = calculate_predictions(out)
predictions += list(preds)
y = y.cpu().detach().numpy()
labels += list(y)
# metrics
accuracy = metrics.accuracy_score(y, preds)
precision = metrics.precision_score(y, preds, average='weighted',zero_division=1)
recall = metrics.recall_score(y, preds, average='weighted',zero_division=1)
f1_score = metrics.f1_score(y, preds, average='weighted',zero_division=1)
loss = float(loss.detach())
loss_hist.append(loss)
avg += loss
spent_time = time.time() - batch_time_s
out_str = '\rTRAIN Epoch: {} Loss: {:.6f} Acc: {:5.2f} preci: {:.3f} recall: {:.3f} f1 score: {:.3f} time: {:.2f}{}'.format(
epoch_i, loss, 100*accuracy, precision, recall, f1_score, spent_time, 10*' ')
print('\r'+out_str, end='')
labels = np.array(labels)
predictions = np.array(predictions)
accuracy = metrics.accuracy_score(labels, predictions)
precision = metrics.precision_score(labels, predictions, average='weighted',zero_division=1)
recall = metrics.recall_score(labels, predictions, average='weighted',zero_division=1)
f1_score = metrics.f1_score(labels, predictions, average='weighted',zero_division=1)
total_time = time.time() - start
avg /= len(loader)
out_str = 'TRAIN Epoch: {} Avg Loss: {:.6f} Acc: {:5.2f} preci: {:.3f} recall: {:.3f} f1 score: {:.3f} time: {:.2f}{}'.format(
epoch_i, avg, 100*accuracy, precision, recall, f1_score, total_time, 10*' ')
print('\r'+out_str)
return loss_hist, avg, accuracy, precision, recall, f1_score
def validation(model, loader, epoch_i, cuda=False, type_t='VAL'):
with torch.no_grad():
model.eval()
loss = 0.0
avg = 0.0
start = time.time()
loss_hist = []
predictions = []
labels = []
for batch_idx, data in enumerate(loader, 1):
batch_time_s = time.time()
y = torch.squeeze(data['label'])
x = data['image']
if cuda:
x = x.cuda()
y = y.cuda()
out = model(x)
loss = float(model.criter(out, y).detach_())
loss_hist.append(loss)
avg += loss
out = out.cpu().detach().numpy()
preds = calculate_predictions(out)
predictions += list(preds)
y = y.cpu().detach().numpy()
labels += list(y)
# metrics
accuracy = metrics.accuracy_score(y, preds)
precision = metrics.precision_score(y, preds, average='weighted',zero_division=1)
recall = metrics.recall_score(y, preds, average='weighted',zero_division=1)
f1_score = metrics.f1_score(y, preds, average='weighted',zero_division=1)
spent_time = time.time() - batch_time_s
out_str = '\r{} Epoch: {} Loss: {:.6f} Acc: {:5.2f} preci: {:.3f} recall: {:.3f} f1 score: {:.3f} time: {:.2f}{}'.format(
type_t, epoch_i, loss, 100*accuracy, precision, recall, f1_score, spent_time, 10*' ')
print('\r'+out_str, end='')
labels = np.array(labels)
predictions = np.array(predictions)
accuracy = metrics.accuracy_score(labels, predictions)
precision = metrics.precision_score(labels, predictions, average='weighted',zero_division=1)
recall = metrics.recall_score(labels, predictions, average='weighted',zero_division=1)
f1_score = metrics.f1_score(labels, predictions, average='weighted',zero_division=1)
total_time = time.time() - start
avg /= len(loader)
out_str = '{} Epoch: {} Avg Loss: {:.6f} Acc: {:5.2f} preci: {:.3f} recall: {:.3f} f1 score: {:.3f} time: {:.2f}{}'.format(
type_t, epoch_i, avg, 100*accuracy, precision, recall, f1_score, total_time, 10*' ')
print('\r'+out_str)
return loss_hist, avg, accuracy, precision, recall, f1_score
def predict_from_loader(model, loader, cuda=False):
with torch.no_grad():
model.eval()
loss = 0.0
avg = 0.0
start = time.time()
loss_hist = []
predictions = []
labels = []
for batch_idx, data in enumerate(loader, 1):
batch_time_s = time.time()
y = torch.squeeze(data['label'])
x = data['image']
if cuda:
x = x.cuda()
y = y.cuda()
out = model(x)
loss = float(model.criter(out, y).detach_())
loss_hist.append(loss)
avg += loss
out = out.cpu().detach().numpy()
preds = calculate_predictions(out)
predictions += list(preds)
y = y.cpu().detach().numpy()
labels += list(y)
# metrics
accuracy = metrics.accuracy_score(y, preds)
precision = metrics.precision_score(y, preds, average='weighted',zero_division=1)
recall = metrics.recall_score(y, preds, average='weighted',zero_division=1)
f1_score = metrics.f1_score(y, preds, average='weighted',zero_division=1)
spent_time = time.time() - batch_time_s
out_str = '\r{:5.2f}% {}/{} Loss: {:.6f} Acc: {:5.2f} preci: {:.3f} recall: {:.3f} f1 score: {:.3f} time: {:.2f}{}'.format(
100*batch_idx/len(loader), batch_idx, len(loader), loss, 100*accuracy, precision, recall, f1_score, spent_time, 10*' ')
print('\r'+out_str, end='')
labels = np.array(labels)
predictions = np.array(predictions)
accuracy = metrics.accuracy_score(labels, predictions)
precision = metrics.precision_score(labels, predictions, average='weighted',zero_division=1)
recall = metrics.recall_score(labels, predictions, average='weighted',zero_division=1)
f1_score = metrics.f1_score(labels, predictions, average='weighted',zero_division=1)
total_time = time.time() - start
avg /= len(loader)
out_str = 'Avg Loss: {:.6f} Acc: {:5.2f} preci: {:.3f} recall: {:.3f} f1 score: {:.3f} time: {:.2f}{}'.format(
avg, 100*accuracy, precision, recall, f1_score, total_time, 10*' ')
print('\r'+out_str)
return labels, predictions | 39.436275 | 153 | 0.576134 | 976 | 8,045 | 4.605533 | 0.143443 | 0.040489 | 0.076085 | 0.10812 | 0.798888 | 0.789544 | 0.789544 | 0.782647 | 0.782647 | 0.775306 | 0 | 0.027821 | 0.280671 | 8,045 | 204 | 154 | 39.436275 | 0.74892 | 0.013052 | 0 | 0.740506 | 0 | 0.037975 | 0.113115 | 0.007314 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031646 | false | 0 | 0.037975 | 0 | 0.101266 | 0.050633 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
aaad629bc99d200a6033466bedfde4c405906ab6 | 57,874 | py | Python | freecad/Commands.py | Halfmuh/ef | 22fcb912da2c8b904fc9b5542d65718e6a0883f6 | [
"MIT"
] | 13 | 2016-08-09T13:38:35.000Z | 2021-11-14T11:25:57.000Z | freecad/Commands.py | Halfmuh/ef | 22fcb912da2c8b904fc9b5542d65718e6a0883f6 | [
"MIT"
] | 18 | 2018-04-21T19:26:34.000Z | 2019-02-14T10:51:37.000Z | freecad/Commands.py | Halfmuh/ef | 22fcb912da2c8b904fc9b5542d65718e6a0883f6 | [
"MIT"
] | 7 | 2017-02-22T09:12:42.000Z | 2019-07-07T07:49:02.000Z | import os
import FreeCAD, FreeCADGui
import Part
from math import *
from pivy import coin
from PySide import QtGui, QtCore
import subprocess
class CreateEfConfig():
"""Create objects for new ef config"""
def GetResources(self):
moddir = os.path.expanduser("~") + "/.FreeCAD/Mod/ef"
return {'Pixmap' : moddir + '/icons/new_conf_template.svg',
'Accel' : "Shift+N", # a default shortcut (optional)
'MenuText': "New minimal ef config",
'ToolTip' : "New minimal ef config"}
def Activated(self):
ef_conf_group = FreeCAD.ActiveDocument.addObject(
"App::DocumentObjectGroup", "ef_conf" )
time_grid_conf = ef_conf_group.newObject(
"App::FeaturePython", "Time grid" )
TimeGridConfigPart( time_grid_conf )
outfile_conf = ef_conf_group.newObject(
"App::FeaturePython", "Output filename")
OutputFilenameConfigPart( outfile_conf )
particle_interaction_conf = ef_conf_group.newObject(
"App::FeaturePython", "Particle interaction model")
ParticleInteractionModelConfigPart( particle_interaction_conf )
spat_mesh_conf = ef_conf_group.newObject(
"App::FeaturePython", "Spatial mesh" )
SpatialMeshConfigPart( spat_mesh_conf )
boundary_cond_conf = ef_conf_group.newObject(
"App::FeaturePython", "Boundary conditions" )
BoundaryConditionsConfigPart( boundary_cond_conf )
magn_field_conf = ef_conf_group.newObject(
"App::FeaturePython", "Magnetic field" )
MagneticFieldConfigPart( magn_field_conf )
run_ef = ef_conf_group.newObject(
"App::FeaturePython", "Run_Ef" )
RunEfConfig( run_ef )
FreeCAD.ActiveDocument.recompute()
return
def IsActive(self):
return (FreeCAD.ActiveDocument is not None)
class AddSourceRegion():
"""Add box-shaped source of particles"""
def GetResources(self):
moddir = os.path.expanduser("~") + "/.FreeCAD/Mod/ef"
return {'Pixmap' : moddir + '/icons/add_box_source.svg',
'Accel' : "Shift+S", # a default shortcut (optional)
'MenuText': "Add box-shaped source of particles",
'ToolTip' : "Add box-shaped source of particles"}
def Activated(self):
for ef_conf_group in self.selected_ef_conf_groups:
source_conf = ef_conf_group.newObject(
"App::FeaturePython", "Source" )
ParticleSourceConfigPart( source_conf )
FreeCAD.ActiveDocument.recompute()
return
def IsActive(self):
# Add source only if ef-group is selected
# todo: check if selected object is ef-conf group
# or directly belongs to ef-conf group
sel = FreeCADGui.Selection.getSelection()
self.selected_ef_conf_groups = []
active = False
for obj in sel:
if "ef" in obj.Name:
self.selected_ef_conf_groups.append( obj )
active = True
else:
for parent_obj in obj.InList:
if "ef" in parent_obj.Name:
self.selected_ef_conf_groups.append( parent_obj )
active = True
return active
class AddCylindricalSource():
"""Add cylindrical-shaped source of particles"""
def GetResources(self):
moddir = os.path.expanduser("~") + "/.FreeCAD/Mod/ef"
return {'Pixmap' : moddir + '/icons/add_cylindrical_source.ico',
'Accel' : "Shift+C", # a default shortcut (optional)
'MenuText': "Add cylindrical-shaped source of particles",
'ToolTip' : "Add cylindrical-shaped source of particles"}
def Activated(self):
for ef_conf_group in self.selected_ef_conf_groups:
source_conf = ef_conf_group.newObject(
"App::FeaturePython", "Cylindrical Source" )
ParticleCylindricalSourceConfigPart( source_conf )
FreeCAD.ActiveDocument.recompute()
return
def IsActive(self):
# Add source only if ef-group is selected
# todo: check if selected object is ef-conf group
# or directly belongs to ef-conf group
sel = FreeCADGui.Selection.getSelection()
self.selected_ef_conf_groups = []
active = False
for obj in sel:
if "ef" in obj.Name:
self.selected_ef_conf_groups.append( obj )
active = True
else:
for parent_obj in obj.InList:
if "ef" in parent_obj.Name:
self.selected_ef_conf_groups.append( parent_obj )
active = True
return active
class AddInnerRegionBox():
"""Add box inner region"""
def GetResources(self):
moddir = os.path.expanduser("~") + "/.FreeCAD/Mod/ef"
return {'Pixmap' : moddir + '/icons/add_box_inner_region.svg',
'Accel' : "Shift+R", # a default shortcut (optional)
'MenuText': "Add box-shaped inner region",
'ToolTip' : "Add box-shaped inner region"}
def Activated(self):
for ef_conf_group in self.selected_ef_conf_groups:
inner_reg_conf = ef_conf_group.newObject(
"App::FeaturePython", "Inner_region_box" )
InnerRegionBoxConfigPart( inner_reg_conf )
FreeCAD.ActiveDocument.recompute()
return
def IsActive(self):
# Add source only if ef-group is selected
# todo: check if selected object is ef-conf group
# or directly belongs to ef-conf group
sel = FreeCADGui.Selection.getSelection()
self.selected_ef_conf_groups = []
active = False
for obj in sel:
if "ef" in obj.Name:
self.selected_ef_conf_groups.append( obj )
active = True
else:
for parent_obj in obj.InList:
if "ef" in parent_obj.Name:
self.selected_ef_conf_groups.append( parent_obj )
active = True
return active
class GenerateConfFile():
"""Generate .conf file suitable for ef"""
def GetResources(self):
moddir = os.path.expanduser("~") + "/.FreeCAD/Mod/ef"
return {'Pixmap' : moddir + '/icons/generate_config.svg',
'Accel' : "Shift+G", # a default shortcut (optional)
'MenuText': "Generate .conf file",
'ToolTip' : "Generate .conf file"}
def IsActive(self):
# Add source only if ef-group is selected
# todo: check if selected object is ef-conf group
# or directly belongs to ef-conf group
sel = FreeCADGui.Selection.getSelection()
self.selected_ef_conf_groups = []
active = False
for obj in sel:
if "ef" in obj.Name:
self.selected_ef_conf_groups.append( obj )
active = True
else:
for parent_obj in obj.InList:
if "ef" in parent_obj.Name:
self.selected_ef_conf_groups.append( parent_obj )
active = True
return active
def Activated(self):
for ef_grp in self.selected_ef_conf_groups:
### Generate and write config
config_text = self.generate_config_text( ef_grp )
config_filename = self.write_config( config_text, ef_grp.Name )
def generate_config_text( self, ef_group ):
config_text = []
config_text.append( "; Generated by FreeCAD module\n" )
config_text.append( "\n" )
objects_in_grp = ef_group.Group
for obj in objects_in_grp:
config_text.extend( obj.Proxy.generate_config_part() )
return config_text
def write_config( self, config_text, ef_group_name ):
default_dialog_path = "./"
default_conf_name = ef_group_name + ".conf"
conf_filename, filename_filter = QtGui.QFileDialog.getSaveFileName(
None, "Generate ef config",
default_dialog_path + default_conf_name,
"*.conf" )
if conf_filename == "":
FreeCAD.Console.PrintMessage( "Config generation aborted: "
"file to write was not selected" + "\n" )
else:
with open( conf_filename, 'w') as f:
f.writelines( config_text )
return conf_filename
class RunEf():
"""Run Ef"""
def GetResources(self):
moddir = os.path.expanduser("~") + "/.FreeCAD/Mod/ef"
return {'Pixmap' : moddir + '/icons/run_ef.svg',
'Accel' : "Shift+S", # a default shortcut (optional)
'MenuText': "Run Ef",
'ToolTip' : "Run Ef"}
def Activated(self):
for ef_conf_group in self.selected_ef_conf_groups:
# todo: generate config to temp file in temp directory, run ef on this config
# Rename 'command' to 'ef_command'
run_ef = ef_conf_group.getObject("Run_Ef")
freecad_workdir = os.getcwd()
os.chdir( run_ef.change_workdir_to )
stdout = subprocess.Popen( run_ef.command, shell = True,
stdout = subprocess.PIPE ).stdout.read()
FreeCAD.Console.PrintMessage( stdout )
# https://stackoverflow.com/questions/803265/getting-realtime-output-using-subprocess
# realtime output for subprocess
os.chdir( freecad_workdir )
FreeCAD.ActiveDocument.recompute()
return
def IsActive(self):
# Add source only if ef-group is selected
# todo: check if selected object is ef-conf group
# or directly belongs to ef-conf group
sel = FreeCADGui.Selection.getSelection()
self.selected_ef_conf_groups = []
active = False
for obj in sel:
if "ef" in obj.Name:
self.selected_ef_conf_groups.append( obj )
active = True
else:
for parent_obj in obj.InList:
if "ef" in parent_obj.Name:
self.selected_ef_conf_groups.append( parent_obj )
active = True
return active
###
class TimeGridConfigPart:
"""Properties and representation of time_grid config part"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString", "total_time",
"Time grid", "Total simulation time" ).total_time = "1.0"
obj.addProperty(
"App::PropertyString", "time_save_step",
"Time grid", "Time step between checkpoints" ).time_save_step = "1e-3"
obj.addProperty(
"App::PropertyString", "time_step_size",
"Time grid", "Time step" ).time_step_size = "1e-5"
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def updateData(self, fp, prop):
'''If a property of the handled feature has changed
we have the chance to handle this here'''
return
def attach(self, obj):
''' Setup the scene sub-graph of the view provider, this method is mandatory '''
# todo: represent time grid as text on 3d-screen
# self.text = coin.SoGroup()
# self.t1 = coin.SoAsciiText()
# self.t1.string = "arghk"
# self.text.addChild( self.t1 )
return
def generate_config_part( self ):
conf_part = []
conf_part.append( "[TimeGrid]\n" )
export_property_names = [ "total_time",
"time_save_step",
"time_step_size" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
def __getstate__(self):
'''When saving the document this object gets stored using Python's json module.
Since we have some un-serializable parts
here -- the Coin stuff -- we must define this method
to return a tuple of all serializable objects or None.'''
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
'''When restoring the serialized object from document
we have the chance to set some internals here.
Since no data were serialized nothing needs to be done here.'''
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
class OutputFilenameConfigPart():
"""Properties and representation of output_filename config part"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString", "output_filename_suffix",
"Output filename", "Output filename extension").output_filename_suffix = ".h5"
obj.addProperty(
"App::PropertyString", "output_filename_prefix",
"Output filename", "Output filename basename").output_filename_prefix = "out_"
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def updateData(self, fp, prop):
'''If a property of the handled feature has changed
we have the chance to handle this here'''
return
def attach(self, obj):
'''Setup the scene sub-graph of the view provider, this method is mandatory'''
# todo: represent output_filename as text on 3d-screen
return
def generate_config_part( self ):
conf_part = []
conf_part.append( "[OutputFilename]\n" )
export_property_names = [ "output_filename_suffix", "output_filename_prefix" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
def __getstate__(self):
'''When saving the document this object gets stored using Python's json module.
Since we have some un-serializable parts
here -- the Coin stuff -- we must define this method
to return a tuple of all serializable objects or None.'''
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
'''When restoring the serialized object from document
we have the chance to set some internals here.
Since no data were serialized nothing needs to be done here.'''
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
class ParticleInteractionModelConfigPart():
"""Properties and representation of output_filename config part"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyEnumeration",
"particle_interaction_model",
"Base",
"Interaction of particles").particle_interaction_model = ["noninteracting", "PIC"]
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def updateData(self, fp, prop):
'''If a property of the handled feature has changed
we have the chance to handle this here'''
return
def attach(self, obj):
'''Setup the scene sub-graph of the view provider, this method is mandatory'''
return
def generate_config_part( self ):
conf_part = []
conf_part.append( "[ParticleInteractionModel]\n" )
export_property_names = [ "particle_interaction_model" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
def __getstate__(self):
'''When saving the document this object gets stored using Python's json module.
Since we have some un-serializable parts
here -- the Coin stuff -- we must define this method
to return a tuple of all serializable objects or None.'''
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
'''When restoring the serialized object from document
we have the chance to set some internals here.
Since no data were serialized nothing needs to be done here.'''
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
class SpatialMeshConfigPart():
"""Properties and representation of spatial_mesh config part"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString", "grid_x_size",
"Spatial mesh", "Computational volume X-size" ).grid_x_size = "1.0"
obj.addProperty(
"App::PropertyString", "grid_x_step",
"Spatial mesh", "X-step size" ).grid_x_step = "0.1"
obj.addProperty(
"App::PropertyString", "grid_y_size",
"Spatial mesh", "Computational volume Y-size" ).grid_y_size = "1.0"
obj.addProperty(
"App::PropertyString", "grid_y_step",
"Spatial mesh", "Y-step size" ).grid_y_step = "0.1"
obj.addProperty(
"App::PropertyString", "grid_z_size",
"Spatial mesh", "Computational volume Z-size" ).grid_z_size = "1.0"
obj.addProperty(
"App::PropertyString", "grid_z_step",
"Spatial mesh", "Z-step size" ).grid_z_step = "0.1"
obj.addProperty("Part::PropertyPartShape", "Shape",
"Spatial mesh", "Computational volume box")
obj.ViewObject.addProperty("App::PropertyColor", "Color",
"Spatial mesh", "Volume box color").Color=(1.0,0.0,0.0)
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def attach(self, obj):
self.shaded = coin.SoGroup()
self.wireframe = coin.SoGroup()
self.color = coin.SoBaseColor()
self.trans = coin.SoTranslation()
self.box = coin.SoCube()
self.shaded.addChild( self.color )
self.shaded.addChild( self.trans )
self.shaded.addChild( self.box )
obj.addDisplayMode( self.shaded, "Shaded" );
style = coin.SoDrawStyle()
style.style = coin.SoDrawStyle.LINES
self.wireframe.addChild( style )
self.wireframe.addChild( self.color )
self.wireframe.addChild( self.trans )
self.wireframe.addChild( self.box )
obj.addDisplayMode( self.wireframe, "Wireframe" );
self.onChanged( obj, "Color" )
return
def updateData(self, obj, prop ):
"Executed when propery in field 'data' is changed"
# todo: recompute only 'prop'
x_size = float( obj.getPropertyByName("grid_x_size") )
y_size = float( obj.getPropertyByName("grid_y_size") )
z_size = float( obj.getPropertyByName("grid_z_size") )
self.trans.translation.setValue( [ x_size/2, y_size/2, z_size/2 ] )
self.box.width.setValue( x_size )
self.box.height.setValue( y_size )
self.box.depth.setValue( z_size )
def getDisplayModes(self,obj):
"Return a list of display modes."
modes=[]
modes.append("Shaded")
modes.append("Wireframe")
return modes
def getDefaultDisplayMode(self):
'''Return the name of the default display mode.
It must be defined in getDisplayModes.'''
return "Wireframe"
def setDisplayMode(self,mode):
return mode
def onChanged(self, vp, prop):
"Executed if any property is changed"
if prop == "Color":
c = vp.getPropertyByName("Color")
self.color.rgb.setValue( c[0], c[1], c[2] )
def __getstate__(self):
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
def generate_config_part( self ):
conf_part = []
conf_part.append( "[SpatialMesh]\n" )
export_property_names = [ "grid_x_size", "grid_x_step",
"grid_y_size", "grid_y_step",
"grid_z_size", "grid_z_step" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
class BoundaryConditionsConfigPart():
"""Properties and representation of boundary_conditions config part"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString",
"boundary_phi_left",
"Boundary conditions",
"Potential on left boundary").boundary_phi_left = "0.0"
obj.addProperty(
"App::PropertyString",
"boundary_phi_right",
"Boundary conditions",
"Potential on right boundary").boundary_phi_right = "0.0"
obj.addProperty(
"App::PropertyString",
"boundary_phi_top",
"Boundary conditions",
"Potential on top boundary").boundary_phi_top = "0.0"
obj.addProperty(
"App::PropertyString",
"boundary_phi_bottom",
"Boundary conditions",
"Potential on bottom boundary").boundary_phi_bottom = "0.0"
obj.addProperty(
"App::PropertyString",
"boundary_phi_near",
"Boundary conditions",
"Potential on near boundary").boundary_phi_near = "0.0"
obj.addProperty(
"App::PropertyString",
"boundary_phi_far",
"Boundary conditions",
"Potential on far boundary").boundary_phi_far = "0.0"
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def updateData(self, fp, prop):
'''If a property of the handled feature has changed
we have the chance to handle this here'''
return
def attach(self, obj):
'''Setup the scene sub-graph of the view provider, this method is mandatory'''
return
def generate_config_part( self ):
conf_part = []
conf_part.append( "[BoundaryConditions]\n" )
export_property_names = [ "boundary_phi_left", "boundary_phi_right",
"boundary_phi_top" , "boundary_phi_bottom",
"boundary_phi_near", "boundary_phi_far" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
def __getstate__(self):
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
class MagneticFieldConfigPart():
"""Properties and representation of magnetic_field config part"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString",
"magnetic_field_x",
"External magnetic field",
"Field magnitude along X axis").magnetic_field_x = "0.0"
obj.addProperty(
"App::PropertyString",
"magnetic_field_y",
"External magnetic field",
"Field magnitude along Y axis").magnetic_field_y = "0.0"
obj.addProperty(
"App::PropertyString",
"magnetic_field_z",
"External magnetic field",
"Field magnitude along Z axis").magnetic_field_z = "0.0"
obj.addProperty(
"App::PropertyString",
"speed_of_light",
"External magnetic field",
"Speed of light").speed_of_light = "3.0e10"
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def updateData(self, fp, prop):
'''If a property of the handled feature has changed we
have the chance to handle this here'''
return
def attach(self, obj):
'''Setup the scene sub-graph of the view provider, this method is mandatory'''
return
def generate_config_part( self ):
conf_part = []
conf_part.append( "[ExternalMagneticField]\n" )
export_property_names = [ "magnetic_field_x", "magnetic_field_y",
"magnetic_field_z", "speed_of_light" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
def __getstate__(self):
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
class ParticleSourceConfigPart():
"""Particle source region"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyEnumeration",
"individual_charge_or_total_current",
"Base",
"Specify particles' charge or total source current")
obj.individual_charge_or_total_current = ["Particles' charge", "Source current"]
obj.addProperty(
"App::PropertyEnumeration",
"mass_or_charge_to_mass",
"Base",
"Specify particles' mass or charge-to-mass ratio")
obj.mass_or_charge_to_mass = [ "Mass", "Charge-to-mass" ]
obj.addProperty(
"App::PropertyString",
"initial_number_of_particles",
"Number of particles",
"Initial number of particles" ).initial_number_of_particles = "1000"
obj.addProperty(
"App::PropertyString",
"particles_to_generate_each_step",
"Number of particles",
"Number of particles to add at each time step" ).particles_to_generate_each_step = "1000"
obj.addProperty(
"App::PropertyString",
"current",
"Number of particles",
"I = q * N / dt" ).current = "10" # default value is unimportant; it will be recalculated.
obj.addProperty(
"App::PropertyString",
"box_x_left",
"Position",
"Position of the left side of the source" ).box_x_left = "0.6"
obj.addProperty(
"App::PropertyString",
"box_x_right",
"Position",
"Position of the right side of the source" ).box_x_right = "0.4"
obj.addProperty(
"App::PropertyString",
"box_y_bottom",
"Position",
"Position of the bottom side of the source" ).box_y_bottom = "0.4"
obj.addProperty(
"App::PropertyString",
"box_y_top",
"Position",
"Position of the top side of the source" ).box_y_top = "0.6"
obj.addProperty(
"App::PropertyString",
"box_z_near",
"Position",
"Position of the near side of the source" ).box_z_near = "0.4"
obj.addProperty(
"App::PropertyString",
"box_z_far",
"Position",
"Position of the far side of the source" ).box_z_far = "0.6"
obj.addProperty(
"App::PropertyString",
"mean_momentum_x",
"Momentum",
"Mean momentum in X direction" ).mean_momentum_x = "1.0"
obj.addProperty(
"App::PropertyString",
"mean_momentum_y",
"Momentum",
"Mean momentum in Y direction" ).mean_momentum_y = "1.0"
obj.addProperty(
"App::PropertyString",
"mean_momentum_z",
"Momentum",
"Mean momentum in Z direction" ).mean_momentum_z = "1.0"
obj.addProperty(
"App::PropertyString",
"temperature",
"Momentum",
"Temperature" ).temperature = "1.0"
obj.addProperty(
"App::PropertyString",
"charge",
"Particle properties",
"Particles' charge" ).charge = "1.0"
obj.addProperty(
"App::PropertyString",
"mass",
"Particle properties",
"Particles' mass (calculated automatically from q and q/m)" ).mass = "1.0"
obj.addProperty(
"App::PropertyString",
"charge_to_mass_ratio",
"Particle properties",
"Particles' charge to mass ratio" ).charge_to_mass_ratio = "1.0"
obj.ViewObject.addProperty(
"App::PropertyColor", "Color",
"Spatial mesh", "Volume box color").Color=(0.0, 0.0, 1.0)
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute( self, obj ):
'''Executed when document is recomputated. This method is mandatory'''
dt = self.get_time_step( obj )
N = int( obj.particles_to_generate_each_step )
individual_charge_or_total_current = obj.getPropertyByName(
"individual_charge_or_total_current")
mass_or_charge_to_mass = obj.getPropertyByName(
"mass_or_charge_to_mass")
if individual_charge_or_total_current == "Particles' charge":
# todo: make certain fields read-only
# e.g., obj.setEditorMode( "current", readonly )
q = float( obj.charge )
I = q * N / dt
obj.current = str( I )
elif individual_charge_or_total_current == "Source current":
I = float( obj.current )
q = I * dt / N
obj.charge = str( q )
if mass_or_charge_to_mass == "Mass":
# todo: make certain fields read-only
# e.g., obj.setEditorMode( "current", readonly )
m = float( obj.mass )
q_to_m = q / m
obj.charge_to_mass_ratio = str( q_to_m )
elif mass_or_charge_to_mass == "Charge-to-mass":
q_to_m = float( obj.charge_to_mass_ratio )
m = abs( 1 / q_to_m * q )
obj.mass = str( m )
return
def attach(self, obj):
self.shaded = coin.SoGroup()
self.wireframe = coin.SoGroup()
self.trans = coin.SoTranslation()
self.color = coin.SoBaseColor()
self.box = coin.SoCube()
self.shaded.addChild( self.color )
self.shaded.addChild( self.trans )
self.shaded.addChild( self.box )
obj.addDisplayMode( self.shaded, "Shaded" )
style = coin.SoDrawStyle()
style.style = coin.SoDrawStyle.LINES
self.wireframe.addChild( style )
self.wireframe.addChild( self.color )
self.wireframe.addChild( self.trans )
self.wireframe.addChild( self.box )
obj.addDisplayMode( self.wireframe, "Wireframe" )
self.onChanged( obj, "Color" )
return
def updateData( self, obj, prop ):
"Executed when propery in field 'data' is changed"
# todo: move charge-current recomputation here from 'execute'
x0 = float( obj.getPropertyByName("box_x_right") )
y0 = float( obj.getPropertyByName("box_y_bottom") )
z0 = float( obj.getPropertyByName("box_z_near") )
xlen = float( obj.getPropertyByName("box_x_left") ) - x0
ylen = float( obj.getPropertyByName("box_y_top") ) - y0
zlen = float( obj.getPropertyByName("box_z_far") ) - z0
self.trans.translation.setValue( [ x0 + xlen / 2,
y0 + ylen / 2,
z0 + zlen / 2 ] )
self.box.width.setValue( xlen )
self.box.height.setValue( ylen )
self.box.depth.setValue( zlen )
return
def getDisplayModes(self,obj):
"Return a list of display modes."
modes=[]
modes.append("Shaded")
modes.append("Wireframe")
return modes
def getDefaultDisplayMode(self):
'''Return the name of the default display mode.
It must be defined in getDisplayModes.'''
return "Wireframe"
def setDisplayMode(self,mode):
return mode
def onChanged(self, vp, prop):
"Executed if any property is changed"
if prop == "Color":
c = vp.getPropertyByName("Color")
self.color.rgb.setValue( c[0], c[1], c[2] )
def __getstate__(self):
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
def get_time_step( self, obj ):
# todo: to get dt, get object named "Time_grid" from the first
# group which the source belongs to.
# Instead, pass reference to "Time_grid" object in the source-constructor.
# todo: source properties are not recomputed when dt is changed. do something about it.
dt = float( obj.InList[0].getObject("Time_grid").getPropertyByName("time_step_size") )
return dt
def generate_config_part( self ):
conf_part = []
source_name = self.doc_object.getPropertyByName( "Label" )
conf_part.append( "[ParticleSourceBox.{0}]\n".format( source_name ) )
export_property_names = [ "initial_number_of_particles",
"particles_to_generate_each_step",
"box_x_left", "box_x_right",
"box_y_bottom", "box_y_top",
"box_z_near", "box_z_far",
"mean_momentum_x",
"mean_momentum_y",
"mean_momentum_z",
"temperature",
"charge",
"mass" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
comments = [ "individual_charge_or_total_current",
"mass_or_charge_to_mass",
"charge_to_mass_ratio",
"current" ]
for x in comments:
conf_part.append(
";{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
class ParticleCylindricalSourceConfigPart():
"""Particle cylindrical source region"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyEnumeration",
"individual_charge_or_total_current",
"Base",
"Specify particles' charge or total source current")
obj.individual_charge_or_total_current = ["Particles' charge", "Source current"]
obj.addProperty(
"App::PropertyEnumeration",
"mass_or_charge_to_mass",
"Base",
"Specify particles' mass or charge-to-mass ratio")
obj.mass_or_charge_to_mass = [ "Mass", "Charge-to-mass" ]
obj.addProperty(
"App::PropertyString",
"initial_number_of_particles",
"Number of particles",
"Initial number of particles" ).initial_number_of_particles = "1000"
obj.addProperty(
"App::PropertyString",
"particles_to_generate_each_step",
"Number of particles",
"Number of particles to add at each time step" ).particles_to_generate_each_step = "1000"
obj.addProperty(
"App::PropertyString",
"current",
"Number of particles",
"I = q * N / dt" ).current = "10" # default value is unimportant;
# it will be recalculated.
### Size
obj.addProperty(
"App::PropertyEnumeration",
"cylinder_axis_direction",
"Size",
"Cylinder along axis").cylinder_axis_direction = ["X", "Y", "Z"]
obj.addProperty(
"App::PropertyString",
"cylinder_length",
"Size",
"Cylinder axis length").cylinder_length = "0.5"
obj.addProperty(
"App::PropertyString",
"cylinder_radius",
"Size",
"Cylinder radius" ).cylinder_radius = "0.05"
###
obj.addProperty(
"App::PropertyString",
"cylinder_axis_start_x",
"Size",
"Position of the left side of the source" ).cylinder_axis_start_x = "0.0"
obj.addProperty(
"App::PropertyString",
"cylinder_axis_start_y",
"Size",
"Position of the right side of the source" ).cylinder_axis_start_y = "0.0"
obj.addProperty(
"App::PropertyString",
"cylinder_axis_start_z",
"Size",
"Position of the bottom side of the source" ).cylinder_axis_start_z = "0.0"
obj.addProperty(
"App::PropertyString",
"cylinder_axis_end_x",
"Size",
"Position of the top side of the source" ).cylinder_axis_end_x = "0.2"
obj.addProperty(
"App::PropertyString",
"cylinder_axis_end_y",
"Size",
"Position of the near side of the source" ).cylinder_axis_end_y = "0.2"
obj.addProperty(
"App::PropertyString",
"cylinder_axis_end_z",
"Size",
"Position of the far side of the source" ).cylinder_axis_end_z = "0.3"
###
obj.addProperty(
"App::PropertyString",
"mean_momentum_x",
"Momentum",
"Mean momentum in X direction" ).mean_momentum_x = "1.0"
obj.addProperty(
"App::PropertyString",
"mean_momentum_y",
"Momentum",
"Mean momentum in Y direction" ).mean_momentum_y = "1.0"
obj.addProperty(
"App::PropertyString",
"mean_momentum_z",
"Momentum",
"Mean momentum in Z direction" ).mean_momentum_z = "1.0"
obj.addProperty(
"App::PropertyString",
"temperature",
"Momentum",
"Temperature" ).temperature = "1.0"
obj.addProperty(
"App::PropertyString",
"charge",
"Particle properties",
"Particles' charge" ).charge = "1.0"
obj.addProperty(
"App::PropertyString",
"mass",
"Particle properties",
"Particles' mass (calculated automatically from q and q/m)" ).mass = "1.0"
obj.addProperty(
"App::PropertyString",
"charge_to_mass_ratio",
"Particle properties",
"Particles' charge to mass ratio" ).charge_to_mass_ratio = "1.0"
obj.ViewObject.addProperty(
"App::PropertyColor", "Color",
"Spatial mesh", "Volume box color").Color=(0.0, 0.0, 1.0)
# hide axis-end
obj.setEditorMode( "cylinder_axis_end_x", 1 )
obj.setEditorMode( "cylinder_axis_end_y", 1 )
obj.setEditorMode( "cylinder_axis_end_z", 1 )
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute( self, obj ):
'''Executed when document is recomputated. This method is mandatory'''
dt = self.get_time_step( obj )
N = int( obj.particles_to_generate_each_step )
individual_charge_or_total_current = obj.getPropertyByName(
"individual_charge_or_total_current")
mass_or_charge_to_mass = obj.getPropertyByName(
"mass_or_charge_to_mass")
if individual_charge_or_total_current == "Particles' charge":
# todo: make certain fields read-only
# e.g., obj.setEditorMode( "current", readonly )
q = float( obj.charge )
I = q * N / dt
obj.current = str( I )
elif individual_charge_or_total_current == "Source current":
I = float( obj.current )
q = I * dt / N
obj.charge = str( q )
if mass_or_charge_to_mass == "Mass":
# todo: make certain fields read-only
# e.g., obj.setEditorMode( "current", readonly )
m = float( obj.mass )
q_to_m = q / m
obj.charge_to_mass_ratio = str( q_to_m )
elif mass_or_charge_to_mass == "Charge-to-mass":
q_to_m = float( obj.charge_to_mass_ratio )
m = abs( 1 / q_to_m * q )
obj.mass = str( m )
cylinder_axis_direction = obj.cylinder_axis_direction
cylinder_length = float( obj.cylinder_length )
cylinder_axis_start_x = float( obj.cylinder_axis_start_x )
cylinder_axis_start_y = float( obj.cylinder_axis_start_y )
cylinder_axis_start_z = float( obj.cylinder_axis_start_z )
if cylinder_axis_direction == 'X':
obj.cylinder_axis_end_x = str( cylinder_axis_start_x + cylinder_length )
obj.cylinder_axis_end_y = str( cylinder_axis_start_y )
obj.cylinder_axis_end_z = str( cylinder_axis_start_z )
elif cylinder_axis_direction == 'Y':
obj.cylinder_axis_end_x = str( cylinder_axis_start_x )
obj.cylinder_axis_end_y = str( cylinder_axis_start_y + cylinder_length )
obj.cylinder_axis_end_z = str( cylinder_axis_start_z )
elif cylinder_axis_direction == 'Z':
obj.cylinder_axis_end_x = str( cylinder_axis_start_x )
obj.cylinder_axis_end_y = str( cylinder_axis_start_y )
obj.cylinder_axis_end_z = str( cylinder_axis_start_z + cylinder_length )
return
def attach(self, obj):
self.trans = coin.SoTranslation()
self.rot_xyz = coin.SoRotationXYZ()
self.color = coin.SoBaseColor()
self.cyl = coin.SoCylinder()
self.shaded = coin.SoGroup()
self.shaded.addChild( self.color )
self.shaded.addChild( self.trans )
self.shaded.addChild( self.rot_xyz )
self.shaded.addChild( self.cyl )
obj.addDisplayMode( self.shaded, "Shaded" )
style = coin.SoDrawStyle()
style.style = coin.SoDrawStyle.LINES
self.wireframe = coin.SoGroup()
self.wireframe.addChild( style )
self.wireframe.addChild( self.color )
self.wireframe.addChild( self.trans )
self.wireframe.addChild( self.rot_xyz )
self.wireframe.addChild( self.cyl )
obj.addDisplayMode( self.wireframe, "Wireframe" )
self.onChanged( obj, "Color" )
return
def updateData( self, obj, prop ):
"Executed when propery in field 'data' is changed"
cylinder_axis_direction = obj.getPropertyByName("cylinder_axis_direction")
cylinder_length = float( obj.getPropertyByName("cylinder_length") )
cylinder_radius = float( obj.getPropertyByName("cylinder_radius") )
cylinder_axis_start_x = float( obj.getPropertyByName("cylinder_axis_start_x") )
cylinder_axis_start_y = float( obj.getPropertyByName("cylinder_axis_start_y") )
cylinder_axis_start_z = float( obj.getPropertyByName("cylinder_axis_start_z") )
if cylinder_axis_direction == 'X':
self.rot_xyz.axis.setValue( 2 )
self.rot_xyz.angle.setValue( pi / 2 )
self.trans.translation.setValue(
[ cylinder_axis_start_x + cylinder_length / 2,
cylinder_axis_start_y,
cylinder_axis_start_z ] )
elif cylinder_axis_direction == 'Y':
self.rot_xyz.axis.setValue( 1 )
self.rot_xyz.angle.setValue( 0 )
self.trans.translation.setValue(
[ cylinder_axis_start_x,
cylinder_axis_start_y + cylinder_length / 2,
cylinder_axis_start_z ] )
elif cylinder_axis_direction == 'Z':
self.rot_xyz.axis.setValue( 0 )
self.rot_xyz.angle.setValue( pi / 2 )
self.trans.translation.setValue(
[ cylinder_axis_start_x,
cylinder_axis_start_y,
cylinder_axis_start_z + cylinder_length / 2 ] )
self.cyl.radius.setValue( cylinder_radius )
self.cyl.height.setValue( cylinder_length )
return
def getDisplayModes(self,obj):
"Return a list of display modes."
modes=[]
modes.append("Shaded")
modes.append("Wireframe")
return modes
def getDefaultDisplayMode(self):
'''Return the name of the default display mode.
It must be defined in getDisplayModes.'''
return "Wireframe"
def setDisplayMode(self,mode):
return mode
def onChanged(self, vp, prop):
"Executed if any property is changed"
if prop == "Color":
c = vp.getPropertyByName("Color")
self.color.rgb.setValue( c[0], c[1], c[2] )
def __getstate__(self):
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
def get_time_step( self, obj ):
# todo: to get dt, get object named "Time_grid" from the first
# group which the source belongs to.
# Instead, pass reference to "Time_grid" object in the source-constructor.
# todo: source properties are not recomputed when dt is changed.
# do something about it.
dt = float(
obj.InList[0].getObject("Time_grid").getPropertyByName("time_step_size") )
return dt
def generate_config_part( self ):
conf_part = []
source_name = self.doc_object.getPropertyByName( "Label" )
conf_part.append( "[ParticleSourceCylinder.{0}]\n".format( source_name ) )
export_property_names = [ "initial_number_of_particles",
"particles_to_generate_each_step",
"cylinder_axis_start_x",
"cylinder_axis_start_y",
"cylinder_axis_start_z",
"cylinder_axis_end_x",
"cylinder_axis_end_y",
"cylinder_axis_end_z",
"cylinder_radius",
"mean_momentum_x",
"mean_momentum_y",
"mean_momentum_z",
"temperature",
"charge",
"mass" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
comments = [ "individual_charge_or_total_current",
"mass_or_charge_to_mass",
"charge_to_mass_ratio",
"current",
"cylinder_axis_direction",
"cylinder_length" ]
for x in comments:
conf_part.append(
";{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
class InnerRegionBoxConfigPart():
"""Box inner region"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString",
"box_x_right",
"Position",
"Box right side position" ).box_x_right = "0.1"
obj.addProperty(
"App::PropertyString",
"box_x_left",
"Position",
"Box left side position" ).box_x_left = "0.9"
obj.addProperty(
"App::PropertyString",
"box_y_bottom",
"Position",
"Box bottom side position" ).box_y_bottom = "0.1"
obj.addProperty(
"App::PropertyString",
"box_y_top",
"Position",
"Box top side position" ).box_y_top = "0.9"
obj.addProperty(
"App::PropertyString",
"box_z_near",
"Position",
"Box near side position" ).box_z_near = "0.1"
obj.addProperty(
"App::PropertyString",
"box_z_far",
"Position",
"Box far side position" ).box_z_far = "0.2"
obj.addProperty(
"App::PropertyString",
"potential",
"Potential",
"Inner region potential" ).potential = "0.0"
obj.ViewObject.addProperty(
"App::PropertyColor", "Color",
"Inner region color", "Inner region color").Color=(0.5, 0.5, 0.0)
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
return
def attach(self, obj):
self.shaded = coin.SoGroup()
self.wireframe = coin.SoGroup()
self.trans = coin.SoTranslation()
self.color = coin.SoBaseColor()
self.box = coin.SoCube()
self.shaded.addChild( self.color )
self.shaded.addChild( self.trans )
self.shaded.addChild( self.box )
obj.addDisplayMode( self.shaded, "Shaded" )
style = coin.SoDrawStyle()
style.style = coin.SoDrawStyle.LINES
self.wireframe.addChild( style )
self.wireframe.addChild( self.color )
self.wireframe.addChild( self.trans )
self.wireframe.addChild( self.box )
obj.addDisplayMode( self.wireframe, "Wireframe" )
self.onChanged( obj, "Color" )
return
def updateData(self, obj, prop ):
"Executed when propery in field 'data' is changed"
x0 = float( obj.getPropertyByName("box_x_right") )
y0 = float( obj.getPropertyByName("box_y_bottom") )
z0 = float( obj.getPropertyByName("box_z_near") )
xlen = float( obj.getPropertyByName("box_x_left") ) - x0
ylen = float( obj.getPropertyByName("box_y_top") ) - y0
zlen = float( obj.getPropertyByName("box_z_far") ) - z0
self.trans.translation.setValue( [ x0 + xlen / 2,
y0 + ylen / 2,
z0 + zlen / 2 ] )
self.box.width.setValue( xlen )
self.box.height.setValue( ylen )
self.box.depth.setValue( zlen )
def getDisplayModes(self,obj):
"Return a list of display modes."
modes=[]
modes.append("Shaded")
modes.append("Wireframe")
return modes
def getDefaultDisplayMode(self):
'''Return the name of the default display mode.
It must be defined in getDisplayModes.'''
return "Shaded"
def setDisplayMode(self,mode):
return mode
def onChanged(self, vp, prop):
"Executed if any property is changed"
if prop == "Color":
c = vp.getPropertyByName("Color")
self.color.rgb.setValue( c[0],c[1],c[2] )
def __getstate__(self):
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
def generate_config_part( self ):
conf_part = []
conf_part.append( "[InnerRegionBox.{0}]\n".format(
self.doc_object.getPropertyByName( "Label" ) ) )
export_property_names = [ "box_x_left", "box_x_right",
"box_y_bottom", "box_y_top",
"box_z_near", "box_z_far",
"potential" ]
for x in export_property_names:
conf_part.append(
"{0} = {1}\n".format( x, self.doc_object.getPropertyByName( x ) ) )
conf_part.append("\n")
return conf_part
class RunEfConfig:
"""Parameters to run computation"""
def __init__( self, obj ):
obj.addProperty(
"App::PropertyString", "current_workdir",
"Run Parameters", "Path to working directory" ).current_workdir = os.getcwd()
obj.setEditorMode( "current_workdir", 1 )
obj.addProperty(
"App::PropertyString", "change_workdir_to",
"Run Parameters", "Path to working directory" ).change_workdir_to = "/tmp/"
obj.addProperty(
"App::PropertyString", "command",
"Run Parameters", "Command to execute" ).command = "./ef.out test.conf"
obj.Proxy = self
obj.ViewObject.Proxy = self
self.doc_object = obj
self.view_object = obj.ViewObject
def execute(self, fp):
'''Executed when document is recomputated. This method is mandatory'''
return
def updateData(self, fp, prop):
'''If a property of the handled feature has changed
we have the chance to handle this here'''
return
def attach(self, obj):
''' Setup the scene sub-graph of the view provider, this method is mandatory '''
return
def __getstate__(self):
'''When saving the document this object gets stored using Python's json module.
Since we have some un-serializable parts
here -- the Coin stuff -- we must define this method
to return a tuple of all serializable objects or None.'''
doc_object_name = self.doc_object.Name
return { "doc_object_name": doc_object_name }
def __setstate__(self, state):
'''When restoring the serialized object from document
we have the chance to set some internals here.
Since no data were serialized nothing needs to be done here.'''
doc_object_name = state[ "doc_object_name" ]
self.doc_object = FreeCAD.ActiveDocument.getObject( doc_object_name )
self.view_object = self.doc_object.ViewObject
return None
def generate_config_part( self ):
# no need to add something to config; return empty list
# todo: avoid calling this method for this class
return []
FreeCADGui.addCommand( 'CreateEfConfig', CreateEfConfig() )
FreeCADGui.addCommand( 'AddSourceRegion', AddSourceRegion() )
FreeCADGui.addCommand( 'AddCylindricalSource', AddCylindricalSource() )
FreeCADGui.addCommand( 'AddInnerRegionBox', AddInnerRegionBox() )
FreeCADGui.addCommand( 'GenerateConfFile', GenerateConfFile() )
FreeCADGui.addCommand( 'RunEf', RunEf() )
| 38.685829 | 102 | 0.578429 | 6,358 | 57,874 | 5.048128 | 0.067789 | 0.031967 | 0.037606 | 0.06278 | 0.816737 | 0.785643 | 0.758038 | 0.736073 | 0.683855 | 0.66946 | 0 | 0.006333 | 0.323323 | 57,874 | 1,495 | 103 | 38.711706 | 0.813237 | 0.120745 | 0 | 0.768166 | 0 | 0 | 0.201945 | 0.028585 | 0 | 0 | 0 | 0.005351 | 0 | 1 | 0.093426 | false | 0 | 0.006055 | 0.006055 | 0.192042 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2aa63dd82f508e9240a1d2542bd18c69d46d55de | 134 | py | Python | src/Domains/__init__.py | MarcelFox/api-modelo | 1ca862446893d0f0d079cde1b10931b8fd188c57 | [
"CC0-1.0"
] | 1 | 2020-09-29T14:55:08.000Z | 2020-09-29T14:55:08.000Z | src/Domains/__init__.py | MarcelFox/api-modelo | 1ca862446893d0f0d079cde1b10931b8fd188c57 | [
"CC0-1.0"
] | null | null | null | src/Domains/__init__.py | MarcelFox/api-modelo | 1ca862446893d0f0d079cde1b10931b8fd188c57 | [
"CC0-1.0"
] | null | null | null | from .File.Controller import FileController
from .File.Service import FileService
# from .File.Repository import RepositoryController
| 33.5 | 51 | 0.850746 | 15 | 134 | 7.6 | 0.6 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097015 | 134 | 3 | 52 | 44.666667 | 0.942149 | 0.365672 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2aac48af090d8f87d2477053e500b4d733aa29aa | 466 | py | Python | ports/esp32/modules/ili9341.py | amirgon/lv_mpy | 9b4e5b35d809380efd397f0287aa22957071d978 | [
"MIT"
] | 150 | 2020-05-24T17:42:24.000Z | 2022-03-28T12:47:53.000Z | ports/esp32/modules/ili9341.py | amirgon/lv_mpy | 9b4e5b35d809380efd397f0287aa22957071d978 | [
"MIT"
] | 24 | 2020-05-19T10:46:39.000Z | 2022-01-25T22:47:44.000Z | ports/esp32/modules/ili9341.py | amirgon/lv_mpy | 9b4e5b35d809380efd397f0287aa22957071d978 | [
"MIT"
] | 81 | 2020-05-19T03:57:34.000Z | 2022-03-18T03:34:08.000Z | ##############################################################################
#
# Wrapper function for backward compatibility with new ILI9XXX library.
#
##############################################################################
print("""
***************************************
* This library is obsoled now!
* Please, use ili9XXX library instead:
*
* from ili9XXX import ili9341
*
***************************************
""")
from ili9XXX import ili9341
| 24.526316 | 78 | 0.356223 | 28 | 466 | 5.928571 | 0.714286 | 0.168675 | 0.204819 | 0.289157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.098712 | 466 | 18 | 79 | 25.888889 | 0.366667 | 0.148069 | 0 | 0.2 | 0 | 0 | 0.790598 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0.1 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6300e0b878545e3f2ee0f5a213236791c6ae0cae | 44 | py | Python | tradester/feeds/__init__.py | wrieg123/tradester | 440210940f80e94fde4d43841c729f63b05f597d | [
"MIT"
] | 5 | 2020-11-11T14:54:59.000Z | 2020-11-13T04:00:25.000Z | tradester/feeds/__init__.py | wrieg123/tradester | 440210940f80e94fde4d43841c729f63b05f597d | [
"MIT"
] | null | null | null | tradester/feeds/__init__.py | wrieg123/tradester | 440210940f80e94fde4d43841c729f63b05f597d | [
"MIT"
] | null | null | null | from .active import *
from .static import *
| 14.666667 | 21 | 0.727273 | 6 | 44 | 5.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 44 | 2 | 22 | 22 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
63126686eebae63ae76aa5c37157b70be17b435d | 3,571 | py | Python | imcsdk/mometa/storage/StorageControllerHealth.py | ecoen66/imcsdk | b10eaa926a5ee57cea7182ae0adc8dd1c818b0ab | [
"Apache-2.0"
] | 31 | 2016-06-14T07:23:59.000Z | 2021-09-12T17:17:26.000Z | imcsdk/mometa/storage/StorageControllerHealth.py | sthagen/imcsdk | 1831eaecb5960ca03a8624b1579521749762b932 | [
"Apache-2.0"
] | 109 | 2016-05-25T03:56:56.000Z | 2021-10-18T02:58:12.000Z | imcsdk/mometa/storage/StorageControllerHealth.py | sthagen/imcsdk | 1831eaecb5960ca03a8624b1579521749762b932 | [
"Apache-2.0"
] | 67 | 2016-05-17T05:53:56.000Z | 2022-03-24T15:52:53.000Z | """This module contains the general information for StorageControllerHealth ManagedObject."""
from ...imcmo import ManagedObject
from ...imccoremeta import MoPropertyMeta, MoMeta
from ...imcmeta import VersionMeta
class StorageControllerHealthConsts:
pass
class StorageControllerHealth(ManagedObject):
"""This is StorageControllerHealth class."""
consts = StorageControllerHealthConsts()
naming_props = set([])
mo_meta = {
"classic": MoMeta("StorageControllerHealth", "storageControllerHealth", "controller-health", VersionMeta.Version2013e, "OutputOnly", 0xf, [], ["admin", "read-only", "user"], ['storageController'], [], ["Get"]),
"modular": MoMeta("StorageControllerHealth", "storageControllerHealth", "controller-health", VersionMeta.Version2013e, "OutputOnly", 0xf, [], ["admin", "read-only", "user"], ['storageController'], [], ["Get"])
}
prop_meta = {
"classic": {
"child_action": MoPropertyMeta("child_action", "childAction", "string", VersionMeta.Version2013e, MoPropertyMeta.INTERNAL, None, None, None, None, [], []),
"dn": MoPropertyMeta("dn", "dn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, 0x2, 0, 255, None, [], []),
"health": MoPropertyMeta("health", "health", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"id": MoPropertyMeta("id", "id", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"rn": MoPropertyMeta("rn", "rn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, 0x4, 0, 255, None, [], []),
"status": MoPropertyMeta("status", "status", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, 0x8, None, None, None, ["", "created", "deleted", "modified", "removed"], []),
},
"modular": {
"child_action": MoPropertyMeta("child_action", "childAction", "string", VersionMeta.Version2013e, MoPropertyMeta.INTERNAL, None, None, None, None, [], []),
"dn": MoPropertyMeta("dn", "dn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, 0x2, 0, 255, None, [], []),
"health": MoPropertyMeta("health", "health", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"id": MoPropertyMeta("id", "id", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"rn": MoPropertyMeta("rn", "rn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, 0x4, 0, 255, None, [], []),
"status": MoPropertyMeta("status", "status", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_ONLY, 0x8, None, None, None, ["", "created", "deleted", "modified", "removed"], []),
},
}
prop_map = {
"classic": {
"childAction": "child_action",
"dn": "dn",
"health": "health",
"id": "id",
"rn": "rn",
"status": "status",
},
"modular": {
"childAction": "child_action",
"dn": "dn",
"health": "health",
"id": "id",
"rn": "rn",
"status": "status",
},
}
def __init__(self, parent_mo_or_dn, **kwargs):
self._dirty_mask = 0
self.child_action = None
self.health = None
self.id = None
self.status = None
ManagedObject.__init__(self, "StorageControllerHealth", parent_mo_or_dn, **kwargs)
| 46.376623 | 218 | 0.605713 | 317 | 3,571 | 6.706625 | 0.22082 | 0.151458 | 0.163688 | 0.242709 | 0.752587 | 0.735654 | 0.735654 | 0.735654 | 0.735654 | 0.735654 | 0 | 0.036891 | 0.218146 | 3,571 | 76 | 219 | 46.986842 | 0.72457 | 0.035284 | 0 | 0.509091 | 0 | 0 | 0.20769 | 0.033498 | 0 | 0 | 0.006991 | 0 | 0 | 1 | 0.018182 | false | 0.018182 | 0.054545 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6317ca905effd6bb713f614fbc8008b2f6731854 | 41 | py | Python | botTest.py | JacobJW/MakeNutritionGreatAgain | d26df0f039cf6120f1ab5e533a0e58879382e911 | [
"MIT"
] | null | null | null | botTest.py | JacobJW/MakeNutritionGreatAgain | d26df0f039cf6120f1ab5e533a0e58879382e911 | [
"MIT"
] | null | null | null | botTest.py | JacobJW/MakeNutritionGreatAgain | d26df0f039cf6120f1ab5e533a0e58879382e911 | [
"MIT"
] | null | null | null | def excite(s: str):
return s.upper()
| 13.666667 | 20 | 0.609756 | 7 | 41 | 3.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.219512 | 41 | 2 | 21 | 20.5 | 0.78125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
631b5d482c8dc668a5f99c21645ce3e6e61a9ceb | 10,197 | py | Python | knee/evaluation.py | Yifei-Liu/knee | 7c2c7092a2c2dc4c4dac5ebc3b623c5725e0339b | [
"MIT"
] | null | null | null | knee/evaluation.py | Yifei-Liu/knee | 7c2c7092a2c2dc4c4dac5ebc3b623c5725e0339b | [
"MIT"
] | null | null | null | knee/evaluation.py | Yifei-Liu/knee | 7c2c7092a2c2dc4c4dac5ebc3b623c5725e0339b | [
"MIT"
] | null | null | null | # coding: utf-8
__author__ = 'Mário Antunes'
__version__ = '0.1'
__email__ = 'mariolpantunes@gmail.com'
__status__ = 'Development'
import enum
import math
import logging
import numpy as np
import knee.linear_fit as lf
logger = logging.getLogger(__name__)
class Strategy(enum.Enum):
"""
Enum data type that represents the strategy of MAE, MSE and RMSE
"""
knees = 'knees'
expected = 'expected'
best = 'best'
worst = 'worst'
def __str__(self):
return self.value
def get_neighbourhood_points(points: np.ndarray, a: int, b: int, t: float) -> tuple:
"""Get the neighbourhood (closest points) from a to b.
The neighbourhood is defined as the longest straitgh line (defined by R2).
Args:
points (np.ndarray): numpy array with the points (x, y)
a (int): the initial point of the search
b (int): the left limit of the search
t (float): R2 threshold
Returns:
tuple: (neighbourhood index, r2, slope)
"""
x = points[:, 0]
y = points[:, 1]
return get_neighbourhood(x, y, a, b, t)
def get_neighbourhood(x: np.ndarray, y: np.ndarray, a: int, b: int, t: float = 0.7) -> tuple:
"""Get the neighbourhood (closest points) from a to b.
The neighbourhood is defined as the longest straitgh line (defined by R2).
Args:
x (np.ndarray): the value of the points in the x axis coordinates
y (np.ndarray): the value of the points in the y axis coordinates
a (int): the initial point of the search
b (int): the left limit of the search
t (float): R2 threshold
Returns:
tuple: (neighbourhood index, r2, slope)
"""
r2 = 1.0
i = a - 1
_, slope = lf.linear_fit(x[i:a+1], y[i:a+1])
while r2 > t and i > b:
previous_res = (i, r2, slope)
i -= 1
coef = lf.linear_fit(x[i:a+1], y[i:a+1])
r2 = lf.linear_r2(x[i:a+1], y[i:a+1], coef)
_, slope = coef
if r2 > t:
return i, r2, slope
else:
return previous_res
def accuracy_knee(points: np.ndarray, knees: np.ndarray) -> tuple:
"""Compute the accuracy heuristic for a set of knees.
The heuristic is based on the average distance of X and Y axis, the slope and the R2.
In this version it is used the left neighbourhood of the knee.
Args:
points (np.ndarray): numpy array with the points (x, y)
knees (np.ndarray): knees indexes
Returns:
tuple: (average_x, average_y, average_slope, average_coeffients, cost)
"""
x = points[:, 0]
y = points[:, 1]
total_x = math.fabs(x[-1] - x[0])
total_y = math.fabs(y[-1] - y[0])
distances_x = []
distances_y = []
slopes = []
coeffients = []
previous_knee = 0
for i in range(len(knees)):
idx, r2, slope = get_neighbourhood(x, y, knees[i], previous_knee)
delta_x = x[idx] - x[knees[i]]
delta_y = y[idx] - y[knees[i]]
distances_x.append(math.fabs(delta_x))
distances_y.append(math.fabs(delta_y))
slopes.append(math.fabs(slope))
coeffients.append(r2)
previous_knee = knees[i]
slopes = np.array(slopes)
slopes = slopes/slopes.max()
coeffients = np.array(coeffients)
coeffients = coeffients/coeffients.max()
distances_x = np.array(distances_x)/total_x
distances_y = np.array(distances_y)/total_y
average_x = np.average(distances_x)
average_y = np.average(distances_y)
average_slope = np.average(slopes)
average_coeffients = np.average(coeffients)
#p = slopes * distances_y * coeffients
p = slopes * distances_y
#cost = (average_x * average_y) / (average_slope)
cost = average_x / np.average(p)
return average_x, average_y, average_slope, average_coeffients, cost
def accuracy_trace(points: np.ndarray, knees: np.ndarray) -> tuple:
"""Compute the accuracy heuristic for a set of knees.
The heuristic is based on the average distance of X and Y axis, the slope and the R2.
In this version it is used the points from the current knee to the previous.
Args:
points (np.ndarray): numpy array with the points (x, y)
knees (np.ndarray): knees indexes
Returns:
tuple: (average_x, average_y, average_slope, average_coeffients, cost)
"""
x = points[:, 0]
y = points[:, 1]
distances_x = []
distances_y = []
slopes = []
coeffients = []
total_x = math.fabs(x[-1] - x[0])
total_y = math.fabs(y[-1] - y[0])
previous_knee_x = x[knees[0]]
previous_knee_y = y[knees[0]]
delta_x = x[0] - previous_knee_x
delta_y = y[0] - previous_knee_y
distances_x.append(math.fabs(delta_x))
distances_y.append(math.fabs(delta_y))
coef = lf.linear_fit(x[0:knees[0]+1], y[0:knees[0]+1])
r2 = lf.linear_r2(x[0:knees[0]+1], y[0:knees[0]+1], coef)
coeffients.append(r2)
_, slope = coef
slopes.append(math.fabs(slope))
for i in range(1, len(knees)):
knee_x = x[knees[i]]
knee_y = y[knees[i]]
delta_x = previous_knee_x - knee_x
delta_y = previous_knee_y - knee_y
coef = lf.linear_fit(x[knees[i-1]:knees[i]+1],
y[knees[i-1]:knees[i]+1])
r2 = lf.linear_r2(x[knees[i-1]:knees[i]+1],
y[knees[i-1]:knees[i]+1], coef)
distances_x.append(math.fabs(delta_x))
distances_y.append(math.fabs(delta_y))
_, slope = coef
slopes.append(math.fabs(slope))
coeffients.append(r2)
previous_knee_x = knee_x
previous_knee_y = knee_y
distances_x = np.array(distances_x)/total_x
distances_y = np.array(distances_y)/total_y
slopes = np.array(slopes)
slopes = slopes/slopes.max()
coeffients = np.array(coeffients)
coeffients = coeffients/coeffients.max()
coeffients[coeffients < 0] = 0.0
p = slopes * distances_y * coeffients
#p = slopes * distances_y
average_x = np.average(distances_x)
average_y = np.average(distances_y)
average_slope = np.average(slopes)
average_coeffients = np.average(coeffients)
cost = average_x / np.average(p)
return average_x, average_y, average_slope, average_coeffients, cost
def mae(points: np.ndarray, knees: np.ndarray, expected: np.ndarray, s: Strategy = Strategy.expected) -> float:
"""
Estimates the worst case Mean Absolute Error (MAE) for the given
knee and expected points.
Suppports different size arrays, and estimates the MAE based
on the worst case.
It uses the euclidean distance to find the closer points,
and computes the error based on the closest point.
Args:
points (np.ndarray): numpy array with the points (x, y)
knees (np.ndarray): knees indexes
expected (np.ndarray): numpy array with the expected knee points (x, y)
s (Strategy): enum that controls the point matching (default Strategy.expected)
Returns:
float: the worst case MAE
"""
# get the knee points
knee_points = points[knees]
error = 0.0
if s is Strategy.knees:
a = knee_points
b = expected
elif s is Strategy.expected:
a = expected
b = knee_points
elif s is Strategy.best:
if len(expected) <= len(knee_points):
a = expected
b = knee_points
else:
a = knee_points
b = expected
else:
if len(expected) >= len(knee_points):
a = expected
b = knee_points
else:
a = knee_points
b = expected
for p in a:
distances = np.linalg.norm(b-p, axis=1)
idx = np.argmin(distances)
error += np.sum(np.abs(p-b[idx]))
return error / (len(a)*2.0)
def mse(points: np.ndarray, knees: np.ndarray, expected: np.ndarray, s: Strategy = Strategy.expected) -> float:
"""
Estimates the worst case Mean Squared Error (MSE) for the given
knee and expected points.
Suppports different size arrays, and estimates the MSE based
on the worst case.
It uses the euclidean distance to find the closer points,
and computes the error based on the closest point.
Args:
points (np.ndarray): numpy array with the points (x, y)
knees (np.ndarray): knees indexes
expected (np.ndarray): numpy array with the expected knee points (x, y)
s (Strategy): enum that controls the point matching (default Strategy.expected)
Returns:
float: the worst case MSE
"""
# get the knee points
knee_points = points[knees]
error = 0.0
if s is Strategy.knees:
a = knee_points
b = expected
elif s is Strategy.expected:
a = expected
b = knee_points
elif s is Strategy.best:
if len(expected) <= len(knee_points):
a = expected
b = knee_points
else:
a = knee_points
b = expected
else:
if len(expected) >= len(knee_points):
a = expected
b = knee_points
else:
a = knee_points
b = expected
for p in a:
distances = np.linalg.norm(b-p, axis=1)
idx = np.argmin(distances)
error += np.sum(np.square(p-b[idx]))
print(error)
return error / (len(a)*2.0)
def rmse(points: np.ndarray, knees: np.ndarray, expected: np.ndarray, s: Strategy = Strategy.expected) -> float:
"""
Estimates the worst case Root Mean Squared Error (RMSE) for the given
knee and expected points.
Suppports different size arrays, and estimates the RMSE based
on the worst case.
It uses the euclidean distance to find the closer points,
and computes the error based on the closest point.
Args:
points (np.ndarray): numpy array with the points (x, y)
knees (np.ndarray): knees indexes
expected (np.ndarray): numpy array with the expected knee points (x, y)
s (Strategy): enum that controls the point matching (default Strategy.expected)
Returns:
float: the worst case RMSE
"""
return math.sqrt(mse(points, knees, expected, s))
| 28.886686 | 112 | 0.618908 | 1,471 | 10,197 | 4.17947 | 0.104691 | 0.046845 | 0.029278 | 0.027814 | 0.822056 | 0.811809 | 0.782043 | 0.768543 | 0.759759 | 0.734873 | 0 | 0.012322 | 0.275767 | 10,197 | 352 | 113 | 28.96875 | 0.820176 | 0.352947 | 0 | 0.625 | 0 | 0 | 0.011729 | 0.003856 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.028409 | 0.005682 | 0.153409 | 0.005682 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2d4e41215c98a48e0964b2c092bc8164d49cf735 | 40 | py | Python | Common/Python/Data-Structures/Hashs/__init__.py | MattiKemp/Data-Structures-And-Algorithms | 37a4eb4f092f5a058643ef5ac302fe16d97f84dc | [
"Unlicense"
] | null | null | null | Common/Python/Data-Structures/Hashs/__init__.py | MattiKemp/Data-Structures-And-Algorithms | 37a4eb4f092f5a058643ef5ac302fe16d97f84dc | [
"Unlicense"
] | null | null | null | Common/Python/Data-Structures/Hashs/__init__.py | MattiKemp/Data-Structures-And-Algorithms | 37a4eb4f092f5a058643ef5ac302fe16d97f84dc | [
"Unlicense"
] | null | null | null | from . import HashMap
from . import Set
| 13.333333 | 21 | 0.75 | 6 | 40 | 5 | 0.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 40 | 2 | 22 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
2d659f465ff8ae7814c6603d5bb08d08067d9d4e | 11,936 | py | Python | tests/integration/mongodb/factory/ar_conn/ar_other.py | RaenonX/Jelly-Bot-API | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 5 | 2020-08-26T20:12:00.000Z | 2020-12-11T16:39:22.000Z | tests/integration/mongodb/factory/ar_conn/ar_other.py | RaenonX/Jelly-Bot | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 234 | 2019-12-14T03:45:19.000Z | 2020-08-26T18:55:19.000Z | tests/integration/mongodb/factory/ar_conn/ar_other.py | RaenonX/Jelly-Bot-API | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 2 | 2019-10-23T15:21:15.000Z | 2020-05-22T09:35:55.000Z | import time
from flags import AutoReplyContentType
from models import AutoReplyContentModel, AutoReplyModuleModel
from models.ar import UniqueKeywordCountEntry
from mongodb.factory.ar_conn import AutoReplyManager, AutoReplyModuleManager
from tests.base import TestModelMixin
from ._base_ar import TestAutoReplyManagerBase
__all__ = ["TestAutoReplyManagerOther"]
class TestAutoReplyManagerOther(TestAutoReplyManagerBase.TestClass, TestModelMixin):
def test_get_count_call(self):
mdl = AutoReplyManager.add_conn(**self.get_mdl_1_args()).model
self.assertEqual(mdl.called_count, 0)
for i in range(1, 4):
AutoReplyManager.get_responses(
self.get_mdl_1().keyword.content, self.get_mdl_1().keyword.content_type, self.get_mdl_1().channel_oid,
update_async=False
)
mdl = AutoReplyModuleManager.find_one_casted({
AutoReplyModuleModel.KEY_KW_CONTENT: self.get_mdl_1().keyword.content,
AutoReplyModuleModel.KEY_KW_TYPE: self.get_mdl_1().keyword.content_type
})
self.assertEqual(mdl.called_count, i)
def test_get_after_add_multi(self):
AutoReplyManager.add_conn(**self.get_mdl_1_args())
AutoReplyManager.add_conn(**self.get_mdl_4_args())
AutoReplyManager.add_conn(**self.get_mdl_6_args())
resp = AutoReplyManager.get_responses(
self.get_mdl_1().keyword.content, self.get_mdl_1().keyword.content_type,
self.get_mdl_1().channel_oid)[0][0]
resp_expected = self.get_mdl_1().responses[0]
self.assertModelEqual(resp, resp_expected)
resp = AutoReplyManager.get_responses(
self.get_mdl_4().keyword.content, self.get_mdl_4().keyword.content_type,
self.get_mdl_4().channel_oid)[0][0]
resp_expected = self.get_mdl_4().responses[0]
self.assertModelEqual(resp, resp_expected)
resp = AutoReplyManager.get_responses(
self.get_mdl_6().keyword.content, self.get_mdl_6().keyword.content_type,
self.get_mdl_6().channel_oid)[0][0]
resp_expected = self.get_mdl_6().responses[0]
self.assertModelEqual(resp, resp_expected)
def test_get_on_cooldown(self):
AutoReplyManager.add_conn(**self.get_mdl_3_args())
# Call once to record last used time
AutoReplyManager.get_responses(
self.get_mdl_3().keyword.content, self.get_mdl_3().keyword.content_type, self.get_mdl_3().channel_oid,
update_async=False)
resp = AutoReplyManager.get_responses(
self.get_mdl_3().keyword.content, self.get_mdl_3().keyword.content_type, self.get_mdl_3().channel_oid,
update_async=False)
self.assertEqual(resp, [])
mdl = AutoReplyModuleManager.find_one_casted({
AutoReplyModuleModel.KEY_KW_CONTENT: self.get_mdl_3().keyword.content,
AutoReplyModuleModel.KEY_KW_TYPE: self.get_mdl_3().keyword.content_type
})
self.assertEqual(mdl.called_count, 1)
def test_get_after_cooldown(self):
AutoReplyManager.add_conn(**self.get_mdl_3_args())
# Call once to record last used time
AutoReplyManager.get_responses(
self.get_mdl_3().keyword.content, self.get_mdl_3().keyword.content_type, self.get_mdl_3().channel_oid,
update_async=False
)
time.sleep(1.1) # Cooldown of model #3 is 1 sec
resp = AutoReplyManager.get_responses(
self.get_mdl_3().keyword.content, self.get_mdl_3().keyword.content_type, self.get_mdl_3().channel_oid,
update_async=False
)
self.assertEqual(resp, [(self.get_mdl_3().responses[0], False)])
mdl = AutoReplyModuleManager.find_one_casted({
AutoReplyModuleModel.KEY_KW_CONTENT: self.get_mdl_3().keyword.content,
AutoReplyModuleModel.KEY_KW_TYPE: self.get_mdl_3().keyword.content_type
})
self.assertEqual(mdl.called_count, 2)
def test_get_list_by_keyword_including_inactive(self):
expected_oids = self._add_call_module_kw_a()
for idx, actual_mdl in enumerate(AutoReplyManager.get_conn_list(self.channel_oid, "A", active_only=False)):
with self.subTest(expected=expected_oids[idx], actual=actual_mdl):
self.assertEqual(expected_oids[idx], actual_mdl.id)
def test_get_list_by_keyword_active_only(self):
expected_oids = self._add_call_module_kw_a()[2:]
for idx, actual_mdl in enumerate(AutoReplyManager.get_conn_list(self.channel_oid, "A")):
with self.subTest(expected=expected_oids[idx], actual=actual_mdl):
self.assertEqual(expected_oids[idx], actual_mdl.id)
def test_get_list_by_oids(self):
mdl_oids = self._add_call_module_kw_a()
expected_kw_resp = (
(AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT)]),
(AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="C", ContentType=AutoReplyContentType.TEXT)]),
(AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="D", ContentType=AutoReplyContentType.TEXT)])
)
for mdl, kw_resp in zip(AutoReplyManager.get_conn_list_oids(mdl_oids), expected_kw_resp):
kw, resp = kw_resp
self.assertEqual(mdl.keyword, kw)
self.assertEqual(mdl.responses, resp)
def test_get_response_no_redirect(self):
AutoReplyManager.add_conn(**self.get_mdl_17_args())
self.assertEqual(AutoReplyManager.get_responses("A", AutoReplyContentType.TEXT, self.channel_oid),
[(AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT), True)])
def test_get_response_should_redirect(self):
AutoReplyManager.add_conn(**self.get_mdl_1_args())
self.assertEqual(AutoReplyManager.get_responses("A", AutoReplyContentType.TEXT, self.channel_oid),
[(AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT), False)])
def test_get_response_multiple_responses(self):
AutoReplyManager.add_conn(Keyword=AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
Responses=[AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT),
AutoReplyContentModel(Content="C", ContentType=AutoReplyContentType.TEXT),
AutoReplyContentModel(Content="D",
ContentType=AutoReplyContentType.TEXT)],
ChannelOid=self.channel_oid, CreatorOid=self.CREATOR_OID)
self.assertEqual(AutoReplyManager.get_responses("A", AutoReplyContentType.TEXT, self.channel_oid),
[(AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT), False),
(AutoReplyContentModel(Content="C", ContentType=AutoReplyContentType.TEXT), False),
(AutoReplyContentModel(Content="D", ContentType=AutoReplyContentType.TEXT), False)])
def test_stats_module_count_length_limited(self):
self._add_call_module_kw_a()
expected_kw_resp = {
"1": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT)]),
"2": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="C", ContentType=AutoReplyContentType.TEXT)]),
}
# Not using `zip()` to ensure that exceptions will be raised if len(actual) > len(expected)
# which means the tests is not completed
for rk, mdl in AutoReplyManager.get_module_count_stats(self.channel_oid, 2):
kw, resp = expected_kw_resp[rk]
self.assertEqual(kw, mdl.keyword)
self.assertEqual(resp, mdl.responses)
def test_stats_module_count_length_overlimit(self):
self._add_call_module_kw_a()
expected_kw_resp = {
"1": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT)]),
"2": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="C", ContentType=AutoReplyContentType.TEXT)]),
"3": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="D", ContentType=AutoReplyContentType.TEXT)]),
}
for rk, mdl in AutoReplyManager.get_module_count_stats(self.channel_oid, 5):
kw, resp = expected_kw_resp[rk]
self.assertEqual(kw, mdl.keyword)
self.assertEqual(resp, mdl.responses)
def test_stats_module_count_length_no_limit(self):
self._add_call_module_kw_a()
expected_kw_resp = {
"1": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="B", ContentType=AutoReplyContentType.TEXT)]),
"2": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="C", ContentType=AutoReplyContentType.TEXT)]),
"3": (AutoReplyContentModel(Content="A", ContentType=AutoReplyContentType.TEXT),
[AutoReplyContentModel(Content="D", ContentType=AutoReplyContentType.TEXT)]),
}
for rk, mdl in AutoReplyManager.get_module_count_stats(self.channel_oid):
kw, resp = expected_kw_resp[rk]
self.assertEqual(kw, mdl.keyword)
self.assertEqual(resp, mdl.responses)
def test_stats_unique_count_length_limited(self):
self._add_call_module_multi()
expected = [
UniqueKeywordCountEntry("B", AutoReplyContentType.TEXT, 9, 1, "1"),
UniqueKeywordCountEntry("A", AutoReplyContentType.TEXT, 7, 3, "2"),
UniqueKeywordCountEntry("C", AutoReplyContentType.TEXT, 0, 2, "3")
]
result = AutoReplyManager.get_unique_keyword_count_stats(self.channel_oid, 3)
self.assertEqual(result.limit, 3)
self.assertEqual(result.data, expected)
def test_stats_unique_count_length_overlimit(self):
self._add_call_module_multi()
expected = [
UniqueKeywordCountEntry("B", AutoReplyContentType.TEXT, 9, 1, "1"),
UniqueKeywordCountEntry("A", AutoReplyContentType.TEXT, 7, 3, "2"),
UniqueKeywordCountEntry("C", AutoReplyContentType.TEXT, 0, 2, "T3"),
UniqueKeywordCountEntry("D", AutoReplyContentType.TEXT, 0, 1, "T3")
]
result = AutoReplyManager.get_unique_keyword_count_stats(self.channel_oid, 5)
self.assertEqual(result.limit, 5)
self.assertEqual(result.data, expected)
def test_stats_unique_count_length_no_limit(self):
self._add_call_module_multi()
expected = [
UniqueKeywordCountEntry("B", AutoReplyContentType.TEXT, 9, 1, "1"),
UniqueKeywordCountEntry("A", AutoReplyContentType.TEXT, 7, 3, "2"),
UniqueKeywordCountEntry("C", AutoReplyContentType.TEXT, 0, 2, "T3"),
UniqueKeywordCountEntry("D", AutoReplyContentType.TEXT, 0, 1, "T3")
]
result = AutoReplyManager.get_unique_keyword_count_stats(self.channel_oid)
self.assertIsNone(result.limit)
self.assertEqual(result.data, expected)
| 49.733333 | 119 | 0.676692 | 1,276 | 11,936 | 6.047806 | 0.105016 | 0.139951 | 0.054425 | 0.027083 | 0.858235 | 0.848775 | 0.800441 | 0.784243 | 0.755475 | 0.706103 | 0 | 0.011791 | 0.218415 | 11,936 | 239 | 120 | 49.941423 | 0.815414 | 0.019018 | 0 | 0.513369 | 0 | 0 | 0.008118 | 0.002136 | 0 | 0 | 0 | 0 | 0.149733 | 1 | 0.085562 | false | 0 | 0.037433 | 0 | 0.128342 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2d873eb0a635cdfce4f99b3e1a4d175aab132eaf | 17,244 | py | Python | osm_multiplex/tests/test_lstm_preprocessing.py | SoftwareDevEngResearch/osm_multiplex | 620b1072387428a8bc864c6b0e45416b2cfdc9fe | [
"MIT"
] | null | null | null | osm_multiplex/tests/test_lstm_preprocessing.py | SoftwareDevEngResearch/osm_multiplex | 620b1072387428a8bc864c6b0e45416b2cfdc9fe | [
"MIT"
] | null | null | null | osm_multiplex/tests/test_lstm_preprocessing.py | SoftwareDevEngResearch/osm_multiplex | 620b1072387428a8bc864c6b0e45416b2cfdc9fe | [
"MIT"
] | 1 | 2019-04-20T19:46:58.000Z | 2019-04-20T19:46:58.000Z | # third-party libraries
import pandas as pd
import pytest
# local imports
from .. import lstm_preprocessing
class TestSpatialGrouping:
"""Tests the output of a single location for the record"""
def test_selection_1(self):
"""Select the default, which is choosing the location from dataset 1"""
data_list = [[11.22, 33.44, 55.66, 77.88], [99.00, 11.22, 33.44, 55.66]]
target_list = [[11.22, 33.44], [99.00, 11.22]]
data = pd.DataFrame(data_list, columns=['lat1', 'lon1', 'lat2', 'lon2'])
target = pd.DataFrame(target_list, columns=['lat', 'lon'])
test = lstm_preprocessing.spatial_grouping(data)
assert test.equals(target)
def test_selection_2(self):
"""Select the location from dataset 2"""
data_list = [[11.22, 33.44, 55.66, 77.88], [99.00, 11.22, 33.44, 55.66]]
target_list = [[55.66, 77.88], [33.44, 55.66]]
data = pd.DataFrame(data_list, columns=['lat1', 'lon1', 'lat2', 'lon2'])
target = pd.DataFrame(target_list, columns=['lat', 'lon'])
test = lstm_preprocessing.spatial_grouping(data, location_selection='2')
assert test.equals(target)
def test_selection_osm(self):
"""Select the location by finding the nearest OSM node to the average"""
data_list = [[44.594487, -123.262589, 44.562769, -123.267733],
[44.594528, -123.261476, 44.563046, -123.268784]]
target_list = [[36921149, 44.57822, -123.264745],
[36921149, 44.57822, -123.264745]]
data = pd.DataFrame(data_list, columns=['lat1', 'lon1', 'lat2', 'lon2'])
target = pd.DataFrame(target_list, columns=['osm_id', 'lat', 'lon'])
test = lstm_preprocessing.spatial_grouping(data, location_selection='osm')
assert test.equals(target)
class TestAssignOsm:
"""Find the nearest OSM node to the average of the two dataset locations"""
def test_find_nearest(self):
data_list = [[44.594487, -123.262589, 44.562769, -123.267733],
[44.594528, -123.261476, 44.563046, -123.268784]]
target_list = [[36921149, 44.57822, -123.264745],
[36921149, 44.57822, -123.264745]]
data = pd.DataFrame(data_list, columns=['lat1', 'lon1', 'lat2', 'lon2'])
target = pd.DataFrame(target_list, columns=['osm_id', 'lat', 'lon'])
test = lstm_preprocessing.assign_osm(data)
assert test.equals(target)
class TestOccupancyLevel:
"""Tests the output of occupancy levels for both grouped and single user data"""
def test_both_individual(self):
"""Both datasets have individual identifiers"""
data_list = [['bike1', 'scooter1']]
target_list = [['bike1', 'scooter1', 1, 1]]
data = pd.DataFrame(data_list, columns=['element_id1', 'element_id2'])
target = pd.DataFrame(target_list, columns=['element_id1', 'element_id2', 'occupancy1', 'occupancy2'])
test = lstm_preprocessing.occupancy_level(data)
assert test.equals(target)
def test_both_grouped(self):
"""Both datasets have grouped counts"""
data_list = [['bike1', 'scooter1', 1519330080, 1519330081, 2, 1, 3, 1],
['bike1', 'scooter1', 1519330085, 1519330086, 3, 0, 2, 1],
['bike1', 'scooter1', 1519430080, 1519430081, 3, 1, 4, 2],
['bike1', 'scooter1', 1519430085, 1519430086, 1, 2, 0, 1]]
target_list = [['bike1', 'scooter1', '2018-02-22 20:08:00', '2018-02-22 20:08:01', 1, 2],
['bike1', 'scooter1', '2018-02-22 20:08:05', '2018-02-22 20:08:06', 4, 3],
['bike1', 'scooter1', '2018-02-23 23:54:40', '2018-02-23 23:54:41', 2, 2],
['bike1', 'scooter1', '2018-02-23 23:54:45', '2018-02-23 23:54:46', 1, 1]]
data = pd.DataFrame(data_list, columns=['element_id1', 'element_id2', 'timestamp1', 'timestamp2', 'boardings1', 'alightings1', 'boardings2', 'alightings2'])
target = pd.DataFrame(target_list, columns=['element_id1', 'element_id2', 'timestamp1', 'timestamp2', 'occupancy1', 'occupancy2'])
target['timestamp1'] = pd.to_datetime(target['timestamp1'])
target['timestamp2'] = pd.to_datetime(target['timestamp2'])
test = lstm_preprocessing.occupancy_level(data)
assert test.equals(target)
class TestDailyCumulative:
"""Test the cumulative sum of grouped data to derive occupancy"""
def test_summing_1_timestamp(self):
"""Test cumulative sum for dataset 1 with timestamp"""
data_list = [['bob1', 1519330080, 2, 1], ['bob1', 1519330085, 3, 0], ['bob1', 1519430080, 3, 1], ['bob1', 1519430085, 1, 2]]
target_list = [['bob1', '2018-02-22 20:08:00', 1], ['bob1', '2018-02-22 20:08:05', 4], ['bob1', '2018-02-23 23:54:40', 2], ['bob1', '2018-02-23 23:54:45', 1]]
data = pd.DataFrame(data_list, columns=['element_id1', 'timestamp1', 'boardings1', 'alightings1'])
target = pd.DataFrame(target_list, columns=['element_id1', 'timestamp1', 'occupancy1'])
target['timestamp1'] = pd.to_datetime(target['timestamp1'])
test = lstm_preprocessing.daily_cumulative(data, '1')
assert test.equals(target)
def test_summing_2_timestamp(self):
"""Test cumulative sum for dataset 2 with timestamp"""
data_list = [['bob2', 1519330080, 2, 1], ['bob2', 1519330085, 3, 0], ['bob2', 1519430080, 3, 1], ['bob2', 1519430085, 1, 2]]
target_list = [['bob2', '2018-02-22 20:08:00', 1], ['bob2', '2018-02-22 20:08:05', 4], ['bob2', '2018-02-23 23:54:40', 2], ['bob2', '2018-02-23 23:54:45', 1]]
data = pd.DataFrame(data_list, columns=['element_id2', 'timestamp2', 'boardings2', 'alightings2'])
target = pd.DataFrame(target_list, columns=['element_id2', 'timestamp2', 'occupancy2'])
target['timestamp2'] = pd.to_datetime(target['timestamp2'])
test = lstm_preprocessing.daily_cumulative(data, '2')
assert test.equals(target)
def test_summing_1_session(self):
"""Test cumulative sum for dataset 1 with session times"""
data_list = [['bob1', 1519330080, 1519330081, 2, 1],
['bob1', 1519330085, 1519330086, 3, 0],
['bob1', 1519430080, 1519430081, 3, 1],
['bob1', 1519430085, 1519430086, 1, 2]]
target_list = [['bob1', '2018-02-22 20:08:00', '2018-02-22 20:08:01', 1],
['bob1', '2018-02-22 20:08:05', '2018-02-22 20:08:06', 4],
['bob1', '2018-02-23 23:54:40', '2018-02-23 23:54:41', 2],
['bob1', '2018-02-23 23:54:45', '2018-02-23 23:54:46', 1]]
data = pd.DataFrame(data_list, columns=['element_id1', 'session_start1', 'session_end1', 'boardings1', 'alightings1'])
target = pd.DataFrame(target_list, columns=['element_id1', 'session_start1', 'session_end1', 'occupancy1'])
target['session_start1'] = pd.to_datetime(target['session_start1'])
target['session_end1'] = pd.to_datetime(target['session_end1'])
test = lstm_preprocessing.daily_cumulative(data, '1')
assert test.equals(target)
def test_summing_2_session(self):
"""Test cumulative sum for dataset 2 with session times"""
data_list = [['bob2', 1519330080, 1519330081, 2, 1],
['bob2', 1519330085, 1519330086, 3, 0],
['bob2', 1519430080, 1519430081, 3, 1],
['bob2', 1519430085, 1519430086, 1, 2]]
target_list = [['bob2', '2018-02-22 20:08:00', '2018-02-22 20:08:01', 1],
['bob2', '2018-02-22 20:08:05', '2018-02-22 20:08:06', 4],
['bob2', '2018-02-23 23:54:40', '2018-02-23 23:54:41', 2],
['bob2', '2018-02-23 23:54:45', '2018-02-23 23:54:46', 1]]
data = pd.DataFrame(data_list, columns=['element_id2', 'session_start2', 'session_end2', 'boardings2', 'alightings2'])
target = pd.DataFrame(target_list, columns=['element_id2', 'session_start2', 'session_end2', 'occupancy2'])
target['session_start2'] = pd.to_datetime(target['session_start2'])
target['session_end2'] = pd.to_datetime(target['session_end2'])
test = lstm_preprocessing.daily_cumulative(data, '2')
assert test.equals(target)
def test_invalid_identifier(self):
"""Tests if exception is raised when identifier parameter is not valid"""
data_list = [['bob2', 1519330080, 1519330081, 2, 1],
['bob2', 1519330085, 1519330086, 3, 0],
['bob2', 1519430080, 1519430081, 3, 1],
['bob2', 1519430085, 1519430086, 1, 2]]
target_list = [['bob2', '2018-02-22 20:08:00', '2018-02-22 20:08:01', 1],
['bob2', '2018-02-22 20:08:05', '2018-02-22 20:08:06', 4],
['bob2', '2018-02-23 23:54:40', '2018-02-23 23:54:41', 2],
['bob2', '2018-02-23 23:54:45', '2018-02-23 23:54:46', 1]]
data = pd.DataFrame(data_list, columns=['element_id2', 'session_start2', 'session_end2', 'boardings2', 'alightings2'])
with pytest.raises(Exception):
lstm_preprocessing.daily_cumulative(data, '3')
class TestTimeGrouping:
"""Tests the grouping of records into specified time intervals"""
def test_timestamp1_session2_interval15_selection1(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
target_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:45:00', 44.44, 55.55, 5, 8]]
data = pd.DataFrame(data_list, columns=['timestamp1', 'session_start2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target = pd.DataFrame(target_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target['time'] = pd.to_datetime(target['time'])
target_multi = target.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.time_grouping(data, interval='15T', time_selection='1')
assert test.equals(target_multi)
def test_timestamp1_session2_interval15_selection2(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
target_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:45:00', 44.44, 55.55, 5, 8]]
data = pd.DataFrame(data_list, columns=['timestamp1', 'session_start2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target = pd.DataFrame(target_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target['time'] = pd.to_datetime(target['time'])
target_multi = target.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.time_grouping(data, interval='15T', time_selection='2')
assert test.equals(target_multi)
def test_session1_timestamp2_interval60_selection2(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
target_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:00:00', 44.44, 55.55, 5, 8]]
data = pd.DataFrame(data_list, columns=['session_start1', 'timestamp2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target = pd.DataFrame(target_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target['time'] = pd.to_datetime(target['time'])
target_multi = target.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.time_grouping(data, interval='60T', time_selection='2')
assert test.equals(target_multi)
def test_session1_timestamp2_interval60_selection1(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
target_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:00:00', 44.44, 55.55, 5, 8]]
data = pd.DataFrame(data_list, columns=['session_start1', 'timestamp2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target = pd.DataFrame(target_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target['time'] = pd.to_datetime(target['time'])
target_multi = target.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.time_grouping(data, interval='60T', time_selection='1')
assert test.equals(target_multi)
def test_session1_timestamp2_interval30_selectionavg(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
target_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:30:00', 44.44, 55.55, 5, 8]]
data = pd.DataFrame(data_list, columns=['session_start1', 'timestamp2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target = pd.DataFrame(target_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target['time'] = pd.to_datetime(target['time'])
target_multi = target.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.time_grouping(data, interval='30T', time_selection='avg')
assert test.equals(target_multi)
def test_timestamp1_session2_interval30_selectionavg(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
target_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:30:00', 44.44, 55.55, 5, 8]]
data = pd.DataFrame(data_list, columns=['timestamp1', 'session_start2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target = pd.DataFrame(target_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
target['time'] = pd.to_datetime(target['time'])
target_multi = target.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.time_grouping(data, interval='30T', time_selection='avg')
assert test.equals(target_multi)
def test_invalid_time_selection(self):
data_list = [[1519330080, 1519330090, 44.44, 55.55, 2, 3],
[1519330081, 1519330030, 44.44, 55.55, 1, 4],
[1519430080, 1519430090, 44.44, 55.55, 3, 2],
[1519430081, 1519430030, 44.44, 55.55, 2, 6]]
data = pd.DataFrame(data_list, columns=['timestamp1', 'session_start2', 'lat', 'lon', 'occupancy1', 'occupancy2'])
with pytest.raises(Exception):
lstm_preprocessing.time_grouping(data, interval='30T', time_selection='3')
class TestWeeklyDifferenceDataframes:
def test_two_weeks(self):
data_list = [['2018-02-22 20:00:00', 44.44, 55.55, 3, 7],
['2018-02-23 23:30:00', 44.44, 55.55, 5, 8],
['2018-02-24 00:00:00', 44.44, 55.55, 4, 3],
['2018-02-24 00:30:00', 44.44, 55.55, 2, 3],
['2018-02-24 01:00:00', 66.66, 77.77, 5, 5],
['2018-02-24 01:30:00', 66.66, 77.77, 3, 3],
['2018-02-24 02:00:00', 66.66, 77.77, 7, 8],
['2018-02-24 02:30:00', 66.66, 77.77, 3, 5],
['2018-02-24 03:00:00', 66.66, 77.77, 4, 5],
['2018-03-24 03:30:00', 44.44, 55.55, 7, 8],
['2018-03-24 04:00:00', 44.44, 55.55, 6, 5],
['2018-03-24 04:30:00', 44.44, 55.55, 9, 8],
['2018-03-24 05:00:00', 44.44, 55.55, 2, 2],
['2018-03-24 05:30:00', 44.44, 55.55, 8, 8],
['2018-03-24 06:00:00', 44.44, 55.55, 6, 5],
['2018-03-24 06:30:00', 44.44, 55.55, 7, 8],
['2018-03-24 07:00:00', 44.44, 55.55, 2, 4],
['2018-03-24 07:30:00', 44.44, 55.55, 5, 4],]
data = pd.DataFrame(data_list, columns=['time', 'lat', 'lon', 'occupancy1', 'occupancy2'])
data['time'] = pd.to_datetime(data['time'])
data_multi = data.set_index(['time', 'lat', 'lon'])
test = lstm_preprocessing.weekly_dataframes(data_multi)
assert test != None # need a better assertion, but can't find how to hash a dictionary of dataframes
| 56.352941 | 166 | 0.579564 | 2,290 | 17,244 | 4.253275 | 0.093013 | 0.037577 | 0.032649 | 0.043532 | 0.814476 | 0.784394 | 0.760986 | 0.717146 | 0.676797 | 0.66232 | 0 | 0.245328 | 0.245941 | 17,244 | 305 | 167 | 56.537705 | 0.50373 | 0.055034 | 0 | 0.5671 | 0 | 0 | 0.200728 | 0 | 0 | 0 | 0 | 0 | 0.073593 | 1 | 0.082251 | false | 0 | 0.012987 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2daac25556b612e3453a8ae5fd9bb82c85dfe889 | 7,124 | py | Python | posts/tests.py | BattleMageBro/yatube | 667686aead6ae4ce9de6847fe2fd23fb14767c32 | [
"BSD-3-Clause"
] | null | null | null | posts/tests.py | BattleMageBro/yatube | 667686aead6ae4ce9de6847fe2fd23fb14767c32 | [
"BSD-3-Clause"
] | null | null | null | posts/tests.py | BattleMageBro/yatube | 667686aead6ae4ce9de6847fe2fd23fb14767c32 | [
"BSD-3-Clause"
] | null | null | null | from django.test import TestCase, Client, override_settings
from .models import User, Post, Group, Follow
from django.urls import reverse
TEST_CACHE = {
'default': {
'BACKEND': 'django.core.cache.backends.dummy.DummyCache',
}
}
# Create your tests here.
class ViewsTests(TestCase):
def setUp(self):
self.client = Client()
self.user = User.objects.create_user(username='pupa', email='pupa@mail.ru', password='12345678')
self.post = Post.objects.create(text="Привет, давно не виделись!", author=self.user)
def test_profile_view(self):
response = self.client.get('/pupa/')
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['post_count'], 1)
self.assertIsInstance(response.context['user_profile'], User)
self.assertEqual(response.context['user_profile'].username, self.user.username)
def test_new_post(self):
self.client.login(username='pupa', password='12345678')
response = self.client.post('/new/', {'text': 'Создаю Новый пост!'}, follow=True)
self.assertRedirects(response, '/')
response = self.client.get(reverse('profile', kwargs={'username': self.user.username}))
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['post_count'],2)
self.assertIsInstance(response.context['user_profile'], User)
self.assertEqual(response.context['user_profile'].username, self.user.username)
def test_new_post_no_auth(self):
response = self.client.post(reverse('new_post'), follow=True)
self.assertRedirects(response, '/auth/login/?next=%2Fnew%2F')
def test_tripple_post_index(self):
self.client.login(username='pupa', password='12345678')
response = self.client.get('/')
self.assertContains(response, self.post.text, count=1)
def test_tripple_post_profile(self):
self.client.force_login(self.user)
response = self.client.get(reverse('profile', kwargs={'username': self.user.username}))
self.assertContains(response, self.post.text, count=1)
def test_tripple_post_post_view(self):
self.client.force_login(self.user)
response = self.client.get(reverse('posts', kwargs={'username': self.user.username, 'post_id': self.post.id}))
self.assertContains(response, self.post.text, count=1)
@override_settings(CACHES=TEST_CACHE)
def test_tripple_post_edit_index(self):
self.client.login(username='pupa', password='12345678')
self.client.post(f'/{self.user.username}/{self.post.id}/edit/', {'text': 'О, это снова Вы!'}, follow=False)
response = self.client.get('/')
self.assertContains(response, 'О, это снова Вы!', count=1)
@override_settings(CACHES=TEST_CACHE)
def test_tripple_post_edit_profile(self):
self.client.force_login(self.user)
self.client.post(f'/{self.user.username}/{self.post.id}/edit/', {'text': 'О, это снова Вы!'}, follow=False)
response = self.client.get(reverse('profile', kwargs={'username': self.user.username}))
self.assertContains(response, 'О, это снова Вы!', count=1)
@override_settings(CACHES=TEST_CACHE)
def test_tripple_post_edit_post_view(self):
self.client.force_login(self.user)
self.client.post(f'/{self.user.username}/{self.post.id}/edit/', {'text': 'О, это снова Вы!'}, follow=False)
response = self.client.get(reverse('posts', kwargs={'username': self.user.username, 'post_id': self.post.id}))
self.assertContains(response, 'О, это снова Вы!', count=1)
class FollowTest(TestCase):
def setUp(self):
self.client = Client()
self.user1 = User.objects.create_user(username='pupa', email='pupa@mail.ru', password='12345')
self.user2 = User.objects.create_user(username='lupa', email='lupa@mail.ru', password='54321')
self.user3 = User.objects.create_user(username='kamon', email='kam@mail.ru', password='15243')
self.post = Post.objects.create(text='Для всех моих подписчиков', author=self.user2)
def test_follow(self):
self.client.force_login(self.user1)
self.client.get(reverse('profile_follow', kwargs={'username': self.user2.username}))
self.assertEqual(Follow.objects.count(), 1)
def test_unfollow(self):
self.client.force_login(self.user1)
self.client.get(reverse('profile_follow', kwargs={'username': self.user2.username}))
self.client.get(reverse('profile_unfollow', kwargs={'username': self.user2.username}))
self.assertEqual(Follow.objects.count(), 0)
def test_follow_post(self):
self.client.force_login(self.user1)
self.client.get(reverse('profile_follow', kwargs={'username': self.user2.username}))
response = self.client.get('/follow/')
self.assertContains(response, self.post.text, status_code=200)
self.client.force_login(self.user3)
response = self.client.get('/follow/')
self.assertNotContains(response, self.post.text, status_code=200)
class CommentTest(TestCase):
def setUp(self):
self.client = Client()
self.user = User.objects.create_user(username='pupa', email='pupa@mail.ru', password='12345')
self.post = Post.objects.create(text='Для комментиков=)', author=self.user)
def test_add_comment_no_auth(self):
response = self.client.post(reverse('add_comment', kwargs={'username': self.user.username, 'post_id': self.post.id }), follow=True)
self.assertRedirects(response, '/auth/login/?next=%2Fpupa%2F1%2Fcomment%2F')
def test_add_comment(self):
self.client.force_login(self.user)
response = self.client.post(reverse('add_comment', kwargs={'username': self.user.username, 'post_id': self.post.id }), {'text': 'Комментик=)'}, follow=True)
self.assertContains(response, 'Комментик=)')
class ImageTest(TestCase):
def setUp(self):
self.client = Client()
self.user = User.objects.create_user(username='pupa', email='pupa@mail.ru', password='12345')
self.post = Post.objects.create(text='Картиночка подъехала', author=self.user)
self.group = Group.objects.create(title='Любители картинок', slug='Love', description='lovelove')
@override_settings(CACHES=TEST_CACHE)
def test_image_post(self):
self.client.force_login(self.user)
with open('media/tests/Test.jpg', 'rb') as fp:
response = self.client.post(reverse('post_edit', kwargs={'username': self.user.username, 'post_id': self.post.id}), {'text': self.post.text, 'image': fp, 'group': self.group.id}, follow=True)
response_dec = response.content.decode('utf-8')
self.assertIn('<img', response_dec)
response = self.client.get('/')
response_dec = response.content.decode('utf-8')
self.assertIn('<img', response_dec)
response = self.client.get(reverse('profile', kwargs={'username': self.user.username}))
response_dec = response.content.decode('utf-8')
self.assertIn('<img', response_dec)
| 50.524823 | 203 | 0.676025 | 904 | 7,124 | 5.219027 | 0.149336 | 0.086901 | 0.064858 | 0.053412 | 0.81348 | 0.768334 | 0.746079 | 0.707503 | 0.661933 | 0.642857 | 0 | 0.017023 | 0.167181 | 7,124 | 140 | 204 | 50.885714 | 0.77819 | 0.003229 | 0 | 0.525862 | 0 | 0 | 0.158614 | 0.033526 | 0 | 0 | 0 | 0 | 0.215517 | 1 | 0.163793 | false | 0.077586 | 0.025862 | 0 | 0.224138 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
2de1a1338befa83c884d88161eaf1c15388d4d32 | 95 | py | Python | electronics_lib/DigikeyCapacitorTable.py | lab11/PolymorphicBlocks | 52e5ee332fddc9a9f583ebabfca863365e873bf7 | [
"BSD-3-Clause"
] | null | null | null | electronics_lib/DigikeyCapacitorTable.py | lab11/PolymorphicBlocks | 52e5ee332fddc9a9f583ebabfca863365e873bf7 | [
"BSD-3-Clause"
] | null | null | null | electronics_lib/DigikeyCapacitorTable.py | lab11/PolymorphicBlocks | 52e5ee332fddc9a9f583ebabfca863365e873bf7 | [
"BSD-3-Clause"
] | null | null | null | from .DigikeyTable import *
class DigikeyCapacitorTable(DigikeyTable, CapacitorTable):
pass
| 19 | 58 | 0.821053 | 8 | 95 | 9.75 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115789 | 95 | 4 | 59 | 23.75 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
2dff1856c65ee11c439ff5ca92b77217605fe7ef | 141 | py | Python | Local_debug.py | Doreamonsky/Shnu | b052dd21c2dd6c8f51fa83da0a3504eaa16aedcf | [
"Apache-2.0"
] | 10 | 2018-01-18T11:45:55.000Z | 2021-01-26T08:44:16.000Z | Local_debug.py | Doreamonsky/Shnu | b052dd21c2dd6c8f51fa83da0a3504eaa16aedcf | [
"Apache-2.0"
] | 1 | 2018-02-18T13:56:19.000Z | 2018-02-18T13:56:19.000Z | Local_debug.py | Doreamonsky/Shnu | b052dd21c2dd6c8f51fa83da0a3504eaa16aedcf | [
"Apache-2.0"
] | 2 | 2018-02-07T11:47:36.000Z | 2018-04-05T11:45:58.000Z | #!/usr/bin/python
# -*- coding: UTF-8 -*-
import Function_adapter
print Function_adapter.MyAdapter().run('courses_list_by_keywords', '数学')
| 20.142857 | 72 | 0.730496 | 19 | 141 | 5.157895 | 0.894737 | 0.306122 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007813 | 0.092199 | 141 | 6 | 73 | 23.5 | 0.757813 | 0.269504 | 0 | 0 | 0 | 0 | 0.257426 | 0.237624 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
93225a8d1b5d52dbb1f2766c63847b04de55edac | 47 | py | Python | vizbert/extract/__init__.py | daemon/vizbert | e40b7d1529f8857050313f8d87ff03b1b7226c9e | [
"MIT"
] | null | null | null | vizbert/extract/__init__.py | daemon/vizbert | e40b7d1529f8857050313f8d87ff03b1b7226c9e | [
"MIT"
] | null | null | null | vizbert/extract/__init__.py | daemon/vizbert | e40b7d1529f8857050313f8d87ff03b1b7226c9e | [
"MIT"
] | null | null | null | from .base import *
from .transformer import *
| 15.666667 | 26 | 0.744681 | 6 | 47 | 5.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 27 | 23.5 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
933ab1657fdf5d2dc8022857df9f531b8fef800c | 166,895 | py | Python | pirates/leveleditor/worldData/pvpShipIsland2.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 3 | 2021-02-25T06:38:13.000Z | 2022-03-22T07:00:15.000Z | pirates/leveleditor/worldData/pvpShipIsland2.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | null | null | null | pirates/leveleditor/worldData/pvpShipIsland2.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 1 | 2021-02-25T06:38:17.000Z | 2021-02-25T06:38:17.000Z | # uncompyle6 version 3.2.0
# Python bytecode 2.4 (62061)
# Decompiled from: Python 2.7.14 (v2.7.14:84471935ed, Sep 16 2017, 20:19:30) [MSC v.1500 32 bit (Intel)]
# Embedded file name: pirates.leveleditor.worldData.pvpShipIsland2
from pandac.PandaModules import Point3, VBase3, Vec4, Vec3
objectStruct = {'AmbientColors': {0: Vec4(0.207843, 0.243137, 0.447059, 1), 2: Vec4(0.666667, 0.721569, 0.792157, 1), 4: Vec4(0.721569, 0.611765, 0.619608, 1), 6: Vec4(0.207843, 0.243137, 0.447059, 1), 8: Vec4(0.384314, 0.419608, 0.564706, 1)}, 'DirectionalColors': {0: Vec4(0.956863, 0.909804, 0.894118, 1), 2: Vec4(1, 1, 1, 1), 4: Vec4(0.439216, 0.176471, 0, 1), 6: Vec4(0.513726, 0.482353, 0.639216, 1), 8: Vec4(0.447059, 0.439216, 0.537255, 1)}, 'FogColors': {0: Vec4(0.172549, 0.180392, 0.290196, 1), 2: Vec4(0.894118, 0.894118, 1, 1), 4: Vec4(0.231373, 0.203922, 0.184314, 1), 6: Vec4(0.172549, 0.180392, 0.290196, 1), 8: Vec4(0.129412, 0.137255, 0.203922, 1)}, 'FogRanges': {0: 0.000699999975040555, 2: 0.00019999999494757503, 4: 0.00039999998989515007, 6: 0.000699999975040555, 8: 0.0}, 'Objects': {'1196970035.53sdnaik': {'Type': 'Island', 'Name': 'pvpShipIsland2', 'File': '', 'Environment': 'OpenSky', 'Minimap': False, 'Objects': {'1201551808.32kmuller': {'Type': 'Dinghy', 'Aggro Radius': '20.0000', 'GridPos': Point3(42.304, 291.037, 0.81), 'Hpr': VBase3(-149.207, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(-8.509, -280.365, 0.81), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1201551834.96kmuller': {'Type': 'Dinghy', 'Aggro Radius': '20.0000', 'GridPos': Point3(8.739, -231.852, 0.612), 'Hpr': VBase3(-136.695, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(-64.569, 222.851, 0.612), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1201551889.37kmuller': {'Type': 'Player Spawn Node', 'GridPos': Point3(62.095, -172.567, 3.658), 'Hpr': VBase3(129.217, 0.0, 0.0), 'Index': 1, 'Pos': Point3(-101.998, 152.419, 3.658), 'Scale': VBase3(1.0, 1.0, 1.0), 'Spawnables': 'All', 'Visual': {'Color': (0.5, 0.5, 0.5, 1), 'Model': 'models/misc/smiley'}}, '1201551915.76kmuller': {'Type': 'Player Spawn Node', 'GridPos': Point3(405.977, 55.544, 6.652), 'Hpr': VBase3(69.496, 0.0, 0.0), 'Index': 1, 'Pos': Point3(492.088, 16.466, 2.257), 'Scale': VBase3(1.0, 1.0, 1.0), 'Spawnables': 'All', 'Visual': {'Color': (0.5, 0.5, 0.5, 1), 'Model': 'models/misc/smiley'}}, '1201558997.82kmuller': {'Type': 'Player Spawn Node', 'GridPos': Point3(186.876, -349.348, 1.25), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Index': -1, 'Pos': Point3(-265.84, 293.761, 1.25), 'Scale': VBase3(1.0, 1.0, 1.0), 'Spawnables': 'All', 'Visual': {'Color': (0.5, 0.5, 0.5, 1), 'Model': 'models/misc/smiley'}}, '1201559007.18kmuller': {'Type': 'Player Spawn Node', 'GridPos': Point3(-229.903, -58.379, 1.835), 'Hpr': VBase3(-147.259, 0.0, 0.0), 'Index': -1, 'Pos': Point3(208.951, 112.263, 1.835), 'Scale': VBase3(1.0, 1.0, 1.0), 'Spawnables': 'All', 'Visual': {'Color': (0.5, 0.5, 0.5, 1), 'Model': 'models/misc/smiley'}}, '1201559028.84kmuller': {'Type': 'Player Spawn Node', 'GridPos': Point3(-454.921, -16.539, 56.948), 'Hpr': VBase3(-117.589, 0.0, 0.0), 'Index': -1, 'Pos': Point3(-417.183, 182.286, 1.761), 'Scale': VBase3(1.0, 1.0, 1.0), 'Spawnables': 'All', 'Visual': {'Color': (0.5, 0.5, 0.5, 1), 'Model': 'models/misc/smiley'}}, '1202414940.17akelts': {'Type': 'Building Exterior', 'File': '', 'ExtUid': '1202414940.17akelts0', 'GridPos': Point3(91.131, -360.491, 0.586), 'Hpr': VBase3(-104.772, 0.0, 0.0), 'Objects': {}, 'Pos': Point3(-175.635, 327.736, 0.586), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Door': 'models/buildings/shanty_guildhall_door', 'Model': 'models/buildings/shanty_repairshop_exterior', 'SignFrame': 'models/buildings/sign1_shanty_a_frame', 'SignImage': 'models/buildings/sign1_eng_a_icon_shipwright'}}, '1203114290.45akelts': {'Type': 'Dinghy', 'Aggro Radius': '20.0000', 'GridPos': Point3(420.918, 330.9, -0.561), 'Hpr': VBase3(167.783, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(-328.363, -422.9, -0.561), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1203114330.25akelts': {'Type': 'Dinghy', 'Aggro Radius': '20.0000', 'GridPos': Point3(-246.537, -87.543, -0.021), 'Hpr': VBase3(162.527, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(218.035, 144.585, -0.021), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1203114365.52akelts': {'Type': 'Building Exterior', 'File': '', 'ExtUid': '1203114365.52akelts0', 'GridPos': Point3(-374.828, 18.538, 5.551), 'Hpr': VBase3(-127.2, -0.354, -1.185), 'Objects': {'1210373727.44akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-180.0, 0.0, 0.0), 'Pos': Point3(-0.277, -13.756, 1.561), 'Scale': VBase3(1.0, 1.0, 1.0)}}, 'Pos': Point3(368.179, 72.692, 5.551), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.89, 0.82, 0.807843137254902, 1.0), 'Door': 'models/buildings/shanty_guildhall_door', 'Model': 'models/buildings/shanty_guildhall_exterior', 'SignFrame': '', 'SignImage': 'models/buildings/sign1_eng_a_icon_barber'}}, '1203114365.58akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'GridPos': Point3(-345.867, -4.901, 7.173), 'Hpr': VBase3(142.807, 1.185, -0.354), 'Pos': Point3(357.407, 81.242, 6.653), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1203114419.42akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-370.33, 40.493, 50.666), 'Hpr': VBase3(-36.749, 10.145, 0.354), 'Objects': {}, 'Pos': Point3(369.126, 50.301, 50.666), 'Scale': VBase3(0.487, 0.487, 0.487), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1203114463.95akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-353.885, 53.931, 46.962), 'Hpr': VBase3(-36.894, -2.657, 0.493), 'Objects': {}, 'Pos': Point3(356.42, 33.283, 46.962), 'Scale': VBase3(0.487, 0.487, 0.487), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1203114579.48akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-337.185, 67.472, 48.024), 'Hpr': VBase3(-35.933, -14.761, 0.771), 'Objects': {}, 'Pos': Point3(343.492, 16.105, 48.024), 'Scale': VBase3(0.487, 0.487, 0.487), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1203114754.3akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-317.877, 77.461, 54.14), 'Hpr': VBase3(-36.042, -20.792, -2.004), 'Pos': Point3(327.174, 1.741, 54.14), 'Scale': VBase3(0.481, 0.481, 0.481), 'Visual': {'Model': 'models/props/shanty_rope_bridge_post'}}, '1203114815.91akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-323.541, 84.152, 53.696), 'Hpr': VBase3(-36.042, -20.792, -2.004), 'Pos': Point3(334.289, -3.381, 53.696), 'Scale': VBase3(0.481, 0.481, 0.481), 'Visual': {'Model': 'models/props/shanty_rope_bridge_post'}}, '1203114862.61akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-367.832, 37.147, 51.229), 'Hpr': VBase3(136.33, -14.593, 0.248), 'Pos': Point3(365.892, 52.943, 51.229), 'Scale': VBase3(0.5, 0.5, 0.5), 'Visual': {'Model': 'models/props/shanty_rope_bridge_post'}}, '1203114895.23akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-373.097, 43.489, 51.215), 'Hpr': VBase3(141.84, -16.787, -4.14), 'Pos': Point3(372.535, 48.063, 51.215), 'Scale': VBase3(0.5, 0.5, 0.5), 'Visual': {'Model': 'models/props/shanty_rope_bridge_post'}}, '1203115389.72akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'GridPos': Point3(-0.277, -13.756, 1.023), 'Hpr': VBase3(104.0, 0.0, 0.0), 'Pos': Point3(-3.059, 13.414, 1.023), 'Scale': VBase3(1.0, 1.0, 1.0), 'TargetUIDs': []}, '1203115424.56akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(133.745, -12.642, 1.974), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-132.831, -20.089, 1.974), 'Scale': VBase3(0.643, 0.643, 0.819), 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_med'}}, '1203115489.25akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(297.026, -66.639, 10.052), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-304.324, -7.198, 10.052), 'Scale': VBase3(0.795, 0.795, 1.013), 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_med2'}}, '1203115525.55akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(375.859, 82.108, 1.086), 'Hpr': VBase3(-104.124, 0.0, 0.0), 'Pos': Point3(-344.831, -170.598, 1.086), 'Scale': VBase3(0.456, 0.418, 0.71), 'Visual': {'Color': (0.42, 0.42, 0.33725490196078434, 1.0), 'Model': 'models/props/mound_light_lrg'}}, '1203115567.8akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-587.491, 312.528, -68.942), 'Hpr': VBase3(119.847, 0.0, 0.0), 'Pos': Point3(645.647, -161.118, -68.942), 'Scale': VBase3(1.183, 0.52, 0.922), 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_lrg'}}, '1203115686.2akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(297.269, -154.801, 13.946), 'Hpr': VBase3(170.573, 0.0, 0.0), 'Pos': Point3(-325.889, 78.287, 12.489), 'Scale': VBase3(4.557, 4.557, 5.809), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/rock_group_1_sphere'}}, '1203115759.17akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-157.444, 3.644, -7.311), 'Pos': Point3(-151.091, 9.738, 14.816), 'Scale': VBase3(4.774, 4.774, 5.353), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1203115821.75akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(46.003, -344.332, -12.418), 'Hpr': VBase3(-94.124, 0.0, 0.0), 'Pos': Point3(-127.938, 322.975, -12.418), 'Scale': VBase3(8.653, 8.653, 11.031), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1203115874.03akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(139.745, -358.255, -6.141), 'Hpr': VBase3(-32.416, 2.665, -0.967), 'Pos': Point3(-222.264, 313.806, -6.141), 'Scale': VBase3(5.931, 5.931, 7.562), 'Visual': {'Color': (0.592156862745098, 0.56, 0.4823529411764706, 1.0), 'Model': 'models/props/rock_group_4_sphere'}}, '1203115947.03akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-129.25, 211.145, 50.427), 'Hpr': VBase3(-157.196, 0.865, 0.447), 'Pos': Point3(174.89, -179.16, 50.079), 'Scale': VBase3(9.385, 9.385, 11.96), 'Visual': {'Color': (0.35, 0.38, 0.3058823529411765, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1203449603.97akelts': {'Type': 'Townsperson', 'Category': 'Commoner', 'AnimSet': 'drunk', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'models/char/pls_zero', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'GridPos': Point3(-374.192, 13.788, 34.359), 'Hpr': VBase3(-138.897, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '6.7229', 'Pos': Point3(365.602, 88.526, 34.033), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Villager', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1203450135.28akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-400.776, 18.298, 6.526), 'Hpr': VBase3(144.474, 0.0, 0.0), 'Pos': Point3(393.298, 79.202, 6.526), 'Scale': VBase3(1.207, 1.207, 1.207), 'Visual': {'Color': (0.85, 0.81, 0.7529411764705882, 1.0), 'Model': 'models/props/rock_group_1_floor'}}, '1203450324.03akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(351.431, -9.522, 54.483), 'Hpr': VBase3(178.296, 2.576, -9.081), 'Objects': {}, 'Pos': Point3(-343.296, -75.78, 54.483), 'Scale': VBase3(5.002, 5.002, 5.002), 'Visual': {'Color': (0.3, 0.35, 0.3, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1203450370.69akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(361.155, -135.951, 48.755), 'Hpr': VBase3(-21.612, -2.135, 5.69), 'Pos': Point3(-390.559, 40.931, 49.518), 'Scale': VBase3(8.698, 8.698, 8.698), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1203450461.55akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(449.294, 105.752, 64.957), 'Hpr': VBase3(-5.986, 0.294, -10.694), 'Objects': {}, 'Pos': Point3(-409.993, -223.993, 59.841), 'Scale': VBase3(4.115, 4.115, 8.62), 'Visual': {'Color': (0.33, 0.37, 0.3176470588235294, 1.0), 'Model': 'models/props/rock_group_2_sphere'}}, '1203450511.61akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(322.479, 227.694, 71.845), 'Hpr': VBase3(27.235, 0.0, 0.0), 'Pos': Point3(-257.816, -298.945, 70.312), 'Scale': VBase3(3.53, 3.53, 4.525), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/rock_group_2_sphere'}}, '1203450582.5akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-6.969, 263.14, 4.355), 'Hpr': VBase3(-40.113, 4.317, -0.99), 'Pos': Point3(92.861, -245.087, 6.142), 'Scale': VBase3(9.619, 9.619, 9.619), 'Visual': {'Color': (0.5098039215686274, 0.5372549019607843, 0.4392156862745098, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1203450682.05akelts': {'Type': 'Building Exterior', 'File': '', 'ExtUid': '1203450682.05akelts0', 'GridPos': Point3(353.227, -178.718, 9.399), 'Hpr': VBase3(141.203, -1.558, 3.891), 'Pos': Point3(-385.97, 87.956, 9.399), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Door': 'models/buildings/shanty_guildhall_door', 'Model': 'models/buildings/shanty_leanto_A', 'SignImage': 'models/buildings/sign1_eng_a_icon_barber'}}, '1203450740.25akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(333.47, -178.47, 10.924), 'Hpr': VBase3(47.086, 0.0, 0.0), 'Pos': Point3(-366.74, 92.495, 10.924), 'Scale': VBase3(1.22, 1.0, 1.566), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1203450762.45akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(357.347, -195.137, 7.301), 'Hpr': VBase3(-146.231, 0.0, 0.0), 'Pos': Point3(-393.94, 102.891, 7.301), 'Scale': VBase3(2.302, 1.0, 1.885), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1203450800.73akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(340.836, -195.321, 9.411), 'Hpr': VBase3(162.554, 1.558, 3.797), 'Pos': Point3(-377.964, 107.063, 9.411), 'Scale': VBase3(1.404, 1.0, 1.566), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1203450923.8akelts': {'Type': 'Building Exterior', 'File': '', 'ExtUid': '1203450923.8akelts0', 'GridPos': Point3(341.222, -202.965, 8.089), 'Hpr': VBase3(144.815, 0.0, 0.0), 'Pos': Point3(-380.188, 114.387, 8.089), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6862745098039216, 0.5, 0.32941176470588235, 1.0), 'Door': 'models/buildings/shanty_guildhall_door', 'Model': 'models/buildings/shanty_signpost', 'SignFrame': 'models/buildings/sign1_shanty_a_frame', 'SignImage': 'models/buildings/sign1_eng_a_icon_weapons'}}, '1203451028.89akelts': {'Type': 'Townsperson', 'Category': 'Shipwright', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'GridPos': Point3(100.489, -340.513, 0.586), 'Hpr': VBase3(167.771, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '12.0000', 'Pos': Point3(-179.882, 306.088, 0.586), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Villager', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1203451240.81akelts': {'Type': 'Townsperson', 'Category': 'Gunsmith', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'GridPos': Point3(-376.905, 109.44, 9.91), 'Hpr': VBase3(-77.345, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '6.0361', 'Pos': Point3(-376.905, 109.44, 9.91), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Walk', 'StartFrame': '0', 'Team': 'Player', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1203458158.53akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-30.979, -42.476, -11.649), 'Hpr': VBase3(-1.111, 0.0, 7.587), 'Objects': {}, 'Pos': Point3(19.783, 48.709, -11.649), 'Scale': VBase3(8.698, 8.698, 16.174), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1203458673.23akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-355.289, 97.381, 3.857), 'Hpr': VBase3(147.426, 0.124, 0.966), 'Pos': Point3(367.145, -11.612, 2.787), 'Scale': VBase3(4.598, 4.598, 5.857), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1203459119.77akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-186.002, 32.546, 3.202), 'Hpr': VBase3(-89.834, 0.746, -0.627), 'Pos': Point3(156.072, 19.876, 2.711), 'Scale': VBase3(4.305, 4.305, 3.492), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_1_floor'}}, '1203459408.59akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(424.827, 249.321, 1.111), 'Hpr': VBase3(-25.022, 0.0, 0.0), 'Pos': Point3(-271.045, -331.897, -3.821), 'Scale': VBase3(22.905, 22.905, 28.86), 'Visual': {'Color': (0.29, 0.31, 0.24705882352941178, 1.0), 'Model': 'models/props/rock_3_sphere'}}, '1203460758.22akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(363.854, -188.935, 8.585), 'Hpr': VBase3(-146.382, 0.0, 0.0), 'Pos': Point3(-398.753, 95.299, 8.585), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/prop_group02'}}, '1203460800.05akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(358.97, -190.561, 8.857), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-394.408, 98.058, 8.857), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/prop_group_A'}}, '1203460826.44akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(352.288, -173.771, 10.092), 'Hpr': VBase3(130.335, 0.0, 0.0), 'Pos': Point3(-383.863, 83.383, 10.092), 'Scale': VBase3(0.69, 0.69, 0.69), 'Visual': {'Color': (0.30000001192092896, 0.30000001192092896, 0.30000001192092896, 1.0), 'Model': 'models/props/prop_group_B'}}, '1203460870.59akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(348.056, -178.842, 10.229), 'Hpr': VBase3(150.939, 0.0, 0.0), 'Pos': Point3(-380.983, 89.327, 10.229), 'Scale': VBase3(0.631, 0.631, 0.631), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/prop_group_D'}}, '1203460917.47akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(351.876, -190.764, 9.379), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-387.574, 99.971, 9.379), 'Scale': VBase3(0.558, 0.558, 0.558), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/prop_group_E'}}, '1203460945.09akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(347.804, -184.813, 10.016), 'Hpr': VBase3(158.209, 0.0, 0.0), 'Pos': Point3(-382.183, 95.182, 10.016), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/cannon_broken_prop'}}, '1203461019.36akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(337.989, -181.537, 10.92), 'Hpr': VBase3(124.125, -2.653, 4.877), 'Pos': Point3(-371.867, 94.378, 10.92), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.7, 0.7, 0.7, 1.0), 'Model': 'models/props/cannon_stack_01'}}, '1203461073.42akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(340.919, -184.568, 10.57), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-375.443, 96.61, 10.57), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/cannonball_stack_square'}}, '1203461085.86akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(343.709, -188.02, 10.21), 'Hpr': VBase3(155.309, 0.0, 0.0), 'Pos': Point3(-378.986, 99.284, 10.21), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/cannonball_stack_square'}}, '1203461128.5akelts': {'Type': 'Ship_Props', 'DisableCollision': False, 'GridPos': Point3(345.912, -196.447, 9.393), 'Hpr': VBase3(116.749, 0.0, 0.0), 'Pos': Point3(-383.162, 106.928, 9.393), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/cannon_stack_02'}}, '1203461179.58akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(339.897, -188.054, 10.496), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-375.295, 100.24, 10.496), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/cannonball_stack_triangle'}}, '1203461241.98akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(80.016, -299.305, 0.909), 'Hpr': VBase3(-67.45, 72.134, -81.692), 'Pos': Point3(-150.048, 271.057, 0.909), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/anchor'}}, '1203461317.66akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(81.515, -305.43, -2.556), 'Hpr': VBase3(-114.141, 0.0, 0.0), 'Pos': Point3(-152.984, 276.637, -2.556), 'Scale': VBase3(2.901, 2.8, 2.541), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_cube'}}, '1203461390.23akelts': {'Type': 'Ship_Props', 'DisableCollision': True, 'GridPos': Point3(356.1, -191.262, 12.71), 'Hpr': VBase3(37.994, 25.166, 8.484), 'Pos': Point3(-391.793, 99.432, 12.71), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/wheel_wallprop'}}, '1203461510.47akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(367.133, -187.329, 7.703), 'Hpr': VBase3(-62.834, 0.0, 0.0), 'Pos': Point3(-401.547, 92.947, 7.703), 'Scale': VBase3(0.858, 1.158, 1.814), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1203464268.55akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-248.872, 56.43, 53.321), 'Hpr': VBase3(80.59, 0.0, 0.0), 'Pos': Point3(255.131, 5.454, 53.321), 'Scale': VBase3(0.277, 0.277, 0.277), 'Visual': {'Color': (0.6666666666666666, 0.6862745098039216, 0.6196078431372549, 1.0), 'Model': 'models/props/mound_light_lrg'}}, '1203464436.53akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(634.943, 474.118, -28.164), 'Hpr': VBase3(60.475, 0.0, 0.0), 'Pos': Point3(-501.383, -613.641, -28.164), 'Scale': VBase3(2.127, 4.157, 1.658), 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_med'}}, '1203464470.73akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-480.084, -150.307, -10.216), 'Holiday': '', 'Hpr': VBase3(-79.144, 0.0, 0.0), 'Pos': Point3(429.461, 261.985, -10.216), 'Scale': VBase3(0.795, 0.795, 1.328), 'VisSize': '', 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_med2'}}, '1203464494.17akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-364.214, -186.04, -46.772), 'Hpr': VBase3(119.847, 0.0, 0.0), 'Pos': Point3(308.388, 268.625, -46.772), 'Scale': VBase3(0.496, 0.218, 0.387), 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_lrg'}}, '1203464564.14akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-397.039, -82.45, -12.623), 'Hpr': VBase3(29.204, -2.384, -1.67), 'Pos': Point3(344.939, 241.389, -14.999), 'Scale': VBase3(10.312, 10.312, 13.142), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1204059355.98akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(378.365, -98.236, 55.701), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-390.891, 3.783, 55.701), 'Scale': VBase3(0.63, 0.63, 0.63), 'Visual': {'Model': 'models/vegetation/jungle_tree_a'}}, '1204059414.83akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(436.083, 152.175, 68.61), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-386.315, -253.153, 67.418), 'Scale': VBase3(0.886, 0.886, 0.886), 'Visual': {'Model': 'models/vegetation/jungle_tree_b'}}, '1204059436.81akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(434.287, 150.665, 153.061), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-384.938, -251.253, 153.061), 'Scale': VBase3(2.376, 2.376, 2.376), 'Visual': {'Model': 'models/vegetation/gen_tree_canopy'}}, '1204059462.52akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(373.505, -95.759, 108.721), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-385.576, 2.556, 108.721), 'Scale': VBase3(1.822, 1.822, 1.822), 'Visual': {'Model': 'models/vegetation/gen_tree_canopy'}}, '1204059672.03akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(429.505, 171.393, 65.983), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-385.97, -260.881, 65.983), 'Scale': VBase3(1.425, 1.425, 1.425), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059711.06akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(418.896, 149.38, 70.456), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-369.404, -245.119, 70.456), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059712.09akelts': {'Type': 'Jungle_Props', 'DisableCollision': True, 'GridPos': Point3(442.281, 124.117, 68.171), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-399.117, -227.428, 68.171), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059713.08akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(389.753, 118.221, 74.347), 'Hpr': VBase3(-49.308, 0.0, 0.0), 'Pos': Point3(-349.575, -208.999, 74.347), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059716.5akelts': {'Type': 'Jungle_Props', 'DisableCollision': True, 'GridPos': Point3(320.185, 136.75, 72.585), 'Hpr': VBase3(94.213, 0.0, 0.0), 'Pos': Point3(-263.212, -198.348, 69.211), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059717.45akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(308.702, -18.82, 50.535), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-304.085, -56.421, 50.535), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059719.28akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(334.166, -11.613, 55.404), 'Hpr': VBase3(-25.785, 0.0, 0.0), 'Pos': Point3(-327.049, -69.574, 55.404), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059720.05akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(366.624, -93.04, 54.728), 'Hpr': VBase3(46.146, 0.0, 0.0), 'Pos': Point3(-378.242, 1.582, 54.728), 'Scale': VBase3(2.142, 2.142, 2.142), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059724.55akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(365.774, -126.676, 52.054), 'Hpr': VBase3(-74.251, 0.0, 0.0), 'Pos': Point3(-402.137, 14.514, 56.562), 'Scale': VBase3(1.447, 1.447, 1.447), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059729.58akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(303.472, 235.871, 70.807), 'Hpr': VBase3(-114.87, 0.0, 0.0), 'Pos': Point3(-237.395, -302.281, 70.807), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059732.36akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(332.631, 217.935, 72.079), 'Hpr': VBase3(154.057, 0.0, 0.0), 'Pos': Point3(-270.378, -293.44, 72.079), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059735.73akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-135.625, 167.256, 59.099), 'Hpr': VBase3(-91.621, 0.0, 0.0), 'Pos': Point3(172.059, -129.477, 59.099), 'Scale': VBase3(3.215, 3.215, 3.215), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059738.44akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-311.632, 72.085, 54.836), 'Hpr': VBase3(34.577, -2.462, 2.937), 'Pos': Point3(319.814, 5.447, 54.836), 'Scale': VBase3(2.543, 2.543, 2.543), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059742.03akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-204.7, 64.536, 59.859), 'Hpr': VBase3(-62.086, 0.0, 0.0), 'Pos': Point3(214.232, -13.098, 59.859), 'Scale': VBase3(1.31, 1.31, 1.31), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204059934.52akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(416.271, 106.319, 70.657), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-383.295, -227.551, 69.176), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059949.56akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(356.348, 227.762, 71.117), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-290.662, -307.205, 71.117), 'Scale': VBase3(1.786, 1.786, 1.786), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059953.34akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(362.621, -33.439, 55.184), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-359.939, -55.28, 55.184), 'Scale': VBase3(1.842, 1.842, 1.842), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059956.41akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(443.025, 46.973, 51.688), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-418.501, -152.755, 51.688), 'Scale': VBase3(1.416, 1.416, 1.416), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059960.55akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(86.091, 6.299, 44.858), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(-82.01, -26.939, 44.858), 'Scale': VBase3(1.716, 1.716, 1.716), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059962.81akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(177.224, -122.102, 59.848), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059966.33akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(10.566, 200.632, 54.196), 'Hpr': VBase3(17.523, 4.607, 0.0), 'Pos': Point3(-0.087, -202.037, 51.809), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1204059991.02akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-71.176, 148.108, 57.32), 'Hpr': VBase3(-39.873, -6.618, 0.0), 'Pos': Point3(103.179, -147.109, 58.268), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_b'}}, '1204060027.94akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(427.862, -97.178, 59.092), 'Hpr': VBase3(-39.873, 0.0, 0.0), 'Pos': Point3(-438.662, -9.218, 59.092), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_b'}}, '1204060031.36akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(287.315, 208.111, 69.716), 'Hpr': VBase3(-39.873, 0.0, 0.0), 'Pos': Point3(-238.041, -273.163, 70.456), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_b'}}, '1204060033.45akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(81.244, 50.514, 45.215), 'Hpr': VBase3(-39.873, 0.0, 0.0), 'Pos': Point3(-66.61, -68.668, 45.215), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_b'}}, '1204060035.7akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-176.185, 67.495, 58.068), 'Hpr': VBase3(-39.873, 0.0, 0.0), 'Pos': Point3(187.28, -22.867, 58.068), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_b'}}, '1204060039.77akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(288.688, -139.356, 15.846), 'Hpr': VBase3(-107.761, 0.0, 0.0), 'Pos': Point3(-313.826, 65.377, 15.846), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_c'}}, '1204060065.31akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(375.752, -140.106, 53.385), 'Hpr': VBase3(-38.721, 0.0, 0.0), 'Pos': Point3(-398.485, 45.042, 53.385), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_c'}}, '1204060071.86akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(450.016, 164.31, 64.582), 'Hpr': VBase3(86.499, 0.0, 0.0), 'Pos': Point3(-396.898, -268.298, 64.582), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_c'}}, '1204060074.63akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(301.719, 233.301, 72.271), 'Hpr': VBase3(-141.78, 0.0, 0.0), 'Pos': Point3(-236.316, -299.363, 72.271), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_c'}}, '1204060078.2akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-205.21, 62.988, 60.031), 'Hpr': VBase3(22.412, 0.0, 0.0), 'Pos': Point3(214.353, -11.472, 60.031), 'Scale': VBase3(1.784, 1.784, 1.784), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_c'}}, '1204060082.56akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-160.916, 25.381, 7.156), 'Hpr': VBase3(-21.407, 0.0, 0.0), 'Pos': Point3(173.497, 29.662, 5.825), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_c'}}, '1204060470.2akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-264.603, 85.234, 60.517), 'Hpr': VBase3(-59.9, 0.0, 0.0), 'Pos': Point3(277.363, -18.689, 60.517), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204060505.48akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-312.242, 71.998, 54.357), 'Hpr': VBase3(40.51, -1.437, 10.232), 'Pos': Point3(320.385, 5.679, 54.357), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204060511.89akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(10.953, 200.962, 54.248), 'Hpr': VBase3(-142.944, 0.0, 0.0), 'Pos': Point3(-0.594, -202.56, 53.115), 'Scale': VBase3(2.628, 2.628, 2.628), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204060515.3akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(421.854, 142.243, 70.26), 'Hpr': VBase3(-108.413, 0.0, 0.0), 'Pos': Point3(-374.911, -240.073, 70.26), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204060521.22akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(349.687, -40.622, 54.939), 'Hpr': VBase3(-59.227, 0.0, 0.0), 'Pos': Point3(-349.127, -45.182, 54.939), 'Scale': VBase3(2.331, 2.331, 2.331), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204060524.36akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(410.33, 220.943, 59.965), 'Hpr': VBase3(-42.318, 0.0, 0.0), 'Pos': Point3(-273.769, -210.245, 71.375), 'Scale': VBase3(2.365, 2.365, 2.365), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204060821.13akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-50.908, 145.155, 55.857), 'Hpr': VBase3(17.523, 0.0, 0.0), 'Pos': Point3(127.591, -134.276, 58.11), 'Scale': VBase3(3.621, 3.621, 3.621), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_a'}}, '1204060855.45akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(378.983, 48.666, 51.35), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(-355.952, -138.905, 51.35), 'Scale': VBase3(0.862, 0.862, 0.862), 'Visual': {'Color': (0.9098039215686274, 0.91, 0.8823529411764706, 1.0), 'Model': 'models/props/cliff_jungle_high'}}, '1204060956.09akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(391.222, 24.473, 84.037), 'Hpr': VBase3(-173.107, 0.0, 0.0), 'Pos': Point3(-373.68, -118.391, 84.037), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204061028.02akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(363.031, 32.413, 120.982), 'Hpr': VBase3(100.37, 0.0, 0.0), 'Pos': Point3(-344.406, -119.275, 120.982), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204061061.27akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-253.02, 39.159, 4.524), 'Hpr': VBase3(177.321, 0.0, 0.0), 'Pos': Point3(254.978, 23.215, 4.524), 'Scale': VBase3(0.517, 0.806, 1.208), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_low'}}, '1204134204.59akelts': {'Type': 'Wall_Hangings', 'DisableCollision': False, 'GridPos': Point3(-361.086, 2.343, 33.233), 'Hpr': VBase3(-127.141, 0.0, 0.0), 'Pos': Point3(350.927, 85.082, 33.233), 'Scale': VBase3(1.265, 1.265, 1.265), 'Visual': {'Model': 'models/props/flag_hanging_spanish'}}, '1204134436.16akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(331.941, -187.044, 10.127), 'Hpr': VBase3(137.858, 0.0, 0.0), 'Pos': Point3(-367.331, 101.184, 10.127), 'Scale': VBase3(1.312, 1.312, 2.355), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1204134561.33akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-386.568, 32.712, 34.035), 'Hpr': VBase3(51.145, 36.797, 3.04), 'Objects': {}, 'Pos': Point3(382.999, 61.779, 34.035), 'Scale': VBase3(0.487, 0.487, 0.487), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1204134680.02akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-397.012, 46.512, 21.197), 'Hpr': VBase3(51.774, 26.09, 2.71), 'Objects': {}, 'Pos': Point3(396.471, 50.915, 21.197), 'Scale': VBase3(0.487, 0.487, 0.487), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1204134735.98akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-408.907, 61.839, 11.764), 'Hpr': VBase3(52.383, 13.484, 2.503), 'Objects': {}, 'Pos': Point3(411.721, 38.921, 11.764), 'Scale': VBase3(0.489, 0.485, 0.485), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1204134910.91akelts': {'Type': 'Stairs', 'DisableCollision': False, 'GridPos': Point3(-424.124, 81.44, 5.888), 'Hpr': VBase3(-127.584, -16.059, -2.35), 'Objects': {}, 'Pos': Point3(431.228, 23.584, 5.888), 'Scale': VBase3(1.408, 1.354, 1.0), 'Visual': {'Color': (0.7900000214576721, 0.7799999713897705, 0.699999988079071, 1.0), 'Model': 'models/buildings/landing_double'}}, '1204135098.84akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-426.176, 75.798, 7.394), 'Hpr': VBase3(67.466, 0.0, 0.0), 'Pos': Point3(431.854, 29.555, 7.394), 'Scale': VBase3(0.467, 0.467, 0.467), 'Visual': {'Model': 'models/props/shanty_rope_bridge_post'}}, '1204135135.33akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(-418.286, 82.472, 7.667), 'Hpr': VBase3(41.78, 0.0, 0.0), 'Pos': Point3(425.813, 21.17, 7.667), 'Scale': VBase3(0.467, 0.467, 0.467), 'Visual': {'Model': 'models/props/shanty_rope_bridge_post'}}, '1204237124.22akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'GridPos': Point3(302.232, 58.179, 61.849), 'Hpr': VBase3(-24.821, 0.0, 0.0), 'Pos': Point3(-249.919, -140.602, 65.961), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1204237124.2akelts': {'Type': 'Building Exterior', 'File': 'pvpShipIsland2_int_tavern', 'ExtUid': '1204237124.2akelts0', 'GridPos': Point3(279.715, 79.133, 65.009), 'Hpr': VBase3(155.008, 0.0, 0.0), 'Objects': {'1210373723.53akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-179.829, 0.0, 0.0), 'Pos': Point3(-0.498, -4.914, 0.952), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1210373725.94akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(-6.626, 20.947, 1.006), 'Scale': VBase3(1.0, 1.0, 1.0)}}, 'Pos': Point3(-252.262, -144.452, 65.009), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Door': 'models/buildings/shanty_guildhall_door', 'Model': 'models/buildings/shanty_tavern_exterior', 'SignFrame': 'models/buildings/sign1_shanty_a_frame', 'SignImage': 'models/buildings/sign1_eng_a_icon_tavern'}}, '1204237223.91akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(300.519, 59.182, 55.163), 'Hpr': VBase3(49.106, -7.069, 0.377), 'Objects': {}, 'Pos': Point3(-277.275, -130.126, 55.163), 'Scale': VBase3(4.034, 4.034, 4.034), 'Visual': {'Color': (0.3, 0.35, 0.3, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1204237285.34akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(297.582, 77.276, 60.49), 'Hpr': VBase3(139.493, 0.0, 0.0), 'Pos': Point3(-270.048, -146.972, 60.49), 'Scale': VBase3(2.085, 2.085, 2.085), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1204237296.23akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(310.905, 61.79, 58.761), 'Hpr': VBase3(89.852, 0.0, 0.0), 'Pos': Point3(-286.721, -135.169, 58.761), 'Scale': VBase3(2.365, 2.365, 2.365), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1204237768.98akelts': {'Type': 'Bridge', 'DisableCollision': False, 'GridPos': Point3(273.809, 71.426, 65.813), 'Hpr': VBase3(155.336, 32.331, 0.0), 'Pos': Point3(-248.396, -135.545, 65.813), 'Scale': VBase3(0.357, 0.357, 0.357), 'Visual': {'Model': 'models/props/shanty_rope_bridge'}}, '1204237956.03akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': True, 'GridPos': Point3(-20.251, 72.751, -29.831), 'Hpr': VBase3(-175.338, 0.0, 0.0), 'Pos': Point3(37.25, -65.691, -29.831), 'Scale': VBase3(0.512, 0.798, 0.891), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_low'}}, '1208535913.67akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': VBase3(155.008, 0.0, 0.0), 'Pos': Point3(-255.062, -166.14, 66.015), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1208535916.31akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'GridPos': Point3(-6.626, 20.841, 1.006), 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(-6.626, 20.947, 1.006), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1208551336.89akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-90.0, 0.0, 0.0), 'Pos': Point3(-0.277, -13.756, 1.023), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1208551339.09akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-6.626, 20.841, 1.006), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1208796482.19akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-180.0, 0.0, 0.0), 'Pos': Point3(-0.277, -13.756, 1.561), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1208796484.09akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(-6.626, 20.841, 1.006), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1208796484.17akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-90.0, 0.0, 0.0), 'Pos': Point3(-0.277, -13.756, 1.023), 'Scale': VBase3(1.0, 1.0, 1.0), 'TargetUIDs': []}, '1208808622.33akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(373.505, -95.759, 108.721), 'Hpr': VBase3(-120.335, 0.0, 0.0), 'Pos': Point3(-389.709, 3.032, 83.613), 'Scale': VBase3(2.475, 2.475, 2.475), 'Visual': {'Model': 'models/vegetation/gen_tree_canopy'}}, '1208808661.47akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(434.287, 150.665, 153.061), 'Hpr': VBase3(165.025, 0.0, 0.0), 'Pos': Point3(-381.695, -252.656, 120.228), 'Scale': VBase3(3.921, 3.921, 3.921), 'Visual': {'Model': 'models/vegetation/gen_tree_canopy'}}, '1208808732.92akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(378.365, -98.236, 55.701), 'Hpr': VBase3(-166.0, 0.0, 0.0), 'Pos': Point3(32.495, -74.501, 48.622), 'Scale': VBase3(0.63, 0.63, 0.63), 'Visual': {'Model': 'models/vegetation/jungle_tree_a'}}, '1208808760.67akelts': {'Type': 'Tree', 'DisableCollision': False, 'GridPos': Point3(434.287, 150.665, 153.061), 'Hpr': VBase3(165.025, 0.0, 0.0), 'Pos': Point3(32.367, -75.712, 86.151), 'Scale': VBase3(2.483, 2.483, 2.483), 'Visual': {'Model': 'models/vegetation/gen_tree_canopy'}}, '1208808898.64akelts': {'Type': 'Tree', 'DisableCollision': False, 'Hpr': VBase3(47.91, 0.0, 0.0), 'Pos': Point3(-261.705, -126.077, 59.113), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/fern_tree_c'}}, '1208808939.75akelts': {'Type': 'Tree', 'DisableCollision': False, 'Hpr': VBase3(125.315, 0.0, 0.0), 'Pos': Point3(143.923, -142.808, 58.141), 'Scale': VBase3(1.618, 1.618, 1.618), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/fern_tree_b'}}, '1208808963.02akelts': {'Type': 'Tree', 'DisableCollision': False, 'Hpr': VBase3(22.689, 0.0, 0.0), 'Pos': Point3(188.042, -1.397, 55.796), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/vegetation/fern_tree_e'}}, '1208808997.63akelts': {'Type': 'Tree', 'DisableCollision': False, 'Hpr': VBase3(22.689, 0.0, 0.0), 'Pos': Point3(-95.527, -45.699, 44.614), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/fern_tree_e'}}, '1208809009.36akelts': {'Type': 'Tree', 'DisableCollision': False, 'Hpr': VBase3(22.689, 0.0, 0.0), 'Pos': Point3(-251.118, -214.264, 68.142), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/fern_tree_d'}}, '1208809176.83akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-152.604, 9.484, -0.719), 'Pos': Point3(-393.582, -127.052, 106.869), 'Scale': VBase3(0.768, 0.768, 0.768), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208809279.59akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(170.168, 6.682, 0.702), 'Pos': Point3(-345.264, -108.754, 77.239), 'Scale': VBase3(0.611, 0.611, 0.611), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208809368.11akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-89.621, 5.143, -0.989), 'Pos': Point3(-380.451, -75.759, 89.566), 'Scale': VBase3(0.768, 0.768, 0.768), 'Visual': {'Model': 'models/vegetation/jungle_vine_b_200'}}, '1208809472.88akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-89.723, -0.81, -0.985), 'Pos': Point3(-405.017, -228.593, 101.856), 'Scale': VBase3(0.499, 0.499, 0.499), 'Visual': {'Model': 'models/vegetation/jungle_vine_a_200'}}, '1208809510.83akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-88.774, 4.11, -1.05), 'Pos': Point3(-378.834, -85.961, 66.476), 'Scale': VBase3(0.768, 0.768, 0.768), 'Visual': {'Model': 'models/vegetation/jungle_vine_b'}}, '1208809617.05akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-50.908, 145.155, 55.857), 'Hpr': VBase3(7.579, -15.377, -21.436), 'Pos': Point3(43.315, -214.982, 98.948), 'Scale': VBase3(3.621, 3.621, 3.621), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_a'}}, '1208809694.8akelts': {'Type': 'Tree', 'DisableCollision': False, 'Hpr': VBase3(176.031, 3.973, 3.375), 'Pos': Point3(-416.074, -131.345, 48.889), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/fern_tree_e'}}, '1208809918.53akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-138.094, 0.0, 0.0), 'Pos': Point3(7.654, -39.037, 1.044), 'Scale': VBase3(4.081, 3.438, 3.438), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208809947.38akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(135.041, 0.0, 0.0), 'Pos': Point3(70.424, -43.033, 1.374), 'Scale': VBase3(4.466, 3.438, 3.438), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208809958.27akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(176.825, 0.0, 0.0), 'Pos': Point3(38.433, -26.713, 1.435), 'Scale': VBase3(3.411, 3.438, 3.438), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208809992.67akelts': {'Type': 'Tree', 'DisableCollision': True, 'Hpr': VBase3(96.609, 0.0, 0.0), 'Pos': Point3(64.792, -40.461, 3.8), 'Scale': VBase3(1.618, 1.618, 1.618), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/fern_tree_b'}}, '1208810053.52akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(26.944, -20.071, 27.201), 'Pos': Point3(-324.041, -128.121, 124.495), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1208810085.05akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(-44.557, -6.521, -19.024), 'Pos': Point3(-384.556, -122.727, 70.807), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1208810147.31akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(-44.557, -6.521, -32.391), 'Pos': Point3(-319.36, 28.518, 83.478), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1208810201.73akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(126.499, 14.847, -27.086), 'Pos': Point3(-273.456, 11.863, 112.912), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1208810269.45akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(2.536, 27.623, 19.921), 'Pos': Point3(-289.021, -16.521, 115.903), 'Scale': VBase3(1.764, 1.764, 1.764), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1208810310.83akelts': {'Type': 'Jungle_Props', 'DisableCollision': True, 'GridPos': Point3(334.166, -11.613, 55.404), 'Hpr': VBase3(-25.785, 0.0, 0.0), 'Pos': Point3(-316.557, -36.24, 78.313), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1208810354.47akelts': {'Type': 'Bush', 'DisableCollision': False, 'Hpr': VBase3(89.22, 0.0, 0.0), 'Pos': Point3(-268.512, -167.831, 64.81), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_a'}}, '1208810406.28akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(42.896, 0.0, 0.0), 'Pos': Point3(-263.945, -170.113, 65.123), 'Scale': VBase3(1.751, 1.528, 2.106), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208810445.06akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(-120.367, 0.0, 0.0), 'Pos': Point3(-227.166, -141.207, 60.731), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_b'}}, '1208810495.61akelts': {'Type': 'Bush', 'DisableCollision': False, 'Hpr': VBase3(5.202, 0.0, 0.0), 'Pos': Point3(-227.969, -151.672, 61.043), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_a'}}, '1208810525.34akelts': {'Type': 'Bush', 'DisableCollision': False, 'Hpr': VBase3(-177.316, 0.0, 0.0), 'Pos': Point3(-237.907, -168.971, 64.36), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_c'}}, '1208810579.08akelts': {'Type': 'Bush', 'DisableCollision': False, 'Hpr': VBase3(116.707, 0.0, 0.0), 'Pos': Point3(-289.16, -214.182, 71.69), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_d'}}, '1208810621.84akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(67.606, -13.881, -3.556), 'Pos': Point3(-415.072, -184.363, 63.056), 'Scale': VBase3(1.551, 1.551, 1.551), 'Visual': {'Model': 'models/vegetation/bush_d'}}, '1208810658.95akelts': {'Type': 'Bush', 'DisableCollision': False, 'Hpr': VBase3(143.61, -4.213, 1.999), 'Pos': Point3(-372.745, -218.333, 70.758), 'Scale': VBase3(1.551, 1.551, 1.551), 'Visual': {'Model': 'models/vegetation/bush_d'}}, '1208810807.19akelts': {'Type': 'Bush', 'DisableCollision': False, 'Hpr': VBase3(16.68, 0.933, -4.568), 'Pos': Point3(-409.053, -250.846, 63.391), 'Scale': VBase3(1.494, 1.494, 1.494), 'Visual': {'Model': 'models/vegetation/bush_a'}}, '1208810877.0akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-14.393, 0.0, -0.641), 'Pos': Point3(-376.852, -191.819, 113.669), 'Scale': VBase3(0.967, 0.967, 0.967), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208810934.53akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-14.379, 1.321, -0.642), 'Pos': Point3(-361.836, -195.187, 110.096), 'Scale': VBase3(0.991, 0.991, 0.991), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208811023.73akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(320.185, 136.75, 72.585), 'Hpr': VBase3(-28.499, 0.0, 0.0), 'Pos': Point3(-309.069, -193.754, 138.788), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1208811068.98akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(26.995, 0.0, 0.0), 'Pos': Point3(-253.548, -274.59, 71.384), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_d'}}, '1208811084.52akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(114.363, 0.0, 0.0), 'Pos': Point3(-260.596, -277.698, 71.062), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/bush_a'}}, '1208811112.3akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(114.363, 0.0, 0.0), 'Pos': Point3(-255.875, -312.572, 73.767), 'Scale': VBase3(1.64, 1.64, 1.64), 'Visual': {'Model': 'models/vegetation/bush_f'}}, '1208811262.97akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-104.857, 8.588, 0.554), 'Pos': Point3(70.866, -289.357, 41.508), 'Scale': VBase3(1.204, 1.204, 1.204), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208811381.53akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-140.417, 8.37, -0.751), 'Pos': Point3(96.405, -188.832, 124.112), 'Scale': VBase3(1.204, 1.204, 1.204), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208811429.42akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-156.372, 4.411, 1.348), 'Pos': Point3(49.123, -205.952, 87.759), 'Scale': VBase3(1.204, 1.204, 1.204), 'Visual': {'Model': 'models/vegetation/moss_b'}}, '1208811768.8akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-144.406, -3.176, -2.17), 'Pos': Point3(130.64, -158.195, 55.691), 'Scale': VBase3(4.774, 4.774, 3.344), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811772.16akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-164.65, -7.956, -5.292), 'Pos': Point3(25.948, -214.116, 50.396), 'Scale': VBase3(4.774, 4.774, 5.112), 'Visual': {'Color': (0.396078431372549, 0.42, 0.3803921568627451, 1.0), 'Model': 'models/props/rock_group_4_sphere'}}, '1208811773.16akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(17.497, -8.36, -3.957), 'Pos': Point3(6.864, -72.534, 43.129), 'Scale': VBase3(4.774, 4.774, 6.087), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1208811773.94akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-142.664, -17.303, 4.712), 'Pos': Point3(-72.831, -222.386, 41.214), 'Scale': VBase3(3.184, 3.184, 3.243), 'Visual': {'Color': (0.396078431372549, 0.42, 0.3411764705882353, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811776.03akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-161.098, 4.104, -7.064), 'Pos': Point3(-105.811, -43.819, 40.714), 'Scale': VBase3(3.622, 3.622, 4.618), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_1_sphere'}}, '1208811776.83akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-18.803, -7.562, -6.653), 'Pos': Point3(-130.606, -50.685, 41.261), 'Scale': VBase3(4.774, 4.774, 6.087), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811777.47akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-124.532, -11.815, -8.258), 'Pos': Point3(-201.207, -171.957, 58.879), 'Scale': VBase3(3.248, 3.231, 3.649), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_4_sphere'}}, '1208811779.14akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-6.284, -22.385, -2.016), 'Pos': Point3(217.209, -16.696, 52.7), 'Scale': VBase3(4.774, 4.774, 6.087), 'Visual': {'Color': (0.40784313725490196, 0.4666666666666667, 0.4666666666666667, 1.0), 'Model': 'models/props/rock_group_4_sphere'}}, '1208811781.34akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-168.086, -1.639, -2.347), 'Pos': Point3(-343.132, -111.009, 54.7), 'Scale': VBase3(2.781, 2.781, 4.177), 'Visual': {'Color': (0.3, 0.33, 0.3, 1.0), 'Model': 'models/props/rock_group_2_floor'}}, '1208811783.25akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(148.652, 2.572, -1.384), 'Pos': Point3(-275.718, 60.876, 16.415), 'Scale': VBase3(4.774, 4.774, 4.154), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_2_floor'}}, '1208811785.25akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-105.1, -3.598, -12.609), 'Pos': Point3(-253.606, -58.999, 43.976), 'Scale': VBase3(4.774, 4.774, 4.447), 'Visual': {'Color': (0.31, 0.35, 0.33725490196078434, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1208811787.06akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-78.513, -6.5, -4.951), 'Pos': Point3(-207.322, -279.756, 65.605), 'Scale': VBase3(3.573, 3.573, 4.555), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1208811787.67akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(76.484, 3.8, 7.232), 'Pos': Point3(-198.1, -264.836, 62.839), 'Scale': VBase3(2.14, 2.14, 2.729), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_1_sphere'}}, '1208811788.55akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-81.471, 0.288, 2.826), 'Objects': {}, 'Pos': Point3(-347.282, -359.77, 2.243), 'Scale': VBase3(4.774, 4.774, 2.502), 'Visual': {'Color': (0.51, 0.56, 0.4117647058823529, 1.0), 'Model': 'models/props/rock_group_3_floor'}}, '1208811791.69akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-144.212, -3.487, 1.627), 'Objects': {}, 'Pos': Point3(-439.805, -46.958, 51.483), 'Scale': VBase3(4.774, 4.774, 3.13), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811792.56akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-103.908, 5.818, -0.009), 'Pos': Point3(-440.195, -134.889, 4.905), 'Scale': VBase3(5.184, 5.184, 3.887), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1208811794.17akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-53.521, 0.061, 4.353), 'Objects': {}, 'Pos': Point3(-284.11, -361.855, -5.75), 'Scale': VBase3(4.774, 4.774, 6.592), 'Visual': {'Color': (0.51, 0.54, 0.4392156862745098, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811795.17akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-54.019, -2.465, 3.047), 'Pos': Point3(-312.574, -358.779, -1.533), 'Scale': VBase3(4.774, 4.774, 2.621), 'Visual': {'Color': (0.38, 0.37, 0.3254901960784314, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811800.08akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(72.206, 1.431, 3.479), 'Pos': Point3(-95.254, 45.803, 6.346), 'Scale': VBase3(4.774, 4.774, 5.603), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_2_sphere'}}, '1208811801.2akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(13.261, -4.517, 3.1), 'Pos': Point3(-127.097, 70.881, 8.023), 'Scale': VBase3(5.456, 5.456, 6.378), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1208811802.09akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(164.573, 7.369, -3.526), 'Pos': Point3(-246.819, 286.594, 1.077), 'Scale': VBase3(2.204, 2.204, 1.801), 'Visual': {'Color': (0.59, 0.59, 0.49, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811803.81akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-44.474, -0.255, -3.966), 'Pos': Point3(262.736, 64.252, 4.217), 'Scale': VBase3(4.774, 4.774, 8.168), 'Visual': {'Color': (0.4, 0.44, 0.4, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1208811804.59akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(82.765, 2.384, -5.346), 'Pos': Point3(399.969, 43.546, 5.943), 'Scale': VBase3(2.154, 2.154, 1.875), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208811805.2akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Holiday': '', 'Hpr': VBase3(162.426, 1.496, 3.991), 'Pos': Point3(340.371, 163.849, -8.271), 'Scale': VBase3(5.11, 5.11, 6.515), 'VisSize': '', 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1208811806.67akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-169.822, -0.966, 6.1), 'Pos': Point3(407.35, 4.291, 2.735), 'Scale': VBase3(4.774, 4.774, 3.419), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_4_floor'}}, '1208811976.0akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-109.949, 0.17, 14.322), 'Objects': {}, 'Pos': Point3(-59.92, -209.918, 43.193), 'Scale': VBase3(3.184, 3.184, 3.243), 'Visual': {'Color': (0.41, 0.43, 0.37254901960784315, 1.0), 'Model': 'models/props/rock_group_1_floor'}}, '1208812034.0akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-119.258, -1.685, -7.991), 'Pos': Point3(-212.892, -172.3, 61.45), 'Scale': VBase3(3.248, 3.248, 2.144), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_floor'}}, '1208817172.09akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(322.479, 227.694, 71.845), 'Hpr': VBase3(39.268, 0.0, -3.187), 'Pos': Point3(-336.356, -361.238, 2.893), 'Scale': VBase3(1.529, 1.529, 1.96), 'Visual': {'Color': (0.54, 0.53, 0.4392156862745098, 1.0), 'Model': 'models/props/rock_group_2_floor'}}, '1208817254.27akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-132.321, 8.184, -1.918), 'Pos': Point3(194.072, 26.594, 40.152), 'Scale': VBase3(1.204, 1.204, 1.204), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1208817508.89akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(120.805, -3.248, -1.273), 'Pos': Point3(210.092, -42.115, 59.432), 'Scale': VBase3(1.86, 1.86, 2.372), 'Visual': {'Color': (0.36, 0.41, 0.40784313725490196, 1.0), 'Model': 'models/props/rock_group_1_floor'}}, '1208817704.22akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(113.954, -0.344, 3.212), 'Pos': Point3(50.036, -76.127, 48.039), 'Scale': VBase3(1.551, 1.551, 1.551), 'Visual': {'Model': 'models/vegetation/bush_d'}}, '1208817733.41akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(-96.242, 1.914, -2.603), 'Pos': Point3(92.744, -66.711, 51.079), 'Scale': VBase3(1.551, 1.551, 1.551), 'Visual': {'Model': 'models/vegetation/bush_d'}}, '1208817744.48akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(-44.302, -0.872, -7.43), 'Pos': Point3(102.097, -61.139, 51.013), 'Scale': VBase3(1.551, 1.551, 1.551), 'Visual': {'Model': 'models/vegetation/bush_a'}}, '1208817783.52akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-30.979, -42.476, -11.649), 'Hpr': VBase3(-127.745, 8.254, -2.493), 'Objects': {}, 'Pos': Point3(126.418, -59.327, 50.686), 'Scale': VBase3(4.171, 4.171, 7.842), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1208817904.31akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(104.912, -5.654, -1.867), 'Pos': Point3(-259.361, -212.369, 66.971), 'Scale': VBase3(2.14, 2.14, 2.729), 'Visual': {'Color': (0.4, 0.45, 0.4, 1.0), 'Model': 'models/props/rock_group_2_sphere'}}, '1208818191.44akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-49.797, -4.814, -2.614), 'Pos': Point3(-55.816, -11.111, 3.947), 'Scale': VBase3(5.456, 5.456, 5.724), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1208818217.42akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(361.155, -135.951, 48.755), 'Hpr': VBase3(80.349, -8.033, -1.43), 'Pos': Point3(-468.491, 13.451, 0.893), 'Scale': VBase3(10.925, 10.925, 10.376), 'Visual': {'Color': (0.29, 0.31, 0.25, 1.0), 'Model': 'models/props/rock_group_3_sphere'}}, '1208818540.86akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-20.251, 72.751, -29.831), 'Hpr': VBase3(102.073, 0.0, 0.0), 'Pos': Point3(-267.897, 2.215, -9.048), 'Scale': VBase3(0.462, 0.798, 1.253), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_low'}}, '1208818602.13akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(0.358, 7.602, 15.859), 'Pos': Point3(-240.404, 22.504, 61.208), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1208818653.27akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(-40.754, -17.65, 30.399), 'Pos': Point3(-259.388, -9.174, 86.627), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1208818734.8akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-20.251, 72.751, -29.831), 'Hpr': VBase3(-157.802, 0.0, 0.0), 'Pos': Point3(92.958, -191.914, 27.716), 'Scale': VBase3(0.877, 0.877, 0.877), 'Visual': {'Color': (0.95, 1.0, 0.95, 1.0), 'Model': 'models/props/cliff_jungle_high'}}, '1208824971.89akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(76.961, 2.914, -5.078), 'Pos': Point3(414.891, 42.336, 5.732), 'Scale': VBase3(1.342, 1.342, 1.711), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_3_floor'}}, '1208825047.78akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(411.062, 47.92, 6.511), 'Hpr': VBase3(142.674, 0.0, 0.0), 'Pos': Point3(411.115, 48.001, 6.511), 'Scale': VBase3(1.562, 1.562, 1.562), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208825097.28akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(398.265, 45.218, 6.763), 'Hpr': VBase3(-87.855, 0.0, 0.0), 'Pos': Point3(398.265, 45.218, 6.763), 'Scale': VBase3(0.827, 1.562, 1.562), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208825126.22akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(411.062, 47.92, 6.511), 'Hpr': VBase3(-40.0, 0.0, 0.0), 'Pos': Point3(403.249, 37.278, 6.682), 'Scale': VBase3(1.288, 1.562, 1.562), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208825212.38akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(411.062, 47.92, 6.511), 'Hpr': VBase3(-152.232, 0.0, 0.0), 'Pos': Point3(401.61, 50.943, 6.664), 'Scale': VBase3(0.817, 1.562, 1.562), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1208825312.16akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(411.062, 47.92, 6.511), 'Hpr': VBase3(-40.0, 0.0, 0.0), 'Pos': Point3(380.069, -12.385, 6.999), 'Scale': VBase3(5.23, 5.23, 5.23), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1208825386.56akelts': {'Type': 'Rock', 'DisableCollision': True, 'Hpr': VBase3(23.665, 0.0, 0.0), 'Pos': Point3(391.334, 0.09, 2.161), 'Scale': VBase3(7.302, 7.302, 13.415), 'Visual': {'Color': (0.42, 0.42, 0.4235294117647059, 1.0), 'Model': 'models/props/rock_4_floor'}}, '1208892989.37akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-20.251, 72.751, -29.831), 'Hpr': VBase3(153.104, -12.027, 0.0), 'Pos': Point3(444.135, -28.345, -18.929), 'Scale': VBase3(0.512, 0.86, 0.891), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_low'}}, '1208893039.39akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-253.02, 39.159, 4.524), 'Hpr': VBase3(11.801, 0.0, 0.0), 'Pos': Point3(374.794, -157.153, -11.709), 'Scale': VBase3(0.732, 0.737, 0.857), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_high'}}, '1208893077.73akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-253.02, 39.159, 4.524), 'Hpr': VBase3(35.111, 0.0, 0.0), 'Pos': Point3(152.822, -265.197, -2.141), 'Scale': VBase3(1.117, 1.629, 1.308), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_high'}}, '1208893136.55akelts': {'Type': 'Jungle_Props_large', 'DisableCollision': False, 'GridPos': Point3(-253.02, 39.159, 4.524), 'Hpr': VBase3(42.802, 0.0, 0.0), 'Pos': Point3(250.256, -179.188, -1.664), 'Scale': VBase3(0.958, 1.397, 1.34), 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/cliff_jungle_low'}}, '1208906898.45akelts': {'Type': 'Shanty Tents', 'Hpr': VBase3(-151.762, 0.0, 0.0), 'Pos': Point3(45.696, -178.938, 55.527), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/buildings/shanty_tent_house_facade'}}, '1208906917.51akelts': {'Type': 'Shanty Tents', 'Hpr': VBase3(-151.762, 0.0, 0.0), 'Pos': Point3(45.449, -178.828, 54.51), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/buildings/shanty_tent_house_body'}}, '1208908188.33akelts': {'Type': 'Furniture', 'DisableCollision': False, 'Hpr': VBase3(178.593, 0.0, 0.0), 'Objects': {}, 'Pos': Point3(34.017, -172.752, 55.733), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.39, 0.42, 0.32941176470588235, 1.0), 'Model': 'models/props/bench_shanty_2'}}, '1208908264.25akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'Hpr': VBase3(42.9, 0.0, 0.0), 'Pos': Point3(48.883, -162.468, 55.873), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6784313725490196, 0.72, 0.7372549019607844, 1.0), 'Model': 'models/props/prop_group01'}}, '1208908273.23akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'Hpr': VBase3(91.545, 0.0, 0.0), 'Pos': Point3(57.282, -162.22, 55.76), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.7254901960784313, 0.86, 1.0, 1.0), 'Model': 'models/props/prop_group02'}}, '1208908304.17akelts': {'Type': 'Prop_Groups', 'DisableCollision': False, 'Hpr': VBase3(-72.238, 0.0, 0.0), 'Pos': Point3(19.111, -187.636, 55.085), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6, 0.65, 0.7, 1.0), 'Model': 'models/props/prop_group_A'}}, '1208908477.73akelts': {'Type': 'Townsperson', 'Category': 'Commoner', 'AnimSet': 'axe_chop', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'GridPos': Point3(30.598, -172.947, 55.786), 'Hpr': VBase3(116.347, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '6.3795', 'Pos': Point3(65.15, -167.037, 55.695), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Player', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1208980254.31akelts': {'Type': 'Log_Stack', 'DisableCollision': False, 'Hpr': VBase3(27.629, 0.0, 0.0), 'Pos': Point3(60.091, -169.355, 55.448), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/gen_log02'}}, '1208985003.56akelts': {'Type': 'Furniture', 'DisableCollision': True, 'Hpr': VBase3(-72.826, 0.0, 0.0), 'Pos': Point3(54.273, -180.641, 55.071), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/bed_shantyB'}}, '1208985185.81akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(0.0, 0.0, -7.77), 'Pos': Point3(88.1, -75.872, 51.497), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985214.58akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(28.731, -3.753, 1.341), 'Pos': Point3(64.877, -82.739, 50.505), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985234.61akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(11.467, -0.188, 5.517), 'Pos': Point3(-83.433, -39.622, 44.559), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985272.09akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-30.18, -3.812, 3.995), 'Pos': Point3(-65.186, -47.953, 43.053), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985300.27akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(82.607, 5.162, 6.442), 'Pos': Point3(-232.561, -177.407, 63.937), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985322.83akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(65.033, 2.97, 3.21), 'Pos': Point3(-275.027, -207.207, 69.057), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/grass_18feet'}}, '1208985344.06akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(65.033, 2.97, 3.21), 'Pos': Point3(-274.887, -207.771, 70.266), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/vegetation/grass_8feet'}}, '1208985360.09akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(77.525, 3.594, 2.492), 'Pos': Point3(-265.904, -185.256, 66.955), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/grass_8feet'}}, '1208985376.95akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(77.525, 3.594, -4.742), 'Pos': Point3(-360.566, -215.821, 70.994), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985400.34akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(33.262, 5.88, -0.894), 'Pos': Point3(-361.461, -226.025, 69.61), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_8feet'}}, '1208985418.14akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-40.745, 2.488, 3.514), 'Pos': Point3(-369.478, -287.912, 63.513), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985454.22akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-86.597, -1.881, 3.859), 'Pos': Point3(-363.642, -251.563, 68.713), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985479.5akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-1.089, 14.411, -6.92), 'Pos': Point3(-316.819, -299.837, 67.223), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985517.81akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-2.635, 4.975, -1.055), 'Pos': Point3(246.537, -24.978, 59.993), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985544.53akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-148.536, -3.532, -2.452), 'Pos': Point3(272.652, -18.391, 59.866), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208985576.95akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(34.877, 3.488, 1.698), 'Pos': Point3(287.409, -14.742, 58.774), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208986572.17akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-124.66, -1.746, -0.668), 'Pos': Point3(148.161, -34.548, 52.598), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208986622.69akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(56.594, 1.761, 2.697), 'Pos': Point3(132.288, -56.048, 53.743), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208986646.89akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-142.604, -0.775, -0.615), 'Pos': Point3(112.824, -74.133, 54.446), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1208987108.2akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(-39.598, -1.124, -3.029), 'Pos': Point3(70.991, -74.959, 48.354), 'Scale': VBase3(1.246, 1.246, 1.246), 'Visual': {'Model': 'models/vegetation/bush_c'}}, '1208987212.84akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(10.566, 200.632, 54.196), 'Hpr': VBase3(17.523, 20.035, 0.0), 'Pos': Point3(91.882, -70.332, 51.882), 'Scale': VBase3(2.025, 2.025, 2.025), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1208991300.13akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(634.943, 474.118, -28.164), 'Hpr': VBase3(110.042, 0.0, 0.0), 'Pos': Point3(-204.673, -607.881, -41.891), 'Scale': VBase3(0.964, 1.112, 0.751), 'Visual': {'Color': (0.5899999737739563, 0.5899999737739563, 0.49000000953674316, 1.0), 'Model': 'models/props/mound_light_med2'}}, '1209140730.86akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(375.859, 82.108, 1.086), 'Hpr': VBase3(-171.208, 0.0, 0.0), 'Pos': Point3(-416.916, -181.892, -16.305), 'Scale': VBase3(0.761, 0.697, 1.033), 'Visual': {'Color': (0.42, 0.42, 0.33725490196078434, 1.0), 'Model': 'models/props/mound_light_med2'}}, '1209141015.28akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(375.859, 82.108, 1.086), 'Hpr': VBase3(90.399, 0.0, 0.0), 'Pos': Point3(-317.008, -321.238, -0.51), 'Scale': VBase3(0.247, 0.292, 0.554), 'Visual': {'Color': (0.42, 0.42, 0.33725490196078434, 1.0), 'Model': 'models/props/mound_light_lrg'}}, '1209141425.34akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(39.251, -4.673, -0.399), 'Pos': Point3(-246.574, -305.059, 2.045), 'Scale': VBase3(10.453, 10.453, 13.328), 'Visual': {'Color': (0.33, 0.35, 0.2784313725490196, 1.0), 'Model': 'models/props/rock_4_sphere'}}, '1209141691.17akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(-6.969, 263.14, 4.355), 'Hpr': VBase3(-43.967, -0.265, 8.111), 'Pos': Point3(-66.052, -233.064, 4.105), 'Scale': VBase3(14.765, 14.765, 14.765), 'Visual': {'Color': (0.32, 0.34, 0.2784313725490196, 1.0), 'Model': 'models/props/rock_3_sphere'}}, '1209141904.02akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-33.642, 4.487, -0.17), 'Pos': Point3(-346.824, -342.296, 36.594), 'Scale': VBase3(0.991, 0.991, 0.991), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1209142119.19akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-186.002, 32.546, 3.202), 'Hpr': VBase3(-13.372, -0.435, -0.872), 'Pos': Point3(197.41, 31.295, -2.443), 'Scale': VBase3(6.958, 6.958, 11.792), 'Visual': {'Color': (0.38, 0.4, 0.3254901960784314, 1.0), 'Model': 'models/props/rock_2_sphere'}}, '1209146067.22akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(322.479, 227.694, 71.845), 'Hpr': VBase3(-169.429, -14.405, 5.089), 'Pos': Point3(-372.955, -303.33, 59.802), 'Scale': VBase3(2.157, 2.157, 2.025), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/rock_group_2_floor'}}, '1209146181.89akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(170.771, 6.668, -7.352), 'Pos': Point3(-30.527, -67.842, 37.103), 'Scale': VBase3(5.456, 5.456, 6.378), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1209160185.02akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(411.062, 47.92, 6.511), 'Hpr': VBase3(-40.0, 0.0, 0.0), 'Pos': Point3(402.251, 10.911, 6.754), 'Scale': VBase3(2.494, 2.494, 2.494), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209160274.7akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-163.903, 8.208, -4.85), 'Pos': Point3(207.547, 35.805, 43.107), 'Scale': VBase3(1.204, 1.204, 1.204), 'Visual': {'Model': 'models/vegetation/moss_a'}}, '1209160379.91akelts': {'Type': 'Tree', 'DisableCollision': True, 'Hpr': VBase3(139.806, 0.0, 0.0), 'Pos': Point3(132.217, -53.285, 53.637), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.800000011920929, 0.800000011920929, 0.800000011920929, 1.0), 'Model': 'models/vegetation/fern_tree_c'}}, '1209160555.83akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(10.566, 200.632, 54.196), 'Hpr': VBase3(-157.393, 31.58, 2.796), 'Pos': Point3(90.158, -59.378, 48.938), 'Scale': VBase3(2.617, 2.617, 2.617), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_fern_b'}}, '1209160596.69akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(10.566, 200.632, 54.196), 'Hpr': VBase3(17.523, -1.265, 5.946), 'Pos': Point3(-15.443, -69.225, 43.767), 'Scale': VBase3(1.581, 1.581, 1.581), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1209160629.69akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-43.413, 9.327, -11.599), 'Pos': Point3(-25.643, -78.851, 41.809), 'Scale': VBase3(3.465, 3.465, 6.651), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_1_sphere'}}, '1209160682.22akelts': {'Type': 'Grass', 'DisableCollision': False, 'Hpr': VBase3(-36.445, 1.806, -11.462), 'Pos': Point3(12.577, -87.779, 46.959), 'Scale': VBase3(2.014, 2.014, 2.014), 'Visual': {'Model': 'models/vegetation/grass_18feet'}}, '1209160719.84akelts': {'Type': 'Tree - Animated', 'DisableCollision': True, 'Hpr': VBase3(-11.599, 0.0, 0.0), 'Pos': Point3(-18.532, -47.773, 40.314), 'Scale': VBase3(1.367, 1.367, 1.367), 'SubObjs': {'Top Model': {'Visual': {'Animate': 'models/vegetation/fern_short_leaf_c_idle', 'Attach': ['trunk', 'def_trunk_attach'], 'Model': 'models/vegetation/fern_short_leaf_d_hi', 'PartName': 'leaf'}}}, 'Visual': {'Animate': 'models/vegetation/fern_short_trunk_d_idle', 'Model': 'models/vegetation/fern_short_trunk_e_hi', 'PartName': 'trunk'}}, '1209160827.09akelts': {'Type': 'Tree - Animated', 'DisableCollision': True, 'Hpr': VBase3(-11.599, 0.0, 0.0), 'Pos': Point3(-444.551, -208.964, 60.502), 'Scale': VBase3(1.367, 1.367, 1.367), 'SubObjs': {'Top Model': {'Visual': {'Animate': 'models/vegetation/fern_leaf_a_idle', 'Attach': ['trunk', 'def_trunk_attach'], 'Model': 'models/vegetation/fern_leaf_a_hi', 'PartName': 'leaf'}}}, 'Visual': {'Animate': 'models/vegetation/fern_trunk_a_idle', 'Model': 'models/vegetation/fern_trunk_a_hi', 'PartName': 'trunk'}}, '1209160837.58akelts': {'Type': 'Tree - Animated', 'DisableCollision': True, 'Hpr': VBase3(-11.599, 0.0, 0.0), 'Pos': Point3(-449.297, 15.15, 58.09), 'Scale': VBase3(1.367, 1.367, 1.367), 'SubObjs': {'Top Model': {'Visual': {'Animate': 'models/vegetation/fern_short_leaf_c_idle', 'Attach': ['trunk', 'def_trunk_attach'], 'Model': 'models/vegetation/fern_short_leaf_c_hi', 'PartName': 'leaf'}}}, 'Visual': {'Animate': 'models/vegetation/fern_short_trunk_d_idle', 'Model': 'models/vegetation/fern_short_trunk_d_hi', 'PartName': 'trunk'}}, '1209160864.06akelts': {'Type': 'Tree - Animated', 'DisableCollision': True, 'Hpr': VBase3(1.669, 0.0, 0.0), 'Pos': Point3(-359.654, 38.796, 48.389), 'Scale': VBase3(1.367, 1.367, 1.367), 'SubObjs': {'Top Model': {'Visual': {'Animate': 'models/vegetation/fern_short_leaf_c_idle', 'Attach': ['trunk', 'def_trunk_attach'], 'Model': 'models/vegetation/fern_short_leaf_c_hi', 'PartName': 'leaf'}}}, 'Visual': {'Animate': 'models/vegetation/fern_short_trunk_d_idle', 'Model': 'models/vegetation/fern_short_trunk_d_hi', 'PartName': 'trunk'}}, '1209160896.47akelts': {'Type': 'Tree - Animated', 'DisableCollision': True, 'Hpr': VBase3(36.166, 0.0, 0.0), 'Pos': Point3(254.104, 78.285, 6.637), 'Scale': VBase3(1.367, 1.367, 1.367), 'SubObjs': {'Top Model': {'Visual': {'Animate': 'models/vegetation/fern_short_leaf_c_idle', 'Attach': ['trunk', 'def_trunk_attach'], 'Model': 'models/vegetation/fern_short_leaf_c_hi', 'PartName': 'leaf'}}}, 'Visual': {'Animate': 'models/vegetation/fern_short_trunk_d_idle', 'Model': 'models/vegetation/fern_short_trunk_d_hi', 'PartName': 'trunk'}}, '1209160973.97akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(253.975, 78.719, 6.253), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_tube'}}, '1209161079.81akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(-155.962, 302.425, 9.518), 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(-155.962, 302.425, 9.518), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/prop_group_A'}}, '1209161092.23akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(-200.172, 324.753, 9.442), 'Hpr': VBase3(89.343, 0.0, 0.0), 'Pos': Point3(-200.173, 324.753, 9.442), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/prop_group_B'}}, '1209161143.61akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(-179.92, 309.712, 7.794), 'Hpr': VBase3(-12.669, 0.0, 0.0), 'Pos': Point3(-179.92, 309.713, 7.794), 'Scale': VBase3(5.45, 2.8, 2.541), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209161288.52akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-246.642, -281.739, 69.556), 'Scale': VBase3(4.657, 4.657, 4.657), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161340.83akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(322.479, 227.694, 71.845), 'Hpr': VBase3(-170.939, 2.793, 4.934), 'Pos': Point3(-431.925, -244.179, 58.528), 'Scale': VBase3(6.838, 6.838, 6.42), 'Visual': {'Color': (0.5, 0.5, 0.5, 1.0), 'Model': 'models/props/rock_3_sphere'}}, '1209161440.44akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-144.282, -1.023, -1.323), 'Objects': {}, 'Pos': Point3(-365.81, 0.376, 51.849), 'Scale': VBase3(4.774, 4.774, 3.13), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_2_sphere'}}, '1209161477.69akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-0.015, 0.058, 1.671), 'Objects': {}, 'Pos': Point3(-409.894, 17.425, 56.048), 'Scale': VBase3(7.288, 7.288, 4.778), 'Visual': {'Color': (0.4000000059604645, 0.4000000059604645, 0.4000000059604645, 1.0), 'Model': 'models/props/rock_4_floor'}}, '1209161531.03akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-449.28, 14.967, 58.265), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_tube'}}, '1209161544.22akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-438.996, -8.506, 57.961), 'Scale': VBase3(1.801, 1.801, 1.801), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161579.78akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-359.211, -56.039, 55.448), 'Scale': VBase3(1.571, 1.571, 1.571), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161595.84akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-347.367, -45.613, 55.072), 'Scale': VBase3(1.972, 1.972, 1.972), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161621.64akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(0.169, -201.995, 52.27), 'Scale': VBase3(1.778, 1.778, 1.778), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161638.92akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-66.046, -68.342, 45.16), 'Scale': VBase3(2.372, 2.372, 2.372), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161654.92akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(102.793, -146.8, 58.166), 'Scale': VBase3(2.372, 2.372, 2.372), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161657.44akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(128.546, -133.474, 58.518), 'Scale': VBase3(2.954, 2.954, 2.954), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161671.08akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(186.798, -22.386, 56.978), 'Scale': VBase3(2.524, 2.524, 2.524), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1209161686.61akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-13.52, 0.0, 0.0), 'Pos': Point3(273.201, -21.641, 59.928), 'Scale': VBase3(1.174, 1.174, 1.416), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209161710.03akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(14.597, 0.128, -5.824), 'Pos': Point3(82.206, -75.41, 50.078), 'Scale': VBase3(6.705, 2.85, 2.85), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209487843.45akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-129.25, 211.145, 50.427), 'Hpr': VBase3(140.127, 0.0, 0.974), 'Pos': Point3(275.008, -69.766, 53.784), 'Scale': VBase3(11.352, 11.352, 20.379), 'Visual': {'Color': (0.35, 0.38, 0.3058823529411765, 1.0), 'Model': 'models/props/rock_4_sphere'}}, '1209487882.58akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-129.25, 211.145, 50.427), 'Hpr': VBase3(-103.375, 0.871, -0.434), 'Pos': Point3(263.141, -79.951, 54.999), 'Scale': VBase3(4.386, 4.386, 2.489), 'Visual': {'Color': (0.35, 0.38, 0.3058823529411765, 1.0), 'Model': 'models/props/rock_3_sphere'}}, '1209488311.5akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(-129.25, 211.145, 50.427), 'Hpr': VBase3(97.047, 11.992, 12.247), 'Pos': Point3(341.282, -16.655, 48.54), 'Scale': VBase3(11.352, 11.352, 25.765), 'Visual': {'Color': (0.33, 0.39, 0.29411764705882354, 1.0), 'Model': 'models/props/rock_4_sphere'}}, '1209488450.56akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(10.238, 5.609, -4.441), 'Pos': Point3(397.278, -167.751, 10.629), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1209488472.47akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(10.238, 5.609, -4.441), 'Pos': Point3(177.615, -288.955, 76.916), 'Scale': VBase3(2.879, 2.879, 2.879), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_d'}}, '1209488483.22akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(1.748, 40.407, 14.612), 'Pos': Point3(351.268, -169.859, 19.653), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1209488503.27akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(0.238, 27.161, -5.002), 'Pos': Point3(380.178, -175.757, 38.526), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1209488515.72akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(212.329, -282.354, 30.754), 'Hpr': VBase3(0.237, 27.161, 12.904), 'Pos': Point3(212.329, -282.354, 30.754), 'Scale': VBase3(2.077, 2.077, 2.077), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1209488525.33akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(2.517, 23.367, 28.041), 'Pos': Point3(272.536, -231.389, 52.873), 'Scale': VBase3(2.543, 2.543, 2.543), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1209488617.81akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(212.329, -282.354, 30.754), 'Hpr': VBase3(-47.475, 1.66, 9.606), 'Pos': Point3(182.742, -297.231, -5.17), 'Scale': VBase3(1.933, 1.933, 1.933), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1209488652.98akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(212.329, -282.354, 30.754), 'Hpr': VBase3(-47.475, 1.659, 9.606), 'Pos': Point3(330.473, -171.451, -6.221), 'Scale': VBase3(1.933, 1.933, 1.933), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1209488661.19akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'GridPos': Point3(212.329, -282.354, 30.754), 'Hpr': VBase3(-47.475, 1.659, 9.606), 'Pos': Point3(474.988, -113.093, -7.282), 'Scale': VBase3(1.933, 1.933, 1.933), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1209493265.22akelts': {'Type': 'Vines', 'DisableCollision': False, 'Hpr': VBase3(-113.398, -4.875, -0.412), 'Pos': Point3(323.594, -34.059, 75.885), 'Scale': VBase3(1.204, 1.204, 1.204), 'Visual': {'Model': 'models/vegetation/moss_b'}}, '1209493343.03akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(-248.872, 56.43, 53.321), 'Hpr': VBase3(-172.407, 0.0, 0.0), 'Pos': Point3(344.013, -68.194, 70.628), 'Scale': VBase3(0.22, 0.22, 0.239), 'Visual': {'Color': (0.42, 0.49, 0.35294117647058826, 1.0), 'Model': 'models/props/mound_light_lrg'}}, '1209494722.42akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(155.008, 0.0, 0.0), 'Pos': Point3(37.059, -28.77, 5.11), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1209494725.34akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': VBase3(155.008, 0.0, 0.0), 'Pos': Point3(37.059, -28.77, 5.11), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1209494725.63akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-127.2, -0.354, -1.185), 'Pos': Point3(37.059, -28.77, 5.11), 'Scale': VBase3(1.0, 1.0, 1.0), 'TargetUIDs': []}, '1209494966.0akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(155.393, 0.0, 0.0), 'Pos': Point3(-255.159, -127.722, 59.195), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495088.03akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'Hpr': VBase3(32.31, 0.0, 0.0), 'Pos': Point3(-256.78, -131.618, 59.952), 'Scale': VBase3(0.447, 0.447, 0.447), 'Visual': {'Model': 'models/vegetation/jungle_plant_b'}}, '1209495118.52akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'Hpr': VBase3(-61.885, 0.0, 0.0), 'Pos': Point3(-257.172, -131.645, 59.971), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/jungle_fern_a'}}, '1209495145.86akelts': {'Type': 'Jungle_Props', 'DisableCollision': False, 'Hpr': VBase3(32.31, 0.0, 0.0), 'Pos': Point3(-257.161, -131.858, 60.009), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/vegetation/jungle_fern_c'}}, '1209495327.36akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-16.321, 0.0, 0.0), 'Pos': Point3(-274.376, -215.222, 71.008), 'Scale': VBase3(1.547, 1.547, 1.547), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495367.36akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(-130.758, 0.0, 0.0), 'Pos': Point3(-265.362, -191.332, 69.757), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6000000238418579, 0.6000000238418579, 0.6000000238418579, 1.0), 'Model': 'models/vegetation/bush_b'}}, '1209495433.5akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(126.526, 0.0, 0.0), 'Pos': Point3(-261.988, -195.283, 67.94), 'Scale': VBase3(1.216, 1.547, 1.547), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495577.41akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-35.753, 0.0, 0.0), 'Pos': Point3(-75.592, -245.691, 4.261), 'Scale': VBase3(1.979, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495606.77akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(35.201, 0.0, 0.0), 'Pos': Point3(-57.12, -243.858, 4.318), 'Scale': VBase3(2.619, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495623.19akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(73.064, 0.0, 0.0), 'Pos': Point3(-44.884, -230.551, 4.319), 'Scale': VBase3(1.309, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495656.72akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-69.933, 0.0, 0.0), 'Pos': Point3(-84.664, -236.663, 4.305), 'Scale': VBase3(0.785, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495679.92akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-169.896, 0.0, 0.0), 'Pos': Point3(-66.491, -226.737, 40.084), 'Scale': VBase3(1.521, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495696.53akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-114.055, 0.0, 0.0), 'Pos': Point3(-75.191, -231.723, 40.01), 'Scale': VBase3(0.854, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209495721.25akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(158.79, 0.0, 0.0), 'Pos': Point3(-55.338, -226.91, 39.976), 'Scale': VBase3(0.854, 1.979, 1.979), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1209507473.88akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-24.821, 0.0, 0.0), 'Pos': Point3(-249.918, -140.602, 65.961), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1209507475.38akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': VBase3(155.008, 0.0, 0.0), 'Pos': Point3(-255.062, -166.141, 66.015), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1209507476.16akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(142.807, 1.185, -0.354), 'Pos': Point3(357.407, 81.242, 6.653), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1210017193.42akelts': {'Type': 'Jungle_Props', 'DisableCollision': True, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(1.392, -0.182, 7.522), 'Pos': Point3(19.089, -40.189, 20.17), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_c'}}, '1210017223.89akelts': {'Type': 'Jungle_Props', 'DisableCollision': True, 'GridPos': Point3(-142.421, 161.349, 59.848), 'Hpr': VBase3(-44.557, -6.521, -11.794), 'Pos': Point3(44.922, -38.239, 7.918), 'Scale': VBase3(1.435, 1.435, 1.435), 'Visual': {'Color': (1.0, 1.0, 0.8799999952316284, 1.0), 'Model': 'models/vegetation/jungle_plant_a'}}, '1210017262.44akelts': {'Type': 'Tree - Animated', 'DisableCollision': True, 'Hpr': VBase3(71.785, 0.0, 0.0), 'Pos': Point3(25.245, -31.096, 1.557), 'Scale': VBase3(1.367, 1.367, 1.367), 'SubObjs': {'Top Model': {'Visual': {'Animate': 'models/vegetation/fern_short_leaf_c_idle', 'Attach': ['trunk', 'def_trunk_attach'], 'Model': 'models/vegetation/fern_short_leaf_d_hi', 'PartName': 'leaf'}}}, 'Visual': {'Animate': 'models/vegetation/fern_trunk_a_idle', 'Model': 'models/vegetation/fern_trunk_a_hi', 'PartName': 'trunk'}}, '1210017823.92akelts': {'Type': 'Bush', 'DisableCollision': True, 'Hpr': VBase3(135.643, 0.869, 3.112), 'Pos': Point3(13.659, -38.481, 3.898), 'Scale': VBase3(1.551, 1.551, 1.551), 'Visual': {'Model': 'models/vegetation/bush_f'}}, '1210017888.88akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(-251.946, -134.4, 62.218), 'Hpr': VBase3(75.504, 0.0, 0.0), 'Pos': Point3(-251.947, -134.4, 62.218), 'Scale': VBase3(1.0, 1.0, 1.821), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210017910.42akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(-251.847, -134.015, 62.218), 'Hpr': VBase3(-112.592, 0.0, 0.0), 'Pos': Point3(-244.564, -137.166, 61.987), 'Scale': VBase3(1.038, 1.0, 1.821), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210018066.84akelts': {'Type': 'Prop_Groups', 'DisableCollision': False, 'Hpr': VBase3(-53.61, -7.082, -8.069), 'Pos': Point3(-233.941, -140.802, 59.899), 'Scale': VBase3(0.947, 0.947, 0.947), 'Visual': {'Color': (0.81, 0.78, 0.7333333333333333, 1.0), 'Model': 'models/props/prop_group_H'}}, '1210018194.59akelts': {'Type': 'Crate', 'DisableCollision': True, 'Hpr': VBase3(-117.317, 0.0, 0.0), 'Pos': Point3(-241.587, -140.234, 60.102), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.7019607843137254, 0.73, 0.6745098039215687, 1.0), 'Model': 'models/props/crates_group_1'}}, '1210018229.25akelts': {'Type': 'Crate', 'DisableCollision': True, 'Hpr': VBase3(144.285, 0.0, 0.0), 'Pos': Point3(-237.053, -142.568, 60.168), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.8, 0.82, 0.7686274509803922, 1.0), 'Model': 'models/props/crates_group_2'}}, '1210018327.55akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(-251.847, -134.015, 62.218), 'Hpr': VBase3(-112.592, 0.0, 0.0), 'Pos': Point3(-228.241, -144.497, 59.732), 'Scale': VBase3(0.727, 0.727, 0.727), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_tube'}}, '1210018399.53akelts': {'Type': 'Townsperson', 'Category': 'Commoner', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'Hpr': VBase3(0.0, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '6.7229', 'Pos': Point3(-225.996, -133.769, 58.773), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Walk', 'StartFrame': '0', 'Team': 'Player', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1210018587.17akelts': {'Type': 'Furniture', 'DisableCollision': False, 'Hpr': VBase3(0.0, -3.702, 0.0), 'Pos': Point3(365.553, 88.805, 34.002), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.75, 0.73, 0.6823529411764706, 1.0), 'Model': 'models/props/chair_shanty'}}, '1210018730.95akelts': {'Type': 'Furniture', 'DisableCollision': False, 'GridPos': Point3(367.558, 85.376, 33.907), 'Hpr': VBase3(143.779, 2.988, 2.187), 'Pos': Point3(367.558, 85.377, 33.907), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.75, 0.73, 0.6823529411764706, 1.0), 'Model': 'models/props/table_shanty_2'}}, '1210018803.45akelts': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'GridPos': Point3(369.89, 86.872, 36.791), 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(369.89, 86.872, 36.792), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/winebottle_A'}}, '1210018806.36akelts': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'GridPos': Point3(366.932, 82.924, 37.046), 'Hpr': VBase3(127.203, 0.73, -0.998), 'Pos': Point3(366.932, 82.924, 37.046), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/winebottle_B'}}, '1210018832.0akelts': {'Type': 'Cups', 'DisableCollision': False, 'Hpr': VBase3(-127.2, -0.354, -1.185), 'Pos': Point3(368.358, 85.514, 36.875), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/beerstein'}}, '1210018838.19akelts': {'Type': 'Cups', 'DisableCollision': False, 'GridPos': Point3(367.707, 84.2, 36.964), 'Hpr': VBase3(-82.374, 0.0, 0.0), 'Pos': Point3(367.706, 84.2, 36.964), 'Scale': VBase3(0.443, 0.443, 0.443), 'Visual': {'Model': 'models/props/cup_tin'}}, '1210018876.3akelts': {'Type': 'Prop_Groups', 'DisableCollision': False, 'GridPos': Point3(361.028, 85.78, 49.797), 'Hpr': VBase3(-102.398, 0.0, 0.0), 'Pos': Point3(361.028, 85.78, 49.797), 'Scale': VBase3(0.639, 0.639, 0.639), 'Visual': {'Color': (0.800000011920929, 0.800000011920929, 0.800000011920929, 1.0), 'Model': 'models/props/prop_group_C'}}, '1210018938.13akelts': {'Type': 'Prop_Groups', 'DisableCollision': False, 'GridPos': Point3(381.373, 75.1, 49.771), 'Hpr': VBase3(58.938, 1.198, 0.308), 'Pos': Point3(381.373, 75.101, 49.771), 'Scale': VBase3(0.639, 0.639, 0.639), 'Visual': {'Model': 'models/props/prop_group_B'}}, '1210018967.11akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(374.945, 53.671, 34.579), 'Hpr': VBase3(42.768, -0.081, 1.097), 'Pos': Point3(374.945, 53.671, 34.579), 'Scale': VBase3(0.866, 0.866, 0.866), 'Visual': {'Color': (0.67, 0.71, 0.7294117647058823, 1.0), 'Model': 'models/props/prop_group_A'}}, '1210018986.64akelts': {'Type': 'Prop_Groups', 'DisableCollision': True, 'GridPos': Point3(363.019, 68.391, 34.731), 'Hpr': VBase3(169.961, -0.222, -0.132), 'Pos': Point3(363.019, 68.391, 34.73), 'Scale': VBase3(0.866, 0.866, 0.866), 'Visual': {'Color': (0.52, 0.56, 0.4392156862745098, 1.0), 'Model': 'models/props/prop_group_A'}}, '1210019029.05akelts': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'GridPos': Point3(365.81, 60.824, 37.802), 'Hpr': VBase3(-105.601, -0.506, -0.976), 'Pos': Point3(365.81, 60.824, 37.802), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/winebottle_B'}}, '1210019045.88akelts': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'GridPos': Point3(365.692, 59.988, 37.787), 'Hpr': VBase3(21.597, 0.259, -0.005), 'Pos': Point3(365.691, 59.988, 37.788), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/winebottle_A'}}, '1210019110.2akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'GridPos': Point3(365.28, 83.918, 36.929), 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(365.28, 83.918, 36.93), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/lamp_table_hurricane_candle'}}, '1210352028.39akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(-24.821, 0.0, 0.0), 'Pos': Point3(-249.918, -140.602, 65.961), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1210352029.53akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator_2', 'Hpr': VBase3(155.008, 0.0, 0.0), 'Pos': Point3(-255.062, -166.141, 66.015), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1210352034.06akelts': {'Type': 'Door Locator Node', 'Name': 'door_locator', 'Hpr': VBase3(142.807, 1.185, -0.354), 'Pos': Point3(357.407, 81.242, 6.653), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1210352211.14akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-455.967, -32.178, 2.633), 'Scale': VBase3(5.225, 5.225, 5.225), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1210352223.39akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-449.488, 30.519, 4.783), 'Scale': VBase3(4.118, 4.118, 2.416), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_tube'}}, '1210352237.56akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-469.793, 6.73, 4.417), 'Scale': VBase3(2.399, 2.399, 2.399), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1210353447.2akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-66.77, 0.0, 0.0), 'Pos': Point3(52.623, -166.21, 55.787), 'Scale': VBase3(1.941, 1.941, 1.941), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1210354351.36akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(85.441, 0.0, 0.0), 'Pos': Point3(282.365, -15.541, 59.381), 'Scale': VBase3(1.174, 1.174, 1.416), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210354363.27akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(29.095, 0.0, 0.0), 'Pos': Point3(280.303, -22.024, 59.813), 'Scale': VBase3(0.427, 1.174, 1.416), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210354405.92akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(29.095, 0.0, 0.0), 'Pos': Point3(230.144, -24.458, 59.865), 'Scale': VBase3(0.427, 1.174, 1.416), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210354500.67akelts': {'Type': 'Dinghy', 'Aggro Radius': '20.0000', 'GridPos': Point3(-246.537, -87.543, -0.021), 'Hpr': VBase3(89.833, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(530.695, 29.685, 0.115), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1210354564.08akelts': {'Type': 'Dinghy', 'Aggro Radius': '20.0000', 'GridPos': Point3(8.739, -231.852, 0.612), 'Hpr': VBase3(-136.695, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(-286.469, 318.366, 0.91), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1210354586.39akelts': {'Type': 'Dinghy', 'Aggro Radius': '19.8795', 'GridPos': Point3(8.739, -231.852, 0.612), 'Hpr': VBase3(-65.781, 0.0, 0.0), 'Location': 'Water', 'Pos': Point3(-464.754, 160.717, 0.509), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/shipparts/dingy-geometry_High'}}, '1210356705.42akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'GridPos': Point3(411.062, 47.92, 6.511), 'Hpr': VBase3(-52.401, 0.0, 0.0), 'Pos': Point3(339.491, -13.104, 54.117), 'Scale': VBase3(2.023, 2.023, 2.023), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_tube'}}, '1210358375.59akelts': {'Type': 'Crate', 'DisableCollision': True, 'Hpr': VBase3(12.252, 0.0, 0.0), 'Pos': Point3(60.212, -165.713, 55.673), 'Scale': VBase3(0.832, 0.832, 0.832), 'Visual': {'Model': 'models/props/crate'}}, '1210612073.56akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(51.889, 0.0, 0.0), 'Pos': Point3(-253.468, -343.007, 2.973), 'Scale': VBase3(3.106, 4.657, 4.657), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210612087.31akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(87.795, 0.0, 0.0), 'Pos': Point3(-244.103, -321.737, 2.983), 'Scale': VBase3(2.049, 4.657, 4.657), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210612184.25akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-3.518, 0.0, 0.0), 'Pos': Point3(-280.058, -353.778, 3.092), 'Scale': VBase3(3.602, 4.657, 4.657), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210612337.41akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-6.919, 0.0, 0.0), 'Pos': Point3(-287.009, -370.072, -5.466), 'Scale': VBase3(2.242, 2.242, 2.242), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_tube'}}, '1210612360.47akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-6.919, 0.0, 0.0), 'Pos': Point3(-308.201, -345.997, 5.321), 'Scale': VBase3(2.941, 2.941, 2.941), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1210613067.88akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-248.226, -303.587, 7.437), 'Scale': VBase3(2.512, 2.512, 2.512), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1210614180.44akelts': {'Type': 'Rock', 'DisableCollision': True, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-153.888, -21.83, -4.597), 'Pos': Point3(-221.129, -286.486, 0.739), 'Scale': VBase3(3.573, 3.573, 4.555), 'Visual': {'Color': (0.22, 0.25, 0.22745098039215686, 1.0), 'Model': 'models/props/rock_group_5_sphere'}}, '1210614230.94akelts': {'Type': 'Rock', 'DisableCollision': False, 'GridPos': Point3(146.575, -48.43, 17.034), 'Hpr': VBase3(-150.378, -10.878, 0.083), 'Pos': Point3(-229.862, -293.394, 5.386), 'Scale': VBase3(2.597, 2.597, 1.992), 'Visual': {'Color': (0.28, 0.32, 0.3058823529411765, 1.0), 'Model': 'models/props/rock_group_1_floor'}}, '1210978646.84kmuller': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(-163.082, 0.0, 0.0), 'Pos': Point3(-298.715, 81.667, 14.685), 'Scale': VBase3(5.033, 4.636, 7.816), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210978775.87kmuller': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(88.931, 0.0, 0.0), 'Pos': Point3(-242.472, 47.559, 17.589), 'Scale': VBase3(2.49, 2.134, 7.212), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1210978826.82kmuller': {'Type': 'Jungle_Props', 'DisableCollision': False, 'Hpr': VBase3(4.725, -17.22, 15.599), 'Pos': Point3(-264.896, 70.594, 17.81), 'Scale': VBase3(2.16, 2.16, 2.16), 'Visual': {'Model': 'models/vegetation/jungle_fern_b'}}, '1210978861.2kmuller': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Hpr': VBase3(138.584, 0.0, 0.0), 'Pos': Point3(-258.687, 74.265, 16.954), 'Scale': VBase3(4.496, 4.859, 7.409), 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1211929216.0WDIG1': {'Type': 'Townsperson', 'Category': 'Commoner', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'DNA': '1211929216.0WDIG1', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'HelpID': 'SHIP_PVP_HELP_SPAINISH_A', 'Hpr': VBase3(80.821, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '12.0000', 'Pos': Point3(-209.305, 197.314, 10.122), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Player', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1211929472.0WDIG': {'Type': 'Townsperson', 'Category': 'Commoner', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'DNA': '1211929472.0WDIG', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'HelpID': 'SHIP_PVP_HELP_SPAINISH_B', 'Hpr': VBase3(67.311, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '12.0000', 'Pos': Point3(327.55, 81.314, 6.205), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Player', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1216249894.26aapatel': {'Type': 'Townsperson', 'Category': 'PvPRewards', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'HelpID': 'NONE', 'Holiday': '', 'Hpr': VBase3(-42.606, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '12.0000', 'Pos': Point3(-321.07, 88.839, 14.807), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PRIVATEER_TATTOOS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Villager', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'VisSize': '', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1259623459.69caoconno': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(443.924, 227.799, -6.924), 'Scale': VBase3(5.736, 5.736, 5.736), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1259623493.53caoconno': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(32.551, 0.0, 0.0), 'Pos': Point3(387.081, 216.027, -17.705), 'Scale': VBase3(1.952, 1.952, 4.363), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1259623521.7caoconno': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(80.439, 0.0, 0.0), 'Pos': Point3(381.85, 177.893, -9.613), 'Scale': VBase3(2.279, 2.624, 5.355), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1259623563.36caoconno': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(322.954, 164.741, -1.891), 'Scale': VBase3(2.252, 2.252, 2.252), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1259623576.63caoconno': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-90.398, 0.0, 0.0), 'Pos': Point3(300.603, 216.858, -18.215), 'Scale': VBase3(2.122, 2.122, 4.718), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1259623610.39caoconno': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-179.301, 0.0, 0.0), 'Pos': Point3(358.649, 267.315, -20.517), 'Scale': VBase3(6.469, 6.469, 10.613), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}}, 'PVPTeam': '2', 'Team': 2, 'Undockable': False, 'Visibility': 'Grid', 'Visual': {'Model': 'models/islands/pir_m_are_isl_pvpSpanish'}}}, 'Node Links': [], 'Layers': {'Collisions': ['1184008208.59kmuller', '1184016064.62kmuller', '1184013852.84kmuller', '1185822696.06kmuller', '1184006140.32kmuller', '1184002350.98kmuller', '1184007573.29kmuller', '1184021176.59kmuller', '1184005963.59kmuller', '1188324241.31akelts', '1184006537.34kmuller', '1184006605.81kmuller', '1187139568.33kmuller', '1188324186.98akelts', '1184006730.66kmuller', '1184007538.51kmuller', '1184006188.41kmuller', '1184021084.27kmuller', '1185824396.94kmuller', '1185824250.16kmuller', '1185823630.52kmuller', '1185823760.23kmuller', '1185824497.83kmuller', '1185824751.45kmuller', '1187739103.34akelts', '1188323993.34akelts', '1184016538.29kmuller', '1185822200.97kmuller', '1184016225.99kmuller', '1195241421.34akelts', '1195242796.08akelts', '1184020642.13kmuller', '1195237994.63akelts', '1184020756.88kmuller', '1184020833.4kmuller', '1185820992.97kmuller', '1185821053.83kmuller', '1184015068.54kmuller', '1184014935.82kmuller', '1185821432.88kmuller', '1185821701.86kmuller', '1195240137.55akelts', '1195241539.38akelts', '1195238422.3akelts', '1195238473.22akelts', '1185821453.17kmuller', '1184021269.96kmuller', '1185821310.89kmuller', '1185821165.59kmuller', '1185821199.36kmuller', '1185822035.98kmuller', '1184015806.59kmuller', '1185822059.48kmuller', '1185920461.76kmuller', '1194984449.66akelts', '1185824206.22kmuller', '1184003446.23kmuller', '1184003254.85kmuller', '1184003218.74kmuller', '1184002700.44kmuller', '1186705073.11kmuller', '1187658531.86akelts', '1186705214.3kmuller', '1185824927.28kmuller', '1184014204.54kmuller', '1184014152.84kmuller']}, 'ObjectIds': {'1196970035.53sdnaik': '["Objects"]["1196970035.53sdnaik"]', '1201551808.32kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201551808.32kmuller"]', '1201551834.96kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201551834.96kmuller"]', '1201551889.37kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201551889.37kmuller"]', '1201551915.76kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201551915.76kmuller"]', '1201558997.82kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201558997.82kmuller"]', '1201559007.18kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201559007.18kmuller"]', '1201559028.84kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1201559028.84kmuller"]', '1202414940.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1202414940.17akelts"]', '1202414940.17akelts0': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1202414940.17akelts"]', '1203114290.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114290.45akelts"]', '1203114330.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114330.25akelts"]', '1203114365.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114365.52akelts"]', '1203114365.52akelts0': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114365.52akelts"]', '1203114365.58akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114365.58akelts"]', '1203114419.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114419.42akelts"]', '1203114463.95akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114463.95akelts"]', '1203114579.48akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114579.48akelts"]', '1203114754.3akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114754.3akelts"]', '1203114815.91akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114815.91akelts"]', '1203114862.61akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114862.61akelts"]', '1203114895.23akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114895.23akelts"]', '1203115389.72akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115389.72akelts"]', '1203115424.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115424.56akelts"]', '1203115489.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115489.25akelts"]', '1203115525.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115525.55akelts"]', '1203115567.8akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115567.8akelts"]', '1203115686.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115686.2akelts"]', '1203115759.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115759.17akelts"]', '1203115821.75akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115821.75akelts"]', '1203115874.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115874.03akelts"]', '1203115947.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203115947.03akelts"]', '1203449603.97akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203449603.97akelts"]', '1203450135.28akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450135.28akelts"]', '1203450324.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450324.03akelts"]', '1203450370.69akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450370.69akelts"]', '1203450461.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450461.55akelts"]', '1203450511.61akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450511.61akelts"]', '1203450582.5akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450582.5akelts"]', '1203450682.05akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450682.05akelts"]', '1203450682.05akelts0': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450682.05akelts"]', '1203450740.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450740.25akelts"]', '1203450762.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450762.45akelts"]', '1203450800.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450800.73akelts"]', '1203450923.8akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450923.8akelts"]', '1203450923.8akelts0': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203450923.8akelts"]', '1203451028.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203451028.89akelts"]', '1203451240.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203451240.81akelts"]', '1203458158.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203458158.53akelts"]', '1203458673.23akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203458673.23akelts"]', '1203459119.77akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203459119.77akelts"]', '1203459408.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203459408.59akelts"]', '1203460758.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203460758.22akelts"]', '1203460800.05akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203460800.05akelts"]', '1203460826.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203460826.44akelts"]', '1203460870.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203460870.59akelts"]', '1203460917.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203460917.47akelts"]', '1203460945.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203460945.09akelts"]', '1203461019.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461019.36akelts"]', '1203461073.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461073.42akelts"]', '1203461085.86akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461085.86akelts"]', '1203461128.5akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461128.5akelts"]', '1203461179.58akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461179.58akelts"]', '1203461241.98akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461241.98akelts"]', '1203461317.66akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461317.66akelts"]', '1203461390.23akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461390.23akelts"]', '1203461510.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203461510.47akelts"]', '1203464268.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203464268.55akelts"]', '1203464436.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203464436.53akelts"]', '1203464470.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203464470.73akelts"]', '1203464494.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203464494.17akelts"]', '1203464564.14akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203464564.14akelts"]', '1204059355.98akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059355.98akelts"]', '1204059414.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059414.83akelts"]', '1204059436.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059436.81akelts"]', '1204059462.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059462.52akelts"]', '1204059672.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059672.03akelts"]', '1204059711.06akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059711.06akelts"]', '1204059712.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059712.09akelts"]', '1204059713.08akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059713.08akelts"]', '1204059716.5akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059716.5akelts"]', '1204059717.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059717.45akelts"]', '1204059719.28akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059719.28akelts"]', '1204059720.05akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059720.05akelts"]', '1204059724.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059724.55akelts"]', '1204059729.58akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059729.58akelts"]', '1204059732.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059732.36akelts"]', '1204059735.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059735.73akelts"]', '1204059738.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059738.44akelts"]', '1204059742.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059742.03akelts"]', '1204059934.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059934.52akelts"]', '1204059949.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059949.56akelts"]', '1204059953.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059953.34akelts"]', '1204059956.41akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059956.41akelts"]', '1204059960.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059960.55akelts"]', '1204059962.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059962.81akelts"]', '1204059966.33akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059966.33akelts"]', '1204059991.02akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204059991.02akelts"]', '1204060027.94akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060027.94akelts"]', '1204060031.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060031.36akelts"]', '1204060033.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060033.45akelts"]', '1204060035.7akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060035.7akelts"]', '1204060039.77akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060039.77akelts"]', '1204060065.31akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060065.31akelts"]', '1204060071.86akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060071.86akelts"]', '1204060074.63akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060074.63akelts"]', '1204060078.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060078.2akelts"]', '1204060082.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060082.56akelts"]', '1204060470.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060470.2akelts"]', '1204060505.48akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060505.48akelts"]', '1204060511.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060511.89akelts"]', '1204060515.3akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060515.3akelts"]', '1204060521.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060521.22akelts"]', '1204060524.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060524.36akelts"]', '1204060821.13akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060821.13akelts"]', '1204060855.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060855.45akelts"]', '1204060956.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204060956.09akelts"]', '1204061028.02akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204061028.02akelts"]', '1204061061.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204061061.27akelts"]', '1204134204.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204134204.59akelts"]', '1204134436.16akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204134436.16akelts"]', '1204134561.33akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204134561.33akelts"]', '1204134680.02akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204134680.02akelts"]', '1204134735.98akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204134735.98akelts"]', '1204134910.91akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204134910.91akelts"]', '1204135098.84akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204135098.84akelts"]', '1204135135.33akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204135135.33akelts"]', '1204237124.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237124.22akelts"]', '1204237124.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237124.2akelts"]', '1204237124.2akelts0': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237124.2akelts"]', '1204237223.91akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237223.91akelts"]', '1204237285.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237285.34akelts"]', '1204237296.23akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237296.23akelts"]', '1204237768.98akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237768.98akelts"]', '1204237956.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237956.03akelts"]', '1208535913.67akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208535913.67akelts"]', '1208535916.31akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208535916.31akelts"]', '1208551336.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208551336.89akelts"]', '1208551339.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208551339.09akelts"]', '1208796482.19akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208796482.19akelts"]', '1208796484.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208796484.09akelts"]', '1208796484.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208796484.17akelts"]', '1208808622.33akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808622.33akelts"]', '1208808661.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808661.47akelts"]', '1208808732.92akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808732.92akelts"]', '1208808760.67akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808760.67akelts"]', '1208808898.64akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808898.64akelts"]', '1208808939.75akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808939.75akelts"]', '1208808963.02akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808963.02akelts"]', '1208808997.63akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208808997.63akelts"]', '1208809009.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809009.36akelts"]', '1208809176.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809176.83akelts"]', '1208809279.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809279.59akelts"]', '1208809368.11akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809368.11akelts"]', '1208809472.88akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809472.88akelts"]', '1208809510.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809510.83akelts"]', '1208809617.05akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809617.05akelts"]', '1208809694.8akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809694.8akelts"]', '1208809918.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809918.53akelts"]', '1208809947.38akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809947.38akelts"]', '1208809958.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809958.27akelts"]', '1208809992.67akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208809992.67akelts"]', '1208810053.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810053.52akelts"]', '1208810085.05akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810085.05akelts"]', '1208810147.31akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810147.31akelts"]', '1208810201.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810201.73akelts"]', '1208810269.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810269.45akelts"]', '1208810310.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810310.83akelts"]', '1208810354.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810354.47akelts"]', '1208810406.28akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810406.28akelts"]', '1208810445.06akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810445.06akelts"]', '1208810495.61akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810495.61akelts"]', '1208810525.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810525.34akelts"]', '1208810579.08akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810579.08akelts"]', '1208810621.84akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810621.84akelts"]', '1208810658.95akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810658.95akelts"]', '1208810807.19akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810807.19akelts"]', '1208810877.0akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810877.0akelts"]', '1208810934.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208810934.53akelts"]', '1208811023.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811023.73akelts"]', '1208811068.98akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811068.98akelts"]', '1208811084.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811084.52akelts"]', '1208811112.3akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811112.3akelts"]', '1208811262.97akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811262.97akelts"]', '1208811381.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811381.53akelts"]', '1208811429.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811429.42akelts"]', '1208811768.8akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811768.8akelts"]', '1208811772.16akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811772.16akelts"]', '1208811773.16akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811773.16akelts"]', '1208811773.94akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811773.94akelts"]', '1208811776.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811776.03akelts"]', '1208811776.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811776.83akelts"]', '1208811777.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811777.47akelts"]', '1208811779.14akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811779.14akelts"]', '1208811781.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811781.34akelts"]', '1208811783.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811783.25akelts"]', '1208811785.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811785.25akelts"]', '1208811787.06akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811787.06akelts"]', '1208811787.67akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811787.67akelts"]', '1208811788.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811788.55akelts"]', '1208811791.69akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811791.69akelts"]', '1208811792.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811792.56akelts"]', '1208811794.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811794.17akelts"]', '1208811795.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811795.17akelts"]', '1208811800.08akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811800.08akelts"]', '1208811801.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811801.2akelts"]', '1208811802.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811802.09akelts"]', '1208811803.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811803.81akelts"]', '1208811804.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811804.59akelts"]', '1208811805.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811805.2akelts"]', '1208811806.67akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811806.67akelts"]', '1208811976.0akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208811976.0akelts"]', '1208812034.0akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208812034.0akelts"]', '1208817172.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817172.09akelts"]', '1208817254.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817254.27akelts"]', '1208817508.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817508.89akelts"]', '1208817704.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817704.22akelts"]', '1208817733.41akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817733.41akelts"]', '1208817744.48akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817744.48akelts"]', '1208817783.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817783.52akelts"]', '1208817904.31akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208817904.31akelts"]', '1208818191.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208818191.44akelts"]', '1208818217.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208818217.42akelts"]', '1208818540.86akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208818540.86akelts"]', '1208818602.13akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208818602.13akelts"]', '1208818653.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208818653.27akelts"]', '1208818734.8akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208818734.8akelts"]', '1208824971.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208824971.89akelts"]', '1208825047.78akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208825047.78akelts"]', '1208825097.28akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208825097.28akelts"]', '1208825126.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208825126.22akelts"]', '1208825212.38akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208825212.38akelts"]', '1208825312.16akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208825312.16akelts"]', '1208825386.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208825386.56akelts"]', '1208892989.37akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208892989.37akelts"]', '1208893039.39akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208893039.39akelts"]', '1208893077.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208893077.73akelts"]', '1208893136.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208893136.55akelts"]', '1208906898.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208906898.45akelts"]', '1208906917.51akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208906917.51akelts"]', '1208908188.33akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208908188.33akelts"]', '1208908264.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208908264.25akelts"]', '1208908273.23akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208908273.23akelts"]', '1208908304.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208908304.17akelts"]', '1208908477.73akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208908477.73akelts"]', '1208980254.31akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208980254.31akelts"]', '1208985003.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985003.56akelts"]', '1208985185.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985185.81akelts"]', '1208985214.58akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985214.58akelts"]', '1208985234.61akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985234.61akelts"]', '1208985272.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985272.09akelts"]', '1208985300.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985300.27akelts"]', '1208985322.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985322.83akelts"]', '1208985344.06akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985344.06akelts"]', '1208985360.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985360.09akelts"]', '1208985376.95akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985376.95akelts"]', '1208985400.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985400.34akelts"]', '1208985418.14akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985418.14akelts"]', '1208985454.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985454.22akelts"]', '1208985479.5akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985479.5akelts"]', '1208985517.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985517.81akelts"]', '1208985544.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985544.53akelts"]', '1208985576.95akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208985576.95akelts"]', '1208986572.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208986572.17akelts"]', '1208986622.69akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208986622.69akelts"]', '1208986646.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208986646.89akelts"]', '1208987108.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208987108.2akelts"]', '1208987212.84akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208987212.84akelts"]', '1208991300.13akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1208991300.13akelts"]', '1209140730.86akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209140730.86akelts"]', '1209141015.28akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209141015.28akelts"]', '1209141425.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209141425.34akelts"]', '1209141691.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209141691.17akelts"]', '1209141904.02akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209141904.02akelts"]', '1209142119.19akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209142119.19akelts"]', '1209146067.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209146067.22akelts"]', '1209146181.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209146181.89akelts"]', '1209160185.02akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160185.02akelts"]', '1209160274.7akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160274.7akelts"]', '1209160379.91akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160379.91akelts"]', '1209160555.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160555.83akelts"]', '1209160596.69akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160596.69akelts"]', '1209160629.69akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160629.69akelts"]', '1209160682.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160682.22akelts"]', '1209160719.84akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160719.84akelts"]', '1209160827.09akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160827.09akelts"]', '1209160837.58akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160837.58akelts"]', '1209160864.06akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160864.06akelts"]', '1209160896.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160896.47akelts"]', '1209160973.97akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209160973.97akelts"]', '1209161079.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161079.81akelts"]', '1209161092.23akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161092.23akelts"]', '1209161143.61akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161143.61akelts"]', '1209161288.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161288.52akelts"]', '1209161340.83akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161340.83akelts"]', '1209161440.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161440.44akelts"]', '1209161477.69akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161477.69akelts"]', '1209161531.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161531.03akelts"]', '1209161544.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161544.22akelts"]', '1209161579.78akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161579.78akelts"]', '1209161595.84akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161595.84akelts"]', '1209161621.64akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161621.64akelts"]', '1209161638.92akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161638.92akelts"]', '1209161654.92akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161654.92akelts"]', '1209161657.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161657.44akelts"]', '1209161671.08akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161671.08akelts"]', '1209161686.61akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161686.61akelts"]', '1209161710.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209161710.03akelts"]', '1209487843.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209487843.45akelts"]', '1209487882.58akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209487882.58akelts"]', '1209488311.5akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488311.5akelts"]', '1209488450.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488450.56akelts"]', '1209488472.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488472.47akelts"]', '1209488483.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488483.22akelts"]', '1209488503.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488503.27akelts"]', '1209488515.72akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488515.72akelts"]', '1209488525.33akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488525.33akelts"]', '1209488617.81akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488617.81akelts"]', '1209488652.98akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488652.98akelts"]', '1209488661.19akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209488661.19akelts"]', '1209493265.22akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209493265.22akelts"]', '1209493343.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209493343.03akelts"]', '1209494722.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209494722.42akelts"]', '1209494725.34akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209494725.34akelts"]', '1209494725.63akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209494725.63akelts"]', '1209494966.0akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209494966.0akelts"]', '1209495088.03akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495088.03akelts"]', '1209495118.52akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495118.52akelts"]', '1209495145.86akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495145.86akelts"]', '1209495327.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495327.36akelts"]', '1209495367.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495367.36akelts"]', '1209495433.5akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495433.5akelts"]', '1209495577.41akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495577.41akelts"]', '1209495606.77akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495606.77akelts"]', '1209495623.19akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495623.19akelts"]', '1209495656.72akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495656.72akelts"]', '1209495679.92akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495679.92akelts"]', '1209495696.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495696.53akelts"]', '1209495721.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209495721.25akelts"]', '1209507473.88akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209507473.88akelts"]', '1209507475.38akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209507475.38akelts"]', '1209507476.16akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1209507476.16akelts"]', '1210017193.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210017193.42akelts"]', '1210017223.89akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210017223.89akelts"]', '1210017262.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210017262.44akelts"]', '1210017823.92akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210017823.92akelts"]', '1210017888.88akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210017888.88akelts"]', '1210017910.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210017910.42akelts"]', '1210018066.84akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018066.84akelts"]', '1210018194.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018194.59akelts"]', '1210018229.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018229.25akelts"]', '1210018327.55akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018327.55akelts"]', '1210018399.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018399.53akelts"]', '1210018587.17akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018587.17akelts"]', '1210018730.95akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018730.95akelts"]', '1210018803.45akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018803.45akelts"]', '1210018806.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018806.36akelts"]', '1210018832.0akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018832.0akelts"]', '1210018838.19akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018838.19akelts"]', '1210018876.3akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018876.3akelts"]', '1210018938.13akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018938.13akelts"]', '1210018967.11akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018967.11akelts"]', '1210018986.64akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210018986.64akelts"]', '1210019029.05akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210019029.05akelts"]', '1210019045.88akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210019045.88akelts"]', '1210019110.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210019110.2akelts"]', '1210352028.39akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210352028.39akelts"]', '1210352029.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210352029.53akelts"]', '1210352034.06akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210352034.06akelts"]', '1210352211.14akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210352211.14akelts"]', '1210352223.39akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210352223.39akelts"]', '1210352237.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210352237.56akelts"]', '1210353447.2akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210353447.2akelts"]', '1210354351.36akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210354351.36akelts"]', '1210354363.27akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210354363.27akelts"]', '1210354405.92akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210354405.92akelts"]', '1210354500.67akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210354500.67akelts"]', '1210354564.08akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210354564.08akelts"]', '1210354586.39akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210354586.39akelts"]', '1210356705.42akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210356705.42akelts"]', '1210358375.59akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210358375.59akelts"]', '1210373723.53akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237124.2akelts"]["Objects"]["1210373723.53akelts"]', '1210373725.94akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1204237124.2akelts"]["Objects"]["1210373725.94akelts"]', '1210373727.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1203114365.52akelts"]["Objects"]["1210373727.44akelts"]', '1210612073.56akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210612073.56akelts"]', '1210612087.31akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210612087.31akelts"]', '1210612184.25akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210612184.25akelts"]', '1210612337.41akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210612337.41akelts"]', '1210612360.47akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210612360.47akelts"]', '1210613067.88akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210613067.88akelts"]', '1210614180.44akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210614180.44akelts"]', '1210614230.94akelts': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210614230.94akelts"]', '1210978646.84kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210978646.84kmuller"]', '1210978775.87kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210978775.87kmuller"]', '1210978826.82kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210978826.82kmuller"]', '1210978861.2kmuller': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1210978861.2kmuller"]', '1211929216.0WDIG1': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1211929216.0WDIG1"]', '1211929472.0WDIG': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1211929472.0WDIG"]', '1216249894.26aapatel': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1216249894.26aapatel"]', '1259623459.69caoconno': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1259623459.69caoconno"]', '1259623493.53caoconno': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1259623493.53caoconno"]', '1259623521.7caoconno': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1259623521.7caoconno"]', '1259623563.36caoconno': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1259623563.36caoconno"]', '1259623576.63caoconno': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1259623576.63caoconno"]', '1259623610.39caoconno': '["Objects"]["1196970035.53sdnaik"]["Objects"]["1259623610.39caoconno"]'}}
extraInfo = {'camPos': Point3(-317.436, 96.8385, 20.2355), 'camHpr': VBase3(146.894, -10.8979, 0), 'focalLength': 0.791999995708, 'skyState': 2, 'fog': 0} | 23,842.142857 | 166,452 | 0.666838 | 23,325 | 166,895 | 4.718671 | 0.082872 | 0.015936 | 0.014855 | 0.11746 | 0.66203 | 0.516295 | 0.478835 | 0.410547 | 0.374567 | 0.363565 | 0 | 0.300607 | 0.070955 | 166,895 | 7 | 166,453 | 23,842.142857 | 0.409234 | 0.001318 | 0 | 0 | 0 | 0 | 0.510647 | 0.244836 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
934678c3c01f03f5a0a5fc9b6fd2c6f69ded1c17 | 136 | py | Python | src/UQpy/reliability/__init__.py | SURGroup/UncertaintyQuantification | a94c8db47d07134ea2b3b0a3ca53ca818532c3e6 | [
"MIT"
] | null | null | null | src/UQpy/reliability/__init__.py | SURGroup/UncertaintyQuantification | a94c8db47d07134ea2b3b0a3ca53ca818532c3e6 | [
"MIT"
] | null | null | null | src/UQpy/reliability/__init__.py | SURGroup/UncertaintyQuantification | a94c8db47d07134ea2b3b0a3ca53ca818532c3e6 | [
"MIT"
] | null | null | null | from UQpy.reliability.SubsetSimulation import SubsetSimulation
from UQpy.reliability.taylor_series import *
from . import TaylorSeries
| 27.2 | 62 | 0.860294 | 15 | 136 | 7.733333 | 0.533333 | 0.137931 | 0.327586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095588 | 136 | 4 | 63 | 34 | 0.943089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
934745fda52dc20c4927f5177c133911e979dc44 | 167 | py | Python | mysite/mysite/modelmysite/author/models.py | lz1988/django-web2015 | 79bcc9fc83b487915da6230e0ab7d5c599a33a9d | [
"BSD-2-Clause"
] | null | null | null | mysite/mysite/modelmysite/author/models.py | lz1988/django-web2015 | 79bcc9fc83b487915da6230e0ab7d5c599a33a9d | [
"BSD-2-Clause"
] | null | null | null | mysite/mysite/modelmysite/author/models.py | lz1988/django-web2015 | 79bcc9fc83b487915da6230e0ab7d5c599a33a9d | [
"BSD-2-Clause"
] | null | null | null | #filename models.py
from django.db import models
class author(models.Model):
name = models.CharField(max_length=30)
address = models.CharField(max_length=30)
| 23.857143 | 45 | 0.760479 | 24 | 167 | 5.208333 | 0.666667 | 0.24 | 0.288 | 0.384 | 0.416 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.137725 | 167 | 6 | 46 | 27.833333 | 0.840278 | 0.107784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
936c3b70a6fca1149f5615adb2c8c08f13d4f5ae | 35 | py | Python | mdm_inventory/address/tests/factories/__init__.py | TeamWalls/mdm-backend-django | 4e23f9abc8531eb786d5e6cf958c9ffa8acd6b1d | [
"MIT"
] | null | null | null | mdm_inventory/address/tests/factories/__init__.py | TeamWalls/mdm-backend-django | 4e23f9abc8531eb786d5e6cf958c9ffa8acd6b1d | [
"MIT"
] | null | null | null | mdm_inventory/address/tests/factories/__init__.py | TeamWalls/mdm-backend-django | 4e23f9abc8531eb786d5e6cf958c9ffa8acd6b1d | [
"MIT"
] | null | null | null | from .address import AddressFactory | 35 | 35 | 0.885714 | 4 | 35 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 35 | 1 | 35 | 35 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4f035a3805e762ff5933f530ad54542c68d67376 | 52,500 | py | Python | if/py/gen-py/fbnet/command_runner_asyncio/CommandRunner/ttypes.py | vdonga/FCR | 59c4c27d6974f55730cd9f6d219214c090928c7c | [
"BSD-3-Clause"
] | null | null | null | if/py/gen-py/fbnet/command_runner_asyncio/CommandRunner/ttypes.py | vdonga/FCR | 59c4c27d6974f55730cd9f6d219214c090928c7c | [
"BSD-3-Clause"
] | null | null | null | if/py/gen-py/fbnet/command_runner_asyncio/CommandRunner/ttypes.py | vdonga/FCR | 59c4c27d6974f55730cd9f6d219214c090928c7c | [
"BSD-3-Clause"
] | null | null | null | #
# Autogenerated by Thrift
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
# @generated
#
from __future__ import absolute_import
import six
from thrift.util.Recursive import fix_spec
from thrift.Thrift import *
from thrift.protocol.TProtocol import TProtocolException
import fb303_asyncio.fb303.ttypes
import pprint
import warnings
from thrift import Thrift
from thrift.transport import TTransport
from thrift.protocol import TBinaryProtocol
from thrift.protocol import TCompactProtocol
from thrift.protocol import THeaderProtocol
fastproto = None
if not '__pypy__' in sys.builtin_module_names:
try:
from thrift.protocol import fastproto
except:
pass
all_structs = []
UTF8STRINGS = bool(0) or sys.version_info.major >= 3
__all__ = ['UTF8STRINGS', 'SessionType', 'FBNetDataException', 'UnsupportedDeviceException', 'SessionException', 'UnsupportedCommandException', 'InstanceOverloaded', 'SessionData', 'Device', 'CommandResult', 'Session']
class SessionType:
SSH = 1
SSH_NETCONF = 2
_VALUES_TO_NAMES = {
1: "SSH",
2: "SSH_NETCONF",
}
_NAMES_TO_VALUES = {
"SSH": 1,
"SSH_NETCONF": 2,
}
class FBNetDataException(TException):
"""
Attributes:
- message
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.message = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('FBNetDataException')
if self.message != None:
oprot.writeFieldBegin('message', TType.STRING, 1)
oprot.writeString(self.message.encode('utf-8')) if UTF8STRINGS and not isinstance(self.message, bytes) else oprot.writeString(self.message)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __str__(self):
return repr(self)
def __repr__(self):
L = []
padding = ' ' * 4
if self.message is not None:
value = pprint.pformat(self.message, indent=0)
value = padding.join(value.splitlines(True))
L.append(' message=%s' % (value))
if 'message' not in self.__dict__:
message = getattr(self, 'message', None)
if message:
L.append('message=%r' % message)
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class UnsupportedDeviceException(TException):
"""
Attributes:
- message
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.message = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('UnsupportedDeviceException')
if self.message != None:
oprot.writeFieldBegin('message', TType.STRING, 1)
oprot.writeString(self.message.encode('utf-8')) if UTF8STRINGS and not isinstance(self.message, bytes) else oprot.writeString(self.message)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __str__(self):
return repr(self)
def __repr__(self):
L = []
padding = ' ' * 4
if self.message is not None:
value = pprint.pformat(self.message, indent=0)
value = padding.join(value.splitlines(True))
L.append(' message=%s' % (value))
if 'message' not in self.__dict__:
message = getattr(self, 'message', None)
if message:
L.append('message=%r' % message)
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class SessionException(TException):
"""
Attributes:
- message
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.message = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('SessionException')
if self.message != None:
oprot.writeFieldBegin('message', TType.STRING, 1)
oprot.writeString(self.message.encode('utf-8')) if UTF8STRINGS and not isinstance(self.message, bytes) else oprot.writeString(self.message)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __str__(self):
return repr(self)
def __repr__(self):
L = []
padding = ' ' * 4
if self.message is not None:
value = pprint.pformat(self.message, indent=0)
value = padding.join(value.splitlines(True))
L.append(' message=%s' % (value))
if 'message' not in self.__dict__:
message = getattr(self, 'message', None)
if message:
L.append('message=%r' % message)
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class UnsupportedCommandException(TException):
"""
Attributes:
- message
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.message = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('UnsupportedCommandException')
if self.message != None:
oprot.writeFieldBegin('message', TType.STRING, 1)
oprot.writeString(self.message.encode('utf-8')) if UTF8STRINGS and not isinstance(self.message, bytes) else oprot.writeString(self.message)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __str__(self):
return repr(self)
def __repr__(self):
L = []
padding = ' ' * 4
if self.message is not None:
value = pprint.pformat(self.message, indent=0)
value = padding.join(value.splitlines(True))
L.append(' message=%s' % (value))
if 'message' not in self.__dict__:
message = getattr(self, 'message', None)
if message:
L.append('message=%r' % message)
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class InstanceOverloaded(TException):
"""
Attributes:
- message
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.message = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('InstanceOverloaded')
if self.message != None:
oprot.writeFieldBegin('message', TType.STRING, 1)
oprot.writeString(self.message.encode('utf-8')) if UTF8STRINGS and not isinstance(self.message, bytes) else oprot.writeString(self.message)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __str__(self):
return repr(self)
def __repr__(self):
L = []
padding = ' ' * 4
if self.message is not None:
value = pprint.pformat(self.message, indent=0)
value = padding.join(value.splitlines(True))
L.append(' message=%s' % (value))
if 'message' not in self.__dict__:
message = getattr(self, 'message', None)
if message:
L.append('message=%r' % message)
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class SessionData:
"""
Attributes:
- subsystem
- exec_command
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.subsystem = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.exec_command = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('SessionData')
if self.subsystem != None:
oprot.writeFieldBegin('subsystem', TType.STRING, 1)
oprot.writeString(self.subsystem.encode('utf-8')) if UTF8STRINGS and not isinstance(self.subsystem, bytes) else oprot.writeString(self.subsystem)
oprot.writeFieldEnd()
if self.exec_command != None:
oprot.writeFieldBegin('exec_command', TType.STRING, 2)
oprot.writeString(self.exec_command.encode('utf-8')) if UTF8STRINGS and not isinstance(self.exec_command, bytes) else oprot.writeString(self.exec_command)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __repr__(self):
L = []
padding = ' ' * 4
if self.subsystem is not None:
value = pprint.pformat(self.subsystem, indent=0)
value = padding.join(value.splitlines(True))
L.append(' subsystem=%s' % (value))
if self.exec_command is not None:
value = pprint.pformat(self.exec_command, indent=0)
value = padding.join(value.splitlines(True))
L.append(' exec_command=%s' % (value))
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class Device:
"""
Attributes:
- hostname
- username
- password
- console
- mgmt_ip
- command_prompts
- ip_address
- session_type
- session_data
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.hostname = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 10:
if ftype == TType.STRING:
self.username = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 11:
if ftype == TType.STRING:
self.password = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 13:
if ftype == TType.STRING:
self.console = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 14:
if ftype == TType.BOOL:
self.mgmt_ip = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 15:
if ftype == TType.MAP:
self.command_prompts = {}
(_ktype1, _vtype2, _size0 ) = iprot.readMapBegin()
if _size0 >= 0:
for _i4 in six.moves.range(_size0):
_key5 = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
_val6 = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
self.command_prompts[_key5] = _val6
else:
while iprot.peekMap():
_key7 = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
_val8 = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
self.command_prompts[_key7] = _val8
iprot.readMapEnd()
else:
iprot.skip(ftype)
elif fid == 16:
if ftype == TType.STRING:
self.ip_address = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 17:
if ftype == TType.I32:
self.session_type = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 18:
if ftype == TType.STRUCT:
self.session_data = SessionData()
self.session_data.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
if self.hostname == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'hostname' was not found in serialized data! Struct: Device")
if self.username == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'username' was not found in serialized data! Struct: Device")
if self.password == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'password' was not found in serialized data! Struct: Device")
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('Device')
if self.hostname != None:
oprot.writeFieldBegin('hostname', TType.STRING, 1)
oprot.writeString(self.hostname.encode('utf-8')) if UTF8STRINGS and not isinstance(self.hostname, bytes) else oprot.writeString(self.hostname)
oprot.writeFieldEnd()
if self.username != None:
oprot.writeFieldBegin('username', TType.STRING, 10)
oprot.writeString(self.username.encode('utf-8')) if UTF8STRINGS and not isinstance(self.username, bytes) else oprot.writeString(self.username)
oprot.writeFieldEnd()
if self.password != None:
oprot.writeFieldBegin('password', TType.STRING, 11)
oprot.writeString(self.password.encode('utf-8')) if UTF8STRINGS and not isinstance(self.password, bytes) else oprot.writeString(self.password)
oprot.writeFieldEnd()
if self.console != None and self.console != self.thrift_spec[13][4]:
oprot.writeFieldBegin('console', TType.STRING, 13)
oprot.writeString(self.console.encode('utf-8')) if UTF8STRINGS and not isinstance(self.console, bytes) else oprot.writeString(self.console)
oprot.writeFieldEnd()
if self.mgmt_ip != None and self.mgmt_ip != self.thrift_spec[14][4]:
oprot.writeFieldBegin('mgmt_ip', TType.BOOL, 14)
oprot.writeBool(self.mgmt_ip)
oprot.writeFieldEnd()
if self.command_prompts != None:
oprot.writeFieldBegin('command_prompts', TType.MAP, 15)
oprot.writeMapBegin(TType.STRING, TType.STRING, len(self.command_prompts))
for kiter9,viter10 in self.command_prompts.items():
oprot.writeString(kiter9.encode('utf-8')) if UTF8STRINGS and not isinstance(kiter9, bytes) else oprot.writeString(kiter9)
oprot.writeString(viter10.encode('utf-8')) if UTF8STRINGS and not isinstance(viter10, bytes) else oprot.writeString(viter10)
oprot.writeMapEnd()
oprot.writeFieldEnd()
if self.ip_address != None:
oprot.writeFieldBegin('ip_address', TType.STRING, 16)
oprot.writeString(self.ip_address.encode('utf-8')) if UTF8STRINGS and not isinstance(self.ip_address, bytes) else oprot.writeString(self.ip_address)
oprot.writeFieldEnd()
if self.session_type != None:
oprot.writeFieldBegin('session_type', TType.I32, 17)
oprot.writeI32(self.session_type)
oprot.writeFieldEnd()
if self.session_data != None:
oprot.writeFieldBegin('session_data', TType.STRUCT, 18)
self.session_data.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __repr__(self):
L = []
padding = ' ' * 4
if self.hostname is not None:
value = pprint.pformat(self.hostname, indent=0)
value = padding.join(value.splitlines(True))
L.append(' hostname=%s' % (value))
if self.username is not None:
value = pprint.pformat(self.username, indent=0)
value = padding.join(value.splitlines(True))
L.append(' username=%s' % (value))
if self.password is not None:
value = pprint.pformat(self.password, indent=0)
value = padding.join(value.splitlines(True))
L.append(' password=%s' % (value))
if self.console is not None:
value = pprint.pformat(self.console, indent=0)
value = padding.join(value.splitlines(True))
L.append(' console=%s' % (value))
if self.mgmt_ip is not None:
value = pprint.pformat(self.mgmt_ip, indent=0)
value = padding.join(value.splitlines(True))
L.append(' mgmt_ip=%s' % (value))
if self.command_prompts is not None:
value = pprint.pformat(self.command_prompts, indent=0)
value = padding.join(value.splitlines(True))
L.append(' command_prompts=%s' % (value))
if self.ip_address is not None:
value = pprint.pformat(self.ip_address, indent=0)
value = padding.join(value.splitlines(True))
L.append(' ip_address=%s' % (value))
if self.session_type is not None:
value = pprint.pformat(self.session_type, indent=0)
value = padding.join(value.splitlines(True))
L.append(' session_type=%s' % (value))
if self.session_data is not None:
value = pprint.pformat(self.session_data, indent=0)
value = padding.join(value.splitlines(True))
L.append(' session_data=%s' % (value))
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class CommandResult:
"""
Attributes:
- output
- status
- command
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.output = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.status = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.command = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
if self.output == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'output' was not found in serialized data! Struct: CommandResult")
if self.status == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'status' was not found in serialized data! Struct: CommandResult")
if self.command == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'command' was not found in serialized data! Struct: CommandResult")
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('CommandResult')
if self.output != None:
oprot.writeFieldBegin('output', TType.STRING, 1)
oprot.writeString(self.output.encode('utf-8')) if UTF8STRINGS and not isinstance(self.output, bytes) else oprot.writeString(self.output)
oprot.writeFieldEnd()
if self.status != None:
oprot.writeFieldBegin('status', TType.STRING, 2)
oprot.writeString(self.status.encode('utf-8')) if UTF8STRINGS and not isinstance(self.status, bytes) else oprot.writeString(self.status)
oprot.writeFieldEnd()
if self.command != None:
oprot.writeFieldBegin('command', TType.STRING, 3)
oprot.writeString(self.command.encode('utf-8')) if UTF8STRINGS and not isinstance(self.command, bytes) else oprot.writeString(self.command)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __repr__(self):
L = []
padding = ' ' * 4
if self.output is not None:
value = pprint.pformat(self.output, indent=0)
value = padding.join(value.splitlines(True))
L.append(' output=%s' % (value))
if self.status is not None:
value = pprint.pformat(self.status, indent=0)
value = padding.join(value.splitlines(True))
L.append(' status=%s' % (value))
if self.command is not None:
value = pprint.pformat(self.command, indent=0)
value = padding.join(value.splitlines(True))
L.append(' command=%s' % (value))
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
class Session:
"""
Attributes:
- id
- name
- hostname
"""
thrift_spec = None
thrift_field_annotations = None
thrift_struct_annotations = None
__init__ = None
@staticmethod
def isUnion():
return False
def read(self, iprot):
if (isinstance(iprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0)
self.checkRequired()
return
if (isinstance(iprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(iprot, THeaderProtocol.THeaderProtocolAccelerate) and iprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastproto is not None:
fastproto.decode(self, iprot.trans, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2)
self.checkRequired()
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.id = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.name = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.hostname = iprot.readString().decode('utf-8') if UTF8STRINGS else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
self.checkRequired()
def checkRequired(self):
if self.id == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'id' was not found in serialized data! Struct: Session")
if self.name == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'name' was not found in serialized data! Struct: Session")
if self.hostname == None:
raise TProtocolException(TProtocolException.MISSING_REQUIRED_FIELD, "Required field 'hostname' was not found in serialized data! Struct: Session")
return
def write(self, oprot):
if (isinstance(oprot, TBinaryProtocol.TBinaryProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_BINARY_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=0))
return
if (isinstance(oprot, TCompactProtocol.TCompactProtocolAccelerated) or (isinstance(oprot, THeaderProtocol.THeaderProtocolAccelerate) and oprot.get_protocol_id() == THeaderProtocol.THeaderProtocol.T_COMPACT_PROTOCOL)) and self.thrift_spec is not None and fastproto is not None:
oprot.trans.write(fastproto.encode(self, [self.__class__, self.thrift_spec, False], utf8strings=UTF8STRINGS, protoid=2))
return
oprot.writeStructBegin('Session')
if self.id != None:
oprot.writeFieldBegin('id', TType.I64, 1)
oprot.writeI64(self.id)
oprot.writeFieldEnd()
if self.name != None:
oprot.writeFieldBegin('name', TType.STRING, 2)
oprot.writeString(self.name.encode('utf-8')) if UTF8STRINGS and not isinstance(self.name, bytes) else oprot.writeString(self.name)
oprot.writeFieldEnd()
if self.hostname != None:
oprot.writeFieldBegin('hostname', TType.STRING, 3)
oprot.writeString(self.hostname.encode('utf-8')) if UTF8STRINGS and not isinstance(self.hostname, bytes) else oprot.writeString(self.hostname)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def __repr__(self):
L = []
padding = ' ' * 4
if self.id is not None:
value = pprint.pformat(self.id, indent=0)
value = padding.join(value.splitlines(True))
L.append(' id=%s' % (value))
if self.name is not None:
value = pprint.pformat(self.name, indent=0)
value = padding.join(value.splitlines(True))
L.append(' name=%s' % (value))
if self.hostname is not None:
value = pprint.pformat(self.hostname, indent=0)
value = padding.join(value.splitlines(True))
L.append(' hostname=%s' % (value))
return "%s(%s)" % (self.__class__.__name__, "\n" + ",\n".join(L) if L else '')
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
# Override the __hash__ function for Python3 - t10434117
if not six.PY2:
__hash__ = object.__hash__
all_structs.append(FBNetDataException)
FBNetDataException.thrift_spec = (
None, # 0
(1, TType.STRING, 'message', True, None, 2, ), # 1
)
FBNetDataException.thrift_struct_annotations = {
}
FBNetDataException.thrift_field_annotations = {
}
def FBNetDataException__init__(self, message=None,):
self.message = message
FBNetDataException.__init__ = FBNetDataException__init__
def FBNetDataException__setstate__(self, state):
state.setdefault('message', None)
self.__dict__ = state
FBNetDataException.__getstate__ = lambda self: self.__dict__.copy()
FBNetDataException.__setstate__ = FBNetDataException__setstate__
all_structs.append(UnsupportedDeviceException)
UnsupportedDeviceException.thrift_spec = (
None, # 0
(1, TType.STRING, 'message', True, None, 2, ), # 1
)
UnsupportedDeviceException.thrift_struct_annotations = {
}
UnsupportedDeviceException.thrift_field_annotations = {
}
def UnsupportedDeviceException__init__(self, message=None,):
self.message = message
UnsupportedDeviceException.__init__ = UnsupportedDeviceException__init__
def UnsupportedDeviceException__setstate__(self, state):
state.setdefault('message', None)
self.__dict__ = state
UnsupportedDeviceException.__getstate__ = lambda self: self.__dict__.copy()
UnsupportedDeviceException.__setstate__ = UnsupportedDeviceException__setstate__
all_structs.append(SessionException)
SessionException.thrift_spec = (
None, # 0
(1, TType.STRING, 'message', True, None, 2, ), # 1
)
SessionException.thrift_struct_annotations = {
}
SessionException.thrift_field_annotations = {
}
def SessionException__init__(self, message=None,):
self.message = message
SessionException.__init__ = SessionException__init__
def SessionException__setstate__(self, state):
state.setdefault('message', None)
self.__dict__ = state
SessionException.__getstate__ = lambda self: self.__dict__.copy()
SessionException.__setstate__ = SessionException__setstate__
all_structs.append(UnsupportedCommandException)
UnsupportedCommandException.thrift_spec = (
None, # 0
(1, TType.STRING, 'message', True, None, 2, ), # 1
)
UnsupportedCommandException.thrift_struct_annotations = {
}
UnsupportedCommandException.thrift_field_annotations = {
}
def UnsupportedCommandException__init__(self, message=None,):
self.message = message
UnsupportedCommandException.__init__ = UnsupportedCommandException__init__
def UnsupportedCommandException__setstate__(self, state):
state.setdefault('message', None)
self.__dict__ = state
UnsupportedCommandException.__getstate__ = lambda self: self.__dict__.copy()
UnsupportedCommandException.__setstate__ = UnsupportedCommandException__setstate__
all_structs.append(InstanceOverloaded)
InstanceOverloaded.thrift_spec = (
None, # 0
(1, TType.STRING, 'message', True, None, 2, ), # 1
)
InstanceOverloaded.thrift_struct_annotations = {
}
InstanceOverloaded.thrift_field_annotations = {
}
def InstanceOverloaded__init__(self, message=None,):
self.message = message
InstanceOverloaded.__init__ = InstanceOverloaded__init__
def InstanceOverloaded__setstate__(self, state):
state.setdefault('message', None)
self.__dict__ = state
InstanceOverloaded.__getstate__ = lambda self: self.__dict__.copy()
InstanceOverloaded.__setstate__ = InstanceOverloaded__setstate__
all_structs.append(SessionData)
SessionData.thrift_spec = (
None, # 0
(1, TType.STRING, 'subsystem', True, None, 1, ), # 1
(2, TType.STRING, 'exec_command', True, None, 1, ), # 2
)
SessionData.thrift_struct_annotations = {
}
SessionData.thrift_field_annotations = {
}
def SessionData__init__(self, subsystem=None, exec_command=None,):
self.subsystem = subsystem
self.exec_command = exec_command
SessionData.__init__ = SessionData__init__
def SessionData__setstate__(self, state):
state.setdefault('subsystem', None)
state.setdefault('exec_command', None)
self.__dict__ = state
SessionData.__getstate__ = lambda self: self.__dict__.copy()
SessionData.__setstate__ = SessionData__setstate__
all_structs.append(Device)
Device.thrift_spec = (
None, # 0
(1, TType.STRING, 'hostname', True, None, 0, ), # 1
None, # 2
None, # 3
None, # 4
None, # 5
None, # 6
None, # 7
None, # 8
None, # 9
(10, TType.STRING, 'username', True, None, 0, ), # 10
(11, TType.STRING, 'password', True, None, 0, ), # 11
None, # 12
(13, TType.STRING, 'console', True, "", 1, ), # 13
(14, TType.BOOL, 'mgmt_ip', None, False, 1, ), # 14
(15, TType.MAP, 'command_prompts', (TType.STRING,True,TType.STRING,True), None, 1, ), # 15
(16, TType.STRING, 'ip_address', True, None, 1, ), # 16
(17, TType.I32, 'session_type', SessionType, None, 1, ), # 17
(18, TType.STRUCT, 'session_data', [SessionData, SessionData.thrift_spec, False], None, 1, ), # 18
)
Device.thrift_struct_annotations = {
}
Device.thrift_field_annotations = {
}
def Device__init__(self, hostname=None, username=None, password=None, console=Device.thrift_spec[13][4], mgmt_ip=Device.thrift_spec[14][4], command_prompts=None, ip_address=None, session_type=None, session_data=None,):
self.hostname = hostname
self.username = username
self.password = password
self.console = console
self.mgmt_ip = mgmt_ip
self.command_prompts = command_prompts
self.ip_address = ip_address
self.session_type = session_type
self.session_data = session_data
Device.__init__ = Device__init__
def Device__setstate__(self, state):
state.setdefault('hostname', None)
state.setdefault('username', None)
state.setdefault('password', None)
state.setdefault('console', "")
state.setdefault('mgmt_ip', False)
state.setdefault('command_prompts', None)
state.setdefault('ip_address', None)
state.setdefault('session_type', None)
state.setdefault('session_data', None)
self.__dict__ = state
Device.__getstate__ = lambda self: self.__dict__.copy()
Device.__setstate__ = Device__setstate__
all_structs.append(CommandResult)
CommandResult.thrift_spec = (
None, # 0
(1, TType.STRING, 'output', True, None, 0, ), # 1
(2, TType.STRING, 'status', True, None, 0, ), # 2
(3, TType.STRING, 'command', True, None, 0, ), # 3
)
CommandResult.thrift_struct_annotations = {
}
CommandResult.thrift_field_annotations = {
}
def CommandResult__init__(self, output=None, status=None, command=None,):
self.output = output
self.status = status
self.command = command
CommandResult.__init__ = CommandResult__init__
def CommandResult__setstate__(self, state):
state.setdefault('output', None)
state.setdefault('status', None)
state.setdefault('command', None)
self.__dict__ = state
CommandResult.__getstate__ = lambda self: self.__dict__.copy()
CommandResult.__setstate__ = CommandResult__setstate__
all_structs.append(Session)
Session.thrift_spec = (
None, # 0
(1, TType.I64, 'id', None, None, 0, ), # 1
(2, TType.STRING, 'name', True, None, 0, ), # 2
(3, TType.STRING, 'hostname', True, None, 0, ), # 3
)
Session.thrift_struct_annotations = {
}
Session.thrift_field_annotations = {
}
def Session__init__(self, id=None, name=None, hostname=None,):
self.id = id
self.name = name
self.hostname = hostname
Session.__init__ = Session__init__
def Session__setstate__(self, state):
state.setdefault('id', None)
state.setdefault('name', None)
state.setdefault('hostname', None)
self.__dict__ = state
Session.__getstate__ = lambda self: self.__dict__.copy()
Session.__setstate__ = Session__setstate__
fix_spec(all_structs)
del all_structs
| 41.047694 | 339 | 0.715429 | 6,116 | 52,500 | 5.910726 | 0.03826 | 0.026279 | 0.023402 | 0.018811 | 0.792669 | 0.771508 | 0.752089 | 0.734938 | 0.724979 | 0.697206 | 0 | 0.012561 | 0.173581 | 52,500 | 1,278 | 340 | 41.079812 | 0.820638 | 0.020133 | 0 | 0.655832 | 1 | 0 | 0.044941 | 0.002068 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082218 | false | 0.013384 | 0.013384 | 0.027725 | 0.234226 | 0.021989 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
87a35ff42afe92edef9ef204919e56ac5f577673 | 544 | py | Python | desafio109/teste109.py | marcelocmedeiros/RevisaoPython | 04c602bf17e8ab37c9660337a8f8497eb498e10d | [
"MIT"
] | null | null | null | desafio109/teste109.py | marcelocmedeiros/RevisaoPython | 04c602bf17e8ab37c9660337a8f8497eb498e10d | [
"MIT"
] | null | null | null | desafio109/teste109.py | marcelocmedeiros/RevisaoPython | 04c602bf17e8ab37c9660337a8f8497eb498e10d | [
"MIT"
] | null | null | null | import moeda109
p = float(input('Digite o preço: R$ '))
# quando eu chamar True(3° parametro) estou pedindo para todos os preços virem formatados
print(f'A metade de {moeda109.moeda(p)} é {moeda109.metade(p, True)}')
print(f'A dobro de {moeda109.moeda(p)} é {moeda109.dobro(p, True)}')
# parametros p = preço, taxa = 10 ou qualquer valor e formatado = True p ter preço formatado
print(f'Aumentado 10% de {moeda109.moeda(p)} é {moeda109.aumentar(p, 10, True)}')
print(f'Diminuindo 20% de {moeda109.moeda(p)} é {moeda109.diminuir(p, 20, True)}') | 60.444444 | 92 | 0.716912 | 93 | 544 | 4.204301 | 0.494624 | 0.061381 | 0.153453 | 0.163683 | 0.255754 | 0.255754 | 0 | 0 | 0 | 0 | 0 | 0.080508 | 0.132353 | 544 | 9 | 93 | 60.444444 | 0.745763 | 0.327206 | 0 | 0 | 0 | 0.333333 | 0.769231 | 0.115385 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
87b4ed66ccc845cebe820315829ff91c20ec33b3 | 4,861 | py | Python | modelzoo/migrations/0025_auto_20201015_1628.py | SuperElastix/ElastixModelZooWebsite | 00d7b4aec8eb04c285d3771d53310079a3443fab | [
"Apache-2.0"
] | 1 | 2021-11-15T07:30:24.000Z | 2021-11-15T07:30:24.000Z | modelzoo/migrations/0025_auto_20201015_1628.py | SuperElastix/ElastixModelZooWebsite | 00d7b4aec8eb04c285d3771d53310079a3443fab | [
"Apache-2.0"
] | null | null | null | modelzoo/migrations/0025_auto_20201015_1628.py | SuperElastix/ElastixModelZooWebsite | 00d7b4aec8eb04c285d3771d53310079a3443fab | [
"Apache-2.0"
] | null | null | null | # Generated by Django 3.0.3 on 2020-10-15 14:28
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('modelzoo', '0024_auto_20201014_1425'),
]
operations = [
migrations.RemoveField(
model_name='model',
name='parameter_file1',
),
migrations.RemoveField(
model_name='model',
name='parameter_file10',
),
migrations.RemoveField(
model_name='model',
name='parameter_file11',
),
migrations.RemoveField(
model_name='model',
name='parameter_file12',
),
migrations.RemoveField(
model_name='model',
name='parameter_file13',
),
migrations.RemoveField(
model_name='model',
name='parameter_file14',
),
migrations.RemoveField(
model_name='model',
name='parameter_file15',
),
migrations.RemoveField(
model_name='model',
name='parameter_file16',
),
migrations.RemoveField(
model_name='model',
name='parameter_file17',
),
migrations.RemoveField(
model_name='model',
name='parameter_file18',
),
migrations.RemoveField(
model_name='model',
name='parameter_file19',
),
migrations.RemoveField(
model_name='model',
name='parameter_file2',
),
migrations.RemoveField(
model_name='model',
name='parameter_file20',
),
migrations.RemoveField(
model_name='model',
name='parameter_file21',
),
migrations.RemoveField(
model_name='model',
name='parameter_file22',
),
migrations.RemoveField(
model_name='model',
name='parameter_file23',
),
migrations.RemoveField(
model_name='model',
name='parameter_file24',
),
migrations.RemoveField(
model_name='model',
name='parameter_file25',
),
migrations.RemoveField(
model_name='model',
name='parameter_file26',
),
migrations.RemoveField(
model_name='model',
name='parameter_file27',
),
migrations.RemoveField(
model_name='model',
name='parameter_file28',
),
migrations.RemoveField(
model_name='model',
name='parameter_file29',
),
migrations.RemoveField(
model_name='model',
name='parameter_file3',
),
migrations.RemoveField(
model_name='model',
name='parameter_file30',
),
migrations.RemoveField(
model_name='model',
name='parameter_file31',
),
migrations.RemoveField(
model_name='model',
name='parameter_file32',
),
migrations.RemoveField(
model_name='model',
name='parameter_file33',
),
migrations.RemoveField(
model_name='model',
name='parameter_file34',
),
migrations.RemoveField(
model_name='model',
name='parameter_file4',
),
migrations.RemoveField(
model_name='model',
name='parameter_file5',
),
migrations.RemoveField(
model_name='model',
name='parameter_file6',
),
migrations.RemoveField(
model_name='model',
name='parameter_file7',
),
migrations.RemoveField(
model_name='model',
name='parameter_file8',
),
migrations.RemoveField(
model_name='model',
name='parameter_file9',
),
migrations.AlterField(
model_name='model',
name='content',
field=models.CharField(blank=True, choices=[('Head', 'Head'), ('Brain', 'Brain'), ('Neck', 'Neck'), ('Carotid', 'Carotid'), ('Chest', 'Chest'), ('Lung', 'Lung'), ('Cardiac', 'Cardiac'), ('Abdomen', 'Abdomen'), ('Liver', 'Liver'), ('Cervix', 'Cervix'), ('Prostate', 'Prostate'), ('Pelvis', 'Pelvis'), ('Knee', 'Knee'), ('Cervical', 'Cervix'), ('Pelvic', 'Pelvis')], default='', max_length=15),
),
migrations.AlterField(
model_name='model',
name='modality',
field=models.CharField(blank=True, choices=[('CT', 'CT'), ('Ultrasound', 'Ultrasound'), ('MRI', 'MRI'), ('PET', 'PET'), ('X-Ray', 'X-Ray'), ('MR', 'MRI'), ('US', 'Ultrasound')], default='', max_length=15),
),
]
| 30.38125 | 404 | 0.5108 | 381 | 4,861 | 6.32021 | 0.254593 | 0.269103 | 0.209302 | 0.269103 | 0.739203 | 0.739203 | 0.677741 | 0 | 0 | 0 | 0 | 0.029851 | 0.352191 | 4,861 | 159 | 405 | 30.572327 | 0.734836 | 0.009257 | 0 | 0.705882 | 1 | 0 | 0.206481 | 0.004778 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006536 | 0 | 0.026144 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
87cd79f0f1db4152c3226ab7f49974cf7638173b | 90,994 | py | Python | Plot_modules/plottingSHIPcal.py | mihaipx/SHIPcal | 45d108c82ca845546ef854c4624ac2d4f981cf0a | [
"MIT"
] | 5 | 2019-12-21T10:41:37.000Z | 2022-03-14T19:22:23.000Z | Plot_modules/plottingSHIPcal.py | mihaipx/SHIPcal | 45d108c82ca845546ef854c4624ac2d4f981cf0a | [
"MIT"
] | 17 | 2022-03-02T05:08:11.000Z | 2022-03-23T15:43:58.000Z | Plot_modules/plottingSHIPcal.py | mihaipx/SHIPcal | 45d108c82ca845546ef854c4624ac2d4f981cf0a | [
"MIT"
] | 8 | 2022-02-03T16:00:50.000Z | 2022-03-14T15:05:32.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Sat May 19 14:53:58 2018
@author: miguel
"""
from matplotlib import pyplot as plt
from matplotlib.sankey import Sankey
import numpy as np
import pandas as pd
from iapws import IAPWS97
from General_modules.func_General import bar_MPa,thermalOil,moltenSalt
import io
import base64
import os
def SankeyPlot(sender,origin,lang,Production_max,Production_lim,Perd_term_anual,DNI_anual_irradiation,Area,num_loops,imageQlty,plotPath,**kwargs):
#Proportions for Sankey
raw_potential=DNI_anual_irradiation*Area*num_loops/1000 #MWh
Utilization=(Production_max-Production_lim)/1000 #En Mwh
Utilization_ratio=100*(Utilization/raw_potential) #Utilization ratio
Thermal_loss=Perd_term_anual/1000 #Thermal loss over lim production
Thermal_loss_ratio=100*(Thermal_loss/raw_potential) #Thermal loss over lim production
Global_eff=100*(Production_lim/1000)/raw_potential
Optic_loss_ratio=Global_eff-Thermal_loss_ratio-Utilization_ratio
Production=Production_lim/1000 #en MWh
sankeyDict={'Production':Production,'raw_potential':raw_potential,'Thermal_loss':Thermal_loss,'Utilization':Utilization}
fig = plt.figure(figsize=(8, 8))
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
ax = fig.add_subplot(1, 1, 1, xticks=[], yticks=[],title="Diagrama Sankey producción solar")
if lang=="eng":
ax = fig.add_subplot(1, 1, 1, xticks=[], yticks=[],title="Solar production - Sankey diagram")
sankey = Sankey(ax=ax, unit=None)
if lang=="spa":
sankey.add(flows=[raw_potential/Production, -raw_potential/Production], #DNI_anual_irradiation/(Production*2) es lo que debería estar, se ha puesto el 1.1 del raw sólo por mostrarlo gráficamente mejor
fc='#F2FA52',
pathlengths = [0.95,0.375],
patchlabel='\n\n\n'+(str(int(DNI_anual_irradiation))+' kWh/m2 - Radiación solar en el emplazamiento'),
trunklength=2,
labels=['',' \n\n'+(str(int(raw_potential))+' MWh \n Radiación Solar*Area de colectores ('+str(int(Area*num_loops))+' m2)')],
label='Radiación solar',
orientations=[0, 0],
rotation=-90)
if lang=="eng":
sankey.add(flows=[raw_potential/Production, -raw_potential/Production], #DNI_anual_irradiation/(Production*2) es lo que debería estar, se ha puesto el 1.1 del raw sólo por mostrarlo gráficamente mejor
fc='#F2FA52',
pathlengths = [0.95,0.375],
patchlabel='\n\n\n'+(str(int(DNI_anual_irradiation))+' kWh/m2 - Solar radiation at the location'),
trunklength=2,
labels=['',' \n\n'+(str(int(raw_potential))+' MWh \n Solar Radiation*Area of collectors ('+str(int(Area*num_loops))+' m2)')],
label='Solar radiation',
orientations=[0, 0],
rotation=-90)
if lang=="spa":
sankey.add(flows=[raw_potential/Production,-(raw_potential-Production-Thermal_loss-Utilization)/Production,-Utilization/Production,-Thermal_loss/Production, -Production/Production],
fc='#FA9E52',
pathlengths = [3,0.6,0.6,0.3,0.3],
label='Instalación solar',
labels=['', (''+str(round(raw_potential-Utilization-Production-Thermal_loss,1))+' MWh - No puede concentrarse'),(' '+str(round(Utilization,1))+' MWh - No puede utilizarse por la industria \n (no hay consumo cuando se produce)'),('\n '+str(round(Thermal_loss,1))+' MWh - Pérdidas térmicas'), ' '+(str(round(Production,1))+' MWh - Producción neta')],
orientations=[0, 1,1,1, 0],rotation=-90, prior=0, connect=(1, 0))
if lang=="eng":
sankey.add(flows=[raw_potential/Production,-(raw_potential-Production-Thermal_loss-Utilization)/Production,-Utilization/Production,-Thermal_loss/Production, -Production/Production],
fc='#FA9E52',
pathlengths = [3,0.6,0.6,0.3,0.3],
label='Solar plant',
labels=['', (''+str(round(raw_potential-Utilization-Production-Thermal_loss,1))+' MWh - Spillage'),(' '+str(round(Utilization,1))+' MWh - Industry cannot use the energy \n (There is no demand when it is produced)'),('\n '+str(round(Thermal_loss,1))+' MWh - Thermal losses'), ' '+(str(round(Production,1))+' MWh - Net production')],
orientations=[0, 1,1,1, 0],rotation=-90, prior=0, connect=(1, 0))
diagrams = sankey.finish()
plt.legend(loc='upper left')
plt.tight_layout()
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64,sankeyDict
if origin==-1:
fig.savefig(str(plotPath)+'Sankey.png', format='png', dpi=imageQlty)
return 0,sankeyDict
if origin==0:
return 0,sankeyDict
def mollierPlotST(sender,origin,lang,type_integration,in_s,out_s,T_in_flag,T_in_C,T_in_C_AR,T_out_C,outProcess_s,T_out_process_C,P_op_bar,x_design,plotPath,imageQlty,**kwargs):
P_op_Mpa=P_op_bar/10
sat_liq=IAPWS97(P=P_op_Mpa, x=0)
sat_vap=IAPWS97(P=P_op_Mpa, x=1)
mollier=pd.read_csv(os.path.dirname(__file__)+'/mollierWater.csv',sep=',',encoding = "ISO-8859-1",header=None)
processEntropy=[]
processEntropy.append(in_s)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
processEntropy.append(sat_liq.s)
processEntropy.append(out_s)
processTemperature=[]
if T_in_flag==1:
processTemperature.append(T_in_C)
else:
processTemperature.append(np.average(T_in_C_AR))
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
processTemperature.append(sat_liq.T-273)
processTemperature.append(T_out_C)
if type_integration=="SL_S_FWS" or type_integration=="SL_S_FW":
processEntropy2=[]
processTemperature2=[]
processEntropy2.append(out_s)
processEntropy2.append(sat_liq.s)
processEntropy2.append(sat_vap.s)
processTemperature2.append(T_out_C)
processTemperature2.append(sat_vap.T-273)
processTemperature2.append(sat_vap.T-273)
if type_integration=="SL_L_RF":
processEntropy2=[]
processTemperature2=[]
processEntropy2.append(out_s)
processEntropy2.append(outProcess_s)
processTemperature2.append(T_out_C)
processTemperature2.append(T_out_process_C)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
processEntropy2=[]
processTemperature2=[]
processEntropy2.append(out_s)
processEntropy2.append(outProcess_s)
processTemperature2.append(T_out_C)
processTemperature2.append(T_out_C)
i=0
s=0.1
s_max=8
s_step=0.1
P_isobar=P_op_bar #bar
isobar=pd.DataFrame(np.zeros((int((s_max-s)/s_step),3)), columns=["T","h","s"])
while (s<s_max):
stateIsobar=IAPWS97(P=bar_MPa(P_isobar), s=s)
isobar["T"][i]=stateIsobar.T-273
isobar["h"][i]=stateIsobar.h
isobar["s"][i]=s
s=s+s_step
i=i+1
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
plt.text(7.5, 350, str(int(P_isobar))+"bar", size=10, color='k', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="spa":
plt.title("Diagrama T-s en una fila de colectores")
if lang=="eng":
plt.title("T-s diagram in one array of solar collectors")
plt.plot(processEntropy,processTemperature, color='r',lw=3,markersize=50,zorder=10 )
if type_integration=="SL_S_FWS" or type_integration=="SL_S_FW":
plt.plot(processEntropy2,processTemperature2, color='m',lw=2,markersize=50,zorder=10 )
if type_integration=="SL_L_RF":
plt.plot(processEntropy2,processTemperature2, color='m',lw=2,markersize=50,zorder=10 )
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
plt.plot(processEntropy2,processTemperature2, color='m',lw=2,markersize=50,zorder=10 )
plt.plot(mollier[1],mollier[0],':',lw=2)
plt.plot(isobar["s"],isobar["T"])
if lang=="spa":
if T_in_flag==1:
plt.text(processEntropy[0]-1.5, processTemperature[0], "Entrada "+str(int(T_in_C))+"ºC", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
else:
plt.text(processEntropy[0]-1.5, processTemperature[0], "Entrada "+str(int(np.average(T_in_C_AR)))+"ºC", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
if T_in_flag==1:
plt.text(processEntropy[0]-1.5, processTemperature[0], "Input "+str(int(T_in_C))+"ºC", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
else:
plt.text(processEntropy[0]-1.5, processTemperature[0], "Input "+str(int(np.average(T_in_C_AR)))+"ºC", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
if lang=="spa":
plt.text(processEntropy[2]-1, processTemperature[2]+20, "Salida "+str(int(P_op_bar))+"bar x="+str(x_design), size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(processEntropy2[1], processTemperature2[1]-15, "Salida "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy[2]-1, processTemperature[2]+20, "Output "+str(int(P_op_bar))+"bar x="+str(x_design), size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(processEntropy2[1], processTemperature2[1]-15, "Output "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
else:
if lang=="spa":
plt.text(processEntropy[1]-1.3, processTemperature[1]+10, "Salida "+str(int(T_out_C))+"ºC", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy[1]-1.3, processTemperature[1]+10, "Output "+str(int(T_out_C))+"ºC", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if type_integration=="SL_S_FWS" or type_integration=="SL_S_FW":
if lang=="spa":
plt.text(processEntropy2[2], processTemperature2[2]+20, "Salida "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy2[2], processTemperature2[2]+20, "Output "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if type_integration=="SL_L_RF":
if lang=="spa":
plt.text(processEntropy2[1]-1.3, processTemperature2[1]+10, "Salida "+str(int(T_out_process_C))+"ºC", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy2[1]-1.3, processTemperature2[1]+10, "Output "+str(int(T_out_process_C))+"ºC", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="spa":
plt.text(-2,200, "Liquido" , size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(4.5,20, "Liquido + Vapor", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(10,200, "Vapor", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
# plt.scatter(modules["s"],modules["T"])
plt.xlabel(r'Entropía (kJ/K/kg)')
plt.ylabel(r'Temperatura (C)')
if lang=="eng":
plt.text(-2,200, "Liquid" , size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(4.5,20, "Liquid + Steam", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(10,200, "Steam", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
# plt.scatter(modules["s"],modules["T"])
plt.xlabel(r'Entropy (kJ/K/kg)')
plt.ylabel(r'Temperature (C)')
axes = plt.gca()
axes.set_ylim([0,400])
axes.set_xlim([-3,11])
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin== -1:
fig.savefig(str(plotPath)+'Mollier.png', format='png', dpi=imageQlty) #Save image for the report for the report
def mollierPlotSH(sender,origin,lang,type_integration,h_in,h_out,hProcess_out,outProcess_h,in_s,out_s,T_in_flag,T_in_C,T_in_C_AR,T_out_C,outProcess_s,T_out_process_C,P_op_bar,x_design,plotPath,imageQlty,**kwargs):
mollier=pd.read_csv(os.path.dirname(__file__)+'/mollierWater.csv',sep=',',encoding = "ISO-8859-1",header=None)
P_op_Mpa=P_op_bar/10
sat_liq=IAPWS97(P=P_op_Mpa, x=0)
sat_vap=IAPWS97(P=P_op_Mpa, x=1)
processEntropy=[]
processEntropy.append(in_s)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
processEntropy.append(sat_liq.s)
processEntropy.append(out_s)
processEnthalpy=[]
processEnthalpy.append(h_in)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
processEnthalpy.append(sat_liq.h)
processEnthalpy.append(h_out)
if type_integration=="SL_S_FWS" or type_integration=="SL_S_FW":
processEntropy2=[]
processEnthalpy2=[]
processEntropy2.append(out_s)
processEntropy2.append(sat_vap.s)
processEnthalpy2.append(h_out)
processEnthalpy2.append(sat_vap.h)
if type_integration=="SL_L_RF":
processEntropy2=[]
processEnthalpy2=[]
processEntropy2.append(out_s)
processEntropy2.append(outProcess_s)
processEnthalpy2.append(h_out)
processEnthalpy2.append(hProcess_out)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
processEntropy2=[]
processEnthalpy2=[]
processEntropy2.append(out_s)
processEntropy2.append(outProcess_s)
processEnthalpy2.append(h_out)
processEnthalpy2.append(outProcess_h)
i=0
s=0.1
s_max=8
s_step=0.1
P_isobar=P_op_bar #bar
isobar=pd.DataFrame(np.zeros((int((s_max-s)/s_step),3)), columns=["T","h","s"])
while (s<s_max):
stateIsobar=IAPWS97(P=bar_MPa(P_isobar), s=s)
isobar["h"][i]=stateIsobar.h
isobar["s"][i]=s
s=s+s_step
i=i+1
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
plt.text(5, IAPWS97(P=bar_MPa(P_isobar),s=5).h-500, str(int(P_isobar))+"bar", size=10, color='k', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="spa":
plt.title("Diagrama h-s en una fila de colectores")
if lang=="eng":
plt.title("H-s diagram in one array of solar collectors")
if type_integration=="SL_S_FWS" or type_integration=="SL_S_FW":
plt.plot(processEntropy2,processEnthalpy2, color='m',lw=2,markersize=50,zorder=10 )
if type_integration=="SL_L_RF":
plt.plot(processEntropy2,processEnthalpy2, color='m',lw=2,markersize=50,zorder=10 )
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
plt.plot(processEntropy2,processEnthalpy2, color='m',lw=2,markersize=50,zorder=10 )
plt.plot(processEntropy,processEnthalpy, color='r',lw=3,markersize=50,zorder=10 )
plt.plot(mollier[1],mollier[2],':',lw=2)
plt.plot(isobar["s"],isobar["h"])
if lang=="spa":
plt.text(processEntropy[0]-1.8, processEnthalpy[0]+100, "Entrada "+str(int(h_in))+" kJ/kg", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy[0]-1.8, processEnthalpy[0]+100, "Inlet "+str(int(h_in))+" kJ/kg", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if type_integration=="SL_S_PD" or type_integration=="SL_S_PDS":
if lang=="spa":
plt.text(processEntropy[2]+1.8, processEnthalpy[2]-100, "Salida "+str(int(P_op_bar))+"bar x="+str(x_design), size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(processEntropy2[1]+1.8, processEnthalpy2[1]-100, "Salida "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy[2]+1.8, processEnthalpy[2]-100, "Output "+str(int(P_op_bar))+"bar x="+str(x_design), size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(processEntropy2[1]+1.8, processEnthalpy2[1]-100, "Output "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
else:
if lang=="spa":
plt.text(processEntropy[1]-1.5, processEnthalpy[1]+150, "Salida "+str(int(h_out))+" kJ/kg", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy[1]-1.5, processEnthalpy[1]+150, "Output "+str(int(h_out))+" kJ/kg", size=10, color='r', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if type_integration=="SL_S_FWS" or type_integration=="SL_S_FW":
if lang=="spa":
plt.text(processEntropy2[1]+1.8, processEnthalpy2[1]-100, "Salida "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(processEntropy2[1]+1.8, processEnthalpy2[1]-300, "Salida "+str(int(sat_vap.h))+" kJ/kg", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy2[1]+1.8, processEnthalpy2[1]-100, "Output "+str(int(P_op_bar))+"bar Sat.", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(processEntropy2[1]+1.8, processEnthalpy2[1]-300, "Output "+str(int(sat_vap.h))+" kJ/kg", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if type_integration=="SL_L_RF":
if lang=="spa":
plt.text(processEntropy2[1]-1.5, processEnthalpy2[1]+150, "Salida "+str(int(hProcess_out))+" kJ/kg", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="eng":
plt.text(processEntropy2[1]-1.5, processEnthalpy2[1]+150, "Output "+str(int(hProcess_out))+" kJ/kg", size=10, color='m', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
if lang=="spa":
plt.text(-1,1500, "Liquido" , size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(4.5,200, "Liquido + Vapor", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(10,2000, "Vapor", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
# plt.scatter(modules["s"],modules["T"])
plt.xlabel(r'Entropía (kJ/K/kg)')
plt.ylabel(r'Entalpía (kJ/Kg)')
if lang=="eng":
plt.text(-1,1500, "Liquid" , size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(4.5,200, "Liquid + Steam", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
plt.text(10,2000, "Steam", size=10, color='b', ha='center', va='center', horizontalalignment='center', verticalalignment='center', rotation= 0)
# plt.scatter(modules["s"],modules["T"])
plt.xlabel(r'Entropy (kJ/K/kg)')
plt.ylabel(r'Enthalpy (kJ/Kg)')
axes = plt.gca()
axes.set_ylim([0,3000])
axes.set_xlim([-3,11])
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'Mollier2.png', format='png', dpi=imageQlty) #Save for the report
def thetaAnglesPlot(sender,origin,step_sim,steps_sim,theta_i_deg,theta_transv_deg,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('Ángulos theta', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, theta_i_deg,'.r-',label="Ang_incidencia")
ax1 .plot(step_sim, theta_transv_deg,'.b-',label="Incidencia_transversal")
ax1 .axhline(y=0,xmin=0,xmax=steps_sim,c="blue",linewidth=0.5,zorder=0)
ax1.set_xlabel('Simulación (hora del año)')
ax1.set_ylabel('Grados')
plt.legend( loc='upper left', borderaxespad=0.)
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'tetha.png', format='png', dpi=imageQlty)
def IAMAnglesPlot(sender,origin,step_sim,IAM_long,IAM_t,IAM,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('IAMs', fontsize=14, fontweight='bold')
ax2 = fig.add_subplot(111)
ax2 .plot(step_sim, IAM_long,'.-',color = 'b',label="IAM_long")
ax2 .plot(step_sim, IAM_t,'.-',color = 'r',label="IAM_transv")
ax2 .plot(step_sim, IAM,'.-',color = '#39B8E3',label="IAM")
ax2.set_xlabel('Simulación (hora del año)')
ax2.set_ylabel('Grados')
plt.legend(loc='upper left', borderaxespad=0.)
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'IAM.png', format='png', dpi=imageQlty)
def demandVsRadiation(sender,origin,lang,step_sim,Demand,Q_prod,Q_prod_lim,Q_prod_rec,steps_sim,DNI,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Demanda vs Radiación solar', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, Demand,'.k-',label="Demanda")
ax1 .plot(step_sim, Q_prod,'.r-',label="Produccion solar total")
ax1 .plot(step_sim, Q_prod_lim,'.b-',label="Produccion util")
ax1 .plot(step_sim, Q_prod_rec,'.g-',label="Produccion Rec")
ax1 .axhline(y=0,xmin=0,xmax=steps_sim,c="blue",linewidth=0.5,zorder=0)
ax1.set_xlabel('Simulación (hora del año)')
ax1.set_ylabel('Demanda - kWh',color="blue")
ax1.set_ylim([0,max(np.max(Q_prod),np.max(Demand))*1.2])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, DNI,'.-',color = 'orange',label="DNI")
ax2.set_ylabel('Radiación solar - W/m2',color='red')
ax2.set_ylim([0,np.max(DNI)*1.2])
plt.legend(loc='upper right', borderaxespad=0.)
if lang=="eng":
fig.suptitle('Demand vs Solar Radiation', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, Demand,'.k-',label="Demand")
ax1 .plot(step_sim, Q_prod,'.r-',label="Total solar production")
ax1 .plot(step_sim, Q_prod_lim,'.b-',label="Net production")
ax1 .plot(step_sim, Q_prod_rec,'.g-',label="Production Rec")
ax1 .axhline(y=0,xmin=0,xmax=steps_sim,c="blue",linewidth=0.5,zorder=0)
ax1.set_xlabel('simulation time (hour of the year)')
ax1.set_ylabel('Demand - kWh',color="blue")
ax1.set_ylim([0,np.max(Demand)*1.2])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, DNI,'.-',color = 'orange',label="DNI")
ax2.set_ylabel('Solar Radiaton - W/m2',color='red')
plt.legend(loc='upper right', borderaxespad=0.)
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'demandProduction.png', format='png', dpi=imageQlty)
def rhoTempPlotSalt(sender,origin,lang,T_out_C,plotPath,imageQlty,**kwargs):
rhoList=[]
CpList=[]
T_step=[]
for T in range(100+273,600+273,5):
T_step.append(T-273)
[rho,Cp,k,Dv]=moltenSalt(T)
rhoList.append(rho)
CpList.append(Cp)
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Propiedades de las sales fundidas', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(rhoList)), rhoList,'.k-',label="Densidad")
[rho,Cp,k,Dv]=moltenSalt(T_out_C+273)
plt .hlines(y=rho,xmin=0,xmax=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),c="r")
ax1.set_xlabel('Temperatura ºC')
ax1.set_ylabel('Densidad - kg/m3')
ax2 = ax1.twinx()
ax2 .plot(np.arange(len(rhoList)), CpList,'.b-',label="Calor específico")
ax2.set_ylabel('Calor Específico - KJ/kgK', color="blue")
plt.xticks(list(np.arange(len(rhoList)))[1::8], T_step[1::8])
plt .hlines(y=Cp,xmin=min(range(len(CpList)), key=lambda i: abs(CpList[i]-Cp)),xmax=len(T_step),color="blue",linewidth=1,zorder=0)
if lang=="eng":
fig.suptitle('Molten Salt properties', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(rhoList)), rhoList,'.k-',label="Density")
[rho,Cp,k,Dv]=moltenSalt(T_out_C+273)
plt .hlines(y=rho,xmin=0,xmax=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),c="r")
ax1.set_xlabel('Temperature ºC')
ax1.set_ylabel('Density - kg/m3')
ax2 = ax1.twinx()
ax2 .plot(np.arange(len(rhoList)), CpList,'.b-',label="Specific heat")
ax2.set_ylabel('Specific heat - lKJ/kgK', color="blue")
plt.xticks(list(np.arange(len(rhoList)))[1::8], T_step[1::8])
plt .hlines(y=Cp,xmin=min(range(len(CpList)), key=lambda i: abs(CpList[i]-Cp)),xmax=len(T_step),color="blue",linewidth=1,zorder=0)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'Salt.png', format='png', dpi=imageQlty)
def rhoTempPlotOil(sender,origin,lang,T_out_C,plotPath,imageQlty,**kwargs):
rhoList=[]
CpList=[]
T_step=[]
for T in range(-20+273,320+273,5):
T_step.append(T-273)
[rho,Cp,k_av,Dv_av,Kv_av,thermalDiff_av,Prant_av]=thermalOil(T)
rhoList.append(rho)
CpList.append(Cp)
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Propiedades del Aceite térmico', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(rhoList)), rhoList,'.k-',label="Densidad")
[rho,Cp,k_av,Dv_av,Kv_av,thermalDiff_av,Prant_av]=thermalOil(T_out_C+273)
plt .hlines(y=rho,xmin=0,xmax=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),c="r")
ax1.set_xlabel('Temperatura ºC')
ax1.set_ylabel('Densidad - kg/m3')
ax2 = ax1.twinx()
ax2 .plot(np.arange(len(rhoList)), CpList,'.b-',label="Calor específico")
ax2.set_ylabel('Calor Específico - KJ/kgK', color="blue")
plt.xticks(list(np.arange(len(rhoList)))[1::8], T_step[1::8])
plt .hlines(y=Cp,xmin=min(range(len(CpList)), key=lambda i: abs(CpList[i]-Cp)),xmax=len(T_step),color="blue",linewidth=1,zorder=0)
if lang=="eng":
fig.suptitle('Thermal Oil properties', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(rhoList)), rhoList,'.k-',label="Density")
[rho,Cp,k_av,Dv_av,Kv_av,thermalDiff_av,Prant_av]=thermalOil(T_out_C+273)
plt .hlines(y=rho,xmin=0,xmax=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(rhoList)), key=lambda i: abs(rhoList[i]-rho)),c="r")
ax1.set_xlabel('Temperature ºC')
ax1.set_ylabel('Density - kg/m3')
ax2 = ax1.twinx()
ax2 .plot(np.arange(len(rhoList)), CpList,'.b-',label="Specific heat")
ax2.set_ylabel('Specific heat - KJ/kgK', color="blue")
plt.xticks(list(np.arange(len(rhoList)))[1::8], T_step[1::8])
plt .hlines(y=Cp,xmin=min(range(len(CpList)), key=lambda i: abs(CpList[i]-Cp)),xmax=len(T_step),color="blue",linewidth=1,zorder=0)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'Oil.png', format='png', dpi=imageQlty)
def viscTempPlotSalt(sender,origin,lang,T_out_C,plotPath,imageQlty,**kwargs):
DvList=[]
T_step=[]
for T in range(100+273,600+273,5):
T_step.append(T-273)
[rho,Cp,k,Dv]=moltenSalt(T)
DvList.append(Dv*1e3)
DvList=DvList[4:]
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Viscosidad de las sales fudidas', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(DvList)), DvList,'.k-',label="Viscosidad dinámica")
[rho,Cp,k,Dv]=moltenSalt(T_out_C+273)
plt .hlines(y=Dv*1e3,xmin=0,xmax=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3)),color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3)),c="r")
ax1.set_xlabel('Temperatura ºC')
ax1.set_ylabel('Viscosidad dinámica*1e3 - Ns/m2')
ax1.set_yscale('log')
plt.xticks(list(np.arange(len(T_step)))[1::8], T_step[1::8])
if lang=="eng":
fig.suptitle('Molten Salt Viscosity', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(DvList)), DvList,'.k-',label="Dynamic viscosity")
[rho,Cp,k,Dv]=moltenSalt(T_out_C+273)
plt .hlines(y=Dv*1e3,xmin=0,xmax=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,c="r")
ax1.set_xlabel('Temperature ºC')
ax1.set_ylabel('Dynamic viscosity*1e3 - Ns/m2')
ax1.set_yscale('log')
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'Salt2.png', format='png', dpi=imageQlty)
def viscTempPlotOil(sender,origin,lang,T_out_C,plotPath,imageQlty,**kwargs):
DvList=[]
KvList=[]
T_step=[]
for T in range(-20+273,320+273,5):
T_step.append(T-273)
[rho,Cp,k,Dv,Kv,thermalDiff,Prant]=thermalOil(T)
DvList.append(Dv*1e3)
KvList.append(Kv*1e6)
DvList=DvList[4:]
KvList=KvList[4:]
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Viscosidad del Aceite térmico', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(DvList)), DvList,'.k-',label="Viscosidad dinámica")
[rho,Cp,k,Dv,Kv,thermalDiff,Prant]=thermalOil(T_out_C+273)
plt .hlines(y=Dv*1e3,xmin=0,xmax=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,c="r")
ax1.set_xlabel('Temperatura ºC')
ax1.set_ylabel('Viscosidad dinámica*1e3 - Ns/m2')
ax1.set_yscale('log')
ax2 = ax1.twinx()
ax2 .plot(np.arange(len(KvList)), KvList,'.b-',label="Viscosidad cinemática")
ax2.set_ylabel('Viscosidad cinemática*1e6 - m2/s', color="blue")
ax2.set_yscale('log')
plt.xticks(list(np.arange(len(T_step)))[1::8], T_step[1::8])
plt .hlines(y=Kv*1e6,xmin=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,xmax=len(T_step),color="blue",linewidth=1,zorder=0)
if lang=="eng":
fig.suptitle('Thermal Oil Viscosity', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(len(DvList)), DvList,'.k-',label="Dynamic viscosity")
[rho,Cp,k,Dv,Kv,thermalDiff,Prant]=thermalOil(T_out_C+273)
plt .hlines(y=Dv*1e3,xmin=0,xmax=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,color="#362510",linewidth=1,zorder=0)
plt .axvline(x=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,c="r")
ax1.set_xlabel('Temperature ºC')
ax1.set_ylabel('Dynamic viscosity*1e3 - Ns/m2')
ax1.set_yscale('log')
ax2 = ax1.twinx()
ax2 .plot(np.arange(len(KvList)), KvList,'.b-',label="Kinematic viscosity")
ax2.set_ylabel('Kinematic viscosity*1e6 - m2/s', color="blue")
ax2.set_yscale('log')
plt.xticks(list(np.arange(len(T_step)))[1::8], T_step[1::8])
plt .hlines(y=Kv*1e6,xmin=min(range(len(DvList)), key=lambda i: abs(DvList[i]-Dv*1e3))+4,xmax=len(T_step),color="blue",linewidth=1,zorder=0)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'Oil2.png', format='png', dpi=imageQlty)
def flowRatesPlot(sender,origin,step_sim,steps_sim,flow_rate_kgs,flow_rate_rec,num_loops,flowDemand,flowToHx,flowToMix,m_dot_min_kgs,T_in_K,T_toProcess_C,T_out_K,T_alm_K,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('Caudales & temperaturas', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, flow_rate_kgs,'m:',label="Caudal solar array")
ax1 .plot(step_sim, flow_rate_rec,'g:',label="Caudal recirculación array")
ax1 .plot(step_sim, flow_rate_kgs*num_loops,'.m-',label="Caudal solar SF")
ax1 .plot(step_sim, flow_rate_rec*num_loops,'.g-',label="Caudal recirculación SF")
ax1 .plot(step_sim, flowDemand,'.k-',label="Caudal demanda")
ax1 .plot(step_sim, flowToHx,'.y-',label="Caudal flowToHx")
ax1 .plot(step_sim, flowToMix,'.-',color='#6BD703',label="Caudal flowToMix")
ax1 .axhline(y=m_dot_min_kgs,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1.set_ylim([0,(np.max(flow_rate_kgs*num_loops))*1.1])
ax1.set_xlabel('Simulación (hora del año)')
ax1.set_ylabel('Caudal - kg/s')
plt.legend(bbox_to_anchor=(1.15, .5), loc=2, borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, T_in_K-273,'-',color = '#1F85DE',label="Temp_in Solar")
ax2 .plot(step_sim, T_toProcess_C,'-',color = 'brown',label="Tem to Process")
ax2 .plot(step_sim, T_out_K-273,'-',color = 'red',label="Temp_out Solar")
ax2 .plot(step_sim, T_alm_K-273,':',color = 'orange',label="Temp_alm")
ax2.set_ylabel('Temp - C')
ax2.set_ylim([0,(np.max([np.max(T_toProcess_C)+273,np.max(T_out_K)])-273)*1.1])
plt.legend(bbox_to_anchor=(1.15, 1), loc=2, borderaxespad=0.)
# output1=pd.DataFrame(flow_rate_kgs)
# output1.columns=['Flow_rate']
# output2=pd.DataFrame(T_in_K)
# output2.columns=['T_in_K']
# output3=pd.DataFrame(T_out_K)
# output3.columns=['T_out_K']
# output_excel_FlowratesTemps=pd.concat([output1,output2,output3], axis=1)
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'flowrates.png', format='png', dpi=imageQlty)
def prodWinterPlot(sender,origin,lang,Demand,Q_prod,Q_prod_lim,type_integration,Q_charg,Q_discharg,DNI,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Producción solar primera semana Enero', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(167), DNI[0:167],color='#CA6A16',linestyle='solid',label="Radiación solar")
ax1.set_xlabel('simulación (hora del año)')
ax1.set_ylabel('Radiación Solar - W/m2')
ax1.set_ylim([0,1200])
plt.legend(loc='upper left', borderaxespad=0.,frameon=False)
ax2 = ax1.twinx()
ax2.fill_between( np.arange(167), Demand[0:167], color="grey", alpha=0.2,label="Demanda")
ax2 .plot(np.arange(167), Demand[0:167],'.-',color = '#362510',label="Demanda")
if sender =='CIMAV':
ax2 .plot(np.arange(167), Q_prod[0:167],'.-',color = 'red',label="Disipación")
else:
ax2 .plot(np.arange(167), Q_prod[0:167],'.-',color = 'red',label="Desenfoque")
ax2 .plot(np.arange(167), Q_prod_lim[0:167],'.-',color = 'blue',label="Producción solar")
if type_integration=="SL_L_PS" or type_integration=='SL_S_FWS':
ax2 .plot(np.arange(167), Q_charg[0:167],'.-',color = '#FFAE00',label="Carga")
ax2 .plot(np.arange(167), Q_discharg[0:167],'.-',color = '#2EAD23',label="Descarga")
ax2.set_ylabel('Producción y Demanda - kWh')
ax2.set_ylim([0,np.max(Q_prod)*4.2])
plt.legend(bbox_to_anchor=(1.00, 1), loc=1, borderaxespad=0.)
# plt.subplots_adjust(left=0, bottom=0, right=1, top=1, wspace=0, hspace=0)
plt.tight_layout()
if lang=="eng":
fig.suptitle('Solar production first week January', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(167), DNI[0:167],color='#CA6A16',linestyle='solid',label="Solar Radiation")
ax1.set_xlabel('simulation time (hour of the year)')
ax1.set_ylabel('Solar Radiation - W/m2')
ax1.set_ylim([0,1200])
plt.legend(loc='upper left', borderaxespad=0.,frameon=False)
ax2 = ax1.twinx()
plt.fill_between( np.arange(167), Demand[0:167], color="grey", alpha=0.2,label="Demand")
ax2 .plot(np.arange(167), Demand[0:167],'.-',color = '#362510',label="Demand")
ax2 .plot(np.arange(167), Q_prod[0:167],'.-',color = 'red',label="Defocused")
ax2 .plot(np.arange(167), Q_prod_lim[0:167],'.-',color = 'blue',label="Solar production")
if type_integration=="SL_L_PS" or type_integration=='SL_S_FWS':
ax2 .plot(np.arange(167), Q_charg[0:167],'.-',color = '#FFAE00',label="Charge")
ax2 .plot(np.arange(167), Q_discharg[0:167],'.-',color = '#2EAD23',label="Discharge")
ax2.set_ylabel('Production & Demand - kWh')
ax2.set_ylim([0,np.max(Q_prod)*4.2])
plt.legend(bbox_to_anchor=(1.00, 1), loc=1, borderaxespad=0.)
# plt.subplots_adjust(left=0, bottom=0, right=1, top=1, wspace=0, hspace=0)
plt.tight_layout()
# output4=pd.DataFrame(DNI)
# output4.columns=['DNI']
# output5=pd.DataFrame(Demand)
# output5.columns=['Demand']
# output6=pd.DataFrame(Q_prod)
# output6.columns=['Q_prod']
# output_excel_Prod_wee_Jan=pd.concat([output1,output2,output3,output4,output5,output6], axis=1)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'produccion_solar1weekWinter.png', format='png', dpi=imageQlty)
def prodSummerPlot(sender,origin,lang,Demand,Q_prod,Q_prod_lim,type_integration,Q_charg,Q_discharg,DNI,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Producción solar primera semana Junio', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
ax1 .plot((np.arange(3624,3624+167,1)), DNI[3624:3791],color='#CA6A16',linestyle='solid',label="Radiación solar")
ax1.set_xlabel('simulación (hora del año)')
ax1.set_ylabel('Radiación Solar - W/m2')
ax1.set_ylim([0,1200])
plt.legend(loc='upper left', borderaxespad=0.,frameon=False)
ax2 = ax1.twinx()
ax2.fill_between( np.arange(3624,3624+167,1), Demand[3624:3791], color="grey", alpha=0.2,label="Demanda")
ax2 .plot((np.arange(3624,3624+167,1)), Demand[3624:3791],'.-',color = '#362510',label="Demanda")
if sender =='CIMAV':
ax2 .plot((np.arange(3624,3624+167,1)), Q_prod[3624:3791],'.-',color = 'red',label="Disipación")
else:
ax2 .plot((np.arange(3624,3624+167,1)), Q_prod[3624:3791],'.-',color = 'red',label="Desenfoque")
ax2 .plot((np.arange(3624,3624+167,1)), Q_prod_lim[3624:3791],'.-',color = 'blue',label="Producción solar")
if type_integration=="SL_L_PS" or type_integration=='SL_S_FWS':
ax2 .plot((np.arange(3624,3624+167,1)), Q_charg[3624:3791],'.-',color = '#FFAE00',label="Carga")
ax2 .plot((np.arange(3624,3624+167,1)), Q_discharg[3624:3791],'.-',color = '#2EAD23',label="Descarga")
ax2.set_ylabel('Producción y Demanda - kWh')
ax2.set_ylim([0,np.max(Q_prod)*4.2])
plt.legend(bbox_to_anchor=(1.00, 1), loc=1, borderaxespad=0.)
plt.tight_layout()
if lang=="eng":
fig.suptitle('Solar production first week of June', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
ax1 .plot((np.arange(3624,3624+167,1)), DNI[3624:3791],color='#CA6A16',linestyle='solid',label="Solar Radiation")
ax1.set_xlabel('simulation time (hour of the year)')
ax1.set_ylabel('Solar Radiation - W/m2')
ax1.set_ylim([0,1200])
plt.legend(loc='upper left', borderaxespad=0.,frameon=False)
ax2 = ax1.twinx()
ax2.fill_between( np.arange(3624,3624+167,1), Demand[3624:3791], color="grey", alpha=0.2,label="Demand")
ax2 .plot((np.arange(3624,3624+167,1)), Demand[3624:3791],'.-',color = '#362510',label="Demand")
ax2 .plot((np.arange(3624,3624+167,1)), Q_prod[3624:3791],'.-',color = 'red',label="Defocused")
ax2 .plot((np.arange(3624,3624+167,1)), Q_prod_lim[3624:3791],'.-',color = 'blue',label="Solar Production")
if type_integration=="SL_L_PS" or type_integration=='SL_S_FWS':
ax2 .plot((np.arange(3624,3624+167,1)), Q_charg[3624:3791],'.-',color = '#FFAE00',label="Charge")
ax2 .plot((np.arange(3624,3624+167,1)), Q_discharg[3624:3791],'.-',color = '#2EAD23',label="Discharge")
ax2.set_ylabel('Production & Demand - kWh')
ax2.set_ylim([0,np.max(Q_prod)*4.2])
plt.legend(bbox_to_anchor=(1.00, 1), loc=1, borderaxespad=0.)
plt.tight_layout()
# output4=pd.DataFrame(DNI)
# output4.columns=['DNI']
# output5=pd.DataFrame(Demand)
# output5.columns=['Demand']
# output6=pd.DataFrame(Q_prod)
# output6.columns=['Q_prod']
#
# output_excel_Prod_week_Jun=pd.concat([output1,output2,output3,output4,output5,output6], axis=1)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'produccion_solar1weekSummer.png', format='png', dpi=imageQlty)
def productionSolar(sender,origin,lang,step_sim,DNI,m_dot_min_kgs,steps_sim,Demand,Q_prod,Q_prod_lim,Q_charg,Q_discharg,type_integration,plotPath,imageQlty,**kwargs):
fig = plt.figure(figsize=(14, 3.5))
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Producción anual', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, DNI,'.r-',label="Radiación solar")
# ax1 .axhline(y=m_dot_min_kgs,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1.set_xlabel('Simulación (hora del año)')
ax1.set_ylabel('Solar radiation - W/m2',color='red')
legend =plt.legend(bbox_to_anchor=(0.12, -.07), loc=1, borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, Demand,'.-',color = '#362510',label="Demanda")
ax2 .plot(step_sim, Q_prod,'.-',color = '#831896',label="Producción solar")
ax2 .plot(step_sim, Q_prod_lim,'.-',color = 'blue',label="Energía suministrada")
if type_integration=="SL_L_PS" or type_integration=="SL_S_FWS":
ax2 .plot(step_sim, Q_charg,'.-',color = '#FFAE00',label="Carga")
ax2 .plot(step_sim, Q_discharg,'.-',color = '#2EAD23',label="Descarga")
ax2.set_ylabel('Producción & Demanda - kWh')
ax2.set_ylim([0,np.max(Q_prod)*2])
plt.legend(bbox_to_anchor=(1.00, 1), loc=1, borderaxespad=0.)
plt.tight_layout()
if lang=="eng":
fig.suptitle('Annual Production', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, DNI,'.r-',label="Solar Radiation")
# ax1 .axhline(y=m_dot_min_kgs,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1.set_xlabel('simulation time (hour of the year)')
ax1.set_ylabel('Solar radiation - W/m2',color='red')
legend =plt.legend(bbox_to_anchor=(0.12, -.07), loc=1, borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, Demand,'.-',color = '#362510',label="Demand")
ax2 .plot(step_sim, Q_prod,'.-',color = '#831896',label="Solar production")
ax2 .plot(step_sim, Q_prod_lim,'.-',color = 'blue',label="Net production")
if type_integration=="SL_L_PS" or type_integration=="SL_S_FWS":
ax2 .plot(step_sim, Q_charg,'.-',color = '#FFAE00',label="Charge")
ax2 .plot(step_sim, Q_discharg,'.-',color = '#2EAD23',label="Discharge")
ax2.set_ylabel('Production & Demand - kWh')
ax2.set_ylim([0,np.max(Q_prod)*2])
plt.legend(bbox_to_anchor=(1.00, 1), loc=1, borderaxespad=0.)
plt.tight_layout()
# output4=pd.DataFrame(DNI)
# output4.columns=['DNI']
# output5=pd.DataFrame(Demand)
# output5.columns=['Demand']
# output6=pd.DataFrame(Q_prod)
# output6.columns=['Q_prod']
#
# output_excel_Prod_annual=pd.concat([output1,output2,output3,output4,output5,output6], axis=1)
# fig.savefig('/home/miguel/Desktop/Python_files/PLAT_VIRT/fresnel/Report/images/produccion_solar.png', format='png', dpi=imageQlty)
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'produccion_solar.png', format='png', dpi=imageQlty)
def storageWinter(sender,origin,lang,Q_prod,Q_charg,Q_prod_lim,Q_useful,Demand,Q_defocus,Q_discharg,type_integration,T_alm_K,SOC,plotPath,imageQlty,**kwargs):
fig = plt.figure(figsize=(14, 3.5))
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Almacenamiento primera semana Enero', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.fill_between( np.arange(167), Demand[0:167], color="grey", alpha=0.2)
plt.bar(np.arange(167), np.array(Q_prod[0:167])-np.array(Q_charg[0:167]),color = 'blue',label="Producción Solar",align='center')
# ax1 .plot(np.arange(167), Q_prod_lim[0:167],color = 'blue',label="Energía suministrada",linewidth=4)
# ax1 .plot(np.arange(167), Q_useful[0:167],color = 'green',label="Energía útil",linewidth=2)
ax1 .plot(np.arange(167), Demand[0:167],color = '#362510',label="Demanda",linewidth=2.0)
if sender =='CIMAV':
plt.bar(np.arange(167), Q_defocus[0:167],color = 'red',label="Disipación",bottom=np.array(Q_prod[0:167])-np.array(Q_defocus[0:167]),align='center')
else:
plt.bar(np.arange(167), Q_defocus[0:167],color = 'red',label="Desenfoque",bottom=np.array(Q_prod[0:167])-np.array(Q_defocus[0:167]),align='center')
plt.bar(np.arange(167), Q_charg[0:167],color = '#FFAE00',label="Carga",bottom=np.array(Q_prod[0:167])-np.array(Q_charg[0:167])-np.array(Q_defocus[0:167]),align='center')
plt.bar(np.arange(167), Q_discharg[0:167],color = '#2EAD23',label="Descarga",bottom=np.array(Q_prod[0:167]),align='center')
ax1.set_ylabel('Producción & Demanda - kWh')
ax1.set_ylim([0,np.max([np.max(Q_prod[0:167]),np.max(Demand[0:167])])*1.1])
ax1.set_xlim([0,167])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
if type_integration=="SL_L_S" or type_integration=="SL_L_S_PH":
ax2 .plot(np.arange(167), np.array(T_alm_K[0:167])-273,'r',label="Temperatura",linewidth=2.0)
ax2 .plot(np.arange(167), SOC[0:167],color='orange',linestyle=':',label="Carga del almacenamiento",linewidth=2.0)
ax2.set_xlabel('simulación (hora del año)')
ax2.set_ylabel('Estado de carga almacenamiento %',color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([0,167])
if lang=="eng":
fig.suptitle('Storage during the first week of January', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.fill_between( np.arange(167), Demand[0:167], color="grey", alpha=0.2)
plt.bar(np.arange(167), np.array(Q_prod[0:167])-np.array(Q_charg[0:167]),color = 'blue',label="Solar Production",align='center')
# ax1 .plot(np.arange(167), Q_prod_lim[0:167],color = 'blue',label="Net production",linewidth=4)
# ax1 .plot(np.arange(167), Q_useful[0:167],color = 'green',label="Useful energy",linewidth=2)
ax1 .plot(np.arange(167), Demand[0:167],color = '#362510',label="Demand",linewidth=2.0)
plt.bar(np.arange(167), Q_defocus[0:167],color = 'red',label="Defocused",bottom=np.array(Q_prod[0:167])-np.array(Q_defocus[0:167]),align='center')
plt.bar(np.arange(167), Q_charg[0:167],color = '#FFAE00',label="Charge",bottom=np.array(Q_prod[0:167])-np.array(Q_charg[0:167])-np.array(Q_defocus[0:167]),align='center')
plt.bar(np.arange(167), Q_discharg[0:167],color = '#2EAD23',label="Discharge",bottom=np.array(Q_prod[0:167]),align='center')
ax1.set_ylabel('Production & Demand - kWh')
ax1.set_ylim([0,np.max([np.max(Q_prod[0:167]),np.max(Demand[0:167])])*1.1])
ax1.set_xlim([0,167])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
if type_integration=="SL_L_S" or type_integration=="SL_L_S_PH":
ax2 .plot(np.arange(167), np.array(T_alm_K[0:167])-273,'r',label="Temperature",linewidth=2.0)
ax2 .plot(np.arange(167), SOC[0:167],color='orange',linestyle=':',label="Storage's state of charge",linewidth=2.0)
ax2.set_xlabel('simulation time (hour of the year)')
ax2.set_ylabel("Storage's state of charge %",color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([0,167])
plt.tight_layout()
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'almacenamiento_Enero.png', format='png', dpi=imageQlty)
def storageSummer(sender,origin,lang,Q_prod,Q_charg,Q_prod_lim,Q_useful,Demand,Q_defocus,Q_discharg,type_integration,T_alm_K,SOC,plotPath,imageQlty,**kwargs):
fig = plt.figure(figsize=(14, 3.5))
#np.array(in list) is because Django need it since Q_prod, Q_prod_lim,.. are passed as lists
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Almacenamiento primera semana Junio', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.fill_between( np.arange(3624,3624+167,1), Demand[3624:3791], color="grey", alpha=0.2)
plt.bar((np.arange(3624,3624+167,1)), np.array(Q_prod[3624:3791])-np.array(Q_charg[3624:3791]),color = 'blue',label="Producción Solar",align='center')
#ax1 .plot((np.arange(3624,3624+167,1)), Q_prod_lim[3624:3791],color = 'blue',label="Energía suministrada",linewidth=4)
#ax1 .plot((np.arange(3624,3624+167,1)), Q_useful[3624:3791],color = 'green',label="Energía útil",linewidth=2)
ax1 .plot((np.arange(3624,3624+167,1)), Demand[3624:3791],color = '#362510',label="Demanda",linewidth=2.0)
if sender =='CIMAV':
plt.bar((np.arange(3624,3624+167,1)), Q_defocus[3624:3791],color = 'red',label="Disipación",bottom=np.array(Q_prod[3624:3791])-np.array(Q_defocus[3624:3791]),align='center')
else:
plt.bar((np.arange(3624,3624+167,1)), Q_defocus[3624:3791],color = 'red',label="Desenfoque",bottom=np.array(Q_prod[3624:3791])-np.array(Q_defocus[3624:3791]),align='center')
plt.bar((np.arange(3624,3624+167,1)), Q_charg[3624:3791],color = '#FFAE00',label="Carga",bottom=np.array(Q_prod[3624:3791])-np.array(Q_charg[3624:3791])-np.array(Q_defocus[3624:3791]),align='center')
plt.bar((np.arange(3624,3624+167,1)), Q_discharg[3624:3791],color = '#2EAD23',label="Descarga",bottom=Q_prod[3624:3791],align='center')
ax1.set_ylabel('Producción & Demanda - kWh')
ax1.set_ylim([0,np.max([np.max(Q_prod[3624:3791]),np.max(Demand[3624:3791])])*1.1])
ax1.set_xlim([3624,3624+167])
ax1.legend(loc='upper left', borderaxespad=0.).set_zorder(99)
ax2 = ax1.twinx()
if type_integration=="SL_L_S" or type_integration=="SL_L_S_PH":
ax2 .plot((np.arange(3624,3624+167,1)), np.array(T_alm_K[3624:3791])-273,'r',label="Carga del almacenamiento",linewidth=2.0,zorder=11)
ax2 .plot((np.arange(3624,3624+167,1)), SOC[3624:3791],color='orange',linestyle=':',label="Carga del almacenamiento",linewidth=2.0,zorder=11)
ax2.set_xlabel('simulación (hora del año)')
ax2.set_ylabel('Estado de carga almacenamiento %',color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([3624,3624+167])
if lang=="eng":
fig.suptitle('Storage during the first week of June', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.fill_between( np.arange(3624,3624+167,1), Demand[3624:3791], color="grey", alpha=0.2)
plt.bar((np.arange(3624,3624+167,1)), np.array(Q_prod[3624:3791])-np.array(Q_charg[3624:3791]),color = 'blue',label="Solar Production",align='center')
#ax1 .plot((np.arange(3624,3624+167,1)), Q_prod_lim[3624:3791],color = 'blue',label="Net Production",linewidth=4)
#ax1 .plot((np.arange(3624,3624+167,1)), Q_useful[3624:3791],color = 'green',label="Useful energy",linewidth=2)
ax1 .plot((np.arange(3624,3624+167,1)), Demand[3624:3791],color = '#362510',label="Demand",linewidth=2.0)
plt.bar((np.arange(3624,3624+167,1)), Q_defocus[3624:3791],color = 'red',label="Defocused",bottom=np.array(Q_prod[3624:3791])-np.array(Q_defocus[3624:3791]),align='center')
plt.bar((np.arange(3624,3624+167,1)), Q_charg[3624:3791],color = '#FFAE00',label="Charge",bottom=np.array(Q_prod[3624:3791])-np.array(Q_charg[3624:3791])-np.array(Q_defocus[3624:3791]),align='center')
plt.bar((np.arange(3624,3624+167,1)), Q_discharg[3624:3791],color = '#2EAD23',label="Discharge",bottom=Q_prod[3624:3791],align='center')
ax1.set_ylabel('Production & Demand - kWh')
ax1.set_ylim([0,np.max([np.max(Q_prod[3624:3791]),np.max(Demand[3624:3791])])*1.1])
ax1.set_xlim([3624,3624+167])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot((np.arange(3624,3624+167,1)), SOC[3624:3791],color='orange',linestyle=':',label="Storage's state of charge %",linewidth=2.0)
ax2.set_xlabel('simulation time (hour of the year)')
ax2.set_ylabel("Storage's state of charge %",color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([3624,3624+167])
plt.tight_layout()
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'almacenamiento_Junio.png', format='png', dpi=imageQlty)
def storageNonAnnual(sender,origin,SOC,Q_useful,Q_prod,Q_charg,Q_prod_lim,step_sim,Demand,Q_defocus,Q_discharg,steps_sim,plotPath,imageQlty,**kwargs):
fig = plt.figure(figsize=(14, 3.5))
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('Almacenamiento', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.bar(step_sim, Q_prod-Q_charg,color = '#1F85DE',label="Producción Solar",align='center')
ax1 .plot(step_sim, Q_prod_lim,color = 'blue',label="Energía suministrada",linewidth=4)
ax1 .plot(step_sim, Q_useful,color = 'green',label="Energía útil",linewidth=2)
ax1 .plot(step_sim, Demand,color = '#362510',label="Demanda")
if sender =='CIMAV':
plt.bar(step_sim, Q_defocus,color = 'red',label="Disipación",bottom=Q_prod-Q_defocus,align='center')
else:
plt.bar(step_sim, Q_defocus,color = 'red',label="Desenfoque",bottom=Q_prod-Q_defocus,align='center')
plt.bar(step_sim, Q_charg,color = '#FFAE00',label="Carga",bottom=Q_prod-Q_charg-Q_defocus,align='center')
plt.bar(step_sim, Q_discharg,color = '#2EAD23',label="Descarga",bottom=Q_prod,align='center')
ax1.set_ylabel('Producción & Demanda - kWh')
ax1.set_ylim([0,max(np.max(Q_prod),np.max(Demand))*1.2])
ax1.set_xlim([0,steps_sim])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, SOC,'.r-',label="Carga del almacenamiento")
ax2.set_xlabel('simulación (hora del año)')
ax2.set_ylabel('Estado de carga almacenamiento %',color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([0,steps_sim])
plt.tight_layout()
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'almacenamiento_Anual.png', format='png', dpi=imageQlty)
def storageNonAnnualSL_S_PDR(sender,origin,SOC,Q_useful,Q_prod_steam,Q_prod,Q_drum,Q_charg,Q_prod_lim,step_sim,Demand,Q_defocus,Q_discharg,steps_sim,plotPath,imageQlty,**kwargs):
fig = plt.figure(figsize=(14, 3.5))
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('Almacenamiento', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.bar(step_sim, Q_prod-Q_charg,color = '#1F85DE',label="Producción solar en el campo",align='center')
plt.bar(step_sim, Q_prod_steam,color = '#7EE4E9',label="Producción de Vapor",align='center')
ax1 .plot(step_sim, Q_prod_lim,color = 'blue',label="Energía suministrada",linewidth=4)
ax1 .plot(step_sim, Q_useful,color = 'green',label="Energía útil",linewidth=2)
ax1 .plot(step_sim, Demand,color = '#362510',label="Demanda")
plt.bar(step_sim, Q_drum,color = '#FFAE00',label="Energía al drum",bottom=Q_prod_steam,align='center')
plt.bar(step_sim, Q_defocus,color = 'red',label="Desenfoque",bottom=Q_prod_steam+Q_drum,align='center')
ax1.set_ylabel('Producción & Demanda - kWh')
ax1.set_ylim([0,max(np.max(Q_prod_steam+Q_drum+Q_defocus),np.max(Demand))*1.2])
ax1.set_xlim([0,steps_sim])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, SOC,'.r-',label="Carga del almacenamiento")
ax2.set_xlabel('simulación (hora del año)')
ax2.set_ylabel('Estado de carga almacenamiento %',color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([0,steps_sim])
plt.tight_layout()
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'almacenamiento_Anual.png', format='png', dpi=imageQlty)
def storageNonAnnualSL_S_PDR(sender,origin,SOC,Q_useful,Q_prod_steam,Q_prod,Q_drum,Q_charg,Q_prod_lim,step_sim,Demand,Q_defocus,Q_discharg,steps_sim,plotPath,imageQlty,**kwargs):
fig = plt.figure(figsize=(14, 3.5))
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('Almacenamiento', fontsize=14, fontweight='bold',y=1)
ax1 = fig.add_subplot(111)
plt.bar(step_sim, Q_prod-Q_charg,color = '#1F85DE',label="Producción solar en el campo",align='center')
plt.bar(step_sim, Q_prod_steam,color = '#7EE4E9',label="Producción de Vapor",align='center')
ax1 .plot(step_sim, Q_prod_lim,color = 'blue',label="Energía suministrada",linewidth=4)
ax1 .plot(step_sim, Q_useful,color = 'green',label="Energía útil",linewidth=2)
ax1 .plot(step_sim, Demand,color = '#362510',label="Demanda")
plt.bar(step_sim, Q_drum,color = '#FFAE00',label="Energía al drum",bottom=Q_prod_steam,align='center')
plt.bar(step_sim, Q_defocus,color = 'red',label="Desenfoque",bottom=Q_prod_steam+Q_drum,align='center')
ax1.set_ylabel('Producción & Demanda - kWh')
ax1.set_ylim([0,max(np.max(Q_prod_steam+Q_drum+Q_defocus),np.max(Demand))*1.2])
ax1.set_xlim([0,steps_sim])
plt.legend(loc='upper left', borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, SOC,'.r-',label="Carga del almacenamiento")
ax2.set_xlabel('simulación (hora del año)')
ax2.set_ylabel('Estado de carga almacenamiento %',color = '#CA6A16')
ax2.set_ylim([0,101])
ax2.set_xlim([0,steps_sim])
plt.tight_layout()
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'almacenamiento_Anual.png', format='png', dpi=imageQlty)
def financePlot(sender,origin,lang,n_years_sim,Acum_FCF,FCF,m_dot_min_kgs,steps_sim,AmortYear,Selling_price,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
fig.suptitle('Estudio financiero', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(n_years_sim), Acum_FCF,'.k-',label="Cash Flow acumulado")
ax1 .plot(np.arange(n_years_sim), FCF,'.b-',label="Cash Flow")
ax1 .axhline(y=m_dot_min_kgs,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1.set_xlabel('años')
if origin == -3:
ax1.set_ylabel('$')
else:
ax1.set_ylabel('€')
plt.legend(bbox_to_anchor=(1.15, .5), loc=2, borderaxespad=0.)
plt.text(int(AmortYear),-Selling_price, "Año de retorno= "+str(int(AmortYear)))
if lang=="eng":
fig.suptitle('Financial study', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(np.arange(n_years_sim), Acum_FCF,'.k-',label="Accumulated Free Cash Flows")
ax1 .plot(np.arange(n_years_sim), FCF,'.b-',label="Free Cash Flows")
ax1 .axhline(y=m_dot_min_kgs,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1.set_xlabel('years')
if origin == -3:
ax1.set_ylabel('$')
else:
ax1.set_ylabel('€')
plt.legend(bbox_to_anchor=(1.15, .5), loc=2, borderaxespad=0.)
plt.text(int(AmortYear),-Selling_price, "Payback period= "+str(int(AmortYear)))
# plt.text(1,Acum_FCF[n_years_sim-1]*.55,"IRR: "+ str(round(IRR,2))+"%", bbox={'boxstyle':'square', 'color':'#A0D8EB'})
# plt.text(1,Acum_FCF[n_years_sim-1]*.35,"Solar_fraction: "+ str(round(solar_fraction_lim,1))+"%", bbox={'boxstyle':'square', 'color':'#A0D8EB'})
# plt.text(1,Acum_FCF[n_years_sim-1]*.85,"Energy_bill: "+ str(round(energy_bill))+"€", bbox={'boxstyle':'square', 'color':'#A0D8EB'})
# plt.text(1,Acum_FCF[n_years_sim-1]*.7,"Savings: "+ str(round(Solar_savings_lim))+"€", bbox={'boxstyle':'square', 'color':'#A0D8EB'})
# plt.text(1,Acum_FCF[n_years_sim-1]*.45,"Investment: "+ str(round(Selling_price))+"€", bbox={'boxstyle':'square', 'color':'#A0D8EB'})
# plt.text(1,Acum_FCF[n_years_sim-1],"Solar Irradiation: "+ str(round(DNI_anual_irradiation,1))+"kWh/m2", bbox={'boxstyle':'square', 'color':'#A0D8EB'})
# # ax2 = ax1.twinx()
# ax2 .plot(step_sim, Demand,'.-',color = 'red',label="Demand")
# ax2 .plot(step_sim, Q_prod,'.-',color = '#617824',label="Q_prod")
# ax2 .plot(step_sim, Q_prod_lim,'.-',color = 'blue',label="Q_prod_lim")
# ax2.set_ylabel('QProd vs DEmand - kWh')
plt.legend(bbox_to_anchor=(0, 1), loc=2, borderaxespad=0.)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'finance.png', format='png', dpi=imageQlty)
def arraysMonth(Q_prod,Q_prod_lim,DNI,Demand,**kwargs):
#Para resumen mensual
Ene_prod=np.zeros(8760)
Feb_prod=np.zeros(8760)
Mar_prod=np.zeros(8760)
Abr_prod=np.zeros(8760)
May_prod=np.zeros(8760)
Jun_prod=np.zeros(8760)
Jul_prod=np.zeros(8760)
Ago_prod=np.zeros(8760)
Sep_prod=np.zeros(8760)
Oct_prod=np.zeros(8760)
Nov_prod=np.zeros(8760)
Dic_prod=np.zeros(8760)
Ene_prod_lim=np.zeros(8760)
Feb_prod_lim=np.zeros(8760)
Mar_prod_lim=np.zeros(8760)
Abr_prod_lim=np.zeros(8760)
May_prod_lim=np.zeros(8760)
Jun_prod_lim=np.zeros(8760)
Jul_prod_lim=np.zeros(8760)
Ago_prod_lim=np.zeros(8760)
Sep_prod_lim=np.zeros(8760)
Oct_prod_lim=np.zeros(8760)
Nov_prod_lim=np.zeros(8760)
Dic_prod_lim=np.zeros(8760)
Ene_DNI=np.zeros(8760)
Feb_DNI=np.zeros(8760)
Mar_DNI=np.zeros(8760)
Abr_DNI=np.zeros(8760)
May_DNI=np.zeros(8760)
Jun_DNI=np.zeros(8760)
Jul_DNI=np.zeros(8760)
Ago_DNI=np.zeros(8760)
Sep_DNI=np.zeros(8760)
Oct_DNI=np.zeros(8760)
Nov_DNI=np.zeros(8760)
Dic_DNI=np.zeros(8760)
Ene_demd=np.zeros(8760)
Feb_demd=np.zeros(8760)
Mar_demd=np.zeros(8760)
Abr_demd=np.zeros(8760)
May_demd=np.zeros(8760)
Jun_demd=np.zeros(8760)
Jul_demd=np.zeros(8760)
Ago_demd=np.zeros(8760)
Sep_demd=np.zeros(8760)
Oct_demd=np.zeros(8760)
Nov_demd=np.zeros(8760)
Dic_demd=np.zeros(8760)
for i in range(0,8759):
if (i<=744-1):
Ene_prod[i]=Q_prod[i]
Ene_prod_lim[i]=Q_prod_lim[i]
Ene_DNI[i]=DNI[i]
Ene_demd[i]=Demand[i]
if (i>744-1) and (i<=1416-1):
Feb_prod[i]=Q_prod[i]
Feb_prod_lim[i]=Q_prod_lim[i]
Feb_DNI[i]=DNI[i]
Feb_demd[i]=Demand[i]
if (i>1416-1) and (i<=2160-1):
Mar_prod[i]=Q_prod[i]
Mar_prod_lim[i]=Q_prod_lim[i]
Mar_DNI[i]=DNI[i]
Mar_demd[i]=Demand[i]
if (i>2160-1) and (i<=2880-1):
Abr_prod[i]=Q_prod[i]
Abr_prod_lim[i]=Q_prod_lim[i]
Abr_DNI[i]=DNI[i]
Abr_demd[i]=Demand[i]
if (i>2880-1) and (i<=3624-1):
May_prod[i]=Q_prod[i]
May_prod_lim[i]=Q_prod_lim[i]
May_DNI[i]=DNI[i]
May_demd[i]=Demand[i]
if (i>3624-1) and (i<=4344-1):
Jun_prod[i]=Q_prod[i]
Jun_prod_lim[i]=Q_prod_lim[i]
Jun_DNI[i]=DNI[i]
Jun_demd[i]=Demand[i]
if (i>4344-1) and (i<=5088-1):
Jul_prod[i]=Q_prod[i]
Jul_prod_lim[i]=Q_prod_lim[i]
Jul_DNI[i]=DNI[i]
Jul_demd[i]=Demand[i]
if (i>5088-1) and (i<=5832-1):
Ago_prod[i]=Q_prod[i]
Ago_prod_lim[i]=Q_prod_lim[i]
Ago_DNI[i]=DNI[i]
Ago_demd[i]=Demand[i]
if (i>5832-1) and (i<=6552-1):
Sep_prod[i]=Q_prod[i]
Sep_prod_lim[i]=Q_prod_lim[i]
Sep_DNI[i]=DNI[i]
Sep_demd[i]=Demand[i]
if (i>6552-1) and (i<=7296-1):
Oct_prod[i]=Q_prod[i]
Oct_prod_lim[i]=Q_prod_lim[i]
Oct_DNI[i]=DNI[i]
Oct_demd[i]=Demand[i]
if (i>7296-1) and (i<=8016-1):
Nov_prod[i]=Q_prod[i]
Nov_prod_lim[i]=Q_prod_lim[i]
Nov_DNI[i]=DNI[i]
Nov_demd[i]=Demand[i]
if (i>8016-1):
Dic_prod[i]=Q_prod[i]
Dic_prod_lim[i]=Q_prod_lim[i]
Dic_DNI[i]=DNI[i]
Dic_demd[i]=Demand[i]
array_de_meses=[np.sum(Ene_prod),np.sum(Feb_prod),np.sum(Mar_prod),np.sum(Abr_prod),np.sum(May_prod),np.sum(Jun_prod),np.sum(Jul_prod),np.sum(Ago_prod),np.sum(Sep_prod),np.sum(Oct_prod),np.sum(Nov_prod),np.sum(Dic_prod)]
array_de_meses_lim=[np.sum(Ene_prod_lim),np.sum(Feb_prod_lim),np.sum(Mar_prod_lim),np.sum(Abr_prod_lim),np.sum(May_prod_lim),np.sum(Jun_prod_lim),np.sum(Jul_prod_lim),np.sum(Ago_prod_lim),np.sum(Sep_prod_lim),np.sum(Oct_prod_lim),np.sum(Nov_prod_lim),np.sum(Dic_prod_lim)]
array_de_DNI=[np.sum(Ene_DNI),np.sum(Feb_DNI),np.sum(Mar_DNI),np.sum(Abr_DNI),np.sum(May_DNI),np.sum(Jun_DNI),np.sum(Jul_DNI),np.sum(Ago_DNI),np.sum(Sep_DNI),np.sum(Oct_DNI),np.sum(Nov_DNI),np.sum(Dic_DNI)]
array_de_demd=[np.sum(Ene_demd),np.sum(Feb_demd),np.sum(Mar_demd),np.sum(Abr_demd),np.sum(May_demd),np.sum(Jun_demd),np.sum(Jul_demd),np.sum(Ago_demd),np.sum(Sep_demd),np.sum(Oct_demd),np.sum(Nov_demd),np.sum(Dic_demd)]
array_de_fraction=np.zeros(12)
return array_de_meses,array_de_meses_lim,array_de_DNI,array_de_demd,array_de_fraction
def prodMonths(sender,origin,Q_prod,Q_prod_lim,DNI,Demand,lang,plotPath,imageQlty,**kwargs):
array_de_meses,array_de_meses_lim,array_de_DNI,array_de_demd,array_de_fraction=arraysMonth(Q_prod,Q_prod_lim,DNI,Demand)
for m in range(0,12):
if array_de_demd[m]==0:
array_de_fraction[m]=0
else:
array_de_fraction[m]=100*array_de_meses[m]/array_de_demd[m]
output1=pd.DataFrame(array_de_meses)
output1.columns=['Prod.mensual']
output2=pd.DataFrame(array_de_DNI)/1000
output2.columns=['DNI']
output3=pd.DataFrame(array_de_demd)
output3.columns=['Demanda']
output4=pd.DataFrame(array_de_meses_lim)
output4.columns=['Prod.mensual_lim']
output_excel=pd.concat([output1,output2,output3,output4], axis=1)
meses=["Ene","Feb","Mar","Abr","May","Jun","Jul","Ago","Sep","Oct","Nov","Dec"]
meses_index=np.arange(0,12)
fig,ax = plt.subplots()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
meses=["Ene","Feb","Mar","Abr","May","Jun","Jul","Ago","Sep","Oct","Nov","Dec"]
fig.suptitle('Producción & Demanda energía de proceso', fontsize=14, fontweight='bold')
ax.set_ylabel('Producción y Demanda en kWh',color = 'black')
ax.bar(meses_index, output3['Demanda'], width=0.8, color='#362510',label="Demanda")
if sender =='CIMAV':
ax.bar(meses_index, output1['Prod.mensual'], width=0.8, color='red',label="Disipada")
else:
ax.bar(meses_index, output1['Prod.mensual'], width=0.8, color='red',label="Desenfocada")
ax.bar(meses_index, output4['Prod.mensual_lim'], width=0.8, color='blue',label="Producción solar")
plt.legend(loc=9, bbox_to_anchor=(0.5, -0.05), ncol=3)
ax2 = ax.twinx()
ax2 .plot([0.5,1.5,2.5,3.5,4.5,5.5,6.5,7.5,8.5,9.5,10.5,11.5], output2['DNI'],'-',color = '#CA6A16',label="Radiación solar",linewidth=2.0)
ax2.set_ylabel('Radiacion solar [kWh/m2]',color = '#CA6A16')
ax.set_xticks(meses_index+.4) # set the x ticks to be at the middle of each bar since the width of each bar is 0.8
ax.set_xticklabels(meses) #replace the name of the x ticks with your Groups name
plt.legend(loc='upper right', borderaxespad=0.,frameon=True)
if lang=="eng":
meses=["Jan","Feb","Mar","Apr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"]
fig.suptitle('Production & Demand process energy', fontsize=14, fontweight='bold')
ax.set_ylabel('Production & Demand kWh',color = 'black')
ax.bar(meses_index, output3['Demanda'], width=0.8, color='#362510',label="Demand")
ax.bar(meses_index, output1['Prod.mensual'], width=0.8, color='red',label="Defocused")
ax.bar(meses_index, output4['Prod.mensual_lim'], width=0.8, color='blue',label="Solar production")
plt.legend(loc=9, bbox_to_anchor=(0.5, -0.05), ncol=3)
ax2 = ax.twinx()
ax2 .plot([0.5,1.5,2.5,3.5,4.5,5.5,6.5,7.5,8.5,9.5,10.5,11.5], output2['DNI'],'-',color = '#CA6A16',label="Solar Radiation",linewidth=2.0)
ax2.set_ylabel('Solar Radiation [kWh/m2]',color = '#CA6A16')
ax.set_xticks(meses_index+.4) # set the x ticks to be at the middle of each bar since the width of each bar is 0.8
ax.set_xticklabels(meses) #replace the name of the x ticks with your Groups name
plt.legend(loc='upper right', borderaxespad=0.,frameon=True)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'prodMonths.png', format='png', dpi=imageQlty)
if origin==0:
plt.show()
return output_excel
def arrays_Savings_Month(Q_prod_lim,Demand,Fuel_price,Boiler_eff,**kwargs):
#Para resumen mensual
Ene_sav_lim=np.zeros(8760)
Feb_sav_lim=np.zeros(8760)
Mar_sav_lim=np.zeros(8760)
Abr_sav_lim=np.zeros(8760)
May_sav_lim=np.zeros(8760)
Jun_sav_lim=np.zeros(8760)
Jul_sav_lim=np.zeros(8760)
Ago_sav_lim=np.zeros(8760)
Sep_sav_lim=np.zeros(8760)
Oct_sav_lim=np.zeros(8760)
Nov_sav_lim=np.zeros(8760)
Dic_sav_lim=np.zeros(8760)
Ene_demd=np.zeros(8760)
Feb_demd=np.zeros(8760)
Mar_demd=np.zeros(8760)
Abr_demd=np.zeros(8760)
May_demd=np.zeros(8760)
Jun_demd=np.zeros(8760)
Jul_demd=np.zeros(8760)
Ago_demd=np.zeros(8760)
Sep_demd=np.zeros(8760)
Oct_demd=np.zeros(8760)
Nov_demd=np.zeros(8760)
Dic_demd=np.zeros(8760)
Ene_frac=np.zeros(8760)
Feb_frac=np.zeros(8760)
Mar_frac=np.zeros(8760)
Abr_frac=np.zeros(8760)
May_frac=np.zeros(8760)
Jun_frac=np.zeros(8760)
Jul_frac=np.zeros(8760)
Ago_frac=np.zeros(8760)
Sep_frac=np.zeros(8760)
Oct_frac=np.zeros(8760)
Nov_frac=np.zeros(8760)
Dic_frac=np.zeros(8760)
for i in range(0,8759):
if (i<=744-1):
Ene_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Ene_demd[i]=Fuel_price*Demand[i]
if Ene_demd[i] == 0:
Ene_frac[i] = 0
else:
Ene_frac[i]=Ene_sav_lim[i]/Ene_demd[i]
if (i>744-1) and (i<=1416-1):
Feb_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Feb_demd[i]=Fuel_price*Demand[i]
if Feb_demd[i] == 0:
Feb_frac[i] = 0
else:
Feb_frac[i]=Feb_sav_lim[i]/Feb_demd[i]
if (i>1416-1) and (i<=2160-1):
Mar_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Mar_demd[i]=Fuel_price*Demand[i]
if Mar_demd[i] == 0:
Mar_frac[i] = 0
else:
Mar_frac[i]=Mar_sav_lim[i]/Mar_demd[i]
if (i>2160-1) and (i<=2880-1):
Abr_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Abr_demd[i]=Fuel_price*Demand[i]
if Abr_demd[i] == 0:
Abr_frac[i] = 0
else:
Abr_frac[i]=Abr_sav_lim[i]/Abr_demd[i]
if (i>2880-1) and (i<=3624-1):
May_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
May_demd[i]=Fuel_price*Demand[i]
if May_demd[i] == 0:
May_frac[i] = 0
else:
May_frac[i]=May_sav_lim[i]/May_demd[i]
if (i>3624-1) and (i<=4344-1):
Jun_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Jun_demd[i]=Fuel_price*Demand[i]
if Jun_demd[i] == 0:
Jun_frac[i] = 0
else:
Jun_frac[i]=Jun_sav_lim[i]/Jun_demd[i]
if (i>4344-1) and (i<=5088-1):
Jul_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Jul_demd[i]=Fuel_price*Demand[i]
if Jul_demd[i] == 0:
Jul_frac[i] = 0
else:
Jul_frac[i]=Jul_sav_lim[i]/Jul_demd[i]
if (i>5088-1) and (i<=5832-1):
Ago_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Ago_demd[i]=Fuel_price*Demand[i]
if Ago_demd[i] == 0:
Ago_frac[i] = 0
else:
Ago_frac[i]=Ago_sav_lim[i]/Ago_demd[i]
if (i>5832-1) and (i<=6552-1):
Sep_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Sep_demd[i]=Fuel_price*Demand[i]
if Sep_demd[i] == 0:
Sep_frac[i] = 0
else:
Sep_frac[i]=Sep_sav_lim[i]/Sep_demd[i]
if (i>6552-1) and (i<=7296-1):
Oct_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Oct_demd[i]=Fuel_price*Demand[i]
if Oct_demd[i] == 0:
Oct_frac[i] = 0
else:
Oct_frac[i]=Oct_sav_lim[i]/Oct_demd[i]
if (i>7296-1) and (i<=8016-1):
Nov_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Nov_demd[i]=Fuel_price*Demand[i]
if Nov_demd[i] == 0:
Nov_frac[i] = 0
else:
Nov_frac[i]=Nov_sav_lim[i]/Nov_demd[i]
if (i>8016-1):
Dic_sav_lim[i]=Fuel_price*Q_prod_lim[i]/Boiler_eff
Dic_demd[i]=Fuel_price*Demand[i]
if Dic_demd[i] == 0:
Dic_frac[i] = 0
else:
Dic_frac[i]=Dic_sav_lim[i]/Dic_demd[i]
array_de_meses_lim=[np.sum(Ene_sav_lim),np.sum(Feb_sav_lim),np.sum(Mar_sav_lim),np.sum(Abr_sav_lim),np.sum(May_sav_lim),np.sum(Jun_sav_lim),np.sum(Jul_sav_lim),np.sum(Ago_sav_lim),np.sum(Sep_sav_lim),np.sum(Oct_sav_lim),np.sum(Nov_sav_lim),np.sum(Dic_sav_lim)]
array_de_demd=[np.sum(Ene_demd),np.sum(Feb_demd),np.sum(Mar_demd),np.sum(Abr_demd),np.sum(May_demd),np.sum(Jun_demd),np.sum(Jul_demd),np.sum(Ago_demd),np.sum(Sep_demd),np.sum(Oct_demd),np.sum(Nov_demd),np.sum(Dic_demd)]
array_de_fraction=[np.sum(Ene_frac),np.sum(Feb_frac),np.sum(Mar_frac),np.sum(Abr_frac),np.sum(May_frac),np.sum(Jun_frac),np.sum(Jul_frac),np.sum(Ago_frac),np.sum(Sep_frac),np.sum(Oct_frac),np.sum(Nov_frac),np.sum(Dic_frac)]
return array_de_meses_lim,array_de_demd,array_de_fraction
def savingsMonths(sender,origin,Q_prod_lim,Demand,Fuel_price,Boiler_eff,lang,plotPath,imageQlty,**kwargs):
array_de_meses_lim,array_de_demd,array_de_fraction=arrays_Savings_Month(Q_prod_lim,Demand,Fuel_price,Boiler_eff)
output2=pd.DataFrame(array_de_fraction)
output2.columns=['Fraccion']
output3=pd.DataFrame(array_de_demd)
output3.columns=['Demanda']
output4=pd.DataFrame(array_de_meses_lim)
output4.columns=['Ahorro mensual']
output_excel=pd.concat([output3,output4], axis=1)
meses=["Ene","Feb","Mar","Abr","May","Jun","Jul","Ago","Sep","Oct","Nov","Dic"]
meses_index=np.arange(0,12)
fig = plt.figure(figsize=(10, 5))
# fig = plt.figure()
# fig,ax = plt.subplots()
ax = fig.add_subplot(111)
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
if lang=="spa":
meses=["Ene","Feb","Mar","Abr","May","Jun","Jul","Ago","Sep","Oct","Nov","Dec"]
fig.suptitle('Ahorro solar', fontsize=14, fontweight='bold')
if origin == -3:
ax.set_ylabel('Ahorro solar / Factura actual $')
else:
ax.set_ylabel('Ahorro solar / Factura actual €')
ax.bar(meses_index, output3['Demanda'], width=0.8, color='#362510',label="Factura mensual")
ax.bar(meses_index, output4['Ahorro mensual'], width=0.8, color='blue',label="Ahorro solar")
ax.set_xticks(meses_index)
ax.set_xticklabels(meses) #replace the name of the x ticks with your Groups name
L=plt.legend(loc=9, bbox_to_anchor=(0.5, -0.05), ncol=3)
if lang=="eng":
meses=["Jan","Feb","Mar","Abr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"]
fig.suptitle('Solar savings', fontsize=14, fontweight='bold')
if origin == -3:
ax.set_ylabel('Solar savings / Monthly energy cost $')
else:
ax.set_ylabel('Solar savings / Monthly energy cost €')
ax.bar(meses_index, output3['Demanda'], width=0.8, color='#362510',label="Monthly energy cost")
ax.bar(meses_index, output4['Ahorro mensual'], width=0.8, color='blue',label="Solar savings")
ax.set_xticks(meses_index)
ax.set_xticklabels(meses) #replace the name of the x ticks with your Groups name
L=plt.legend(loc=9, bbox_to_anchor=(0.5, -0.05), ncol=3)
if origin==-2 or origin == -3 or (origin==1 and sender=='SHIPcal'):
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'savMonths.png', format='png', dpi=imageQlty)
if origin==0:
plt.show()
return output_excel
def SL_S_PDR_Plot(sender,origin,step_sim,steps_sim,SD_min_energy,SD_max_energy,Q_prod,Q_prod_steam,SD_energy,T_in_K,T_out_K,T_SD_K,plotPath,imageQlty,**kwargs):
fig = plt.figure()
if origin==-2 or origin == -3:
fig.patch.set_alpha(0)
fig.suptitle('Direct steam Generation RECIRCULATION', fontsize=14, fontweight='bold')
ax1 = fig.add_subplot(111)
ax1 .plot(step_sim, Q_prod,'m:',label="Producción solar")
ax1 .plot(step_sim, Q_prod_steam,'g:',label="Producción vapor")
ax1 .plot(step_sim, SD_energy,color='orange',label="Energia en SD")
ax1 .axhline(y=SD_min_energy,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1 .axhline(y=SD_max_energy,xmin=0,xmax=steps_sim,c="black",linewidth=0.5,zorder=0)
ax1.set_ylim([0,max(SD_max_energy,max(Q_prod_steam))*1.1])
ax1.set_xlabel('Simulación (hora del año)')
ax1.set_ylabel('Energía - kWh')
plt.legend(bbox_to_anchor=(1.15, .5), loc=2, borderaxespad=0.)
ax2 = ax1.twinx()
ax2 .plot(step_sim, T_in_K-273,'-',color = '#1F85DE',label="Temp_in Solar")
ax2 .plot(step_sim, T_out_K-273,'-',color = 'red',label="Temp_out Solar")
ax2 .plot(step_sim, T_SD_K-273,':',color = 'orange',label="Temp_alm")
ax2.set_ylabel('Temp - C')
ax2.set_ylim([0,(np.max([np.max(T_SD_K),np.max(T_out_K)])-273)*1.2])
plt.legend(bbox_to_anchor=(1.15, 1), loc=2, borderaxespad=0.)
# output1=pd.DataFrame(flow_rate_kgs)
# output1.columns=['Flow_rate']
# output2=pd.DataFrame(T_in_K)
# output2.columns=['T_in_K']
# output3=pd.DataFrame(T_out_K)
# output3.columns=['T_out_K']
# output_excel_FlowratesTemps=pd.concat([output1,output2,output3], axis=1)
if origin==-2 or origin == -3:
f = io.BytesIO() # Python 3
plt.savefig(f, format="png", facecolor=(0.95,0.95,0.95))
plt.clf()
image_base64 = base64.b64encode(f.getvalue()).decode('utf-8').replace('\n', '')
f.close()
return image_base64
if origin==-1:
fig.savefig(str(plotPath)+'flowrates.png', format='png', dpi=imageQlty)
| 52.355581 | 549 | 0.625008 | 13,641 | 90,994 | 4.021406 | 0.04941 | 0.014128 | 0.016844 | 0.009224 | 0.875984 | 0.843973 | 0.825124 | 0.794407 | 0.779824 | 0.76267 | 0 | 0.066527 | 0.197002 | 90,994 | 1,737 | 550 | 52.385723 | 0.68413 | 0.058301 | 0 | 0.627424 | 0 | 0 | 0.132318 | 0.002384 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017313 | false | 0 | 0.006233 | 0 | 0.043629 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
35820aee046bab9529ab1971f2993f8434856f87 | 113 | py | Python | smff/analysis/__init__.py | ismael2395/Research-NoiseBias | 1511b65d23f03bbe7d55b114984740ab9d75110f | [
"MIT"
] | 1 | 2017-11-20T22:25:12.000Z | 2017-11-20T22:25:12.000Z | smff/analysis/__init__.py | ismael2395/ShapeMeasurementFisherFormalism | 1511b65d23f03bbe7d55b114984740ab9d75110f | [
"MIT"
] | null | null | null | smff/analysis/__init__.py | ismael2395/ShapeMeasurementFisherFormalism | 1511b65d23f03bbe7d55b114984740ab9d75110f | [
"MIT"
] | 1 | 2016-08-11T23:33:09.000Z | 2016-08-11T23:33:09.000Z | from . import fisher
from . import gparameters
from . import images
from . import models
from . import readfits
| 16.142857 | 25 | 0.769912 | 15 | 113 | 5.8 | 0.466667 | 0.574713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185841 | 113 | 6 | 26 | 18.833333 | 0.945652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
35879403098135051ed74d76f0d62005b32eafb5 | 11,705 | py | Python | pesummary/tests/summaryplots_test.py | pesummary/pesummary | 99e3c450ecbcaf5a23564d329bdf6e0080f6f2a8 | [
"MIT"
] | 1 | 2021-08-03T05:58:20.000Z | 2021-08-03T05:58:20.000Z | pesummary/tests/summaryplots_test.py | pesummary/pesummary | 99e3c450ecbcaf5a23564d329bdf6e0080f6f2a8 | [
"MIT"
] | 1 | 2020-06-13T13:29:35.000Z | 2020-06-15T12:45:04.000Z | pesummary/tests/summaryplots_test.py | pesummary/pesummary | 99e3c450ecbcaf5a23564d329bdf6e0080f6f2a8 | [
"MIT"
] | 3 | 2021-07-08T08:31:28.000Z | 2022-03-31T14:08:58.000Z | # Licensed under an MIT style license -- see LICENSE.md
import os
import shutil
from glob import glob
from pesummary.core.command_line import command_line
from pesummary.gw.command_line import insert_gwspecific_option_group
from pesummary.gw.inputs import GWInput
from pesummary.cli.summaryplots import _GWPlotGeneration as GWPlotGeneration
from pesummary.gw.file.meta_file import GWMetaFile
from pesummary.cli.summarypages import _GWWebpageGeneration as GWWebpageGeneration
from .base import make_result_file, get_list_of_plots, data_dir
import pytest
__author__ = ["Charlie Hoy <charlie.hoy@ligo.org>"]
class TestPlotGeneration(object):
def setup(self):
directories = ["./.outdir_bilby", "./.outdir_lalinference",
"./.outdir_comparison", "./.outdir_add_to_existing2",
".outdir_comparison_no_comparison",
".outdir_add_to_existing_no_comparison"]
for i in directories:
if os.path.isdir(i):
shutil.rmtree(i)
os.makedirs(i)
def test_plot_generation_for_bilby_structure(self):
with open("./.outdir_bilby/psd.dat", "w") as f:
f.writelines(["1.00 3.44\n"])
f.writelines(["100.00 4.00\n"])
f.writelines(["1000.00 5.00\n"])
f.writelines(["2000.00 6.00\n"])
with open("./.outdir_bilby/calibration.dat", "w") as f:
f.writelines(["1.0 2.0 3.0 4.0 5.0 6.0 7.0\n"])
f.writelines(["2000.0 2.0 3.0 4.0 5.0 6.0 7.0"])
parser = command_line()
insert_gwspecific_option_group(parser)
make_result_file(
gw=True, extension="hdf5", bilby=True, outdir="./.outdir_bilby/",
n_samples=10
)
os.rename("./.outdir_bilby/test.h5", "./.outdir_bilby/bilby_example.h5")
default_arguments = [
"--approximant", "IMRPhenomPv2",
"--webdir", "./.outdir_bilby",
"--samples", "./.outdir_bilby/bilby_example.h5",
"--config", data_dir + "/config_bilby.ini",
"--psd", "./.outdir_bilby/psd.dat",
"--calibration", "./.outdir_bilby/calibration.dat",
"--labels", "H10", "--no_ligo_skymap", "--disable_expert"]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
plots = sorted(glob("./.outdir_bilby/plots/*.png"))
expected_plots = get_list_of_plots(
gw=True, label="H1", outdir=".outdir_bilby", psd=True,
calibration=False
)
for i, j in zip(expected_plots, plots):
print(i, j)
assert all(i == j for i,j in zip(sorted(expected_plots), sorted(plots)))
def test_plot_generation_for_lalinference_structure(self):
parser = command_line()
insert_gwspecific_option_group(parser)
make_result_file(
gw=True, extension="hdf5", lalinference=True,
outdir="./.outdir_lalinference/", n_samples=10
)
os.rename(
"./.outdir_lalinference/test.hdf5",
"./.outdir_lalinference/lalinference_example.h5"
)
default_arguments = [
"--approximant", "IMRPhenomPv2",
"--webdir", "./.outdir_lalinference",
"--samples", "./.outdir_lalinference/lalinference_example.h5",
"--config", data_dir + "/config_lalinference.ini",
"--labels", "H10", "--no_ligo_skymap", "--disable_expert"]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
plots = sorted(glob("./.outdir_lalinference/plots/*.png"))
expected_plots = get_list_of_plots(
gw=True, label="H1", outdir=".outdir_lalinference"
)
assert all(i == j for i,j in zip(sorted(expected_plots), sorted(plots)))
def test_plot_generation_for_comparison(self):
parser = command_line()
insert_gwspecific_option_group(parser)
make_result_file(
gw=True, extension="hdf5", lalinference=True,
outdir="./.outdir_comparison/", n_samples=10
)
os.rename(
"./.outdir_comparison/test.hdf5",
"./.outdir_comparison/lalinference_example.h5"
)
make_result_file(
gw=True, extension="hdf5", bilby=True, outdir="./.outdir_comparison/",
n_samples=10
)
os.rename(
"./.outdir_comparison/test.h5",
"./.outdir_comparison/bilby_example.h5"
)
default_arguments = [
"--approximant", "IMRPhenomPv2", "IMRPhenomP",
"--webdir", "./.outdir_comparison",
"--samples", "./.outdir_comparison/bilby_example.h5",
"./.outdir_comparison/lalinference_example.h5",
"--labels", "H10", "H11", "--no_ligo_skymap", "--disable_expert"]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
plots = sorted(glob("./.outdir_comparison/plots/*.png"))
expected_plots = get_list_of_plots(
gw=True, label="H1", number=2, outdir=".outdir_comparison"
)
for i,j in zip(sorted(plots), sorted(expected_plots)):
print(i, j)
assert all(i == j for i,j in zip(sorted(plots), sorted(expected_plots)))
def test_plot_generation_for_add_to_existing(self):
parser = command_line()
insert_gwspecific_option_group(parser)
make_result_file(
gw=True, extension="hdf5", lalinference=True,
outdir="./.outdir_add_to_existing2/", n_samples=10
)
os.rename(
"./.outdir_add_to_existing2/test.hdf5",
"./.outdir_add_to_existing2/lalinference_example.h5"
)
make_result_file(
gw=True, extension="hdf5", bilby=True,
outdir="./.outdir_add_to_existing2/", n_samples=10
)
os.rename(
"./.outdir_add_to_existing2/test.h5",
"./.outdir_add_to_existing2/bilby_example.h5"
)
default_arguments = [
"--approximant", "IMRPhenomPv2",
"--webdir", "./.outdir_add_to_existing2",
"--samples", "./.outdir_add_to_existing2/bilby_example.h5",
"--labels", "H10", "--no_ligo_skymap", "--disable_expert"]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
webpage = GWWebpageGeneration(inputs)
webpage.generate_webpages()
meta_file = GWMetaFile(inputs)
parser = command_line()
insert_gwspecific_option_group(parser)
default_arguments = [
"--approximant", "IMRPhenomP",
"--existing_webdir", "./.outdir_add_to_existing2",
"--samples", "./.outdir_add_to_existing2/lalinference_example.h5",
"--labels", "H11", "--no_ligo_skymap", "--disable_expert"]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
plots = sorted(glob("./.outdir_add_to_existing2/plots/*.png"))
expected_plots = get_list_of_plots(
gw=True, label="H1", number=2, outdir=".outdir_add_to_existing2"
)
assert all(i == j for i, j in zip(sorted(plots), sorted(expected_plots)))
def test_plot_generation_for_multiple_without_comparison(self):
parser = command_line()
insert_gwspecific_option_group(parser)
make_result_file(
gw=True, extension="hdf5", lalinference=True,
outdir="./.outdir_comparison_no_comparison/", n_samples=10
)
os.rename(
"./.outdir_comparison_no_comparison/test.hdf5",
"./.outdir_comparison_no_comparison/lalinference_example.h5"
)
make_result_file(
gw=True, extension="hdf5", bilby=True,
outdir="./.outdir_comparison_no_comparison/", n_samples=10
)
os.rename(
"./.outdir_comparison_no_comparison/test.h5",
"./.outdir_comparison_no_comparison/bilby_example.h5"
)
default_arguments = [
"--approximant", "IMRPhenomPv2", "IMRPhenomP",
"--webdir", "./.outdir_comparison_no_comparison",
"--samples", "./.outdir_comparison_no_comparison/bilby_example.h5",
"./.outdir_comparison_no_comparison/lalinference_example.h5",
"--labels", "H10", "H11", "--no_ligo_skymap",
"--disable_comparison", "--disable_expert"
]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
plots = sorted(glob("./.outdir_comparison/plots/*.png"))
expected_plots = get_list_of_plots(
gw=True, label="H1", number=2, outdir=".outdir_comparison_no_comparison",
comparison=False
)
for i,j in zip(sorted(plots), sorted(expected_plots)):
print(i, j)
assert all(i == j for i,j in zip(sorted(plots), sorted(expected_plots)))
def test_plot_generation_for_add_to_existing_without_comparison(self):
parser = command_line()
insert_gwspecific_option_group(parser)
make_result_file(
gw=True, extension="hdf5", lalinference=True,
outdir="./.outdir_add_to_existing_no_comparison/", n_samples=10
)
os.rename(
"./.outdir_add_to_existing_no_comparison/test.hdf5",
"./.outdir_add_to_existing_no_comparison/lalinference_example.h5"
)
make_result_file(
gw=True, extension="hdf5", bilby=True,
outdir="./.outdir_add_to_existing_no_comparison/", n_samples=10
)
os.rename(
"./.outdir_add_to_existing_no_comparison/test.h5",
"./.outdir_add_to_existing_no_comparison/bilby_example.h5"
)
default_arguments = [
"--approximant", "IMRPhenomPv2",
"--webdir", "./.outdir_add_to_existing_no_comparison",
"--samples", "./.outdir_add_to_existing_no_comparison/bilby_example.h5",
"--labels", "H10", "--no_ligo_skymap", "--disable_expert"]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
webpage = GWWebpageGeneration(inputs)
webpage.generate_webpages()
meta_file = GWMetaFile(inputs)
parser = command_line()
insert_gwspecific_option_group(parser)
default_arguments = [
"--approximant", "IMRPhenomP",
"--existing_webdir", "./.outdir_add_to_existing_no_comparison",
"--samples", "./.outdir_add_to_existing_no_comparison/lalinference_example.h5",
"--labels", "H11", "--no_ligo_skymap",
"--disable_comparison", "--disable_expert"
]
opts = parser.parse_args(default_arguments)
inputs = GWInput(opts)
webpage = GWPlotGeneration(inputs)
webpage.generate_plots()
plots = sorted(glob("./.outdir_add_to_existing_no_comparison/plots/*.png"))
expected_plots = get_list_of_plots(
gw=True, label="H1", number=2, outdir=".outdir_add_to_existing_no_comparison",
comparison=False
)
assert all(i == j for i, j in zip(sorted(plots), sorted(expected_plots)))
| 43.191882 | 91 | 0.614267 | 1,274 | 11,705 | 5.324961 | 0.110675 | 0.020637 | 0.042158 | 0.038325 | 0.828567 | 0.802919 | 0.774175 | 0.746167 | 0.737323 | 0.712117 | 0 | 0.019768 | 0.256642 | 11,705 | 270 | 92 | 43.351852 | 0.759913 | 0.004528 | 0 | 0.515625 | 0 | 0.007813 | 0.29897 | 0.202403 | 0 | 0 | 0 | 0 | 0.023438 | 1 | 0.027344 | false | 0 | 0.042969 | 0 | 0.074219 | 0.011719 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
35bb38862ef69c170a0d860cade779b1208e450c | 146 | py | Python | OpenGLCffi/GLX/EXT/SGIX/swap_group.py | cydenix/OpenGLCffi | c78f51ae5e6b655eb2ea98f072771cf69e2197f3 | [
"MIT"
] | null | null | null | OpenGLCffi/GLX/EXT/SGIX/swap_group.py | cydenix/OpenGLCffi | c78f51ae5e6b655eb2ea98f072771cf69e2197f3 | [
"MIT"
] | null | null | null | OpenGLCffi/GLX/EXT/SGIX/swap_group.py | cydenix/OpenGLCffi | c78f51ae5e6b655eb2ea98f072771cf69e2197f3 | [
"MIT"
] | null | null | null | from OpenGLCffi.GLX import params
@params(api='glx', prms=['dpy', 'drawable', 'member'])
def glXJoinSwapGroupSGIX(dpy, drawable, member):
pass
| 20.857143 | 54 | 0.726027 | 18 | 146 | 5.888889 | 0.722222 | 0.207547 | 0.320755 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109589 | 146 | 6 | 55 | 24.333333 | 0.815385 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
ea2ab2739fa57d60f95b344b324fe270168f1baa | 122 | py | Python | emailmgr/defaults.py | RileyGibbs/django-emailmgr | 82dae79aceab20ac2146103067d31b01ee51731a | [
"BSD-3-Clause"
] | 6 | 2015-10-12T09:02:56.000Z | 2022-02-25T12:55:01.000Z | emailmgr/defaults.py | Pradip369/django-emailmgr | eae29514fded1200607b93759064338e91e6d6b4 | [
"BSD-3-Clause"
] | null | null | null | emailmgr/defaults.py | Pradip369/django-emailmgr | eae29514fded1200607b93759064338e91e6d6b4 | [
"BSD-3-Clause"
] | 6 | 2015-09-14T20:33:49.000Z | 2020-09-07T17:28:33.000Z | from django.conf import settings
EMAIL_MGR_TEMPLATE_PATH = getattr(settings, "EMAIL_MGR_TEMPLATE_PATH", "emailmgr")
| 13.555556 | 82 | 0.795082 | 16 | 122 | 5.6875 | 0.6875 | 0.285714 | 0.351648 | 0.527473 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122951 | 122 | 8 | 83 | 15.25 | 0.850467 | 0 | 0 | 0 | 0 | 0 | 0.264957 | 0.196581 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
ea60b69d9c0ccc94194dfb00331393c46fe62c2e | 30 | py | Python | ditto/tickets/api/__init__.py | Kvoti/ditto | eb4efb241e54bf679222d14afeb71d9d5441c122 | [
"BSD-3-Clause"
] | null | null | null | ditto/tickets/api/__init__.py | Kvoti/ditto | eb4efb241e54bf679222d14afeb71d9d5441c122 | [
"BSD-3-Clause"
] | 9 | 2015-11-10T15:17:22.000Z | 2015-11-12T11:07:02.000Z | ditto/users/api/__init__.py | Kvoti/ditto | eb4efb241e54bf679222d14afeb71d9d5441c122 | [
"BSD-3-Clause"
] | null | null | null | from .urls import urlpatterns
| 15 | 29 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ea6c92723996192c40d1ed3ba531dc76134f0dfd | 79 | py | Python | toir/formats/mapdata/__init__.py | FistingUranus/innocence-r | 786e1fca75155027e5875363f0b17e7c3cdefced | [
"MIT"
] | 2 | 2021-06-26T16:44:58.000Z | 2021-09-09T22:32:13.000Z | toir/formats/mapdata/__init__.py | FistingUranus/innocence-r | 786e1fca75155027e5875363f0b17e7c3cdefced | [
"MIT"
] | 4 | 2021-08-29T18:12:17.000Z | 2022-03-28T08:54:29.000Z | toir/formats/mapdata/__init__.py | FistingUranus/innocence-r | 786e1fca75155027e5875363f0b17e7c3cdefced | [
"MIT"
] | 3 | 2021-07-20T01:00:19.000Z | 2021-09-09T22:32:14.000Z | from .extract import extract_map_data
from .recompile import recompile_map_data | 39.5 | 41 | 0.886076 | 12 | 79 | 5.5 | 0.5 | 0.212121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088608 | 79 | 2 | 41 | 39.5 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
576adeb519172c497782aee89c7469149a1fe2cf | 6,195 | py | Python | Source Code/playlist_downloader.py | moiSentineL/Youtube-Download-Tools | 27e43b5cdd6e7ea87125144e7f8e8c05abdec1ec | [
"Apache-2.0"
] | 1 | 2021-04-02T07:44:57.000Z | 2021-04-02T07:44:57.000Z | Source Code/playlist_downloader.py | moiSentineL/Youtube-Download-Tools | 27e43b5cdd6e7ea87125144e7f8e8c05abdec1ec | [
"Apache-2.0"
] | null | null | null | Source Code/playlist_downloader.py | moiSentineL/Youtube-Download-Tools | 27e43b5cdd6e7ea87125144e7f8e8c05abdec1ec | [
"Apache-2.0"
] | null | null | null |
_author__ = "moiSentinel"
__license__ = "Apache License 2.0"
__version__ = "1.0.1"
__maintainer__ = "moiSentinel"
__status__ = "In Progress"
from pytube import *
import os
print("\nmoiSentineL's Youtube Bulk Video Downloader\nStable 1.0.1 / YDT 2.1.1")
print("This program will download playlists from Youtube.\n")
def download():
try:
userinputlink = input("Enter Youtube Playlist Link:\n>> ")
p= Playlist(userinputlink)
choosefileformat = int(input("\nChoose File Format:\n1 for .mp4 (video).\n2 for .mp3 (audio).\n>> "))
if choosefileformat == 1:
choosequality = int(input("\nChoose Video Quality:\n1 for 720p.\n2 for 480p.\n3 for 360p\n>> "))
if choosequality == 1:
try:
destinationinput = input("\nEnter path for the file to save (e.g. E:\Movies), press enter for default directory:\n>> ")
if destinationinput == '':
if not os.path.exists('downloads'):
os.makedirs('downloads')
currentdirectory = os.getcwd()
destination = currentdirectory +'\downloads'
else:
destination = destinationinput
print ('Playlist found.\nName: ', p.title)
proceed = input('Proceed? (y/n)\n>> ').lower()
if proceed == 'y':
for video in p.videos:
filesize = format(int(video.filesize)/1000000, ".2f") + " MB"
video.streams.filter(res='720p').first().download(destination)
print(video.title + " has been successfully downloaded with the size of " , filesize)
else:
pass
except Exception as e:
print('Something went wrong\n')
print(e.args)
elif choosequality == 2:
try:
destinationinput = input("\nEnter path for the file to save (e.g. E:\Movies), press enter for default directory:\n>> ")
if destinationinput == '':
if not os.path.exists('downloads'):
os.makedirs('downloads')
currentdirectory = os.getcwd()
destination = currentdirectory +'\downloads'
else:
destination = destinationinput
print ('Playlist found.\nName: ', p.title)
proceed = input('Proceed? (y/n)\n>> ').lower()
if proceed == 'y':
for video in p.videos:
filesize = format(int(video.filesize)/1000000, ".2f") + " MB"
video.streams.filter(res='480p').first().download(destination)
print(video.title + " has been successfully downloaded with the size of " , filesize)
else:
pass
except Exception as e:
print('Something went wrong\n')
print(e.args)
else:
try:
destinationinput = input("\nEnter path for the file to save (e.g. E:\Movies), press enter for default directory:\n>> ")
if destinationinput == '':
if not os.path.exists('downloads'):
os.makedirs('downloads')
currentdirectory = os.getcwd()
destination = currentdirectory +'\downloads'
else:
destination = destinationinput
print ('Playlist found.\nName: ', p.title)
proceed = input('Proceed? (y/n)\n>> ').lower()
if proceed == 'y':
for video in p.videos:
filesize = format(int(video.filesize)/1000000, ".2f") + " MB"
video.streams.filter(res='480p').first().download(destination)
print(video.title + " has been successfully downloaded with the size of " , filesize)
else:
pass
except Exception as e:
print('Something went wrong\n')
print(e.args)
else:
try:
destinationinput = input("\nEnter path for the file to save (e.g. E:\Movies), press enter for default directory:\n>> ")
if destinationinput == '':
if not os.path.exists('downloads'):
os.makedirs('downloads')
currentdirectory = os.getcwd()
destination = currentdirectory +'\downloads'
else:
destination = destinationinput
print ('Playlist found.\nName: ', p.title)
proceed = input('Proceed? (y/n)\n>> ').lower()
if proceed == 'y':
for video in p.videos:
videos = video.streams.filter(only_audio=True).first()
filesize = format(int(videos.filesize)/1000000, ".2f") + " MB"
out_file = videos.download(output_path=destination)
base, ext = os.path.splitext(out_file)
new_file = base + '.mp3'
os.rename(out_file, new_file)
print(videos.title + " has been successfully downloaded with the size of " , filesize)
except Exception as e:
print('Something went wrong\n')
print(e.args)
print('\nTask Completed!')
except Exception as e:
print('Something went wrong\n')
print(e.args)
download()
k = input('Press <Enter> key to exit.')
| 48.398438 | 140 | 0.465214 | 557 | 6,195 | 5.127469 | 0.219031 | 0.022409 | 0.029762 | 0.031513 | 0.739496 | 0.739496 | 0.739496 | 0.739496 | 0.739496 | 0.739496 | 0 | 0.020519 | 0.433575 | 6,195 | 127 | 141 | 48.779528 | 0.793388 | 0 | 0 | 0.75 | 0 | 0.053571 | 0.229311 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008929 | false | 0.026786 | 0.017857 | 0 | 0.026786 | 0.1875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
57858020cbd5000e8cd268ae1a2abc15fd688c0c | 24 | py | Python | catbars/__init__.py | ConstantinLenoir/catbars | de03ca9fdca0b8d0fc24929537639a55eff3c711 | [
"MIT"
] | 1 | 2020-03-25T20:23:37.000Z | 2020-03-25T20:23:37.000Z | catbars/__init__.py | ConstantinLenoir/catbars | de03ca9fdca0b8d0fc24929537639a55eff3c711 | [
"MIT"
] | null | null | null | catbars/__init__.py | ConstantinLenoir/catbars | de03ca9fdca0b8d0fc24929537639a55eff3c711 | [
"MIT"
] | null | null | null |
from .bars import Bars
| 8 | 22 | 0.75 | 4 | 24 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 24 | 2 | 23 | 12 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
17e4a840e32dccad6d96ea2c54579ae0d30241a2 | 51 | py | Python | tests/unit_tests/rf_client_test_resources/primary_dependencies/secondary_dependencies/tertiary_dependencies/Lib2.py | adiroiban/robotframework-remoterunner | 2815672823872c6e5e014131bc0e7f622e9a986e | [
"MIT"
] | null | null | null | tests/unit_tests/rf_client_test_resources/primary_dependencies/secondary_dependencies/tertiary_dependencies/Lib2.py | adiroiban/robotframework-remoterunner | 2815672823872c6e5e014131bc0e7f622e9a986e | [
"MIT"
] | null | null | null | tests/unit_tests/rf_client_test_resources/primary_dependencies/secondary_dependencies/tertiary_dependencies/Lib2.py | adiroiban/robotframework-remoterunner | 2815672823872c6e5e014131bc0e7f622e9a986e | [
"MIT"
] | null | null | null |
def Keyword_34543534():
print('Keyword 1') | 12.75 | 24 | 0.627451 | 6 | 51 | 5.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0.235294 | 51 | 4 | 25 | 12.75 | 0.564103 | 0 | 0 | 0 | 0 | 0 | 0.191489 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
aa0884ebce4cd0ed3594a121f8cb2fa7003e4dbc | 13,096 | py | Python | register/tests/test_translations.py | SmartElect/SmartElect | d6d35f2fa8f60e756ad5247f8f0a5f05830e92f8 | [
"Apache-2.0"
] | 23 | 2015-10-28T14:08:23.000Z | 2021-09-11T21:38:41.000Z | register/tests/test_translations.py | SmartElect/SmartElect | d6d35f2fa8f60e756ad5247f8f0a5f05830e92f8 | [
"Apache-2.0"
] | 4 | 2019-12-05T20:36:10.000Z | 2020-06-05T18:41:54.000Z | register/tests/test_translations.py | SmartElect/SmartElect | d6d35f2fa8f60e756ad5247f8f0a5f05830e92f8 | [
"Apache-2.0"
] | 11 | 2015-10-28T15:49:56.000Z | 2021-09-14T14:18:36.000Z | # -*- coding: utf-8 -*-
import datetime
from unittest.mock import patch
from django.conf import settings
from django.test.utils import override_settings
from civil_registry.tests.factories import CitizenFactory
from libya_elections import constants
from libya_elections.utils import get_random_number_string
from libya_elections.phone_numbers import get_random_phone_number
from polling_reports.models import StaffPhone
from register.tests.factories import RegistrationFactory, RegistrationCenterFactory, \
SMSFactory
from voting.tests.factories import RegistrationPeriodFactory
from .. import utils
from ..tests.base import LibyaRapidTest, TranslationTest, FUTURE_DAY, PAST_DAY
@override_settings(OUTGOING_MESSAGE_LANGUAGE='ar',
LANGUAGE_CODE='en')
@patch.object(utils, "tool_1_enabled")
class ResponseTest(TranslationTest, LibyaRapidTest):
# Test response messages (Arabic)
def setUp(self):
self.number = "919-999-9999"
self.center = RegistrationCenterFactory()
self.conn = self.create_connection(data={'identity': self.number})
self.citizen = CitizenFactory()
self.good_nid = self.citizen.national_id
self.bad_nid = get_random_number_string(length=constants.NID_LENGTH)
self.short_nid = get_random_number_string(length=constants.NID_LENGTH - 1)
self.good_center_id = self.center.center_id
self.bad_center_id = get_random_number_string(length=constants.CENTER_ID_LENGTH)
self.long_center_id = get_random_number_string(length=constants.CENTER_ID_LENGTH + 1)
self.fields = {'to_addr': settings.REGISTRATION_SHORT_CODE}
RegistrationPeriodFactory(start_time=PAST_DAY, end_time=FUTURE_DAY)
def test_garbage(self, registration_open):
self.receive("PING", self.conn, fields=self.fields)
expected = self.translate(constants.MESSAGE_INCORRECT) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_garbage_enhanced(self, registration_open):
for i in range(1, 5):
# last iteration should get an enhanced message
self.receive("PING", self.conn, fields=self.fields)
expected = self.translate(constants.MESSAGE_INCORRECT, enhanced=True) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_wrong_length_nid(self, registration_open):
msg = "{nid}#{center}".format(nid=self.short_nid, center=self.good_center_id)
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_NID_WRONG_LENGTH) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_wrong_length_nid_enhanced(self, registration_open):
msg = "{nid}#{center}".format(nid=self.short_nid, center=self.good_center_id)
for i in range(1, 5):
# last iteration should get an enhanced message
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_NID_WRONG_LENGTH, enhanced=True) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_wrong_length_center_id(self, registration_open):
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.long_center_id)
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_CENTER_ID_WRONG_LENGTH) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_wrong_length_center_id_enhanced(self, registration_open):
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.long_center_id)
for i in range(1, 5):
# last iteration should get an enhanced message
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_CENTER_ID_WRONG_LENGTH, enhanced=True)
self.assertEqual(self.get_last_response_message(), expected)
def test_wrong_length_nid_query(self, registration_open):
msg = "{nid}".format(nid=self.short_nid)
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.VOTER_QUERY_NID_WRONG_LENGTH)
self.assertEqual(self.get_last_response_message(), expected)
def test_citizen_under_18(self, registration_open):
self.citizen.birth_date = datetime.datetime.today()
self.citizen.save()
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.good_center_id)
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_NID_INVALID) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_center_does_not_exist(self, registration_open):
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.bad_center_id)
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_CENTER_ID_INVALID) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_nid_does_not_exist(self, registration_open):
msg = "{nid}#{center}".format(nid=self.bad_nid, center=self.good_center_id)
self.receive(msg, self.conn, fields=self.fields)
expected = self.translate(constants.RESPONSE_NID_INVALID) # arabic
self.assertEqual(self.get_last_response_message(), expected)
@override_settings(MAX_REGISTRATIONS_PER_PHONE=5)
def test_good_registration(self, registration_open):
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.good_center_id)
self.receive(msg, self.conn, fields=self.fields)
context = {'person': str(self.citizen), 'centre': self.center.name,
'code': self.center.center_id}
expected = self.translate(constants.MESSAGE_1, context=context) # arabic
self.assertEqual(self.get_last_response_message(), expected)
@override_settings(MAX_REGISTRATIONS_PER_PHONE=5)
def test_good_registration_enhanced(self, registration_open):
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.good_center_id)
for i in range(1, 5):
# last iteration should get an enhanced message
self.receive(msg, self.conn, fields=self.fields)
context = {'person': str(self.citizen), 'centre': self.center.name,
'code': self.center.center_id}
expected = self.translate(constants.MESSAGE_1, context=context, enhanced=True) # arabic
self.assertEqual(self.get_last_response_code(), constants.MESSAGE_1)
self.assertEqual(self.get_last_response_message(), expected)
def test_good_update(self, registration_open):
new_center = RegistrationCenterFactory()
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.good_center_id)
self.receive(msg, self.conn, fields=self.fields) # registers
msg = "{nid}#{center}".format(nid=self.good_nid, center=new_center.center_id)
self.receive(msg, self.conn, fields=self.fields) # updates
context = {'person': str(self.citizen), 'centre': new_center.name,
'code': new_center.center_id}
# 1st update - message 1
expected = self.translate(constants.MESSAGE_1, context=context) # arabic
self.assertEqual(self.get_last_response_message(), expected)
# 2nd update - message 4
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.good_center_id)
self.receive(msg, self.conn, fields=self.fields) # updates again
context = {'person': str(self.citizen), 'centre': new_center.name,
'code': self.good_center_id}
expected = self.translate(constants.MESSAGE_4, context=context) # arabic
# 3rd and final update - message 5
msg = "{nid}#{center}".format(nid=self.good_nid, center=new_center.center_id)
self.receive(msg, self.conn, fields=self.fields) # updates
context = {'person': str(self.citizen), 'centre': new_center.name,
'code': new_center.center_id}
expected = self.translate(constants.MESSAGE_5, context=context) # arabic
def test_attempt_update_wrong_from_number(self, registration_open):
# create a valid registration
sms = SMSFactory(from_number=self.number, citizen=self.citizen)
RegistrationFactory(
citizen=self.citizen,
registration_center=self.center,
archive_time=None,
sms=sms)
# try to register at a new center with a new number
new_center = RegistrationCenterFactory()
new_number = '919-888-8888'
msg = "{nid}#{center}".format(nid=self.good_nid, center=new_center.center_id)
new_conn = self.create_connection(data={'identity': new_number})
self.receive(msg, new_conn, fields=self.fields)
# message should have the existing number in it (not new_number)
context = {'centre': self.center.name,
'number': self.number[-4:]}
expected = self.translate(constants.MESSAGE_2, context=context) # arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_attempt_update_wrong_from_number_same_center(self, registration_open):
# create a valid registration
sms = SMSFactory(from_number=self.number, citizen=self.citizen)
RegistrationFactory(
citizen=self.citizen,
registration_center=self.center,
archive_time=None,
sms=sms)
# try to register at same center with a new number
new_number = '919-888-8888'
msg = "{nid}#{center}".format(nid=self.good_nid, center=self.center.center_id)
new_conn = self.create_connection(data={'identity': new_number})
self.receive(msg, new_conn, fields=self.fields)
# message should have the existing number in it (not new_number)
context = {'centre': self.center.name,
'number': self.number[-4:]}
expected = self.translate(constants.MESSAGE_2, context=context) # arabic
self.assertEqual(self.get_last_response_message(), expected)
@override_settings(OUTGOING_MESSAGE_LANGUAGE='ar')
@override_settings(LANGUAGE_CODE='en')
class ResponseVoterQueryTest(TranslationTest, LibyaRapidTest):
def setUp(self):
self.number = get_random_phone_number()
self.center = RegistrationCenterFactory()
self.citizen = CitizenFactory()
self.staffphone = StaffPhone.objects.create(phone_number=self.number,
registration_center=self.center)
self.conn = self.create_connection(data={'identity': self.number})
self.good_nid = self.citizen.national_id
self.bad_nid = get_random_number_string(length=constants.NID_LENGTH)
self.short_nid = get_random_number_string(length=constants.NID_LENGTH - 1)
self.fields = {'to_addr': settings.REGISTRATION_SHORT_CODE}
def test_wrong_length_nid(self):
msg = "{nid}".format(nid=self.short_nid)
self.receive(msg, self.conn, fields=self.fields)
self.assertEqual(self.get_last_response_code(), constants.VOTER_QUERY_NID_WRONG_LENGTH)
expected = self.translate(constants.VOTER_QUERY_NID_WRONG_LENGTH) # Arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_citizen_registered(self):
# citizen has been registered
RegistrationFactory(citizen=self.citizen, registration_center=self.center,
archive_time=None)
# let's query for the registration
msg = "{nid}".format(nid=self.good_nid)
self.receive(msg, self.conn, fields=self.fields)
self.assertEqual(self.get_last_response_code(), constants.VOTER_QUERY_REGISTERED_AT)
context = {"person": str(self.citizen), "centre": self.center.name,
"code": self.center.center_id}
expected = self.translate(constants.VOTER_QUERY_REGISTERED_AT, context) # Arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_citizen_not_registered(self):
# let's query for the registration
citizen2 = CitizenFactory() # unregistered
msg = "{nid}".format(nid=citizen2.national_id)
self.receive(msg, self.conn, fields=self.fields)
self.assertEqual(self.get_last_response_code(), constants.VOTER_QUERY_NOT_REGISTERED)
context = {'person': str(citizen2)}
expected = self.translate(constants.VOTER_QUERY_NOT_REGISTERED, context) # Arabic
self.assertEqual(self.get_last_response_message(), expected)
def test_nlid_does_not_exist(self):
msg = "{nid}".format(nid=self.bad_nid)
self.receive(msg, self.conn, fields=self.fields)
self.assertEqual(self.get_last_response_code(), constants.VOTER_QUERY_NOT_FOUND)
expected = self.translate(constants.VOTER_QUERY_NOT_FOUND) # Arabic
self.assertEqual(self.get_last_response_message(), expected)
| 53.453061 | 96 | 0.701665 | 1,628 | 13,096 | 5.405405 | 0.103808 | 0.029091 | 0.051818 | 0.06 | 0.810682 | 0.797159 | 0.764659 | 0.742614 | 0.736932 | 0.717273 | 0 | 0.006241 | 0.192425 | 13,096 | 244 | 97 | 53.672131 | 0.825832 | 0.066967 | 0 | 0.588832 | 0 | 0 | 0.038847 | 0 | 0 | 0 | 0 | 0 | 0.121827 | 1 | 0.106599 | false | 0 | 0.06599 | 0 | 0.182741 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
aa303fe9e82d5b91e080a3ff4a02dc376ead3fc4 | 756 | py | Python | pymontecarlo/options/model/__init__.py | pymontecarlo/pymontecarlo | 87050041724feb17f1ccff5794e9830c3209244e | [
"Apache-2.0"
] | 5 | 2018-04-10T07:15:06.000Z | 2021-07-01T15:40:29.000Z | pymontecarlo/options/model/__init__.py | pymontecarlo/pymontecarlo | 87050041724feb17f1ccff5794e9830c3209244e | [
"Apache-2.0"
] | 73 | 2015-09-04T09:48:29.000Z | 2022-01-03T17:49:01.000Z | pymontecarlo/options/model/__init__.py | pymontecarlo/pymontecarlo | 87050041724feb17f1ccff5794e9830c3209244e | [
"Apache-2.0"
] | 4 | 2016-05-17T12:57:20.000Z | 2021-01-31T10:55:24.000Z | """
Models.
"""
from pymontecarlo.options.model.base import *
from pymontecarlo.options.model.bremsstrahlung_emission import *
from pymontecarlo.options.model.direction_cosine import *
from pymontecarlo.options.model.elastic_cross_section import *
from pymontecarlo.options.model.energy_loss import *
from pymontecarlo.options.model.fluorescence import *
from pymontecarlo.options.model.inelastic_cross_section import *
from pymontecarlo.options.model.ionization_cross_section import *
from pymontecarlo.options.model.ionization_potential import *
from pymontecarlo.options.model.mass_absorption_coefficient import *
from pymontecarlo.options.model.photon_scattering_cross_section import *
from pymontecarlo.options.model.random_number_generator import *
| 44.470588 | 72 | 0.858466 | 90 | 756 | 7.022222 | 0.311111 | 0.303797 | 0.436709 | 0.531646 | 0.699367 | 0.322785 | 0.322785 | 0.177215 | 0 | 0 | 0 | 0 | 0.068783 | 756 | 16 | 73 | 47.25 | 0.897727 | 0.009259 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a4bc4533b229603f4b09e164bb0cb8232bc01688 | 55 | py | Python | involution_pytorch/__init__.py | rish-16/involution-pytorch | 90766013ee1a74fc7fd7db0822a61e2c338754b1 | [
"MIT"
] | 1 | 2021-07-06T21:07:44.000Z | 2021-07-06T21:07:44.000Z | involution_pytorch/__init__.py | rish-16/involution-pytorch | 90766013ee1a74fc7fd7db0822a61e2c338754b1 | [
"MIT"
] | null | null | null | involution_pytorch/__init__.py | rish-16/involution-pytorch | 90766013ee1a74fc7fd7db0822a61e2c338754b1 | [
"MIT"
] | null | null | null | from involution_pytorch.involution_pytorch import Inv2d | 55 | 55 | 0.927273 | 7 | 55 | 7 | 0.714286 | 0.693878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 0.054545 | 55 | 1 | 55 | 55 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3532ab40a1d326d0caed24b9c91a549ff872bbdd | 546 | py | Python | opennmt/layers/__init__.py | mfomicheva/OpenNMT-tf | a367676a16f9e77f76bc58e138e78614eb4add1e | [
"MIT"
] | 4 | 2020-06-21T13:56:27.000Z | 2021-05-07T06:03:35.000Z | opennmt/layers/__init__.py | mfomicheva/OpenNMT-tf | a367676a16f9e77f76bc58e138e78614eb4add1e | [
"MIT"
] | 1 | 2020-06-22T23:38:33.000Z | 2020-06-23T02:06:45.000Z | opennmt/layers/__init__.py | mfomicheva/OpenNMT-tf | a367676a16f9e77f76bc58e138e78614eb4add1e | [
"MIT"
] | 2 | 2021-04-15T08:51:30.000Z | 2022-03-08T07:44:32.000Z | """Module defining reusable and model specific layers."""
from opennmt.layers.common import Dense
from opennmt.layers.reducer import SumReducer
from opennmt.layers.reducer import MultiplyReducer
from opennmt.layers.reducer import ConcatReducer
from opennmt.layers.reducer import JoinReducer
from opennmt.layers.bridge import CopyBridge
from opennmt.layers.bridge import ZeroBridge
from opennmt.layers.bridge import DenseBridge
from opennmt.layers.position import PositionEmbedder
from opennmt.layers.position import SinusoidalPositionEncoder
| 34.125 | 61 | 0.857143 | 67 | 546 | 6.985075 | 0.358209 | 0.235043 | 0.363248 | 0.205128 | 0.574786 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093407 | 546 | 15 | 62 | 36.4 | 0.945455 | 0.093407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.